US20230069816A9 - Image sensor, mobile terminal, and image capturing method - Google Patents
Image sensor, mobile terminal, and image capturing method Download PDFInfo
- Publication number
- US20230069816A9 US20230069816A9 US17/151,745 US202117151745A US2023069816A9 US 20230069816 A9 US20230069816 A9 US 20230069816A9 US 202117151745 A US202117151745 A US 202117151745A US 2023069816 A9 US2023069816 A9 US 2023069816A9
- Authority
- US
- United States
- Prior art keywords
- subpixel
- pixel
- infrared
- green
- red
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 230000009977 dual effect Effects 0.000 claims abstract description 10
- 238000003384 imaging method Methods 0.000 claims description 30
- 239000004065 semiconductor Substances 0.000 claims description 27
- 238000012545 processing Methods 0.000 claims description 25
- 239000002184 metal Substances 0.000 claims description 20
- 238000001914 filtration Methods 0.000 claims description 12
- 230000000295 complement effect Effects 0.000 claims description 6
- 229910044991 metal oxide Inorganic materials 0.000 claims description 6
- 150000004706 metal oxides Chemical class 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 description 25
- 238000010586 diagram Methods 0.000 description 16
- 230000000694 effects Effects 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 9
- 238000001514 detection method Methods 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 4
- 238000007726 management method Methods 0.000 description 3
- 230000005236 sound signal Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/70—SSIS architectures; Circuits associated therewith
- H04N25/703—SSIS architectures incorporating pixels for producing signals other than image signals
- H04N25/705—Pixels for depth measurement, e.g. RGBZ
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14603—Special geometry or disposition of pixel-elements, address-lines or gate-electrodes
- H01L27/14605—Structural or functional details relating to the position of the pixel elements, e.g. smaller pixel elements in the center of the imager compared to pixel elements at the periphery
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14625—Optical elements or arrangements associated with the device
- H01L27/14629—Reflectors
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14601—Structural or functional details thereof
- H01L27/14634—Assemblies, i.e. Hybrid structures
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14643—Photodiode arrays; MOS imagers
- H01L27/14645—Colour imagers
-
- H—ELECTRICITY
- H01—ELECTRIC ELEMENTS
- H01L—SEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
- H01L27/00—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
- H01L27/14—Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components sensitive to infrared radiation, light, electromagnetic radiation of shorter wavelength or corpuscular radiation and specially adapted either for the conversion of the energy of such radiation into electrical energy or for the control of electrical energy by such radiation
- H01L27/144—Devices controlled by radiation
- H01L27/146—Imager structures
- H01L27/14643—Photodiode arrays; MOS imagers
- H01L27/14649—Infrared imagers
- H01L27/1465—Infrared imagers of the hybrid type
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
- H04N25/131—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements including elements passing infrared wavelengths
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
Definitions
- the present disclosure relates to the field of image processing technologies, and in particular, to an image sensor, a mobile terminal, and an image capturing method.
- CMOS complementary metal oxide semiconductor
- red Red
- Green Green
- Blue blue
- FIG. 1 a and FIG. 1 B although this arrangement mode can be used to improve a dark-state photographing effect compared to the Bayer mode, a disadvantage is that an object distance cannot be detected, and this arrangement mode can only be used to receive natural light, for taking pictures and recording images during normal lighting.
- FIG. 1 c and FIG. 1 d A pixel array arrangement mode of a dual pixel focusing 2PD technology is shown in FIG. 1 c and FIG. 1 d .
- Such an arrangement mode can also only be used to accept natural light and used to take pictures and record images.
- Compared with a technical solution of a four-in-one RGB arrangement, such an arrangement mode may be used to detect an object distance and complete a focusing action more quickly, but has a non-ideal dark-state photographing effect.
- each of some R, G and B subpixels in a pixel array is divided into two, and obtains different light energy according to different incident directions, so that a left subpixel point and a right subpixel point form a phase detection pair.
- a luminance value of the left subpixel point and a luminance value of the right subpixel point both reach a relative maximum peak value, an image is relatively the clearest at this moment, that is, in focus, and then an object distance is obtained through calculation by using an algorithm, to achieve fast focusing.
- the pixel array arrangement mode of the image sensor in the related technologies has the problem of slow focusing or inability to improve the dark-state capturing effect, thereby affecting capturing experience of users.
- Embodiments of the present disclosure provide an image sensor, a mobile terminal, and an image capturing method, to resolve a problem that slow focusing is performed or a dark-state capturing effect cannot be improved in a pixel array arrangement mode of the image sensor in related technologies, thereby affecting capturing experience of users.
- an image sensor including:
- the pixel array includes a preset quantity of pixel units arranged in a predetermined manner
- the pixel unit includes a first pixel and a second pixel adjacent to the first pixel
- the first pixel includes a red subpixel, a green subpixel, and a blue subpixel
- the second pixel comprises the green subpixel, and an infrared subpixel
- at least one of the red subpixel or the blue subpixel both the first pixel and the second pixel are dual pixel focusing pixels
- each subpixel in the first pixel and the second pixel is arranged in a four-in-one manner
- a position of the infrared subpixel in the second pixel is the same as a position of the red subpixel, the green subpixel, the blue subpixel, a first combination of subpixels, or a second combination of subpixels in the first pixel, the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other, and the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other; or
- a position of half the infrared subpixel in the second pixel is the same as a position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form the entire infrared subpixel.
- an embodiment of the present disclosure provides a mobile terminal, including an imaging system and an infrared transmit module, where the imaging system includes:
- a driver module configured to drive the lens module to move
- a filtering module disposed between the lens module and the image sensor
- an image data processing module connected to the image sensor
- the infrared transmit module is disposed on a periphery of the lens module.
- an embodiment of the present disclosure further provides an image capturing method, applied to a mobile terminal, where the mobile terminal includes an infrared transmit module and the image sensor, and the method includes:
- the image sensor is formed by using a 2PD pixel array combining an RGB subpixel and an infrared subpixel that are in a four-in-one arrangement. Capturing is performed by using the image sensor, and a distance between the to-be-captured object and the mobile terminal may be detected when an image is captured and recorded, thereby achieving fast focusing and bokeh. In addition, a dark-state imaging effect of the image can be improved, and a related application function of stereophotographing is implemented, thereby ensuring functional diversity of the mobile terminal while capturing experience of a user is ensured.
- FIG. 1 a is a schematic diagram of a four-in-one RGB arrangement in a related technology
- FIG. 1 B is a sectional view of a four-in-one pixel
- FIG. 1 c is an arrangement diagram of a 2PD pixel array
- FIG. 1 d is a sectional view of a 2PD pixel
- FIG. 2 a is a schematic diagram 1 of a pixel unit according to an embodiment of the present disclosure
- FIG. 2 b is a schematic diagram 2 of a pixel unit according to an embodiment of the present disclosure.
- FIG. 2 c is a schematic diagram 3 of a pixel unit according to an embodiment of the present disclosure.
- FIG. 3 a is a schematic diagram 4 of a pixel unit according to an embodiment of the present disclosure.
- FIG. 3 b is a schematic diagram 5 of a pixel unit according to an embodiment of the present disclosure.
- FIG. 4 a is a schematic diagram 6 of a pixel unit according to an embodiment of the present disclosure.
- FIG. 4 b is a schematic diagram 7 of a pixel unit according to an embodiment of the present disclosure.
- FIG. 5 a is a schematic diagram 8 of a pixel unit according to an embodiment of the present disclosure.
- FIG. 5 b is a schematic diagram 9 of a pixel unit according to an embodiment of the present disclosure.
- FIG. 6 is a sectional view of a pixel according to an embodiment of the present disclosure.
- FIG. 7 a is a schematic diagram of a mobile terminal according to an embodiment of the present disclosure.
- FIG. 7 b is a schematic diagram of an imaging system according to an embodiment of the present disclosure.
- FIG. 8 is a schematic diagram of an image capturing method according to an embodiment of the present disclosure.
- FIG. 9 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present disclosure.
- An embodiment of the present disclosure provides an image sensor, including:
- the pixel array includes a preset quantity of pixel units arranged in a predetermined manner
- the pixel unit includes a first pixel and a second pixel adjacent to the first pixel, as shown in FIG. 2 a to FIG. 2 c , FIG. 3 a and FIG. 3 b , and FIG. 4 a and FIG.
- the first pixel includes a red subpixel, a green subpixel, and a blue subpixel
- the second pixel includes the green subpixel, an infrared subpixel, and at least one of the red subpixel or the blue subpixel
- both the first pixel and the second pixel are dual pixel focusing pixels
- each subpixel in the first pixel and the second pixel is arranged in a four-in-one manner, where
- a position of the infrared subpixel in the second pixel is the same as a position of the red subpixel, the green subpixel, the blue subpixel, a first combination of subpixels, or a second combination of subpixels in the first pixel, the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other, and the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other; or
- a position of half the infrared subpixel in the second pixel is the same as a position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form the entire infrared subpixel.
- the pixel array included in the image sensor provided in this embodiment of the present disclosure includes the preset quantity of pixel units, and the preset quantity of pixel units are arranged in the predetermined manner.
- Each of the preset quantity of pixel units includes the first pixel and the second pixel.
- the first pixel is different from the second pixel.
- the first pixel includes the red subpixel (R), the green subpixel (G), and the blue subpixel (B).
- the second pixel further includes the green subpixel and the infrared subpixel (IR).
- Each of the first pixel and the second pixel in this embodiment of the present disclosure is a dual pixel focusing (2PD) pixel, and an object distance may be detected by using the 2PD pixel, to complete a focusing action more quickly.
- both the first pixel and the second pixel are 2PD pixels, that is, the subpixels in the first pixel and the second pixel are all 2PD subpixels.
- all the subpixels in the first pixel and the second pixel are arranged in a four-in-one manner.
- one subpixel includes 4 corresponding units, and a structure of the 4 units is as follows: Two units are located at an upper layer, the other two units are located at a lower layer, and the two units at the lower layer are arranged in a manner corresponding to the two units at the upper layer. That is, the first unit and the second unit are sequentially arranged and adjacent to each other, a third unit is located below the first unit, and a fourth unit is located below the second unit.
- each unit is divided into two.
- the red subpixel includes four red units
- the green subpixel includes four green units
- the blue subpixel includes four blue units
- each unit is divided into two.
- the red subpixel, the green subpixel, and the blue subpixel that are in the first pixel are arranged in a certain way, and the first pixel includes one red subpixel, one blue subpixel, and two green subpixels.
- the two green subpixels are respectively referred to as a first green subpixel and a second green subpixel, where the first green subpixel is the same as the second green subpixel.
- the red subpixel is adjacent to the first green subpixel
- the second green subpixel is below the red subpixel
- the blue subpixel is below the first green subpixel
- the second green subpixel is adjacent to the blue subpixel.
- the second pixel further includes the green subpixel and the infrared subpixel. That is, the second pixel may include the red subpixel, the green subpixel, and the infrared subpixel, or may include the green subpixel, the blue subpixel, and the infrared subpixel, or may include the green subpixel, the red subpixel, the blue subpixel, and the infrared subpixel.
- the position of the infrared subpixel in the second pixel may be the same as a position of a subpixel in the first pixel, or a position of half of each of two different adjacent subpixels in the first pixel.
- the position of half the infrared subpixel in the second pixel may be further the same as a position of half of any subpixel in the first pixel.
- half of an infrared subpixel in two adjacent second pixels can be combined to form an entire infrared subpixel.
- the position of half the infrared subpixel in the second pixel is the same as the position of half the red subpixel in the first pixel.
- the position of half the infrared subpixel in the second pixel is the same as the position of half the green subpixel in the first pixel. So far, half an infrared subpixel in two adjacent second pixels may be combined to form an entire infrared subpixel.
- an RGB pixel array arrangement mode is improved to a four-in-one RGB-infrared (Infrared Radiation, IR) pixel array arrangement mode, so that the mobile terminal captures an image when receiving infrared light, thereby implementing dark-state imaging, and ensuring capturing experience of a user.
- IR Infrared Radiation
- the image sensor in this embodiment of the present disclosure may detect a distance between a to-be-captured object and the mobile terminal, thereby implementing rapid focusing and bokeh.
- an imaging effect of the image can be improved, and a related application function of stereophotographing is implemented, thereby enhancing functionality of the mobile terminal while capturing experience of a user is ensured.
- the position of the infrared subpixel in the second pixel is the same as the position of the red subpixel, the green subpixel, the blue subpixel, the first combination of subpixels, or the second combination of subpixels in the first pixel
- the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other
- the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other.
- the infrared subpixel replaces the red subpixel in the first pixel.
- the position of the infrared subpixel in the second pixel is the same as the position of the red subpixel in the first pixel.
- the infrared subpixel replaces the blue subpixel in the first pixel.
- the position of the infrared subpixel in the second pixel is the same as the position of the blue subpixel in the first pixel.
- the infrared subpixel may replace one green subpixel in the first pixel.
- the position of the infrared subpixel in the second pixel is the same as the position of the one green subpixel in the first pixel.
- the second pixel includes the red subpixel, the green subpixel, the blue subpixel, and the infrared subpixel may be further that the second pixel includes one red subpixel, one green subpixel, half a blue subpixel, half a green subpixel, and one infrared subpixel.
- the infrared subpixel replaces half a green subpixel and half a blue subpixel in the first pixel, that is, the position of the infrared subpixel in the second pixel is the same as a position of half the green subpixel and half the blue subpixel that are adjacent in the first pixel.
- That the second pixel includes the red subpixel, the green subpixel, the blue subpixel, and the infrared subpixel may also be that the second pixel includes one blue subpixel, one green subpixel, half a red subpixel, half a green subpixel, and one infrared subpixel.
- the infrared subpixel replaces half a green subpixel and half a red subpixel in the first pixel, that is, the position of the infrared subpixel in the second pixel is the same as a position of half the green subpixel and half the red subpixel that are adjacent in the first pixel.
- the pixel unit when the position of the infrared subpixel in the second pixel is the same as the position of the red subpixel, the green subpixel, the blue subpixel, the first combination of subpixels, or the second combination of subpixels in the first pixel, the pixel unit includes one second pixel and at least one first pixel.
- a quantity of second pixels in the pixel unit is one, and a quantity of first pixels may be greater than or equal to one.
- the pixel unit includes one second pixel and one first pixel, referring to an example shown in FIG. 5 a , the pixel unit includes one first pixel and one second pixel.
- the second pixel includes one red subpixel, two green subpixels, and one infrared subpixel
- the first pixel includes one red subpixel, two green subpixels, and one blue subpixel.
- the infrared subpixel accounts for 1 ⁇ 8 of the pixel unit. It should be noted that the subpixels herein are all arranged in a four-in-one manner, each subpixel includes four corresponding units, and each unit is divided into two.
- the second pixel may include one blue subpixel, two green subpixels, and one infrared subpixel
- the first pixel includes one red subpixel, two green subpixels, and one blue subpixel.
- the infrared subpixel accounts for 1/12 of the pixel unit.
- the subpixels herein are all arranged in a four-in-one manner, each subpixel includes four corresponding units, and each unit is divided into two.
- the pixel unit includes three first pixels and one second pixel, for example, as shown in FIG. 3 a , the pixel unit includes three first pixels and one second pixel.
- the second pixel includes one blue subpixel, one green subpixel, half a red subpixel, half a green subpixel, and one infrared subpixel.
- Half a red subpixel and half a green subpixel among 2PD subpixels may be taken as the infrared subpixel based on the first pixel.
- the first pixel includes one red subpixel, two green subpixels, and one blue subpixel.
- the infrared subpixel accounts for 1/16 of the pixel unit.
- the subpixels herein are arranged in a four-in-one manner, each subpixel includes four corresponding units, and each unit is divided into two.
- a position of half an infrared subpixel in the second pixel is the same as a position of half a red subpixel, half a green subpixel, or half a blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form an entire infrared subpixel.
- the second pixel may include only half an infrared subpixel, and an entire infrared subpixel may be obtained by combining two second pixels.
- the position of half the infrared subpixel in the second pixel may be the same as a position of half a red subpixel in the first pixel, or may be the same as a position of half a green subpixel in the first pixel, or may be the same as a position of half a blue subpixel in the first pixel.
- a position of half an infrared subpixel in one second pixel is the same as a position of half a red subpixel in the first pixel
- a position of half an infrared subpixel in another second pixel is the same as a position of half a green subpixel in the first pixel.
- a position of half an infrared subpixel in one second pixel is the same as a position of half a green subpixel in the first pixel
- a position of half an infrared subpixel in another second pixel is the same as a position of half a blue subpixel or half a red subpixel in the first pixel.
- the pixel unit when the position of half the infrared subpixel in the second pixel is the same as the position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form the entire infrared subpixel, the pixel unit includes two second pixels and the first pixel whose quantity is greater than or equal to zero.
- the pixel unit When a quantity of pixels in the pixel unit is at least two, the pixel unit includes two second pixels and the first pixel whose quantity is greater than or equal to zero. When the quantity of pixels in the pixel unit is two, two second pixels are included. For example, as shown in FIG. 5 b , the pixel unit includes two second pixels. One second pixel includes one red subpixel, one green subpixel, one blue subpixel, half a green subpixel, and half an infrared subpixel. The other second pixel includes one red subpixel, two green subpixels, half a blue subpixel, and half an infrared subpixel.
- a position of half an infrared subpixel in one second pixel is the same as a position of half a green subpixel in the first pixel
- a position of half an infrared subpixel in the other second pixel is the same as a position of half a blue subpixel in the first pixel.
- the infrared subpixel accounts for 1 ⁇ 8 of the pixel unit.
- the pixel unit includes two second pixels and one first pixel.
- One second pixel includes one red subpixel, one green subpixel, one blue subpixel, half a green subpixel, and half an infrared subpixel.
- the other second pixel includes half a red subpixel, two green subpixels, one blue subpixel, and half an infrared subpixel.
- a position of half the infrared subpixel in the one second pixel is the same as a position of half a green subpixel in the first pixel
- a position of half the infrared subpixel in the other second pixel is the same as a position of half a red subpixel in the first pixel.
- the infrared subpixel accounts for 1/12 of the pixel unit.
- the pixel unit includes two second pixels and two first pixels.
- One second pixel includes one red subpixel, one green subpixel, one blue subpixel, half a green subpixel, and half an infrared subpixel.
- the other second pixel includes one blue subpixel, two green subpixels, half a red subpixel, and half an infrared subpixel.
- a position of half the infrared subpixel in the one second pixel is the same as a position of half a green subpixel in the first pixel
- a position of half the infrared subpixel in the other second pixel is the same as a position of half a red subpixel in the first pixel.
- the infrared subpixel accounts for 1/16 of the pixel unit.
- the subpixels herein are all arranged in a four-in-one manner, each subpixel includes four corresponding units, and each unit is divided into two.
- RGB+IR pixel unit with a proportion of 1 ⁇ 8 an RGB+IR pixel unit with a proportion of 1/12, or an RGB+IR pixel unit with a proportion of 1/16 may be used as a pixel unit array, and a periodic array arrangement is performed on the pixel unit array to form the pixel array.
- the pixel array may be in another form, which is not listed herein.
- Density of infrared subpixels in the pixel unit (that is, a proportion) is 1 ⁇ 4n, n is an integer greater than or equal to 2, and a size of the pixel array to which the infrared subpixel is applicable is not limited. Only a few corresponding point taking implementations of the infrared subpixel are described above, and other point taking manners may be further used. The other point taking manners in the embodiments of the present disclosure are not described one by one herein. A point taking position of the infrared subpixel in the pixel unit (the location of the second pixel) is not specifically limited in this embodiment of the present disclosure.
- a four-in-one red, green, or blue subpixel among the 2PD subpixels may be used as the infrared subpixel, to facilitate form diversification of the pixel unit; or half a four-in-one red subpixel and half a four-in-one green subpixel among the 2PD subpixels or half a four-in-one blue subpixel and half a four-in-one green subpixel may be used as the infrared subpixel, so that impact of IR dead pixels during RGB processing can be reduced.
- the red subpixel includes a semiconductor layer, a metal layer, a photodiode, a red light filter, and a micromirror that are sequentially stacked.
- the green subpixel includes a semiconductor layer, a metal layer, a photodiode, a green light filter, and a micromirror that are sequentially stacked.
- the blue subpixel includes a semiconductor layer, a metal layer, a photodiode, a blue light filter, and a micromirror that are sequentially stacked.
- the infrared subpixel includes a semiconductor layer, a metal layer, a photodiode, an infrared light filter, and a micromirror that are sequentially stacked.
- the semiconductor layer, the metal layer, the photodiode, the red light filter, and the micromirror that are included in the red subpixel are sequentially arranged from bottom to top.
- the semiconductor layer, the metal layer, the photodiode, the green light filter, and the micromirror that are included in the corresponding green subpixel are sequentially arranged from bottom to top.
- the semiconductor layer, the metal layer, the photodiode, the blue light filter, and the micromirror that are included in the blue subpixel are sequentially arranged from bottom to top.
- the semiconductor layer, the metal layer, the photodiode, the infrared filter, and the micromirror included in the infrared subpixel are sequentially arranged from bottom to top.
- the semiconductor layer herein may be a silicon substrate.
- the present disclosure is not limited thereto.
- structures of the red, green, blue, and infrared subpixels refer to FIG. 6 . Even though FIG. 6 shows only the green and infrared subpixels, the structures of the red and blue subpixels may be obtained on this basis.
- the green light filter may be replaced with the red or blue light filter, so that the structures of the red subpixel or the blue subpixel can be obtained.
- the red, green, and blue subpixels are used to obtain color information of pixels for synthesizing an image, and block infrared light from entering.
- visible light with a wavelength of only 380 mm to 700 nm is allowed to enter, and an image with full and vivid colors can be directly generated under high illumination.
- An infrared wavelength is 750 nm to 1100 nm, and a dark-state imaging effect can be improved by using an infrared band, thereby implementing an infrared distance measurement function.
- an RGB subpixel point is a light receiving element corresponding to wavelength light in each RGB color
- an IR subpixel point is a light receiving element corresponding to infrared light.
- the image sensor is a complementary metal oxide semiconductor CMOS image sensor, a charge-coupled device CCD image sensor, or a quantum film image sensor.
- the image sensor may be an image sensor based on a CMOS, an image sensor based on a charge-coupled device (Charge-coupled Device, CCD), or an image sensor based on a quantum film.
- CCD charge-coupled Device
- the image sensor may be further of another type.
- the image sensor in this embodiment of the present disclosure is applicable to any electronic product including a camera module.
- An embodiment of the present disclosure further provides a mobile terminal.
- the mobile terminal 1 includes an imaging system 2 and an infrared transmit module 3 .
- the imaging system 2 further includes:
- a lens module 22 a driver module 23 configured to drive the lens module 22 to move; a filtering module 24 disposed between the lens module 22 and the image sensor 21 ; an image data processing module 25 connected to the image sensor 21 ; and a display module 26 connected to the image data processing module 25 , where the infrared transmit module 3 is disposed on a periphery of the lens module 22 .
- the mobile terminal 1 in this embodiment of the present disclosure further includes the infrared transmit module 3 .
- the imaging system 2 further includes the lens module 22 configured to focus light.
- the lens module 22 is connected to the driver module 23 , and the driver module 23 is configured to adjust a position of the lens module 22 based on a distance to a to-be-captured object.
- the filtering module 24 is disposed between the lens module 22 and the image sensor 21 . After light is focused by using the lens module 22 and passes through the filtering module 24 , the light may be focused on a pixel array of the image sensor 21 .
- the image sensor 21 is connected to the image data processing module 25 , and the image data processing module 25 is connected to the display module 26 . After the light is focused on the pixel array of the image sensor 21 , and the image sensor 21 performs photovoltaic conversion on the light, data is transmitted to the image data processing module 25 . After the image data processing module 25 processes the data, the data is presented on the display module 26 in an image form.
- the driver module 23 may obtain a phase difference by using a 2PD pixel of the image sensor 21 , to obtain a distance between an object and an imaging surface, thereby implementing fast focusing.
- the four-in-one RGB+IR pixel array arrangement mode based on the 2PD image sensor in the present disclosure can cooperate with the infrared transmit module 3 to achieve a stereo-related function, such as terminal application like face recognition and unlocking, secure payment, and stereo imaging, thereby improving functionality of the mobile terminal on the basis of ensuring imaging.
- a stereo-related function such as terminal application like face recognition and unlocking, secure payment, and stereo imaging
- the filtering module 24 in this embodiment of the present disclosure can pass through a light wavelength from 380 nm to 1100 nm. In this case, after the light is focused by using the lens module 22 , the light can be filtered by using the filtering module 24 . Natural light and infrared light can pass through the filtering module 24 , and the filtering module 24 can be configured to ensure an imaging effect of the imaging system 2 .
- the infrared transmit module 3 on the mobile terminal is disposed on a periphery of the lens module 22 .
- the infrared transmit module 3 emits infrared light, and the infrared light is reflected after encountering an obstacle.
- the imaging system 2 captures the reflected infrared light, and photoelectric conversion is performed by using the infrared subpixel, a time difference from transmitting the infrared light to receiving the infrared light can be obtained. Due to a fixed light propagation speed, a distance between the obstacle and the mobile terminal may be obtained, and a distance from each minimum unit on the obstacle to the mobile terminal may be finally obtained, to implement a stereo imaging and recording function. Certainly, a distance between each infrared light reflection point on the obstacle and the mobile terminal may be further obtained by obtaining a phase difference of the infrared light.
- the image sensor is formed by using a 2PD pixel array combining an RGB subpixel and an infrared subpixel that are in a four-in-one arrangement. Capturing is performed by using the image sensor, and a distance between the to-be-captured object and the mobile terminal may be detected when an image is captured and recorded, thereby achieving fast focusing and bokeh. In addition, a dark-state imaging effect of the image can be improved, and a related application function of stereophotographing is implemented thereby ensuring functional diversity of the mobile terminal while capturing experience of a user is ensured.
- An embodiment of the present disclosure further provides an image capturing method, applied to a mobile terminal.
- the mobile terminal further includes an infrared transmit module. As shown in FIG. 8 , the method includes:
- Step 801 Transmit infrared light by using the infrared transmit module.
- the infrared transmit module on the mobile terminal can emit infrared light. After encountering a to-be-captured object, the infrared light is reflected, and the reflected infrared light is received by an imaging system of the mobile terminal.
- the image sensor of the mobile terminal forms a four-in-one RGB-IR pixel array. Therefore, photoelectric conversion can be performed by using an infrared subpixel.
- Step 802 Obtain a distance between each infrared light reflection point on a to-be-captured object and the mobile terminal based on the infrared light reflected by the to-be-captured object.
- a process of obtaining the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal is as follows: receiving, by using a pixel array including a second pixel, the infrared light reflected by each infrared light reflection point on the to-be-captured object; and according to the time difference between sending and receiving the infrared light and a propagation speed of the infrared light or by obtaining the phase difference of the infrared light, obtaining the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal.
- the time difference from transmitting the infrared light to receiving the infrared light can be obtained. Due to the fixed light propagation speed, the distance between the obstacle and the mobile terminal may be obtained based on 1 ⁇ 2 of a product of the time difference and the propagation speed. Time points at which the mobile terminal receives infrared light reflected by the infrared light reflection points are different. Therefore, a distance may be correspondingly calculated for each infrared light reflection point, so that the distance between each infrared light reflection point and the mobile terminal may be obtained. The distance between each infrared light reflection point and the mobile terminal may be further obtained by obtaining the phase difference of the infrared light. For details, refer to a time of flight (Time of Flight, TOF) technology. Details are not described herein.
- TOF Time of Flight
- the pixel array of the image sensor in this embodiment of the present disclosure includes a preset quantity of pixel units arranged in a predetermined manner, the pixel unit includes a first pixel and a second pixel, the first pixel includes a red subpixel, a green subpixel, and a blue subpixel, the second pixel includes the green subpixel, an infrared subpixel, and at least one of the red subpixel or the blue subpixel, both the first pixel and the second pixel are dual pixel focusing pixels, and each subpixel in the first pixel and the second pixel is arranged in a four-in-one manner.
- a position of the infrared subpixel in the second pixel is the same as a position of the red subpixel, the green subpixel, the blue subpixel, a first combination of subpixels, or a second combination of subpixels in the first pixel, the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other, and the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other.
- a position of half the infrared subpixel in the second pixel is the same as a position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form an entire infrared subpixel.
- Step 803 Obtain three-dimensional information of the to-be-captured object based on the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal.
- the image sensor is formed by using a 2PD pixel array combining an RGB subpixel and an infrared subpixel that are in a four-in-one arrangement. Capturing is performed by using the image sensor, and a distance between the to-be-captured object and the mobile terminal may be detected when an image is captured and recorded, thereby achieving fast focusing and bokeh. In addition, a dark-state imaging effect of the image can be improved, and a related application function of stereophotographing is implemented thereby ensuring functional diversity of the mobile terminal while capturing experience of a user is ensured.
- FIG. 9 is a schematic diagram of a hardware structure of a mobile terminal according to various embodiments of the present disclosure.
- the mobile terminal 900 includes, but is not limited to: a radio frequency unit 901 , a network module 902 , an audio output unit 903 , an input unit 904 , a sensor 905 , a display unit 906 , a user input unit 907 , an interface unit 908 , a memory 909 , a processor 910 , a power supply 911 , and other components.
- the mobile terminal 900 further includes an imaging system and an infrared transmit module.
- the imaging system includes an image sensor and a lens module; a driver module configured to drive the lens module to move; a filtering module disposed between the lens module and the image sensor; an image data processing module connected to the image sensor; and a display module connected to the image data processing module, where the infrared transmit module is disposed on a periphery of the lens module.
- the filtering module can pass through an optical wavelength from 380 nm to 1100 nm.
- the image sensor includes: a pixel array, where the pixel array includes a preset quantity of pixel units arranged in a predetermined manner, the pixel unit includes a first pixel and a second pixel adjacent to the first pixel, the first pixel includes a red subpixel, a green subpixel, and a blue subpixel, the second pixel includes the green subpixel, an infrared subpixel, and at least one of the red subpixel or the blue subpixel, both the first pixel and the second pixel are dual pixel focusing pixels, and each subpixel in the first pixel and the second pixel is arranged in a four-in-one manner; and
- a position of the infrared subpixel in the second pixel is the same as a position of the red subpixel, the green subpixel, the blue subpixel, a first combination of subpixels, or a second combination of subpixels in the first pixel, the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other, and the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other; or
- a position of half the infrared subpixel in the second pixel is the same as a position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form an entire infrared subpixel.
- the pixel unit When the position of the infrared subpixel in the second pixel is the same as the position of the red subpixel, the green subpixel, the blue subpixel, the first combination of subpixels, or the second combination of subpixels in the first pixel, the pixel unit includes one second pixel and at least one first pixel.
- the pixel unit When the position of half the infrared subpixel in the second pixel is the same as the position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half the infrared subpixel in each of two adjacent second pixels is combined to form the entire infrared subpixel, the pixel unit includes two second pixels and the first pixel whose quantity is greater than or equal to zero.
- the red subpixel includes a semiconductor layer, a metal layer, a photodiode, a red light filter, and a micromirror that are sequentially stacked;
- the green subpixel includes a semiconductor layer, a metal layer, a photodiode, a green light filter, and a micromirror that are sequentially stacked;
- the blue subpixel includes a semiconductor layer, a metal layer, a photodiode, a blue light filter, and a micromirror that are sequentially stacked;
- the infrared subpixel includes a semiconductor layer, a metal layer, a photodiode, an infrared light filter, and a micromirror that are sequentially stacked.
- the image sensor is a complementary metal oxide semiconductor CMOS image sensor, a charge-coupled device CCD image sensor, or a quantum film image sensor.
- the mobile terminal shown in FIG. 9 does not constitute a limitation on the mobile terminal, and the mobile terminal may include more or fewer components than those shown in the figure, or combine some components, or have different component arrangements.
- the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a laptop computer, a palmtop computer, an in-vehicle terminal, a wearable device, a pedometer, and the like.
- the processor 910 is configured to: transmit infrared light by using the infrared transmit module; obtain a distance between each infrared light reflection point on a to-be-captured object and the mobile terminal based on the infrared light reflected by the to-be-captured object; and control, based on the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal, the imaging system to obtain three-dimensional information of the to-be-captured object.
- the image sensor is formed by using a 2PD pixel array combining an RGB subpixel and an infrared subpixel that are in a four-in-one arrangement. Capturing is performed by using the image sensor, and a distance between the to-be-captured object and the mobile terminal may be detected when an image is captured and recorded, thereby achieving fast focusing and bokeh. In addition, a dark-state imaging effect of the image can be improved, and a related application function of stereophotographing is implemented thereby ensuring functional diversity of the mobile terminal while capturing experience of a user is ensured.
- the radio frequency unit 901 may be configured to receive and send signals in a process of receiving and sending information or calling. Specifically, after receiving downlink data from a base station, the radio frequency unit 901 sends the downlink data to the processor 910 for processing; and sends uplink data to the base station.
- the radio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
- the radio frequency unit 901 may further communicate with a network and another device by using a wireless communications system.
- the mobile terminal provides wireless broadband Internet access for a user by using a network module 902 , for example, helping the user send and receive an email, browse a web page, and access streaming media.
- the audio output unit 903 may convert audio data received by the radio frequency unit 901 or the network module 902 or stored in the memory 909 into an audio signal and output the audio signal as sound. Moreover, the audio output unit 903 may further provide audio output (for example, call signal receiving sound and message receiving sound) related to a specific function executed by the mobile terminal 900 .
- the audio output unit 903 includes a speaker, a buzzer, a telephone receiver, and the like.
- the input unit 904 is configured to receive an audio signal or a video signal.
- the input unit 904 may include a graphics processing unit (Graphics Processing Unit, GPU) 9041 and a microphone 9042 .
- the graphics processing unit 9041 is configured to process image data of a static picture or a video obtained by an image capturing apparatus (for example, a camera) in video capturing mode or image capturing mode. An image frame obtained through processing may be displayed on the display unit 906 .
- the display unit herein is the display module.
- the image frame obtained through processing by the graphics processing unit 9041 may be stored in the memory 909 (or another storage medium) or sent by using the radio frequency unit 901 or the network module 902 .
- the graphics processing unit 9041 is the image data processing module.
- the microphone 9042 may receive sound and can process such sound into audio data.
- the audio data obtained through processing may be converted, in a telephone call mode, into a format that can be sent to a mobile communications base station via the radio frequency unit 901 for output.
- the mobile terminal 900 further includes at least one sensor 905 , such as an optional sensor, a motion sensor, and another sensor.
- the optional sensor includes an ambient light sensor and a proximity sensor.
- the ambient light sensor may adjust luminance of the display panel 9061 based on brightness of ambient light
- the proximity sensor may disable the display panel 9061 and/or backlight when the mobile terminal 900 approaches an ear.
- an accelerometer sensor may detect magnitude of an acceleration in each direction (generally three axes), and may detect magnitude and a direction of gravity when being static.
- the accelerometer sensor may be used for recognizing a gesture of a mobile terminal (for example, horizontal and vertical screen switching, a related game, or magnetometer posture calibration), a function related to vibration recognition (for example, a pedometer or a strike), or the like.
- the sensor 905 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and the like. Details are not described herein.
- the display unit 906 is configured to display information entered by a user or information provided for the user.
- the display unit 906 may include a display panel 9061 , and the display panel 9061 may be configured in a form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), or the like.
- LCD Liquid Crystal Display
- OLED Organic Light-Emitting Diode
- the user input unit 907 may be configured to receive input digit or character information and generate key signal input related to user setting and function control of the mobile terminal.
- the user input unit 907 includes a touch panel 9071 and another input device 9072 .
- the touch panel 9071 also referred to as a touch screen, may collect a touch operation of a user on or near the touch panel 9071 (for example, the user uses any suitable object or accessory such as a finger or a stylus to operate on or near the touch panel 9071 ).
- the touch panel 9071 may include two parts: a touch detection apparatus and a touch controller.
- the touch detection apparatus detects a touch position of the user, detects a signal brought by the touch operation, and transmits the signal to the touch controller.
- the touch controller receives touch information from the touch detection apparatus, converts the touch information into contact coordinates, transmits the contact coordinates to the processor 910 , receives a command sent by the processor 910 , and executes the command.
- the touch panel 9071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave.
- the user input unit 907 may further include the another input device 9072 .
- the another input device 9072 may include, but is not limited to, a physical keyboard, function keys (such as a volume control key and a switch key), a trackball, a mouse, and a joystick. Details are not described herein.
- the touch panel 9071 can cover the display panel 9061 .
- the touch panel 9071 transmits the touch operation to the processor 910 to determine a type of a touch event. Then the processor 910 provides corresponding visual output on the display panel 9061 based on the type of the touch event.
- the touch panel 9071 and the display panel 9061 are used as two independent components to implement input and output functions of the mobile terminal. However, in some embodiments, the touch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the mobile terminal. This is not specifically limited herein.
- the interface unit 908 is an interface connecting an external apparatus to the mobile terminal 900 .
- the external apparatus may include a wired or wireless headset port, an external power supply (or a battery charger) port, a wired or wireless data port, a memory card port, a port for connecting an apparatus having an identification module, an audio input/output (I/O) port, a video I/O port, a headset port, and the like.
- the interface unit 908 may be configured to receive input (such as data information and power) from the external apparatus and transmit the received input to one or more elements in the mobile terminal 900 , or may be configured to transmit data between the mobile terminal 900 and the external apparatus.
- the memory 909 may be configured to store a software program and various data.
- the memory 909 may mainly include a program storage area and a data storage area.
- the program storage area may store an operating system, an application program required by at least one function (such as a sound play function or an image display function), and the like.
- the data storage area may store data (such as audio data or an address book) or the like created based on use of the mobile phone.
- the memory 909 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or another volatile solid-state storage device.
- the processor 910 is a control center of the mobile terminal, and is connected to all parts of the entire mobile terminal by using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing the software program and/or the module that are/is stored in the memory 909 and invoking the data stored in the memory 909 , to implement overall monitoring on the mobile terminal.
- the processor 910 may include one or more processing units.
- the processor 910 may integrate an application processor with a modem processor.
- the application processor mainly processes the operating system, a user interface, the application program, and the like, and the modem processor mainly processes wireless communication. It may be understood that the foregoing modem processor may not be integrated into the processor 910 .
- the mobile terminal 900 may further include a power supply 911 (such as a battery) that supplies power to each component.
- a power supply 911 (such as a battery) that supplies power to each component.
- the power supply 911 may be logically connected to the processor 910 by using a power management system, to implement functions such as charging, discharging, and power consumption management by using the power management system.
- the mobile terminal 900 includes some function modules not shown, and details are not described herein.
- the computer software product is stored in a storage medium (such as a read-only memory (Read-only Memory, ROM)/random access memory (Random Access Memory, RAM), a magnetic disk, or an optical disc) and includes several instructions for instructing user equipment (which may be a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the methods described in the embodiments of the present disclosure.
- a storage medium such as a read-only memory (Read-only Memory, ROM)/random access memory (Random Access Memory, RAM), a magnetic disk, or an optical disc
- user equipment which may be a mobile phone, a computer, a server, an air conditioner, or a network device
Abstract
Description
- This application is continuation application of PCT International Application No. PCT/CN2019/095367 filed on Jul. 10, 2019, which claims priority to Chinese Patent Application No. 201810798680.4 filed on Jul. 19, 2018 in China, the disclosures of which are incorporated in their entireties by reference herein.
- The present disclosure relates to the field of image processing technologies, and in particular, to an image sensor, a mobile terminal, and an image capturing method.
- Among related technologies, smart electronic products have gradually become necessities in lives of people, and a camera function, as an important configuration of the electronic products, is also gradually developing. However, with the promotion and popularization of the camera function, people are no longer satisfied with an only camera function of a camera in a current smart electronic product, and expect to achieve diversified camera effects, diversified playing methods, and diversified functions.
- On the market, among pixel array arrangements of an image sensor based on a complementary metal oxide semiconductor (Complementary Metal Oxide Semiconductor, CMOS), there is a commonly used four-in-one pixel array arrangement mode improved based on the Bayer mode of red (Red, R), green (Green, G), and blue (Blue, B). As shown in
FIG. 1 a andFIG. 1B , although this arrangement mode can be used to improve a dark-state photographing effect compared to the Bayer mode, a disadvantage is that an object distance cannot be detected, and this arrangement mode can only be used to receive natural light, for taking pictures and recording images during normal lighting. - A pixel array arrangement mode of a dual pixel focusing 2PD technology is shown in
FIG. 1 c andFIG. 1 d . Such an arrangement mode can also only be used to accept natural light and used to take pictures and record images. Compared with a technical solution of a four-in-one RGB arrangement, such an arrangement mode may be used to detect an object distance and complete a focusing action more quickly, but has a non-ideal dark-state photographing effect. - A principle of a 2PD phase detection technology is described as follows: As seen from
FIG. 1 c andFIG. 1 d , each of some R, G and B subpixels in a pixel array is divided into two, and obtains different light energy according to different incident directions, so that a left subpixel point and a right subpixel point form a phase detection pair. When a luminance value of the left subpixel point and a luminance value of the right subpixel point both reach a relative maximum peak value, an image is relatively the clearest at this moment, that is, in focus, and then an object distance is obtained through calculation by using an algorithm, to achieve fast focusing. - In summary, the pixel array arrangement mode of the image sensor in the related technologies has the problem of slow focusing or inability to improve the dark-state capturing effect, thereby affecting capturing experience of users.
- Embodiments of the present disclosure provide an image sensor, a mobile terminal, and an image capturing method, to resolve a problem that slow focusing is performed or a dark-state capturing effect cannot be improved in a pixel array arrangement mode of the image sensor in related technologies, thereby affecting capturing experience of users.
- According to a first aspect, an embodiment of the present disclosure provides an image sensor, including:
- a pixel array, where the pixel array includes a preset quantity of pixel units arranged in a predetermined manner, the pixel unit includes a first pixel and a second pixel adjacent to the first pixel, the first pixel includes a red subpixel, a green subpixel, and a blue subpixel, the second pixel comprises the green subpixel, and an infrared subpixel, and at least one of the red subpixel or the blue subpixel, both the first pixel and the second pixel are dual pixel focusing pixels, and each subpixel in the first pixel and the second pixel is arranged in a four-in-one manner; where
- a position of the infrared subpixel in the second pixel is the same as a position of the red subpixel, the green subpixel, the blue subpixel, a first combination of subpixels, or a second combination of subpixels in the first pixel, the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other, and the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other; or
- a position of half the infrared subpixel in the second pixel is the same as a position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form the entire infrared subpixel.
- According to a second aspect, an embodiment of the present disclosure provides a mobile terminal, including an imaging system and an infrared transmit module, where the imaging system includes:
- the image sensor;
- a lens module;
- a driver module configured to drive the lens module to move;
- a filtering module disposed between the lens module and the image sensor;
- an image data processing module connected to the image sensor; and
- a display module connected to the image data processing module, where the infrared transmit module is disposed on a periphery of the lens module.
- According to a third aspect, an embodiment of the present disclosure further provides an image capturing method, applied to a mobile terminal, where the mobile terminal includes an infrared transmit module and the image sensor, and the method includes:
- transmitting infrared light by using the infrared transmit module;
- obtaining a distance between each infrared light reflection point on a to-be-captured object and the mobile terminal based on infrared light reflected by the to-be-captured object; and
- obtaining three-dimensional information of the to-be-captured object based on the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal.
- In the technical solutions of the present disclosure, the image sensor is formed by using a 2PD pixel array combining an RGB subpixel and an infrared subpixel that are in a four-in-one arrangement. Capturing is performed by using the image sensor, and a distance between the to-be-captured object and the mobile terminal may be detected when an image is captured and recorded, thereby achieving fast focusing and bokeh. In addition, a dark-state imaging effect of the image can be improved, and a related application function of stereophotographing is implemented, thereby ensuring functional diversity of the mobile terminal while capturing experience of a user is ensured.
- The following clearly describes the technical solutions in the embodiments of the present disclosure with reference to the accompanying drawings in the embodiments of the present disclosure. Apparently, the described embodiments are some rather than all of the embodiments of the present disclosure. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present disclosure without creative efforts shall fall within the protection scope of the present disclosure.
-
FIG. 1 a is a schematic diagram of a four-in-one RGB arrangement in a related technology; -
FIG. 1B is a sectional view of a four-in-one pixel; -
FIG. 1 c is an arrangement diagram of a 2PD pixel array; -
FIG. 1 d is a sectional view of a 2PD pixel; -
FIG. 2 a is a schematic diagram 1 of a pixel unit according to an embodiment of the present disclosure; -
FIG. 2 b is a schematic diagram 2 of a pixel unit according to an embodiment of the present disclosure; -
FIG. 2 c is a schematic diagram 3 of a pixel unit according to an embodiment of the present disclosure; -
FIG. 3 a is a schematic diagram 4 of a pixel unit according to an embodiment of the present disclosure; -
FIG. 3 b is a schematic diagram 5 of a pixel unit according to an embodiment of the present disclosure; -
FIG. 4 a is a schematic diagram 6 of a pixel unit according to an embodiment of the present disclosure; -
FIG. 4 b is a schematic diagram 7 of a pixel unit according to an embodiment of the present disclosure; -
FIG. 5 a is a schematic diagram 8 of a pixel unit according to an embodiment of the present disclosure; -
FIG. 5 b is a schematic diagram 9 of a pixel unit according to an embodiment of the present disclosure; -
FIG. 6 is a sectional view of a pixel according to an embodiment of the present disclosure; -
FIG. 7 a is a schematic diagram of a mobile terminal according to an embodiment of the present disclosure; -
FIG. 7 b is a schematic diagram of an imaging system according to an embodiment of the present disclosure; -
FIG. 8 is a schematic diagram of an image capturing method according to an embodiment of the present disclosure; and -
FIG. 9 is a schematic diagram of a hardware structure of a mobile terminal according to an embodiment of the present disclosure. - The following clearly and completely describes the technical solutions in the embodiments of the present disclosure with reference to the accompanying drawings in the embodiments of the present disclosure. Apparently, the described embodiments are some rather than all of the embodiments of the present disclosure. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present disclosure without creative efforts shall fall within the protection scope of the present disclosure.
- An embodiment of the present disclosure provides an image sensor, including:
- a pixel array, where the pixel array includes a preset quantity of pixel units arranged in a predetermined manner, the pixel unit includes a first pixel and a second pixel adjacent to the first pixel, as shown in
FIG. 2 a toFIG. 2 c ,FIG. 3 a andFIG. 3 b , andFIG. 4 a andFIG. 4 b , the first pixel includes a red subpixel, a green subpixel, and a blue subpixel, the second pixel includes the green subpixel, an infrared subpixel, and at least one of the red subpixel or the blue subpixel, both the first pixel and the second pixel are dual pixel focusing pixels, and each subpixel in the first pixel and the second pixel is arranged in a four-in-one manner, where - a position of the infrared subpixel in the second pixel is the same as a position of the red subpixel, the green subpixel, the blue subpixel, a first combination of subpixels, or a second combination of subpixels in the first pixel, the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other, and the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other; or
- a position of half the infrared subpixel in the second pixel is the same as a position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form the entire infrared subpixel.
- The pixel array included in the image sensor provided in this embodiment of the present disclosure includes the preset quantity of pixel units, and the preset quantity of pixel units are arranged in the predetermined manner. Each of the preset quantity of pixel units includes the first pixel and the second pixel. The first pixel is different from the second pixel. The first pixel includes the red subpixel (R), the green subpixel (G), and the blue subpixel (B). In addition to at least one of the red subpixel or the blue subpixel, the second pixel further includes the green subpixel and the infrared subpixel (IR). When the infrared light is received, an image may be captured by disposing the infrared subpixel in the second pixel, thereby implementing dark-state imaging, and ensuring capturing experience of a user.
- Each of the first pixel and the second pixel in this embodiment of the present disclosure is a dual pixel focusing (2PD) pixel, and an object distance may be detected by using the 2PD pixel, to complete a focusing action more quickly. Herein both the first pixel and the second pixel are 2PD pixels, that is, the subpixels in the first pixel and the second pixel are all 2PD subpixels. In addition, all the subpixels in the first pixel and the second pixel are arranged in a four-in-one manner. Based on the four-in-one arrangement, one subpixel includes 4 corresponding units, and a structure of the 4 units is as follows: Two units are located at an upper layer, the other two units are located at a lower layer, and the two units at the lower layer are arranged in a manner corresponding to the two units at the upper layer. That is, the first unit and the second unit are sequentially arranged and adjacent to each other, a third unit is located below the first unit, and a fourth unit is located below the second unit.
- In addition, because the subpixel is the 2PD subpixel, each unit is divided into two. For example, the red subpixel includes four red units, the green subpixel includes four green units, the blue subpixel includes four blue units, and each unit is divided into two.
- The red subpixel, the green subpixel, and the blue subpixel that are in the first pixel are arranged in a certain way, and the first pixel includes one red subpixel, one blue subpixel, and two green subpixels. Herein, for distinguishing, the two green subpixels are respectively referred to as a first green subpixel and a second green subpixel, where the first green subpixel is the same as the second green subpixel. The red subpixel is adjacent to the first green subpixel, the second green subpixel is below the red subpixel, the blue subpixel is below the first green subpixel, and the second green subpixel is adjacent to the blue subpixel.
- In addition to at least one of the red subpixel or the blue subpixel, the second pixel further includes the green subpixel and the infrared subpixel. That is, the second pixel may include the red subpixel, the green subpixel, and the infrared subpixel, or may include the green subpixel, the blue subpixel, and the infrared subpixel, or may include the green subpixel, the red subpixel, the blue subpixel, and the infrared subpixel.
- The position of the infrared subpixel in the second pixel may be the same as a position of a subpixel in the first pixel, or a position of half of each of two different adjacent subpixels in the first pixel. Certainly, the position of half the infrared subpixel in the second pixel may be further the same as a position of half of any subpixel in the first pixel. In this case, half of an infrared subpixel in two adjacent second pixels can be combined to form an entire infrared subpixel. For example, the position of half the infrared subpixel in the second pixel is the same as the position of half the red subpixel in the first pixel. In addition, the position of half the infrared subpixel in the second pixel is the same as the position of half the green subpixel in the first pixel. So far, half an infrared subpixel in two adjacent second pixels may be combined to form an entire infrared subpixel.
- In this embodiment of the present disclosure, an RGB pixel array arrangement mode is improved to a four-in-one RGB-infrared (Infrared Radiation, IR) pixel array arrangement mode, so that the mobile terminal captures an image when receiving infrared light, thereby implementing dark-state imaging, and ensuring capturing experience of a user.
- In addition, the image sensor in this embodiment of the present disclosure may detect a distance between a to-be-captured object and the mobile terminal, thereby implementing rapid focusing and bokeh. In addition, through cooperation with an infrared transmit module, an imaging effect of the image can be improved, and a related application function of stereophotographing is implemented, thereby enhancing functionality of the mobile terminal while capturing experience of a user is ensured.
- In this embodiment of the present disclosure, as shown in
FIG. 2 a toFIG. 2 c ,FIG. 3 a , andFIG. 3 b , the position of the infrared subpixel in the second pixel is the same as the position of the red subpixel, the green subpixel, the blue subpixel, the first combination of subpixels, or the second combination of subpixels in the first pixel, the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other, and the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other. - When the second pixel includes the blue subpixel, the green subpixel, and the infrared subpixel, the infrared subpixel replaces the red subpixel in the first pixel. In this case, the position of the infrared subpixel in the second pixel is the same as the position of the red subpixel in the first pixel.
- When the second pixel includes the red subpixel, the green subpixel, and the infrared subpixel, the infrared subpixel replaces the blue subpixel in the first pixel. In this case, the position of the infrared subpixel in the second pixel is the same as the position of the blue subpixel in the first pixel.
- When the second pixel includes the red subpixel, the green subpixel, the blue subpixel, and the infrared subpixel, the infrared subpixel may replace one green subpixel in the first pixel. In this case, the position of the infrared subpixel in the second pixel is the same as the position of the one green subpixel in the first pixel.
- That the second pixel includes the red subpixel, the green subpixel, the blue subpixel, and the infrared subpixel may be further that the second pixel includes one red subpixel, one green subpixel, half a blue subpixel, half a green subpixel, and one infrared subpixel. In this case, the infrared subpixel replaces half a green subpixel and half a blue subpixel in the first pixel, that is, the position of the infrared subpixel in the second pixel is the same as a position of half the green subpixel and half the blue subpixel that are adjacent in the first pixel.
- That the second pixel includes the red subpixel, the green subpixel, the blue subpixel, and the infrared subpixel may also be that the second pixel includes one blue subpixel, one green subpixel, half a red subpixel, half a green subpixel, and one infrared subpixel. In this case, the infrared subpixel replaces half a green subpixel and half a red subpixel in the first pixel, that is, the position of the infrared subpixel in the second pixel is the same as a position of half the green subpixel and half the red subpixel that are adjacent in the first pixel.
- Based on the foregoing embodiment, when the position of the infrared subpixel in the second pixel is the same as the position of the red subpixel, the green subpixel, the blue subpixel, the first combination of subpixels, or the second combination of subpixels in the first pixel, the pixel unit includes one second pixel and at least one first pixel.
- If the position of the infrared subpixel in the second pixel is the same as the position of the red subpixel, the green subpixel, the blue subpixel, the first combination of subpixels, or the second combination of subpixels in the first pixel, a quantity of second pixels in the pixel unit is one, and a quantity of first pixels may be greater than or equal to one. When the pixel unit includes one second pixel and one first pixel, referring to an example shown in
FIG. 5 a , the pixel unit includes one first pixel and one second pixel. The second pixel includes one red subpixel, two green subpixels, and one infrared subpixel, and the first pixel includes one red subpixel, two green subpixels, and one blue subpixel. In this case, the infrared subpixel accounts for ⅛ of the pixel unit. It should be noted that the subpixels herein are all arranged in a four-in-one manner, each subpixel includes four corresponding units, and each unit is divided into two. - When the pixel unit includes two first pixels and one second pixel, the second pixel may include one blue subpixel, two green subpixels, and one infrared subpixel, and the first pixel includes one red subpixel, two green subpixels, and one blue subpixel. In this case, the infrared subpixel accounts for 1/12 of the pixel unit. The subpixels herein are all arranged in a four-in-one manner, each subpixel includes four corresponding units, and each unit is divided into two.
- When the pixel unit includes three first pixels and one second pixel, for example, as shown in
FIG. 3 a , the pixel unit includes three first pixels and one second pixel. The second pixel includes one blue subpixel, one green subpixel, half a red subpixel, half a green subpixel, and one infrared subpixel. Half a red subpixel and half a green subpixel among 2PD subpixels may be taken as the infrared subpixel based on the first pixel. The first pixel includes one red subpixel, two green subpixels, and one blue subpixel. In this case, the infrared subpixel accounts for 1/16 of the pixel unit. The subpixels herein are arranged in a four-in-one manner, each subpixel includes four corresponding units, and each unit is divided into two. - The foregoing corresponding point-taking manners are only used for illustration, and other points-taking manners may also be used, which are not described one by one herein.
- In this embodiment of the present disclosure, as shown in
FIG. 4 a andFIG. 4 b , a position of half an infrared subpixel in the second pixel is the same as a position of half a red subpixel, half a green subpixel, or half a blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form an entire infrared subpixel. - The second pixel may include only half an infrared subpixel, and an entire infrared subpixel may be obtained by combining two second pixels.
- When the second pixel includes half the infrared subpixel, the position of half the infrared subpixel in the second pixel may be the same as a position of half a red subpixel in the first pixel, or may be the same as a position of half a green subpixel in the first pixel, or may be the same as a position of half a blue subpixel in the first pixel.
- When a position of half an infrared subpixel in one second pixel is the same as a position of half a red subpixel in the first pixel, a position of half an infrared subpixel in another second pixel is the same as a position of half a green subpixel in the first pixel. When a position of half an infrared subpixel in one second pixel is the same as a position of half a green subpixel in the first pixel, a position of half an infrared subpixel in another second pixel is the same as a position of half a blue subpixel or half a red subpixel in the first pixel.
- Correspondingly, when the position of half the infrared subpixel in the second pixel is the same as the position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form the entire infrared subpixel, the pixel unit includes two second pixels and the first pixel whose quantity is greater than or equal to zero.
- When a quantity of pixels in the pixel unit is at least two, the pixel unit includes two second pixels and the first pixel whose quantity is greater than or equal to zero. When the quantity of pixels in the pixel unit is two, two second pixels are included. For example, as shown in
FIG. 5 b , the pixel unit includes two second pixels. One second pixel includes one red subpixel, one green subpixel, one blue subpixel, half a green subpixel, and half an infrared subpixel. The other second pixel includes one red subpixel, two green subpixels, half a blue subpixel, and half an infrared subpixel. In this case, when a position of half an infrared subpixel in one second pixel is the same as a position of half a green subpixel in the first pixel, a position of half an infrared subpixel in the other second pixel is the same as a position of half a blue subpixel in the first pixel. The infrared subpixel accounts for ⅛ of the pixel unit. - When the quantity of pixels in the pixel unit is three, two second pixels and one first pixel are included. For example, the pixel unit includes two second pixels and one first pixel. One second pixel includes one red subpixel, one green subpixel, one blue subpixel, half a green subpixel, and half an infrared subpixel. The other second pixel includes half a red subpixel, two green subpixels, one blue subpixel, and half an infrared subpixel. In this case, a position of half the infrared subpixel in the one second pixel is the same as a position of half a green subpixel in the first pixel, and a position of half the infrared subpixel in the other second pixel is the same as a position of half a red subpixel in the first pixel. The infrared subpixel accounts for 1/12 of the pixel unit.
- When the quantity of pixels in the pixel unit is four, two second pixels and two first pixels are included. For example, as shown in
FIG. 4 b , the pixel unit includes two second pixels and two first pixels. One second pixel includes one red subpixel, one green subpixel, one blue subpixel, half a green subpixel, and half an infrared subpixel. The other second pixel includes one blue subpixel, two green subpixels, half a red subpixel, and half an infrared subpixel. In this case, when a position of half the infrared subpixel in the one second pixel is the same as a position of half a green subpixel in the first pixel, a position of half the infrared subpixel in the other second pixel is the same as a position of half a red subpixel in the first pixel. The infrared subpixel accounts for 1/16 of the pixel unit. The subpixels herein are all arranged in a four-in-one manner, each subpixel includes four corresponding units, and each unit is divided into two. - An RGB+IR pixel unit with a proportion of ⅛, an RGB+IR pixel unit with a proportion of 1/12, or an RGB+IR pixel unit with a proportion of 1/16 may be used as a pixel unit array, and a periodic array arrangement is performed on the pixel unit array to form the pixel array. Certainly, the pixel array may be in another form, which is not listed herein.
- Density of infrared subpixels in the pixel unit (that is, a proportion) is ¼n, n is an integer greater than or equal to 2, and a size of the pixel array to which the infrared subpixel is applicable is not limited. Only a few corresponding point taking implementations of the infrared subpixel are described above, and other point taking manners may be further used. The other point taking manners in the embodiments of the present disclosure are not described one by one herein. A point taking position of the infrared subpixel in the pixel unit (the location of the second pixel) is not specifically limited in this embodiment of the present disclosure.
- For a form of the forgoing pixel unit, a four-in-one red, green, or blue subpixel among the 2PD subpixels may be used as the infrared subpixel, to facilitate form diversification of the pixel unit; or half a four-in-one red subpixel and half a four-in-one green subpixel among the 2PD subpixels or half a four-in-one blue subpixel and half a four-in-one green subpixel may be used as the infrared subpixel, so that impact of IR dead pixels during RGB processing can be reduced.
- In this embodiment of the present disclosure, the red subpixel includes a semiconductor layer, a metal layer, a photodiode, a red light filter, and a micromirror that are sequentially stacked. The green subpixel includes a semiconductor layer, a metal layer, a photodiode, a green light filter, and a micromirror that are sequentially stacked. The blue subpixel includes a semiconductor layer, a metal layer, a photodiode, a blue light filter, and a micromirror that are sequentially stacked. The infrared subpixel includes a semiconductor layer, a metal layer, a photodiode, an infrared light filter, and a micromirror that are sequentially stacked.
- The semiconductor layer, the metal layer, the photodiode, the red light filter, and the micromirror that are included in the red subpixel are sequentially arranged from bottom to top. The semiconductor layer, the metal layer, the photodiode, the green light filter, and the micromirror that are included in the corresponding green subpixel are sequentially arranged from bottom to top. The semiconductor layer, the metal layer, the photodiode, the blue light filter, and the micromirror that are included in the blue subpixel are sequentially arranged from bottom to top. The semiconductor layer, the metal layer, the photodiode, the infrared filter, and the micromirror included in the infrared subpixel are sequentially arranged from bottom to top. The semiconductor layer herein may be a silicon substrate. However, the present disclosure is not limited thereto. For structures of the red, green, blue, and infrared subpixels, refer to
FIG. 6 . Even thoughFIG. 6 shows only the green and infrared subpixels, the structures of the red and blue subpixels may be obtained on this basis. The green light filter may be replaced with the red or blue light filter, so that the structures of the red subpixel or the blue subpixel can be obtained. - The red, green, and blue subpixels are used to obtain color information of pixels for synthesizing an image, and block infrared light from entering. For example, visible light with a wavelength of only 380 mm to 700 nm is allowed to enter, and an image with full and vivid colors can be directly generated under high illumination. An infrared wavelength is 750 nm to 1100 nm, and a dark-state imaging effect can be improved by using an infrared band, thereby implementing an infrared distance measurement function.
- As can be seen from the above description, an RGB subpixel point is a light receiving element corresponding to wavelength light in each RGB color, and an IR subpixel point is a light receiving element corresponding to infrared light.
- In this embodiment of the present disclosure, the image sensor is a complementary metal oxide semiconductor CMOS image sensor, a charge-coupled device CCD image sensor, or a quantum film image sensor.
- In the present disclosure, types of image sensors to which the four-in-one RGB-IR pixel array arrangement mode is applicable are not limited. The image sensor may be an image sensor based on a CMOS, an image sensor based on a charge-coupled device (Charge-coupled Device, CCD), or an image sensor based on a quantum film. Certainly, the image sensor may be further of another type. In addition, the image sensor in this embodiment of the present disclosure is applicable to any electronic product including a camera module.
- An embodiment of the present disclosure further provides a mobile terminal. As shown in
FIG. 7 a andFIG. 7 b , the mobile terminal 1 includes animaging system 2 and an infrared transmitmodule 3. In addition to the foregoingimage sensor 21, theimaging system 2 further includes: - a
lens module 22; adriver module 23 configured to drive thelens module 22 to move; afiltering module 24 disposed between thelens module 22 and theimage sensor 21; an imagedata processing module 25 connected to theimage sensor 21; and adisplay module 26 connected to the imagedata processing module 25, where the infrared transmitmodule 3 is disposed on a periphery of thelens module 22. - In addition to the
imaging system 2, the mobile terminal 1 in this embodiment of the present disclosure further includes the infrared transmitmodule 3. In addition to theimage sensor 21, theimaging system 2 further includes thelens module 22 configured to focus light. Thelens module 22 is connected to thedriver module 23, and thedriver module 23 is configured to adjust a position of thelens module 22 based on a distance to a to-be-captured object. - The
filtering module 24 is disposed between thelens module 22 and theimage sensor 21. After light is focused by using thelens module 22 and passes through thefiltering module 24, the light may be focused on a pixel array of theimage sensor 21. Theimage sensor 21 is connected to the imagedata processing module 25, and the imagedata processing module 25 is connected to thedisplay module 26. After the light is focused on the pixel array of theimage sensor 21, and theimage sensor 21 performs photovoltaic conversion on the light, data is transmitted to the imagedata processing module 25. After the imagedata processing module 25 processes the data, the data is presented on thedisplay module 26 in an image form. - After adjusting the position of the
lens module 22, thedriver module 23 may obtain a phase difference by using a 2PD pixel of theimage sensor 21, to obtain a distance between an object and an imaging surface, thereby implementing fast focusing. - In addition, the four-in-one RGB+IR pixel array arrangement mode based on the 2PD image sensor in the present disclosure can cooperate with the infrared transmit
module 3 to achieve a stereo-related function, such as terminal application like face recognition and unlocking, secure payment, and stereo imaging, thereby improving functionality of the mobile terminal on the basis of ensuring imaging. - The
filtering module 24 in this embodiment of the present disclosure can pass through a light wavelength from 380 nm to 1100 nm. In this case, after the light is focused by using thelens module 22, the light can be filtered by using thefiltering module 24. Natural light and infrared light can pass through thefiltering module 24, and thefiltering module 24 can be configured to ensure an imaging effect of theimaging system 2. - The infrared transmit
module 3 on the mobile terminal is disposed on a periphery of thelens module 22. The infrared transmitmodule 3 emits infrared light, and the infrared light is reflected after encountering an obstacle. After theimaging system 2 captures the reflected infrared light, and photoelectric conversion is performed by using the infrared subpixel, a time difference from transmitting the infrared light to receiving the infrared light can be obtained. Due to a fixed light propagation speed, a distance between the obstacle and the mobile terminal may be obtained, and a distance from each minimum unit on the obstacle to the mobile terminal may be finally obtained, to implement a stereo imaging and recording function. Certainly, a distance between each infrared light reflection point on the obstacle and the mobile terminal may be further obtained by obtaining a phase difference of the infrared light. - For the mobile terminal in this embodiment of the present disclosure, the image sensor is formed by using a 2PD pixel array combining an RGB subpixel and an infrared subpixel that are in a four-in-one arrangement. Capturing is performed by using the image sensor, and a distance between the to-be-captured object and the mobile terminal may be detected when an image is captured and recorded, thereby achieving fast focusing and bokeh. In addition, a dark-state imaging effect of the image can be improved, and a related application function of stereophotographing is implemented thereby ensuring functional diversity of the mobile terminal while capturing experience of a user is ensured.
- An embodiment of the present disclosure further provides an image capturing method, applied to a mobile terminal. In addition to the foregoing image sensor, the mobile terminal further includes an infrared transmit module. As shown in
FIG. 8 , the method includes: -
Step 801. Transmit infrared light by using the infrared transmit module. - The infrared transmit module on the mobile terminal can emit infrared light. After encountering a to-be-captured object, the infrared light is reflected, and the reflected infrared light is received by an imaging system of the mobile terminal. The image sensor of the mobile terminal forms a four-in-one RGB-IR pixel array. Therefore, photoelectric conversion can be performed by using an infrared subpixel.
-
Step 802. Obtain a distance between each infrared light reflection point on a to-be-captured object and the mobile terminal based on the infrared light reflected by the to-be-captured object. - When the distance between the to-be-captured object and the mobile terminal is obtained, a distance between the to-be-captured object and an imaging surface is actually obtained. A process of obtaining the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal is as follows: receiving, by using a pixel array including a second pixel, the infrared light reflected by each infrared light reflection point on the to-be-captured object; and according to the time difference between sending and receiving the infrared light and a propagation speed of the infrared light or by obtaining the phase difference of the infrared light, obtaining the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal.
- After the reflected infrared light is captured, and photoelectric conversion is performed by using the infrared subpixel, the time difference from transmitting the infrared light to receiving the infrared light can be obtained. Due to the fixed light propagation speed, the distance between the obstacle and the mobile terminal may be obtained based on ½ of a product of the time difference and the propagation speed. Time points at which the mobile terminal receives infrared light reflected by the infrared light reflection points are different. Therefore, a distance may be correspondingly calculated for each infrared light reflection point, so that the distance between each infrared light reflection point and the mobile terminal may be obtained. The distance between each infrared light reflection point and the mobile terminal may be further obtained by obtaining the phase difference of the infrared light. For details, refer to a time of flight (Time of Flight, TOF) technology. Details are not described herein.
- The pixel array of the image sensor in this embodiment of the present disclosure includes a preset quantity of pixel units arranged in a predetermined manner, the pixel unit includes a first pixel and a second pixel, the first pixel includes a red subpixel, a green subpixel, and a blue subpixel, the second pixel includes the green subpixel, an infrared subpixel, and at least one of the red subpixel or the blue subpixel, both the first pixel and the second pixel are dual pixel focusing pixels, and each subpixel in the first pixel and the second pixel is arranged in a four-in-one manner. A position of the infrared subpixel in the second pixel is the same as a position of the red subpixel, the green subpixel, the blue subpixel, a first combination of subpixels, or a second combination of subpixels in the first pixel, the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other, and the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other. Alternatively, a position of half the infrared subpixel in the second pixel is the same as a position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form an entire infrared subpixel.
-
Step 803. Obtain three-dimensional information of the to-be-captured object based on the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal. - When the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal is obtained, a distance from each minimum unit on the to-be-captured object to the mobile terminal is specifically obtained, and then a process of capturing the to-be-captured object is executed, to implement a stereo imaging and recording function.
- For the image capturing method in this embodiment of the present disclosure, the image sensor is formed by using a 2PD pixel array combining an RGB subpixel and an infrared subpixel that are in a four-in-one arrangement. Capturing is performed by using the image sensor, and a distance between the to-be-captured object and the mobile terminal may be detected when an image is captured and recorded, thereby achieving fast focusing and bokeh. In addition, a dark-state imaging effect of the image can be improved, and a related application function of stereophotographing is implemented thereby ensuring functional diversity of the mobile terminal while capturing experience of a user is ensured.
-
FIG. 9 is a schematic diagram of a hardware structure of a mobile terminal according to various embodiments of the present disclosure. Themobile terminal 900 includes, but is not limited to: aradio frequency unit 901, anetwork module 902, anaudio output unit 903, aninput unit 904, asensor 905, adisplay unit 906, auser input unit 907, aninterface unit 908, amemory 909, aprocessor 910, apower supply 911, and other components. - The
mobile terminal 900 further includes an imaging system and an infrared transmit module. The imaging system includes an image sensor and a lens module; a driver module configured to drive the lens module to move; a filtering module disposed between the lens module and the image sensor; an image data processing module connected to the image sensor; and a display module connected to the image data processing module, where the infrared transmit module is disposed on a periphery of the lens module. - The filtering module can pass through an optical wavelength from 380 nm to 1100 nm.
- The image sensor includes: a pixel array, where the pixel array includes a preset quantity of pixel units arranged in a predetermined manner, the pixel unit includes a first pixel and a second pixel adjacent to the first pixel, the first pixel includes a red subpixel, a green subpixel, and a blue subpixel, the second pixel includes the green subpixel, an infrared subpixel, and at least one of the red subpixel or the blue subpixel, both the first pixel and the second pixel are dual pixel focusing pixels, and each subpixel in the first pixel and the second pixel is arranged in a four-in-one manner; and
- a position of the infrared subpixel in the second pixel is the same as a position of the red subpixel, the green subpixel, the blue subpixel, a first combination of subpixels, or a second combination of subpixels in the first pixel, the first combination of subpixels is a combination of half a red subpixel and half a green subpixel that are adjacent to each other, and the second combination of subpixels is a combination of half a green subpixel and half a blue subpixel that are adjacent to each other; or
- a position of half the infrared subpixel in the second pixel is the same as a position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half an infrared subpixel in each of two adjacent second pixels is combined to form an entire infrared subpixel.
- When the position of the infrared subpixel in the second pixel is the same as the position of the red subpixel, the green subpixel, the blue subpixel, the first combination of subpixels, or the second combination of subpixels in the first pixel, the pixel unit includes one second pixel and at least one first pixel.
- When the position of half the infrared subpixel in the second pixel is the same as the position of half the red subpixel, half the green subpixel, or half the blue subpixel in the first pixel, and half the infrared subpixel in each of two adjacent second pixels is combined to form the entire infrared subpixel, the pixel unit includes two second pixels and the first pixel whose quantity is greater than or equal to zero.
- The red subpixel includes a semiconductor layer, a metal layer, a photodiode, a red light filter, and a micromirror that are sequentially stacked; the green subpixel includes a semiconductor layer, a metal layer, a photodiode, a green light filter, and a micromirror that are sequentially stacked; the blue subpixel includes a semiconductor layer, a metal layer, a photodiode, a blue light filter, and a micromirror that are sequentially stacked; the infrared subpixel includes a semiconductor layer, a metal layer, a photodiode, an infrared light filter, and a micromirror that are sequentially stacked.
- The image sensor is a complementary metal oxide semiconductor CMOS image sensor, a charge-coupled device CCD image sensor, or a quantum film image sensor.
- A person skilled in the art may understand that a structure of the mobile terminal shown in
FIG. 9 does not constitute a limitation on the mobile terminal, and the mobile terminal may include more or fewer components than those shown in the figure, or combine some components, or have different component arrangements. In this embodiment of the present disclosure, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a laptop computer, a palmtop computer, an in-vehicle terminal, a wearable device, a pedometer, and the like. - The
processor 910 is configured to: transmit infrared light by using the infrared transmit module; obtain a distance between each infrared light reflection point on a to-be-captured object and the mobile terminal based on the infrared light reflected by the to-be-captured object; and control, based on the distance between each infrared light reflection point on the to-be-captured object and the mobile terminal, the imaging system to obtain three-dimensional information of the to-be-captured object. - In this way, the image sensor is formed by using a 2PD pixel array combining an RGB subpixel and an infrared subpixel that are in a four-in-one arrangement. Capturing is performed by using the image sensor, and a distance between the to-be-captured object and the mobile terminal may be detected when an image is captured and recorded, thereby achieving fast focusing and bokeh. In addition, a dark-state imaging effect of the image can be improved, and a related application function of stereophotographing is implemented thereby ensuring functional diversity of the mobile terminal while capturing experience of a user is ensured.
- It should be understood that in this embodiment of the present disclosure, the
radio frequency unit 901 may be configured to receive and send signals in a process of receiving and sending information or calling. Specifically, after receiving downlink data from a base station, theradio frequency unit 901 sends the downlink data to theprocessor 910 for processing; and sends uplink data to the base station. Generally, theradio frequency unit 901 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, theradio frequency unit 901 may further communicate with a network and another device by using a wireless communications system. - The mobile terminal provides wireless broadband Internet access for a user by using a
network module 902, for example, helping the user send and receive an email, browse a web page, and access streaming media. - The
audio output unit 903 may convert audio data received by theradio frequency unit 901 or thenetwork module 902 or stored in thememory 909 into an audio signal and output the audio signal as sound. Moreover, theaudio output unit 903 may further provide audio output (for example, call signal receiving sound and message receiving sound) related to a specific function executed by themobile terminal 900. Theaudio output unit 903 includes a speaker, a buzzer, a telephone receiver, and the like. - The
input unit 904 is configured to receive an audio signal or a video signal. Theinput unit 904 may include a graphics processing unit (Graphics Processing Unit, GPU) 9041 and amicrophone 9042. Thegraphics processing unit 9041 is configured to process image data of a static picture or a video obtained by an image capturing apparatus (for example, a camera) in video capturing mode or image capturing mode. An image frame obtained through processing may be displayed on thedisplay unit 906. The display unit herein is the display module. The image frame obtained through processing by thegraphics processing unit 9041 may be stored in the memory 909 (or another storage medium) or sent by using theradio frequency unit 901 or thenetwork module 902. Thegraphics processing unit 9041 is the image data processing module. Themicrophone 9042 may receive sound and can process such sound into audio data. The audio data obtained through processing may be converted, in a telephone call mode, into a format that can be sent to a mobile communications base station via theradio frequency unit 901 for output. - The
mobile terminal 900 further includes at least onesensor 905, such as an optional sensor, a motion sensor, and another sensor. Specifically, the optional sensor includes an ambient light sensor and a proximity sensor. The ambient light sensor may adjust luminance of the display panel 9061 based on brightness of ambient light, and the proximity sensor may disable the display panel 9061 and/or backlight when themobile terminal 900 approaches an ear. As a type of the motion sensor, an accelerometer sensor may detect magnitude of an acceleration in each direction (generally three axes), and may detect magnitude and a direction of gravity when being static. The accelerometer sensor may be used for recognizing a gesture of a mobile terminal (for example, horizontal and vertical screen switching, a related game, or magnetometer posture calibration), a function related to vibration recognition (for example, a pedometer or a strike), or the like. Thesensor 905 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and the like. Details are not described herein. - The
display unit 906 is configured to display information entered by a user or information provided for the user. Thedisplay unit 906 may include a display panel 9061, and the display panel 9061 may be configured in a form of a liquid crystal display (Liquid Crystal Display, LCD), an organic light-emitting diode (Organic Light-Emitting Diode, OLED), or the like. - The
user input unit 907 may be configured to receive input digit or character information and generate key signal input related to user setting and function control of the mobile terminal. Specifically, theuser input unit 907 includes atouch panel 9071 and anotherinput device 9072. Thetouch panel 9071, also referred to as a touch screen, may collect a touch operation of a user on or near the touch panel 9071 (for example, the user uses any suitable object or accessory such as a finger or a stylus to operate on or near the touch panel 9071). Thetouch panel 9071 may include two parts: a touch detection apparatus and a touch controller. The touch detection apparatus detects a touch position of the user, detects a signal brought by the touch operation, and transmits the signal to the touch controller. The touch controller receives touch information from the touch detection apparatus, converts the touch information into contact coordinates, transmits the contact coordinates to theprocessor 910, receives a command sent by theprocessor 910, and executes the command. In addition, thetouch panel 9071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to thetouch panel 9071, theuser input unit 907 may further include the anotherinput device 9072. Specifically, the anotherinput device 9072 may include, but is not limited to, a physical keyboard, function keys (such as a volume control key and a switch key), a trackball, a mouse, and a joystick. Details are not described herein. - Further, the
touch panel 9071 can cover the display panel 9061. When detecting a touch operation on or near thetouch panel 9071, thetouch panel 9071 transmits the touch operation to theprocessor 910 to determine a type of a touch event. Then theprocessor 910 provides corresponding visual output on the display panel 9061 based on the type of the touch event. InFIG. 9 , thetouch panel 9071 and the display panel 9061 are used as two independent components to implement input and output functions of the mobile terminal. However, in some embodiments, thetouch panel 9071 and the display panel 9061 may be integrated to implement the input and output functions of the mobile terminal. This is not specifically limited herein. - The
interface unit 908 is an interface connecting an external apparatus to themobile terminal 900. For example, the external apparatus may include a wired or wireless headset port, an external power supply (or a battery charger) port, a wired or wireless data port, a memory card port, a port for connecting an apparatus having an identification module, an audio input/output (I/O) port, a video I/O port, a headset port, and the like. Theinterface unit 908 may be configured to receive input (such as data information and power) from the external apparatus and transmit the received input to one or more elements in themobile terminal 900, or may be configured to transmit data between themobile terminal 900 and the external apparatus. - The
memory 909 may be configured to store a software program and various data. Thememory 909 may mainly include a program storage area and a data storage area. The program storage area may store an operating system, an application program required by at least one function (such as a sound play function or an image display function), and the like. The data storage area may store data (such as audio data or an address book) or the like created based on use of the mobile phone. In addition, thememory 909 may include a high-speed random access memory, and may further include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, or another volatile solid-state storage device. - The
processor 910 is a control center of the mobile terminal, and is connected to all parts of the entire mobile terminal by using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing the software program and/or the module that are/is stored in thememory 909 and invoking the data stored in thememory 909, to implement overall monitoring on the mobile terminal. Theprocessor 910 may include one or more processing units. Optionally, theprocessor 910 may integrate an application processor with a modem processor. The application processor mainly processes the operating system, a user interface, the application program, and the like, and the modem processor mainly processes wireless communication. It may be understood that the foregoing modem processor may not be integrated into theprocessor 910. - The
mobile terminal 900 may further include a power supply 911 (such as a battery) that supplies power to each component. Optionally, thepower supply 911 may be logically connected to theprocessor 910 by using a power management system, to implement functions such as charging, discharging, and power consumption management by using the power management system. - In addition, the
mobile terminal 900 includes some function modules not shown, and details are not described herein. - It should be noted that in this specification, the terms “comprise”, “include”, and any other variants thereof are intended to cover non-exclusive inclusion, so that a process, a method, an article, or an apparatus that includes a series of elements not only includes these very elements, but may also include other elements not expressly listed, or also include elements inherent to this process, method, article, or apparatus. Without being subject to further limitations, an element defined by a phrase “including a” does not exclude presence of other identical elements in the process, method, article, or apparatus that includes the very element.
- By means of the foregoing description of the embodiments, a person skilled in the art may clearly understand that the method in the foregoing embodiments may be implemented by software with a necessary general hardware platform. Certainly, the method in the foregoing embodiments may also be implemented by hardware. However, in many cases, the former is a preferred implementation. Based on such an understanding, the technical solutions of the present disclosure essentially or the part contributing to related technologies may be implemented in a form of a software product. The computer software product is stored in a storage medium (such as a read-only memory (Read-only Memory, ROM)/random access memory (Random Access Memory, RAM), a magnetic disk, or an optical disc) and includes several instructions for instructing user equipment (which may be a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the methods described in the embodiments of the present disclosure.
- The embodiments of the present disclosure are described above with reference to the accompanying drawings, but the present disclosure is not limited to the foregoing specific implementations. The foregoing specific implementations are merely schematic instead of restrictive. Under enlightenment of the present disclosure, a person of ordinary skills in the art may make many forms without departing from the protection scope of aims and claims of the present disclosure, all of which fall within the protection of the present disclosure.
Claims (12)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810798680.4A CN108965704B (en) | 2018-07-19 | 2018-07-19 | image sensor, mobile terminal and image shooting method |
CN201810798680.4 | 2018-07-19 | ||
PCT/CN2019/095367 WO2020015561A1 (en) | 2018-07-19 | 2019-07-10 | Image sensor, mobile terminal, and image capturing method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/095367 Continuation WO2020015561A1 (en) | 2018-07-19 | 2019-07-10 | Image sensor, mobile terminal, and image capturing method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20220367541A1 US20220367541A1 (en) | 2022-11-17 |
US20230069816A9 true US20230069816A9 (en) | 2023-03-02 |
Family
ID=64481918
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/151,745 Pending US20230069816A9 (en) | 2018-07-19 | 2021-01-19 | Image sensor, mobile terminal, and image capturing method |
Country Status (5)
Country | Link |
---|---|
US (1) | US20230069816A9 (en) |
EP (1) | EP3826291A4 (en) |
KR (1) | KR102374428B1 (en) |
CN (1) | CN108965704B (en) |
WO (1) | WO2020015561A1 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180315788A1 (en) * | 2017-05-01 | 2018-11-01 | Visera Technologies Company Limited | Image sensor |
CN108965704B (en) * | 2018-07-19 | 2020-01-31 | 维沃移动通信有限公司 | image sensor, mobile terminal and image shooting method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140008679A1 (en) * | 2010-08-31 | 2014-01-09 | Sanyo Electric Co., Ltd. | Substrate for mounting element and optical module |
US9749556B2 (en) * | 2015-03-24 | 2017-08-29 | Semiconductor Components Industries, Llc | Imaging systems having image sensor pixel arrays with phase detection capabilities |
US10593712B2 (en) * | 2017-08-23 | 2020-03-17 | Semiconductor Components Industries, Llc | Image sensors with high dynamic range and infrared imaging toroidal pixels |
US10629105B2 (en) * | 2017-06-15 | 2020-04-21 | Google Llc | Near-eye display with frame rendering based on reflected wavefront analysis for eye characterization |
US10686004B2 (en) * | 2016-03-31 | 2020-06-16 | Nikon Corporation | Image capturing element and image capturing device image sensor and image-capturing device |
US10848657B2 (en) * | 2016-04-29 | 2020-11-24 | Lg Innotek Co., Ltd. | Camera module having a slim overall structure and portable device comprising same |
US11004884B2 (en) * | 2017-04-11 | 2021-05-11 | Sony Semiconductor Solutions Corporation | Solid-state imaging apparatus |
US11210799B2 (en) * | 2017-10-04 | 2021-12-28 | Google Llc | Estimating depth using a single camera |
US11217617B2 (en) * | 2017-06-21 | 2022-01-04 | Sony Semiconductor Solutions Corporation | Imaging element and solid-state imaging device |
US11404454B2 (en) * | 2019-04-11 | 2022-08-02 | Guangzhou Luxvisions Innovation Technology Limited | Image sensing device |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2005072358A2 (en) * | 2004-01-28 | 2005-08-11 | Canesta, Inc. | Single chip red, green, blue, distance (rgb-z) sensor |
JP2006237737A (en) * | 2005-02-22 | 2006-09-07 | Sanyo Electric Co Ltd | Color filter array and solid state image sensor |
KR20100018449A (en) * | 2008-08-06 | 2010-02-17 | 삼성전자주식회사 | Pixel array of three dimensional image sensor |
KR20110137700A (en) * | 2010-06-17 | 2011-12-23 | 삼성전자주식회사 | Optical apparatus and imaging apparatus using optical apparatus |
JP5513326B2 (en) * | 2010-09-07 | 2014-06-04 | キヤノン株式会社 | Imaging device and imaging apparatus |
WO2013046973A1 (en) * | 2011-09-29 | 2013-04-04 | 富士フイルム株式会社 | Solid state image capture element, image capture device, and focus control method |
KR102086509B1 (en) * | 2012-11-23 | 2020-03-09 | 엘지전자 주식회사 | Apparatus and method for obtaining 3d image |
JP2014239290A (en) * | 2013-06-06 | 2014-12-18 | ソニー株式会社 | Focus detection apparatus, electronic apparatus, manufacturing apparatus, and manufacturing method |
US9692992B2 (en) * | 2013-07-01 | 2017-06-27 | Omnivision Technologies, Inc. | Color and infrared filter array patterns to reduce color aliasing |
JP6233188B2 (en) * | 2013-12-12 | 2017-11-22 | ソニー株式会社 | Solid-state imaging device, manufacturing method thereof, and electronic device |
US20160182846A1 (en) * | 2014-12-22 | 2016-06-23 | Google Inc. | Monolithically integrated rgb pixel array and z pixel array |
WO2016115338A1 (en) * | 2015-01-14 | 2016-07-21 | Emanuele Mandelli | Phase-detect autofocus |
KR20160114474A (en) * | 2015-03-24 | 2016-10-05 | (주)실리콘화일 | 4-color image sensor, process for 4-color image sensor and 4-color image sensor thereby |
CN105049829B (en) * | 2015-07-10 | 2018-12-25 | 上海图漾信息科技有限公司 | Optical filter, imaging sensor, imaging device and 3-D imaging system |
JP2017022624A (en) * | 2015-07-13 | 2017-01-26 | キヤノン株式会社 | Imaging device, driving method therefor, and imaging apparatus |
JP2017038311A (en) * | 2015-08-12 | 2017-02-16 | 株式会社東芝 | Solid-state imaging device |
KR101733309B1 (en) * | 2015-11-11 | 2017-05-08 | 재단법인 다차원 스마트 아이티 융합시스템 연구단 | Dual isp based camera system for 4 color image sensor |
CN106878690A (en) * | 2015-12-14 | 2017-06-20 | 比亚迪股份有限公司 | The imaging method of imageing sensor, imaging device and electronic equipment |
US10114465B2 (en) * | 2016-01-15 | 2018-10-30 | Google Llc | Virtual reality head-mounted devices having reduced numbers of cameras, and methods of operating the same |
CN105611136B (en) * | 2016-02-26 | 2019-04-23 | 联想(北京)有限公司 | A kind of imaging sensor and electronic equipment |
CN107360405A (en) * | 2016-05-09 | 2017-11-17 | 比亚迪股份有限公司 | Imaging sensor, imaging method and imaging device |
US10547829B2 (en) * | 2016-06-16 | 2020-01-28 | Samsung Electronics Co., Ltd. | Image detecting device and image detecting method using the same |
KR20180064629A (en) * | 2016-12-05 | 2018-06-15 | 삼성디스플레이 주식회사 | Display device |
CN107040724B (en) * | 2017-04-28 | 2020-05-15 | Oppo广东移动通信有限公司 | Dual-core focusing image sensor, focusing control method thereof and imaging device |
CN107205139A (en) * | 2017-06-28 | 2017-09-26 | 重庆中科云丛科技有限公司 | The imaging sensor and acquisition method of multichannel collecting |
CN107613210B (en) * | 2017-09-30 | 2020-12-29 | 北京小米移动软件有限公司 | Image display method and device, terminal and storage medium |
CN207543224U (en) * | 2017-11-24 | 2018-06-26 | 深圳先牛信息技术有限公司 | Imaging sensor |
CN207491128U (en) * | 2017-11-30 | 2018-06-12 | 北京中科虹霸科技有限公司 | A kind of RGB+IR image capture devices |
CN108282644B (en) * | 2018-02-14 | 2020-01-10 | 北京飞识科技有限公司 | Single-camera imaging method and device |
CN108965704B (en) * | 2018-07-19 | 2020-01-31 | 维沃移动通信有限公司 | image sensor, mobile terminal and image shooting method |
-
2018
- 2018-07-19 CN CN201810798680.4A patent/CN108965704B/en active Active
-
2019
- 2019-07-10 KR KR1020217004094A patent/KR102374428B1/en active IP Right Grant
- 2019-07-10 WO PCT/CN2019/095367 patent/WO2020015561A1/en active Application Filing
- 2019-07-10 EP EP19838484.4A patent/EP3826291A4/en active Pending
-
2021
- 2021-01-19 US US17/151,745 patent/US20230069816A9/en active Pending
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140008679A1 (en) * | 2010-08-31 | 2014-01-09 | Sanyo Electric Co., Ltd. | Substrate for mounting element and optical module |
US9749556B2 (en) * | 2015-03-24 | 2017-08-29 | Semiconductor Components Industries, Llc | Imaging systems having image sensor pixel arrays with phase detection capabilities |
US10686004B2 (en) * | 2016-03-31 | 2020-06-16 | Nikon Corporation | Image capturing element and image capturing device image sensor and image-capturing device |
US10848657B2 (en) * | 2016-04-29 | 2020-11-24 | Lg Innotek Co., Ltd. | Camera module having a slim overall structure and portable device comprising same |
US11004884B2 (en) * | 2017-04-11 | 2021-05-11 | Sony Semiconductor Solutions Corporation | Solid-state imaging apparatus |
US10629105B2 (en) * | 2017-06-15 | 2020-04-21 | Google Llc | Near-eye display with frame rendering based on reflected wavefront analysis for eye characterization |
US11217617B2 (en) * | 2017-06-21 | 2022-01-04 | Sony Semiconductor Solutions Corporation | Imaging element and solid-state imaging device |
US10593712B2 (en) * | 2017-08-23 | 2020-03-17 | Semiconductor Components Industries, Llc | Image sensors with high dynamic range and infrared imaging toroidal pixels |
US11210799B2 (en) * | 2017-10-04 | 2021-12-28 | Google Llc | Estimating depth using a single camera |
US11404454B2 (en) * | 2019-04-11 | 2022-08-02 | Guangzhou Luxvisions Innovation Technology Limited | Image sensing device |
Also Published As
Publication number | Publication date |
---|---|
KR20210028254A (en) | 2021-03-11 |
WO2020015561A1 (en) | 2020-01-23 |
US20220367541A1 (en) | 2022-11-17 |
EP3826291A4 (en) | 2021-08-11 |
CN108965704B (en) | 2020-01-31 |
CN108965704A (en) | 2018-12-07 |
KR102374428B1 (en) | 2022-03-15 |
EP3826291A1 (en) | 2021-05-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11962918B2 (en) | Image sensor, mobile terminal, and image photographing method | |
US11463641B2 (en) | Image sensor, mobile terminal, and photographing method | |
CN108900750B (en) | Image sensor and mobile terminal | |
US20220367550A1 (en) | Mobile terminal and image photographing method | |
CN108965666B (en) | Mobile terminal and image shooting method | |
US11463642B2 (en) | Image sensor including pixel array and mobile terminal | |
WO2020015538A1 (en) | Image data processing method and mobile terminal | |
WO2019091426A1 (en) | Camera assembly, image acquisition method, and mobile terminal | |
WO2019144956A1 (en) | Image sensor, camera module, mobile terminal, and facial recognition method and apparatus | |
CN107317963A (en) | A kind of double-camera mobile terminal control method, mobile terminal and storage medium | |
US20230069816A9 (en) | Image sensor, mobile terminal, and image capturing method | |
CN109257463B (en) | Terminal device, control method thereof and computer-readable storage medium | |
WO2021238869A1 (en) | Pixel unit, photoelectric sensor, camera module, and electronic device | |
WO2022206607A1 (en) | Display module, electronic device, and control method and control apparatus thereof | |
CN110248050B (en) | Camera module and mobile terminal | |
CN108965703A (en) | A kind of imaging sensor, mobile terminal and image capturing method | |
CN217508882U (en) | Camera device and mobile terminal |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: VIVO MOBILE COMMUNICATION CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WANG, DANMEI;ZHOU, HUAZHAO;ZHU, PANPAN;SIGNING DATES FROM 20201223 TO 20201228;REEL/FRAME:054948/0785 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |