CN112804438A - Sensor and shooting module - Google Patents

Sensor and shooting module Download PDF

Info

Publication number
CN112804438A
CN112804438A CN202110337173.2A CN202110337173A CN112804438A CN 112804438 A CN112804438 A CN 112804438A CN 202110337173 A CN202110337173 A CN 202110337173A CN 112804438 A CN112804438 A CN 112804438A
Authority
CN
China
Prior art keywords
light
photosensitive
depth
shooting
photographing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110337173.2A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ivisual 3D Technology Co Ltd
Original Assignee
Beijing Ivisual 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ivisual 3D Technology Co Ltd filed Critical Beijing Ivisual 3D Technology Co Ltd
Priority to CN202110337173.2A priority Critical patent/CN112804438A/en
Publication of CN112804438A publication Critical patent/CN112804438A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/122Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/128Adjusting depth or disparity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to the technical field of sensors and discloses a sensor, which comprises: the shooting photosensitive assembly and the depth of field measuring photosensitive assembly are oppositely arranged; the shooting photosensitive assembly comprises a plurality of shooting pixel units; the depth of field measurement photosensitive assembly comprises a plurality of depth of field measurement pixel units; a photographing pixel unit configured to receive photographing light, and generate photographing image information of a local area of a subject based on the photographing light; a depth measurement pixel unit configured to receive depth measurement light, and generate depth image information of a local area of a target object based on the depth measurement light; the plurality of shooting pixel units correspond to the plurality of depth of field measurement pixel units one to one. The sensor that this application provided can reduce the space that the sensor occupy through will shoot sensitization subassembly and the setting of depth of field measurement sensitization subassembly in a sensor. The application also discloses a shooting module.

Description

Sensor and shooting module
Technical Field
The application relates to the technical field of sensors, for example to a sensor and shoot module.
Background
Currently, a 3D camera needs to be separately provided with a photographing sensor for obtaining a 2D image and a depth sensor for obtaining a depth image. Obtaining the 3D image requires the 2D image and the depth image to be algorithmically compared and fused.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art: at least more than two sensors are needed for obtaining the 2D image and the depth image, and the occupied space is large; the 2D image and the depth image need to be compared and fused to obtain the 3D image, and the data processing speed is reduced.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a sensor and a shooting module, which are used for solving the technical problems that the occupied space of the sensor is large, a 2D image and a depth image need to be compared and fused, and the data processing speed is reduced.
In some embodiments, the sensor comprises: the shooting photosensitive assembly and the depth of field measuring photosensitive assembly are oppositely arranged;
the shooting photosensitive assembly comprises a plurality of shooting pixel units;
the depth of field measurement photosensitive assembly comprises a plurality of depth of field measurement pixel units;
the photographing pixel unit configured to receive photographing light, and generate photographing image information of a local area of a subject based on the photographing light;
the depth measurement pixel unit is configured to receive depth measurement light and generate depth image information of a local area of the target object based on the depth measurement light;
the plurality of shooting pixel units correspond to the plurality of depth of field measurement pixel units one to one.
In some embodiments, the corresponding capture pixel cell and depth measurement pixel cell correspond to the same local area of the object.
In some embodiments, the capture pixel cells include at least one capture photosensitive layer.
In some embodiments, when a photographing photosensitive layer is included in the photographing pixel unit, the photographing photosensitive layer includes at least one photographing band photosensitive region;
when the photographing photosensitive layer includes a photographing band light sensing region, the photographing band light sensing region configured to absorb light of a certain band, and generate photographed image information of a local region of the target based on the absorbed light of the band;
when the shooting photosensitive layer comprises more than two shooting waveband photosensitive regions, different shooting waveband photosensitive regions are configured to respectively absorb light of different wavebands, and shooting image information corresponding to the different shooting waveband photosensitive regions of the local region of the target object is generated based on the absorbed light of the wavebands;
the shot image information corresponding to the different shot wave band light sensing areas forms shot image information of the local area of the target object.
In some embodiments, the corresponding capture pixel element and the depth measurement pixel element partially overlap or completely overlap in the projection direction.
In some embodiments, the photographing photosensitive layer includes a first photographing band light sensing region, a second photographing band light sensing region, and a third photographing band light sensing region;
the first shooting waveband light sensing area is configured to absorb first waveband light and generate first shooting image information of a local area of a target object based on the first waveband light;
the second shooting wave band light sensing area is configured to absorb second wave band light and generate second shooting image information of a local area of the target object based on the second wave band light;
the third shooting wave band light sensing area is configured to absorb third wave band light and generate third shooting image information of a local area of the target object based on the third wave band light;
the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object.
In some embodiments, the depth measurement pixel unit includes a depth measurement photosensitive area configured to receive depth measurement light, and generate depth image information of a local area of the target object based on the depth measurement light;
the depth of field measurement photosensitive area overlaps or does not overlap with at least one of the following in the projection direction:
the first shooting waveband light sensing area;
the second shooting waveband light sensing area;
and the third shooting waveband light sensing area.
In some embodiments, the photographic sensitive assembly is disposed between the target object and the depth of field measuring sensitive assembly; or the like, or, alternatively,
the depth of field measuring photosensitive assembly is arranged between the target object and the shooting photosensitive assembly.
In some embodiments, when the photographing photosensitive assembly is disposed between the target object and the depth-of-field measuring photosensitive assembly, the photographing photosensitive layer includes a first photographing band photosensitive region, a second photographing band photosensitive region, and a third photographing band photosensitive region:
the sensor further includes a first filter, the first filter comprising: a first light-transmitting region, a second light-transmitting region, and a third light-transmitting region;
the first light transmission region is configured to filter light to be transmitted to the first shooting band light sensing region and the depth measurement pixel unit to obtain light required by the first shooting band light sensing region and the depth measurement pixel unit;
the second light transmission region is configured to filter light to be transmitted to the second shooting band light sensing region and the depth of field measurement pixel unit to obtain light required by the second shooting band light sensing region and the depth of field measurement pixel unit;
the third light transmission region is configured to filter light to be transmitted to the third shooting band light sensing region and the depth measurement pixel unit to obtain light required by the third shooting band light sensing region and the depth measurement pixel unit.
In some embodiments, the image capture device further includes a second filter disposed between the capture pixel unit and the depth measurement pixel unit, and the second filter is configured to filter light to be transmitted to the depth measurement pixel unit to obtain light required by the depth measurement pixel unit.
In some embodiments, when three photographing photosensitive layers are included in the photographing pixel unit, the three photographing photosensitive layers are respectively: a first shooting photosensitive layer, a second shooting photosensitive layer and a third shooting photosensitive layer;
the first photographic sensitive layer is configured to absorb light of a first waveband and generate first photographic image information of a local area of a target object based on the light of the first waveband;
the second shooting photosensitive layer is configured to absorb light of a second wave band and generate second shooting image information of a local area of a target object based on the light of the second wave band;
the third shooting photosensitive layer is configured to absorb third wave band light and generate third shooting image information of a local area of a target object based on the third wave band light;
the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object.
In some embodiments, the first, second, and third photographic sensitive layers overlap completely or partially in the projection direction.
In some embodiments, the depth of field measurement pixel cells, the first photographic sensitive layer, the second photographic sensitive layer, and the third photographic sensitive layer overlap completely or partially in a projection direction.
In some embodiments, the first photographic sensitive layer comprises a first photographic sensitive region; wherein the first photographing photosensitive area is configured to absorb light of a first wavelength band and generate first photographed image information of a local area of a target based on the light of the first wavelength band;
the second shooting photosensitive layer comprises a second shooting photosensitive area; wherein the second photographing photosensitive area is configured to absorb light of a second wavelength band and generate second photographed image information of a local area of the object based on the light of the second wavelength band;
the third shooting photosensitive layer comprises a third shooting photosensitive area; the third shooting photosensitive area is configured to absorb third wave band light and generate third shooting image information of a local area of the target based on the third wave band light;
the first shooting photosensitive area, the second shooting photosensitive area and the third shooting photosensitive area are overlapped or not overlapped in the projection direction.
In some embodiments, the depth measurement pixel unit includes a depth measurement photosensitive area configured to receive depth measurement light, and generate depth image information of a local area of the target object based on the depth measurement light;
the depth of field measuring photosensitive area, the first shooting photosensitive area, the second shooting photosensitive area and the third shooting photosensitive area are overlapped or not overlapped in the projection direction.
In some embodiments, the photographic sensitive assembly is disposed between the target object and the depth of field measuring sensitive assembly; or the like, or, alternatively,
the depth of field measuring photosensitive assembly is arranged between the target object and the shooting photosensitive assembly.
In some embodiments, when the photographing photosensitive assembly is disposed between the target object and the depth-of-field measuring photosensitive assembly:
the sensor comprises a third light filter film arranged between the target object and the first shooting photosensitive layer;
the third filter film is configured to filter light to be transmitted to the three-layer photographing photosensitive layer and the depth measurement pixel unit to obtain light required by the three-layer photographing photosensitive layer and the depth measurement pixel unit.
In some embodiments, when the photographing photosensitive assembly is disposed between the target object and the depth-of-field measuring photosensitive assembly:
the sensor comprises a fourth light filter film arranged between the first shooting photosensitive layer and the second shooting photosensitive layer;
the fourth filter is configured to filter light to be transmitted to the second photographing photosensitive layer, the third photographing photosensitive layer, and the depth measurement pixel unit to obtain light required by the second photographing photosensitive layer, the third photographing photosensitive layer, and the depth measurement pixel unit.
In some embodiments, when the photographing photosensitive assembly is disposed between the target object and the depth-of-field measuring photosensitive assembly:
the sensor comprises a fifth light filtering film arranged between the second shooting photosensitive layer and the third shooting photosensitive layer;
the fifth filter is configured to filter light to be transmitted to the third photographing photosensitive layer and the depth measurement pixel unit to obtain light required by the third photographing photosensitive layer and the depth measurement pixel unit.
In some embodiments, when the photographing photosensitive assembly is disposed between the target object and the depth-of-field measuring photosensitive assembly:
the sensor comprises a sixth filter film arranged between the third shooting photosensitive layer and the depth-of-field measurement pixel unit;
the sixth filter is configured to filter light to be transmitted to the depth measurement pixel unit to obtain light required by the depth measurement pixel unit.
In some embodiments, the depth measurement pixel cell comprises a depth measurement sensitive layer.
In some embodiments, the depth of field measurement sensitive layer comprises a depth of field measurement sensitive region.
In some embodiments, the camera module comprises the sensor.
The sensor and the shooting module provided by the embodiment of the disclosure can realize the following technical effects:
by arranging the photographing photosensitive assembly and the depth-of-field measuring photosensitive assembly in one sensor, the space occupied by the sensor can be reduced. By enabling the shooting pixel units and the depth-of-field measurement pixel units to be in one-to-one correspondence at the pixel level, the shooting image information and the depth-of-field image information generated by the shooting pixel units and the depth-of-field measurement pixel units can be directly recorded by taking pixels as units, subsequent image comparison and data fusion processing are not needed, and the data processing speed can be increased.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
At least one embodiment is illustrated by the accompanying drawings, which correspond to the accompanying drawings, and which do not form a limitation on the embodiment, wherein elements having the same reference numeral designations are shown as similar elements, and which are not to scale, and wherein:
FIG. 1 is a schematic diagram of a sensor provided by an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of a depth measurement pixel unit according to an embodiment of the present disclosure;
FIG. 3 is a schematic view of a structure of a depth of field measurement sensitive layer provided by an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a shooting pixel unit provided by the embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a shooting pixel unit provided by the embodiment of the present disclosure;
fig. 6 is a schematic structural diagram of a photographic sensitive layer provided by an embodiment of the present disclosure;
fig. 7 is a schematic structural diagram of a photographic sensitive layer provided by an embodiment of the present disclosure;
fig. 8 is a schematic structural diagram of a positional relationship between a corresponding shooting pixel unit and a depth measurement pixel unit according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a positional relationship between a corresponding shooting pixel unit and a depth measurement pixel unit according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a positional relationship between a light sensing area and a depth-of-field measurement light sensing area in a shooting band according to an embodiment of the present disclosure;
FIG. 11 is a schematic structural diagram of a sensor provided by an embodiment of the present disclosure;
FIG. 12 is a schematic structural diagram of a sensor provided by an embodiment of the present disclosure;
fig. 13 is a schematic structural diagram of a positional relationship among the first, second, and third photosensitive layers provided in the embodiment of the present disclosure;
fig. 14 is a schematic structural diagram of a positional relationship among the first, second, and third photosensitive layers provided in the embodiment of the present disclosure;
fig. 15 is a schematic structural diagram of a positional relationship among the depth-of-field measurement pixel unit, the first photosensitive layer, the second photosensitive layer, and the third photosensitive layer provided in the embodiment of the present disclosure;
fig. 16 is a schematic structural diagram of the first, second, and third photosensitive layers provided in the embodiment of the present disclosure;
fig. 17 is a schematic structural diagram illustrating a relative position relationship among the depth-of-field measuring photosensitive area, the first photographing photosensitive area, the second photographing photosensitive area, and the third photographing photosensitive area according to the embodiment of the disclosure;
FIG. 18 is a schematic structural diagram of a sensor provided by an embodiment of the present disclosure;
FIG. 19 is a schematic diagram of a sensor provided by an embodiment of the present disclosure;
FIG. 20 is a schematic structural diagram of a sensor provided by an embodiment of the present disclosure;
FIG. 21 is a schematic diagram of a sensor provided by an embodiment of the present disclosure;
fig. 22 is a schematic structural diagram of a shooting module provided in the embodiment of the present disclosure.
Reference numerals:
100: a sensor; 101: shooting a photosensitive assembly; 102: a depth-of-field measuring photosensitive assembly; 103: a shooting pixel unit; 104: a depth-of-field measurement pixel unit; 105: shooting a photosensitive layer; 106: a first photographing photosensitive layer; 107: a second photographing photosensitive layer; 108: a third photographing sensitive layer; 109: a depth of field measuring photosensitive layer; 110: a depth-of-field measurement photosensitive area; 111: depth of field measuring the non-photosensitive area; 112: a first shooting waveband light sensing area; 113: a second shooting waveband light sensing area; 114: a third shooting waveband light sensing area; 115: shooting a non-photosensitive area; 116: a first light filter film; 117: a first light-transmitting region; 118: a second light-transmitting region; 119: a third light-transmitting region; 120: a second light filter film; 121: a first photographing photosensitive area; 122: a second photographing photosensitive area; 123: a third photographing sensitive area; 124: a first photographing non-photosensitive area; 125: a second photographing non-photosensitive area; 126: a third shooting non-photosensitive area; 127: a third light filter film; 128: a fourth light filter film; 129: a fifth light filter film; 130: a sixth light filtering film; 200: a shooting module; 1: a first dashed box; 2: a second dashed box.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, at least one embodiment may be practiced without these specific details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
As shown in fig. 1, embodiments of the present disclosure provide a sensor 100, and the sensor 100 may include: a shooting photosensitive assembly 101 and a depth-of-field measuring photosensitive assembly 102 which are arranged oppositely;
the photographing photosensitive member 101 includes a plurality of photographing pixel units 103;
the depth-of-field measurement photosensitive assembly 102 includes a plurality of depth-of-field measurement pixel units 104;
a photographing pixel unit 103 configured to receive photographing light, generate photographing image information of a local area of a subject based on the photographing light;
a depth measurement pixel unit 104 configured to receive the depth measurement light, and generate depth image information of a local area of the object based on the depth measurement light;
the plurality of photographing pixel units 103 correspond to the plurality of depth-of-field measurement pixel units 104 one to one.
In some embodiments, the photographing photosensitive assembly 101 and the depth-of-field measuring photosensitive assembly 102 are oppositely disposed. Optionally, the photographing photosensitive assembly 101 and the depth-of-field measuring photosensitive assembly 102 are stacked. Alternatively, the laminated manner of the photographing photosensitive member 101 and the depth-of-field measuring photosensitive member 102 may be processed as a whole. Alternatively, the photographing photosensitive member 101 and the depth-of-field measuring photosensitive member 102 may be stacked in a manner of being separately processed and then being stacked in direct contact.
In some embodiments, a spacer layer may be optionally disposed between the photographing photosensitive assembly 101 and the depth-of-field measuring photosensitive assembly 102, for example: the spacer layer may be at least one of an adhesive and a filter film. Alternatively, no spacer layer may be optionally provided between the photographing photosensitive assembly 101 and the depth-of-field measuring photosensitive assembly 102. Alternatively, when no spacer layer is provided between the photographing photosensitive assembly 101 and the depth-of-field measuring photosensitive assembly 102, a gap may or may not remain between the photographing photosensitive assembly 101 and the depth-of-field measuring photosensitive assembly 102.
In some embodiments, the plurality of photographing pixel units 103 are distributed in an array. Optionally, the plurality of depth measurement pixel cells 104 are distributed in an array.
In some embodiments, the captured image information may be an analog signal. Alternatively, a 2D image may be generated based on the captured image information.
In some embodiments, the range image information may be an analog signal. Alternatively, the range image may be generated based on the range image information. Alternatively, the analog signal corresponding to the captured image information may be different from the analog signal corresponding to the range image information.
In some embodiments, the captured image information of the local area of the target object and the depth image information of the local area of the target object generated by the one-to-one corresponding capturing pixel unit 103 and depth measurement pixel unit 104 may be collected, and the captured image information of the local area of the target object and the depth image information of the local area of the target object may be directly recorded in units of pixels.
In some embodiments, the captured image information of the local region of the target object and the depth image information of the local region of the target object generated by the one-to-one corresponding capturing pixel unit 103 and depth measurement pixel unit 104 may be collected, respectively, a corresponding captured image is generated based on the captured image information of the local region of the target object, a corresponding depth image is generated based on the depth image information of the local region of the target object, the captured image and the depth image are fused, and the fused image is recorded in units of pixels.
By disposing the photographing photosensitive member 101 and the depth-of-field measuring photosensitive member 102 in one sensor 100, the space occupied by the sensor 100 can be reduced. By making the shooting pixel unit 103 and the depth of field measurement pixel unit 104 correspond to each other at the pixel level, the shot image information of the local area of the target object and the depth of field image information of the local area of the target object, which are generated by the shooting pixel unit 103 and the depth of field measurement pixel unit 104, respectively, can be directly recorded by taking the pixel as a unit, and subsequent image comparison and data fusion processing are not required, so that the data processing speed can be increased.
As shown in fig. 2, in some embodiments, the depth of field measurement pixel cell 104 may include a multi-layer depth of field measurement photosensitive layer 109. Optionally, the depth of field measurement pixel cell 104 may include a depth of field measurement sensitive layer 109. The case where the depth of field measurement pixel cell 104 comprises a depth of field measurement sensitive layer 109 is exemplarily depicted in fig. 2.
As shown in fig. 3, in some embodiments, when the depth-of-field measurement pixel cell 104 includes one depth-of-field measurement photosensitive layer 109, the depth-of-field measurement photosensitive layer 109 may include a plurality of depth-of-field measurement photosensitive regions 110. Alternatively, the depth of field measurement photosensitive layer 109 may include a depth of field measurement photosensitive region 110. Alternatively, the depth of view measurement sensitive layer 109 may include a depth of view measurement non-sensitive region 111. Alternatively, the depth of field measurement photosensitive region 110 may be distributed anywhere within the depth of field measurement photosensitive layer 109. Fig. 3 schematically illustrates a case where the depth-of-field measurement photosensitive layer 109 includes a depth-of-field measurement photosensitive region 110, and the depth-of-field measurement photosensitive region 110 is disposed at a middle position of the depth-of-field measurement photosensitive layer 109.
In some embodiments, the photographic light may be visible light.
In some embodiments, the depth of field measurement light may be invisible light. Alternatively, the invisible light may be infrared or ultraviolet. Table 1 exemplarily illustrates wavelength distributions of visible light and invisible light.
TABLE 1
Figure 456753DEST_PATH_IMAGE001
In some embodiments, the corresponding capture pixel cell 103 and depth measurement pixel cell 104 correspond to the same local area of the target object.
As shown in fig. 4 and 5, in some embodiments, the capture pixel unit 103 includes at least one capture photosensitive layer.
In some embodiments, each of the at least one photographic sensitive layer may absorb light in more than one wavelength band.
In some embodiments, each of the at least one photographic sensitive layer may absorb light in one wavelength band. Alternatively, red light may be absorbed. Alternatively, green light may be absorbed. Alternatively, blue light may be absorbed.
In some embodiments, each of the at least one photographic sensitive layer may absorb light in two wavelength bands. Alternatively, red light and green light may be absorbed. Alternatively, blue and red light may be absorbed, and so on.
As shown in fig. 4, in some embodiments, the capture pixel cell 103 may include a layer of capture photosensitive layer 105.
As shown in fig. 5, in some embodiments, the capture pixel unit 103 may include three photosensitive layers 105, wherein the three photosensitive layers 105 are: a first photographing photosensitive layer 106, a second photographing photosensitive layer 107, and a third photographing photosensitive layer 108. Alternatively, the first photographic sensitive layer 106 may be used to absorb blue light. Alternatively, the second photographing photosensitive layer 107 may be used to absorb green light. Alternatively, the third photographic sensitive layer 108 may be used to absorb red light.
In some embodiments, the capture pixel unit 103 may include seven capture photosensitive layers, which are: a red light-sensitive layer, an orange light-sensitive layer, a yellow light-sensitive layer, a green light-sensitive layer, a blue light-sensitive layer, an indigo light-sensitive layer and a violet light-sensitive layer. Alternatively, a red photosensitive layer may be used to absorb red light. Alternatively, an orange photosensitive layer may be used to absorb orange light. Alternatively, a yellow photosensitive layer may be used to absorb yellow light. Alternatively, the green photosensitive layer may be used to absorb green light. Alternatively, a blue sensitive layer may be used to absorb blue light. Alternatively, the indigo photosensitive layer may be used to absorb indigo light. Alternatively, a violet photosensitive layer may be used to absorb violet light.
As shown in fig. 4, 6, and 7, in some embodiments, when the photographing pixel unit 103 includes one photographing photosensitive layer 105, the photographing photosensitive layer 105 includes at least one photographing band photosensitive region;
when the photographing photosensitive layer 105 includes a photographing band light sensing region configured to absorb light of a certain band, and generate photographed image information of a local region of the subject based on the absorbed light of the band;
when the photographing photosensitive layer 105 includes more than two photographing band photosensitive regions, different photographing band photosensitive regions configured to respectively absorb light of different bands, and generate photographed image information corresponding to the different photographing band photosensitive regions of a local region of the target based on the absorbed light of the bands;
the shot image information corresponding to the different shot wavelength light sensing areas forms shot image information of a local area of the target object.
In some embodiments, when the photographing photosensitive layer includes two or more photographing band photosensitive regions, the two or more photographing band photosensitive regions are in the same plane, and different photographing band photosensitive regions of the two or more photographing band photosensitive regions do not overlap in a direction perpendicular to the photographing band photosensitive regions.
In some embodiments, when the photographing pixel unit 103 includes one photographing photosensitive layer 105, the photographing photosensitive layer 105 may include one photographing band photosensitive area. Alternatively, the photographing photosensitive layer 105 may include a red photographing band photosensitive region configured to absorb red light, and generate photographed image information of a partial region of the target based on the red light. Alternatively, the photographing photosensitive layer 105 may include a green photographing band photosensitive region configured to absorb green light, and generate photographed image information of a partial area of the target object based on the green light. Alternatively, the photographing photosensitive layer 105 may include a blue photographing band photosensitive region configured to absorb blue light, and generate photographed image information of a local area of the target based on the blue light.
As shown in fig. 6 and 7, in some embodiments, the photographing photosensitive layer 105 includes a first photographing band light sensing region 112, a second photographing band light sensing region 113, and a third photographing band light sensing region 114;
a first photographing band light sensing region 112 configured to absorb first band light and generate first photographed image information of a local area of the target based on the first band light;
a second photographing band light sensing region 113 configured to absorb second band light and generate second photographed image information of a local area of the object based on the second band light;
a third photographing band light sensing area 114 configured to absorb the third band light and generate third photographed image information of a local area of the target based on the third band light;
the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object.
In some embodiments, the first band of wavelengths of light may be more than one band of wavelengths of light. Alternatively, the first wavelength band light may be one wavelength band light. Alternatively, it may be red light. Alternatively, it may be green light. Alternatively, blue light may be used. The first wavelength band light may be two wavelength band light. Alternatively, red light and green light may be used. Alternatively, blue light and red light, etc.
In some embodiments, the second wavelength band of light may be more than one wavelength band of light. Alternatively, the second wavelength band light may be one wavelength band light. Alternatively, it may be red light. Alternatively, it may be green light. Alternatively, blue light may be used. The second wavelength band light may be two wavelength band light. Alternatively, red light and green light may be used. Alternatively, blue light and red light, etc.
In some embodiments, the third wavelength band of light may be more than one wavelength band of light. Alternatively, the third wavelength band of light may be one wavelength band of light. Alternatively, it may be red light. Alternatively, it may be green light. Alternatively, blue light may be used. The third wavelength band light may be two wavelength bands of light. Alternatively, red light and green light may be used. Alternatively, blue light and red light, etc.
In some embodiments, when the photographing pixel unit 103 includes one photographing photosensitive layer 105, a plurality of photographing photosensitive layers 105 may be arranged in a regular manner to form the photographing photosensitive assembly 101.
As shown in fig. 6 and 7, in some embodiments, the photographic sensitive layer 105 may include a photographic non-sensitive area 115. Alternatively, the photographing non-photosensitive region 115 may be a region other than the photographing band photosensitive region in the photographing photosensitive layer 105.
In some embodiments, the first, second, and third photographing band light sensing regions 112, 113, and 114 may be distributed in any manner in the photographing photosensitive layer 105. For example: the distribution can be made on the photographic sensitive layer 105 in the manner shown in fig. 6; it may also be distributed over the photographic sensitive layer 105 in the manner of fig. 7.
In some embodiments, the first, second, and third photographing band light sensing regions 112, 113, and 114 are in the same plane and do not overlap with each other in a direction perpendicular to the first photographing band light sensing region 112.
In some embodiments, the first, second, and third photographing band light sensing regions 112, 113, and 114 may be independently selected from one of a square, a rectangle, a circle, a trapezoid, a parallelogram, a trapezoid, and a polygon.
In some embodiments, the first band of light may be red light. Alternatively, the second band light may be blue light. Alternatively, the third wavelength band light may be green light.
In some embodiments, seven photographing wavelength bands of light sensing regions may be included on the photographing photosensitive layer 105. Alternatively, the light sensing areas in the seven shooting bands can be respectively: the light sensing area of the red shooting waveband, the light sensing area of the orange shooting waveband, the light sensing area of the yellow shooting waveband, the light sensing area of the green shooting waveband, the light sensing area of the blue shooting waveband, the light sensing area of the indigo shooting waveband and the light sensing area of the purple shooting waveband.
In some embodiments, the red capture band light sensing zone is configured to absorb red light, and red captured image information of a local area of the target is generated based on the red light. Alternatively, the orange photographing band light sensing region is configured to absorb orange light, and orange photographed image information of a partial region of the object is generated based on the orange light. Alternatively, the yellow photographing band light sensing region is configured to absorb yellow light, and yellow photographing image information of a local region of the object is generated based on the yellow light. Alternatively, the green photographing band light sensing area is configured to absorb green light, and green photographed image information of a partial area of the target object is generated based on the green light. Alternatively, the blue-photographing-band light sensing area is configured to absorb blue light, and blue-photographing image information of a local area of the target is generated based on the blue light. Alternatively, the indigo-color-photographing-band light-sensing region is configured to absorb indigo light, and generate indigo-color-photographed image information of a local region of the target based on the indigo light. Alternatively, the violet photographing band light sensing area is configured to absorb violet light, and violet photographed image information of a local area of the object is generated based on the violet light. Alternatively, the red captured image information, the orange captured image information, the yellow captured image information, the green captured image information, the blue captured image information, the indigo captured image information, and the purple captured image information form captured image information of a local area of the object.
As shown in fig. 8 and 9, in some embodiments, the corresponding shooting pixel unit 103 and the depth measurement pixel unit 104 partially overlap or completely overlap in the projection direction. Alternatively, the corresponding shooting pixel unit 103 and the depth measurement pixel unit 104 partially overlap or completely overlap in a projection direction perpendicular to the shooting pixel unit 103.
In some embodiments, as shown in fig. 8, the corresponding shooting pixel unit 103 and the depth measurement pixel unit 104 may completely overlap in the projection direction. Alternatively, as shown in fig. 9, the corresponding shooting pixel unit 103 and depth measurement pixel unit 104 may partially overlap in the projection direction.
As shown in fig. 2, 3 and 10, in some embodiments, the depth of view measurement pixel cell 104 may include a depth of view measurement sensitive layer 109; the depth of field measurement photosensitive layer 109 may include a depth of field measurement photosensitive region 110. A depth-of-field measurement photosensitive area 110 configured to receive the depth-of-field measurement light, and generate depth-of-field image information of a local area of the target object based on the depth-of-field measurement light;
the depth of field measuring photosensitive area 110 may or may not overlap with at least one of the following in a projection direction perpendicular to the depth of field measuring photosensitive area 110:
a first photographing band light sensing area 112;
a second photographing band light sensing area 113;
and a light sensing area 114 in a third photographing band.
In some embodiments, the depth-of-field measurement photosensitive region 110 and the first capture band photosensitive region 112 are in two different planes, respectively. Optionally, the plane of the depth-of-field measurement photosensitive region 110 is parallel to the plane of the first photographing wavelength band photosensitive region 112.
As shown in fig. 10, in some embodiments, the photographing photosensitive layer 105 and the depth measurement photosensitive layer 109 may overlap completely or partially in the projection direction. Fig. 10 exemplarily depicts a case where the photographing photosensitive layer 105 and the depth measurement photosensitive layer 109 partially overlap in the projection direction.
As shown in fig. 10, in some embodiments, the shot photosensitive layer 105 within the first dashed box 1 corresponds to the depth measurement photosensitive layer 109, and the shot photosensitive layer 105 within the second dashed box 2 is the shot photosensitive layer 105 adjacent to the shot photosensitive layer 105 within the first dashed box 1. Alternatively, when the photographing photosensitive layer 105 and the depth-of-field measuring photosensitive layer 109 do not overlap in the projection direction, the depth-of-field measuring photosensitive region 110 on the depth-of-field measuring photosensitive layer 109 cannot correspond to any one of the photographing wavelength band photosensitive regions on the photographing photosensitive layer 105 within the second dashed box 2.
In some embodiments, the depth-of-field measurement photosensitive region 110 may completely overlap or partially overlap the first photographing wavelength band photosensitive region 112 within the first dashed box 1 in the projection direction. Alternatively, the depth-of-field measuring photosensitive region 110 may completely overlap or partially overlap with the second photographing wavelength band photosensitive region 113 within the first dashed box 1. Alternatively, the depth-of-field measuring photosensitive region 110 may completely overlap or partially overlap with the third photographing band photosensitive region 114 within the first dashed box 1. Alternatively, the depth-of-field measurement photosensitive region 110 may completely overlap with the first photographing band light sensing region 112 and partially overlap with the second photographing band light sensing region 113 within the first dashed box 1. Alternatively, the depth-of-field measurement photosensitive region 110 may partially overlap with the first photographing band light sensing region 112 and partially overlap with the second photographing band light sensing region 113 within the first dashed box 1. Alternatively, the depth-of-field measurement photosensitive region 110 may completely overlap with the first-photographing-band light-sensing region 112, the second-photographing-band light-sensing region 113, and the third-photographing-band light-sensing region 114 within the first dashed box 1, and so on. Fig. 10 exemplarily shows a case where the depth-of-field measurement photosensitive region 110 partially overlaps the third photographing band photosensitive region 114 in the first dashed frame 1 in the projection direction.
In some embodiments, the photographing photosensitive assembly 101 is disposed between the target object and the depth-of-field measuring photosensitive assembly 102; alternatively, the depth-of-field measuring photosensitive assembly 102 is disposed between the subject and the photographing photosensitive assembly 101.
As shown in fig. 11, in some embodiments, when the photographing photosensitive assembly 101 is disposed between the object and the depth-of-field measuring photosensitive assembly 102, the photographing photosensitive layer 105 includes a first photographing band photosensitive area 112, a second photographing band photosensitive area 113, and a third photographing band photosensitive area 114:
the sensor 100 further includes a first filter 116, the first filter 116 including: a first light-transmitting region 117, a second light-transmitting region 118, and a third light-transmitting region 119;
a first light-transmitting region 117 configured to filter light to be transmitted to the first photographing band light sensing region 112 and the depth of field measurement pixel unit 104 to obtain light required by the first photographing band light sensing region 112 and the depth of field measurement pixel unit 104;
a second light-transmitting region 118 configured to filter light to be transmitted to the second photographing band light sensing region 113 and the depth-of-field measurement pixel unit 104 to obtain light required by the second photographing band light sensing region 113 and the depth-of-field measurement pixel unit 104;
a third light-transmitting region 119 configured to filter light to be transmitted to the third photographing band light sensing region 114 and the depth of field measurement pixel unit 104 to obtain light required by the third photographing band light sensing region 114 and the depth of field measurement pixel unit 104.
As shown in fig. 11, in some embodiments, sensor 100 further includes a first filter 116. Optionally, the first filter 116 is disposed between the target and the photosensitive photographing layer 105. Alternatively, when the photosensitive layer 105 includes the first, second, and third photographing wavelength light sensing regions 112, 113, and 114, the first filter 116 includes: a first light-transmissive region 117, a second light-transmissive region 118, and a third light-transmissive region 119.
In some embodiments, the first light transmitting area 117 corresponds to the first photographing band light sensing area 112, and the first light required by the first photographing band light sensing area 112 and the depth of field measurement pixel unit 104 is obtained by filtering through the first light transmitting area 117. Alternatively, the first light required by the first photographing band light sensing region 112 and the depth measurement pixel unit 104 may include light required by the first photographing band light sensing region 112 and the depth measurement pixel unit 104. Alternatively, the first light required by the first photographing band light sensing region 112 and the depth-of-field measurement pixel unit 104 may further include light unnecessary for the first photographing band light sensing region 112 and the depth-of-field measurement pixel unit 104, which does not interfere with the first photographing band light sensing region 112 and the depth-of-field measurement pixel unit 104. Alternatively, after the first photographing band light sensing region 112 absorbs the first photographing band light in the first light, the remaining light continues to be transmitted to the depth of field measurement pixel unit 104. For example: the first photographing band light sensing area 112 may absorb red light, the first light transmission area 117 may transmit the red light and the depth of field measurement light, and the red light and the depth of field measurement light are obtained by filtering through the first light transmission area 117, the first photographing band light sensing area 112 absorbs the red light, and the depth of field measurement light continues to be transmitted to the depth of field measurement pixel unit 104.
In some embodiments, the second light transmitting area 118 corresponds to the second photographing wavelength band light sensing area 113, and the second light required by the second photographing wavelength band light sensing area 113 and the depth of field measurement pixel unit 104 is obtained by filtering through the second light transmitting area 118. Alternatively, the second light required by the second photographing band light sensing region 113 and the depth measurement pixel unit 104 may include the light required by the second photographing band light sensing region 113 and the depth measurement pixel unit 104. Alternatively, the second light required by the second photographing band light sensing region 113 and the depth-of-field measurement pixel unit 104 may further include light unnecessary for the second photographing band light sensing region 113 and the depth-of-field measurement pixel unit 104, which does not interfere with the second photographing band light sensing region 113 and the depth-of-field measurement pixel unit 104. Alternatively, after the second photographing band light sensing region 113 absorbs the second photographing band light in the second light, the remaining light continues to be transmitted to the depth of field measurement pixel unit 104. For example: the second photographing band light sensing area 113 may absorb green light, the second light transmission area 118 may transmit the green light and the depth of field measurement light, the green light and the depth of field measurement light are obtained by filtering through the second light transmission area 118, the second photographing band light sensing area 113 absorbs the green light, and the depth of field measurement light is continuously transmitted to the depth of field measurement pixel unit 104.
In some embodiments, the third light transmitting region 119 corresponds to the third photographing band light sensing region 114, and the third light required by the third photographing band light sensing region 114 and the depth of field measurement pixel unit 104 is obtained by filtering through the third light transmitting region 119. Alternatively, the third light required by the third photographing band light sensing region 114 and the depth measurement pixel unit 104 may include the light required by the third photographing band light sensing region 114 and the depth measurement pixel unit 104. Alternatively, the third light required by the third photographing band light sensing region 114 and the depth-of-field measurement pixel unit 104 may also include light unnecessary for the third photographing band light sensing region 114 and the depth-of-field measurement pixel unit 104, which does not interfere with the third photographing band light sensing region 114 and the depth-of-field measurement pixel unit 104. Alternatively, after the third shooting band light sensing region 114 absorbs the third shooting band light in the third light, the remaining light continues to be transmitted to the depth-of-field measurement pixel unit 104. For example: the third photographing band light sensing area 114 can absorb blue light, the third light transmission area 119 can transmit the blue light and the depth of field measuring light, the blue light and the depth of field measuring light are obtained by filtering through the third light transmission area 119, the third photographing band light sensing area 114 absorbs the blue light, and the depth of field measuring light is continuously transmitted to the depth of field measuring pixel unit 104.
As shown in fig. 12, in some embodiments, the sensor 100 may include a second filter 120 disposed between the capture pixel unit 103 and the depth measurement pixel unit 104, and the second filter 120 is configured to filter the light to be transmitted to the depth measurement pixel unit 104 to obtain the light required by the depth measurement pixel unit 104.
In some embodiments, the photosensitive layer 105 includes a first photo band photosensitive region 112, a second photo band photosensitive region 113 and a third photo band photosensitive region 114 for absorbing light of different bands, and when the first photo band photosensitive region 112, the second photo band photosensitive region 113 and the third photo band photosensitive region 114 can completely absorb light of corresponding bands, and the light absorbed by the first photo band photosensitive region 112, the second photo band photosensitive region 113 and the third photo band photosensitive region 114 does not interfere with the depth measurement pixel unit 104, the second filter 120 is not required. Alternatively, when the first, second, and third photographing wavelength light sensing regions 112, 113, and 114 cannot completely absorb light of the respective photographing wavelength bands, the second filter 120 is required. Alternatively, when the first, second, and third photographing wavelength light sensing regions 112, 113, and 114 can completely absorb the light of the respective photographing wavelength, but the light absorbed by the first, second, and third photographing wavelength light sensing regions 112, 113, and 114 can interfere with the depth measurement pixel unit 104, the second filter 120 is needed.
As shown in fig. 5, in some embodiments, when three photosensitive layers 105 are included in the photographing pixel unit 103, the three photosensitive layers 105 are respectively: a first photographing photosensitive layer 106, a second photographing photosensitive layer 107, and a third photographing photosensitive layer 108;
a first photographic sensitive layer 106 configured to absorb light of a first wavelength band, to generate first photographic image information of a localized area of the subject based on the light of the first wavelength band;
a second photographic sensitive layer 107 configured to absorb light of a second wavelength band, and generate second photographic image information of a local area of the subject based on the light of the second wavelength band;
a third photosensitive layer 108 configured to absorb light of a third wavelength band and generate third captured image information of a local region of the target based on the light of the third wavelength band;
the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object.
In some embodiments, the first, second, and third photosensitive layers 106, 107, and 108 are stacked. Alternatively, the first, second, and third photosensitive layers 106, 107, and 108 may completely overlap or partially overlap in a projection direction perpendicular to the first photosensitive layer 106.
In some embodiments, the first photographic sensitive layer 106 may be used to absorb blue light. Alternatively, the second photographing photosensitive layer 107 may be used to absorb green light. Alternatively, the third photographic sensitive layer 108 may be used to absorb red light.
As shown in fig. 5, 13, and 14, in some embodiments, the first, second, and third photographic sensitive layers 106, 107, and 108 overlap completely or partially in the projection direction.
As shown in fig. 5, in some embodiments, the first, second, and third photographic sensitive layers 106, 107, and 108 may completely overlap in the projection direction.
As shown in fig. 13, in some embodiments, the first and second photosensitive layers 106, 107 may completely overlap, and the second and third photosensitive layers 107, 108 may partially overlap.
As shown in fig. 14, in some embodiments, the first and second photosensitive layers 106, 107 may partially overlap, and the second and third photosensitive layers 107, 108 may partially overlap.
In some embodiments, the first and second photosensitive layers 106, 107 can partially overlap, the second and third photosensitive layers 107, 108 can completely overlap, and so on.
As shown in fig. 15, in some embodiments, the depth of field measurement pixel cells 104, the first photosensitive layer 106, the second photosensitive layer 107, and the third photosensitive layer 108 may overlap completely or partially in the projection direction. Alternatively, the depth of field measurement pixel unit 104, the first photosensitive layer 106, the second photosensitive layer 107, and the third photosensitive layer 108 are provided in a stacked manner. Alternatively, the depth of field measurement pixel cell 104, the first photosensitive layer 106, the second photosensitive layer 107, and the third photosensitive layer 108 may overlap completely or partially in a projection direction perpendicular to the depth of field measurement pixel cell 104.
In some embodiments, the depth of field measurement pixel cell 104 may be disposed proximate to a side of the first photographic sensitive layer 106. Alternatively, the depth-of-field measurement pixel unit 104 may be disposed on a side close to the third photographic photosensitive layer 108.
As shown in fig. 15, in some embodiments, the depth of field measurement pixel cells 104, the first photosensitive layer 106, the second photosensitive layer 107, and the third photosensitive layer 108 may completely overlap in the projection direction.
In some embodiments, the depth of field measurement pixel cell 104, the first photosensitive layer 106, and the second photosensitive layer 107 may completely overlap, and the second photosensitive layer 107 and the third photosensitive layer 108 may partially overlap. Alternatively, the depth of field measurement pixel cell 104 and the first photosensitive layer 106 overlap completely, the second photosensitive layer 107 and the first photosensitive layer 106 overlap partially, and the third photosensitive layer 108 and the second photosensitive layer 107 overlap partially or completely. Alternatively, the depth of field measurement pixel cell 104 and the first photosensitive layer 106 partially overlap, the second photosensitive layer 107 and the first photosensitive layer 106 completely overlap, the third photosensitive layer 108 and the second photosensitive layer 107 partially overlap or completely overlap, and so on.
As shown in fig. 16, in some embodiments, the first photograph photosensitive layer 106 includes a first photograph photosensitive region 121; wherein the first photographing photosensitive area 121 is configured to absorb light of a first wavelength band and generate first photographed image information of a local area of the object based on the light of the first wavelength band;
the second photosensitive layer 107 includes a second photosensitive area 122; wherein the second photographing photosensitive area 122 is configured to absorb light of a second wavelength band and generate second photographed image information of a local area of the object based on the light of the second wavelength band;
the third photosensitive layer 108 includes a third photosensitive area 123; the third photographing photosensitive area 123 is configured to absorb the third wavelength band light, and generate third photographing image information of a local area of the target based on the third wavelength band light;
wherein the first, second, and third photographing photosensitive areas 121, 122, and 123 overlap or do not overlap in the projection direction.
In some embodiments, the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object. Alternatively, the first, second, and third photographing photosensitive areas 121, 122, and 123 may or may not overlap in a projection direction perpendicular to the first photographing photosensitive area 121.
As shown in fig. 16, in some embodiments, the first photograph photosensitive layer 106 may include a first photograph sensitive area 121 and a first photograph non-sensitive area 124. Alternatively, the second photographing photosensitive layer 107 may include a second photographing photosensitive region 122 and a second photographing non-photosensitive region 125. Alternatively, the third photographic sensitive layer 108 may include a third photographic sensitive area 123 and a third photographic non-sensitive area 126.
In some embodiments, the first photographing photosensitive area 121 may be used to absorb blue light. Alternatively, the second photographing photosensitive area 122 may be used to absorb green light. Alternatively, the third photographing photosensitive area 123 may be used to absorb red light.
In some embodiments, the first, second, and third photographing photosensitive areas 121, 122, and 123 may completely overlap in the projection direction. Alternatively, the first, second, and third photographing photosensitive areas 121, 122, and 123 may not overlap at all in the projection direction. Alternatively, the first and second photographing photosensitive areas 121 and 122 may completely overlap in the projection direction, and the second and third photographing photosensitive areas 122 and 123 may partially overlap in the projection direction. Alternatively, the first and second photographing photosensitive areas 121 and 122 may not overlap at all in the projection direction, and the second and third photographing photosensitive areas 122 and 123 may partially overlap in the projection direction. Alternatively, the first and second photographing photosensitive areas 121 and 122 may partially overlap in the projection direction, and the second and third photographing photosensitive areas 122 and 123 may partially overlap in the projection direction. Alternatively, the first and second photographing photosensitive areas 121 and 122 may partially overlap in the projection direction, the second and third photographing photosensitive areas 122 and 123 may not overlap at all in the projection direction, and so on. Fig. 16 exemplarily shows a case where the first, second, and third photographing photosensitive areas 121, 122, and 123 completely overlap.
As shown in fig. 2, 3 and 17, in some embodiments, the depth of view measurement pixel cell 104 may include a depth of view measurement sensitive layer 109; the depth of field measurement photosensitive layer 109 may include a depth of field measurement photosensitive region 110. A depth-of-field measurement photosensitive area 110 configured to receive the depth-of-field measurement light, and generate depth-of-field image information of a local area of the target object based on the depth-of-field measurement light;
the depth-of-field measuring photosensitive area 110, the first photographing photosensitive area 121, the second photographing photosensitive area 122, and the third photographing photosensitive area 123 overlap or do not overlap in the projection direction.
In some embodiments, the depth-of-field measuring photosensitive area 110, the first photographing photosensitive area 121, the second photographing photosensitive area 122, and the third photographing photosensitive area 123 overlap or do not overlap in a projection direction perpendicular to the depth-of-field measuring photosensitive area 110.
In some embodiments, the depth-of-field measuring photosensitive area 110, the first photographing photosensitive area 121, the second photographing photosensitive area 122, and the third photographing photosensitive area 123 may completely overlap in the projection direction. Alternatively, the depth-of-field measuring photosensitive area 110, the first shooting photosensitive area 121, the second shooting photosensitive area 122, and the third shooting photosensitive area 123 may not overlap at all in the projection direction. Alternatively, the depth-of-field measuring photosensitive area 110, the first photographing photosensitive area 121, and the second photographing photosensitive area 122 may completely overlap in the projection direction, and the second photographing photosensitive area 122 and the third photographing photosensitive area 123 may partially overlap or completely not overlap in the projection direction. Alternatively, the depth-of-field measuring photosensitive area 110 and the first photographing photosensitive area 121 may completely overlap in the projection direction, the first photographing photosensitive area 121 and the second photographing photosensitive area 122 may partially overlap in the projection direction, and the second photographing photosensitive area 122 and the third photographing photosensitive area 123 may completely overlap or may not overlap or partially overlap in the projection direction. Alternatively, the depth-of-field measuring photosensitive region 110 and the first photographing photosensitive region 121 may partially overlap in the projection direction, the first photographing photosensitive region 121 and the second photographing photosensitive region 122 may partially overlap in the projection direction, the second photographing photosensitive region 122 and the third photographing photosensitive region 123 may completely overlap or completely not overlap or partially overlap in the projection direction, and so on. Fig. 17 exemplarily shows a case where the depth-of-field measuring photosensitive area 110, the first photographing photosensitive area 121, the second photographing photosensitive area 122, and the third photographing photosensitive area 123 completely overlap.
In some embodiments, the photographing photosensitive assembly 101 is disposed between the target object and the depth-of-field measuring photosensitive assembly 102; alternatively, the depth-of-field measuring photosensitive assembly 102 is disposed between the subject and the photographing photosensitive assembly 101.
As shown in fig. 18, in some embodiments, when the photographing photosensitive assembly 101 is disposed between the target object and the depth-of-field measuring photosensitive assembly 102:
the sensor 100 includes a third filter 127 disposed between the object and the first photosensitive layer 106;
the third filter 127 is configured to filter light to be transmitted to the three-layered photosensitive photographing layer 105 and the depth measurement pixel unit 104, so as to obtain light required by the three-layered photosensitive photographing layer 105 and the depth measurement pixel unit 104.
As shown in fig. 18, in some embodiments, the first photosensitive layer 106 can absorb blue light, and when the light emitted to the first photosensitive layer 106 includes uv light, the third filter 127 can filter the uv light to prevent the uv light from interfering with the first photosensitive layer 106.
As shown in fig. 19, in some embodiments, when the photographing photosensitive assembly 101 is disposed between the target object and the depth-of-field measuring photosensitive assembly 102:
the sensor 100 includes a fourth filter 128 disposed between the first photosensitive layer 106 and the second photosensitive layer 107;
a fourth filter 128 configured to filter the light to be transmitted to the second photographing photosensitive layer 107, the third photographing photosensitive layer 108 and the depth measurement pixel unit 104 to obtain the light required by the second photographing photosensitive layer 107, the third photographing photosensitive layer 108 and the depth measurement pixel unit 104.
In some embodiments, the fourth filter 128 can filter out light that is not completely absorbed by the first photosensitive capture layer 106. Optionally, the fourth filter 128 can filter out light that may interfere with the second photosensitive layer 107. The sensor 100 may include a fourth filter 128. Optionally, the sensor 100 may include a third filter 127 and a fourth filter 128. Fig. 19 exemplarily shows a case where the sensor 100 includes the fourth filter film 128.
As shown in fig. 20, in some embodiments, when the photographing photosensitive assembly 101 is disposed between the target object and the depth-of-field measuring photosensitive assembly 102:
the sensor 100 includes a fifth filter 129 disposed between the second photosensitive layer 107 and the third photosensitive layer 108;
a fifth filter 129 configured to filter light to be transmitted to the third photosensitive photographing layer 108 and the depth measurement pixel unit 104, so as to obtain light required by the third photosensitive photographing layer 108 and the depth measurement pixel unit 104.
In some embodiments, the fifth filter 129 can filter out light that is not completely absorbed by the second photo sensitive layer 107. Optionally, the fifth filter 129 can filter out light that would interfere with the third photosensitive photographing layer 108. The sensor 100 may include a fifth filter 129. Optionally, the sensor 100 may include a third filter 127 and a fifth filter 129. Optionally, the sensor 100 may include a fourth filter 128 and a fifth filter 129. Optionally, the sensor 100 may include a third filter 127, a fourth filter 128, and a fifth filter 129. Fig. 20 exemplarily shows a case where the sensor 100 includes the fifth filter 129.
As shown in fig. 21, in some embodiments, when the photographing photosensitive assembly 101 is disposed between the target object and the depth-of-field measuring photosensitive assembly 102:
the sensor 100 includes a sixth filter 130 disposed between the third photosensitive photographing layer 108 and the depth measurement pixel unit 104;
a sixth filter 130 configured to filter the light to be transmitted to the depth measurement pixel unit 104 to obtain the light required by the depth measurement pixel unit 104.
In some embodiments, the sixth filter 130 can filter out light that is not completely absorbed by the third photosensitive photographing layer 108. Optionally, the sixth filter 130 can filter out light that may interfere with the depth of field measurement pixel 104. The sensor 100 may include a sixth filter film 130. Optionally, the sensor 100 may include a third filter 127 and a sixth filter 130. Optionally, the sensor 100 may include a fourth filter 128 and a sixth filter 130. Optionally, the sensor 100 may include a fifth filter 129 and a sixth filter 130. Optionally, sensor 100 may include a third filter 127, a fourth filter 128, and a sixth filter 130. Optionally, sensor 100 may include a third filter 127, a fifth filter 129, and a sixth filter 130. Optionally, sensor 100 may include a fourth filter 128, a fifth filter 129, and a sixth filter 130. Optionally, sensor 100 may include a third filter 127, a fourth filter 128, a fifth filter 129, and a sixth filter 130. Fig. 21 exemplarily shows a case where the sensor 100 includes the sixth filter film 130.
As shown in fig. 22, an embodiment of the present disclosure provides a camera module 200, and the camera module 200 includes the sensor 100.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of the disclosed embodiments includes the full ambit of the claims, as well as all available equivalents of the claims. As used in this application, although the terms "first," "second," etc. may be used in this application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, unless the meaning of the description changes, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first and second elements are both elements, but may not be the same element. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It is clear to those skilled in the art that, for convenience and brevity of description, the working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit may be merely a division of a logical function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the drawings, the width, length, thickness, etc. of structures such as elements or layers may be exaggerated for clarity and descriptive purposes. When an element or layer is referred to as being "disposed on" (or "mounted on," "laid on," "attached to," "coated on," or the like) another element or layer, the element or layer may be directly "disposed on" or "over" the other element or layer, or intervening elements or layers may be present, or even partially embedded in the other element or layer.

Claims (23)

1. A sensor, comprising: the shooting photosensitive assembly and the depth of field measuring photosensitive assembly are oppositely arranged;
the shooting photosensitive assembly comprises a plurality of shooting pixel units;
the depth of field measurement photosensitive assembly comprises a plurality of depth of field measurement pixel units;
the photographing pixel unit configured to receive photographing light, and generate photographing image information of a local area of a subject based on the photographing light;
the depth measurement pixel unit is configured to receive depth measurement light and generate depth image information of a local area of the target object based on the depth measurement light;
the plurality of shooting pixel units correspond to the plurality of depth of field measurement pixel units one to one.
2. The sensor according to claim 1, wherein the corresponding capture pixel unit and depth measurement pixel unit correspond to the same local area of the object.
3. The sensor of claim 1, wherein the capture pixel cells comprise at least one capture photosensitive layer.
4. The sensor according to claim 3, wherein when the photographing pixel unit includes a photographing photosensitive layer, the photographing photosensitive layer includes at least one photographing band photosensitive area;
when the photographing photosensitive layer includes a photographing band light sensing region, the photographing band light sensing region configured to absorb light of a certain band, and generate photographed image information of a local region of the target based on the absorbed light of the band;
when the shooting photosensitive layer comprises more than two shooting waveband photosensitive regions, different shooting waveband photosensitive regions are configured to respectively absorb light of different wavebands, and shooting image information corresponding to the different shooting waveband photosensitive regions of the local region of the target object is generated based on the absorbed light of the wavebands;
the shot image information corresponding to the different shot wave band light sensing areas forms shot image information of the local area of the target object.
5. The sensor according to claim 4, wherein the corresponding capture pixel cell and the depth measurement pixel cell partially overlap or completely overlap in a projection direction.
6. The sensor according to claim 4 or 5, wherein the photographing photosensitive layer includes a first photographing band photosensitive region, a second photographing band photosensitive region, and a third photographing band photosensitive region;
the first shooting waveband light sensing area is configured to absorb first waveband light and generate first shooting image information of a local area of a target object based on the first waveband light;
the second shooting wave band light sensing area is configured to absorb second wave band light and generate second shooting image information of a local area of the target object based on the second wave band light;
the third shooting wave band light sensing area is configured to absorb third wave band light and generate third shooting image information of a local area of the target object based on the third wave band light;
the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object.
7. The sensor of claim 6, wherein the depth measurement pixel cell includes a depth measurement photosensitive region configured to receive a depth measurement light, generate depth image information of a local area of the target object based on the depth measurement light;
the depth of field measurement photosensitive area overlaps or does not overlap with at least one of the following in the projection direction:
the first shooting waveband light sensing area;
the second shooting waveband light sensing area;
and the third shooting waveband light sensing area.
8. The sensor of claim 4,
the shooting photosensitive assembly is arranged between the target object and the depth-of-field measurement photosensitive assembly; or the like, or, alternatively,
the depth of field measuring photosensitive assembly is arranged between the target object and the shooting photosensitive assembly.
9. The sensor of claim 8, wherein when the capture photosensitive assembly is disposed between the target object and the depth measurement photosensitive assembly, the capture photosensitive layer includes a first capture band photosensitive zone, a second capture band photosensitive zone, and a third capture band photosensitive zone:
the sensor further includes a first filter, the first filter comprising: a first light-transmitting region, a second light-transmitting region, and a third light-transmitting region;
the first light transmission region is configured to filter light to be transmitted to the first shooting band light sensing region and the depth measurement pixel unit to obtain light required by the first shooting band light sensing region and the depth measurement pixel unit;
the second light transmission region is configured to filter light to be transmitted to the second shooting band light sensing region and the depth of field measurement pixel unit to obtain light required by the second shooting band light sensing region and the depth of field measurement pixel unit;
the third light transmission region is configured to filter light to be transmitted to the third shooting band light sensing region and the depth measurement pixel unit to obtain light required by the third shooting band light sensing region and the depth measurement pixel unit.
10. The sensor of claim 9, further comprising a second filter disposed between the capture pixel unit and the depth measurement pixel unit, the second filter configured to filter light to be transmitted to the depth measurement pixel unit to obtain light required by the depth measurement pixel unit.
11. The sensor according to claim 3, wherein when three photographic photosensitive layers are included in the photographic pixel unit, the three photographic photosensitive layers are: a first shooting photosensitive layer, a second shooting photosensitive layer and a third shooting photosensitive layer;
the first photographic sensitive layer is configured to absorb light of a first waveband and generate first photographic image information of a local area of a target object based on the light of the first waveband;
the second shooting photosensitive layer is configured to absorb light of a second wave band and generate second shooting image information of a local area of a target object based on the light of the second wave band;
the third shooting photosensitive layer is configured to absorb third wave band light and generate third shooting image information of a local area of a target object based on the third wave band light;
the first captured image information, the second captured image information, and the third captured image information form captured image information of a local area of the target object.
12. The sensor of claim 11, wherein the first, second, and third photographic sensitive layers completely overlap or partially overlap in a projection direction.
13. The sensor of claim 11 or 12, wherein the depth of field measurement pixel cells, the first, second and third photographic sensitive layers overlap completely or partially in the projection direction.
14. The sensor of claim 11,
the first shooting photosensitive layer comprises a first shooting photosensitive area; wherein the first photographing photosensitive area is configured to absorb light of a first wavelength band and generate first photographed image information of a local area of a target based on the light of the first wavelength band;
the second shooting photosensitive layer comprises a second shooting photosensitive area; wherein the second photographing photosensitive area is configured to absorb light of a second wavelength band and generate second photographed image information of a local area of the object based on the light of the second wavelength band;
the third shooting photosensitive layer comprises a third shooting photosensitive area; the third shooting photosensitive area is configured to absorb third wave band light and generate third shooting image information of a local area of the target based on the third wave band light;
the first shooting photosensitive area, the second shooting photosensitive area and the third shooting photosensitive area are overlapped or not overlapped in the projection direction.
15. The sensor of claim 14, wherein the depth measurement pixel cell includes a depth measurement photosensitive region configured to receive a depth measurement light, generate depth image information of a local area of the target object based on the depth measurement light;
the depth of field measuring photosensitive area, the first shooting photosensitive area, the second shooting photosensitive area and the third shooting photosensitive area are overlapped or not overlapped in the projection direction.
16. The sensor of claim 11,
the shooting photosensitive assembly is arranged between the target object and the depth-of-field measurement photosensitive assembly; or the like, or, alternatively,
the depth of field measuring photosensitive assembly is arranged between the target object and the shooting photosensitive assembly.
17. The sensor of claim 16, wherein when the capture photosensitive assembly is disposed between the target object and the depth measurement photosensitive assembly:
the sensor comprises a third light filter film arranged between the target object and the first shooting photosensitive layer;
the third filter film is configured to filter light to be transmitted to the three-layer photographing photosensitive layer and the depth measurement pixel unit to obtain light required by the three-layer photographing photosensitive layer and the depth measurement pixel unit.
18. The sensor according to claim 16 or 17, wherein when the photographing photosensitive member is disposed between the target object and the depth-of-field measuring photosensitive member:
the sensor comprises a fourth light filter film arranged between the first shooting photosensitive layer and the second shooting photosensitive layer;
the fourth filter is configured to filter light to be transmitted to the second photographing photosensitive layer, the third photographing photosensitive layer, and the depth measurement pixel unit to obtain light required by the second photographing photosensitive layer, the third photographing photosensitive layer, and the depth measurement pixel unit.
19. The sensor of claim 16, wherein when the capture photosensitive assembly is disposed between the target object and the depth measurement photosensitive assembly:
the sensor comprises a fifth light filtering film arranged between the second shooting photosensitive layer and the third shooting photosensitive layer;
the fifth filter is configured to filter light to be transmitted to the third photographing photosensitive layer and the depth measurement pixel unit to obtain light required by the third photographing photosensitive layer and the depth measurement pixel unit.
20. The sensor of claim 16, wherein when the capture photosensitive assembly is disposed between the target object and the depth measurement photosensitive assembly:
the sensor comprises a sixth filter film arranged between the third shooting photosensitive layer and the depth-of-field measurement pixel unit;
the sixth filter is configured to filter light to be transmitted to the depth measurement pixel unit to obtain light required by the depth measurement pixel unit.
21. The sensor of claim 1, wherein the depth of field measurement pixel cell comprises a depth of field measurement sensitive layer.
22. The sensor of claim 21 wherein the depth-of-field measurement sensitive layer comprises a depth-of-field measurement sensitive region.
23. A camera module, characterized in that it comprises a sensor according to any one of claims 1 to 22.
CN202110337173.2A 2021-03-30 2021-03-30 Sensor and shooting module Pending CN112804438A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110337173.2A CN112804438A (en) 2021-03-30 2021-03-30 Sensor and shooting module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110337173.2A CN112804438A (en) 2021-03-30 2021-03-30 Sensor and shooting module

Publications (1)

Publication Number Publication Date
CN112804438A true CN112804438A (en) 2021-05-14

Family

ID=75815866

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110337173.2A Pending CN112804438A (en) 2021-03-30 2021-03-30 Sensor and shooting module

Country Status (1)

Country Link
CN (1) CN112804438A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102623475A (en) * 2012-04-17 2012-08-01 上海中科高等研究院 Stacked CMOS (Complementary Metal Oxide Semiconductor) image sensor
US20140263980A1 (en) * 2011-01-28 2014-09-18 Aptina Imaging Corpoation Imagers with depth sensing capabilities
CN106340527A (en) * 2016-11-15 2017-01-18 豪威科技(上海)有限公司 Back-illuminated image sensor chip
CN107465849A (en) * 2016-06-03 2017-12-12 致伸科技股份有限公司 The image capture unit of image depth of field measuring method and application this method
CN110299375A (en) * 2019-07-08 2019-10-01 芯盟科技有限公司 Semiconductor structure and forming method thereof
CN111314634A (en) * 2018-12-11 2020-06-19 江西合力泰科技有限公司 Manufacturing method of image sensor, sensor and pixel reading method
CN112363180A (en) * 2020-10-28 2021-02-12 Oppo广东移动通信有限公司 Imaging distance measuring sensor, method, system and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140263980A1 (en) * 2011-01-28 2014-09-18 Aptina Imaging Corpoation Imagers with depth sensing capabilities
CN102623475A (en) * 2012-04-17 2012-08-01 上海中科高等研究院 Stacked CMOS (Complementary Metal Oxide Semiconductor) image sensor
CN107465849A (en) * 2016-06-03 2017-12-12 致伸科技股份有限公司 The image capture unit of image depth of field measuring method and application this method
CN106340527A (en) * 2016-11-15 2017-01-18 豪威科技(上海)有限公司 Back-illuminated image sensor chip
CN111314634A (en) * 2018-12-11 2020-06-19 江西合力泰科技有限公司 Manufacturing method of image sensor, sensor and pixel reading method
CN110299375A (en) * 2019-07-08 2019-10-01 芯盟科技有限公司 Semiconductor structure and forming method thereof
CN112363180A (en) * 2020-10-28 2021-02-12 Oppo广东移动通信有限公司 Imaging distance measuring sensor, method, system and storage medium

Similar Documents

Publication Publication Date Title
US9793308B2 (en) Imager integrated circuit and stereoscopic image capture device
US10498990B2 (en) Imaging systems with high dynamic range and phase detection pixels
EP2696570B1 (en) Lens array for partitioned image sensor having color filters
JP5399215B2 (en) Multi-lens camera device and electronic information device
EP2380345B1 (en) Improving the depth of field in an imaging system
US10686004B2 (en) Image capturing element and image capturing device image sensor and image-capturing device
KR102283111B1 (en) Integrated microoptic imager, processor and display
TWI606309B (en) Optical imaging apparatus, in particular for computational imaging, having further functionality
US20130222546A1 (en) Solid-state image pickup element and image pickup apparatus
CN211481355U (en) Multispectral sensing structure, sensor and camera
WO2005089369B1 (en) Apparatus for imaging using an array of lenses
KR101808355B1 (en) Imaging device
EP2225877A1 (en) Camera system with multiple pixel arrays on a chip
WO2020199674A1 (en) Electroluminescent display panel and display apparatus
JP2008091841A (en) Solid state imaging device and imaging device
JP5947217B2 (en) Color separation filter array, solid-state imaging device, imaging device, and display device
EP2782136A2 (en) Solid state imaging device and portable information terminal
JP5332423B2 (en) Imaging device
JP5532766B2 (en) Imaging device and imaging apparatus
WO2021168666A1 (en) Fingerprint identification apparatus and electronic device
TWI584643B (en) Camera devices and systems based on a single imaging sensor and methods for manufacturing the same
EP3979123A1 (en) Fingerprint detection apparatus and electronic device
CN112804438A (en) Sensor and shooting module
JP5476731B2 (en) Image sensor
CN112738385A (en) Sensor and shooting module

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210514