CN112738386A - Sensor, shooting module and image acquisition method - Google Patents

Sensor, shooting module and image acquisition method Download PDF

Info

Publication number
CN112738386A
CN112738386A CN202110337177.0A CN202110337177A CN112738386A CN 112738386 A CN112738386 A CN 112738386A CN 202110337177 A CN202110337177 A CN 202110337177A CN 112738386 A CN112738386 A CN 112738386A
Authority
CN
China
Prior art keywords
light
photosensitive layer
depth
filter
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110337177.0A
Other languages
Chinese (zh)
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ivisual 3D Technology Co Ltd
Original Assignee
Beijing Ivisual 3D Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ivisual 3D Technology Co Ltd filed Critical Beijing Ivisual 3D Technology Co Ltd
Priority to CN202110337177.0A priority Critical patent/CN112738386A/en
Publication of CN112738386A publication Critical patent/CN112738386A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/70SSIS architectures; Circuits associated therewith

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application relates to the technical field of sensors and discloses a sensor, which corresponds to a lens and comprises: shooting a photosensitive layer and measuring the depth of field; wherein the photographic photosensitive layer is configured to receive photographic light and generate photographic image information of the object based on the received photographic light; a depth measurement sensitive layer configured to receive the depth measurement light and generate depth image information of the object based on the received depth measurement light. The sensor provided by the application can realize that a single lens can simultaneously acquire shot image information and depth of field image information. The application also discloses a sensor, a shooting module and an image acquisition method.

Description

Sensor, shooting module and image acquisition method
Technical Field
The application relates to the technical field of sensors, for example to a sensor, a shooting module and an image acquisition method.
Background
At present, with the application of a camera on electronic equipment, people have higher and higher requirements on the experience of photographing the electronic equipment, so that the requirements on the electronic equipment are clear, and good visual experience is required. However, in order to obtain depth-of-field image information, the electronic device, which is a portable device, is usually provided with a dual-lens module, and the dual-lens module is used to obtain dual-shot original data, and then the dual-shot original data is processed by a dual-shot algorithm to generate image information with a depth-of-field effect; or at least two image sensors are arranged to acquire the shot image information and the depth image information.
In the process of implementing the embodiments of the present disclosure, it is found that at least the following problems exist in the related art: at least more than two lenses are needed for acquiring the shot image information and the depth image information, and the occupied space is large.
Disclosure of Invention
The following presents a simplified summary in order to provide a basic understanding of some aspects of the disclosed embodiments. This summary is not an extensive overview nor is intended to identify key/critical elements or to delineate the scope of such embodiments but rather as a prelude to the more detailed description that is presented later.
The embodiment of the disclosure provides a sensor, a shooting module and an image acquisition method, and aims to solve the technical problems that at least more than two lenses are needed for acquiring shot image information and depth image information, and the occupied space is large.
In some embodiments, a sensor, the sensor corresponding to a lens, comprises: shooting a photosensitive layer and measuring the depth of field; wherein the photographic photosensitive layer is configured to receive photographic light and generate photographic image information of a target object based on the received photographic light; the depth measurement photosensitive layer is configured to receive depth measurement light and generate depth image information of the target object based on the received depth measurement light.
In some embodiments, the capture sensitive layer and the depth measurement sensitive layer are fully covered, partially covered, or uncovered.
In some embodiments, when the photographic sensitive layer fully covers or partially covers the depth measurement sensitive layer, the sensor may further include: a first optical filter disposed between the photographing photosensitive layer and the depth measurement photosensitive layer, configured to filter light to be transmitted to the depth measurement photosensitive layer to obtain the depth measurement light required by the depth measurement photosensitive layer.
In some embodiments, when the depth measurement sensitive layer fully covers or partially covers the capture sensitive layer, the sensor may further comprise: a second optical filter disposed between the photographing photosensitive layer and the depth measurement photosensitive layer, configured to filter light to be transmitted to the photographing photosensitive layer to obtain the photographing light required by the photographing photosensitive layer.
In some embodiments, when the photographing photosensitive layer and the depth measurement photosensitive layer are uncovered structures, the sensor may further include: a third optical filter disposed between the photosensitive photographing layer and the lens, and configured to filter light to be transmitted to the photosensitive photographing layer to obtain the photographing light required by the photosensitive photographing layer.
In some embodiments, when the photographing photosensitive layer and the depth measurement photosensitive layer are uncovered structures, the sensor may further include: a fourth optical filter disposed between the depth-of-field measurement photosensitive layer and the lens, and configured to filter light to be transmitted to the depth-of-field measurement photosensitive layer, so as to obtain the depth-of-field measurement light required by the depth-of-field measurement photosensitive layer.
In some embodiments, the third filter and the fourth filter are integrally disposed as a first patterned filter, or the third filter and the fourth filter are disposed separately.
In some embodiments, the first patterned filter may include different regions, and the capture light and the depth measurement light are transmitted through the different regions of the first patterned filter, respectively.
In some embodiments, the capture photosensitive layer comprises at least two photosensitive layer structures.
In some embodiments, the sensor may further comprise: a filter set configured to filter the photographing light to be transmitted to the at least two photosensitive layers, respectively, to obtain the photographing light required by the at least two photosensitive layers.
In some embodiments, the at least two photosensitive layer structures of the photographing photosensitive layer are full-coverage structures, partial-coverage structures, or no-coverage structures.
In some embodiments, when the at least two photosensitive layer structure is an uncovered structure, the filter set may be a second patterned filter.
In some embodiments, the second patterned filter includes different regions, and the photographing light of different wavelength bands respectively passes through the different regions of the second patterned filter.
In some embodiments, when the at least two photosensitive layer structure is a full-coverage structure or a partial-coverage structure, the filter set may include: and the optical filter is arranged between the at least two photosensitive layers.
In some embodiments, the photographing photosensitive layer may include: the first photosensitive layer is configured to receive first wave band light and convert a light signal corresponding to the first wave band light into first shooting image information; the second photosensitive layer is configured to receive second wave band light and convert a light signal corresponding to the second wave band light into second shooting image information; and the third photosensitive layer is configured to receive a third wave band light and convert an optical signal corresponding to the third wave band light into third shooting image information.
In some embodiments, the depth of field measurement light is light of a particular wavelength range.
In some embodiments, the sensor may further comprise: a transmitter configured to transmit the depth measurement light to the target object; a calculation unit configured to obtain a depth of field value of the target object by a round trip time of flight of the depth of field measurement light received based on the depth of field measurement photosensitive layer.
In some embodiments, a camera module includes the sensor.
In some embodiments, an image acquisition method comprises: receiving photographing light, and generating photographing image information of a target object based on the received photographing light; receiving depth-of-field measuring light, and generating depth-of-field image information of the target object based on the received depth-of-field measuring light; wherein the shooting light and the depth of field measuring light come from the same lens.
The sensor, the shooting module and the image acquisition method provided by the embodiment of the disclosure can realize the following technical effects: a single lens can simultaneously acquire shot image information and depth image information.
The foregoing general description and the following description are exemplary and explanatory only and are not restrictive of the application.
Drawings
At least one embodiment is illustrated by the accompanying drawings, which correspond to the accompanying drawings, and which do not form a limitation on the embodiment, wherein elements having the same reference numeral designations are shown as similar elements, and which are not to scale, and wherein:
FIG. 1 is a schematic view of a sensor provided by an embodiment of the present disclosure;
fig. 2A, 2B, and 2C are schematic diagrams illustrating several situations of the positional relationship between the photographing photosensitive layer and the depth measurement photosensitive layer according to the embodiment of the disclosure;
fig. 3 is a schematic diagram of a positional relationship between a filter and a photographing photosensitive layer and a depth measurement photosensitive layer according to an embodiment of the disclosure;
fig. 4 is a schematic diagram of a positional relationship between a filter and a photographing photosensitive layer and a depth measurement photosensitive layer according to an embodiment of the disclosure;
fig. 5A is a schematic diagram of a positional relationship between a filter and a photographing photosensitive layer and a depth measurement photosensitive layer according to an embodiment of the disclosure;
fig. 5B is a schematic diagram of a positional relationship between the patterned filter and the photographing photosensitive layer and the depth-of-field measurement photosensitive layer according to an embodiment of the disclosure;
fig. 6A and 6B are schematic structural diagrams of a photosensitive layer provided in the present disclosure;
fig. 7 is a schematic diagram of a positional relationship between a patterned filter and a photosensitive layer structure inside a photographic photosensitive layer provided in an embodiment of the present disclosure;
fig. 8 is a schematic diagram of a positional relationship between a filter and a photosensitive layer structure inside a photographic photosensitive layer provided in the embodiment of the present disclosure;
fig. 9 is a schematic flowchart of an image acquisition method according to an embodiment of the present disclosure.
Reference numerals:
100: a sensor; 101: shooting a photosensitive layer; 102: a depth of field measuring photosensitive layer; 103: a first optical filter; 1030: a second optical filter; 1031: a third optical filter; 1032: a fourth optical filter; 1033: a first patterned optical filter; 104: a lens; 105: a transmitter; 106: a calculation unit; 1011: a first photosensitive layer; 1012: a second photosensitive layer; 1013: a third photosensitive layer; 20: a target object; 107: a filter set; 1071: an area; 1072: an area; 1073: an area; 1074: an optical filter; 1075: and (3) a filter.
Detailed Description
So that the manner in which the features and elements of the disclosed embodiments can be understood in detail, a more particular description of the disclosed embodiments, briefly summarized above, may be had by reference to the embodiments, some of which are illustrated in the appended drawings. In the following description of the technology, for purposes of explanation, numerous details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, at least one embodiment may be practiced without these specific details. In other instances, well-known structures and devices may be shown in simplified form in order to simplify the drawing.
The sensor, the shooting module and the image acquisition method provided by the embodiment of the disclosure are applied to scenes of shooting pictures and videos, and the shooting module can be an imaging sensor and can be applied to lens modules in electronic equipment such as cameras, mobile phones, depth-of-field measuring equipment, stereo imaging equipment and the like.
Referring to fig. 1, the disclosed embodiment provides a sensor 100, the sensor 100 corresponding to a lens 104, the sensor 100 including: shooting a photosensitive layer 101 and measuring the depth of field 102; wherein the photographic photosensitive layer 101 is configured to receive photographic light, and generate photographic image information of the object 20 based on the received photographic light; a depth measurement photosensitive layer 102 configured to receive the depth measurement light and generate depth image information of the subject 20 based on the received depth measurement light.
In some embodiments, the photographic sensitive layer 101 may include one or more than two sensitive layer structures. The photographing light may be visible light, the depth of field measuring light may be invisible light, and the invisible light may be infrared or ultraviolet. Table 1 exemplarily illustrates wavelength distributions of visible light and invisible light.
TABLE 1
Figure 785408DEST_PATH_IMAGE001
In some embodiments, referring to fig. 2A, 2B, and 2C, the photosensitive layer 101 and the depth measurement photosensitive layer 102 may be a full-coverage structure, a partial-coverage structure, or an uncovered structure.
In some embodiments, the full coverage, partial coverage, or no coverage of the photosensitive layer 101 and the depth of field measurement photosensitive layer 102 is the positional relationship between the photosensitive layer 101 and the depth of field measurement photosensitive layer 102, the full coverage is the orthographic coincidence of the photosensitive layer 101 and the depth of field measurement photosensitive layer 102, the partial coverage is the orthographic coincidence of the photosensitive layer 101 and the depth of field measurement photosensitive layer 102, and the no coverage is the orthographic misalignment of the photosensitive layer 101 and the depth of field measurement photosensitive layer 102. The positional relationship between the photographing photosensitive layer 101 and the depth measurement photosensitive layer 102: the photosensitive layer 101 shown in fig. 2A may cover the depth of field measurement photosensitive layer 102 completely, or the photosensitive layer 101 shown in fig. 2B may cover the depth of field measurement photosensitive layer 102 partially; the depth-of-field measurement photosensitive layer 102 may cover the photographic photosensitive layer 101 completely or partially, or the photographic photosensitive layer 101 and the depth-of-field measurement photosensitive layer 102 shown in fig. 2C may be independent from each other without covering relationship.
In some embodiments, referring to fig. 3 and 4, when the photosensitive layer 101 is shot to fully cover or partially cover the depth of field measurement photosensitive layer 102, the sensor further includes: the first filter 103 is disposed between the photographing photosensitive layer 101 and the depth measurement photosensitive layer 102, and configured to filter light to be transmitted to the depth measurement photosensitive layer 102 to obtain depth measurement light required by the depth measurement photosensitive layer 102.
In some embodiments, when the photosensitive layer 101 covers the depth measurement photosensitive layer 102 completely or partially, the first filter 103 is disposed between the photosensitive layer 101 and the depth measurement photosensitive layer 102. Alternatively, the first filter 103 is disposed on the surface of the photographing photosensitive layer 101 facing the depth-of-field measurement photosensitive layer 102; alternatively, the first filter 103 is disposed on the surface of the depth-of-field measurement photosensitive layer 102 facing the photographing photosensitive layer 101; alternatively, the first filter 103 is independently disposed between the photosensitive photographing layer 101 and the photosensitive depth-of-field measuring layer 102. For example: the depth of field measurement light required by the depth of field measurement photosensitive layer 102 is infrared light, and the first optical filter 103 filters out light other than infrared light from the light to be transmitted to the depth of field measurement photosensitive layer 102, so as to obtain the infrared depth of field measurement light required by the depth of field measurement photosensitive layer 102.
In some embodiments, when the depth of field measurement sensitive layer 102 fully covers or partially covers the photographic sensitive layer 101, the sensor further comprises: the second filter 1030 is disposed between the photosensitive photographing layer 101 and the depth measurement photosensitive layer 102, and configured to filter light to be transmitted to the photosensitive photographing layer 101 to obtain photographing light required for photographing the photosensitive photographing layer 101.
In some embodiments, the second filter 1030 is disposed between the photosensitive layer 101 and the photosensitive layer 102 when the photosensitive layer 102 covers the photosensitive layer 101 completely or partially. Alternatively, the second filter 1030 is disposed on the surface of the depth-of-field measurement sensitive layer 102 facing the photographic sensitive layer 101; alternatively, the second filter 1030 is disposed on the surface of the photographing photosensitive layer 101 facing the depth-of-field measurement photosensitive layer 102; alternatively, the second filter 1030 is independently disposed between the photosensitive layer 101 and the depth measurement photosensitive layer 102. For example: the light required for photographing the photosensitive layer 101 is red, green and blue light, and the second filter 1030 filters light which is not the red, green and blue light from the light to be transmitted to the photosensitive layer 101, so as to obtain the red, green and blue light required for photographing the photosensitive layer 101.
In some embodiments, referring to fig. 5A, when the photographing photosensitive layer 101 and the depth-of-field measuring photosensitive layer 102 are in an uncovered structure, the sensor further includes: a third filter 1031, disposed between the photosensitive layer 101 and the lens 104, configured to filter the light to be transmitted to the photosensitive layer 101 to obtain the photographing light required for photographing the photosensitive layer 101.
In some embodiments, the third optical filter 1031 is disposed between the photographing photosensitive layer 101 and the lens 104, the third optical filter 1031 may be disposed on a surface of the photographing photosensitive layer 101 facing the lens 104, or the third optical filter 1031 is independently disposed between the photographing photosensitive layer 101 and the lens 104.
In some embodiments, referring to fig. 5A, when the photographing photosensitive layer 101 and the depth-of-field measuring photosensitive layer 102 are in an uncovered structure, the sensor further includes: the fourth filter 1032 is disposed between the depth measurement sensitive layer 102 and the lens 104, and configured to filter the light to be transmitted to the depth measurement sensitive layer 102, so as to obtain the depth measurement light required by the depth measurement sensitive layer 102.
In some embodiments, the fourth filter 1032 is disposed between the sensitive depth of field measurement layer 102 and the lens 104, the fourth filter 1032 may be disposed on the surface of the sensitive depth of field measurement layer 102 facing the lens 104, or the fourth filter 1032 is independently disposed between the sensitive depth of field measurement layer 102 and the lens 104.
In some embodiments, when the photosensitive layer 101 and the photosensitive depth measurement layer 102 are not covered, the filter may filter light to be transmitted to the photosensitive layer 101 or the photosensitive depth measurement layer 102. Alternatively, the optical filter may filter both the light transmitted to the photographing photosensitive layer 101 and the light transmitted to the depth measurement photosensitive layer 102.
In some embodiments, referring to fig. 5B, the third optical filter 1031 and the fourth optical filter 1032 are integrally disposed as the first patterned optical filter 1033, or the third optical filter 1031 is disposed separately from the fourth optical filter 1032.
In some embodiments, the third and fourth optical filters 1031 and 1032 can be integrally formed as a patterned optical filter with regions having two different functions formed by a set of semiconductor processes but different design structures. When the third filter 1031 and the fourth filter 1032 are integrally provided as the first patterned filter 1033, both the light transmitted to the photographing photosensitive layer 101 and the light transmitted to the depth measurement photosensitive layer 102 may be filtered at the same time. When the third filter 1031 is provided separately from the fourth filter 1032, the third filter 1031 may be selectively selected to filter light to be transmitted to the photographing photosensitive layer 101, or the fourth filter 1032 may be selected to filter light to be transmitted to the depth measurement photosensitive layer 102.
In some embodiments, the first patterned filter 1033 includes different regions, and the capture light and the depth of field measurement light are transmitted through the different regions of the first patterned filter 1033, respectively.
In some embodiments, referring to fig. 5B, when the photosensitive layer 101 and the depth-of-field measurement photosensitive layer 102 are in an uncovered structure, the first patterned filter 1033 is integrally disposed by the third filter 1031 and the fourth filter 1032, and the third filter 1031 and the fourth filter 1032 respectively occupy two different regions. The first patterned filter 1033 filters both the light transmitted to the photosensitive layer 101 and the light transmitted to the photosensitive layer 102, and the captured light and the measured light pass through different regions of the first patterned filter 1033, respectively, so as to obtain the captured light required for capturing the photosensitive layer 101 and the measured light required for measuring the depth of field of the photosensitive layer 102.
In some embodiments, referring to fig. 6A, 6B, the photographic photosensitive layer 101 includes at least two photosensitive layer structures.
In some embodiments, the photographic sensitive layer 101 may include a sensitive layer structure of at least two layers, for example: including a first photosensitive layer 1011, a second photosensitive layer 1012, and a third photosensitive layer 1013, the first photosensitive layer 1011 receives light of a first wavelength band, the second photosensitive layer 1012 receives light of a second wavelength band, and the third photosensitive layer 1013 receives light of a third wavelength band. For example: the red light sensitive layer structure, the orange light sensitive layer structure, the yellow light sensitive layer structure, the green light sensitive layer structure, the blue light sensitive layer structure, the indigo light sensitive layer structure and the purple light sensitive layer structure. The red photosensitive layer structure receives red light, and the blue photosensitive layer structure receives blue light.
In some embodiments, the filter set 107 is configured to filter the photographing light to be transmitted to the at least two photosensitive layers, respectively, to obtain the photographing light required by the at least two photosensitive layers. For example: the filter set 107 filters red light to be transmitted to the red light sensitive layer and blue light to be transmitted to the blue light sensitive layer, respectively, to obtain red light and blue light required by the red light sensitive layer and the blue light sensitive layer.
In some embodiments, at least two of the photosensitive layers 101 are captured in a full coverage configuration, a partial coverage configuration, or an uncovered configuration.
In some embodiments, referring to fig. 6A, 6B, the photosensitive layer positional relationship in the photographing photosensitive layer 101 may be a full-coverage relationship, a partial-coverage relationship, or a non-coverage relationship.
In some embodiments, referring to fig. 7, when the at least two photosensitive layer structures are uncovered structures, the filter set 107 is a second patterned filter.
In some embodiments, when the photosensitive layer structure in the photosensitive layer 101 is an independent uncovered structure, the filter set 107 may be an integrally disposed patterned filter disposed in different regions, and the filter set 107 may also be filters disposed separately from each other.
In some embodiments, the second patterned filter includes different regions, and the different bands of captured light respectively pass through the different regions of the second patterned filter.
In some embodiments, referring to fig. 7, filter set 107 is a second patterned filter divided into region 1071, region 1072, and region 1073. The second patterned optical filter can simultaneously filter the shooting light of different wave bands transmitted to the shooting photosensitive layer 101, the shooting light of different wave bands respectively passes through different areas of the second patterned optical filter, and the area 1071 corresponds to the first photosensitive layer 1011 and filters the first wave band light to obtain the shooting light of the first wave band light required by the first photosensitive layer 1011; the region 1072 corresponds to the second photosensitive layer 1012 and filters the second band light to obtain the second band light capturing light required by the second photosensitive layer 1012; the region 1073 corresponds to the third photosensitive layer 1013 and filters the third wavelength band light to obtain the photographing light of the third wavelength band light required for the third photosensitive layer 1013.
In some embodiments, referring to fig. 8, when the at least two photosensitive layer structures are a full-coverage structure or a partial-coverage structure, the filter set 107 includes: and the optical filter is arranged between the at least two photosensitive layers.
In some embodiments, when at least two photosensitive layers in the photosensitive layer 101 are photographed to have a full coverage structure or a partial coverage structure, the filter sets 107 may be filters respectively disposed between the at least two photosensitive layers and separately disposed from each other. For example: the filter 1074 is provided between the first photosensitive layer 1011 and the second photosensitive layer 1012, and the filter 1075 is provided between the second photosensitive layer 1012 and the third photosensitive layer 1013, and selectively filters the photographing light of different wavelength bands to be transmitted to the photosensitive layers, respectively. For example: the photographing photosensitive layer 101 includes a red photosensitive layer and a blue photosensitive layer structure, and when the red and blue photosensitive layer structures are a full coverage structure or a partial coverage structure, the filter 1074 is disposed between the red and blue photosensitive layers. When the red photosensitive layer covers the blue photosensitive layer entirely or partially, the filter 1074 filters the blue light-capturing light transmitted to the blue photosensitive layer; when the blue photosensitive layer covers the red photosensitive layer entirely or partially, the filter filters the red photographing light transmitted to the red photosensitive layer.
In some embodiments, referring to fig. 6A, 6B, the photographing photosensitive layer 101 includes: a first photosensitive layer 1011 configured to receive light of a first wavelength band and convert a light signal corresponding to the light of the first wavelength band into first photographed image information; a second photosensitive layer 1012 configured to receive light of a second wavelength band and convert a light signal corresponding to the light of the second wavelength band into second photographed image information; the third photosensitive layer 1013 is configured to receive the third wavelength band light and convert an optical signal corresponding to the third wavelength band light into third captured image information.
In some embodiments, the photographing photosensitive layer 101 includes a red photosensitive layer, a green photosensitive layer, and a blue photosensitive layer, i.e., the first photosensitive layer 1011 can be a red photosensitive layer, which receives red light of the first wavelength band and converts a light signal corresponding to the received red light into first photographing image information; the second photosensitive layer 1012 may be a green photosensitive layer, which receives green light of the second wavelength band and converts a light signal corresponding to the received green light into second photographed image information; the third photosensitive layer 1013 may be a blue photosensitive layer, which receives blue light of the third wavelength band and converts an optical signal corresponding to the received blue light into third captured image information. Subsequently, processing may be performed based on the first captured image information, the second captured image information, and the third captured image information corresponding to the three colors of red, green, and blue to generate color image information of the target object 20.
In some embodiments, the depth of field measurement light is light of a particular wavelength range.
In some embodiments, the depth of field measuring light may be infrared (far infrared, mid infrared, or near infrared) or ultraviolet in a specific wavelength range.
In some embodiments, the sensor 100 further comprises: an emitter 105 configured to emit depth measurement light to the target 20; a calculation unit 106 configured to derive a depth of field value of the target object 20 by measuring a round trip time of the depth of field measurement light received by the depth of field measurement sensitive layer 102 based on the depth of field.
In some embodiments, the sensor 100 further includes an emitter 105, the emitter 105 emits the depth of field measurement light, and the depth of field measurement light is then reflected back from the object 20 and received by the depth of field measurement sensitive layer 102, and the calculation unit 106 obtains the depth of field value of the object 20 by determining a round trip time of the depth of field measurement light received by the depth of field measurement sensitive layer 102 based on the depth of field measurement light. Alternatively, the emitter 105 and the calculation unit 106 may be integrally provided, and the emitter 105 may be provided around the sensor 100 or the lens 104.
According to the embodiment of the disclosure, a single sensor can simultaneously realize a shooting function and a depth of field measurement function, and a single shooting lens can simultaneously acquire shot image information and depth of field image information.
The embodiment of the present disclosure provides a shooting module including the sensor 100 described above.
Referring to fig. 9, the subject of execution of the image acquisition method may be a sensor 100, comprising the steps of:
s901 receives the photographic light, and generates photographic image information of the subject 20 based on the received photographic light.
S902, receiving the depth-of-field measurement light, and generating depth-of-field image information of the target 20 based on the received depth-of-field measurement light; wherein, the shooting light and the depth of field measuring light come from the same lens.
In some embodiments, the photographing light includes a first wavelength band light, a second wavelength band light, and a third wavelength band light, and step 901 may further include: receiving the first waveband light, and converting an optical signal corresponding to the first waveband light into first shot image information; receiving the light of the second wave band, and converting an optical signal corresponding to the light of the second wave band into second shot image information; and receiving the third wavelength band light, and converting an optical signal corresponding to the third wavelength band light into third shot image information.
In some embodiments, the image acquisition method further comprises: emitting the depth-of-field measurement light to the target 20; the depth of field value of the target object 20 is obtained based on the round-trip time of the received depth of field measurement light.
The above description and drawings sufficiently illustrate embodiments of the disclosure to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. The examples merely typify possible variations. Individual components and functions are optional unless explicitly required, and the sequence of operations may vary. Portions and features of some embodiments may be included in or substituted for those of others. The scope of the disclosed embodiments includes the full ambit of the claims, as well as all available equivalents of the claims. As used in this application, although the terms "first," "second," etc. may be used in this application to describe various elements, these elements should not be limited by these terms. These terms are only used to distinguish one element from another. For example, a first element could be termed a second element, and, similarly, a second element could be termed a first element, unless the meaning of the description changes, so long as all occurrences of the "first element" are renamed consistently and all occurrences of the "second element" are renamed consistently. The first and second elements are both elements, but may not be the same element. Furthermore, the words used in the specification are words of description only and are not intended to limit the claims. As used in the description of the embodiments and the claims, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. Similarly, the term "and/or" as used in this application is meant to encompass any and all possible combinations of one or more of the associated listed. Furthermore, the terms "comprises" and/or "comprising," when used in this application, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. Without further limitation, an element defined by the phrase "comprising an …" does not exclude the presence of other like elements in a process, method or apparatus that comprises the element. In this document, each embodiment may be described with emphasis on differences from other embodiments, and the same and similar parts between the respective embodiments may be referred to each other. For methods, products, etc. of the embodiment disclosures, reference may be made to the description of the method section for relevance if it corresponds to the method section of the embodiment disclosure.
Those of skill in the art would appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software may depend upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the disclosed embodiments. It is clear to those skilled in the art that, for convenience and brevity of description, the working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments disclosed herein, the disclosed methods, products (including but not limited to devices, apparatuses, etc.) may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, a division of a unit may be merely a division of a logical function, and an actual implementation may have another division, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form. Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to implement the present embodiment. In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
In the drawings, the width, length, thickness, etc. of structures such as elements or layers may be exaggerated for clarity and descriptive purposes. When an element or layer is referred to as being "disposed on" (or "mounted on," "laid on," "attached to," "coated on," or the like) another element or layer, the element or layer may be directly "disposed on" or "over" the other element or layer, or intervening elements or layers may be present, or even partially embedded in the other element or layer.

Claims (19)

1. A sensor, wherein the sensor corresponds to a lens, comprising: shooting a photosensitive layer and measuring the depth of field; wherein the content of the first and second substances,
the photographic photosensitive layer is configured to receive photographic light and generate photographic image information of a target object based on the received photographic light;
the depth measurement photosensitive layer is configured to receive depth measurement light and generate depth image information of the target object based on the received depth measurement light.
2. The sensor of claim 1, wherein the capture sensitive layer and the depth measurement sensitive layer are fully covered, partially covered, or uncovered.
3. The sensor of claim 2, wherein when the photographic sensitive layer fully covers or partially covers the depth measurement sensitive layer, further comprising:
a first optical filter disposed between the photographing photosensitive layer and the depth measurement photosensitive layer, configured to filter light to be transmitted to the depth measurement photosensitive layer to obtain the depth measurement light required by the depth measurement photosensitive layer.
4. The sensor of claim 2, wherein when the depth measurement sensitive layer fully covers or partially covers the capture sensitive layer, further comprising:
a second optical filter disposed between the photographing photosensitive layer and the depth measurement photosensitive layer, configured to filter light to be transmitted to the photographing photosensitive layer to obtain the photographing light required by the photographing photosensitive layer.
5. The sensor according to claim 2, further comprising, when the photographing photosensitive layer and the depth measurement photosensitive layer are uncovered structures:
a third optical filter disposed between the photosensitive photographing layer and the lens, and configured to filter light to be transmitted to the photosensitive photographing layer to obtain the photographing light required by the photosensitive photographing layer.
6. The sensor of claim 5, wherein when the capture sensitive layer and the depth measurement sensitive layer are uncovered structures, further comprising:
a fourth optical filter disposed between the depth-of-field measurement photosensitive layer and the lens, and configured to filter light to be transmitted to the depth-of-field measurement photosensitive layer, so as to obtain the depth-of-field measurement light required by the depth-of-field measurement photosensitive layer.
7. The sensor of claim 6, wherein the third filter and the fourth filter are integrally disposed as a first patterned filter, or the third filter and the fourth filter are separately disposed.
8. The sensor of claim 7,
the first patterned optical filter comprises different areas, and the shooting light and the depth of field measuring light respectively penetrate through the different areas of the first patterned optical filter.
9. The sensor according to any one of claims 1 to 6,
the shooting photosensitive layer comprises at least two photosensitive layer structures.
10. The sensor of claim 9, further comprising:
a filter set configured to filter the photographing light to be transmitted to the at least two photosensitive layers, respectively, to obtain the photographing light required by the at least two photosensitive layers.
11. The sensor of claim 10,
the at least two photosensitive layers in the shooting photosensitive layer are of a full-coverage structure, a partial-coverage structure or a non-coverage structure.
12. The sensor of claim 11,
when the at least two photosensitive layer structures are uncovered structures, the filter set is a second graphical filter.
13. The sensor of claim 12,
the second patterned optical filter comprises different areas, and the shooting light with different wave bands respectively penetrates through the different areas of the second patterned optical filter.
14. The sensor of claim 11,
when the at least two photosensitive layer structures are full-coverage structures or partial-coverage structures, the filter set comprises:
and the optical filter is arranged between the at least two photosensitive layers.
15. The sensor of claim 9, wherein the taking a photosensitive layer comprises:
the first photosensitive layer is configured to receive first wave band light and convert a light signal corresponding to the first wave band light into first shooting image information;
the second photosensitive layer is configured to receive second wave band light and convert a light signal corresponding to the second wave band light into second shooting image information;
and the third photosensitive layer is configured to receive a third wave band light and convert an optical signal corresponding to the third wave band light into third shooting image information.
16. A sensor according to any one of claims 1 to 6, wherein the depth of field measurement light is light of a particular wavelength range.
17. The sensor of claim 1, further comprising:
a transmitter configured to transmit the depth measurement light to the target object;
a calculation unit configured to obtain a depth of field value of the target object by a round trip time of flight of the depth of field measurement light received based on the depth of field measurement photosensitive layer.
18. A camera module, characterized in that it comprises a sensor according to any one of claims 1 to 17.
19. An image acquisition method, comprising:
receiving photographing light, and generating photographing image information of a target object based on the received photographing light;
receiving depth-of-field measuring light, and generating depth-of-field image information of the target object based on the received depth-of-field measuring light;
wherein the shooting light and the depth of field measuring light come from the same lens.
CN202110337177.0A 2021-03-30 2021-03-30 Sensor, shooting module and image acquisition method Pending CN112738386A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110337177.0A CN112738386A (en) 2021-03-30 2021-03-30 Sensor, shooting module and image acquisition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110337177.0A CN112738386A (en) 2021-03-30 2021-03-30 Sensor, shooting module and image acquisition method

Publications (1)

Publication Number Publication Date
CN112738386A true CN112738386A (en) 2021-04-30

Family

ID=75597075

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110337177.0A Pending CN112738386A (en) 2021-03-30 2021-03-30 Sensor, shooting module and image acquisition method

Country Status (1)

Country Link
CN (1) CN112738386A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080316347A1 (en) * 2004-06-01 2008-12-25 Abbas El Gamal Adaptive pixel for high dynamic range and disturbance detection and correction
KR20110101435A (en) * 2010-03-08 2011-09-16 삼성전자주식회사 Infrared sensor, touch panel and 3d color image sensor containing the same
CN102623475A (en) * 2012-04-17 2012-08-01 上海中科高等研究院 Stacked CMOS (Complementary Metal Oxide Semiconductor) image sensor
CN105049829A (en) * 2015-07-10 2015-11-11 北京唯创视界科技有限公司 Optical filter, image sensor, imaging device and three-dimensional imaging system
US20150362698A1 (en) * 2014-06-11 2015-12-17 Olympus Corporation Image Sensor for Depth Estimation
US20160165213A1 (en) * 2013-06-19 2016-06-09 Samsung Electronics Co., Ltd. Layered type color-depth sensor and three-dimensional image acquisition apparatus employing the same
CN106572340A (en) * 2016-10-27 2017-04-19 深圳奥比中光科技有限公司 Camera shooting system, mobile terminal and image processing method
CN108139562A (en) * 2015-09-30 2018-06-08 富士胶片株式会社 Focusing control apparatus, focusing control method, focusing control program, lens assembly, photographic device
CN108900772A (en) * 2018-07-19 2018-11-27 维沃移动通信有限公司 A kind of mobile terminal and image capturing method
CN109346494A (en) * 2018-11-20 2019-02-15 德淮半导体有限公司 Phase focus image sensor and forming method thereof

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080316347A1 (en) * 2004-06-01 2008-12-25 Abbas El Gamal Adaptive pixel for high dynamic range and disturbance detection and correction
KR20110101435A (en) * 2010-03-08 2011-09-16 삼성전자주식회사 Infrared sensor, touch panel and 3d color image sensor containing the same
CN102623475A (en) * 2012-04-17 2012-08-01 上海中科高等研究院 Stacked CMOS (Complementary Metal Oxide Semiconductor) image sensor
US20160165213A1 (en) * 2013-06-19 2016-06-09 Samsung Electronics Co., Ltd. Layered type color-depth sensor and three-dimensional image acquisition apparatus employing the same
US20150362698A1 (en) * 2014-06-11 2015-12-17 Olympus Corporation Image Sensor for Depth Estimation
CN105049829A (en) * 2015-07-10 2015-11-11 北京唯创视界科技有限公司 Optical filter, image sensor, imaging device and three-dimensional imaging system
CN108139562A (en) * 2015-09-30 2018-06-08 富士胶片株式会社 Focusing control apparatus, focusing control method, focusing control program, lens assembly, photographic device
CN106572340A (en) * 2016-10-27 2017-04-19 深圳奥比中光科技有限公司 Camera shooting system, mobile terminal and image processing method
CN108900772A (en) * 2018-07-19 2018-11-27 维沃移动通信有限公司 A kind of mobile terminal and image capturing method
CN109346494A (en) * 2018-11-20 2019-02-15 德淮半导体有限公司 Phase focus image sensor and forming method thereof

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DIETMAR KNIPP1 等: "Vertically integrated thin film color sensor", 《OPTICS EXPRESS》 *
陈远 等: ""基于垂直层叠结构的可见/近红外双波段传感器研究"", 《光学学报》 *
陈远 等: ""基于垂直层叠结构的多光谱彩色传感器研究"", 《光谱学与光谱分析》 *
陈远 等: "基于垂直层叠结构的可见/近红外双波段传感器研究", 《光学学报》 *

Similar Documents

Publication Publication Date Title
JP5788009B2 (en) Digital multispectral camera system with at least two independent digital cameras
KR100802525B1 (en) Real time multi band camera
EP2589226B1 (en) Image capture using luminance and chrominance sensors
KR100777428B1 (en) Image processing device and method
CN107924045A (en) With providing the photomoduel of the different effectively filters of entrance pupil sizes based on light type
CN105229788A (en) Have further function, especially for the optical imaging apparatus being calculated to be picture
WO2016065541A1 (en) Rgb-d imaging system and method using ultrasonic depth sensing
CN107534718A (en) The metadata of light field
WO2020155739A1 (en) Image sensor, method for acquiring image data from image sensor, and camera device
CN108900763A (en) Filming apparatus, electronic equipment and image acquiring method
CN103098480B (en) Image processing apparatus and method, three-dimensional image pickup device
CN108429882B (en) Shooting device, electronic equipment and image acquisition method
CN106471804A (en) Method and device for picture catching and depth extraction simultaneously
TW201515462A (en) Camera devices and systems based on a single imaging sensor and methods for manufacturing the same
CN112738386A (en) Sensor, shooting module and image acquisition method
US11418707B2 (en) Electronic device and notification method
CN108769635B (en) Shooting device, electronic equipment and image acquisition method
CN113973197B (en) Pixel structure, pixel array, image sensor and electronic equipment
CN107392081B (en) Image recognition and photography module
WO2017120640A1 (en) Image sensor
CN209982584U (en) Image acquisition device
CN112738385A (en) Sensor and shooting module
CN113163106A (en) Electronic device
KR20190125177A (en) Three-dimensional image capturing module and method for capturing three-dimensional image
CN205670813U (en) A kind of filming apparatus and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210430