WO2020181869A1 - 成像装置和电子设备 - Google Patents

成像装置和电子设备 Download PDF

Info

Publication number
WO2020181869A1
WO2020181869A1 PCT/CN2019/125631 CN2019125631W WO2020181869A1 WO 2020181869 A1 WO2020181869 A1 WO 2020181869A1 CN 2019125631 W CN2019125631 W CN 2019125631W WO 2020181869 A1 WO2020181869 A1 WO 2020181869A1
Authority
WO
WIPO (PCT)
Prior art keywords
lens
array
depth
sub
range
Prior art date
Application number
PCT/CN2019/125631
Other languages
English (en)
French (fr)
Inventor
王雷
张�林
丁小梁
王鹏鹏
王海生
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US16/767,327 priority Critical patent/US20210203821A1/en
Publication of WO2020181869A1 publication Critical patent/WO2020181869A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/957Light-field or plenoptic cameras or camera modules
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0043Inhomogeneous or irregular arrays, e.g. varying shape, size, height
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0006Arrays
    • G02B3/0037Arrays characterized by the distribution or form of lenses
    • G02B3/0056Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/20Filters
    • G02B5/201Filters in the form of arrays
    • GPHYSICS
    • G02OPTICS
    • G02FOPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
    • G02F1/00Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
    • G02F1/29Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
    • G02F1/294Variable focal length devices
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B3/00Focusing arrangements of general interest for cameras, projectors or printers
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B30/00Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/45Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • H04N23/676Bracketing for image capture at varying focusing conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/958Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
    • H04N23/959Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/10Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
    • H04N25/11Arrangement of colour filter arrays [CFA]; Filter mosaics
    • H04N25/13Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements

Definitions

  • Embodiments of the present disclosure relate to an imaging device and electronic equipment including the imaging device.
  • the light field is a complete representation of the collection of light in space. Collecting and displaying the light field can visually reproduce the real world. Compared with ordinary cameras, light field cameras can obtain more complete light field information.
  • the imaging part of the light field camera is generally composed of a microlens array and an image sensor.
  • the microlens array can divide the light beam emitted by the same object point into discrete thin beams of different angles, and each microlens in the microlens array focuses the corresponding thin beam that passes through it to the corresponding part of the image sensor.
  • Each part of the image sensor receives a thin beam of light passing through the corresponding microlens to record a complete image with a specific orientation, thereby capturing more complete light field information.
  • the light field camera can realize the function of "photograph first, focus later". For example, for a light field camera with m (m is usually greater than 2, for example, several tens) lenses, m original images can be obtained in one frame time. Then, through the fusion refocusing algorithm, the m original images are fused to obtain the desired final refocusing image. Depending on the algorithm, the refocused focal plane of the obtained refocused image can be different.
  • At least one embodiment of the present disclosure provides an imaging device including a lens array and an image sensor, wherein the lens array includes a plurality of lens units and is configured to provide at least a first lens sub-array and a second lens sub-array, so
  • the first lens sub-array includes a first lens unit of the plurality of lens units, the first lens unit and the image sensor are configured to have a first depth of field range
  • the second lens sub-array includes the multiple The second lens unit of the two lens units, the second lens unit and the image sensor are configured to have a second depth of field range, the first depth of field and the second depth of field are partially overlapped, and the first depth of field
  • the common range of the range and the second depth of field range is larger than the first depth of field range and the second depth of field range.
  • the first lens unit of the first lens sub-array and the image sensor there is a first distance between the first lens unit of the first lens sub-array and the image sensor, and the second lens unit of the second lens sub-array is connected to the image sensor. There is a second distance between the sensors.
  • the first lens unit of the first lens sub-array has a first focal length
  • the second lens unit of the second lens sub-array has a second focal length
  • the first lens unit of the first lens sub-array there is a first distance between the first lens unit of the first lens sub-array and the image sensor, and the second lens unit of the second lens sub-array is connected to the image sensor.
  • the first lens unit of the first lens sub-array has a first focal length
  • the second lens unit of the second lens sub-array has a second focal length.
  • the lens unit is a convex lens.
  • the first lens sub-array and the second lens sub-array are regularly arranged.
  • the plurality of lens units of the lens array are configured such that the focal length of each of the plurality of lens units can be adjusted or the plurality of lens units are different from the image sensor.
  • the distance between them is adjustable, so that the lens arrays respectively form the first lens sub-array and the second lens sub-array in different time periods.
  • the lens unit is a plurality of liquid crystal lens units, and the plurality of liquid crystal lens units are configured such that the focal length of each of the plurality of liquid crystal lens units is adjustable, so that the lens array The first lens sub-array and the second lens sub-array are respectively formed in different time periods.
  • the imaging device further includes a color filter array, wherein the filter array includes a plurality of filters, and each of the plurality of filters corresponds to the One lens unit of the plurality of lens units filters the light transmitted by the one lens unit.
  • the back depth point of the first depth of field range is at infinity, and the back depth point of the second depth of field range is not at infinity.
  • the overlapping range of the first depth of field range and the second depth of field range is 0-100 mm.
  • the image sensor includes a plurality of sub-image sensors, and the plurality of sub-image sensors correspond to the plurality of lens units one-to-one, so that each sub-image sensor is configured to receive incident light from one lens unit. Light to image.
  • the lens array is configured to further provide a third lens sub-array, the third lens sub-array includes a third lens unit of the plurality of lens units, and the third lens
  • the unit and the image sensor are configured to have a third depth-of-field range, the third depth-of-field range is at least partially overlapped with one of the first depth-of-field range or the second depth-of-field range, the first depth-of-field range, the second depth-of-field range
  • the common range of the range of depth of field and the range of the third depth of field is larger than the range of any one or both of the range of the first depth of field, the second range of depth, and the third range of depth of field.
  • At least one embodiment of the present disclosure also provides an electronic device, which includes any imaging device as described above.
  • the electronic device includes a housing and a display panel, the lens array is disposed in the housing opposite to the display panel, and a part of the lens array is exposed to the Outside the housing, the image sensor is arranged inside the housing.
  • Figure 1 shows a schematic diagram of the principle of an imaging device
  • FIG. 2A shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
  • FIG. 2B shows a schematic side view of an electronic device including the imaging device shown in FIG. 2A;
  • 2C shows a schematic diagram of a lens array, a filter array, and an image sensor of an imaging device according to another embodiment of the disclosure
  • 3A and 3B show the image sensor in FIG. 2A;
  • 4A, 4B, 4C, and 4D show exemplary arrangements of a lens array according to another embodiment of the disclosure
  • Fig. 5 shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
  • Fig. 6 shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
  • FIG. 7 shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
  • FIG. 8 shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
  • FIG. 9 shows a perspective schematic diagram of an electronic device including an imaging device according to an embodiment of the present disclosure.
  • FIG. 10A shows a perspective schematic diagram of an electronic device including an imaging device according to another embodiment of the present disclosure
  • FIG. 10B shows a schematic diagram of the lens array of the imaging device in FIG. 10A.
  • FIG. 1 shows an imaging device 10, such as a light field camera, which generally includes a lens array 11 and an image sensor 12.
  • the lens array 11 includes a plurality of lens units, and the image sensor 12 is arranged corresponding to each lens unit.
  • each light from the target imaging object (a tree in the figure) passes through each lens unit, and then is respectively focused on different parts of the image sensor 12 corresponding to it. Therefore, different parts of the image sensor 12 can record a complete image of the target imaging object from different orientations, and each pixel of the different parts of the image sensor 12 records the object point (for example, see FIG. 1, the object on the target imaging object).
  • each lens unit and image sensor in the single lens array are configured to have a single depth range.
  • the depth of field range obtained by combining it with the image sensor is limited.
  • the lens unit and the image sensor in a single lens array have a depth of field from 20 cm to infinity when the focal point is on the focal plane, that is, the foreground depth point is 20 cm from the lens, and the back depth point is infinite. If the focus point is not on the focal plane, but moves a certain distance in the direction of the lens unit, while the foreground deep point moves forward, the rear depth point also moves forward, at this time, the infinity can no longer be clearly imaged. Therefore, limited by means, space, cost, etc., it is difficult to increase the depth of field range of an imaging device with a single lens array.
  • Some embodiments of the present disclosure provide an imaging device including a lens array and an image sensor.
  • the lens array is configured to provide a first lens sub-array and a second lens sub-array in the spatial domain or in the time domain, and the lens unit of the first lens sub-array and the image sensor are configured to have a first depth of field Range, the lens unit of the second lens sub-array and the image sensor are configured to have a second depth of field range, and the common range of the first depth range and the second depth range is larger than the first depth range and also larger than the second depth range.
  • the imaging device can expand the depth of field range of the imaging device by providing the aforementioned first lens sub-array and the second lens sub-array, so that the imaging device can reposition any object surface within the extended depth of field. Focus.
  • the first depth-of-field range and the second depth-of-field range may not overlap at all.
  • the first depth-of-field range and the second depth-of-field range may partially overlap.
  • the foreground depth point of the first depth range is before the back depth point of the second depth range; for example, further, the back depth point of the first depth range is at infinity. Therefore, the common range of the first depth-of-field range and the second depth-of-field range is continuous, so that the imaging device can refocus any object surface within the continuous depth-of-field range.
  • the overlapping range of the first depth of field range and the second depth of field range may be 0-100 mm, so that the depth of field range can be expanded as much as possible, and the expansion efficiency can be improved.
  • the overlap range is equal to 0 mm, the end points of the depth range overlap.
  • the back depth point of the first depth of field range may be set to be at infinity and the back depth point of the second depth of field range may be set not to be at infinity, so that the depth of field range can also be expanded as much as possible.
  • imaging apparatuses according to various embodiments of the present disclosure will be described in more detail below.
  • FIG. 2A shows a schematic diagram of the imaging device 100 according to an embodiment of the present disclosure.
  • the imaging device 100 may be implemented as a light field camera.
  • the imaging device 100 includes a lens array 110 and an image sensor 120.
  • the lens array 110 includes a plurality of lens units 111 and 112 which are arranged side by side with each other;
  • the image sensor 120 includes a plurality of lens units 111 and 112.
  • the lens array 110 includes a first lens sub-array and a second lens sub-array.
  • the first lens sub-array includes at least one first lens unit 111, such as a plurality of first lens units 111 (only one is shown in the figure), and the second lens sub-array includes at least one second lens unit 112, for example, a plurality of second lens units 112.
  • Two lens unit 112 (only one is shown in the figure).
  • the specifications of the first lens unit 111 and the second lens unit 112 are the same.
  • the focal length, size, and structure of the first lens unit 111 and the second lens unit 112 are the same, and for example, both are solid lens units, for example, made of glass or resin materials.
  • each lens unit is shown as a complete double-sided convex lens, it is understood that each lens unit can also be formed as a partial convex lens or a single-sided convex lens, which can also achieve the Modulation effect.
  • the following embodiments to be described are the same as this, so they will not be repeated.
  • the image sensor 120 includes a first sub image sensor array and a second sub image sensor array, the first sub image sensor array includes at least one first sub image sensor 121, and the second sub image sensor array includes at least one second sub image sensor 122 .
  • the first sub image sensor 121 and the second sub image sensor 122 may be implemented by a complementary metal oxide semiconductor (CMOS, Complementary Metal Oxide Semiconductor) device or a charge coupled device (CCD, Charge Coupled Device).
  • CMOS Complementary Metal Oxide Semiconductor
  • CCD Charge Coupled Device
  • each of the first sub-image sensor 121 or the second sub-image sensor 122 may include a pixel array and a detection circuit
  • each sub-pixel of the pixel array may include a photoelectric sensing device and a pixel drive circuit, etc.
  • the detection circuit may include a reading circuit. Circuits, amplifier circuits, analog-to-digital conversion circuits, etc., can therefore (for example, line by line) read and process the photoelectric signals collected from the pixel array to obtain data corresponding to an image.
  • FIG. 2B shows a schematic side view of an exemplary electronic device including the imaging device 100 shown in FIG. 2A, and FIG. 3A and FIG. 3B show the image sensor 120 in FIG. 2A.
  • the lens units 111 and 112 in the lens array 110 correspond to the sub-image sensors 121 and 122 in the image sensor 120, respectively.
  • the light from the target imaging object passing through each lens unit 111, 112 is respectively focused to the corresponding sub-image sensors 121, 122 that overlap each lens unit 111, 112 in the light incident direction.
  • These sub-image sensors 121, 122 respectively form a complete image of the target imaging object.
  • light from the target imaging object passing through the first lens unit 111 is respectively focused on the first sub-image sensor 121 corresponding to the first lens unit 111, and the first sub-image sensor 121 forms a complete first image of the target imaging object;
  • the light from the target imaging object passing through the second lens unit 112 is respectively focused on the second sub-image sensor 122 corresponding to the second lens unit 112, and the second sub-image sensor 122 forms a complete second image of the target imaging object.
  • each lens unit 111, 112 and each sub image sensor 121, 122 corresponding to each lens unit 111, 112 the complete image of the target imaging object formed by each sub image sensor 121, 122 has a different orientation The complete image. That is, the imaging device of this embodiment can obtain multiple complete images of the target imaging object with different orientations in one shot.
  • Each image sensor 121, 122 transmits the first image and the second image with different orientations to an image processing device (not shown in the figure, for example, an image processing chip) included in the electronic device, and then, the image processing device, for example, After the shooting is completed, the first image and the second image can be merged through the fusion refocusing algorithm to obtain the desired final refocusing image.
  • an image processing device not shown in the figure, for example, an image processing chip
  • first distance d1 between the first lens unit 111 and the corresponding first sub image sensor 121, thereby having a first depth of field range; between the second lens unit 112 and the corresponding second sub image sensor 122 It has a second distance d2 smaller than the first distance d1, thereby having a second depth range different from the first depth range.
  • the first distance d1 and the second distance d2 are configured such that the foreground deep point Q13 of the second lens unit 112, the foreground deep point Q11 of the first lens unit 111, the back depth point Q14 of the second lens unit 112, and the first lens unit 111
  • the back depth of field point Q12 is close to infinity in turn.
  • the common range of the first depth of field range and the second depth of field range is the continuous range between the foreground depth point Q13 of the second lens unit 112 and the back depth point Q12 of the first lens unit 111, which expands the depth of field of the imaging device 100 range. That is, the first sub image sensor 121 obtains a complete image of the target imaging object with a first depth of field range, the second sub image sensor 122 obtains a complete image of the target imaging object with a second depth of field range, and each sub image sensor obtains The multiple complete images of the target imaging object with different orientations have a wider range of depth of field as a whole. Therefore, the image processing device can fuse the original image with a wider range of depth of field through the fusion refocusing algorithm to obtain a wider range of depth of field. The desired final refocused image of the depth of field range.
  • the distance between the lens unit and the image sensor refers to the distance between the optical center of the lens unit and the imaging surface of the image sensor in the direction of the main optical axis of the lens.
  • the first depth range and the second depth range are realized by different distances between different lens units 111 and 112 and sub-image sensors 121 and 122. Since the specifications of the lens units 111 and 112 are the same, the manufacturing cost and processing difficulty of the lens array can be reduced.
  • the respective sub image sensors 121 and 122 are independent of each other, and may be independently driven, for example.
  • the image sensor 120 includes a plurality of sub image sensors 121 and 122 and a substrate 40, and the sub image sensors 121 and 122 are disposed on the substrate 40.
  • each sub-image sensor 121, 122 in the image sensor 120 can be arranged on the substrate 40 by means of transfer, thereby reducing manufacturing costs; for another example, each sub-image sensor 121, 122 in the image sensor 120 can be The surface mount process (SMT) is provided on the substrate 40.
  • the substrate 40 can be a glass substrate, a plastic substrate, a ceramic substrate, etc.
  • the image sensor is obtained by using a glass substrate, which can be compatible with the preparation process of a common display device (such as a liquid crystal display device or an organic light-emitting display device), so that it can be further Reduce manufacturing costs.
  • a common display device such as a liquid crystal display device or an organic light-emitting display device
  • circuit structures such as wires and contact pads (pads) are provided on the substrate 40, and the sub-image sensors 121 and 122 are bonded to the contact pads in various appropriate ways (such as soldering, conductive glue), etc., so as to realize the circuit connection.
  • the image sensor 120 may be constructed as an integrated image sensor, and the sub-image sensors 121 and 122 are respectively different areas of the overall image sensor. In this case, different lens units 111, The light rays from the target imaging object 112 are respectively focused on different areas of the image sensor 120 corresponding to them.
  • Each lens sub-array of the lens array may be regularly arranged, for example, the arrangement of each lens sub-array is regular, for example, periodic.
  • all lens units are arranged in rows and columns to form a matrix form, wherein the first lens unit 111 and the second lens unit 112 are arranged in rows and columns at intervals.
  • all lens units are arranged in rows and columns to form a matrix form, wherein the first lens unit 111 and the second lens unit 112 are spaced apart from each other in the row and column directions.
  • the regularly arranged first lens sub-array and second lens sub-array help simplify the algorithm.
  • the individual lens sub-arrays of the lens array can also be randomly arranged, as shown in FIG. 4C, the first lens unit 111 and the second lens unit 112 are randomly arranged in the row and column directions, and adjacent lenses The cells are not aligned with each other.
  • each lens sub-array of the lens array may also be arranged as a specific mark.
  • a plurality of first lens sub-arrays are arranged as a letter B
  • a plurality of second lens sub-arrays are arranged as a letter O
  • a plurality of third lens sub-arrays are arranged as a letter E.
  • Arrangement as a specific logo helps to improve aesthetics.
  • the imaging device 100 may further include a color filter array 130, which includes performing processing on the light passing through each lens unit 111 and 112 in the lens array 110.
  • a plurality of color filters for filtering light such as a red color filter, a green color filter, and a blue color filter, make the sub-image sensors corresponding to the respective lens units 111 and 112 generate a monochrome image, and the resulting monochrome image Standby for subsequent image processing.
  • the color filter array 130 may be arranged in a Bayer array for each lens sub-array.
  • the color filter array 130 may be arranged between the lens array 110 and the image sensor 120.
  • Various color filters can be prepared from, for example, color resin materials.
  • FIG. 5 shows a schematic diagram of an imaging device 200 according to another embodiment of the present disclosure.
  • the imaging device 200 includes a lens array 210 and an image sensor 220.
  • the lens array 210 includes a plurality of lens units 211 and 212 which are arranged side by side with each other;
  • the image sensor 220 includes a plurality of sub-image sensors 221 and 222 which are arranged side by side with each other.
  • the lens array 210 includes a first lens sub-array and a second lens sub-array.
  • the first lens sub-array includes a plurality of first lens units 211 (only one is shown in the figure), and the second lens sub-array includes a plurality of second lens units 212 (only one is shown in the figure).
  • the distances of the sub-image sensors 221 and 222 corresponding to the respective distances of the plurality of first lens units 211 and the plurality of second lens units 212 are the same.
  • the lens array 210 is an integrated lens array, and the individual lens units of the lens array are connected to each other, for example, prepared from the same resin film through a molding process, or from the same glass film through an etching process.
  • the image sensor 220 includes a first sub image sensor array including a plurality of sub image sensors 221 and a second sub image sensor array including a plurality of sub image sensors 222.
  • each lens unit 211, 212 corresponds to each sub-image sensor 221, 222 one-to-one to form a target imaging object with different orientations. The complete image.
  • the first lens unit 211 each has a first focal length f1, so that the corresponding sub-image sensor 221 has a first depth of field range, and the second lens unit 212 each has a second focal length f2 greater than the first focal length f1, so as to correspond to it.
  • the sub image sensor 222 has a second depth range different from the first depth range, so that the depth range of the imaging device 200 can be expanded.
  • the first focal length f1 of the first lens unit 211 and the second focal length f2 of the second lens unit 212 are configured such that the foreground deep point Q23 of the second lens unit 212, the foreground deep point Q21 of the first lens unit 211, and the second lens unit 212
  • the back depth point Q24 of 212 and the back depth point Q22 of the first lens unit 211 are sequentially away from infinity. Therefore, the common range of the first depth range and the second depth range is the continuous range between the foreground depth point Q23 of the second lens unit 212 and the back depth point Q22 of the first lens unit 211, which expands the depth range of the imaging device 200 .
  • the imaging device 200 may further include a color filter array, the color filter array including a plurality of color filters that filter light passing through each lens unit in the lens array, such as a red filter. Color chip, green color filter, blue color filter, etc.
  • FIG. 6 shows a schematic diagram of the imaging device 300 according to another embodiment of the present disclosure.
  • the imaging device 300 includes a lens array 310 and an image sensor 320.
  • the lens array 310 includes a plurality of lens units 311, 312, and 313, the plurality of lens units 311, 312, and 313 are arranged side by side;
  • the image sensor 320 includes a plurality of sub image sensors 321, 322, 323, and the plurality of sub image sensors 321 , 322, 323 are arranged side by side with each other.
  • the lens array 310 includes a first lens sub-array, a second lens sub-array, and further includes a third lens sub-array.
  • the image sensor 320 includes a first sub-image sensor array, a second sub-image sensor, and a third sub-image sensor array, where the first sub-image sensor array includes a plurality of first sub-image sensors 321, and the second sub-image sensor array includes multiple A second sub image sensor 322, and the third sub image sensor array includes a plurality of third sub image sensors 323.
  • each lens unit 311, 312, 313 corresponds to each sub image sensor 321, 322, 323 one-to-one to form a target imaging object, respectively Of complete images with different orientations.
  • the first lens sub-array includes a plurality of first lens units 311 (only one is shown in the figure), each of the plurality of first lens units 311 has a foreground depth point Q31 and a back depth point Q32, and the back depth point Q32 is infinite
  • the second lens sub-array includes a plurality of second lens units 312 (only one is shown in the figure), each of the plurality of second lens units 312 has a foreground depth point Q33 and a back depth point Q34, and a third lens sub-array It includes a plurality of third lens units 313 (only one is shown in the figure), and each of the plurality of third lens units 313 has a foreground depth point Q35 and a back depth point Q36.
  • the first lens unit 311, the second lens unit 312, and the third lens unit 313 and their respective corresponding image sensors 321, 322, 323 are configured to have a first depth range, a second depth range, and a third depth range, respectively, thereby enabling Expand the depth of field range of the imaging device.
  • the foreground depth point Q31 of the first lens unit 311 in the first lens sub-array is located in front of the back depth point Q34 of the second lens unit 312 in the second lens sub-array
  • the foreground deep point Q33 of the second lens unit 312 in the second lens sub-array is located in front of the back depth point Q36 of the plurality of third lens units 313 in the third lens sub-array. Therefore, the common range of the first depth range, the second depth range, and the third depth range is larger than any one or both of the first depth range, the second depth range, and the third depth range, which expands the overall imaging device Depth of field range.
  • the first lens unit 311, the second lens unit 312, and the third lens unit 313 may have different focal lengths. Additionally or alternatively, they may be The corresponding sub-image sensors 321, 322, and 323 have different distances.
  • the imaging device 300 may further include a color filter array, the color filter array including a plurality of color filters that filter light passing through each lens unit in the lens array, such as a red filter. Color chip, green color filter, blue color filter, etc.
  • the various embodiments of the present disclosure do not limit the number of lens arrays having different depths of field from the image sensor in the imaging device, and it can be 2, 3, or more than 3.
  • a lens array can be added as needed to further expand the depth of field range of the imaging device.
  • the depth ranges of the multiple lens arrays overlap in two parts. In order to minimize the number of different lens arrays and increase the coverage and efficiency of each lens array under the same overall depth of field range, the overlap range should not be too large.
  • the lens array of the imaging device includes a first lens sub-array and a second lens sub-array
  • the first lens sub-array includes a plurality of first lens units
  • the second lens sub-array includes a plurality of The second lens unit.
  • the second lens unit has a second distance between the image sensor
  • the first lens unit has a first focal length
  • the second lens unit has a second focal length.
  • the second distance is different and the first focal length is different from the second focal length, so that the common range of the first depth range and the second depth range is greater than the first depth range and the second depth range, thereby expanding the depth range of the imaging device.
  • the first lens sub-array and the second lens sub-array are provided in the spatial domain, the first lens sub-array and the second lens sub-array can be used for imaging at the same time. Compared with the second lens sub-array formed in the segment, the imaging time is not required.
  • the lens unit in the lens array has a fixed focal length and a fixed position. Therefore, there is no need to provide a special mechanism to adjust the distance between the lens unit and the image sensor or to configure the lens unit to have an adjustable focal length to obtain an extended depth of field. Such a lens unit and imaging device have a small cost.
  • the imaging device may provide the image sensor with a first lens sub-array and a second lens sub-array in the time domain. That is, the imaging device can adjust the lens unit in the lens array so that the lens array forms the first lens sub-array and the second lens sub-array in different time periods.
  • the first lens sub-array and the second lens sub-array are formed separately in different time periods, which is beneficial to save space.
  • FIG. 7 shows a schematic diagram of an imaging device 400 according to another embodiment of the present disclosure.
  • the imaging device 400 includes lens arrays 410, 410' and an image sensor 420.
  • the lens array 410, 410' includes a plurality of liquid crystal lens units 411, 411'.
  • the image sensor 420 includes a plurality of sub image sensors 421.
  • the plurality of liquid crystal lens units 411, 411' correspond to the plurality of sub-image sensors 421 one-to-one.
  • the focal lengths of the liquid crystal lens units 411, 411' are configured to be adjustable.
  • each of the liquid crystal lens units 411, 411' includes a liquid crystal layer and a plurality of liquid crystal control electrodes, the liquid crystal layer is sandwiched between two transparent substrates, and the plurality of liquid crystal control electrodes may be located on the same side of the liquid crystal layer. ), or one part is on one side (on the substrate) and the other part is on the other side (on the substrate); when a predetermined voltage combination (driving voltage) is applied to these liquid crystal control electrodes, a predetermined The distributed electric field can drive the deflection of liquid crystal molecules in the liquid crystal layer, change the refractive index of the liquid crystal layer at different positions, so that the liquid crystal layer has a lens effect, and realizes the modulation of light passing through the liquid crystal layer.
  • the liquid crystal lens is shown by taking a double-sided convex lens as an example in FIG. 7, it should be understood that the shape of the liquid crystal lens unit itself is not a double-sided convex lens.
  • the focal lengths of the liquid crystal lens units 411 and 411' are adjusted by applying different driving voltages to the liquid crystal lens units 411 and 411'.
  • the lens arrays respectively form a first lens sub-array (as shown by the solid line in FIG. 7) and a second lens sub-array (as shown in FIG. 7) in different time periods. The dotted line part is shown).
  • the first lens unit 411 of the first sub-lens array formed may have a first focal length, and is, for example, the same as the plurality of sub-image sensors 421 of the image sensor 420.
  • the second lens unit 411' of the second lens sub-array formed may have a second focal length different from the first focal length, and is also similar to the image sensor 420, for example.
  • the plurality of sub-image sensors 421 of the corresponding one to one so as to have a second depth of field range, wherein the foreground depth point of the second lens unit 411 ′ is Q43, and the back depth point of field is Q44.
  • the foreground deep point Q43 of the second lens unit 411', the foreground deep point Q41 of the first lens unit 411, the back depth point Q44 of the second lens unit 411' and the first The back depth point Q42 of the lens unit 411 is sequentially close to infinity, so that the overall depth of field range of the imaging device 400 is expanded.
  • the target imaging object can be imaged once at the first time point to obtain multiple complete images of the target imaging object with different orientations. Then, immediately afterwards, the target imaging object may be imaged again at a second time point to obtain another plurality of complete images of the target imaging object with different orientations. Since at the first time point and the second time point, each lens unit 411, 411' of the lens array and its corresponding sub-image sensor 421 have different depth of field ranges, multiple complete images obtained twice before and after have different depths of field range. Thereby, the image processing device can merge the original image obtained by the two imagings with a wider depth of field as a whole through the fusion refocusing algorithm, so as to obtain a desired final refocused image with a wider depth of field.
  • the imaging device 400 may further include a color filter array, the color filter array including a plurality of color filters that filter light passing through each lens unit in the lens array, such as a red filter. Color chip, green color filter, blue color filter, etc.
  • the color filter array may be arranged between the lens array and the image sensor.
  • FIG. 8 shows a schematic diagram of an imaging apparatus 400 according to another embodiment of the present disclosure.
  • the imaging device 500 includes lens arrays 510, 510' and an image sensor 520.
  • the lens array 510, 510' includes a plurality of identical lens units 511, 511'.
  • the image sensor 520 includes a plurality of sub image sensors 521.
  • the plurality of lens units 511, 511' correspond to the plurality of sub-image sensors 521 one to one.
  • the imaging device 400 further includes a position adjustment mechanism 540, which is used to adjust the distance between each lens unit 511, 511' and the corresponding sub-image sensor 521; for example, the position adjustment mechanism 540 can be used by various Realization in an appropriate form, such as a mechanical method, a magnetic method, etc., for example, a mechanical method may include a motor and a gear or a piece of magnetically induced strain material. That is to say, the distance between the lens unit 511, 511' and the sub-image sensor 521 is variable, so that the lens arrays 510, 510' can respectively form the first lens sub-arrays in different time periods (as shown in the solid line in FIG. Partially shown) and the second lens sub-array (shown in dotted line in Fig. 8).
  • a position adjustment mechanism 540 which is used to adjust the distance between each lens unit 511, 511' and the corresponding sub-image sensor 521; for example, the position adjustment mechanism 540 can be used by various Realization in an appropriate
  • the first lens unit 511 of the lens array 510 may be a first distance d3 from the corresponding sub-image sensor 521, thereby having a first depth range, where the first The foreground depth point of the lens unit 511 is Q51, and the back depth point is Q52.
  • the second lens unit 511' in the lens array 510' can be separated from its corresponding sub
  • the image sensor 521 has a second distance d4 that is different from the first distance d3, thereby having a second depth of field range, wherein the foreground depth point of the second lens unit 511' is Q53, and the back depth point of field is Q54.
  • the first distance d3 and the second distance d4 are appropriately set so that the foreground deep point Q53 of the second lens unit 511', the foreground deep point Q51 of the first lens unit 511, the rear depth point Q54 of the second lens unit 511', and the second lens unit 511'
  • the back depth point Q52 of a lens unit 511 is sequentially close to infinity, so that the overall depth of field range of the imaging device 500 is expanded.
  • the imaging device 500 may further include a color filter array, the color filter array including a plurality of color filters that filter light passing through each lens unit in the lens array, such as a red filter. Color chip, green color filter, blue color filter, etc.
  • Some embodiments of the present disclosure also provide an electronic device, which includes the imaging device of any embodiment of the present disclosure, for example, any imaging device described above.
  • the electronic device may be a mobile phone, a tablet computer, a notebook computer, etc., which are not limited in the embodiments of the present disclosure.
  • FIG. 9 shows a perspective schematic diagram of an exemplary electronic device according to an embodiment of the present disclosure, which includes any of the imaging devices 100, 200, 300, 400, etc. described above.
  • the electronic device is, for example, a mobile phone, and the imaging devices 100, 200, 300, 400 are, for example, arranged on the back of the mobile phone (that is, the side facing away from the user) as a rear camera.
  • the imaging devices 100, 200, 300, 400 may be set to be as large as possible to obtain a complete image of the target imaging object from more and larger orientations.
  • the electronic device includes a display panel (not shown), a housing 50, an imaging device, and an image processing device (not shown).
  • the display surface of the display panel faces the user, that is, on the front of the electronic device; the imaging device is arranged on the opposite side of the display surface.
  • the imaging device includes a lens array and an image sensor.
  • the lens array is arranged in the housing 50 of the electronic device, opposite to the display panel, and exposed to the outside of the housing 50 at the back of the electronic device, and the image sensor is arranged inside the housing 50 and adjacent to the lens array.
  • the user can image the target object through the imaging device, and then execute the fusion refocusing algorithm through the image processing device of the electronic device to form the final image on the display panel; the image processing device is, for example, an image processor or an image processing chip.
  • the lens unit may be embedded in the housing 50 through a base (not shown), and the above-mentioned first distance and second distance may be determined by determining the position of the base.
  • FIG. 10A shows a perspective schematic diagram of an exemplary electronic device according to another embodiment of the present disclosure, which includes an imaging device 500.
  • the electronic device is, for example, a mobile phone, and the imaging device 500 is, for example, disposed on the back of the mobile phone (that is, the side facing away from the user), of course, the implementation of the present disclosure does not limit this.
  • the electronic device includes a display panel (not shown), a housing 550, an imaging device 500, and an image processing device (not shown).
  • the display surface of the display panel faces the user, that is, on the front of the electronic device; the imaging device 500 is arranged on the opposite side of the display surface.
  • the imaging device 500 includes a lens array 510 and an image sensor 520.
  • FIG. 10B shows a schematic diagram of the lens array of the imaging device in FIG. 10A.
  • the lens array 510 may be a liquid crystal lens array plate having a laminated structure.
  • the liquid crystal lens array board may be embedded in the housing 550 and arranged opposite to the display panel.
  • the liquid crystal lens array panel includes a first substrate 501, a second substrate 502, a first electrode layer 503 disposed on the side of the first substrate 501 facing the second substrate 502, and a first electrode layer 503 disposed on the second substrate 502 facing the first substrate 501.
  • the second electrode layer 504 on one side, and the liquid crystal layer 505 between the first electrode layer 503 and the second electrode layer 504.
  • the first electrode layer 503 and the second electrode layer 504 may be, for example, ITO transparent electrodes.
  • the liquid crystal lens array panel may be divided into a plurality of liquid crystal lens units, for example, into a plurality of first liquid crystal lens units 511 and a plurality of second liquid crystal lens units 512.
  • the first electrode layer 503 may be an integral common electrode
  • the second electrode layer may include a plurality of second sub-electrodes 5041 that independently control each liquid crystal lens unit 511, 512. That is, each liquid crystal lens unit 511, 512 includes a common first substrate 503, a common first electrode layer 503, an independent second sub-electrode 504, a common liquid crystal layer 505, and a common second substrate 502.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Nonlinear Science (AREA)
  • Human Computer Interaction (AREA)
  • Cameras In General (AREA)
  • Studio Devices (AREA)

Abstract

一种成像装置包括透镜阵列和图像传感器,其中,该透镜阵列包括多个透镜单元,且配置成提供第一透镜子阵列和第二透镜子阵列,第一透镜子阵列包括多个透镜单元中的第一透镜单元,第一透镜单元与图像传感器配置成具有第一景深范围,第二透镜子阵列包括多个透镜单元中的第二透镜单元,第二透镜单元与图像传感器配置成具有第二景深范围,第一景深范围与第二景深范围部分重合且第一景深范围与第二景深范围的共同范围大于第一景深范围和第二景深范围。该成像装置可以具有拓展的景深范围。还提供一种包括该成像装置的电子设备。

Description

成像装置和电子设备
本申请要求于2019年3月11日递交的中国专利申请第201910180440.2号的优先权,在此全文引用上述中国专利申请公开的内容以作为本申请的一部分。
技术领域
本公开的实施例涉及一种成像装置以及包括该成像装置的电子设备。
背景技术
光场是空间光线集合的完备表示,采集并显示光场就能在视觉上重现真实世界。光场相机相较于普通相机可以获得更加完备的光场信息。光场相机的成像部分一般由微透镜阵列和图像传感器构成。微透镜阵列可以将同一物点发出的光束划分成离散的不同角度的细光束,并且微透镜阵列中的每个微透镜将透过其的相应的细光束聚焦到图像传感器的与其对应的部分。图像传感器中的每个部分接收透过相应微透镜的细光束以记录一幅具有特定取向的完整图像,从而捕捉更完备的光场信息。
光场相机能够实现“先拍照、后聚焦”的功能。例如,对于有m(m通常大于2,例如几十)个透镜的光场相机,一帧时间可以获得m幅原始图像。之后,通过融合重聚焦算法,将这m幅原始图像融合可以获得期望的最终重聚焦图像。根据算法不同,所得到的重聚焦图像的重聚焦的焦面可以不同。
发明内容
本公开的至少一个实施例提供一种成像装置,包括透镜阵列和图像传感器,其中,所述透镜阵列包括多个透镜单元,且配置成至少提 供第一透镜子阵列和第二透镜子阵列,所述第一透镜子阵列包括所述多个透镜单元中的第一透镜单元,所述第一透镜单元与所述图像传感器配置成具有第一景深范围,所述第二透镜子阵列包括所述多个透镜单元中的第二透镜单元,所述第二透镜单元与所述图像传感器配置成具有第二景深范围,所述第一景深范围与所述第二景深范围部分重合且所述第一景深范围与所述第二景深范围的共同范围大于所述第一景深范围和所述第二景深范围。
例如,根据本公开的一实施例,所述第一透镜子阵列的第一透镜单元与所述图像传感器之间具有第一距离,所述第二透镜子阵列的第二透镜单元与所述图像传感器之间具有第二距离。
例如,根据本公开的一实施例,所述第一透镜子阵列的第一透镜单元具有第一焦距,所述第二透镜子阵列的第二透镜单元具有第二焦距。
例如,根据本公开的一实施例,所述第一透镜子阵列的第一透镜单元与所述图像传感器之间具有第一距离,所述第二透镜子阵列的第二透镜单元与所述图像传感器之间具有第二距离,所述第一透镜子阵列的第一透镜单元具有第一焦距,所述第二透镜子阵列的第二透镜单元具有第二焦距。
例如,根据本公开的一实施例,所述透镜单元为凸透镜。
例如,根据本公开的一实施例,所述第一透镜子阵列和所述第二透镜子阵列呈规则布置。
例如,根据本公开的一实施例,所述透镜阵列的所述多个透镜单元配置为,使得所述多个透镜单元各自的焦距可调或者所述多个透镜单元各自与所述图像传感器之间的距离可调,以使得所述透镜阵列在不同时间段内分别形成所述第一透镜子阵列和所述第二透镜子阵列。
例如,根据本公开的一实施例,所述透镜单元为多个液晶透镜单元,所述多个液晶透镜单元配置为使得所述多个液晶透镜单元各自的焦距可调,以使得所述透镜阵列在不同时间段内分别形成所述第一透 镜子阵列和所述第二透镜子阵列。
例如,根据本公开的一实施例,所述成像装置还包括滤色片阵列,其中,所述滤光片阵列包括多个滤光片,所述多个滤光片的每个对应于所述多个透镜单元中的一个透镜单元,以对所述一个透镜单元透过的光线进行滤光。
例如,根据本公开的一实施例,所述第一景深范围的后景深点在无穷远处,所述第二景深范围的后景深点不在无穷远处。
例如,根据本公开的一实施例,所述第一景深范围与所述第二景深范围的重合范围在0-100mm。
例如,根据本公开的一实施例,所述图像传感器包括多个子图像传感器,所述多个子图像传感器与所述多个透镜单元一一对应,使得每个子图像传感器配置为接收一个透镜单元的入射光以成像。
例如,根据本公开的一实施例,所述透镜阵列配置成还提供第三透镜子阵列,所述第三透镜子阵列包括所述多个透镜单元中的第三透镜单元,所述第三透镜单元与所述图像传感器配置成具有第三景深范围,所述第三景深范围至少与所述第一景深范围或第二景深范围中的一个部分重合,所述第一景深范围、所述第二景深范围、所述第三景深范围的共同范围大于所述第一景深范围、所述第二景深范围和所述第三景深范围中任意一者或两者的范围。
本公开的至少一个实施例还提供一种电子设备,其包括如上所述的任一成像装置。
例如,根据本公开的一实施例,所述电子设备包括壳体和显示面板,所述透镜阵列与所述显示面板相对地设置在所述壳体中,并且所述透镜阵列的一部分暴露于所述壳体的外部,所述图像传感器设置在所述壳体的内部。
附图说明
为了更清楚地说明本公开实施例的技术方案,下面将对实施例中 所需要使用的附图作简单地介绍,应当理解,以下附图仅示出了本公开的某些实施例,因此不应被看作是对范围的限定,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他相关的附图。
图1示出了一种的成像装置的原理示意图;
图2A示出了根据本公开另一实施例的成像装置的透镜阵列和图像传感器的原理示意图;
图2B示出了包括图2A所示的成像装置的电子设备的侧视示意图;
图2C示出了根据公开另一实施例的成像装置的透镜阵列、滤光片阵列和图像传感器的示意图;
图3A和图3B示出了图2A中的图像传感器;
图4A、图4B、图4C和图4D示出了根据公开另一实施例的透镜阵列的示例性布置;
图5示出了根据本公开另一实施例的成像装置的透镜阵列和图像传感器的原理示意图;
图6示出了根据本公开另一实施例的成像装置的透镜阵列和图像传感器的原理示意图;
图7示出了根据本公开的另一实施例的成像装置的透镜阵列和图像传感器的原理示意图;
图8示出了根据本公开的另一实施例的成像装置的透镜阵列和图像传感器的原理示意图;
图9示出了包括根据本公开一实施例的成像装置的电子设备的透视示意图;
图10A示出了包括根据本公开另一实施例的成像装置的电子设备的透视示意图;
图10B示出了图10A中的成像装置的透镜阵列的示意图。
具体实施方式
下面,参照附图详细描述根据本公开的实施例的成像装置和包括其的电子设备。为使本实用公开的目的、技术方案和优点更加清楚,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。
因此,以下对结合附图提供的本公开的实施例的详细描述并非旨在限制要求保护的本公开的范围,而是仅仅表示本公开的选定实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
除非上下文另有定义,否则单数形式包括复数形式。在整个说明书中,术语“包括”、“具有”、等在本文中用于指定所述特征、数字、步骤、操作、元件、部件或其组合的存在,但不排除存在或添加一个或多个其他特征、数字、步骤、操作、元件、部件或其组合。
另外,即使包括诸如“第一”、“第二”等序数的术语可用于描述各种部件,但这些部件并不受这些术语的限制,并且这些术语仅用于区分一个元件与其他元件。
图1示出了一种成像装置10,例如光场相机,通常包括透镜阵列11和图像传感器12。参见图1所示,透镜阵列11包括多个透镜单元,图像传感器12对应于各个透镜单元设置。在工作过程中,来自目标成像对象(图中为一棵树)的各个光线通过每个透镜单元,然后分别聚焦到图像传感器12与之对应的不同部分。因此,图像传感器12的不同部分可以从不同取向记录目标成像对象的一幅完整的图像,图像传感器12的不同部分的每个像素记录了物点(例如,参见图1,目标成像对象上的物点A、物点B、物点C)的空间位置。
在仅具有单一透镜阵列的成像装置中,单一透镜阵列中的各个透镜单元与图像传感器配置成具有单一的景深范围。对于任何透镜单元,其与图像传感器组合所得到的景深范围是有限的。例如,单一透 镜阵列中的透镜单元与图像传感器在聚焦点在焦面上时具有20cm到无穷远的景深,即前景深点距透镜20cm,后景深点为无穷远。如果聚焦点不在焦面上,而是朝向透镜单元的方向移动一定距离,则在前景深点向前移动的同时,后景深点也向前移动,此时无穷远处不再能清晰成像。因此,受到手段、空间或成本等限制,难以提高具有单一透镜阵列的成像装置的景深范围。
本公开的一些实施例提供了一种成像装置,该成像装置包括透镜阵列和图像传感器。为了拓展成像装置的景深范围,透镜阵列配置为在空间域上或者时间域上提供第一透镜子阵列和第二透镜子阵列,第一透镜子阵列的透镜单元与图像传感器配置成具有第一景深范围,第二透镜子阵列的透镜单元与图像传感器配置成具有第二景深范围,第一景深范围与第二景深范围的共同范围大于第一景深范围,也大于第二景深范围。因此,根据本公开一些实施例的成像装置通过提供前述第一透镜子阵列和第二透镜子阵列,可以拓展成像装置的景深范围,使得成像装置能够在该拓展的景深范围内对任意物面重聚焦。
在一些实施例中,第一景深范围和第二景深范围可以是完全不重合的。
或者,在另一些实施例中,第一景深范围和第二景深范围可以部分重合。例如,在朝向图像传感器的方向上,第一景深范围的前景深点在第二景深范围的后景深点之前;例如,更进一步地,第一景深范围的后景深点在无穷远处。因此,第一景深范围和第二景深范围的共同范围是连续的,使得成像装置能够在连续的景深范围内对任意物面重聚焦。
例如,在一些实施例中,第一景深范围和第二景深范围的重合范围可以在0-100mm,从而可以尽可能地扩展景深范围,提高扩展效率。这里,当重合范围等于0mm时,景深范围的端点重合。另外,可以将第一景深范围的后景深点设置成在无穷远处而将第二景深范围的后景深点设置成不在无穷远处,这样,也可以尽可能地拓展景深范 围。
下面更详细地介绍根据本公开的多个实施例的成像装置的配置。
图2A示出了根据本公开的一实施例的成像装置100的原理示意图,例如该成像装置100可以实现为光场相机。如图2A所示,该成像装置100包括透镜阵列110和图像传感器120,该透镜阵列110包括多个透镜单元111、112,该多个透镜单元111、112彼此并排设置;该图像传感器120包括多个子图像传感器121、122,该多个子图像传感器121、122彼此并排设置。
在该实施例中,透镜阵列110包括第一透镜子阵列和第二透镜子阵列。第一透镜子阵列包括至少一个第一透镜单元111,例如多个第一透镜单元111(图中仅示出了一个),第二透镜子阵列包括至少一个第二透镜单元112,例如多个第二透镜单元112(图中仅示出了一个)。在该实施例中,第一透镜单元111和第二透镜单元112的规格是相同的。例如,第一透镜单元111和第二透镜单元112的焦距、尺寸、结构均相同,并且例如均为固体透镜单元,例如采用玻璃或树脂材料制备。
另外,虽然图2A中,每个透镜单元都示出为完整的双侧凸透镜,但是可以理解,每个透镜单元也可以形成为部分凸透镜或单侧凸透镜,其同样能够实现对于透过的光的调制作用。下面的将要描述的实施例与此相同,因此不再赘述。
图像传感器120包括第一子图像传感器阵列和第二子图像传感器阵列,该第一子图像传感器阵列包括至少一个第一子图像传感器121,第二子图像传感器阵列包括至少一个第二子图像传感器122。
例如,该第一子图像传感器121和该第二子图像传感器122可以通过互补金属氧化物半导体(CMOS,Complementary Metal Oxide Semiconductor)器件或者电荷耦合器件(CCD,Charge Coupled Device)实现。例如,该第一子图像传感器121或该第二子图像传感器122每个可以包括像素阵列以及检测电路,像素阵列的每个子像素可以 包括光电感应器件以及像素驱动电路等,检测电路可以包括读取电路、放大电路、模数转换电路等,因此可以(例如逐行)读取和处理从像素阵列所采集的光电信号,以得到对应于一幅图像的数据。
图2B示出了包括图2A所示的成像装置100的示例性电子设备的侧视示意图,图3A和图3B示出了图2A中的图像传感器120。
参见图2A、图2B、图3A、图3B,透镜阵列110中的透镜单元111、112分别与图像传感器120中的子图像传感器121、122一一对应。通过每个透镜单元111、112的来自目标成像对象的光线分别聚焦到与该每个透镜单元111、112在光入射方向彼此位置重叠的对应的子图像传感器121、122,这些子图像传感器121、122分别形成目标成像对象的完整图像。例如,通过第一透镜单元111的来自目标成像对象的光线分别聚焦到与第一透镜单元111对应的第一子图像传感器121,第一子图像传感器121形成目标成像对象的完整的第一图像;通过第二透镜单元112的来自目标成像对象的光线分别聚焦到与第二透镜单元112对应的第二子图像传感器122,第二子图像传感器122形成目标成像对象的完整的第二图像。
由于各个透镜单元111、112以及与各个透镜单元111、112一一对应的各个子图像传感器121、122的不同位置,各个子图像传感器121、122所形成的目标成像对象的完整图像为具有不同取向的完整图像。也就是说,该实施例的成像装置在一次拍摄中可以获得目标成像对象的具有不同取向的多个完整图像。各个图像传感器121、122将该具有不同取向的第一图像和第二图像传输到该电子设备包括的图像处理装置(图中未示出,例如为图像处理芯片),然后,该图像处理装置例如可以在拍摄完成后,通过融合重聚焦算法对这些第一图像和第二图像进行融合,以获得期望的最终重聚焦图像。
再次参见图2A,第一透镜单元111与对应的第一子图像传感器121之间具有第一距离d1,从而具有第一景深范围;第二透镜单元112与对应的第二子图像传感器122之间具有小于第一距离d1的第二距离 d2,从而具有与第一景深范围不同的第二景深范围。第一距离d1和第二距离d2配置成使得第二透镜单元112的前景深点Q13、第一透镜单元111的前景深点Q11、第二透镜单元112的后景深点Q14和第一透镜单元111的后景深点Q12依次靠近无限远处。因此,第一景深范围与第二景深范围的共同范围为第二透镜单元112的前景深点Q13与第一透镜单元111的后景深点Q12之间的连续范围,这拓展了成像装置100的景深范围。也就是说,第一子图像传感器121获得目标成像对象的具有第一景深范围的完整图像,第二子图像传感器122获得目标成像对象的具有第二景深范围的完整图像,各个子图像传感器所获得的目标成像对象的具有不同取向的多个完整图像整体上具有更宽的景深范围,从而,图像处理装置可以通过融合重聚焦算法对具有更宽的景深范围的原始图像进行融合,以获得更宽景深范围的期望的最终重聚焦图像。
本公开中,透镜单元与图像传感器之间的距离是指透镜单元的光心与图像传感器的成像面在透镜的主光轴方向上的距离。
在本实施例中,通过不同透镜单元111、112与子图像传感器121、122的不同距离而实现第一景深范围和第二景深范围。由于各个透镜单元111、112的规格是相同的,因此,可以降低透镜阵列的制造成本和加工难度。
在本实施例的一个示例中,如图2A~图2C、图3A~图3B所示,各个子图像传感器121、122为彼此独立的子图像传感器,例如可以分别被独立驱动。
例如,如图3A和图3B所示,图像传感器120包括多个子图像传感器121、122和基板40,子图像传感器121、122设置在基板40之上。例如,图像传感器120中的各个子图像传感器121、122可以通过转印的方式设置在基板40上,由此可以降低制造成本;又例如,图像传感器120中的各个子图像传感器121、122可以通过表面安装工艺(SMT)设置在基板40上。例如,基板40可以为玻璃基板、塑料基 板、陶瓷基板等,通过采用玻璃基板来得到图像传感器,可以与通常的显示装置(例如液晶显示装置或有机发光显示装置)的制备工艺兼容,从而可以进一步降低制造成本。例如,在基板40之上设置有导线、接触垫(焊盘)等电路结构,子图像传感器121、122各种适当的方式(例如焊接、导电胶)等结合到接触垫之上,从而实现电路连接。
在本公开的其他实施例中,图像传感器120可以被构造成一体的图像传感器,子图像传感器121、122分别为该整体的图像传感器的不同区域,在这种情况下,通过不同透镜单元111、112的来自目标成像对象的光线分别聚焦到与其对应的图像传感器120的不同区域。
透镜阵列的各个透镜子阵列可以规则地布置,例如,各个透镜子阵列的排布方式具有规律性,例如具有周期性。例如,在本实施例中,如图4A所示,所有透镜单元成行和成列地布置,构成一个矩阵形式,其中第一透镜单元111和第二透镜单元112成行且成列地间隔布置。
在一些实施例中,如图4B所示,所有透镜单元成行和成列地布置,构成一个矩阵形式,其中在行和列的方向上第一透镜单元111和第二透镜单元112均彼此间隔地布置。规则布置的第一透镜子阵列和第二透镜子阵列有助于简化算法。
在一些实施例中,透镜阵列的各个透镜子阵列还可以随机布置,如图4C所示,在行和列的方向上第一透镜单元111和第二透镜单元112随机布置,并且相邻的透镜单元并不彼此对齐。
在一些实施例中,透镜阵列的各个透镜子阵列还可以布置成特定标志。在一个示例中,如图4D所示,多个第一透镜子阵列置成字母B,多个第二透镜子阵列布置成字母O,并且多个第三透镜子阵列布置成字母E。布置成特定标志有助于提高美观。
在本公开的一些实施例中,如图2C所示,成像装置100还可以包括滤色片阵列130,该滤色片阵列130包括对通过透镜阵列110中的各 个透镜单元111、112的光线进行滤光的多个滤色片,例如红色滤色片、绿色滤色片和蓝色滤色片,使得与各个透镜单元111、112对应的子图像传感器产生单色图像,所得到的单色图像备用后续图像处理。滤色片阵列130可以分别对于各个透镜子阵列呈拜耳阵列式排布。滤色片阵列130可以布置在透镜阵列110与图像传感器120之间。各种滤色片例如可以通过彩色树脂材料制备。
图5示出了根据本公开的另一实施例的成像装置200的原理示意图。如图5所示,成像装置200包括透镜阵列210和图像传感器220。该透镜阵列210包括多个透镜单元211、212,该多个透镜单元211、212彼此并排设置;该图像传感器220包括多个子图像传感器221、222,该多个子图像传感器221、222彼此并排设置。
在该实施例中,该透镜阵列210包括第一透镜子阵列和第二透镜子阵列。第一透镜子阵列包括多个第一透镜单元211(图中仅示出了一个),第二透镜子阵列包括多个第二透镜单元212(图中仅示出了一个)。该多个第一透镜单元211和该多个第二透镜单元212各自距离对应的子图像传感器221、222的距离是相同的。在一个示例中,透镜阵列210为一体式透镜阵列,透镜阵列的各个透镜单元彼此连接,例如由同一树脂薄膜通过模压工艺制备,或由同一玻璃薄膜通过刻蚀工艺制备。
图像传感器220包括第一子图像传感器阵列和第二子图像传感器阵列,该第一子图像传感器阵列包括多个子图像传感器221,该第二子图像传感器阵列包括多个子图像传感器222。
与图2A至图3B中所述的成像装置100类似地,在成像装置200中,各个透镜单元211、212与各个子图像传感器221、222一一对应,以分别形成目标成像对象的具有不同取向的完整图像。
第一透镜单元211每个具有第一焦距f1,从而与其对应的子图像传感器221具有第一景深范围,第二透镜单元212每个具有大于第一焦距f1的第二焦距f2,从而与其对应的子图像传感器222具有与第一 景深范围不同的第二景深范围,从而能拓展成像装置200的景深范围。第一透镜单元211的第一焦距f1和第二透镜单元212的第二焦距f2配置成使得第二透镜单元212的前景深点Q23、第一透镜单元211的前景深点Q21、第二透镜单元212的后景深点Q24和第一透镜单元211的后景深点Q22依次远离无限远处。因此,第一景深范围与第二景深范围的共同范围为第二透镜单元212的前景深点Q23与第一透镜单元211的后景深点Q22之间的连续范围,拓展了成像装置200的景深范围。
类似地,在至少一个示例中,成像装置200还可以包括滤色片阵列,该滤色片阵列包括对通过透镜阵列中的各个透镜单元的光线进行滤光的多个滤色片,例如红色滤色片、绿色滤色片和蓝色滤色片等。
图6示出了根据本公开的另一实施例的成像装置300的原理示意图。如图6所示,成像装置300包括透镜阵列310和图像传感器320。该透镜阵列310包括多个透镜单元311、312、313,该多个透镜单元311、312、313彼此并排设置;该图像传感器320包括多个子图像传感器321、322、323,该多个子图像传感器321、322、323彼此并排设置。
在该实施例中,透镜阵列310包括第一透镜子阵列、第二透镜子阵列,并且还包括第三透镜子阵列。
图像传感器320包括第一子图像传感器阵列、第二子图像传感器和第三子图像传感器阵列,其中,第一子图像传感器阵列包括多个第一子图像传感器321,第二子图像传感器阵列包括多个第二子图像传感器322,第三子图像传感器阵列包括多个第三子图像传感器323。
与图2A至图3B中所述的成像装置100类似地,在成像装置300中,各个透镜单元311、312、313与各个子图像传感器321、322、323一一对应,以分别形成目标成像对象的具有不同取向的完整图像。
第一透镜子阵列包括多个第一透镜单元311(图中仅示出了一个),该多个第一透镜单元311各自具有前景深点Q31和后景深点 Q32,后景深点Q32为无穷远,第二透镜子阵列包括多个第二透镜单元312(图中仅示出了一个),该多个第二透镜单元312各自具有前景深点Q33和后景深点Q34,并且第三透镜子阵列包括多个第三透镜单元313(图中仅示出了一个),该多个第三透镜单元313各自具有前景深点Q35和后景深点Q36。第一透镜单元311、第二透镜单元312和第三透镜单元313与其各自对应的图像传感器321、322、323配置成分别具有第一景深范围、第二景深范围和第三景深范围,由此能够拓展成像装置的景深范围。
在朝向子图像传感器321、322、323的方向上,第一透镜子阵列中的第一透镜单元311的前景深点Q31位于第二透镜子阵列中的第二透镜单元312的后景深点Q34之前,并且第二透镜子阵列中的第二透镜单元312的前景深点Q33位于第三透镜子阵列中的多个第三透镜单元313的后景深点Q36之前。因此,第一景深范围、第二景深范围、第三景深范围的共同范围大于第一景深范围、第二景深范围和第三景深范围中任意一者或两者的范围,拓展了成像装置的总体景深范围。
为了实现上述第一景深范围、第二景深范围和第三景深范围,第一透镜单元311、第二透镜单元312、第三透镜单元313可以具有不同的焦距,附加地或替代地,其可以距对应的子图像传感器321、322、323不同距离。
类似地,在至少一个示例中,成像装置300还可以包括滤色片阵列,该滤色片阵列包括对通过透镜阵列中的各个透镜单元的光线进行滤光的多个滤色片,例如红色滤色片、绿色滤色片和蓝色滤色片等。
需要说明的是,本公开的各个实施例不限制成像装置中与图像传感器具有不同景深的透镜阵列的数量,其可以为2个、3个或大于3个。可以根据需要增加透镜阵列,以进一步拓展成像装置的景深范围。如上所述,为了使得总体景深范围连续,多个透镜阵列的景深范围两两部分重合。为了在总体景深范围相同的情况下,使不同透镜阵列的数量尽量少,增加每个透镜阵列的覆盖率和效率,重合范围不应 过大。
在未示出的其他实施例中,成像装置的透镜阵列包括第一透镜子阵列和第二透镜子阵列,第一透镜子阵列中包括多个第一透镜单元,第二透镜子阵列包括多个第二透镜单元。第一透镜单元与图像传感器之间具有第一距离,第二透镜单元与图像传感器之间具有第二距离,第一透镜单元具有第一焦距,第二透镜单元具有第二焦距,第一距离与第二距离不同并且第一焦距与第二焦距不同使得第一景深范围与第二景深范围的共同范围大于第一景深范围和第二景深范围,从而拓展了成像装置的景深范围。通过结合地调节距离和焦距,可以更加方便且灵活地配置不同的景深范围。
本公开的一些实施例的成像装置中,由于在空间域上提供第一透镜子阵列和第二透镜子阵列,第一透镜子阵列和第二透镜子阵列可以同时用于成像,与在不同时间段内形成的第一透镜子阵列和第二透镜子阵列相比,对成像时间没有要求。并且,上述透镜阵列中的透镜单元具有固定焦距和固定位置。因此,不需设置特定的机构以调整透镜单元与图像传感器的距离或者不需将透镜单元配置成具有可调节的焦距即可获得拓展的景深。这样的透镜单元和成像装置具有较小的成本。
在本公开的一些其他实施例中,成像装置可以在时间域上对图像传感器提供第一透镜子阵列和第二透镜子阵列。即,成像装置可以通过调整透镜阵列中的透镜单元而使得透镜阵列在不同时间段内分别形成第一透镜子阵列和第二透镜子阵列。在不同时间段内分别形成第一透镜子阵列和第二透镜子阵列,有利于节省空间。
图7示出了根据本公开的另一实施例的成像装置400的原理示意图。如图8所示,成像装置400包括透镜阵列410、410’和图像传感器420。
在该实施例中,透镜阵列410、410’包括多个液晶透镜单元411、411’。图像传感器420包括多个子图像传感器421。该多个液晶 透镜单元411、411’与多个子图像传感器421一一对应。液晶透镜单元411、411’的焦距被配置成可调的。
例如,液晶透镜单元411、411’每个包括液晶层和多个液晶控制电极,液晶层夹置在两块透明基板之间,该多个液晶控制电极可以位于液晶层的同一侧(的基板上),或一部分位于一侧(的基板上)而另一部分位于另一侧(的基板上);当在这些液晶控制电极上施加预定电压组合(驱动电压)时,可以在液晶层中形成呈预定分布的电场,该电场可以驱动液晶层中液晶分子偏转,改变液晶层在不同位置的折射率,使得液晶层具有透镜效果,实现对于通过液晶层的光的调制。虽然在图7中以双侧凸透镜为例示出了液晶透镜,但是应该理解的是,液晶透镜单元本身的外形并非是双侧凸透镜的。
通过在液晶透镜单元411、411’加载不同的驱动电压而调整液晶透镜单元411、411’的焦距。通过调整液晶透镜单元411、411’的焦距而使得透镜阵列在不同时间段内分别形成第一透镜子阵列(如图7中的实线部分所示)和第二透镜子阵列(如图7中的虚线部分所示)。
如图7中的实线部分所示,在第一时间点,所形成的第一子透镜阵列的第一透镜单元411可以具有第一焦距,并且例如与图像传感器420的多个子图像传感器421一一对应,从而具有第一景深范围,其中第一透镜单元411的前景深点为Q41,后景深点为Q42。
如图7中的虚线部分所示,在第二时间点,所形成的第二透镜子阵列的第二透镜单元411’可以具有不同于第一焦距的第二焦距,并且同样例如与图像传感器420的多个子图像传感器421一一对应,从而具有第二景深范围,其中第二透镜单元411’的前景深点为Q43,后景深点为Q44。
通过适当地设置第一焦距和第二焦距,使得第二透镜单元411’的前景深点Q43、第一透镜单元411的前景深点Q41、第二透镜单元411’的后景深点Q44和第一透镜单元411的后景深点Q42依次靠近无限远处,从而使得成像装置400的总体景深范围被拓展。
也就是说,可以在第一时间点对目标成像对象进行一次成像以获得目标成像对象的具有不同取向的多个完整图像。然后,紧接着,可以在第二时间点对目标成像对象进行再一次成像以获得目标成像对象的具有不同取向的另外的多个完整图像。由于在第一时间点和第二时间点,透镜阵列的各个透镜单元411、411’与其对应的子图像传感器421具有不同的景深范围,因此,前后两次获得的多个完整图像具有不同的景深范围。从而,图像处理装置可以通过融合重聚焦算法对两次成像所获得的总体上具有更宽的景深范围的原始图像进行融合,以获得更宽景深范围的期望的最终重聚焦图像。
类似地,在至少一个示例中,成像装置400还可以包括滤色片阵列,该滤色片阵列包括对通过透镜阵列中的各个透镜单元的光线进行滤光的多个滤色片,例如红色滤色片、绿色滤色片和蓝色滤色片等。例如,该滤色片阵列可以设置在透镜阵列与图像传感器之间。
图8示出了根据本公开的另一实施例的成像装置400的原理示意图。如图8所示,成像装置500包括透镜阵列510、510’和图像传感器520。
在该实施例中,透镜阵列510、510’包括多个相同的透镜单元511、511’。图像传感器520包括多个子图像传感器521。该多个透镜单元511、511’与多个子图像传感器521一一对应。
该成像装置400还包括位置调整机构540,该位置调整机构540用于调整各个透镜单元511、511’距与其对应的子图像传感器521之间的距离;例如,该位置调整机构540可以通过各种适当的形式实现,例如机械方式、磁方式等,例如机械方式可以包括电机与齿轮或者磁致应变材料件等。也就是说,透镜单元511、511’与子图像传感器521的距离是可变的,从而透镜阵列510、510’可以在不同时间段内分别形成第一透镜子阵列(如图8中的实线部分所示)和第二透镜子阵列(如图8中的虚线部分所示)。
如图8中的实线部分所示,在第一时间点,透镜阵列510的第一 透镜单元511可以距与其对应的子图像传感器521第一距离d3,从而具有第一景深范围,其中第一透镜单元511的前景深点为Q51,后景深点为Q52。
如图8中的虚线部分所示,在第二时间点,由于位置调整机构540调整了透镜单元511’、511的位置,透镜阵列510’中的第二透镜单元511’可以距与其对应的子图像传感器521与第一距离d3不同的第二距离d4,从而具有第二景深范围,其中第二透镜单元511’的前景深点为Q53,后景深点为Q54。
适当地设置第一距离d3和第二距离d4,使得第二透镜单元511’的前景深点Q53、第一透镜单元511的前景深点Q51、第二透镜单元511’的后景深点Q54和第一透镜单元511的后景深点Q52依次靠近无限远处,从而使得成像装置500的总体景深范围被拓展。
类似地,在至少一个示例中,成像装置500还可以包括滤色片阵列,该滤色片阵列包括对通过透镜阵列中的各个透镜单元的光线进行滤光的多个滤色片,例如红色滤色片、绿色滤色片和蓝色滤色片等。
当然,只要不相互抵触,也可以组合上述实施例得到本公开的一些其他实施例,由此可以综合地调整透镜单元距图像传感器的距离以及其焦距,本公开的实施例对此不作限制。
本公开的一些实施例还提供一种电子设备,该电子设备包括本公开任一实施例的成像装置,例如上所述的任一成像装置。例如,电子设备可以为移动电话、平板电脑、笔记本电脑等,本公开的实施例对此不作限制。
图9示出了根据本公开一个实施例的示例性电子设备的透视示意图,其包括上述任一成像装置100、200、300、400等。该电子设备例如为移动电话,成像装置100、200、300、400例如设置在该移动电话的背面(也即背向使用者的一侧),作为后置摄像头,当然本公开的实施对此不作限制。例如,成像装置100、200、300、400可以设置成尽可能地大以从更多、更大的取向获取目标成像对象的完整图像。
例如,参见如图2A、和图9所示,电子设备包括显示面板(未示出)、壳体50、成像装置和图像处理装置(未示出)。例如,显示面板的显示表面朝向使用者,即位于电子设备的正面;成像装置设置在显示表面的相对侧。成像装置包括透镜阵列和图像传感器。透镜阵列设置在电子设备的壳体50中,与显示面板相对地设置,并且在电子设备的背面暴露于壳体50外部,图像传感器设置在壳体50内部且邻近透镜阵列设置。用户可以通过成像装置对目标对象进行成像,然后通过电子设备的图像处理装置执行融合重聚焦算法而在显示面板上形成最终图形;该图像处理装置例如为图像处理器或图像处理芯片等。
例如,透镜单元可以通过基座(未示出)嵌入到壳体50中,可以通过确定基座的位置而确定上述第一距离和第二距离。
图10A示出了根据本公开另一个实施例的示例性电子设备的透视示意图,其包括成像装置500。该电子设备例如为移动电话,成像装置500例如设置在该移动电话的背面(也即背向使用者的一侧),当然本公开的实施对此不作限制。
例如,参见图10A,电子设备包括显示面板(未示出)、壳体550、成像装置500和图像处理装置(未示出)。例如,显示面板的显示表面朝向使用者,即位于电子设备的正面;成像装置500设置在显示表面的相对侧。该成像装置500包括透镜阵列510和图像传感器520。
图10B示出了图10A中的成像装置的透镜阵列的示意图。在本实施例中,如图10A和图10B所示,透镜阵列510可以是具有层叠结构的液晶透镜阵列板。该液晶透镜阵列板可以嵌入到壳体550中,与显示面板相对地设置。该液晶透镜阵列板包括第一基板501、第二基板502、设置于第一基板501面向第二基板502的一侧上的第一电极层503、设置于第二基板502面向第一基板501的一侧上的第二电极层504,以及位于第一电极层503和第二电极层504之间的液晶层505。例如,第一电极层503和第二电极层504可以为例如为ITO透明电 极。可以将液晶透镜阵列板划分为多个液晶透镜单元,例如划分为多个第一液晶透镜单元511和多个第二液晶透镜单元512。在本实施例中,第一电极层503可以为一个整体的公共电极,第二电极层可以包括独立控制各个液晶透镜单元511、512的多个第二子电极5041。也就是说,每个液晶透镜单元511、512包括公共的第一基板503、公共的第一电极层503、独立的第二子电极504、公共的液晶层505以及公共的第二基板502。
本公开的范围并非由上述描述的实施方式来限定,而是由所附的权利要求书及其等同范围来限定。

Claims (14)

  1. 一种成像装置,包括透镜阵列和图像传感器,
    其中,所述透镜阵列包括多个透镜单元,且配置成至少提供第一透镜子阵列和第二透镜子阵列,
    所述第一透镜子阵列包括所述多个透镜单元中的第一透镜单元,所述第一透镜单元与所述图像传感器配置成具有第一景深范围,
    所述第二透镜子阵列包括所述多个透镜单元中的第二透镜单元,所述第二透镜单元与所述图像传感器配置成具有第二景深范围,
    所述第一景深范围与所述第二景深范围部分重合且所述第一景深范围与所述第二景深范围的共同范围大于所述第一景深范围和所述第二景深范围。
  2. 根据权利要求1所述的成像装置,其中,
    所述第一透镜子阵列的第一透镜单元与所述图像传感器之间具有第一距离,
    所述第二透镜子阵列的第二透镜单元与所述图像传感器之间具有与第一距离不同的第二距离。
  3. 根据权利要求1或2所述的成像装置,其中,
    所述第一透镜子阵列的第一透镜单元具有第一焦距,
    所述第二透镜子阵列的第二透镜单元具有与第一焦距不同的第二焦距。
  4. 根据权利要求1-3中任一项所述的成像装置,其中,
    所述透镜单元为凸透镜。
  5. 根据权利要求1-4中任一项所述的成像装置,其中,
    所述第一透镜子阵列和所述第二透镜子阵列呈规则布置。
  6. 根据权利要求1所述的成像装置,其中,
    所述透镜阵列的所述多个透镜单元配置为,使得所述多个透镜单元各自的焦距可调或者所述多个透镜单元各自与所述图像传感器之间 的距离可调,以使得所述透镜阵列在不同时间段内分别形成所述第一透镜子阵列和所述第二透镜子阵列。
  7. 根据权利要求1所述的成像装置,其中,
    所述透镜单元为多个液晶透镜单元,所述多个液晶透镜单元配置为使得所述多个液晶透镜单元各自的焦距可调,以使得所述透镜阵列在不同时间段内分别形成所述第一透镜子阵列和所述第二透镜子阵列。
  8. 根据权利要求1-7任一所述的成像装置,还包括滤色片阵列,其中,所述滤光片阵列包括多个滤光片,所述多个滤光片的每个对应于所述多个透镜单元中的一个透镜单元,以对所述一个透镜单元透过的光线进行滤光。
  9. 根据权利要求1-7任一所述的成像装置,其中,
    所述第一景深范围的后景深点在无穷远处,所述第二景深范围的后景深点不在无穷远处。
  10. 根据权利要求1-7任一所述的成像装置,其中,
    所述第一景深范围与所述第二景深范围的重合范围在0-100mm。
  11. 根据权利要求1-7任一所述的成像装置,其中,
    所述图像传感器包括多个子图像传感器,所述多个子图像传感器与所述多个透镜单元一一对应,使得每个子图像传感器配置为接收一个透镜单元的入射光以成像。
  12. 根据权利要求1-11中任一项所述的成像装置,
    所述透镜阵列配置成还提供第三透镜子阵列,
    所述第三透镜子阵列包括所述多个透镜单元中的第三透镜单元,所述第三透镜单元与所述图像传感器配置成具有第三景深范围,
    所述第三景深范围至少与所述第一景深范围或第二景深范围中的一个部分重合,
    所述第一景深范围、所述第二景深范围、所述第三景深范围的共同范围大于所述第一景深范围、所述第二景深范围和所述第三景深范 围中任意一者或两者的范围。
  13. 一种电子设备,包括根据权利要求1-12中任一项所述的成像装置。
  14. 根据权利要求13所述的电子设备,其中,
    所述电子设备包括壳体和显示面板,
    所述透镜阵列与所述显示面板相对地设置在所述壳体中,并且所述透镜阵列的一部分暴露于所述壳体的外部,
    所述图像传感器设置在所述壳体的内部。
PCT/CN2019/125631 2019-03-11 2019-12-16 成像装置和电子设备 WO2020181869A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/767,327 US20210203821A1 (en) 2019-03-11 2019-12-16 Imaging device and electronic apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910180440.2A CN111698348B (zh) 2019-03-11 2019-03-11 成像装置和电子设备
CN201910180440.2 2019-03-11

Publications (1)

Publication Number Publication Date
WO2020181869A1 true WO2020181869A1 (zh) 2020-09-17

Family

ID=72426155

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/125631 WO2020181869A1 (zh) 2019-03-11 2019-12-16 成像装置和电子设备

Country Status (3)

Country Link
US (1) US20210203821A1 (zh)
CN (1) CN111698348B (zh)
WO (1) WO2020181869A1 (zh)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2618466A (en) * 2021-02-20 2023-11-08 Boe Technology Group Co Ltd Image acquisition device, image acquisition apparatus, image acquisition method and manufacturing method
CN115524857B (zh) * 2022-11-24 2023-03-31 锐驰智光(北京)科技有限公司 光学系统及具有此的激光雷达

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006251613A (ja) * 2005-03-14 2006-09-21 Citizen Watch Co Ltd 撮像レンズ装置
US7830443B2 (en) * 2004-12-21 2010-11-09 Psion Teklogix Systems Inc. Dual mode image engine
CN102131044A (zh) * 2010-01-20 2011-07-20 鸿富锦精密工业(深圳)有限公司 相机模组
CN104410784A (zh) * 2014-11-06 2015-03-11 北京智谷技术服务有限公司 光场采集控制方法和装置
CN104717482A (zh) * 2015-03-12 2015-06-17 天津大学 多光谱多景深阵列拍摄方法与拍摄相机
CN105827922A (zh) * 2016-05-25 2016-08-03 京东方科技集团股份有限公司 一种摄像装置及其拍摄方法

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1983140A (zh) * 2005-12-13 2007-06-20 邓仕林 笔形光鼠标的多透镜光路光学成像装置
US8478123B2 (en) * 2011-01-25 2013-07-02 Aptina Imaging Corporation Imaging devices having arrays of image sensors and lenses with multiple aperture sizes
CN106027861B (zh) * 2016-05-23 2019-06-04 西北工业大学 基于微相机阵列的光场采集装置及数据处理方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7830443B2 (en) * 2004-12-21 2010-11-09 Psion Teklogix Systems Inc. Dual mode image engine
JP2006251613A (ja) * 2005-03-14 2006-09-21 Citizen Watch Co Ltd 撮像レンズ装置
CN102131044A (zh) * 2010-01-20 2011-07-20 鸿富锦精密工业(深圳)有限公司 相机模组
CN104410784A (zh) * 2014-11-06 2015-03-11 北京智谷技术服务有限公司 光场采集控制方法和装置
CN104717482A (zh) * 2015-03-12 2015-06-17 天津大学 多光谱多景深阵列拍摄方法与拍摄相机
CN105827922A (zh) * 2016-05-25 2016-08-03 京东方科技集团股份有限公司 一种摄像装置及其拍摄方法

Also Published As

Publication number Publication date
CN111698348A (zh) 2020-09-22
US20210203821A1 (en) 2021-07-01
CN111698348B (zh) 2021-11-09

Similar Documents

Publication Publication Date Title
US10594919B2 (en) Camera device and method for capturing images by using the same
CN108919573B (zh) 显示面板、显示装置、成像方法和深度距离的检测方法
JP5040493B2 (ja) 撮像装置及び撮像方法
JP5331838B2 (ja) 固体撮像装置および携帯情報端末
US9781311B2 (en) Liquid crystal optical device, solid state imaging device, portable information terminal, and display device
JP2010050707A (ja) 撮像装置、表示装置および画像処理装置
US11531181B2 (en) Imaging lens module and electronic device
US20140118516A1 (en) Solid state imaging module, solid state imaging device, and information processing device
WO2021103872A1 (zh) 图像传感器、摄像装置、电子设备和成像方法
WO2020181869A1 (zh) 成像装置和电子设备
US9462166B2 (en) Imaging device, portable information terminal, and display device
US20140284746A1 (en) Solid state imaging device and portable information terminal
TW201517257A (zh) 在多透鏡陣列模組中之緊密間隔
CN105241450B (zh) 基于四象限偏振片的天空偏振模式探测方法与系统
US9411136B2 (en) Image capturing module and optical auxiliary unit thereof
CN110166676B (zh) 一种成像设备、成像控制方法、电子装置和介质
WO2019049193A1 (ja) 指紋認証用センサモジュールおよび指紋認証装置
US11336806B2 (en) Dual-function display and camera
WO2019019813A1 (zh) 一种感光组件
US20220329774A1 (en) Tof camera
JP2005250026A (ja) 液晶調光素子、レンズ鏡筒および撮像装置
WO2021166834A1 (ja) 撮像装置、及び撮像方法
US11726306B2 (en) Imaging device
JP2021132372A (ja) 撮像装置、及び撮像方法
KR20180051340A (ko) 카메라 모듈

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19919317

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19919317

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09/05/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 19919317

Country of ref document: EP

Kind code of ref document: A1