WO2020181869A1 - 成像装置和电子设备 - Google Patents
成像装置和电子设备 Download PDFInfo
- Publication number
- WO2020181869A1 WO2020181869A1 PCT/CN2019/125631 CN2019125631W WO2020181869A1 WO 2020181869 A1 WO2020181869 A1 WO 2020181869A1 CN 2019125631 W CN2019125631 W CN 2019125631W WO 2020181869 A1 WO2020181869 A1 WO 2020181869A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- lens
- array
- depth
- sub
- range
- Prior art date
Links
- 238000003384 imaging method Methods 0.000 title claims abstract description 125
- 239000004973 liquid crystal related substance Substances 0.000 claims description 38
- 238000003491 array Methods 0.000 claims description 15
- 239000000758 substrate Substances 0.000 description 21
- 238000010586 diagram Methods 0.000 description 18
- 238000000034 method Methods 0.000 description 7
- 230000004927 fusion Effects 0.000 description 5
- 239000011521 glass Substances 0.000 description 4
- 238000004519 manufacturing process Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 239000011347 resin Substances 0.000 description 3
- 229920005989 resin Polymers 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 238000010297 mechanical methods and process Methods 0.000 description 2
- 229910044991 metal oxide Inorganic materials 0.000 description 2
- 150000004706 metal oxides Chemical class 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000000919 ceramic Substances 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 238000005530 etching Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000003292 glue Substances 0.000 description 1
- 238000000465 moulding Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000005476 soldering Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
- H04M1/0264—Details of the structure or mounting of specific components for a camera module assembly
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/02—Constructional features of telephone sets
- H04M1/0202—Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
- H04M1/026—Details of the structure or mounting of specific components
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/957—Light-field or plenoptic cameras or camera modules
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0037—Arrays characterized by the distribution or form of lenses
- G02B3/0043—Inhomogeneous or irregular arrays, e.g. varying shape, size, height
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B3/00—Simple or compound lenses
- G02B3/0006—Arrays
- G02B3/0037—Arrays characterized by the distribution or form of lenses
- G02B3/0056—Arrays characterized by the distribution or form of lenses arranged along two different directions in a plane, e.g. honeycomb arrangement of lenses
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B5/00—Optical elements other than lenses
- G02B5/20—Filters
- G02B5/201—Filters in the form of arrays
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F1/00—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
- G02F1/29—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the position or the direction of light beams, i.e. deflection
- G02F1/294—Variable focal length devices
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B3/00—Focusing arrangements of general interest for cameras, projectors or printers
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B30/00—Camera modules comprising integrated lens units and imaging units, specially adapted for being embedded in other devices, e.g. mobile phones or vehicles
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/45—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from two or more image sensors being of different type or operating in different modes, e.g. with a CMOS sensor for moving images in combination with a charge-coupled device [CCD] for still images
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/676—Bracketing for image capture at varying focusing conditions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
- H04N23/959—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging by adjusting depth of field during image capture, e.g. maximising or setting range based on scene characteristics
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N25/00—Circuitry of solid-state image sensors [SSIS]; Control thereof
- H04N25/10—Circuitry of solid-state image sensors [SSIS]; Control thereof for transforming different wavelengths into image signals
- H04N25/11—Arrangement of colour filter arrays [CFA]; Filter mosaics
- H04N25/13—Arrangement of colour filter arrays [CFA]; Filter mosaics characterised by the spectral characteristics of the filter elements
Definitions
- Embodiments of the present disclosure relate to an imaging device and electronic equipment including the imaging device.
- the light field is a complete representation of the collection of light in space. Collecting and displaying the light field can visually reproduce the real world. Compared with ordinary cameras, light field cameras can obtain more complete light field information.
- the imaging part of the light field camera is generally composed of a microlens array and an image sensor.
- the microlens array can divide the light beam emitted by the same object point into discrete thin beams of different angles, and each microlens in the microlens array focuses the corresponding thin beam that passes through it to the corresponding part of the image sensor.
- Each part of the image sensor receives a thin beam of light passing through the corresponding microlens to record a complete image with a specific orientation, thereby capturing more complete light field information.
- the light field camera can realize the function of "photograph first, focus later". For example, for a light field camera with m (m is usually greater than 2, for example, several tens) lenses, m original images can be obtained in one frame time. Then, through the fusion refocusing algorithm, the m original images are fused to obtain the desired final refocusing image. Depending on the algorithm, the refocused focal plane of the obtained refocused image can be different.
- At least one embodiment of the present disclosure provides an imaging device including a lens array and an image sensor, wherein the lens array includes a plurality of lens units and is configured to provide at least a first lens sub-array and a second lens sub-array, so
- the first lens sub-array includes a first lens unit of the plurality of lens units, the first lens unit and the image sensor are configured to have a first depth of field range
- the second lens sub-array includes the multiple The second lens unit of the two lens units, the second lens unit and the image sensor are configured to have a second depth of field range, the first depth of field and the second depth of field are partially overlapped, and the first depth of field
- the common range of the range and the second depth of field range is larger than the first depth of field range and the second depth of field range.
- the first lens unit of the first lens sub-array and the image sensor there is a first distance between the first lens unit of the first lens sub-array and the image sensor, and the second lens unit of the second lens sub-array is connected to the image sensor. There is a second distance between the sensors.
- the first lens unit of the first lens sub-array has a first focal length
- the second lens unit of the second lens sub-array has a second focal length
- the first lens unit of the first lens sub-array there is a first distance between the first lens unit of the first lens sub-array and the image sensor, and the second lens unit of the second lens sub-array is connected to the image sensor.
- the first lens unit of the first lens sub-array has a first focal length
- the second lens unit of the second lens sub-array has a second focal length.
- the lens unit is a convex lens.
- the first lens sub-array and the second lens sub-array are regularly arranged.
- the plurality of lens units of the lens array are configured such that the focal length of each of the plurality of lens units can be adjusted or the plurality of lens units are different from the image sensor.
- the distance between them is adjustable, so that the lens arrays respectively form the first lens sub-array and the second lens sub-array in different time periods.
- the lens unit is a plurality of liquid crystal lens units, and the plurality of liquid crystal lens units are configured such that the focal length of each of the plurality of liquid crystal lens units is adjustable, so that the lens array The first lens sub-array and the second lens sub-array are respectively formed in different time periods.
- the imaging device further includes a color filter array, wherein the filter array includes a plurality of filters, and each of the plurality of filters corresponds to the One lens unit of the plurality of lens units filters the light transmitted by the one lens unit.
- the back depth point of the first depth of field range is at infinity, and the back depth point of the second depth of field range is not at infinity.
- the overlapping range of the first depth of field range and the second depth of field range is 0-100 mm.
- the image sensor includes a plurality of sub-image sensors, and the plurality of sub-image sensors correspond to the plurality of lens units one-to-one, so that each sub-image sensor is configured to receive incident light from one lens unit. Light to image.
- the lens array is configured to further provide a third lens sub-array, the third lens sub-array includes a third lens unit of the plurality of lens units, and the third lens
- the unit and the image sensor are configured to have a third depth-of-field range, the third depth-of-field range is at least partially overlapped with one of the first depth-of-field range or the second depth-of-field range, the first depth-of-field range, the second depth-of-field range
- the common range of the range of depth of field and the range of the third depth of field is larger than the range of any one or both of the range of the first depth of field, the second range of depth, and the third range of depth of field.
- At least one embodiment of the present disclosure also provides an electronic device, which includes any imaging device as described above.
- the electronic device includes a housing and a display panel, the lens array is disposed in the housing opposite to the display panel, and a part of the lens array is exposed to the Outside the housing, the image sensor is arranged inside the housing.
- Figure 1 shows a schematic diagram of the principle of an imaging device
- FIG. 2A shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
- FIG. 2B shows a schematic side view of an electronic device including the imaging device shown in FIG. 2A;
- 2C shows a schematic diagram of a lens array, a filter array, and an image sensor of an imaging device according to another embodiment of the disclosure
- 3A and 3B show the image sensor in FIG. 2A;
- 4A, 4B, 4C, and 4D show exemplary arrangements of a lens array according to another embodiment of the disclosure
- Fig. 5 shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
- Fig. 6 shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
- FIG. 7 shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
- FIG. 8 shows a schematic diagram of a lens array and an image sensor of an imaging device according to another embodiment of the present disclosure
- FIG. 9 shows a perspective schematic diagram of an electronic device including an imaging device according to an embodiment of the present disclosure.
- FIG. 10A shows a perspective schematic diagram of an electronic device including an imaging device according to another embodiment of the present disclosure
- FIG. 10B shows a schematic diagram of the lens array of the imaging device in FIG. 10A.
- FIG. 1 shows an imaging device 10, such as a light field camera, which generally includes a lens array 11 and an image sensor 12.
- the lens array 11 includes a plurality of lens units, and the image sensor 12 is arranged corresponding to each lens unit.
- each light from the target imaging object (a tree in the figure) passes through each lens unit, and then is respectively focused on different parts of the image sensor 12 corresponding to it. Therefore, different parts of the image sensor 12 can record a complete image of the target imaging object from different orientations, and each pixel of the different parts of the image sensor 12 records the object point (for example, see FIG. 1, the object on the target imaging object).
- each lens unit and image sensor in the single lens array are configured to have a single depth range.
- the depth of field range obtained by combining it with the image sensor is limited.
- the lens unit and the image sensor in a single lens array have a depth of field from 20 cm to infinity when the focal point is on the focal plane, that is, the foreground depth point is 20 cm from the lens, and the back depth point is infinite. If the focus point is not on the focal plane, but moves a certain distance in the direction of the lens unit, while the foreground deep point moves forward, the rear depth point also moves forward, at this time, the infinity can no longer be clearly imaged. Therefore, limited by means, space, cost, etc., it is difficult to increase the depth of field range of an imaging device with a single lens array.
- Some embodiments of the present disclosure provide an imaging device including a lens array and an image sensor.
- the lens array is configured to provide a first lens sub-array and a second lens sub-array in the spatial domain or in the time domain, and the lens unit of the first lens sub-array and the image sensor are configured to have a first depth of field Range, the lens unit of the second lens sub-array and the image sensor are configured to have a second depth of field range, and the common range of the first depth range and the second depth range is larger than the first depth range and also larger than the second depth range.
- the imaging device can expand the depth of field range of the imaging device by providing the aforementioned first lens sub-array and the second lens sub-array, so that the imaging device can reposition any object surface within the extended depth of field. Focus.
- the first depth-of-field range and the second depth-of-field range may not overlap at all.
- the first depth-of-field range and the second depth-of-field range may partially overlap.
- the foreground depth point of the first depth range is before the back depth point of the second depth range; for example, further, the back depth point of the first depth range is at infinity. Therefore, the common range of the first depth-of-field range and the second depth-of-field range is continuous, so that the imaging device can refocus any object surface within the continuous depth-of-field range.
- the overlapping range of the first depth of field range and the second depth of field range may be 0-100 mm, so that the depth of field range can be expanded as much as possible, and the expansion efficiency can be improved.
- the overlap range is equal to 0 mm, the end points of the depth range overlap.
- the back depth point of the first depth of field range may be set to be at infinity and the back depth point of the second depth of field range may be set not to be at infinity, so that the depth of field range can also be expanded as much as possible.
- imaging apparatuses according to various embodiments of the present disclosure will be described in more detail below.
- FIG. 2A shows a schematic diagram of the imaging device 100 according to an embodiment of the present disclosure.
- the imaging device 100 may be implemented as a light field camera.
- the imaging device 100 includes a lens array 110 and an image sensor 120.
- the lens array 110 includes a plurality of lens units 111 and 112 which are arranged side by side with each other;
- the image sensor 120 includes a plurality of lens units 111 and 112.
- the lens array 110 includes a first lens sub-array and a second lens sub-array.
- the first lens sub-array includes at least one first lens unit 111, such as a plurality of first lens units 111 (only one is shown in the figure), and the second lens sub-array includes at least one second lens unit 112, for example, a plurality of second lens units 112.
- Two lens unit 112 (only one is shown in the figure).
- the specifications of the first lens unit 111 and the second lens unit 112 are the same.
- the focal length, size, and structure of the first lens unit 111 and the second lens unit 112 are the same, and for example, both are solid lens units, for example, made of glass or resin materials.
- each lens unit is shown as a complete double-sided convex lens, it is understood that each lens unit can also be formed as a partial convex lens or a single-sided convex lens, which can also achieve the Modulation effect.
- the following embodiments to be described are the same as this, so they will not be repeated.
- the image sensor 120 includes a first sub image sensor array and a second sub image sensor array, the first sub image sensor array includes at least one first sub image sensor 121, and the second sub image sensor array includes at least one second sub image sensor 122 .
- the first sub image sensor 121 and the second sub image sensor 122 may be implemented by a complementary metal oxide semiconductor (CMOS, Complementary Metal Oxide Semiconductor) device or a charge coupled device (CCD, Charge Coupled Device).
- CMOS Complementary Metal Oxide Semiconductor
- CCD Charge Coupled Device
- each of the first sub-image sensor 121 or the second sub-image sensor 122 may include a pixel array and a detection circuit
- each sub-pixel of the pixel array may include a photoelectric sensing device and a pixel drive circuit, etc.
- the detection circuit may include a reading circuit. Circuits, amplifier circuits, analog-to-digital conversion circuits, etc., can therefore (for example, line by line) read and process the photoelectric signals collected from the pixel array to obtain data corresponding to an image.
- FIG. 2B shows a schematic side view of an exemplary electronic device including the imaging device 100 shown in FIG. 2A, and FIG. 3A and FIG. 3B show the image sensor 120 in FIG. 2A.
- the lens units 111 and 112 in the lens array 110 correspond to the sub-image sensors 121 and 122 in the image sensor 120, respectively.
- the light from the target imaging object passing through each lens unit 111, 112 is respectively focused to the corresponding sub-image sensors 121, 122 that overlap each lens unit 111, 112 in the light incident direction.
- These sub-image sensors 121, 122 respectively form a complete image of the target imaging object.
- light from the target imaging object passing through the first lens unit 111 is respectively focused on the first sub-image sensor 121 corresponding to the first lens unit 111, and the first sub-image sensor 121 forms a complete first image of the target imaging object;
- the light from the target imaging object passing through the second lens unit 112 is respectively focused on the second sub-image sensor 122 corresponding to the second lens unit 112, and the second sub-image sensor 122 forms a complete second image of the target imaging object.
- each lens unit 111, 112 and each sub image sensor 121, 122 corresponding to each lens unit 111, 112 the complete image of the target imaging object formed by each sub image sensor 121, 122 has a different orientation The complete image. That is, the imaging device of this embodiment can obtain multiple complete images of the target imaging object with different orientations in one shot.
- Each image sensor 121, 122 transmits the first image and the second image with different orientations to an image processing device (not shown in the figure, for example, an image processing chip) included in the electronic device, and then, the image processing device, for example, After the shooting is completed, the first image and the second image can be merged through the fusion refocusing algorithm to obtain the desired final refocusing image.
- an image processing device not shown in the figure, for example, an image processing chip
- first distance d1 between the first lens unit 111 and the corresponding first sub image sensor 121, thereby having a first depth of field range; between the second lens unit 112 and the corresponding second sub image sensor 122 It has a second distance d2 smaller than the first distance d1, thereby having a second depth range different from the first depth range.
- the first distance d1 and the second distance d2 are configured such that the foreground deep point Q13 of the second lens unit 112, the foreground deep point Q11 of the first lens unit 111, the back depth point Q14 of the second lens unit 112, and the first lens unit 111
- the back depth of field point Q12 is close to infinity in turn.
- the common range of the first depth of field range and the second depth of field range is the continuous range between the foreground depth point Q13 of the second lens unit 112 and the back depth point Q12 of the first lens unit 111, which expands the depth of field of the imaging device 100 range. That is, the first sub image sensor 121 obtains a complete image of the target imaging object with a first depth of field range, the second sub image sensor 122 obtains a complete image of the target imaging object with a second depth of field range, and each sub image sensor obtains The multiple complete images of the target imaging object with different orientations have a wider range of depth of field as a whole. Therefore, the image processing device can fuse the original image with a wider range of depth of field through the fusion refocusing algorithm to obtain a wider range of depth of field. The desired final refocused image of the depth of field range.
- the distance between the lens unit and the image sensor refers to the distance between the optical center of the lens unit and the imaging surface of the image sensor in the direction of the main optical axis of the lens.
- the first depth range and the second depth range are realized by different distances between different lens units 111 and 112 and sub-image sensors 121 and 122. Since the specifications of the lens units 111 and 112 are the same, the manufacturing cost and processing difficulty of the lens array can be reduced.
- the respective sub image sensors 121 and 122 are independent of each other, and may be independently driven, for example.
- the image sensor 120 includes a plurality of sub image sensors 121 and 122 and a substrate 40, and the sub image sensors 121 and 122 are disposed on the substrate 40.
- each sub-image sensor 121, 122 in the image sensor 120 can be arranged on the substrate 40 by means of transfer, thereby reducing manufacturing costs; for another example, each sub-image sensor 121, 122 in the image sensor 120 can be The surface mount process (SMT) is provided on the substrate 40.
- the substrate 40 can be a glass substrate, a plastic substrate, a ceramic substrate, etc.
- the image sensor is obtained by using a glass substrate, which can be compatible with the preparation process of a common display device (such as a liquid crystal display device or an organic light-emitting display device), so that it can be further Reduce manufacturing costs.
- a common display device such as a liquid crystal display device or an organic light-emitting display device
- circuit structures such as wires and contact pads (pads) are provided on the substrate 40, and the sub-image sensors 121 and 122 are bonded to the contact pads in various appropriate ways (such as soldering, conductive glue), etc., so as to realize the circuit connection.
- the image sensor 120 may be constructed as an integrated image sensor, and the sub-image sensors 121 and 122 are respectively different areas of the overall image sensor. In this case, different lens units 111, The light rays from the target imaging object 112 are respectively focused on different areas of the image sensor 120 corresponding to them.
- Each lens sub-array of the lens array may be regularly arranged, for example, the arrangement of each lens sub-array is regular, for example, periodic.
- all lens units are arranged in rows and columns to form a matrix form, wherein the first lens unit 111 and the second lens unit 112 are arranged in rows and columns at intervals.
- all lens units are arranged in rows and columns to form a matrix form, wherein the first lens unit 111 and the second lens unit 112 are spaced apart from each other in the row and column directions.
- the regularly arranged first lens sub-array and second lens sub-array help simplify the algorithm.
- the individual lens sub-arrays of the lens array can also be randomly arranged, as shown in FIG. 4C, the first lens unit 111 and the second lens unit 112 are randomly arranged in the row and column directions, and adjacent lenses The cells are not aligned with each other.
- each lens sub-array of the lens array may also be arranged as a specific mark.
- a plurality of first lens sub-arrays are arranged as a letter B
- a plurality of second lens sub-arrays are arranged as a letter O
- a plurality of third lens sub-arrays are arranged as a letter E.
- Arrangement as a specific logo helps to improve aesthetics.
- the imaging device 100 may further include a color filter array 130, which includes performing processing on the light passing through each lens unit 111 and 112 in the lens array 110.
- a plurality of color filters for filtering light such as a red color filter, a green color filter, and a blue color filter, make the sub-image sensors corresponding to the respective lens units 111 and 112 generate a monochrome image, and the resulting monochrome image Standby for subsequent image processing.
- the color filter array 130 may be arranged in a Bayer array for each lens sub-array.
- the color filter array 130 may be arranged between the lens array 110 and the image sensor 120.
- Various color filters can be prepared from, for example, color resin materials.
- FIG. 5 shows a schematic diagram of an imaging device 200 according to another embodiment of the present disclosure.
- the imaging device 200 includes a lens array 210 and an image sensor 220.
- the lens array 210 includes a plurality of lens units 211 and 212 which are arranged side by side with each other;
- the image sensor 220 includes a plurality of sub-image sensors 221 and 222 which are arranged side by side with each other.
- the lens array 210 includes a first lens sub-array and a second lens sub-array.
- the first lens sub-array includes a plurality of first lens units 211 (only one is shown in the figure), and the second lens sub-array includes a plurality of second lens units 212 (only one is shown in the figure).
- the distances of the sub-image sensors 221 and 222 corresponding to the respective distances of the plurality of first lens units 211 and the plurality of second lens units 212 are the same.
- the lens array 210 is an integrated lens array, and the individual lens units of the lens array are connected to each other, for example, prepared from the same resin film through a molding process, or from the same glass film through an etching process.
- the image sensor 220 includes a first sub image sensor array including a plurality of sub image sensors 221 and a second sub image sensor array including a plurality of sub image sensors 222.
- each lens unit 211, 212 corresponds to each sub-image sensor 221, 222 one-to-one to form a target imaging object with different orientations. The complete image.
- the first lens unit 211 each has a first focal length f1, so that the corresponding sub-image sensor 221 has a first depth of field range, and the second lens unit 212 each has a second focal length f2 greater than the first focal length f1, so as to correspond to it.
- the sub image sensor 222 has a second depth range different from the first depth range, so that the depth range of the imaging device 200 can be expanded.
- the first focal length f1 of the first lens unit 211 and the second focal length f2 of the second lens unit 212 are configured such that the foreground deep point Q23 of the second lens unit 212, the foreground deep point Q21 of the first lens unit 211, and the second lens unit 212
- the back depth point Q24 of 212 and the back depth point Q22 of the first lens unit 211 are sequentially away from infinity. Therefore, the common range of the first depth range and the second depth range is the continuous range between the foreground depth point Q23 of the second lens unit 212 and the back depth point Q22 of the first lens unit 211, which expands the depth range of the imaging device 200 .
- the imaging device 200 may further include a color filter array, the color filter array including a plurality of color filters that filter light passing through each lens unit in the lens array, such as a red filter. Color chip, green color filter, blue color filter, etc.
- FIG. 6 shows a schematic diagram of the imaging device 300 according to another embodiment of the present disclosure.
- the imaging device 300 includes a lens array 310 and an image sensor 320.
- the lens array 310 includes a plurality of lens units 311, 312, and 313, the plurality of lens units 311, 312, and 313 are arranged side by side;
- the image sensor 320 includes a plurality of sub image sensors 321, 322, 323, and the plurality of sub image sensors 321 , 322, 323 are arranged side by side with each other.
- the lens array 310 includes a first lens sub-array, a second lens sub-array, and further includes a third lens sub-array.
- the image sensor 320 includes a first sub-image sensor array, a second sub-image sensor, and a third sub-image sensor array, where the first sub-image sensor array includes a plurality of first sub-image sensors 321, and the second sub-image sensor array includes multiple A second sub image sensor 322, and the third sub image sensor array includes a plurality of third sub image sensors 323.
- each lens unit 311, 312, 313 corresponds to each sub image sensor 321, 322, 323 one-to-one to form a target imaging object, respectively Of complete images with different orientations.
- the first lens sub-array includes a plurality of first lens units 311 (only one is shown in the figure), each of the plurality of first lens units 311 has a foreground depth point Q31 and a back depth point Q32, and the back depth point Q32 is infinite
- the second lens sub-array includes a plurality of second lens units 312 (only one is shown in the figure), each of the plurality of second lens units 312 has a foreground depth point Q33 and a back depth point Q34, and a third lens sub-array It includes a plurality of third lens units 313 (only one is shown in the figure), and each of the plurality of third lens units 313 has a foreground depth point Q35 and a back depth point Q36.
- the first lens unit 311, the second lens unit 312, and the third lens unit 313 and their respective corresponding image sensors 321, 322, 323 are configured to have a first depth range, a second depth range, and a third depth range, respectively, thereby enabling Expand the depth of field range of the imaging device.
- the foreground depth point Q31 of the first lens unit 311 in the first lens sub-array is located in front of the back depth point Q34 of the second lens unit 312 in the second lens sub-array
- the foreground deep point Q33 of the second lens unit 312 in the second lens sub-array is located in front of the back depth point Q36 of the plurality of third lens units 313 in the third lens sub-array. Therefore, the common range of the first depth range, the second depth range, and the third depth range is larger than any one or both of the first depth range, the second depth range, and the third depth range, which expands the overall imaging device Depth of field range.
- the first lens unit 311, the second lens unit 312, and the third lens unit 313 may have different focal lengths. Additionally or alternatively, they may be The corresponding sub-image sensors 321, 322, and 323 have different distances.
- the imaging device 300 may further include a color filter array, the color filter array including a plurality of color filters that filter light passing through each lens unit in the lens array, such as a red filter. Color chip, green color filter, blue color filter, etc.
- the various embodiments of the present disclosure do not limit the number of lens arrays having different depths of field from the image sensor in the imaging device, and it can be 2, 3, or more than 3.
- a lens array can be added as needed to further expand the depth of field range of the imaging device.
- the depth ranges of the multiple lens arrays overlap in two parts. In order to minimize the number of different lens arrays and increase the coverage and efficiency of each lens array under the same overall depth of field range, the overlap range should not be too large.
- the lens array of the imaging device includes a first lens sub-array and a second lens sub-array
- the first lens sub-array includes a plurality of first lens units
- the second lens sub-array includes a plurality of The second lens unit.
- the second lens unit has a second distance between the image sensor
- the first lens unit has a first focal length
- the second lens unit has a second focal length.
- the second distance is different and the first focal length is different from the second focal length, so that the common range of the first depth range and the second depth range is greater than the first depth range and the second depth range, thereby expanding the depth range of the imaging device.
- the first lens sub-array and the second lens sub-array are provided in the spatial domain, the first lens sub-array and the second lens sub-array can be used for imaging at the same time. Compared with the second lens sub-array formed in the segment, the imaging time is not required.
- the lens unit in the lens array has a fixed focal length and a fixed position. Therefore, there is no need to provide a special mechanism to adjust the distance between the lens unit and the image sensor or to configure the lens unit to have an adjustable focal length to obtain an extended depth of field. Such a lens unit and imaging device have a small cost.
- the imaging device may provide the image sensor with a first lens sub-array and a second lens sub-array in the time domain. That is, the imaging device can adjust the lens unit in the lens array so that the lens array forms the first lens sub-array and the second lens sub-array in different time periods.
- the first lens sub-array and the second lens sub-array are formed separately in different time periods, which is beneficial to save space.
- FIG. 7 shows a schematic diagram of an imaging device 400 according to another embodiment of the present disclosure.
- the imaging device 400 includes lens arrays 410, 410' and an image sensor 420.
- the lens array 410, 410' includes a plurality of liquid crystal lens units 411, 411'.
- the image sensor 420 includes a plurality of sub image sensors 421.
- the plurality of liquid crystal lens units 411, 411' correspond to the plurality of sub-image sensors 421 one-to-one.
- the focal lengths of the liquid crystal lens units 411, 411' are configured to be adjustable.
- each of the liquid crystal lens units 411, 411' includes a liquid crystal layer and a plurality of liquid crystal control electrodes, the liquid crystal layer is sandwiched between two transparent substrates, and the plurality of liquid crystal control electrodes may be located on the same side of the liquid crystal layer. ), or one part is on one side (on the substrate) and the other part is on the other side (on the substrate); when a predetermined voltage combination (driving voltage) is applied to these liquid crystal control electrodes, a predetermined The distributed electric field can drive the deflection of liquid crystal molecules in the liquid crystal layer, change the refractive index of the liquid crystal layer at different positions, so that the liquid crystal layer has a lens effect, and realizes the modulation of light passing through the liquid crystal layer.
- the liquid crystal lens is shown by taking a double-sided convex lens as an example in FIG. 7, it should be understood that the shape of the liquid crystal lens unit itself is not a double-sided convex lens.
- the focal lengths of the liquid crystal lens units 411 and 411' are adjusted by applying different driving voltages to the liquid crystal lens units 411 and 411'.
- the lens arrays respectively form a first lens sub-array (as shown by the solid line in FIG. 7) and a second lens sub-array (as shown in FIG. 7) in different time periods. The dotted line part is shown).
- the first lens unit 411 of the first sub-lens array formed may have a first focal length, and is, for example, the same as the plurality of sub-image sensors 421 of the image sensor 420.
- the second lens unit 411' of the second lens sub-array formed may have a second focal length different from the first focal length, and is also similar to the image sensor 420, for example.
- the plurality of sub-image sensors 421 of the corresponding one to one so as to have a second depth of field range, wherein the foreground depth point of the second lens unit 411 ′ is Q43, and the back depth point of field is Q44.
- the foreground deep point Q43 of the second lens unit 411', the foreground deep point Q41 of the first lens unit 411, the back depth point Q44 of the second lens unit 411' and the first The back depth point Q42 of the lens unit 411 is sequentially close to infinity, so that the overall depth of field range of the imaging device 400 is expanded.
- the target imaging object can be imaged once at the first time point to obtain multiple complete images of the target imaging object with different orientations. Then, immediately afterwards, the target imaging object may be imaged again at a second time point to obtain another plurality of complete images of the target imaging object with different orientations. Since at the first time point and the second time point, each lens unit 411, 411' of the lens array and its corresponding sub-image sensor 421 have different depth of field ranges, multiple complete images obtained twice before and after have different depths of field range. Thereby, the image processing device can merge the original image obtained by the two imagings with a wider depth of field as a whole through the fusion refocusing algorithm, so as to obtain a desired final refocused image with a wider depth of field.
- the imaging device 400 may further include a color filter array, the color filter array including a plurality of color filters that filter light passing through each lens unit in the lens array, such as a red filter. Color chip, green color filter, blue color filter, etc.
- the color filter array may be arranged between the lens array and the image sensor.
- FIG. 8 shows a schematic diagram of an imaging apparatus 400 according to another embodiment of the present disclosure.
- the imaging device 500 includes lens arrays 510, 510' and an image sensor 520.
- the lens array 510, 510' includes a plurality of identical lens units 511, 511'.
- the image sensor 520 includes a plurality of sub image sensors 521.
- the plurality of lens units 511, 511' correspond to the plurality of sub-image sensors 521 one to one.
- the imaging device 400 further includes a position adjustment mechanism 540, which is used to adjust the distance between each lens unit 511, 511' and the corresponding sub-image sensor 521; for example, the position adjustment mechanism 540 can be used by various Realization in an appropriate form, such as a mechanical method, a magnetic method, etc., for example, a mechanical method may include a motor and a gear or a piece of magnetically induced strain material. That is to say, the distance between the lens unit 511, 511' and the sub-image sensor 521 is variable, so that the lens arrays 510, 510' can respectively form the first lens sub-arrays in different time periods (as shown in the solid line in FIG. Partially shown) and the second lens sub-array (shown in dotted line in Fig. 8).
- a position adjustment mechanism 540 which is used to adjust the distance between each lens unit 511, 511' and the corresponding sub-image sensor 521; for example, the position adjustment mechanism 540 can be used by various Realization in an appropriate
- the first lens unit 511 of the lens array 510 may be a first distance d3 from the corresponding sub-image sensor 521, thereby having a first depth range, where the first The foreground depth point of the lens unit 511 is Q51, and the back depth point is Q52.
- the second lens unit 511' in the lens array 510' can be separated from its corresponding sub
- the image sensor 521 has a second distance d4 that is different from the first distance d3, thereby having a second depth of field range, wherein the foreground depth point of the second lens unit 511' is Q53, and the back depth point of field is Q54.
- the first distance d3 and the second distance d4 are appropriately set so that the foreground deep point Q53 of the second lens unit 511', the foreground deep point Q51 of the first lens unit 511, the rear depth point Q54 of the second lens unit 511', and the second lens unit 511'
- the back depth point Q52 of a lens unit 511 is sequentially close to infinity, so that the overall depth of field range of the imaging device 500 is expanded.
- the imaging device 500 may further include a color filter array, the color filter array including a plurality of color filters that filter light passing through each lens unit in the lens array, such as a red filter. Color chip, green color filter, blue color filter, etc.
- Some embodiments of the present disclosure also provide an electronic device, which includes the imaging device of any embodiment of the present disclosure, for example, any imaging device described above.
- the electronic device may be a mobile phone, a tablet computer, a notebook computer, etc., which are not limited in the embodiments of the present disclosure.
- FIG. 9 shows a perspective schematic diagram of an exemplary electronic device according to an embodiment of the present disclosure, which includes any of the imaging devices 100, 200, 300, 400, etc. described above.
- the electronic device is, for example, a mobile phone, and the imaging devices 100, 200, 300, 400 are, for example, arranged on the back of the mobile phone (that is, the side facing away from the user) as a rear camera.
- the imaging devices 100, 200, 300, 400 may be set to be as large as possible to obtain a complete image of the target imaging object from more and larger orientations.
- the electronic device includes a display panel (not shown), a housing 50, an imaging device, and an image processing device (not shown).
- the display surface of the display panel faces the user, that is, on the front of the electronic device; the imaging device is arranged on the opposite side of the display surface.
- the imaging device includes a lens array and an image sensor.
- the lens array is arranged in the housing 50 of the electronic device, opposite to the display panel, and exposed to the outside of the housing 50 at the back of the electronic device, and the image sensor is arranged inside the housing 50 and adjacent to the lens array.
- the user can image the target object through the imaging device, and then execute the fusion refocusing algorithm through the image processing device of the electronic device to form the final image on the display panel; the image processing device is, for example, an image processor or an image processing chip.
- the lens unit may be embedded in the housing 50 through a base (not shown), and the above-mentioned first distance and second distance may be determined by determining the position of the base.
- FIG. 10A shows a perspective schematic diagram of an exemplary electronic device according to another embodiment of the present disclosure, which includes an imaging device 500.
- the electronic device is, for example, a mobile phone, and the imaging device 500 is, for example, disposed on the back of the mobile phone (that is, the side facing away from the user), of course, the implementation of the present disclosure does not limit this.
- the electronic device includes a display panel (not shown), a housing 550, an imaging device 500, and an image processing device (not shown).
- the display surface of the display panel faces the user, that is, on the front of the electronic device; the imaging device 500 is arranged on the opposite side of the display surface.
- the imaging device 500 includes a lens array 510 and an image sensor 520.
- FIG. 10B shows a schematic diagram of the lens array of the imaging device in FIG. 10A.
- the lens array 510 may be a liquid crystal lens array plate having a laminated structure.
- the liquid crystal lens array board may be embedded in the housing 550 and arranged opposite to the display panel.
- the liquid crystal lens array panel includes a first substrate 501, a second substrate 502, a first electrode layer 503 disposed on the side of the first substrate 501 facing the second substrate 502, and a first electrode layer 503 disposed on the second substrate 502 facing the first substrate 501.
- the second electrode layer 504 on one side, and the liquid crystal layer 505 between the first electrode layer 503 and the second electrode layer 504.
- the first electrode layer 503 and the second electrode layer 504 may be, for example, ITO transparent electrodes.
- the liquid crystal lens array panel may be divided into a plurality of liquid crystal lens units, for example, into a plurality of first liquid crystal lens units 511 and a plurality of second liquid crystal lens units 512.
- the first electrode layer 503 may be an integral common electrode
- the second electrode layer may include a plurality of second sub-electrodes 5041 that independently control each liquid crystal lens unit 511, 512. That is, each liquid crystal lens unit 511, 512 includes a common first substrate 503, a common first electrode layer 503, an independent second sub-electrode 504, a common liquid crystal layer 505, and a common second substrate 502.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Nonlinear Science (AREA)
- Human Computer Interaction (AREA)
- Cameras In General (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims (14)
- 一种成像装置,包括透镜阵列和图像传感器,其中,所述透镜阵列包括多个透镜单元,且配置成至少提供第一透镜子阵列和第二透镜子阵列,所述第一透镜子阵列包括所述多个透镜单元中的第一透镜单元,所述第一透镜单元与所述图像传感器配置成具有第一景深范围,所述第二透镜子阵列包括所述多个透镜单元中的第二透镜单元,所述第二透镜单元与所述图像传感器配置成具有第二景深范围,所述第一景深范围与所述第二景深范围部分重合且所述第一景深范围与所述第二景深范围的共同范围大于所述第一景深范围和所述第二景深范围。
- 根据权利要求1所述的成像装置,其中,所述第一透镜子阵列的第一透镜单元与所述图像传感器之间具有第一距离,所述第二透镜子阵列的第二透镜单元与所述图像传感器之间具有与第一距离不同的第二距离。
- 根据权利要求1或2所述的成像装置,其中,所述第一透镜子阵列的第一透镜单元具有第一焦距,所述第二透镜子阵列的第二透镜单元具有与第一焦距不同的第二焦距。
- 根据权利要求1-3中任一项所述的成像装置,其中,所述透镜单元为凸透镜。
- 根据权利要求1-4中任一项所述的成像装置,其中,所述第一透镜子阵列和所述第二透镜子阵列呈规则布置。
- 根据权利要求1所述的成像装置,其中,所述透镜阵列的所述多个透镜单元配置为,使得所述多个透镜单元各自的焦距可调或者所述多个透镜单元各自与所述图像传感器之间 的距离可调,以使得所述透镜阵列在不同时间段内分别形成所述第一透镜子阵列和所述第二透镜子阵列。
- 根据权利要求1所述的成像装置,其中,所述透镜单元为多个液晶透镜单元,所述多个液晶透镜单元配置为使得所述多个液晶透镜单元各自的焦距可调,以使得所述透镜阵列在不同时间段内分别形成所述第一透镜子阵列和所述第二透镜子阵列。
- 根据权利要求1-7任一所述的成像装置,还包括滤色片阵列,其中,所述滤光片阵列包括多个滤光片,所述多个滤光片的每个对应于所述多个透镜单元中的一个透镜单元,以对所述一个透镜单元透过的光线进行滤光。
- 根据权利要求1-7任一所述的成像装置,其中,所述第一景深范围的后景深点在无穷远处,所述第二景深范围的后景深点不在无穷远处。
- 根据权利要求1-7任一所述的成像装置,其中,所述第一景深范围与所述第二景深范围的重合范围在0-100mm。
- 根据权利要求1-7任一所述的成像装置,其中,所述图像传感器包括多个子图像传感器,所述多个子图像传感器与所述多个透镜单元一一对应,使得每个子图像传感器配置为接收一个透镜单元的入射光以成像。
- 根据权利要求1-11中任一项所述的成像装置,所述透镜阵列配置成还提供第三透镜子阵列,所述第三透镜子阵列包括所述多个透镜单元中的第三透镜单元,所述第三透镜单元与所述图像传感器配置成具有第三景深范围,所述第三景深范围至少与所述第一景深范围或第二景深范围中的一个部分重合,所述第一景深范围、所述第二景深范围、所述第三景深范围的共同范围大于所述第一景深范围、所述第二景深范围和所述第三景深范 围中任意一者或两者的范围。
- 一种电子设备,包括根据权利要求1-12中任一项所述的成像装置。
- 根据权利要求13所述的电子设备,其中,所述电子设备包括壳体和显示面板,所述透镜阵列与所述显示面板相对地设置在所述壳体中,并且所述透镜阵列的一部分暴露于所述壳体的外部,所述图像传感器设置在所述壳体的内部。
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/767,327 US20210203821A1 (en) | 2019-03-11 | 2019-12-16 | Imaging device and electronic apparatus |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910180440.2A CN111698348B (zh) | 2019-03-11 | 2019-03-11 | 成像装置和电子设备 |
CN201910180440.2 | 2019-03-11 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020181869A1 true WO2020181869A1 (zh) | 2020-09-17 |
Family
ID=72426155
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/125631 WO2020181869A1 (zh) | 2019-03-11 | 2019-12-16 | 成像装置和电子设备 |
Country Status (3)
Country | Link |
---|---|
US (1) | US20210203821A1 (zh) |
CN (1) | CN111698348B (zh) |
WO (1) | WO2020181869A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB2618466A (en) * | 2021-02-20 | 2023-11-08 | Boe Technology Group Co Ltd | Image acquisition device, image acquisition apparatus, image acquisition method and manufacturing method |
CN115524857B (zh) * | 2022-11-24 | 2023-03-31 | 锐驰智光(北京)科技有限公司 | 光学系统及具有此的激光雷达 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006251613A (ja) * | 2005-03-14 | 2006-09-21 | Citizen Watch Co Ltd | 撮像レンズ装置 |
US7830443B2 (en) * | 2004-12-21 | 2010-11-09 | Psion Teklogix Systems Inc. | Dual mode image engine |
CN102131044A (zh) * | 2010-01-20 | 2011-07-20 | 鸿富锦精密工业(深圳)有限公司 | 相机模组 |
CN104410784A (zh) * | 2014-11-06 | 2015-03-11 | 北京智谷技术服务有限公司 | 光场采集控制方法和装置 |
CN104717482A (zh) * | 2015-03-12 | 2015-06-17 | 天津大学 | 多光谱多景深阵列拍摄方法与拍摄相机 |
CN105827922A (zh) * | 2016-05-25 | 2016-08-03 | 京东方科技集团股份有限公司 | 一种摄像装置及其拍摄方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1983140A (zh) * | 2005-12-13 | 2007-06-20 | 邓仕林 | 笔形光鼠标的多透镜光路光学成像装置 |
US8478123B2 (en) * | 2011-01-25 | 2013-07-02 | Aptina Imaging Corporation | Imaging devices having arrays of image sensors and lenses with multiple aperture sizes |
CN106027861B (zh) * | 2016-05-23 | 2019-06-04 | 西北工业大学 | 基于微相机阵列的光场采集装置及数据处理方法 |
-
2019
- 2019-03-11 CN CN201910180440.2A patent/CN111698348B/zh active Active
- 2019-12-16 WO PCT/CN2019/125631 patent/WO2020181869A1/zh active Application Filing
- 2019-12-16 US US16/767,327 patent/US20210203821A1/en not_active Abandoned
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7830443B2 (en) * | 2004-12-21 | 2010-11-09 | Psion Teklogix Systems Inc. | Dual mode image engine |
JP2006251613A (ja) * | 2005-03-14 | 2006-09-21 | Citizen Watch Co Ltd | 撮像レンズ装置 |
CN102131044A (zh) * | 2010-01-20 | 2011-07-20 | 鸿富锦精密工业(深圳)有限公司 | 相机模组 |
CN104410784A (zh) * | 2014-11-06 | 2015-03-11 | 北京智谷技术服务有限公司 | 光场采集控制方法和装置 |
CN104717482A (zh) * | 2015-03-12 | 2015-06-17 | 天津大学 | 多光谱多景深阵列拍摄方法与拍摄相机 |
CN105827922A (zh) * | 2016-05-25 | 2016-08-03 | 京东方科技集团股份有限公司 | 一种摄像装置及其拍摄方法 |
Also Published As
Publication number | Publication date |
---|---|
CN111698348A (zh) | 2020-09-22 |
US20210203821A1 (en) | 2021-07-01 |
CN111698348B (zh) | 2021-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10594919B2 (en) | Camera device and method for capturing images by using the same | |
CN108919573B (zh) | 显示面板、显示装置、成像方法和深度距离的检测方法 | |
JP5040493B2 (ja) | 撮像装置及び撮像方法 | |
JP5331838B2 (ja) | 固体撮像装置および携帯情報端末 | |
US9781311B2 (en) | Liquid crystal optical device, solid state imaging device, portable information terminal, and display device | |
JP2010050707A (ja) | 撮像装置、表示装置および画像処理装置 | |
US11531181B2 (en) | Imaging lens module and electronic device | |
US20140118516A1 (en) | Solid state imaging module, solid state imaging device, and information processing device | |
WO2021103872A1 (zh) | 图像传感器、摄像装置、电子设备和成像方法 | |
WO2020181869A1 (zh) | 成像装置和电子设备 | |
US9462166B2 (en) | Imaging device, portable information terminal, and display device | |
US20140284746A1 (en) | Solid state imaging device and portable information terminal | |
TW201517257A (zh) | 在多透鏡陣列模組中之緊密間隔 | |
CN105241450B (zh) | 基于四象限偏振片的天空偏振模式探测方法与系统 | |
US9411136B2 (en) | Image capturing module and optical auxiliary unit thereof | |
CN110166676B (zh) | 一种成像设备、成像控制方法、电子装置和介质 | |
WO2019049193A1 (ja) | 指紋認証用センサモジュールおよび指紋認証装置 | |
US11336806B2 (en) | Dual-function display and camera | |
WO2019019813A1 (zh) | 一种感光组件 | |
US20220329774A1 (en) | Tof camera | |
JP2005250026A (ja) | 液晶調光素子、レンズ鏡筒および撮像装置 | |
WO2021166834A1 (ja) | 撮像装置、及び撮像方法 | |
US11726306B2 (en) | Imaging device | |
JP2021132372A (ja) | 撮像装置、及び撮像方法 | |
KR20180051340A (ko) | 카메라 모듈 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19919317 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19919317 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09/05/2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19919317 Country of ref document: EP Kind code of ref document: A1 |