WO2020114144A1 - 摄像模组及其潜望式摄像模组和图像获取方法及工作方法 - Google Patents
摄像模组及其潜望式摄像模组和图像获取方法及工作方法 Download PDFInfo
- Publication number
- WO2020114144A1 WO2020114144A1 PCT/CN2019/113351 CN2019113351W WO2020114144A1 WO 2020114144 A1 WO2020114144 A1 WO 2020114144A1 CN 2019113351 W CN2019113351 W CN 2019113351W WO 2020114144 A1 WO2020114144 A1 WO 2020114144A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- camera module
- photosensitive
- photosensitive element
- processing unit
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 101
- 238000011017 operating method Methods 0.000 title abstract 2
- 238000012545 processing Methods 0.000 claims abstract description 126
- 239000002131 composite material Substances 0.000 claims abstract description 47
- 239000006185 dispersion Substances 0.000 claims abstract description 19
- 239000003086 colorant Substances 0.000 claims description 90
- 238000003384 imaging method Methods 0.000 claims description 50
- 238000001914 filtration Methods 0.000 claims description 5
- 230000003287 optical effect Effects 0.000 abstract description 19
- 230000001965 increasing effect Effects 0.000 abstract description 7
- 230000000694 effects Effects 0.000 description 13
- 230000007246 mechanism Effects 0.000 description 11
- 235000000177 Indigofera tinctoria Nutrition 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 229940097275 indigo Drugs 0.000 description 8
- COHYTHOBJLSHDF-UHFFFAOYSA-N indigo powder Natural products N1C2=CC=CC=C2C(=O)C1=C1C(=O)C2=CC=CC=C2N1 COHYTHOBJLSHDF-UHFFFAOYSA-N 0.000 description 8
- 230000009977 dual effect Effects 0.000 description 6
- 230000001902 propagating effect Effects 0.000 description 6
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000002708 enhancing effect Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 230000001788 irregular Effects 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 1
- 150000001875 compounds Chemical class 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/95—Computational photography systems, e.g. light-field imaging systems
- H04N23/958—Computational photography systems, e.g. light-field imaging systems for extended depth of field imaging
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B33/00—Colour photography, other than mere exposure or projection of a colour film
- G03B33/10—Simultaneous recording or projection
- G03B33/12—Simultaneous recording or projection using beam-splitting or beam-combining systems, e.g. dichroic mirrors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/10—Beam splitting or combining systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
Definitions
- the invention relates to the field of a camera module, in particular to a camera module and its periscope camera module, image acquisition method and working method.
- the dual camera modules that are gradually appearing on the market include a main camera module and a sub camera module, wherein the main camera module and the sub camera module can obtain images respectively, and the main camera module and the camera
- the images obtained by the secondary camera module can be synthesized to obtain images with high resolution, which also makes the dual camera module favored by larger consumers.
- the existing dual-camera modules include black-and-white dual-camera modules, color dual-camera modules, and wide-angle telephoto camera modules.
- the wide-angle telephoto camera module is a widely used dual-camera module. one.
- the wide-angle telephoto camera module includes a wide-angle camera module with a large field of view and a telephoto camera module with a narrow field of view.
- the acquired images are fused to obtain the final image, which can meet the needs of wide-angle shooting and telephoto shooting.
- the height of the dual camera module is relatively high and the volume is relatively large, which is not in line with the current trend of mobile electronic devices tending to be thinner and thinner.
- the telephoto camera module is designed as a periscope camera module, wherein the periscope camera module is changed into the periscope camera module by an optical element
- the light path direction of the light beam enables the light beam entering the periscope camera module to be turned from a first direction to a second direction perpendicular to the first direction, and then passes through a lens module and a color filter After reaching a photosensitive chip, in this way, the telephoto module can be guaranteed to reduce the height of the telephoto camera module while satisfying the telephoto shooting effect, so that the dual camera module is applied to thin and light Mobile electronic devices.
- the existing periscope camera module also has many problems.
- the existing periscope camera module has a long focal length, the depth of field of the image that the periscope camera module can acquire is relatively shallow.
- the existing periscope camera module requires high precision for the driving part.
- Another object of the present invention is to provide a camera module, a periscope camera module, an image acquisition method and a working method, wherein at least one of the camera modules of the periscope camera module receives multiple beams Acquire the image of the object to be shot in the form of colored light to help increase the depth of field.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein the periscope camera module obtains the imaging of the monochromatic light of different colors, And combine the clear imaging of the monochromatic light into the final image of the periscope camera module, and obtain the final image by superimposing clear images of the monochromatic light of different colors to enhance the periscope The depth of field of the camera module.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein different colors of a photosensitive surface of a photosensitive element formed on the periscope camera module
- the imaging of the monochromatic light corresponds to different clear object plane positions of the object to be photographed, that is, the final image of the periscope camera module is obtained by synthesizing images of different clear object plane positions, which is further improved Depth of field of the periscope camera module.
- Another object of the present invention is to provide a camera module, a periscope camera module, an image acquisition method, and a working method, wherein the periscope camera module provides a light processing unit, a compound from the object The light is dispersed into multiple beams of the monochromatic light of different colors through the light processing unit, so that the dispersion degree of the positions of different clear objects corresponding to the monochromatic lights of different colors is improved to facilitate obtaining corresponding to different positions Imaging of a clear object surface, and subsequently superimposing and synthesizing images corresponding to different clear object surfaces through an algorithm to increase the depth of field of the periscope camera module.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein each pixel of the photosensitive element can correspondingly receive the monochromatic light of different colors And obtain clear images of the monochromatic light of different colors separately.
- Another object of the present invention is to provide a camera module, a periscope camera module, an image acquisition method, and a working method, wherein the periscope camera module respectively acquires at least one of multiple beams of the monochromatic light Clear imaging of red light, at least one green light, and at least one blue light, and subsequently synthesize images of different clear object positions corresponding to the red light, the green light, and the blue light according to an algorithm, by combining different The clear imaging of the monochromatic light of colors is superimposed to achieve an increased depth of field.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein each pixel of the photosensitive element can receive the red light and the green light correspondingly The light of the three colors of light and the blue light respectively obtains the images formed by the red light, the green light and the blue light separately.
- An object of the present invention is to provide a camera module, a periscope camera module, an image acquisition method, and a working method, wherein the camera module can obtain high-quality images.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein the camera module can obtain images obtained by at least one of the periscope camera module With in-depth information.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein the periscope camera module can determine the location of the actual scene and obtain the actual scene Outline.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein the periscope camera module can determine the actual scene by receiving the monochromatic light Depth information and get the outline of the actual scene.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein the periscope camera module can determine the actual scene by receiving the monochromatic light Position and get the outline of the actual scene.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein the periscope camera module can receive the red light, the green light and the The monochromatic light of the three colors of blue light, and analyzing the position of the actual scene according to the images of the monochromatic light of the three colors, and obtaining the outline of the actual scene.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein each pixel of the photosensitive element can receive the red light and the green light correspondingly Light and three colors of blue light, and analyze the position of the actual scene according to the difference in the sharpness of the three colors of light, and then obtain the outline of the actual scene.
- Another object of the present invention is to provide a camera module, a periscope camera module, an image acquisition method and a working method, wherein the periscope camera module can obtain the focus parameters of the current area according to the contour of the actual scene, The scene appearing in the image is kept within the depth of field of a lens unit of the periscope camera module, so as to improve the focusing efficiency of the periscope camera module.
- Another object of the present invention is to provide a camera module, a periscope camera module, an image acquisition method and a working method, wherein the periscope camera module can obtain the focus parameters of the current area according to the contour of the actual scene, The scene appearing in the image is kept at the focal length of a lens unit of the periscope camera module, thereby reducing the requirement for a focus driving member that drives the lens unit to complete focusing.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein the camera module provides at least one main camera module, the main camera module and the camera The periscope camera modules cooperate to obtain high-quality images.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein the periscope camera module assists the main camera module to complete focusing to improve the Describe the focusing efficiency of the main camera module.
- Another object of the present invention is to provide a camera module, a periscope camera module, an image acquisition method and a working method, wherein the periscope camera module further includes a cylindrical mirror, the cylindrical mirror can Enlarging the degree of dispersion of light passing through the light processing unit and reducing the overlap between red light, green light, and blue light, so that the photosensitive element can obtain an image formed by better monochromatic light.
- Another object of the present invention is to provide a camera module, a periscope camera module, an image acquisition method and a working method, wherein the periscope camera module further includes a free-form mirror, the free-form mirror can Expanding the degree of dispersion of light passing through the light processing unit, reducing the overlap between the red light, the green light, and the blue light, so that the photosensitive element can obtain better monochromatic light image.
- Another object of the present invention is to provide a camera module, a periscope camera module, an image acquisition method and a working method, wherein the light processing unit of the periscope camera module can be driven to rotate so that The monochromatic light formed by the composite light passing through the light processing unit can all be received by the photosensitive element, so as to ensure the imaging quality of the periscope camera module.
- Another object of the present invention is to provide a camera module and its periscope camera module and image acquisition method and working method, wherein the light processing unit of the periscope camera module can be driven to rotate, and further The photosensitive element can receive all the red light, the green light, and the blue light without expanding the area of the photosensitive element, in this way, the imaging quality of the periscope camera module is guaranteed At the same time, it is also conducive to the thinning of the periscope camera module.
- the present invention further provides an image acquisition method of a periscope camera module, the image acquisition method includes the following steps:
- the step (b) further includes a step (d): the photosensitive element correspondingly receives at least one red light, at least one green light and at least one of the plurality of monochromatic lights A blue light.
- a red pixel unit, a green pixel unit and a blue pixel unit of the photosensitive surface of the photosensitive element respectively receive the red color Light, the green light, and the blue light.
- the red pixel unit, the color filter pixel unit and the blue pixel unit are selected in the order of R-RB-RGB-RG-G
- the up-down arrangement correspondingly receives the red light, the green light, and the blue light.
- the method before the step (b), the method further includes a step (e): condensing the monochromatic lights of different colors through a lens unit.
- the step (f) is further included: the distance between the monochromatic lights of different colors is enlarged by an auxiliary element.
- the method before the step (b), the method further includes a step (g): filtering stray light through a color filter element.
- the method before the step (b), the method further includes a step (h): rotating the light processing unit.
- the present invention further provides a working method of a camera module.
- the working method includes the following steps:
- a light processing unit of at least one periscope camera module turns a composite light from a photographed object while dispersing the composite light to form multiple monochromatic lights;
- the periscope camera module correspondingly receives at least one red beam among the multiple beams of monochromatic light through the photosensitive surface of the photosensitive element
- a first image about the object is obtained by means of light, at least one green light and at least one blue light.
- step (c) in the step (c), the following steps:
- the method further includes a step (c.4): acquiring the outline of the object according to the position of the object.
- the step (d) further includes the following steps:
- the periscope camera module determines a focus parameter according to the contour of the object being photographed.
- the main camera module completes focusing according to the focusing parameters.
- the main camera module obtains a second image about the photographed object.
- the first image and the second image are synthesized to obtain a final image of the camera module.
- the present invention further provides a periscope camera module, which includes:
- a light processing unit wherein a composite light dispersion passing through the light processing unit forms multi-monochromatic light
- a lens unit wherein the monochromatic light can be focused by the lens unit
- a photosensitive element wherein the light processing unit and the lens unit are held in the photosensitive path of the photosensitive element, the lens unit is disposed between the light processing unit and the photosensitive element, the monochrome After passing through the lens unit, light is received by a photosensitive surface of the photosensitive element.
- the photosensitive surface of the photosensitive element correspondingly receives at least one red light, at least one green light, and at least one blue light among the multiple beams of the monochromatic light.
- the periscope camera module further includes a color filter element, wherein the color filter element is held in the photosensitive path of the photosensitive element, and light passes through the filter element Reaching the photosensitive element, the color filter element only allows the red light, the green light, and the blue light to pass through.
- the color filter element is disposed between the lens unit and the photosensitive element.
- the periscope camera module further includes an auxiliary element, wherein the auxiliary element is disposed between the light processing unit and the lens unit, and the auxiliary element is It is held by the photosensitive path of the photosensitive element.
- the auxiliary element is a cylindrical mirror.
- the auxiliary element is a freeform mirror.
- the periscope camera module further includes a driving element, wherein the driving element is disposed on the light processing unit, and the light processing unit is drivably connected to be rotated To the drive element.
- the photosensitive element includes multiple pixels, wherein the pixels include multiple pixel units, wherein the pixel units are selected from a red pixel unit, a green pixel unit, and a blue pixel One or more combinations of pixel unit types composed of pixel units.
- the pixel units are arranged from top to bottom in the order of R-RB-RGB-RG-G.
- the light processing unit is a prism.
- the present invention further provides a camera module, which includes:
- At least one periscope camera module wherein the periscope camera module includes a light processing unit, a lens unit, and a photosensitive element, wherein a composite light dispersion passing through the light processing unit forms multiple monochromatic lights, Wherein the monochromatic light can be focused by the lens unit, wherein the light processing unit and the lens unit are held by the photosensitive path of the photosensitive element, and the lens unit is disposed at the light processing unit and the Between the photosensitive elements, the monochromatic light is received by a photosensitive surface of the photosensitive element after passing through the lens unit, and a first image about the photographed object is obtained; and
- At least one main camera module wherein the main camera module obtains a second image about the object to be photographed, and the first image and the second image are processed to form a final image.
- FIG. 1 is a schematic perspective view of a periscope camera module according to a preferred embodiment of the present invention.
- FIG. 2A is a schematic diagram of the optical path of the periscope camera module according to the above preferred embodiment of the present invention.
- 2B is a schematic diagram of the pixels of a photosensitive chip of the periscope camera module according to the above preferred embodiment of the present invention.
- FIG. 3A is a schematic diagram of the optical path of the periscope camera module according to the above preferred embodiment of the present invention.
- FIG. 3B is a partially enlarged view of a schematic diagram of the optical path of the periscope camera module according to the above preferred embodiment of the present invention.
- FIG. 4 is a schematic diagram of the optical path of the periscope camera module according to another preferred embodiment of the present invention.
- FIG. 5 is a schematic diagram of the optical path of the periscope camera module according to another preferred embodiment of the present invention.
- FIG. 6 is a schematic diagram of a camera module applied to a mobile electronic device according to a preferred embodiment of the present invention.
- FIG. 7 is a schematic diagram of a scene in which a mobile electronic device according to the above preferred embodiment of the present invention acquires an image through the camera module.
- the term “a” should be understood as “at least one” or “one or more”, that is, in one embodiment, the number of an element can be one, and in other embodiments, the The quantity can be more than one, and the term “one” cannot be understood as a limitation on the quantity.
- a periscope camera module 100 receives multiple The method of beam of monochromatic light 300 obtains the imaging of a plurality of monochromatic lights 300 of different colors, wherein the monochromatic lights 300 of different colors correspond to different clear object surface positions of the photographed object, specifically, each The imaging of the monochromatic light 300 corresponds to a clear object surface position of the object to be photographed, and subsequently, the imaging of the monochromatic light 300 of different colors is superimposed on each other through an algorithm to synthesize the final image of the object to be photographed In this way, the depth of field of the periscope camera module 100 is increased.
- the periscope camera module obtains a clear image of each monochromatic light 300, and the depth of field of the camera module 100 is increased by superimposing the clear images of the monochromatic light 300 on each other , While obtaining high-quality images.
- the periscope camera module 100 includes a light processing unit 10 and a photosensitive element 20, wherein the light processing unit 10 is held in the photosensitive path of the photosensitive element 20 and enters the periscope camera module 100 After passing through the light processing unit 10, the light can reach the photosensitive element 20 and be imaged on a photosensitive surface 210 of the photosensitive element 20.
- a composite light 400 from a photographed object is dispersed by the light processing unit 10, thereby forming a plurality of monochromatic lights 300 of different colors, that is, the composite light 400 is subjected to the light processing Dispersion occurs after the unit 10 to form the monochromatic light 300 of seven colors of red light, orange light, yellow light, green light, blue light, indigo light, and purple light, and subsequently, different colors
- the monochromatic light 300 forms a clear image on the photosensitive surface of the photosensitive element 20 respectively.
- the clear imaging of the monochromatic light 300 of different colors is equivalent to a plurality of different clear object surface positions of the photographed object ,
- a clear image of the monochromatic light 300 of different colors is synthesized by an algorithm to obtain a final image of the photographed object.
- the light processing element 10 increases the degree of dispersion of the multiple beams of the monochromatic light 300 from the composite light 400 of the photographed object, thereby enhancing the different sharpness corresponding to the monochromatic light 300
- the degree of dispersion of the object plane is beneficial for the periscope camera module 100 to increase the depth of field by superimposing the imaging of different monochromatic lights 300.
- the periscope camera module 100 includes a lens unit 30, wherein the lens unit 30 is disposed between the light processing unit 10 and the photosensitive element 20, and The lens unit 30 is held by the photosensitive path of the photosensitive element 20, and a plurality of monochromatic lights 300 emitted from the light processing unit 10 reach the lens unit 30, and the monochromatic light 300 is captured by The lens unit 30 focuses, and further, the monochromatic light 300 of different colors after passing through the lens unit 30 is imaged on the photosensitive surface 210 of the photosensitive element 20 to obtain a plurality of different clear object surface positions Imaging.
- the same distance from the photosensitive surface 210 of the photosensitive element 20 of the periscope camera module Due to factors such as different refractive indexes and different propagation speeds of different colors of light in the same medium, the same distance from the photosensitive surface 210 of the photosensitive element 20 of the periscope camera module
- the red light, the green light, and the blue light formed by the composite light 400 that is reflected into the periscope camera module are dispersed will be focused on different planes, that is, different colors
- the clear image plane positions that the monochromatic light 300 can form are also inconsistent, and the images of different clear object plane positions are synthesized by an algorithm to obtain the final image of the periscope camera module, thereby improving the periscope camera Depth of field of module 100.
- the composite light 400 undergoes dispersion after passing through the light processing unit 10 to form the monochromatic light 300, because different monochromatic lights 300 have different wavelengths and frequencies, and
- the light processing unit 10 has different refractive indices for the monochromatic light 300 having different wavelengths, so that different positions of the monochromatic light 300 after being emitted from the light processing unit 10 are different.
- the order of the light, the green light, the blue light, the indigo light, and the purple light are arranged from top to bottom, that is, when the monochromatic light 300 of seven colors reaches the direction perpendicular to the Z axis
- a flat surface for example, the photosensitive surface 210 of the photosensitive element 20 and the image area formed by the monochromatic light 300 of different colors are also arranged on the photosensitive surface 210 of the photosensitive element 20 from top to bottom, wherein the red The light is located in the uppermost area, followed by the orange light, and the purple light is located in the lowermost area, thereby enhancing the dispersion of different clear objects, so that the periscope camera module 100 can pass different single
- the method of superimposing the imaging of the colored light 300 can enhance the depth of field.
- the light processing unit 10 is a prism.
- the specific implementation of the light processing unit 10 is only an example, and cannot be a limitation on the content and scope of the periscope camera module 100 of the present invention.
- the photosensitive element 20 correspondingly receives the monochromatic light 300 of different colors, the photosensitive element 20 converts the light signal into an electrical signal, and transmits the electrical signal to the communicative connection with the photosensitive element 20
- a processing device 50 of FIG. 3 can further obtain images of the monochromatic light of different colors, that is, obtain images of a plurality of different clear object plane positions.
- the first image 101 is synthesized by superimposing images of different colors of monochromatic light through an algorithm.
- the photosensitive surface 210 of the photosensitive element 20 correspondingly receives the monochromatic light 300 of the three colors of the red light, the green light and the blue light, thereby obtaining the red light, Separate clear imaging of the green light and the blue light, that is, imaging at three different clear object plane positions.
- the image of the red light, the image of the green light, and the image of the blue light correspond to three different clear object plane positions, and the image of the red light, the image of the green light, and the blue Colored light images to enhance depth of field. For example, referring to FIG.
- the photosensitive surface 210 of the photosensitive element 20 correspondingly receives the monochromatic light 300 of three colors of the red light, the green light, and the blue light, and obtains respectively The imaging A of the red light, the image B of the green light and the image C of the blue light, wherein the imaging A of the red light, the image B of the green light and the image C of the blue light are equivalent to three
- the object plane position A′ and the object plane position B of the photographed object can be obtained according to the information of the imaging A of the red light, the image B of the green light, and the image C of the blue light 'And the object position C', so that in the following, according to the algorithm, the image A of the red light, the image B of the green light and the image C of the blue light can be obtained to obtain a clear image of the object .
- the photosensitive element 20 can also obtain an image of the object by receiving the monochromatic light 300 of other colors.
- the photosensitive surface 210 of the photosensitive element 20 includes multiple pixel dots 21, that is, the photosensitive surface 210 of the photosensitive element 20 includes an array of pixel dots, and
- the pixel point 21 receives the corresponding red light, green light and blue light, and can obtain a clear image of the monochromatic light 300 in three colors. That is, the imaging of the red light, the green light, and the blue light corresponding to the red pixel unit 2111, the green pixel unit 2112, and the blue pixel unit 2113 of the photosensitive element 20 It is equivalent to three optical systems with different clear object positions. After the algorithm combines the "three shots" of different clear object positions, the final image of the object can be obtained. It should be understood that the “three-shot” image refers to the imaging of the red light obtained by the red pixel unit 2111, the green pixel unit 2112, and the blue pixel unit 2113. The imaging of green light and the imaging of blue light.
- the pixel point 21 includes a plurality of pixel units 211, wherein each pixel unit 211 can receive the monochromatic light 300 of a corresponding color, and the plurality of pixel units 211 are ordered from top to bottom The arrangement further enables the pixel points 211 to receive the monochromatic light 300 of the corresponding color.
- the pixel unit 211 is selected from one or more combinations of pixel unit types consisting of a red pixel unit 2111, a green pixel unit 2112, and a blue pixel unit 2113, wherein the red pixel unit 2111 Capable of receiving the red light, the green pixel unit 2112 can receive the green light, the blue pixel unit 2113 can receive the blue light, the red pixel unit 2111, the green pixel unit 2112 and the The blue pixel units 2113 are arranged according to a preset rule, so that the pixel points 21 can receive the monochromatic light 300 of the corresponding color.
- the red pixel unit 2111, the green pixel unit 2112 and the blue pixel unit 2113 are represented by the letters "R", "G” and "B" respectively, as shown in FIG. 2B As shown.
- the photosensitive element 20 is implemented as an irregular color filter array.
- the pixel units 211 are arranged from top to bottom in the order of R-RB-RGB-RG-G, so that the photosensitive element 20 can receive the monochromatic light of the corresponding color.
- the different monochromatic light since different wavelengths and frequencies of the monochromatic light 300 are different, and the refractive index of the light processing unit 10 to the monochromatic light 300 having different wavelengths is different, the different monochromatic light
- the position where 300 is emitted from the light processing unit 10 is also different, and the monochromatic light 300 is in accordance with the red light, the orange light, the yellow light, the green light, the blue light,
- the order of the indigo light and the purple light are arranged in order from top to bottom.
- the corresponding pixel unit 211 including only the red pixel unit 2111 is distributed at the top, and only includes the blue pixel unit 2113.
- the pixel units 211 are distributed at the bottom. Further, due to the influence of the refraction angle and the focal length, it is difficult to completely separate the red light, the green light, and the blue light, that is, the red light and the green light may overlap from top to bottom.
- the pixel unit 211 is located below the pixel unit 211 including only the red pixel unit 2111, and includes at least one of the red pixel unit 2111, at least one of the green pixel unit 2112, and at least one of the blue pixel unit 2113.
- the pixel unit 211 is located above the pixel unit 211 including only the blue pixel unit 2113.
- the pixel unit 211 of the pixel point 21 of the corresponding photosensitive element 20 is designed according to the distribution of the monochromatic light 300 formed after dispersion, so that the photosensitive element 20 can better receive the correspondence
- the monochromatic light 300 of color is further conducive to improving the imaging effect.
- each pixel unit 211 includes at least one red pixel unit 2111, at least one green pixel unit 2112, and at least one The blue pixel unit 2113.
- the photosensitive element 20 is implemented as an RGBW color filter array, where “W” is a white pixel unit, that is, each pixel unit 211 includes at least one red pixel unit 2111, at least one The green pixel unit 2112, at least one blue pixel unit 2113, and at least one white pixel unit supplement brightness with the white pixel unit to improve imaging quality.
- W is a white pixel unit, that is, each pixel unit 211 includes at least one red pixel unit 2111, at least one The green pixel unit 2112, at least one blue pixel unit 2113, and at least one white pixel unit supplement brightness with the white pixel unit to improve imaging quality.
- the periscope camera module 100 further includes a color filter element 40, wherein the color filter element 40 is held in the photosensitive path of the photosensitive element 20, and the color filter element 40 It is arranged in front of the photosensitive element 20 so that the light entering the periscope camera module 100 passes through the color filter element 40 before reaching the photosensitive element 20.
- the color filter element 40 filters stray light to ensure the clarity of the image formed on the photosensitive surface 210 of the photosensitive element 20.
- the color filter element 40 only allows light of a predetermined wavelength band to pass to ensure that the red light, the green light, and the blue light respectively form clear and independent images on the photosensitive element 20 of the photosensitive surface 210, thereby ensuring imaging quality.
- the color filter element 40 is disposed between the lens unit 30 and the photosensitive element 20, and the light reaches the color filter element 40 after passing through the lens unit 30, and the color filter element 40 is mixed Light is filtered.
- the color filter element 40 is disposed between the lens unit 30 and the light processing module 10. It should be understood that the specific implementation of the color filter element 40 is only an example, and cannot be a limitation on the content and scope of the periscope camera module 100 of the present invention.
- the periscope camera module 100 further includes an auxiliary element 80, wherein the auxiliary element 80 is provided Between the light processing unit 10 and the lens unit 30, and the auxiliary element 80 is held by the photosensitive path of the photosensitive element 20, the monochromatic light 300 emitted from the light processing unit 10 passes After the auxiliary element 80 is dispersed, the overlapping portion of the monochromatic light 300 of different colors is reduced, so that the overlapping area of the red light, the green light, and the blue light is reduced to benefit all
- the photosensitive element 20 better receives the monochromatic light 300 of the corresponding color, and improves the clarity of the image formed by the red light, the green light, and the blue light separately, and the auxiliary element 80 may further Increase the depth of field.
- the auxiliary element 80 is implemented as a cylindrical mirror.
- the auxiliary element 80 is implemented as a free-form surface.
- the difference between the periscope camera module 100 shown in FIG. 5 of the specification and the periscope camera module 100 shown in FIG. 2A is that the periscope camera shown in FIG. 5
- the module 100 further includes a driving element 90, wherein the light processing unit 10 is drivably connected to the driving element 90 so that the red light and the green light emitted from the light processing unit 10 And the blue light can be received by the photosensitive element 20 in such a way that the photosensitive element 20 can receive all of the red without increasing the area of the photosensitive surface 210 of the photosensitive element 20 Light, the green light, and the blue light.
- the photosensitive element 20 needs to be The large area is not conducive to the thinning of the periscope camera module 100, and by driving the light processing unit 10 to rotate around the Y axis, the photosensitive element 20 can sequentially receive the red light and the green light And the blue light, and obtain images corresponding to the monochromatic light 300 of three colors, so that the first image 101 can be obtained later.
- the present invention further provides an image acquisition method of a periscope camera module, wherein the image acquisition method includes the following steps:
- a light processing unit 10 turns a composite light 400 from a photographed object while dispersing the composite light 400 to form a plurality of monochromatic lights 300;
- the composite light 400 from the photographed object can be dispersed by the light processing unit 10 to form the monochromatic light 300 of different colors, that is, the The composite light 400 undergoes dispersion after passing through the light processing unit 10 to form the seven colors of the red light, an orange light, a yellow light, the green light, the blue light, an indigo light, and a purple light
- Monochromatic light 300 and subsequently, the monochromatic light 300 of different colors can be imaged on the photosensitive surface 210 of the photosensitive element 20, wherein the monochromatic light 300 of different colors corresponds to different objects Clear object surface position, specifically, each clear image of the monochromatic light 300 corresponds to a clear object surface position of the object to be photographed, and subsequently, the clearness of the monochromatic light 300 of different colors is cleared by an algorithm
- the imaging is superimposed on each other to synthesize the final image of the object to be photographed. In this way, the depth of field of the periscope camera module 100 is increased.
- the photosensitive surface 210 of the photosensitive element 20 receives the red light, the green light and the blue light correspondingly, thereby obtaining the red light, the Clear imaging of green light and the blue light.
- the specific color of the monochromatic light 300 received by the photosensitive element 20 cannot be a limitation on the content and scope of the image acquisition method of the periscope camera module of the present invention. More specifically, the red pixel unit 2111, the green pixel unit 2112 and the blue pixel unit 2113 of the photosensitive surface 210 of the photosensitive element 210 respectively receive the red light and the green Light and the blue light, respectively, to obtain the imaging of the red light, the green light and the blue light respectively.
- the red pixel unit 2111, the green pixel unit 2112 and the blue pixel unit 2113 correspondingly receive the red color in a manner of arranging from top to bottom in the order of R-RB-RGB-RG-G Light, the green light, and the blue light.
- the method further includes the step (e): condensing the monochromatic light 300 by a lens unit 30.
- the step (f) is further included: the distance between the monochromatic lights 300 of different colors is enlarged by an auxiliary element 80.
- the monochromatic light 300 emitted from the light processing unit 10 is dispersed after passing through the auxiliary element 80, so that the overlapping portion of the monochromatic light 300 of different colors is reduced, so that the red The overlapping area of light, the green light and the blue light is reduced to facilitate the photosensitive element 20 to better receive the monochromatic light 300 of the corresponding color and improve the red light and the green light And the clarity of the image formed by the blue light alone.
- the method further includes a step (g): filtering stray light through a color filter element 40.
- the color filter element 40 only allows light of a predetermined wavelength band to pass to ensure that the red light, the green light, and the blue light respectively form clear and independent images on the photosensitive element 20.
- the photosensitive surface 210 is described so as to ensure the imaging quality.
- the method further includes a step (h): rotating the light processing unit 10.
- a driving element 90 is sufficient to drive the light processing unit 10 to rotate so that the photosensitive element 20 can receive the red light, the green light, and the blue light in this order.
- the red light, the green light, and the blue light emitted from the light processing unit 20 can all be received by the photosensitive element 20, which is advantageous for the periscope camera module 100 to be thin and light.
- a periscope camera module 100 according to another preferred embodiment of the present invention will be explained in the following description. 6 and 7 of the specification, at least one of the periscope camera modules 100 is applied to a camera module 1000, wherein the camera module 1000 can obtain high-quality images.
- the camera module 1000 includes at least one of the periscope camera module 100 and at least one main camera module 200, wherein the periscope camera module 100 and the main camera module 200 are capable of A first image 101 and a second image 102 are obtained respectively, wherein the first image 101 and the second image 102 can be synthesized by the algorithm into a final image 103 of the camera module 1000.
- the periscope camera module 100 can determine the position of the actual scene, and obtain the outline of the actual scene, and the first image 101 obtained by the periscope camera module 100 has depth information.
- the final image 103 obtained by fusing the first image 101 and the second image 102 can reflect the outline of the actual scene and present a stereoscopic effect, thereby improving the image quality acquired by the camera module 1000.
- the number of the periscope camera module 100 and the main camera module 200 of the camera module 1000 is not limited.
- the camera module 1000 can be implemented as a dual Camera module, that is, the camera module 1000 includes one periscope camera module 100 and one main camera module 200; the camera module 1000 may also be implemented as a three camera module, that is , The camera module 1000 includes two periscope camera modules 100 and one main camera module 200, or the camera module 1000 includes one periscope camera module 100 and Two of the main camera modules 200.
- the specific numbers of the periscope camera module 100 and the main camera module 200 of the camera module 1000 are merely examples, and cannot be used for the camera module 1000 and the periscope camera module of the present invention. Limitations on the content and scope of group 100.
- the camera module 1000 is implemented as a dual camera module as an example.
- the periscope camera module 100 forms an image by receiving a monochromatic light 300, and analyzes the actual position of the actual scene according to the sharpness of the received monochromatic light 300, and obtains the actual Scenery silhouette.
- the periscope camera module 100 includes a light processing unit 10 and a photosensitive element 20, wherein the light processing unit 10 is held in the photosensitive path of the photosensitive element 20 and enters the periscope type The light of the camera module 100 can reach the photosensitive element 20 after passing through the light processing unit 10 and be imaged on a photosensitive surface 210 of the photosensitive element 20.
- a composite light 400 in the external environment is dispersed by the light processing unit 10 to form the monochromatic light 300, that is, the composite light 400 is dispersed by the light processing unit 10 to form a red color Light, an orange light, a yellow light, a green light, a blue light, an indigo light, and a purple light of the monochromatic light 300 of seven colors, and in the subsequent, the red light, the green light and
- the monochromatic light 300 of the three colors of blue light can be imaged on the photosensitive element 20, and then each monochromatic light 300 is calculated according to the image analysis formed by the monochromatic light 300 of the three colors Corresponding to the position of the actual scene, thereby obtaining the outline of the actual scene.
- the photosensitive surface 210 of the photosensitive element 20 is a flat surface, and due to the red light, the green light, and the blue light, there is a certain difference in corresponding refractive index inconsistency, propagation speed, etc., and the There is also a difference between the position of the actual scene corresponding to the red light, the green light, and the blue light, and the relative position of the photosensitive surface 210 of the photosensitive element 20. Therefore, the photosensitive element 20 receives The sharpness of different monochromatic lights also varies. Further, according to the image formed on the photosensitive surface 210 of the photosensitive element 20, the actual scene position or depth information corresponding to different monochromatic lights can be analyzed. It should be known that white light, sunlight, etc. in the external environment all belong to the composite light 400.
- the light processing unit 10 can change the direction of the light path of the light beam passing through the periscope camera module 100.
- the composite light 400 propagating along the X-axis direction is incident on the light processing unit 10, and can pass along the Z-axis direction after passing through the light processing unit 10, that is, by the light processing unit 10
- the optical path of the light beam entering the periscope camera module 100 can be turned, so that the periscope camera module 100 can reduce the overall height of the camera module 1000 while having a telephoto shooting effect. It is advantageous for the camera module 1000 to be applied to a mobile electronic device 2000, and it is in line with the development trend of the mobile electronic device 2000 to be thinner and thinner, wherein the mobile electronic device 2000 is a mobile phone, an iPad, and the like.
- the composite light 400 undergoes dispersion after passing through the light processing unit 10 to form the monochromatic light 300, because different monochromatic lights 300 have different wavelengths and frequencies, and the light processing The unit 10 has different refractive indexes for the monochromatic light 300 having different wavelengths, so that different positions of the monochromatic light 300 after being emitted from the light processing unit 10 are different.
- the order of the light, the green light, the blue light, the indigo light, and the purple light are arranged from top to bottom, that is, when the monochromatic light 300 of seven colors reaches the direction perpendicular to the Z axis
- a flat surface for example, the photosensitive surface 210 of the photosensitive element 20 and the image area formed by the monochromatic light 300 of different colors are also arranged on the photosensitive surface 210 of the photosensitive element 20 from top to bottom, wherein the red The light is located in the uppermost area, followed by the orange light, and the purple light is located in the lowermost area.
- the light processing unit 10 is a prism.
- the specific implementation of the light processing unit 10 is only an example, and cannot be a limitation on the content and scope of the periscope camera module 1000 of the present invention.
- the periscope camera module 100 includes a lens unit 30, wherein the lens unit 30 is disposed between the light processing unit 10 and the photosensitive element 20, and The lens unit 30 is held by the photosensitive path of the photosensitive element 20, the monochromatic light 300 emitted from the light processing unit 10 can reach the lens unit 30, and the monochromatic light 300 can be captured
- the lens unit 30 focuses, and further, the monochromatic light 300 of the three colors of the red light, the green light, and the blue light after passing through the lens unit 30 can be imaged on the photosensitive element 20 ⁇ photosensitive ⁇ 210 ⁇ The photosensitive surface 210.
- the periscope camera module 100 further includes a color filter element 40, wherein the color filter element 40 is held in the photosensitive path of the photosensitive element 20, and the color filter element 40 It is arranged in front of the photosensitive element 20 so that the light entering the periscope camera module 100 passes through the color filter element 40 before reaching the photosensitive element 20.
- the color filter element 40 filters the monochromatic light 300 of different colors, allowing only the red light, the green light, and the blue light to pass, ensuring that only the red light, the green light, and all The blue light can be imaged on the photosensitive surface 210 of the photosensitive element 20, thereby ensuring imaging quality.
- the red light, the green light, and the blue light have different optical properties such as refractive index and propagation speed, which are caused by the same point on the actual scene, or a point in the same vertical plane, that is ,
- the composite light 400 that is reflected by the composite light 400 that is reflected into the periscope camera module at the same distance as the photosensitive surface 210 of the photosensitive element 20 of the periscope camera module is dispersed
- the red light, the green light, and the blue light are focused on different planes.
- the photosensitive surface 210 of the photosensitive element 20 When the position of the photosensitive surface 210 of the photosensitive element 20 is fixed, the photosensitive surface 210 of the photosensitive element 20 The sharpness of the received monochromatic light 300 of different colors is different.
- the sharpness of the image of the red light formed on the photosensitive surface 210 of the photosensitive element 20 by the red light dispersed and formed by the composite light 400 reflecting into the periscope camera module at the same position on the actual scene is greater than
- the blue light dispersed by the composite light 400 is imaged on the photosensitive surface 210 of the photosensitive element 20.
- the position of each point on the actual scene corresponding to the monochromatic light 300 of different colors having the same sharpness and received by the photosensitive surface 210 of the photosensitive element 20 will also be different.
- the imaging of the red light and the blue light on the photosensitive surface 210 of the photosensitive element 20 has the same sharpness, and the point on the actual scene corresponding to the red light corresponds to the periscope
- the distance of the camera module is greater than the distance between the point on the actual scene corresponding to the blue light and the periscope camera module.
- the stray light is filtered by the color filter element 40 to ensure the definition of the red light, the green light, and the blue light imaged on the photosensitive surface 210 of the photosensitive element 20, and to reduce
- the monochromatic light 300 of other colors interferes with the location of the actual scenery, so as to improve the accuracy of the periscope camera module in acquiring the location of the actual scenery, so as to better present the actual scenery in the future Outline.
- the stray light in the present invention refers to the monochromatic light 300 in colors other than red light, green light, and blue light.
- the color filter element 40 is disposed between the lens unit 30 and the photosensitive element 20, and the color filter element 40 filters the monochromatic light 300 emitted from the lens unit 30 , So that only the red light, the green light, and the blue light can reach the photosensitive element 20, and the first image 101 can be obtained later.
- the color filter element 40 is disposed between the lens unit 30 and the light processing module 10 so that the monochromatic light 300 emitted from the light processing unit 10 can be used by the color filter element 40, so that only the red light, the green light, and the blue light can be focused by the lens unit 30.
- the photosensitive element 20 correspondingly receives the monochromatic light 300 of three colors of the red light, the green light, and the blue light, and the photosensitive element 20 converts the optical signal into an electrical signal, and
- the electrical signal is transmitted to a processing device 50 communicatively connected to the photosensitive element 20, so that the image of the red light, the image of the green light, and the image of the blue light can be obtained respectively, and then passed
- the processing device 50 analyzes the image of the red light, the image of the green light, and the image of the blue light, and obtains the actual scene according to the difference in the sharpness of the three colors of the monochromatic light 300 Position to get the outline of the actual scene.
- the processing device 50 may be implemented as a processor of the mobile electronic device 2000, and the processing device 50 may also be implemented as a processor of the camera module 1000, which is in the art
- the specific implementation of the processing device 50 is only an example, and cannot be a limitation on the content and scope of the camera module 1000 and the periscope camera module 100 of the present invention.
- the photosensitive surface 210 of the photosensitive element 20 includes multiple pixel dots 21, that is, the photosensitive surface 210 of the photosensitive element 20 includes an array of pixel dots, and
- the pixel point 21 receives the corresponding red light, the green light, and the blue light, and can further obtain images corresponding to the monochromatic light 300 of three colors.
- the pixel point 21 includes a plurality of pixel units 211, wherein each pixel unit 211 can receive the monochromatic light 300 of a corresponding color, and the plurality of pixel units 211 are ordered from top to bottom The arrangement further enables the pixel points 211 to receive the monochromatic light 300 of the corresponding color.
- the pixel unit 211 is selected from one or more combinations of pixel unit types consisting of a red pixel unit 2111, a green pixel unit 2112, and a blue pixel unit 2113, wherein the red pixel unit 2111 Capable of receiving the red light, the green pixel unit 2112 can receive the green light, the blue pixel unit 2113 can receive the blue light, the red pixel unit 2111, the green pixel unit 2112 and the blue
- the pixel units 2113 are arranged according to a preset rule, so that the pixel points 21 can receive the monochromatic light 300 of the corresponding color.
- the red pixel unit 2111, the green pixel unit 2112 and the blue pixel unit 2113 are represented by the letters "R", "G” and "B" respectively, as shown in FIG. 2B As shown.
- the photosensitive element 20 is implemented as an irregular color filter array.
- the pixel units 211 are arranged from top to bottom in the order of R-RB-RGB-RG-G, so that the photosensitive element 20 can receive the monochromatic light of the corresponding color.
- the different monochromatic light since different wavelengths and frequencies of the monochromatic light 300 are different, and the refractive index of the light processing unit 10 to the monochromatic light 300 having different wavelengths is different, the different monochromatic light
- the position where 300 is emitted from the light processing unit 10 is also different, and the monochromatic light 300 is in accordance with the red light, the orange light, the yellow light, the green light, the blue light,
- the order of the indigo light and the purple light are arranged in order from top to bottom.
- the corresponding pixel unit 211 including only the red pixel unit 2111 is distributed at the top, and only includes the blue pixel unit 2113.
- the pixel units 211 are distributed at the bottom. Further, due to the influence of the refraction angle and the focal length, it is difficult for the red light, the green light, and the blue light to be completely separated, that is, the red light and the green light may overlap from top to bottom.
- the pixel unit 211 is located below the pixel unit 211 including only the red pixel unit 2111, and includes at least one of the red pixel unit 2111, at least one of the green pixel unit 2112, and at least one of the blue pixel unit 2113.
- the pixel unit 211 is located above the pixel unit 211 including only the blue pixel unit 2113.
- the pixel unit 211 of the pixel point 21 of the corresponding photosensitive element 20 is designed according to the distribution of the monochromatic light 300 formed after dispersion, so that the photosensitive element 20 can better receive the correspondence
- the monochromatic light 300 of color is further conducive to improving the imaging effect.
- each pixel unit 211 includes at least one red pixel unit 2111, at least one green pixel unit 2112, and at least one The blue pixel unit 2113.
- the photosensitive surface 210 of the photosensitive element 20 is a flat plane, and the position of the photosensitive element 20 is fixed, that is, the image surface is fixed, when the photosensitive element 20 is disposed at Within a preset range, the red light, the green light, and the blue light can be imaged on the photosensitive surface of the photosensitive element 10. Since the distances between the points constituting the outline of the actual scene and the periscope camera module 100 are different, and the positions of the actual scene corresponding to the different colors of the monochromatic light 300 are different, the photosensitive element The sharpness of the monochromatic light 300 of different colors received in 20 is also different. For example, referring to FIGS.
- the photosensitive element 20 can be disposed at any position between an A plane and a C plane, and an image surface corresponding to the actual scene can be formed on the photosensitive element 20 , A point where the red light can be focused is on the plane B, then when the photosensitive element 10 is disposed at a position corresponding to the plane B, the red light is formed on the image of the photosensitive element 10
- the brightness is higher, that is, the sharpness of the red light is higher.
- the The brightness of the red light that the photosensitive element 20 can receive decreases; similarly, the point where another beam of the red light can be focused is on the plane C, when the photosensitive element 10 is disposed on the plane At the position corresponding to C, the brightness of the image of the red light formed on the photosensitive element 10 is higher, that is, the sharpness of the red light is higher, when the photosensitive element 20 is disposed behind the C plane Obtaining the B plane, the brightness of the red light that the photosensitive element 20 can receive decreases, and when the photosensitive element 20 is placed on the B plane to obtain the A plane, the photosensitive element 20 can receive The brightness of the red light is lower.
- the monochromatic light 300 of different colors formed by the dispersion of the same composite light 400 can be focused at different positions.
- the optical properties of the monochromatic light 300 of different colors are different, which is caused by the same point on the actual scene or a point in the same vertical plane, that is, the distance from the periscope camera module
- the red light, the green light, and the blue light that are dispersed and formed by the composite light 400 reflected into the periscope camera module by the same distance from the photosensitive surface 210 of the photosensitive element 20 Colored light will be focused on different planes.
- the monochromatic light of different colors received by the photosensitive surface 210 of the photosensitive element 20 The sharpness of 300 is not the same; similarly, the position of each point on the actual scene corresponding to the monochromatic light 300 of different colors with the same sharpness received by the photosensitive surface 210 of the photosensitive element 20 will also be Not the same.
- the periscope camera module disperses the composite light entering the camera module into the monochromatic light 300 of different colors, and receives it through the photosensitive surface 210 of the photosensitive element 20
- the method of the monochromatic light 300 is beneficial to obtain the position of each point on the actual scene according to the image analysis of the monochromatic light 300 of different colors formed on the photosensitive surface 210 of the photosensitive element 20, and obtain the actual Scenery silhouette.
- the periscope camera module 100 further includes a focusing mechanism 60, wherein the focusing mechanism 60 enables the image corresponding to the actual scene to be maintained at the focal length of the lens unit 30 to improve the latent The imaging effect of the look camera module 100.
- the focusing mechanism 60 is disposed on the lens unit 30, and the focusing mechanism 60 can drive the lens unit 30 to rotate to complete focusing.
- the focusing mechanism 60 can drive the lens unit 30 to move in the Z-axis direction to achieve focusing.
- the processing device 50 can determine the focus parameters used in the current area according to the contour of the actual scene, and keep all the scenes appearing in the photo at the focal length, that is, the focus mechanism 60 can be compensated by an algorithm There are some errors in the focusing process, which further reduces the accuracy requirements of the focusing mechanism 60 and can guarantee the imaging effect.
- the periscope camera module 100 further includes an anti-shake mechanism 70, wherein the anti-shake mechanism 70 is provided in the lens unit 30 and/or the light processing unit 10, the anti-shake The mechanism 70 can reduce the influence of shaking on the shooting effect. It is worth mentioning that the periscope camera module 100 can keep objects in the range of 40cm to infinity clear, and is less sensitive to the shaking of the user's hand during shooting than autofocus.
- the periscope camera module 100 further includes an auxiliary element 80, wherein the auxiliary element 80 is provided Between the light processing unit 10 and the lens unit 30, and the auxiliary element 80 is held by the photosensitive path of the photosensitive element 20, the monochromatic light 300 emitted from the light processing unit 10 passes After the auxiliary element 80 is dispersed, the overlapping portion of the monochromatic light 300 of different colors is reduced, so that the overlapping area of the red light, the green light, and the blue light is reduced to benefit all
- the photosensitive element 20 better receives the monochromatic light 300 of the corresponding color, and improves the clarity of the image formed by the red light, the green light, and the blue light separately, so that the actual light can be obtained more accurately
- the auxiliary element 80 is implemented as a cylindrical mirror.
- the auxiliary element 80 is implemented as a free-form surface
- the difference between the periscope camera module 100 shown in FIG. 5 of the specification and the periscope camera module 100 shown in FIG. 2A is that the periscope camera shown in FIG. 5
- the module 100 further includes a driving element 90, wherein the light processing unit 10 is drivably connected to the driving element 90 so that the red light and the green light emitted from the light processing unit 10 And the blue light can be received by the photosensitive element 20 in such a way that the photosensitive element 20 can receive all of the red without increasing the area of the photosensitive surface 210 of the photosensitive element 20 Light, the green light, and the blue light.
- the photosensitive element 20 needs to be The large area is not conducive to the thinning of the periscope camera module 100, and by driving the light processing unit 10 to rotate around the Y axis, the photosensitive element 20 can sequentially receive the red light and the green light And the blue light, and obtain images corresponding to the monochromatic light 300 of three colors, so that the first image 101 can be obtained later.
- the present invention further provides an image acquisition method of a periscope camera module, wherein the image acquisition method includes the following steps:
- a light processing unit 10 By a light processing unit 10 turning a composite light 400 from a subject while dispersing the composite light 400 and forming a plurality of monochromatic lights 300, at least one of the monochromatic lights 300 is A red light, at least one beam of the monochromatic light 300 is a green light, and at least one beam of the monochromatic light 300 is a blue light;
- the light processing unit 10 changes the optical path direction of the light beam passing through the periscope camera module 100.
- the composite light 400 propagating along the X-axis direction is incident on the light processing unit 10, and can pass along the Z-axis direction after passing through the light processing unit 10, that is, by the light processing unit 10
- the optical path of the light beam entering the periscope camera module 100 can be turned, so that the periscope camera module 100 can reduce the overall height of the camera module 1000 while having a telephoto shooting effect. It is advantageous for the camera module 1000 to be applied to a mobile electronic device 2000, and conforms to the trend of the mobile electronic device 2000 being thin and thin.
- the composite light 400 in the external environment can be dispersed through the light processing unit 10 to form the monochromatic light 300 of different colors, that is, the composite light 400 Dispersion occurs after passing through the light processing unit 10 to form the monochromatic light of seven colors of the red light, an orange light, a yellow light, the green light, the blue light, an indigo light and a purple light 300, and subsequently, the monochromatic light 300 of the three colors of the red light, the green light, and the blue light can be imaged on a photosensitive surface 210 of the photosensitive element 20.
- the photosensitive element receives the red light, the green light and the blue light correspondingly.
- the red pixel unit 2111, the green pixel unit 2112 and the blue pixel unit 2113 of the photosensitive surface 210 of the photosensitive element 210 respectively receive the red light and the green Light and the blue light.
- the red pixel unit 2111, the green pixel unit 2112 and the blue pixel unit 2113 correspondingly receive the red color in a manner of arranging from top to bottom in the order of R-RB-RGB-RG-G Light, the green light, and the blue light.
- the method further includes the step (e): condensing the monochromatic light 300 by a lens unit 30.
- the step (f) is further included: the distance between the monochromatic lights 300 of different colors is enlarged by an auxiliary element 80.
- the monochromatic light 300 emitted from the light processing unit 10 is dispersed after passing through the auxiliary element 80, so that the overlapping portion of the monochromatic light 300 of different colors is reduced, so that the red The overlapping area of light, the green light and the blue light is reduced to facilitate the photosensitive element 20 to better receive the monochromatic light 300 of the corresponding color and improve the red light and the green light And the clarity of the image formed by the blue light alone, so that the position of the actual scene can be obtained more accurately, so as to obtain the outline of the actual scene, and better present the three-dimensional effect.
- the method further includes a step (g): filtering stray light through a color filter element 40.
- the stray light is filtered in a manner that allows only the red light, the green light, and the blue light to pass, to ensure that only the red light, the green light, and the blue light can be imaged on the photosensitive
- the element 20 can subsequently analyze the position of the actual scene according to the images of the red light, the green light, and the blue light, respectively, and obtain the outline of the actual scene.
- the clarity of the imaging is ensured, and the interference of the other colors of the monochromatic light 300 to the location of the actual scene is reduced, so as to improve the position of the periscope camera module to obtain the actual scene Accuracy to better present the outline of the actual scene in the future.
- the method further includes a step (h): rotating the light processing unit 10.
- a driving element 90 is sufficient to drive the light processing unit 10 to rotate so that the photosensitive element 20 can receive the red light, the green light, and the blue light in this order. In this way, it is possible to All the red light, the green light, and the blue light emitted from the light processing unit 20 can be received by the photosensitive element 20.
- the step (c) further includes the following steps:
- the method further includes a step (c.4): acquiring the outline of the actual scene according to the position of the photographed object.
- the red light, the green light, and the blue light have different optical properties such as refractive index and propagation speed, which are caused by the same point on the actual scene, or a point in the same vertical plane, that is ,
- the composite light 400 that is reflected by the composite light 400 that is reflected into the periscope camera module at the same distance as the photosensitive surface 210 of the photosensitive element 20 of the periscope camera module is dispersed
- the red light, the green light, and the blue light are focused on different planes.
- the photosensitive surface 210 of the photosensitive element 20 When the position of the photosensitive surface 210 of the photosensitive element 20 is fixed, the photosensitive surface 210 of the photosensitive element 20 The sharpness of the received monochromatic light 300 of different colors is different. For example, the sharpness of the image of the red light formed on the photosensitive surface 210 of the photosensitive element 20 by the red light dispersed and formed by the composite light 400 reflecting into the periscope camera module at the same position on the actual scene is greater than The blue light dispersed by the composite light 400 is imaged on the photosensitive surface 210 of the photosensitive element 20. Similarly, the position of each point on the actual scene corresponding to the monochromatic light 300 of different colors having the same sharpness and received by the photosensitive surface 210 of the photosensitive element 20 will also be different.
- the imaging of the red light and the blue light on the photosensitive surface 210 of the photosensitive element 20 has the same sharpness, and the point on the actual scene corresponding to the red light corresponds to the periscope
- the distance of the camera module is greater than the distance between the point on the actual scene corresponding to the blue light and the periscope camera module.
- the image of the monochromatic light 300 of different colors formed on the photosensitive surface 210 of the photosensitive element 20 is analyzed by the photosensitive surface 210 of the photosensitive element 20 receiving the monochromatic light 300 to Obtain the position of each point on the actual scene, and obtain the outline of the actual scene, so as to obtain the depth information of the actual scene.
- the method further includes step (i): generating a focus parameter according to the outline of the actual scene. Further, the periscope camera module completes focusing according to the generated focusing parameters, and obtains a first image 101 of the periscope camera module 100.
- the present invention further provides an image acquisition method of a camera module, wherein the image acquisition method includes the following steps:
- a composite light 400 is dispersed by a light processing unit 10 of at least one periscope camera module 100, and a multi-monochromatic light 300 is formed;
- a corresponding photosensitive element 20 of the periscope camera module 100 receives a corresponding red light, a green light and a blue light;
- the periscope camera module 100 disperses the coincidence light 400 into the monochromatic light 300, and can obtain images of the red light, the green light, and the blue light.
- a processing device 50 analyzes the actual position of the actual scene corresponding to the monochromatic light 300 according to the difference in sharpness of the monochromatic light 300 of different colors, and obtains the outline of the actual scene, thereby obtaining a focus parameter, the The processing device 50 outputs the focus parameters to the main camera module 200, so that the main camera module 200 completes focusing according to the focus parameters to obtain a second image 102, and improves the main camera module
- the focusing efficiency of the group 200 can reduce the accuracy requirement of the focusing mechanism of the main camera module 200.
- the method further includes the step (e): obtaining a first image 101 by the periscope camera module 100.
- a final image 103 of the camera module 1000 is synthesized.
- the first image 101 and the second image 102 are synthesized by an algorithm to obtain the final image 103, wherein the first image 101 can have accurate depth information, and the first image 101
- the final image 103 obtained by fusing the image 101 and the second image 102 can reflect the outline of the actual scene and present a three-dimensional effect, thereby improving the quality of the image acquired by the camera module 1000. For example, referring to FIG.
- the periscope camera module 100 of the camera module 1000 obtains
- the first image 101 can reflect the actual position of the human face, that is, the relative positions between the points constituting the human face, and then obtain the outline of the human face, and the first image 101 presents the human face Depth information, after the first image 101 and the second image 102 acquired by the main camera module 100 are further synthesized into the final image 103, the final image 103 can present a three-dimensional outline of a human face.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computing Systems (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Microscoopes, Condenser (AREA)
- Studio Devices (AREA)
- Color Television Image Signal Generators (AREA)
Abstract
一摄像模组(1000)及其潜望式摄像模组(100)和图像获取方法及工作方法,其中潜望式摄像模组(100)包括一光处理单元(10)、一镜头单元(30)以及一感光元件(20),其中经过光处理单元(10)的一复合光(400)色散形成多单色光(300),单色光(300)能够被镜头单元(30)聚焦,其中光处理单元(10)和镜头单元(30)被保持于感光元件(20)的感光路径,镜头单元(30)被设置于光处理单元(10)和感光元件(20)之间,单色光(300)经过镜头单元(30)后被感光元件(20)的一感光面(210)接收,并能够在后续分析得到单色光(300)对应的实际景物的位置,以获得实际景物的轮廓,进而扩大潜望式摄像模组(100)的景深。
Description
本发明涉及一摄像模组领域,特别涉及一摄像模组及其潜望式摄像模组和图像获取方法及工作方法。
近年来,消费者对移动电子设备成像的要求越来越高,被配置于移动电子设备的单镜头摄像模组已经无法满足消费者的需求。市面上逐渐出现的双摄模组包括一主摄像模组和一副摄像模组,其中所述主摄像模组和所述副摄像模组能够分别获得图像,且所述主摄像模组和所述副摄像模组获得的图像能够被合成,进而得到具有高分辨率的图像,这也使得所述双摄模组受到更大消费者的青睐。现有的所述双摄模组包括黑白双摄模组、彩色双摄模组以及广角长焦摄像模组等多种类型,其中广角长焦摄像模组是被应用较为广泛的双摄模组之一。所述广角长焦摄像模组包括具有大视场角的一广角摄像模组和具有窄视场角的一长焦摄像模组,通过将所述广角摄像模组和所述长焦摄像模组获取的图像进行融合而获得最终的图像,能够满足广角拍摄和长焦拍摄的需求。但是在实际的应用过程中,由于长焦摄像模组的长度较长,造成所述双摄模组的高度偏高,体积偏大,不符合目前移动电子设备趋于轻薄化发展的趋势。
在现有的一些设计中,所述长焦摄像模组被设计成一潜望式摄像模组,其中所述潜望式摄像模组藉由一光学元件改变进入所述潜望式摄像模组的光束的光路方向,使得进入所述潜望式摄像模组的光束能够从一第一方向被转折至与所述第一方向相互垂直的一第二方向,再通过一镜头模块和一滤色片后到达一感光芯片,通过这样的方式,能够保障所述长焦距模组在满足长焦拍摄效果的同时降低所述长焦摄像模组的高度,以利于所述双摄模组被应用于轻薄化的移动电子设备。但是,现有的所述潜望式摄像模组也存在不少问题。首先,尽管现有的所述潜望式摄像模组的焦距较长,但是所述潜望式摄像模组能够获取的图像的景深较浅。其次,为了拍摄高质量的图像,需要通过一驱动件大范围地驱动所述镜头模块和/或所述光学元件实现对焦,而且,为了保障成像效果,在对焦过程中,要确保所述驱动件运动或转动的精度,因此,现有的潜望式摄像模组对所述驱动件 的精度要求较高。
发明内容
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述摄像模组的至少一个所述潜望式摄像模组通过接收多束单色光的方式获取被拍摄物体的图像,以利于提升景深。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组分别获得不同颜色的所述单色光的成像,并将清晰的所述单色光的成像合成所述潜望式摄像模组的最终图像,通过将不同颜色的所述单色光的清晰的像叠加得到最终图像,以提升所述潜望式摄像模组的景深。
本发明的另一目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中形成于所述潜望式摄像模组的一感光元件的一感光面的不同颜色的所述单色光的成像对应于所述被拍摄物体的不同清晰物面位置,即,通过将不同清晰物面位置的图像合成而得到所述潜望式摄像模组的最终图像,进而提升了所述潜望式摄像模组的景深。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组提供一光处理单元,来自被拍摄物体的一复合光经过所述光处理单元被分散成多束不同颜色的所述单色光,使得不同颜色的所述单色光对应的不同清晰物面位置的离散程度提升,以利于得到对应于不同位置的清晰物面的成像,并在后续通过算法将对应于不同清晰物面位置的图像相互叠加合成,以提高所述潜望式摄像模组的景深。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述感光元件的每一像素点能够对应地接收不同颜色的所述单色光,并分别得到不同颜色的所述单色光单独形成的清晰的像。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组分别获取多束所述单色光中的至少一红色光、至少一绿色光以及至少一蓝色光的清晰成像,并在后续根据算法将所述红色光、所述绿色光以及所述蓝色光对应的不同清晰物面位置的图像合成,通过将不同颜色的所述单色光的清晰成像的叠加以实现提升景深。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述感光元件的每一像素点能够对应地接收所述红色光、所述绿色光以及所述蓝色光三种颜色的光,进而分别得到所述红色光、所述绿色光以及所述蓝色光单独形成的图像。
本发明的一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述摄像模组能够获得高质量的图像。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述摄像模组藉由至少一个所述潜望式摄像模组获得的图像能够具有深度信息。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组能够确定实际景物的所在的位置,并获取实际景物的轮廓。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组通过接收所述单色光的方式能够确定实际景物的深度信息,并获取实际景物的轮廓。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组通过接收所述单色光的方式能够确定实际景物的位置,并获取实际景物的轮廓。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组能够接收所述红色光、所述绿色光以及所述蓝色光三种颜色的所述单色光,并根据三种颜色的所述单色光成的像分析实际景物所在的位置,并获取实际景物的轮廓。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述感光元件的每一像素点能够对应地接收所述红色光、所述绿色光以及所述蓝色光三种颜色的光,并根据三种颜色的光的锐利程度的差别分析实际景物所在的位置,进而获得实际景物的轮廓。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组能够根据实际景物的轮廓获取当前区域的对焦参数,并将图像中出现的景物保持于所述潜望式摄像模组的一镜头单元的景深内,以利于提高所述潜望式摄像模组的对焦效率。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组能够根据实际景物的轮廓获取当前区域的对焦参数,并将图像中出现的景物保持于所述潜望式摄像模组的一镜头单元的焦距上,进而降低了驱动所述镜头单元完成对焦的一对焦驱动件的要求。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述摄像模组提供至少一主摄像模组,所述主摄像模组和所述潜望式摄像模组相互配合获得高质量的图像。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组辅助所述主摄像模组完成对焦,以提高所述主摄像模组的对焦效率。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组进一步包括一柱面镜,所述柱面镜能够扩大经过所述光处理单元的光线的色散程度,减小红色光、绿色光以及蓝色光之间的重叠部分,使得所述感光元件能够获取更好的单色光形成的图像。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组进一步包括一自由曲面镜,所述自由曲面镜能够扩大经过所述光处理单元的光线的色散程度,减小所述红色光、所述绿色光和所述蓝色光之间的重叠部分,使得所述感光元件能够获取更好的单色光形成的图像。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组的所述光处理单元能够被驱动而转动,使得经过所述光处理单元的所述复合光形成的所述单色光能够全部被所述感光元件接收,以保障所述潜望式摄像模组成像的质量。
本发明的另一个目的在于提供一摄像模组及其潜望式摄像模组和图像获取方法及工作方法,其中所述潜望式摄像模组的所述光处理单元能够被驱动而转动,进而所述感光元件不需要扩大所述感光元件的面积就能够接收全部的所述红色光、所述绿色光以及所述蓝色光,通过这样的方式既保障了所述潜望式摄像模组成像质量的同时,又有利于所述潜望式摄像模组轻薄化。
依本发明的一个方面,本发明进一步提供一潜望式摄像模组的图像获取方法,所述图像获取方法包括如下步骤:
(a)藉由一光处理单元转折来自一被拍摄物体的一复合光的同时分散所述复合光,形成多束单色光;
(b)藉由一感光元件的一感光面接收所述单色光,分别获得不同颜色的所述单色光的成像;以及
(c)合成不同颜色的所述单色光的成像,以得到被拍摄物体的最终图像。
根据本发明的一个实施例,在所述步骤(b)中进一步包括步骤(d):所述感光元件对应地接收多束所述单色光中的至少一红色光、至少一绿色光以及至少一蓝色光。
根据本发明的一个实施例,在所述步骤(b)中,藉由所述感光元件的所述感光面的一红色像素单元、一绿色像素单元以及一蓝色像素单元分别地接收所述红色光、所述绿色光以及所述蓝色光。
根据本发明的一个实施例,在所述步骤(b)中,通过所述红色像素单元、所述滤色像素单元以及所述蓝色像素单元按照R-RB-RGB-RG-G的顺序自上而下地排列的方式对应地接收所述红色光、所述绿色光以及所述蓝色光。
根据本发明的一个实施例,在所述步骤(b)之前进一步包括步骤(e):藉由一镜头单元分别汇聚不同颜色的所述单色光。
根据本发明的一个实施例,在所述步骤(e)之后进一步包括步骤(f):藉由一辅助元件扩大不同颜色的所述单色光之间的距离。
根据本发明的一个实施例,在所述步骤(b)之前进一步包括步骤(g):藉由一滤色元件过滤杂光。
根据本发明的一个实施例,在所述步骤(b)之前进一步包括步骤(h):转动所述光处理单元。
依本发明的另一个方面,本发明进一步提供一摄像模组的工作方法,所述工作方法包括如下步骤:
(a)藉由至少一潜望式摄像模组的一光处理单元转折来自一被拍摄物体的一复合光的同时分散所述复合光,形成多束单色光;
(b)藉由一感光元件的一感光面接收所述单色光;
(c)获取所述被拍摄物体的轮廓;以及
(d)辅助至少一主摄像模组完成对焦。
根据本发明的一个实施例,在上述步骤中,其中所述潜望式摄像模组藉由所 述感光元件的所述感光面对应地接收多束所述单色光中的至少一束红色光、至少一束绿色光以及至少一束蓝色光的方式获得关于所述被拍摄物体的一第一图像。
根据本发明的一较佳实施例,在所述步骤(c)中如下步骤:
(c.1)形成所述红色光、所述绿色光以及所述蓝色光的像;
(c.2)分析所述红色光、所述绿色光以及所述蓝色光的锐利程度;以及
(c.3)确定所述被拍摄物体的位置。
根据本发明的一较佳实施例,在所述步骤(c.3)之后进一步包括步骤(c.4):根据被拍摄物体的位置获取所述被拍摄物体的轮廓。
根据本发明的一较佳实施例,在所述步骤(d)中进一步包括如下步骤:
所述潜望式摄像模组根据被拍摄物体的轮廓确定一对焦参数;和
所述主摄像模组根据所述对焦参数完成对焦。
根据本发明的一个实施例,在所述步骤(d)之后,所述主摄像模组获得关于所述被拍摄物体的一第二图像。
根据本发明的一个实施例,在上述方法中,合成所述第一图像和所述第二图像以得到所述摄像模组的一最终图像。
依本发明的另一个方面,本发明进一步提供一潜望式摄像模组,其包括:
一光处理单元,其中经过所述光处理单元的一复合光色散形成多单色光;
一镜头单元,其中所述单色光能够被所述镜头单元聚焦;以及
一感光元件,其中所述光处理单元和所述镜头单元被保持于所述感光元件的感光路径,所述镜头单元被设置于所述光处理单元和所述感光元件之间,所述单色光经过所述镜头单元后被所述感光元件的一感光面接收。
根据本发明的一个实施例,其中所述感光元件的所述感光面对应地接收多束所述单色光中的至少一红色光、至少一绿色光以及至少一蓝色光。
根据本发明的一个实施例,所述的潜望式摄像模组进一步包括一滤色元件,其中所述滤色元件被保持于所述感光元件的感光路径,且光通过所述滤光元件后达到所述感光元件,所述滤色元件仅允许所述红色光、所述绿色光以及所述蓝色光通过。
根据本发明的一个实施例,所述滤色元件被设置于所述镜头单元和所述感光元件之间。
根据本发明的一个实施例,所述的潜望式摄像模组进一步包括一辅助元件, 其中所述辅助元件被设置于所述光处理单元和所述镜头单元之间,且所述辅助元件被保持于所述感光元件的感光路径。
根据本发明的一个实施例,所述辅助元件为一柱面镜。
根据本发明的一个实施例,所述辅助元件为一自由曲面镜。
根据本发明的一个实施例,所述的潜望式摄像模组进一步包括一驱动元件,其中所述驱动元件被设置于所述光处理单元,且所述光处理单元被可驱动以转动地连接于所述驱动元件。
根据本发明的一个实施例,所述感光元件包括多像素点,其中所述像素点包括多个像素单元,其中所述像素单元选自:一红色像素单元、一绿色像素单元、以及一蓝色像素单元组成的像素单元类型中的一种或是多种的组合。
根据本发明的一个实施例,所述像素单元按照R-RB-RGB-RG-G的顺序自上而下地排列。
根据本发明的一个实施例,所述光处理单元为一棱镜。
依本发明的一个方面,本发明进一步提供一摄像模组,其包括:
至少一潜望式摄像模组,其中所述潜望式摄像模组包括一光处理单元、一镜头单元以及一感光元件,其中经过所述光处理单元的一复合光色散形成多单色光,其中所述单色光能够被所述镜头单元聚焦,其中所述光处理单元和所述镜头单元被保持于所述感光元件的感光路径,所述镜头单元被设置于所述光处理单元和所述感光元件之间,所述单色光经过所述镜头单元后被所述感光元件的一感光面接收,并获得关于所述被拍摄物体的一第一图像;和
至少一主摄像模组,其中所述主摄像模组获得关于所述被拍摄物体的一第二图像,所述第一图像和所述第二图像被处理后形成一最终图像。
图1是根据本发明的一较佳实施例的一潜望式摄像模组的立体示意图。
图2A是根据本发明的上述较佳实施例的所述潜望式摄像模组的光路示意图。
图2B是根据本发明的上述较佳实施例的所述潜望式摄像模组的一感光芯片的所述像素点的示意图。
图3A是根据本发明的上述较佳实施例的所述潜望式摄像模组的光路示意 图。
图3B是根据本发明的上述较佳实施例的所述潜望式摄像模组的光路示意图的部分放大图。
图4是根据本发明的另一较佳实施例的所述潜望式摄像模组的光路示意图。
图5是根据本发明的另一较佳实施例的所述潜望式摄像模组的光路示意图。
图6是根据本发明的一较佳实施例的一摄像模组的应用于一移动电子设备的示意图。
图7是根据本发明的上述较佳实施例所述的移动电子设备通过所述摄像模组获取图像的场景示意图。
以下描述用于揭露本发明以使本领域技术人员能够实现本发明。以下描述中的优选实施例只作为举例,本领域技术人员可以想到其他显而易见的变型。在以下描述中界定的本发明的基本原理可以应用于其他实施方案、变形方案、改进方案、等同方案以及没有背离本发明的精神和范围的其他技术方案。
本领域技术人员应理解的是,在本发明的揭露中,术语“纵向”、“横向”、“上”、“下”、“前”、“后”、“左”、“右”、“竖直”、“水平”、“顶”、“底”“内”、“外”等指示的方位或位置关系是基于附图所示的方位或位置关系,其仅是为了便于描述本发明和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此上述术语不能理解为对本发明的限制。
可以理解的是,术语“一”应理解为“至少一”或“一个或多个”,即在一个实施例中,一个元件的数量可以为一个,而在另外的实施例中,该元件的数量可以为多个,术语“一”不能理解为对数量的限制。
参照说明书图1至图3B,依据本发明一较佳实施例的一潜望式摄像模组100将在接下来的描述中被阐述,其中所述潜望式摄像模组100接收不同颜色的多束单色光300的方式获得多个不同颜色的所述单色光300的成像,其中不同颜色的所述单色光300对应所述被拍摄物体不同清晰物面位置,具体地,每一所述单色光300的成像对应于被拍摄物体的一个清晰物面位置,并在后续,通过算法将不同颜色的所述单色光300的成像相互叠加,以合成所述被拍摄物体的最终图像,通过这样的方式提升了所述潜望式摄像模组100的景深。优选地,所述潜望式摄 像模组获取每一所述单色光300的清晰的成像,通过将所述单色光300的清晰的成像相互叠加的方式提高所述摄像模组100的景深,同时获得高质量的图像。
所述潜望式摄像模组100包括一光处理单元10和一感光元件20,其中所述光处理单元10被保持于所述感光元件20的感光路径,进入所述潜望式摄像模组100的光线经过所述光处理单元10后能够到达所述感光元件20,并成像于所述感光元件20的一感光面210。具体来说,来自一被拍摄物体的一复合光400经过所述光处理单元10发生色散,进而形成多束不同颜色的所述单色光300,即,所述复合光400经过所述光处理单元10后发生色散形成一红色光、一橙色光、一黄色光、一绿色光、一蓝色光、一靛色光以及一紫色光七种颜色的所述单色光300,并在后续,不同颜色的所述单色光300分别形成清晰的成像于所述感光元件20的所述感光面,不同颜色的所述单色光300的清晰成像相当于多个所述被拍摄物体不同清晰物面位置,通过算法合成不同颜色的所述单色光300的清晰成像,以得到所述被拍摄物体的最终图像。换句话说,所述光处理元件10使得来自所述被拍摄物体的所述复合光400的多束所述单色光300的色散程度提升,进而提升了所述单色光300对应的不同清晰物面的离散程度,以利于所述潜望式摄像模组100通过将不同单色光300的成像叠加的方式实现提升景深。
进一步地,参照图1至图3B,所述潜望式摄像模组100包括一镜头单元30,其中所述镜头单元30被设置于所述光处理单元10和所述感光元件20之间,且所述镜头单元30被保持于所述感光元件20的感光路径,从所述光处理单元10射出的多束所述单色光300到达所述镜头单元30,且所述单色光300被所述镜头单元30聚焦,进一步地,经过所述镜头单元30后的不同颜色的所述单色光300分别成像于所述感光元件20的所述感光面210,以获得多个不同清晰物面位置的成像。由于不同颜色的光在同一介质中的折射率不同、传播速度也不相同等因素的影响,与所述潜望式摄像模组的所述感光元件20的所述感光面210距离相同的物面所反射进入所述潜望式摄像模组内的所述复合光400被分散形成的所述红色光、所述绿色光以及所述蓝色光会被聚焦于不同的平面,即,不同颜色的所述单色光300能够形成的清晰的像面位置也不一致,利用算法将不同清晰物面位置的图像合成后得到所述潜望式摄像模组的最终图像,进而提升了所述潜望式摄像模组100的景深。
具体来说,参照图3A和图3B,所述复合光400经过所述光处理单元10后 发生色散形成所述单色光300,由于不同的所述单色光300的波长和频率不同,且所述光处理单元10对具有不同波长的所述单色光300的折射率不同,使得不同的所述单色光300从所述光处理单元10射出后的位置也不相同。比如说,沿X轴方向传播的所述复合光400经过所述光处理单元10后形成的沿Z轴方向传播的所述单色光300按照所述红色光、所述橙色光、所述黄色光、所述绿色光、所述蓝色光、所述靛色光以及所述紫色光的顺序依次自上而下排列,即,当七种颜色的所述单色光300到达垂直于Z轴方向的平面,如,所述感光元件20的所述感光面210,不同颜色的单色光300形成的图像区域也自上而下排列于所述感光元件20的所述感光面210,其中所述红色光位于最上方的区域,所述橙色光次之,所述紫色光位于最下方的区域,进而提升了不同清晰物面的离散程度,以利于所述潜望式摄像模组100通过将不同单色光300的成像叠加的方式实现提升景深。在本发明的一较佳实施例中,所述光处理单元10为一棱镜。本领域技术人员应该知晓的是,所述光处理单元10的具体实施方式仅仅作为示例,不能成为对本发明所述潜望式摄像模组100的内容和范围的限制。
进一步地,所述感光元件20对应地接收不同颜色的所述单色光300,所述感光元件20将光信号转换成电信号,并将电信号传输至与所述感光元件20可通信地连接的一处理装置50,进而能够分别得到不同颜色的所述单色光的图像,即,获得多个不同清晰物面位置的成像。通过算法将不同颜色的单色光的图像进行叠加后合成所述第一图像101。
优选地,所述感光元件20的所述感光面210对应地接收所述红色光、所述绿色光以及所述蓝色光这三种颜色的所述单色光300,进而得到所述红色光、所述绿色光以及所述蓝色光单独的清晰的成像,即,获得三个不同清晰物面位置的成像。所述红色光的图像、所述绿色光的图像以及所述蓝色光的图像对应于三个不同清晰物面位置,根据算法合成所述红色光的图像、所述绿色光的图像以及所述蓝色光的图像,以实现提升景深。比如说,参照图3A,所述感光元件20的所述感光面210对应地接收所述红色光、所述绿色光以及所述蓝色光这三种颜色的所述单色光300,进而分别得到所述红色光的成像A、所述绿色光图像B以及所述蓝色光的图像C,其中所述红色光的成像A、所述绿色光图像B以及所述蓝色光的图像C相当于三个不同清晰物面位置的成像,根据所述红色光的成像A和所述绿色光的图像B以及所述蓝色光的图像C的信息能够获得被拍摄物体的物面位 置A’、物面位置B’以及物面位置C’,从而在后续,根据算法能够获得合成所述红色光的成像A、所述绿色光图像B以及所述蓝色光的图像C,以得到所述被拍摄物体的清晰图像。应该理解的是,所述感光元件20也可以通过接收其他颜色的所述单色光300获得被拍摄物体的图像。
具体来说,参照图2B,所述感光元件20的所述感光面210包括多像素点21,即,所述感光元件20的所述感光面210包括一像素点阵列,并藉由多个所述像素点21接收对应的所述红色光、所述绿色光以及所述蓝色光,进而能够得到三种颜色的所述单色光300的清晰的图像。也就是说,所述感光元件20的所述红色像素单元2111、所述绿色像素单元2112、以及所述蓝色像素单元2113对应的所述红色光、所述绿色光以及所述蓝色光的成像相当于是三个不同清晰物面位置的光学系统,通过算法将不同清晰物面位置的“三摄”的图像合成后,能够得到被拍摄物体的最终图像。应该理解的是,“三摄”的图像是指通过藉由所述红色像素单元2111、所述绿色像素单元2112、以及所述蓝色像素单元2113所得到的所述红色光的成像、所述绿色光的成像以及所述蓝色光的成像。
更具体地,所述像素点21包括多个所述像素单元211,其中每个像素单元211能够接收对应颜色的所述单色光300,多个所述像素单元211有序地自上而下排列,进而使得所述像素点211能够接收对应颜色的所述单色光300。所述像素单元211选自:一红色像素单元2111、一绿色像素单元2112、以及一蓝色像素单元2113组成的像素单元类型中的一种或是多种的组合,其中所述红色像素单元2111能够接收所述红色光,所述绿色像素单元2112能够接收所述绿色光,所述蓝色像素单元2113能够接收所述蓝色光,所述红色像素单元2111、所述绿色像素单元2112以及所述蓝色像素单元2113被按照一预设规则排列,进而使得所述像素点21能够接收对应颜色的所述单色光300。在说明书附图和下面的描述中,所述红色像素单元2111、所述绿色像素单元2112以及所述蓝色像素单元2113分别用字母“R”、“G”以及“B”代表,如图2B所示。
优选地,所述感光元件20被实施为不规则颜色滤光器阵列。具体来说,所述像素单元211按照R-RB-RGB-RG-G的顺序自上而下地排列,使得所述感光元件20能够接收对应颜色的所述单色光。具体来说,由于不同的所述单色光300的波长和频率不同,且所述光处理单元10对具有不同波长的所述单色光300的折射率不同,使得不同的所述单色光300从所述光处理单元10射出后的位置也不 相同,且所述单色光300按照所述红色光、所述橙色光、所述黄色光、所述绿色光、所述蓝色光、所述靛色光以及所述紫色光的顺序依次自上而下排列,因此,对应的仅包含所述红色像素单元2111的所述像素单元211分布于最上方,仅包含所述蓝色像素单元2113的所述像素单元211分布于最下方。进一步地,由于折射角度和焦距的影响,所述红色光、所述绿色光以及所述蓝色光难以完全分离,即,自上而下会出现所述红色光和所述绿色光重合,所述红色光、所述绿色光和所述蓝色光,所述绿色光和所述蓝色光重合的现象,对应地,包括至少一个所述红色像素单元2111和至少一个所述绿色像素单元2112的所述像素单元211位于仅包含所述红色像素单元2111的所述像素单元211的下方,包括至少一个所述红色像素单元2111、至少一个所述绿色像素单元2112以及至少一个所述蓝色像素单元2113的所述像素单元211位于仅包括所述蓝色像素单元2113的所述像素单元211的上方。也就是说,按照色散后形成的所述单色光300的分布设计对应的所述感光元件20的所述像素点21的所述像素单元211,使得所述感光元件20能够更好地接收对应颜色的所述单色光300,进一步有利于提升成像效果。
可选地,所述感光元件20被实施为RGB颜色滤光器阵列,即每个所述像素单元211包括至少一个所述红色像素单元2111、至少一个所述绿色像素单元2112以及至少一个所述蓝色像素单元2113。
可选地,所述感光元件20被实施为RGBW颜色滤光器阵列,其中“W”为一白像素单元,即,每个所述像素单元211包括至少一个所述红色像素单元2111、至少一个所述绿色像素单元2112、至少一个所述蓝色像素单元2113以及至少一个白色像素单元,通过所述白像素单元来补充亮度,以提高成像质量。
参照图1和图2A,所述潜望式摄像模组100进一步包括一滤色元件40,其中所述滤色元件40被保持于所述感光元件20的感光路径,且所述滤色元件40被设置于所述感光元件20前,使得进入所述潜望式摄像模组100的光线先经过所述滤色元件40后再到达所述感光元件20。所述滤色元件40对杂光进行过滤,以保障形成于所述感光元件20的所述感光面210的成像的清晰程度。比如说,比如说,所述滤色元件40仅允许一预设波段的光线通过,以保障所述红色光、所述绿色光以及所述蓝色光分别形成清晰的独立的图像于所述感光元件20的所述感光面210,从而保障成像质量。优选地,所述滤色元件40被设置于所述镜头单元30和所述感光元件20之间,光线经过所述镜头单元30后到达所述滤色 元件40,所述滤色元件40对杂光进行过滤。优选地,所述滤色元件40被设置于所述镜头单元30和所述光处理模块10之间。应该理解的是,所述滤色元件40的具体实施方式仅仅作为示例,不能成为对本发明所述潜望式摄像模组100的内容及范围的限制。
在说明书附图4所示出的所述潜望式摄像模组100的这个具体的实施例中,所述潜望式摄像模组100进一步包括一辅助元件80,其中所述辅助元件80被设置于所述光处理单元10和所述镜头单元30之间,且所述辅助元件80被保持于所述感光元件20的感光路径,从所述光处理单元10射出的所述单色光300经过所述辅助元件80后被分散,使不同颜色的所述单色光300的重叠部分被减少,这样,所述红色光、所述绿色光以及所述蓝色光的重叠区域被减少,以利于所述感光元件20更好地接收对应颜色的所述单色光300,并提高了所述红色光、所述绿色光以及所述蓝色光单独形成的图像的清晰度,所述辅助元件80可进一步提升景深。优选地,所述辅助元件80被实施为一柱面镜。优选地,所述辅助元件80被实施为一自由曲面。
说明书附图5中示出的所述潜望式摄像模组100与附图2A中示出的所述潜望式摄像模组100的差异在于,图5中示出的所述潜望式摄像模组100进一步包括一驱动元件90,其中所述光处理单元10被可驱动以转动地连接于所述驱动元件90,使得从所述光处理单元10射出的所述红色光、所述绿色光以及所述蓝色光能够全部被所述感光元件20接收,通过这样的方式,使得所述感光元件20不需要扩大所述感光元件20的所述感光面210的面积就能够接收全部的所述红色光、所述绿色光以及所述蓝色光。举例来说,由于所述光处理单元10对不同的颜色的所述单色光300的折射率不同,从所述光处理单元10射出的所述单色光300与YZ平面所成的角度不同,随着光的传播,所述单色光300之间的距离会变大,若要接收全部的所述红色光、所述绿色光以及所述蓝色光,需要的所述感光元件20的感光面积偏大,不利于所述潜望式摄像模组100轻薄化,而通过驱动所述光处理单元10绕Y轴转动,使得所述感光元件20能够依次接收所述红色光、所述绿色光以及所述蓝色光,并分别得到三种颜色的所述单色光300对应的图像,以在后续能够得到所述第一图像101。
依本发明的另一个方面,本发明进一步提供一潜望式摄像模组的图像获取方法,其中所述图像获取方法包括如下步骤:
(a)藉由一光处理单元10转折来自一被拍摄物体的一复合光400的同时分散所述复合光400,形成多束单色光300;
(b)藉由一感光元件20的一感光面210接收所述单色光300,分别获得不同颜色的所述单色光300的成像;以及
(c)合成不同颜色的所述单色光200的成像,以得到被拍摄物体的最终图像。
具体来说,在所述步骤(a)中,来自被拍摄物体的所述复合光400经过所述光处理单元10能够发生色散,进而形成不同颜色的所述单色光300,即,所述复合光400经过所述光处理单元10后发生色散形成所述红色光、一橙色光、一黄色光、所述绿色光、所述蓝色光、一靛色光以及一紫色光七种颜色的所述单色光300,并在后续,不同颜色的所述单色光300能够成像于所述感光元件20的所述感光面210,其中不同颜色的所述单色光300对应所述被拍摄物体不同清晰物面位置,具体地,每一所述单色光300的清晰的成像对应于被拍摄物体的一个清晰物面位置,并在后续,通过算法将不同颜色的所述单色光300的清晰成像相互叠加,以合成所述被拍摄物体的最终图像,通过这样的方式提升了所述潜望式摄像模组100的景深。
优选地,在所述步骤(b)中,所述感光元件20的所述感光面210对应地接收所述红色光、所述绿色光以及所述蓝色光,进而获得所述红色光、所述绿色光以及所述蓝色光的清晰的成像。值得一提的是,所述感光元件20接收的所述单色光300的具体颜色不能成为对本发明所述潜望式摄像模组的图像获取方法的内容和范围的限制。更具体地,藉由所述感光元件210的所述感光面210的所述红色像素单元2111、所述绿色像素单元2112以及所述蓝色像素单元2113分别地接收所述红色光、所述绿色光以及所述蓝色光,进而分别获得所述红色光、所述绿色光以及所述蓝色光的成像。更优选地,所述红色像素单元2111、所述绿色像素单元2112以及所述蓝色像素单元2113按照R-RB-RGB-RG-G的顺序自上而下地排列的方式对应地接收所述红色光、所述绿色光以及所述蓝色光。
在所述步骤(b)之前进一步包括步骤(e):藉由一镜头单元30汇聚所述单色光300。
优选地,在所述步骤(e)之后进一步包括步骤(f):藉由一辅助元件80扩大不同颜色的所述单色光300之间的距离。具体来说,从所述光处理单元10射 出的所述单色光300经过所述辅助元件80后被分散,使不同颜色的所述单色光300的重叠部分被减少,这样,所述红色光、所述绿色光以及所述蓝色光的重叠区域被减少,以利于所述感光元件20更好地接收对应颜色的所述单色光300,并提高了所述红色光、所述绿色光以及所述蓝色光单独形成的图像的清晰度。
在所述步骤(b)之前进一步包括步骤(g):藉由一滤色元件40过滤杂光。具体地,所述滤色元件40仅允许一预设波段的光线通过,以保障所述红色光、所述绿色光以及所述蓝色光分别形成清晰的独立的图像于所述感光元件20的所述感光面210,从而保障成像质量。
优选地,在所述步骤(b)之前进一步包括步骤(h):转动所述光处理单元10。具体来说,藉由一驱动元件90够驱动所述光处理单元10转动,使得所述感光元件20能够依次接收所述红色光、所述绿色光以及所述蓝色光,通过这样的方式,能够使得从所述光处理单元20射出的所述红色光、所述绿色光以及所述蓝色光能够全部被所述感光元件20接收,利于所述潜望式摄像模组100轻薄化。
根据本发明的另一个方面,依本发明的另一较佳实施例的一潜望式摄像模组100将在接下来的描述中被阐述。参照说明书附图6和图7,至少一所述潜望式摄像模组100被应用一摄像模组1000,其中所述摄像模组1000能够获得高质量的图像。具体来说,所述摄像模组1000包括至少一所述潜望式摄像模组100和至少一主摄像模组200,其中所述潜望式摄像模组100和所述主摄像模组200能够分别获得一第一图像101和一第二图像102,其中所述第一图像101和所述第二图像102通过算法能够合成所述摄像模组1000的一最终图像103。进一步地,所述潜望式摄像模组100能够确定实际景物的位置,并获得实际景物的轮廓,且所述潜望式摄像模组100获得的所述第一图像101具有深度信息,通过所述第一图像101和所述第二图像102融合得到的所述最终图像103能够体现出实际景物的轮廓,并呈现出立体的效果,进而提高了所述摄像模组1000获取的图像质量。
值得一提的是,所述摄像模组1000的所述潜望式摄像模组100和所述主摄像模组200的数量不受限制,比如说,所述摄像模组1000可以被实施为双摄模组,即,所述摄像模组1000包括一个所述潜望式摄像模组100和一个所述主摄像模组200;所述摄像模组1000也可以被实施为三摄模组,即,所述摄像模组1000包括两个所述潜望式摄像模组100和一个所述主摄像模组200,或是,所述摄像模组1000包括一个所述潜望式摄像模组100和两个所述主摄像模组200。 所述摄像模组1000的所述潜望式摄像模组100和所述主摄像模组200的具体数量仅仅作为示例,不能成为对本发明所述摄像模组1000及其所述潜望式摄像模组100的内容及范围的限制。在本发明的说明书附图及下面的描述中,以所述摄像模组1000被实施为双摄模组为例。
参照图1至图3B,所述潜望式摄像模组100通过接收一单色光300的方式成像,并根据接收的所述单色光300的锐利程度分析实际景物的实际位置,并获得实际景物的轮廓。具体来说,所述潜望式摄像模组100包括一光处理单元10和一感光元件20,其中所述光处理单元10被保持于所述感光元件20的感光路径,进入所述潜望式摄像模组100的光线能够经过所述光处理单元10后能够到达所述感光元件20,并成像于所述感光元件20的一感光面210。进一步地,外部环境中的一复合光400经过所述光处理单元10发生色散,进而形成所述单色光300,即,所述复合光400经过所述光处理单元10后发生色散形成一红色光、一橙色光、一黄色光、一绿色光、一蓝色光、一靛色光以及一紫色光七种颜色的所述单色光300,并在后续,所述红色光、所述绿色光以及所述蓝色光这三种颜色的所述单色光300能够成像于所述感光元件20,进而根据这三种颜色的所述单色光300形成的图像分析计算每个所述单色光300对应的实际景物的位置,从而得到实际景物的轮廓。具体来说,所述感光元件20的所述感光面210为一平面,由于所述红色光、所述绿色光和所述蓝色光,对应折射率不一致、传播速度等存在一定差异,且所述红色光、所述绿色光以及所述蓝色光对应的实际景物的位置和所述感光元件20的所述感光面210之间的相对位置也存在差异,因此,藉由所述感光元件20接收到的不同单色光的锐利程度也存在差异。进一步地,根据形成于所述感光元件20的所述感光面210的图像,能够分析出不同单色光对应的实际景物位置或者深度信息。应该知晓的是,外部环境中的白光、太阳光等都属于所述复合光400。
值得一提的是,所述光处理单元10能够改变经过进入所述潜望式摄像模组100的光束的光路方向。比如说,沿着X轴方向传播的所述复合光400入射至所述光处理单元10,经过所述光处理单元10后能够沿着Z轴方向传播,即,藉由所述光处理单元10能够使得进入所述潜望式摄像模组100的光束的光路发生转折,进而所述潜望式摄像模组100在具有长焦拍摄效果的同时能够降低所述摄像模组1000的整体高度,以利于所述摄像模组1000被应用于一移动电子设备2000, 并符合所述移动电子设备2000轻薄化发展的趋势,其中所述移动电子设备2000如手机、iPad等。
参照图3A和图3B,所述复合光400经过所述光处理单元10后发生色散形成所述单色光300,由于不同的所述单色光300的波长和频率不同,且所述光处理单元10对具有不同波长的所述单色光300的折射率不同,使得不同的所述单色光300从所述光处理单元10射出后的位置也不相同。比如说,沿X轴方向传播的所述复合光400经过所述光处理单元10后形成的沿Z轴方向传播的所述单色光300按照所述红色光、所述橙色光、所述黄色光、所述绿色光、所述蓝色光、所述靛色光以及所述紫色光的顺序依次自上而下排列,即,当七种颜色的所述单色光300到达垂直于Z轴方向的平面,如,所述感光元件20的所述感光面210,不同颜色的单色光300形成的图像区域也自上而下排列于所述感光元件20的所述感光面210,其中所述红色光位于最上方的区域,所述橙色光次之,所述紫色光位于最下方的区域。在本发明的一较佳实施例中,所述光处理单元10为一棱镜。本领域技术人员应该知晓的是,所述光处理单元10的具体实施方式仅仅作为示例,不能成为对本发明所述潜望式摄像模组1000的内容和范围的限制。
进一步地,参照图1和图2A,所述潜望式摄像模组100包括一镜头单元30,其中所述镜头单元30被设置于所述光处理单元10和所述感光元件20之间,且所述镜头单元30被保持于所述感光元件20的感光路径,从所述光处理单元10射出的所述单色光300能够到达所述镜头单元30,且所述单色光300能够被所述镜头单元30聚焦,进一步地,经过所述镜头单元30后的所述红色光、所述绿色光以及所述蓝色光这三种颜色的所述单色光300能够成像于所述感光元件20的所述感光面210。
参照图1和图2A,所述潜望式摄像模组100进一步包括一滤色元件40,其中所述滤色元件40被保持于所述感光元件20的感光路径,且所述滤色元件40被设置于所述感光元件20前,使得进入所述潜望式摄像模组100的光线先经过所述滤色元件40后再到达所述感光元件20。所述滤色元件40对不同颜色的所述单色光300进行过滤,仅允许所述红色光、所述绿色光以及所述蓝色光通过,保障仅所述红色光、所述绿色光以及所述蓝色光能够成像于所述感光元件20的所述感光面210,从而保障成像质量。进一步地,三种颜色的所述单色光300对应的实际景物的位置存在差异,分析三种颜色的所述单色光300形成于所述感光 元件20的所述感光面210的像,能够获得分析实际景物的位置或深度信息,并获取实际景物的轮廓。
具体来说,所述红色光、所述绿色光以及所述蓝色光的折射率、传播速度等光学性质不一样,由实际景物上的同一个点,或是位于同一垂直平面内的点,即,与所述潜望式摄像模组的所述感光元件20的所述感光面210距离相同的点所反射进入所述潜望式摄像模组内的所述复合光400被分散形成的所述红色光、所述绿色光以及所述蓝色光会被聚焦于不同的平面,当所述感光元件20的所述感光面210的位置固定不变时,所述感光元件20的所述感光面210接收到的不同颜色的所述单色光300的锐利程度不相同。比如说,实际景物的上的同一位置反射进入所述潜望式摄像模组的复合光400分散形成的所述红色光形成于所述感光元件20的所述感光面210的像的锐利程度大于由所述复合光400分散形成的所述蓝色光于所述感光元件20的所述感光面210的成像。同样的,由所述感光元件20的所述感光面210接收的具有相同锐利程度的不同颜色的所述单色光300对应的实际景物的上的各个点位置也会不相同。比如说,所述红色光和所述蓝色光于所述感光元件20的所述感光面210的成像具有相同的锐利程度,对应的所述红色光对应的实际景物上的点与所述潜望式摄像模组的距离大于所述蓝色光对应的实际景物上的点与所述潜望式摄像模组的距离。进一步地,通过所述滤色元件40过滤杂光的方式保障了所述红色光、所述绿色光以及所述蓝色光于所述感光元件20的所述感光面210成像的清晰度,并减少了其他颜色的所述单色光300对获取实际景物所在位置的干扰,以利于提高所述潜望式摄像模组获取实际景物所在位置的准确性,以在后续能够更好地呈现出实际景物的轮廓。应该理解的是,本发明所述的杂光是指除了红色光、绿色光以及蓝色光以外的其他颜色的所述单色光300。
优选地,所述滤色元件40被设置于所述镜头单元30和所述感光元件20之间,且所述滤色元件40对从所述镜头单元30射出的所述单色光300进行过滤,使得仅所述红色光、所述绿色光以及所述蓝色光能够到达所述感光元件20,并在后续能够得到所述第一图像101。优选地,所述滤色元件40被设置于所述镜头单元30和所述光处理模块10之间,使得自所述光处理单元10射出的所述单色光300能够被所述滤色元件40过滤,进而仅所述红色光、所述绿色光以及所述蓝色光能够被所述镜头单元30聚焦。
进一步地,所述感光元件20对应地接收所述红色光、所述绿色光以及所述蓝色光三种颜色的所述单色光300,所述感光元件20将光信号转换成电信号,并将电信号传输至与所述感光元件20可通信地连接的一处理装置50,进而能够分别得到所述红色光的图像、所述绿色光的图像以及所述蓝色光的图像,并在后续通过所述处理装置50对所述红色光的图像、所述绿色光的图像以及所述蓝色光的图像进行分析,根据三种颜色的所述单色光300的锐利程度的差别得到实际景物所在的位置,以获得实际景物的轮廓。应该理解的是,所述红色光的图像、所述绿色光的图像以及所述蓝色光的图像能够被合成所述第一图像101。值得一提的是,所述处理装置50可以被实施为所述移动电子设备2000的处理器,且所述处理装置50也可以被实施为所述摄像模组1000自带的处理器,本领域技术人员应该理解的是,所述处理装置50的具体实施方式仅仅作为示例,不能成为对本发明所述的摄像模组1000和所述潜望式摄像模组100的内容和范围的限制。
具体来说,参照图2B,所述感光元件20的所述感光面210包括多像素点21,即,所述感光元件20的所述感光面210包括一像素点阵列,并藉由多个所述像素点21接收对应的所述红色光、所述绿色光以及所述蓝色光,进而能够得到三种颜色的所述单色光300对应的图像。更具体地,所述像素点21包括多个所述像素单元211,其中每个像素单元211能够接收对应颜色的所述单色光300,多个所述像素单元211有序地自上而下排列,进而使得所述像素点211能够接收对应颜色的所述单色光300。所述像素单元211选自:一红色像素单元2111、一绿色像素单元2112、以及一蓝色像素单元2113组成的像素单元类型中的一种或是多种的组合,其中所述红色像素单元2111能够接收所述红色光,所述绿色像素单元2112能够接收绿色光,所述蓝色像素单元2113能够接收所述蓝色光,所述红色像素单元2111、所述绿色像素单元2112以及所述蓝色像素单元2113被按照一预设规则排列,进而使得所述像素点21能够接收对应颜色的所述单色光300。在说明书附图和下面的描述中,所述红色像素单元2111、所述绿色像素单元2112以及所述蓝色像素单元2113分别用字母“R”、“G”以及“B”代表,如图2B所示。
优选地,所述感光元件20被实施为不规则颜色滤光器阵列。具体来说,所述像素单元211按照R-RB-RGB-RG-G的顺序自上而下地排列,使得所述感光元件20能够接收对应颜色的所述单色光。具体来说,由于不同的所述单色光300的 波长和频率不同,且所述光处理单元10对具有不同波长的所述单色光300的折射率不同,使得不同的所述单色光300从所述光处理单元10射出后的位置也不相同,且所述单色光300按照所述红色光、所述橙色光、所述黄色光、所述绿色光、所述蓝色光、所述靛色光以及所述紫色光的顺序依次自上而下排列,因此,对应的仅包含所述红色像素单元2111的所述像素单元211分布于最上方,仅包含所述蓝色像素单元2113的所述像素单元211分布于最下方。进一步地,由于折射角度和焦距的影响,所述红色光、所述绿色光以及所述蓝色光难以完全分离,即,自上而下会出现所述红色光和所述绿色光重合,所述红色光、所述绿色光和所述蓝色光,所述绿色光和所述蓝色光重合的现象,对应地,包括至少一个所述红色像素单元2111和至少一个所述绿色像素单元2112的所述像素单元211位于仅包含所述红色像素单元2111的所述像素单元211的下方,包括至少一个所述红色像素单元2111、至少一个所述绿色像素单元2112以及至少一个所述蓝色像素单元2113的所述像素单元211位于仅包括所述蓝色像素单元2113的所述像素单元211的上方。也就是说,按照色散后形成的所述单色光300的分布设计对应的所述感光元件20的所述像素点21的所述像素单元211,使得所述感光元件20能够更好地接收对应颜色的所述单色光300,进一步有利于提升成像效果。
可选地,所述感光元件20被实施为RGB颜色滤光器阵列,即每个所述像素单元211包括至少一个所述红色像素单元2111、至少一个所述绿色像素单元2112以及至少一个所述蓝色像素单元2113。
值得一提的是,所述感光元件20的所述感光面210为一平整平面,且所述感光元件20的位置是固定的,即像面是固定的,当所述感光元件20被设置于一预设范围内时,且所述红色光、所述绿色光以及所述蓝色光都能够成像于所述感光元件10的感光面。由于构成实际景物的轮廓的各个点与所述潜望式摄像模组100的之间的距离不同,且不同颜色的所述单色光300对应的实际景物的位置也不一样,所述感光元件20接收的不同颜色的所述单色光300的锐利程度也不相同。举例来说,参照图3A和图3B,所述感光元件20能够被设置于一A平面和一C平面之间的任意位置,并且所述实际景物对应的像面能够形成于所述感光元件20,一束所述红色光能够被聚焦的点位于所述平面B,则当所述感光元件10被设置于所述平面B对应的位置时,所述红色光形成于所述感光元件10的像的亮度较高,即,所述红色光的锐利程度较高,当所述感光元件20被设置于所述 B平面之前的所述C平面或是所述B平面之后的所述A平面,所述感光元件20能够接收到的所述红色光的亮度降低;同样的,另一束所述红色光能够被聚焦的点位于所述平面C,则当所述感光元件10被设置于所述平面C对应的位置时,所述红色光形成于所述感光元件10的像的亮度较高,即,所述红色光的锐利程度较高,当所述感光元件20被设置于所述C平面之后得所述B平面,所述感光元件20能够接收到的所述红色光的亮度降低,当所述感光元件20被设置于所述B平面之后得所述A平面,所述感光元件20能够接收到的所述红色光的亮度更低。应该理解的是,同一所述复合光400发生色散后形成的不同颜色的所述单色光300能够被聚焦的位置也不相同。换句话说,不同颜色的所述单色光300的光学性质不一样,由实际景物上的同一个点,或是位于同一垂直平面内的点,即,距离所述潜望式摄像模组的所述感光元件20的所述感光面210距离相同的点所反射进入所述潜望式摄像模组内的所述复合光400被分散形成的所述红色光、所述绿色光以及所述蓝色光会被聚焦于不同的平面,当所述感光元件20的所述感光面210的位置固定不变时,所述感光元件20的所述感光面210接收到的不同颜色的所述单色光300的锐利程度不相同;同样的,由所述感光元件20的所述感光面210接收的具有相同锐利程度的不同颜色的所述单色光300对应的实际景物的上的各个点位置也会不相同。也就是说,所述潜望式摄像模组通过将进入所述摄像模组的复合光分散成不同颜色的所述单色光300,并藉由所述感光元件20的所述感光面210接收所述单色光300的方式有利于根据不同颜色的所述单色光300形成于所述感光元件20的所述感光面210的像分析等得到实际景物上的各个点的位置,并获得实际景物的轮廓。
参照图1,所述潜望式摄像模组100进一步包括一对焦机构60,其中所述对焦机构60能够使得实际景物对应的图像被保持于所述镜头单元30的焦距上,以提高所述潜望式摄像模组100的成像效果。优选地,所述对焦机构60被设置于所述镜头单元30,所述对焦机构60能够驱动所述镜头单元30转动而完成对焦。比如说,所述对焦机构60能够驱动所述镜头单元30沿着Z轴方向运动而实现对焦。值得一提的是,所述处理装置50能够根据实际景物的轮廓决定当前区域采用的对焦参数,将照片中出现的所有景物保持在焦距上,即,可以通过算法去补偿所述对焦机构60在对焦过程中存在部分的误差,进而降低对所述对焦机构60的精度要求,并能够保障成像的效果。
参照图1,所述潜望式摄像模组100进一步包括一防抖机构70,其中所述防抖机构70被设置于所述镜头单元30和/或所述光处理单元10,所述防抖机构70能够减小抖动对拍摄效果的影响。值得一提的是,所述潜望式摄像模组100能够让40cm至无限远范围的物体保持清晰,且在拍摄过程中对使用者手部的抖动没有自动对焦那么敏感。
在说明书附图4所示出的所述潜望式摄像模组100的这个具体的实施例中,所述潜望式摄像模组100进一步包括一辅助元件80,其中所述辅助元件80被设置于所述光处理单元10和所述镜头单元30之间,且所述辅助元件80被保持于所述感光元件20的感光路径,从所述光处理单元10射出的所述单色光300经过所述辅助元件80后被分散,使不同颜色的所述单色光300的重叠部分被减少,这样,所述红色光、所述绿色光以及所述蓝色光的重叠区域被减少,以利于所述感光元件20更好地接收对应颜色的所述单色光300,并提高了所述红色光、所述绿色光以及所述蓝色光单独形成的图像的清晰度,从而能够更准确地获取实际景物所在的位置,以得到实际景物的轮廓,并更好地呈现出立体效果。优选地,所述辅助元件80被实施为一柱面镜。优选地,所述辅助元件80被实施为一自由曲面。
说明书附图5中示出的所述潜望式摄像模组100与附图2A中示出的所述潜望式摄像模组100的差异在于,图5中示出的所述潜望式摄像模组100进一步包括一驱动元件90,其中所述光处理单元10被可驱动以转动地连接于所述驱动元件90,使得从所述光处理单元10射出的所述红色光、所述绿色光以及所述蓝色光能够全部被所述感光元件20接收,通过这样的方式,使得所述感光元件20不需要扩大所述感光元件20的所述感光面210的面积就能够接收全部的所述红色光、所述绿色光以及所述蓝色光。举例来说,由于所述光处理单元10对不同的颜色的所述单色光300的折射率不同,从所述光处理单元10射出的所述单色光300与YZ平面所成的角度不同,随着光的传播,所述单色光300之间的距离会变大,若要接收全部的所述红色光、所述绿色光以及所述蓝色光,需要的所述感光元件20的感光面积偏大,不利于所述潜望式摄像模组100轻薄化,而通过驱动所述光处理单元10绕Y轴转动,使得所述感光元件20能够依次接收所述红色光、所述绿色光以及所述蓝色光,并分别得到三种颜色的所述单色光300对应的图像,以在后续能够得到所述第一图像101。
依本发明的另一个方面,本发明进一步提供一潜望式摄像模组的图像获取方法,其中所述图像获取方法包括如下步骤:
(a)藉由一光处理单元10转折来自一被拍摄物体的一复合光400的同时分散所述复合光400,并形成多束单色光300,其中至少一束所述单色光300是一红色光,至少一束所述单色光300是一绿色光,至少一束所述单色光300是一蓝色光;
(b)藉由一感光元件20接收所述红色光、所述绿色光以及所述蓝色光;以及
(c)获取所述被拍摄物体的轮廓。
具体来说,所述光处理单元10改变了经过进入所述潜望式摄像模组100的光束的光路方向。比如说,沿着X轴方向传播的所述复合光400入射至所述光处理单元10,经过所述光处理单元10后能够沿着Z轴方向传播,即,藉由所述光处理单元10能够使得进入所述潜望式摄像模组100的光束的光路发生转折,进而所述潜望式摄像模组100在具有长焦拍摄效果的同时能够降低所述摄像模组1000的整体高度,以利于所述摄像模组1000被应用于一移动电子设备2000,并符合所述移动电子设备2000轻薄化发展的趋势。并且,在所述步骤(a)中,外部环境中的所述复合光400经过所述光处理单元10能够发生色散,进而形成不同颜色的所述单色光300,即,所述复合光400经过所述光处理单元10后发生色散形成所述红色光、一橙色光、一黄色光、所述绿色光、所述蓝色光、一靛色光以及一紫色光七种颜色的所述单色光300,并在后续,所述红色光、所述绿色光以及所述蓝色光这三种颜色的所述单色光300能够成像于所述感光元件20的一感光面210。
在所述步骤(b)中,所述感光元件对应地接收所述红色光、所述绿色光以及所述蓝色光。具体来说,藉由所述感光元件210的所述感光面210的所述红色像素单元2111、所述绿色像素单元2112以及所述蓝色像素单元2113分别地接收所述红色光、所述绿色光以及所述蓝色光。更优选地,所述红色像素单元2111、所述绿色像素单元2112以及所述蓝色像素单元2113按照R-RB-RGB-RG-G的顺序自上而下地排列的方式对应地接收所述红色光、所述绿色光以及所述蓝色光。
在所述步骤(b)之前进一步包括步骤(e):藉由一镜头单元30汇聚所述单色光300。
优选地,在所述步骤(e)之后进一步包括步骤(f):藉由一辅助元件80扩大不同颜色的所述单色光300之间的距离。具体来说,从所述光处理单元10射出的所述单色光300经过所述辅助元件80后被分散,使不同颜色的所述单色光300的重叠部分被减少,这样,所述红色光、所述绿色光以及所述蓝色光的重叠区域被减少,以利于所述感光元件20更好地接收对应颜色的所述单色光300,并提高了所述红色光、所述绿色光以及所述蓝色光单独形成的图像的清晰度,从而能够更准确地获取实际景物所在的位置,以得到实际景物的轮廓,并更好地呈现出立体效果。
在所述步骤(b)之前进一步包括步骤(g):藉由一滤色元件40过滤杂光。具体来说,以仅允许所述红色光、所述绿色光以及所述蓝色光通过的方式过滤杂光,保障仅所述红色光、所述绿色光以及所述蓝色光能够成像于所述感光元件20,以在后续能够根据所述红色光、所述绿色光以及所述蓝色光分别成的像分析实际景物所在的位置,并获取实际景物的轮廓。通过过滤杂光的方式保障了成像的清晰度,并减少了其他颜色的所述单色光300对获取实际景物所在位置的干扰,以利于提高所述潜望式摄像模组获取实际景物所在位置的准确性,以在后续能够更好地呈现出实际景物的轮廓。
优选地,在所述步骤(b)之前进一步包括步骤(h):转动所述光处理单元10。具体来说,藉由一驱动元件90够驱动所述光处理单元10转动,使得所述感光元件20能够依次接收所述红色光、所述绿色光以及所述蓝色光,通过这样的方式,能够使得从所述光处理单元20射出的所述红色光、所述绿色光以及所述蓝色光能够全部被所述感光元件20接收。
在所述步骤(c)中进一步包括如下步骤:
(c.1)分别地形成所述红色光、所述绿色光以及所述蓝色光的像;
(c.2)分析所述红色光、所述绿色光以及所述蓝色光的锐利程度;以及
(c.3)确定所述被拍摄物体所在的位置。
在所述步骤(c.3)之后进一步包括步骤(c.4):根据所述被拍摄物体的位置获取实际景物的轮廓。具体来说,所述红色光、所述绿色光以及所述蓝色光的折射率、传播速度等光学性质不一样,由实际景物上的同一个点,或是位于同一垂直平面内的点,即,与所述潜望式摄像模组的所述感光元件20的所述感光面210距离相同的点所反射进入所述潜望式摄像模组内的所述复合光400被分散形 成的所述红色光、所述绿色光以及所述蓝色光会被聚焦于不同的平面,当所述感光元件20的所述感光面210的位置固定不变时,所述感光元件20的所述感光面210接收到的不同颜色的所述单色光300的锐利程度不相同。比如说,实际景物的上的同一位置反射进入所述潜望式摄像模组的复合光400分散形成的所述红色光形成于所述感光元件20的所述感光面210的像的锐利程度大于由所述复合光400分散形成的所述蓝色光于所述感光元件20的所述感光面210的成像。同样的,由所述感光元件20的所述感光面210接收的具有相同锐利程度的不同颜色的所述单色光300对应的实际景物的上的各个点位置也会不相同。比如说,所述红色光和所述蓝色光于所述感光元件20的所述感光面210的成像具有相同的锐利程度,对应的所述红色光对应的实际景物上的点与所述潜望式摄像模组的距离大于所述蓝色光对应的实际景物上的点与所述潜望式摄像模组的距离。进而藉由所述感光元件20的所述感光面210接收所述单色光300的方式分析不同颜色的所述单色光300形成于所述感光元件20的所述感光面210的像,以得到实际景物上的各个点的位置,并获得实际景物的轮廓,从而获取实际景物的深度信息。
在所述步骤(c.4)之后进一步包括步骤(i):根据实际景物的轮廓生成一对焦参数。进一步地,所述潜望式摄像模组根据生成的所述对焦参数完成对焦,并获得所述潜望式摄像模组100的一第一图像101。
依本发明的另一个方面,本发明进一步提供一摄像模组的图像获取方法,其中所述图像获取方法包括如下步骤:
(a)藉由至少一潜望式摄像模组100的一光处理单元10分散一复合光400,并形成多单色光300;
(b)藉由所述潜望式摄像模组100的一感光元件20接收对应的一红色光、一绿色光以及一蓝色光;
(c)获取实际景物的轮廓;以及
(d)辅助一主摄像模组200完成对焦。
具体来说,藉由所述潜望式摄像模组100将所述符合光400分散成所述单色光300,并能够得到所述红色光、所述绿色光以及所述蓝色光的像,一处理装置50根据不同颜色的所述单色光300的锐利程度的差异分析所述单色光300对应的实际景物存在的实际位置,并得到实际景物的轮廓,进而获得一对焦参数,所述处理装置50将所述对焦参数输出至所述主摄像模组200,使得所述主摄像模 组200根据所述对焦参数完成对焦,以获得一第二图像102,并提高了所述主摄像模组200的对焦效率,并能够降低所述主摄像模组200的对焦机构的精度要求。
在所述步骤(c)之后,进一步包括步骤(e):藉由所述潜望式摄像模组100获得一第一图像101。
在所述步骤(d)之后,合成所述摄像模组1000的一最终图像103。具体来说,通过算法将所述第一图像101和所述第二图像102进行合成,以得到所述最终图像103,其中所述第一图像101能够具有准确的深度信息,通过所述第一图像101和所述第二图像102融合得到的所述最终图像103能够体现出实际景物的轮廓,并呈现出立体的效果,进而提高了所述摄像模组1000获取的图像的质量。举例来说,参照图7所示,使用带有所述摄像模组1000的一移动电子设备2000进行人脸拍摄时,所述摄像模组1000的所述潜望式摄像模组100获取的所述第一图像101能够体现人脸的实际存在的位置,即,构成人脸的点和点之间的相对位置,进而得到人脸的轮廓,且所述第一图像101中呈现出了人脸的深度信息,当第一图像101和所述主摄像模组100获取的所述第二图像102进一步合成所述最终图像103后,所述最终图像103能够呈现出人脸的立体轮廓。
本领域的技术人员可以理解的是,以上实施例仅为举例,其中不同实施例的特征可以相互组合,以得到根据本发明揭露的内容很容易想到但是在附图中没有明确指出的实施方式。
本领域的技术人员应理解,上述描述及附图中所示的本发明的实施例只作为举例而并不限制本发明。本发明的目的已经完整并有效地实现。本发明的功能及结构原理已在实施例中展示和说明,在没有背离所述原理下,本发明的实施方式可以有任何变形或修改。
Claims (37)
- 一潜望式摄像模组的图像获取方法,其特征在于,所述图像获取方法包括如下步骤:(a)藉由一光处理单元转折来自一被拍摄物体的一复合光的同时分散所述复合光,形成多束单色光;(b)藉由一感光元件的一感光面接收所述单色光,分别获得不同颜色的所述单色光的成像;以及(c)合成不同颜色的所述单色光的成像,以得到被拍摄物体的最终图像。
- 根据权利要求1所述的图像获取方法,其中在所述步骤(b)中进一步包括步骤(d):所述感光元件对应地接收多束所述单色光中的至少一红色光、至少一绿色光以及至少一蓝色光。
- 根据权利要求2所述的图像获取方法,其中在所述步骤(b)中,藉由所述感光元件的所述感光面的一红色像素单元、一绿色像素单元以及一蓝色像素单元分别地接收所述红色光、所述绿色光以及所述蓝色光,形成所述红色光的成像、所述绿色光的成像以及所述蓝色光的成像。
- 根据权利要求2所述的图像获取方法,其中在所述步骤(b)中,藉由所述红色像素单元、所述滤色像素单元以及所述蓝色像素单元按照R-RB-RGB-RG-G的顺序自上而下地排列的方式对应地接收所述红色光、所述绿色光以及所述蓝色光。
- 根据权利要求2所述的图像获取方法,其中在所述步骤(b)之前进一步包括步骤(e):藉由一镜头单元分别汇聚不同颜色的所述单色光。
- 根据权利要求3所述的图像获取方法,其中在所述步骤(e)之后进一步包括步骤(f):藉由一辅助元件扩大不同颜色的所述单色光之间的距离。
- 根据权利要求3所述的图像获取方法,其中在所述步骤(b)之前进一步包括步骤(g):藉由一滤色元件过滤杂光。
- 根据权利要求3所述的图像获取方法,其中在所述步骤(b)之前进一步包括步骤(h):转动所述光处理单元。
- 一摄像模组的工作方法,其特征在于,所述工作方法包括如下步骤:(a)藉由至少一潜望式摄像模组一光处理单元转折来自一被拍摄物体的一复合光的同时分散所述复合光,并形成多束单色光;(b)藉由一感光元件的一感光面接收所述单色光;(c)获取所述被拍摄物体的轮廓;以及(d)辅助至少一主摄像模组完成对焦。
- 根据权利要求9所述的工作方法,其中在上述步骤中,其中所述潜望式摄像模组藉由所述感光元件的所述感光面对应地接收多束所述单色光中的至少一束红色光、至少一束绿色光以及至少一束蓝色光的方式获得关于所述被拍摄物体的一第一图像。
- 根据权利要求9所述的工作方法,其中在上述步骤中,在所述步骤(c)中如下步骤:(c.1)形成所述红色光、所述绿色光以及所述蓝色光的像;(c.2)分析所述红色光、所述绿色光以及所述蓝色光的锐利程度;以及(c.3)确定所述被拍摄物体的位置。
- 根据权利要求11所述的图像获取方法,其中在所述步骤(c.3)之后进一步包括步骤(c.4):根据被拍摄物体的位置获取所述被拍摄物体的轮廓。
- 根据权利要求12所述的图像获取方法,其中在所述步骤(d)中进一步包括如下步骤:所述潜望式摄像模组根据被拍摄物体的轮廓确定一对焦参数;和所述主摄像模组根据所述对焦参数完成对焦。
- 根据权利要求13所述的工作方法,其中在所述步骤(d)之后,所述主摄像模组获得关于所述被拍摄物体的一第二图像。
- 根据权利要求14所述的工作方法,其中在上述方法中,合成所述第一图像和所述第二图像以得到所述摄像模组的一最终图像。
- 一潜望式摄像模组,其特征在于,包括:一光处理单元,其中经过所述光处理单元的一复合光色散形成多单色光;一镜头单元,其中所述单色光能够被所述镜头单元聚焦;以及一感光元件,其中所述光处理单元和所述镜头单元被保持于所述感光元件的感光路径,所述镜头单元被设置于所述光处理单元和所述感光元件之间,所述单色光经过所述镜头单元后被所述感光元件的一感光面接收。
- 根据权利要求16所述的潜望式摄像模组,其中所述感光元件的所述感光面对应地接收多束所述单色光中的至少一红色光、至少一绿色光以及至少一蓝色光。
- 根据权利要求17所述的潜望式摄像模组,进一步包括一滤色元件,其中所述滤色元件被保持于所述感光元件的感光路径。
- 根据权利要求18所述的潜望式摄像模组,其中所述滤色元件被设置于所述镜头 单元和所述感光元件之间。
- 根据权利要求18所述的潜望式摄像模组,进一步包括一辅助元件,其中所述辅助元件被设置于所述光处理单元和所述镜头单元之间,且所述辅助元件被保持于所述感光元件的感光路径。
- 根据权利要求20所述的潜望式摄像模组,其中所述辅助元件为一柱面镜。
- 根据权利要求20所述的潜望式摄像模组,其中所述辅助元件为一自由曲面镜。
- 根据权利要求16至22任一所述的潜望式摄像模组,进一步包括一驱动元件,其中所述驱动元件被设置于所述光处理单元,且所述光处理单元被可驱动以转动地连接于所述驱动元件。
- 根据权利要求16至22任一所述的潜望式摄像模组,其中所述感光元件包括多像素点,其中所述像素点包括多个像素单元,其中所述像素单元选自:一红色像素单元、一绿色像素单元、以及一蓝色像素单元组成的像素单元类型中的一种或是多种的组合。
- 根据权利要求24所述的潜望式摄像模组,其中所述像素单元按照R-RB-RGB-RG-G的顺序自上而下地排列。
- 根据权利要求16所述的潜望式摄像模组,其中所述光处理单元为一棱镜。
- 一摄像模组,其特征在于,包括:一潜望式摄像模组,其中所述潜望式摄像模组获得关于所述被拍摄物体的一第一图像;和至少一主摄像模组,其中所述主摄像模组获得关于所述被拍摄物体的一第二图像,所述第一图像和所述第二图像被处理后形成一最终图像;其中所述潜望式摄像模组包括:一光处理单元,其中经过所述光处理单元的一复合光色散形成多单色光;一镜头单元,其中所述单色光能够被所述镜头单元聚焦;以及一感光元件,其中所述光处理单元和所述镜头单元被保持于所述感光元件的感光路径,所述镜头单元被设置于所述光处理单元和所述感光元件之间,所述单色光经过所述镜头单元后被所述感光元件的一感光面接收。
- 根据权利要求27所述的摄像模组,其中所述感光元件的所述感光面对应地接收多束所述单色光中的至少一红色光、至少一绿色光以及至少一蓝色光。
- 根据权利要求28所述的摄像模组,其中所述潜望式摄像模组进一步包括一滤色元件,其中所述滤色元件被保持于所述感光元件的感光路径。
- 根据权利要求29所述的摄像模组,其中所述滤色元件被设置于所述镜头单元和所述感光元件之间。
- 根据权利要求29所述的摄像模组,其中所述潜望式摄像模组进一步包括一辅助元件,其中所述辅助元件被设置于所述光处理单元和所述镜头单元之间,且所述辅助元件被保持于所述感光元件的感光路径。
- 根据权利要求31所述的摄像模组,其中所述辅助元件为一柱面镜。
- 根据权利要求31所述的摄像模组,其中所述辅助元件为一自由曲面镜。
- 根据权利要求27至33任一所述的摄像模组,其中所述潜望式摄像模组进一步包括一驱动元件,其中所述驱动元件被设置于所述光处理单元,且所述光处理单元被可驱动以转动地连接于所述驱动元件。
- 根据权利要求27至33任一所述的摄像模组,其中所述感光元件包括多像素点,其中所述像素点包括多个像素单元,其中所述像素单元选自:一红色像素单元、一绿色像素单元、以及一蓝色像素单元组成的像素单元类型中的一种或是多种的组合。
- 根据权利要求35所述的摄像模组,其中所述像素单元按照R-RB-RGB-RG-G的顺序自上而下地排列。
- 根据权利要求27所述的摄像模组,其中所述光处理单元为一棱镜。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811465390.4A CN111258166B (zh) | 2018-12-03 | 2018-12-03 | 摄像模组及其潜望式摄像模组和图像获取方法及工作方法 |
CN201811465390.4 | 2018-12-03 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2020114144A1 true WO2020114144A1 (zh) | 2020-06-11 |
Family
ID=70948776
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2019/113351 WO2020114144A1 (zh) | 2018-12-03 | 2019-10-25 | 摄像模组及其潜望式摄像模组和图像获取方法及工作方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN111258166B (zh) |
WO (1) | WO2020114144A1 (zh) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114338997B (zh) * | 2021-12-30 | 2024-02-27 | 维沃移动通信有限公司 | 摄像模组、拍摄方法、装置和电子设备 |
CN118509510A (zh) * | 2023-02-14 | 2024-08-16 | Oppo广东移动通信有限公司 | 电子设备 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8149270B1 (en) * | 2006-12-18 | 2012-04-03 | Visionsense Ltd. | High resolution endoscope |
CN103155543A (zh) * | 2010-10-01 | 2013-06-12 | 富士胶片株式会社 | 成像设备 |
CN107121783A (zh) * | 2017-06-23 | 2017-09-01 | 中山联合光电科技股份有限公司 | 一种成像系统 |
CN107948470A (zh) * | 2017-11-22 | 2018-04-20 | 德淮半导体有限公司 | 摄像头模块和移动设备 |
CN207835595U (zh) * | 2017-12-14 | 2018-09-07 | 信利光电股份有限公司 | 一种双摄像头模组以及终端 |
CN108663775A (zh) * | 2017-04-01 | 2018-10-16 | 华为技术有限公司 | 一种镜头模组及终端 |
CN108810386A (zh) * | 2018-08-02 | 2018-11-13 | Oppo广东移动通信有限公司 | 摄像头模组、摄像头组件和电子装置 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5075795B2 (ja) * | 2008-11-14 | 2012-11-21 | 株式会社東芝 | 固体撮像装置 |
US9258468B2 (en) * | 2012-02-15 | 2016-02-09 | Fluxdata, Inc. | Method and apparatus for separate spectral imaging and sensing |
-
2018
- 2018-12-03 CN CN201811465390.4A patent/CN111258166B/zh active Active
-
2019
- 2019-10-25 WO PCT/CN2019/113351 patent/WO2020114144A1/zh active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8149270B1 (en) * | 2006-12-18 | 2012-04-03 | Visionsense Ltd. | High resolution endoscope |
CN103155543A (zh) * | 2010-10-01 | 2013-06-12 | 富士胶片株式会社 | 成像设备 |
CN108663775A (zh) * | 2017-04-01 | 2018-10-16 | 华为技术有限公司 | 一种镜头模组及终端 |
CN107121783A (zh) * | 2017-06-23 | 2017-09-01 | 中山联合光电科技股份有限公司 | 一种成像系统 |
CN107948470A (zh) * | 2017-11-22 | 2018-04-20 | 德淮半导体有限公司 | 摄像头模块和移动设备 |
CN207835595U (zh) * | 2017-12-14 | 2018-09-07 | 信利光电股份有限公司 | 一种双摄像头模组以及终端 |
CN108810386A (zh) * | 2018-08-02 | 2018-11-13 | Oppo广东移动通信有限公司 | 摄像头模组、摄像头组件和电子装置 |
Also Published As
Publication number | Publication date |
---|---|
CN111258166B (zh) | 2021-11-05 |
CN111258166A (zh) | 2020-06-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103119516B (zh) | 光场摄像装置和图像处理装置 | |
TWI419551B (zh) | 固態全景影像擷取裝置 | |
CN105578019B (zh) | 一种可获得深度信息的影像提取系统与对焦方法 | |
JP4495041B2 (ja) | ピンホール投影により表示面上のレーザポイントと関連付けられるプロジェクタ画素を求める方法 | |
TWI660629B (zh) | 自我調整三維立體成像系統 | |
US9848118B2 (en) | Phase detection autofocus using opposing filter masks | |
CN102823230B (zh) | 摄像装置 | |
JP7104296B2 (ja) | 測距カメラ | |
US9544570B2 (en) | Three-dimensional image pickup apparatus, light-transparent unit, image processing apparatus, and program | |
WO2011095026A1 (zh) | 摄像方法及系统 | |
US20140085422A1 (en) | Image processing method and device | |
WO2020114144A1 (zh) | 摄像模组及其潜望式摄像模组和图像获取方法及工作方法 | |
CN107948470A (zh) | 摄像头模块和移动设备 | |
CN104849829A (zh) | 一种镜头及拍摄设备 | |
CN111190325A (zh) | 一种全固态全息拍摄器 | |
CN108181782A (zh) | 无盲点的折返式全景成像仪 | |
JP7288226B2 (ja) | 測距カメラ | |
CN103098480A (zh) | 图像处理装置、三维摄像装置、图像处理方法、以及图像处理程序 | |
JP5348258B2 (ja) | 撮像装置 | |
JP7104301B2 (ja) | 測距カメラ | |
JP6756898B2 (ja) | 距離計測装置、ヘッドマウントディスプレイ装置、携帯情報端末、映像表示装置、及び周辺監視システム | |
CN110553585A (zh) | 一种基于光学阵列的3d信息获取装置 | |
CN111031227A (zh) | 一种辅摄模组及包括该辅摄模组的摄像装置 | |
WO2020213595A1 (ja) | 交換レンズ、情報処理装置、情報処理方法、及び、プログラム | |
KR20180130378A (ko) | 산란매질 신호처리 기능이 탑재된 라이트필드 카메라 어뎁터 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19892266 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 19892266 Country of ref document: EP Kind code of ref document: A1 |