CN111258166B - Camera module, periscopic camera module, image acquisition method and working method - Google Patents

Camera module, periscopic camera module, image acquisition method and working method Download PDF

Info

Publication number
CN111258166B
CN111258166B CN201811465390.4A CN201811465390A CN111258166B CN 111258166 B CN111258166 B CN 111258166B CN 201811465390 A CN201811465390 A CN 201811465390A CN 111258166 B CN111258166 B CN 111258166B
Authority
CN
China
Prior art keywords
light
camera module
image
monochromatic
red
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811465390.4A
Other languages
Chinese (zh)
Other versions
CN111258166A (en
Inventor
陈振宇
王明珠
姚立锋
黄乾友
周凯伦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Sunny Opotech Co Ltd
Original Assignee
Ningbo Sunny Opotech Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Sunny Opotech Co Ltd filed Critical Ningbo Sunny Opotech Co Ltd
Priority to CN201811465390.4A priority Critical patent/CN111258166B/en
Priority to PCT/CN2019/113351 priority patent/WO2020114144A1/en
Publication of CN111258166A publication Critical patent/CN111258166A/en
Application granted granted Critical
Publication of CN111258166B publication Critical patent/CN111258166B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B33/00Colour photography, other than mere exposure or projection of a colour film
    • G03B33/10Simultaneous recording or projection
    • G03B33/12Simultaneous recording or projection using beam-splitting or beam-combining systems, e.g. dichroic mirrors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Abstract

The invention discloses a camera module, a periscopic camera module, an image acquisition method and a working method thereof, wherein the periscopic camera module comprises a light processing unit, a lens unit and a photosensitive element, wherein a composite light passing through the light processing unit is dispersed to form multiple monochromatic lights, the monochromatic lights can be focused by the lens unit, the light processing unit and the lens unit are kept in a photosensitive path of the photosensitive element, the lens unit is arranged between the light processing unit and the photosensitive element, the monochromatic lights are received by a photosensitive surface of the photosensitive element after passing through the lens unit, and the position of an actual scenery corresponding to the monochromatic lights can be obtained through subsequent analysis, so that the outline of the actual scenery is obtained, and the depth of field of the periscopic camera module is further expanded.

Description

Camera module, periscopic camera module, image acquisition method and working method
Technical Field
The invention relates to the field of camera modules, in particular to a camera module, a periscopic camera module, an image acquisition method and a working method.
Background
In recent years, consumers have increasingly demanded imaging of mobile electronic devices, and single-lens camera modules configured for mobile electronic devices have been unable to meet the demands of consumers. The two modules of shooing that appear gradually on the market include a main module of making a video recording and a pair module of making a video recording, wherein the main module of making a video recording with the pair module of making a video recording can obtain the image respectively, just the main module of making a video recording with the image that the pair module of making a video recording obtained can be synthesized, and then obtains the image that has the high resolution, and this also makes the two modules of shooing receive bigger consumer's favor. The conventional dual camera module includes various types, such as a black and white dual camera module, a color dual camera module, and a wide-angle telephoto camera module, wherein the wide-angle telephoto camera module is one of the widely used dual camera modules. The wide-angle long-focus camera module comprises a wide-angle camera module with a large field angle and a long-focus camera module with a narrow field angle, and the wide-angle camera module and the image acquired by the long-focus camera module are fused to obtain a final image, so that the requirements of wide-angle shooting and long-focus shooting can be met. However, in the practical application process, the long-focus camera module is long, so that the height of the dual-camera module is higher, the size of the dual-camera module is larger, and the dual-camera module does not conform to the trend that the current mobile electronic equipment tends to be light, thin and developed.
In some existing designs, the telephoto camera module is designed as a periscopic camera module, wherein the periscopic camera module changes a light path direction of a light beam entering the periscopic camera module by an optical element, so that the light beam entering the periscopic camera module can be turned from a first direction to a second direction perpendicular to the first direction, and then reaches a photosensitive chip after passing through a lens module and a color filter. However, the periscopic camera module has many problems. First, although the focal length of the existing periscopic camera module is long, the depth of field of the image that the periscopic camera module can acquire is shallow. Secondly, in order to capture a high-quality image, a driving member is required to drive the lens module and/or the optical element to a large extent to achieve focusing, and in order to ensure an imaging effect, the precision of movement or rotation of the driving member is ensured during focusing, so that the existing periscopic camera module has a high requirement on the precision of the driving member.
Disclosure of Invention
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein at least one periscopic camera module of the camera module obtains an image of a photographed object by receiving a plurality of monochromatic lights, so as to facilitate depth of field improvement.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the periscopic camera module respectively obtains images of the monochromatic lights with different colors, synthesizes the clear images of the monochromatic lights into a final image of the periscopic camera module, and superimposes the clear images of the monochromatic lights with different colors to obtain the final image, so as to improve the depth of field of the periscopic camera module.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the images of the monochromatic light with different colors formed on a photosensitive surface of a photosensitive element of the periscopic camera module correspond to different sharp object surface positions of the object to be photographed, that is, the final image of the periscopic camera module is obtained by synthesizing the images with different sharp object surface positions, so as to improve the depth of field of the periscopic camera module.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the periscopic camera module provides a light processing unit, a composite light from a photographed object is dispersed into a plurality of monochromatic lights with different colors by the light processing unit, so that the dispersion degree of different sharp object surface positions corresponding to the monochromatic lights with different colors is increased, thereby facilitating to obtain images of sharp object surfaces corresponding to different positions, and images corresponding to different sharp object surface positions are superposed and synthesized with each other through an algorithm in the following process, so as to increase the depth of field of the periscopic camera module.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein each pixel of the photosensitive element can correspondingly receive the monochromatic lights with different colors, and respectively obtain clear images formed by the monochromatic lights with different colors.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the periscopic camera module respectively obtains clear images of at least one red light, at least one green light and at least one blue light in a plurality of monochromatic lights, and then synthesizes images of different clear object plane positions corresponding to the red light, the green light and the blue light according to an algorithm, so as to achieve depth of field enhancement by superimposing the clear images of the monochromatic lights of different colors.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image capturing method and an operating method thereof, wherein each pixel of the photosensitive element can correspondingly receive the red light, the green light and the blue light, so as to obtain images formed by the red light, the green light and the blue light respectively.
An object of the present invention is to provide a camera module, a periscopic camera module thereof, an image acquisition method and a working method thereof, wherein the camera module can obtain high-quality images.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the image obtained by at least one periscopic camera module of the camera module can have depth information.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and an operating method thereof, wherein the periscopic camera module can determine the position of an actual scene and obtain the outline of the actual scene.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and an operating method thereof, wherein the periscopic camera module can determine depth information of an actual scene by receiving monochromatic light, and obtain an outline of the actual scene.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the periscopic camera module can determine the position of an actual scene by receiving the monochromatic light, and obtain the outline of the actual scene.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the periscopic camera module can receive the monochromatic lights of the three colors of red light, green light and blue light, analyze the position of an actual scene according to the images formed by the monochromatic lights of the three colors, and obtain the outline of the actual scene.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein each pixel of the photosensitive element can correspondingly receive the light of the three colors of red light, green light and blue light, and analyze the position of an actual scene according to the difference of the sharpness of the light of the three colors, so as to obtain the outline of the actual scene.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and an operating method thereof, wherein the periscopic camera module can obtain a focusing parameter of a current area according to an outline of an actual scene, and keep the scene appearing in the image within a depth of field of a lens unit of the periscopic camera module, so as to improve the focusing efficiency of the periscopic camera module.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the periscopic camera module can obtain a focusing parameter of a current area according to an outline of an actual scene, and keep the scene appearing in the image at a focal length of a lens unit of the periscopic camera module, thereby reducing a requirement of a focusing driving member for driving the lens unit to complete focusing.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the camera module provides at least one main camera module, and the main camera module and the periscopic camera module cooperate with each other to obtain a high-quality image.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the periscopic camera module assists the main camera module to complete focusing, so as to improve the focusing efficiency of the main camera module.
Another objective of the present invention is to provide a camera module, a periscopic camera module, an image obtaining method and a working method thereof, wherein the periscopic camera module further includes a cylindrical mirror, and the cylindrical mirror can expand the dispersion degree of light passing through the light processing unit, and reduce the overlapping portion between red light, green light and blue light, so that the photosensitive element can obtain an image formed by better monochromatic light.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and an operating method thereof, wherein the periscopic camera module further includes a free-form surface mirror, and the free-form surface mirror can expand the dispersion degree of light passing through the light processing unit, and reduce the overlapping portion between the red light, the green light and the blue light, so that the photosensitive element can obtain a better image formed by monochromatic light.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image obtaining method and a working method thereof, wherein the light processing unit of the periscopic camera module can be driven to rotate, so that the monochromatic light formed by the composite light passing through the light processing unit can be all received by the photosensitive element, thereby ensuring the imaging quality of the periscopic camera module.
Another objective of the present invention is to provide a camera module, a periscopic camera module thereof, an image capturing method and a working method thereof, wherein the light processing unit of the periscopic camera module can be driven to rotate, and further the photosensitive element can receive all of the red light, the green light and the blue light without enlarging the area of the photosensitive element, so that the periscopic camera module is not only ensured in imaging quality, but also is beneficial to being light and thin.
According to an aspect of the present invention, the present invention further provides an image obtaining method of a periscopic camera module, the image obtaining method includes the following steps:
(a) a light processing unit turns a composite light from a photographed object and disperses the composite light to form a plurality of monochromatic lights;
(b) receiving the monochromatic light by a photosensitive surface of a photosensitive element to respectively obtain images of the monochromatic light with different colors; and
(c) and synthesizing the images of the monochromatic light with different colors to obtain a final image of the shot object.
According to an embodiment of the present invention, the step (b) further comprises a step (d): the photosensitive element correspondingly receives at least one red light, at least one green light and at least one blue light in the plurality of monochromatic lights.
According to an embodiment of the present invention, in the step (b), the red light, the green light and the blue light are respectively received by a red pixel unit, a green pixel unit and a blue pixel unit of the photosensitive surface of the photosensitive element.
According to an embodiment of the present invention, in the step (B), the red light, the green light, and the blue light are received by the red pixel cell, the color filter pixel cell, and the blue pixel cell, respectively, in a manner of being arranged in an order of R-RG-RGB-GB-B from top to bottom.
According to one embodiment of the present invention, step (e) is further included before step (b): the monochromatic lights with different colors are respectively converged by a lens unit.
According to one embodiment of the present invention, the method further comprises a step (f) after the step (e): the distance between the monochromatic lights of different colors is enlarged by an auxiliary device.
According to one embodiment of the present invention, step (b) is preceded by step (g): the stray light is filtered by a color filter.
According to an embodiment of the present invention, the method further comprises, before the step (b), a step (h): rotating the light processing unit.
According to another aspect of the present invention, the present invention further provides a working method of the camera module, the working method includes the following steps:
(a) turning a composite light from a shot object by using a light processing unit of at least one periscopic camera module and simultaneously dispersing the composite light to form a plurality of monochromatic lights;
(b) receiving the monochromatic light by a light sensing surface of a light sensing element;
(c) acquiring the outline of the shot object; and
(d) and assisting at least one main camera module to finish focusing.
According to an embodiment of the present invention, in the above step, the periscopic camera module obtains a first image of the object by receiving at least one red light beam, at least one green light beam and at least one blue light beam of the monochromatic light beams by the photosensitive surface of the photosensitive element.
According to a preferred embodiment of the present invention, in the step (c), the following steps are performed:
(c.1) forming images of the red, green, and blue light;
(c.2) analyzing the sharpness of the red, green, and blue light; and
(c.3) determining the position of the photographed object.
According to a preferred embodiment of the present invention, the method further comprises step (c.4) after step (c.3): and acquiring the outline of the shot object according to the position of the shot object.
According to a preferred embodiment of the present invention, the step (d) further comprises the steps of:
the periscopic camera module determines a focusing parameter according to the outline of the shot object; and
and the main camera module finishes focusing according to the focusing parameters.
According to an embodiment of the present invention, after the step (d), the main camera module obtains a second image of the object.
According to an embodiment of the present invention, in the method, the first image and the second image are synthesized to obtain a final image of the camera module.
According to another aspect of the present invention, there is further provided a periscopic camera module, comprising:
a light processing unit, wherein a composite light passing through the light processing unit is dispersed to form a plurality of monochromatic lights;
a lens unit, wherein the monochromatic light can be focused by the lens unit; and
the light processing unit and the lens unit are kept in a light sensing path of the light sensing element, the lens unit is arranged between the light processing unit and the light sensing element, and the monochromatic light is received by a light sensing surface of the light sensing element after passing through the lens unit.
According to an embodiment of the present invention, the photosensitive surface of the photosensitive element correspondingly receives at least one red light, at least one green light and at least one blue light in the plurality of monochromatic lights.
According to an embodiment of the present invention, the periscopic camera module further includes a color filter, wherein the color filter is held in a photosensitive path of the photosensitive element, and light reaches the photosensitive element after passing through the color filter, and the color filter allows only the red light, the green light, and the blue light to pass through.
According to one embodiment of the present invention, the color filter is disposed between the lens unit and the light sensing element.
According to an embodiment of the present invention, the periscopic camera module further includes an auxiliary element, wherein the auxiliary element is disposed between the light processing unit and the lens unit, and the auxiliary element is held in a photosensitive path of the photosensitive element.
According to one embodiment of the invention, the auxiliary element is a cylindrical mirror.
According to one embodiment of the invention, the auxiliary element is a free-form surface mirror.
According to an embodiment of the present invention, the periscopic camera module further comprises a driving element, wherein the driving element is disposed on the light processing unit, and the light processing unit is driven to be rotatably connected to the driving element.
According to one embodiment of the present invention, the photosensitive element comprises a multi-pixel dot, wherein the pixel dot comprises a plurality of pixel cells, wherein the pixel cells are selected from the group consisting of: one or a combination of a plurality of pixel unit types consisting of a red pixel unit, a green pixel unit and a blue pixel unit.
According to one embodiment of the present invention, the pixel cells are arranged in the order of R-RG-RGB-GB-B from top to bottom.
According to one embodiment of the invention, the light processing unit is a prism.
According to an aspect of the present invention, the present invention further provides a camera module, which includes:
the periscopic camera module comprises a light processing unit, a lens unit and a photosensitive element, wherein multiple monochromatic lights are formed by dispersing composite light passing through the light processing unit, the monochromatic lights can be focused by the lens unit, the light processing unit and the lens unit are kept in a photosensitive path of the photosensitive element, the lens unit is arranged between the light processing unit and the photosensitive element, and the monochromatic lights are received by a photosensitive surface of the photosensitive element after passing through the lens unit and obtain a first image of the shot object; and
and the main camera module obtains a second image related to the shot object, and the first image and the second image are processed to form a final image.
Drawings
Fig. 1 is a perspective view of a periscopic camera module according to a preferred embodiment of the present invention.
Fig. 2A is a schematic optical path diagram of the periscopic camera module according to the above preferred embodiment of the present invention.
Fig. 2B is a schematic diagram of the pixel points of a photosensitive chip of the periscopic camera module according to the above preferred embodiment of the present invention.
Fig. 3A is a schematic optical path diagram of the periscopic camera module according to the above preferred embodiment of the present invention.
Fig. 3B is a partially enlarged view of the schematic optical path diagram of the periscopic camera module according to the above preferred embodiment of the present invention.
Fig. 4 is a schematic optical path diagram of the periscopic camera module according to another preferred embodiment of the present invention.
Fig. 5 is a schematic optical path diagram of the periscopic camera module according to another preferred embodiment of the present invention.
Fig. 6 is a schematic diagram of a camera module applied to a mobile electronic device according to a preferred embodiment of the invention.
Fig. 7 is a schematic view of a scene of the mobile electronic device acquiring an image through the camera module according to the above preferred embodiment of the invention.
Detailed Description
The following description is presented to disclose the invention so as to enable any person skilled in the art to practice the invention. The preferred embodiments in the following description are given by way of example only, and other obvious variations will occur to those skilled in the art. The basic principles of the invention, as defined in the following description, may be applied to other embodiments, variations, modifications, equivalents, and other technical solutions without departing from the spirit and scope of the invention.
It will be understood by those skilled in the art that in the present disclosure, the terms "longitudinal," "lateral," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for ease of description and simplicity of description, and do not indicate or imply that the referenced devices or components must be constructed and operated in a particular orientation and thus are not to be considered limiting.
It is understood that the terms "a" and "an" should be interpreted as meaning that a number of one element or element is one in one embodiment, while a number of other elements is one in another embodiment, and the terms "a" and "an" should not be interpreted as limiting the number.
Referring to fig. 1 to 3B, a periscopic camera module 100 according to a preferred embodiment of the present invention will be described in the following description, wherein the periscopic camera module 100 obtains images of a plurality of monochromatic lights 300 with different colors by receiving the plurality of monochromatic lights 300 with different colors, wherein the monochromatic lights 300 with different colors correspond to different sharp object plane positions of the object to be photographed, specifically, each image of the monochromatic light 300 corresponds to a sharp object plane position of the object to be photographed, and subsequently, the images of the monochromatic lights 300 with different colors are superimposed on each other by an algorithm to synthesize a final image of the object to be photographed, so as to improve the depth of field of the periscopic camera module 100. Preferably, the periscopic camera module obtains a clear image of each monochromatic light 300, and improves the depth of field of the camera module 100 by overlapping the clear images of the monochromatic lights 300, and obtains a high-quality image.
The periscopic camera module 100 includes a light processing unit 10 and a photosensitive element 20, wherein the light processing unit 10 is kept in a photosensitive path of the photosensitive element 20, and light entering the periscopic camera module 100 can reach the photosensitive element 20 after passing through the light processing unit 10, and forms an image on a photosensitive surface 210 of the photosensitive element 20. Specifically, a composite light 400 from a photographed object is dispersed by the light processing unit 10 to form a plurality of monochromatic lights 300 of different colors, that is, the composite light 400 is dispersed by the light processing unit 10 to form the monochromatic lights 300 of seven colors, i.e., a red light, an orange light, a yellow light, a green light, a blue light, an indigo light and a violet light, and then the monochromatic lights 300 of different colors respectively form clear images on the photosensitive surface of the photosensitive element 20, the clear images of the monochromatic lights 300 of different colors are equivalent to different clear object surface positions of the photographed object, and the clear images of the monochromatic lights 300 of different colors are synthesized by an algorithm to obtain a final image of the photographed object. In other words, the light processing element 10 increases the dispersion degree of the plurality of monochromatic lights 300 from the composite light 400 of the object to be photographed, and further increases the dispersion degree of different clear object surfaces corresponding to the monochromatic lights 300, so as to facilitate the periscopic imaging module 100 to achieve depth of field enhancement by superimposing the images of the different monochromatic lights 300.
Further, referring to fig. 1 to 3B, the periscopic camera module 100 includes a lens unit 30, wherein the lens unit 30 is disposed between the light processing unit 10 and the photosensitive element 20, the lens unit 30 is maintained in a photosensitive path of the photosensitive element 20, a plurality of monochromatic lights 300 emitted from the light processing unit 10 reach the lens unit 30, the monochromatic lights 300 are focused by the lens unit 30, and further, the monochromatic lights 300 of different colors passing through the lens unit 30 are respectively imaged on the photosensitive surface 210 of the photosensitive element 20 to obtain images of a plurality of different sharp object plane positions. Because the different refractive index of different colours light in same medium is different, propagation speed is also the influence of same factor, with the periscopic camera module the photosensitive element 20 photosensitive surface 210 gets into with the reflection of the object plane that the distance is the same in the periscopic camera module compound light 400 is by the dispersion formation red light green light and blue light can be focused on different planes, promptly, different colours the clear image plane position that monochromatic light 300 can form is also inconsistent, obtains after utilizing the algorithm to synthesize the image of different clear object plane positions the final image of periscopic camera module, and then has promoted the depth of field of periscopic camera module 100.
Specifically, referring to fig. 3A and 3B, the composite light 400 is dispersed to form the monochromatic light 300 after passing through the light processing unit 10, and since the wavelengths and frequencies of the monochromatic lights 300 are different and the refractive indexes of the light processing unit 10 to the monochromatic lights 300 with different wavelengths are different, the positions of the monochromatic lights 300 emitted from the light processing unit 10 are different. For example, the monochromatic light 300 propagating in the Z-axis direction formed by the composite light 400 propagating in the X-axis direction passing through the light processing unit 10 is arranged in the order of the red light, the orange light, the yellow light, the green light, the blue light, the indigo light, and the violet light from top to bottom, that is, when the monochromatic light 300 of seven colors reaches a plane perpendicular to the Z-axis direction, for example, the photosensitive surface 210 of the photosensitive element 20, image areas formed by monochromatic light 300 of different colors are also arranged on the photosensitive surface 210 of the photosensitive element 20 from top to bottom, wherein the red light is in the uppermost region, the orange light is next to the orange light, and the violet light is in the lowermost region, and then promoted the discrete degree of different sharp object planes to do benefit to periscopic camera module 100 realizes promoting the depth of field through the mode of the formation of image stack with different monochromatic light 300. In a preferred embodiment of the present invention, the light processing unit 10 is a prism. It should be understood by those skilled in the art that the embodiment of the light processing unit 10 is only an example and is not intended to limit the content and scope of the periscopic camera module 100 according to the present invention.
Further, the photosensitive element 20 correspondingly receives the monochromatic light 300 with different colors, the photosensitive element 20 converts the light signal into an electrical signal, and transmits the electrical signal to a processing device 50 communicably connected to the photosensitive element 20, so as to obtain images of the monochromatic light with different colors, i.e., obtain images of a plurality of different sharp object plane positions. Images of monochromatic light of different colors are superimposed by an algorithm to synthesize the first image 101.
Preferably, the photosensitive surface 210 of the photosensitive element 20 correspondingly receives the monochromatic light 300 of the three colors of the red light, the green light, and the blue light, so as to obtain separate clear images of the red light, the green light, and the blue light, that is, images of three different clear object plane positions are obtained. The red light image, the green light image and the blue light image correspond to three different clear object plane positions, and the red light image, the green light image and the blue light image are synthesized according to an algorithm to improve the depth of field. For example, referring to fig. 3A, the light receiving surface 210 of the light receiving element 20 correspondingly receives the monochromatic light 300 of the three colors of red light, green light, and blue light, and further obtains an image a of red light, an image B of green light, and an image C of blue light, respectively, where the image a of red light, the image B of green light, and the image C of blue light correspond to images of three different sharp object plane positions, and an object plane position a ', an object plane position B ', and an object plane position C ' of an object to be photographed can be obtained from information of the image a of red light, the image B of green light, and the image C of blue light, so that subsequently, the image a of red light, the image B of green light, and the image C of blue light can be obtained by an algorithm, so as to obtain a clear image of the shot object. It should be understood that the photosensitive element 20 may also obtain an image of a photographed object by receiving the monochromatic light 300 of other colors.
Specifically, referring to fig. 2B, the photosensitive surface 210 of the photosensitive element 20 includes a plurality of pixel points 21, that is, the photosensitive surface 210 of the photosensitive element 20 includes a pixel point array, and receives the red light, the green light, and the blue light corresponding to the pixel points 21, so as to obtain a clear image of the monochromatic light 300 of three colors. That is, the red, green, and blue light corresponding to the red pixel 2111, the green pixel 2112, and the blue pixel 2113 of the light receiving element 20 are imaged by three optical systems at different object plane positions, and the three images at different object plane positions are combined by an algorithm to obtain a final image of the subject. It should be understood that the "trio" image refers to the imaging of the red light, the imaging of the green light, and the imaging of the blue light by the red pixel unit 2111, the green pixel unit 2112, and the blue pixel unit 2113.
More specifically, the pixel point 21 includes a plurality of pixel units 211, where each pixel unit 211 can receive the monochromatic light 300 with a corresponding color, and the plurality of pixel units 211 are sequentially arranged from top to bottom, so that the pixel point 211 can receive the monochromatic light 300 with a corresponding color. The pixel cells 211 are selected from: one or a combination of a plurality of pixel types including a red pixel 2111, a green pixel 2112, and a blue pixel 2113, wherein the red pixel 2111 can receive the red light, the green pixel 2112 can receive the green light, the blue pixel 2113 can receive the blue light, and the red pixel 2111, the green pixel 2112, and the blue pixel 2113 are arranged according to a predetermined rule, so that the pixel 21 can receive the monochromatic light 300 of a corresponding color. In the drawings of the specification and the following description, the red pixel unit 2111, the green pixel unit 2112, and the blue pixel unit 2113 are represented by letters "R", "G", and "B", respectively, as shown in fig. 2B.
Preferably, the light sensing element 20 is implemented as an irregular color filter array. Specifically, the pixel cells 211 are arranged in the order of R-RG-RGB-GB-B from top to bottom so that the light sensing element 20 can receive the monochromatic light of the corresponding color. Specifically, since the wavelengths and frequencies of the different monochromatic lights 300 are different, and the refractive indexes of the light processing unit 10 to the monochromatic lights 300 with different wavelengths are different, the positions of the different monochromatic lights 300 emitted from the light processing unit 10 are also different, and the monochromatic lights 300 are sequentially arranged from top to bottom according to the sequence of the red light, the orange light, the yellow light, the green light, the blue light, the indigo light, and the violet light, the corresponding pixel units 211 including only the red pixel unit 2111 are distributed at the top, and the pixel units 211 including only the blue pixel unit 2113 are distributed at the bottom. Further, due to the influence of the refraction angle and the focal length, the red light, the green light, and the blue light are difficult to be completely separated, that is, the phenomenon that the red light and the green light coincide with each other, and the red light, the green light, and the blue light coincide with each other, may occur from top to bottom, and accordingly, the pixel unit 211 including at least one of the red pixel units 2111 and at least one of the green pixel units 2112 is located below the pixel unit 211 including only the red pixel unit 2111, and the pixel unit 211 including at least one of the red pixel units 2111, at least one of the green pixel units 2112, and at least one of the blue pixel units 2113 is located above the pixel unit 211 including only the blue pixel unit 2113. That is to say, the pixel units 211 of the pixels 21 of the photosensitive element 20 are designed according to the distribution of the monochromatic light 300 formed after dispersion, so that the photosensitive element 20 can better receive the monochromatic light 300 of the corresponding color, which is further beneficial to improving the imaging effect.
Optionally, the light sensing element 20 is implemented as an RGB color filter array, i.e. each of the pixel cells 211 comprises at least one of the red pixel cells 2111, at least one of the green pixel cells 2112 and at least one of the blue pixel cells 2113.
Optionally, the photosensitive element 20 is implemented as an RGBW color filter array, where "W" is a white pixel unit, that is, each of the pixel units 211 includes at least one of the red pixel units 2111, at least one of the green pixel units 2112, at least one of the blue pixel units 2113 and at least one white pixel unit, and brightness is supplemented by the white pixel unit to improve imaging quality.
Referring to fig. 1 and 2A, the periscopic camera module 100 further includes a color filter 40, wherein the color filter 40 is held in a photosensitive path of the photosensitive element 20, and the color filter 40 is disposed in front of the photosensitive element 20, so that light entering the periscopic camera module 100 passes through the color filter 40 before reaching the photosensitive element 20. The color filter 40 filters stray light to ensure the sharpness of the image formed on the photosensitive surface 210 of the photosensitive element 20. For example, the color filter 40 only allows light of a predetermined wavelength band to pass through, so as to ensure that the red light, the green light and the blue light respectively form clear independent images on the photosensitive surface 210 of the photosensitive element 20, thereby ensuring the imaging quality. Preferably, the color filter 40 is disposed between the lens unit 30 and the light sensing element 20, light passes through the lens unit 30 and reaches the color filter 40, and the color filter 40 filters stray light. Preferably, the color filter 40 is disposed between the lens unit 30 and the light processing module 10. It should be understood that the specific embodiment of the color filter 40 is only an example, and is not intended to limit the content and scope of the periscopic camera module 100 according to the present invention.
In the specific embodiment of the periscopic camera module 100 illustrated in fig. 4 of the specification, the periscopic camera module 100 further comprises an auxiliary element 80, wherein the auxiliary element 80 is disposed between the light processing unit 10 and the lens unit 30, and the auxiliary element 80 is held in the photosensitive path of the photosensitive element 20, the monochromatic light 300 emitted from the light processing unit 10 is dispersed after passing through the auxiliary element 80, so that the overlapping portion of the monochromatic light 300 of different colors is reduced, thus, the overlapping area of the red light, the green light, and the blue light is reduced, the photosensitive element 20 can better receive the monochromatic light 300 with the corresponding color, the definition of the image formed by the red light, the green light and the blue light is improved, and the auxiliary element 80 can further improve the depth of field. Preferably, the auxiliary element 80 is implemented as a cylindrical mirror. Preferably, the auxiliary element 80 is embodied as a free-form surface.
The periscopic camera module 100 shown in fig. 5 of the specification differs from the periscopic camera module 100 shown in fig. 2A in that the periscopic camera module 100 shown in fig. 5 further includes a driving element 90, wherein the light processing unit 10 is rotatably connected to the driving element 90, such that the red light, the green light, and the blue light emitted from the light processing unit 10 can all be received by the photosensitive element 20, in such a way that the photosensitive element 20 can receive all of the red light, the green light, and the blue light without enlarging the area of the photosensitive surface 210 of the photosensitive element 20. For example, since the refractive indexes of the monochromatic lights 300 of different colors are different in the light processing unit 10, the angles formed by the monochromatic lights 300 emitted from the light processing unit 10 and the YZ plane are different, and the distance between the monochromatic lights 300 increases along with the propagation of light, if all of the red light, the green light, and the blue light are to be received, the photosensitive area of the photosensitive element 20 required is relatively large, which is not favorable for the periscopic imaging module 100 to be light and thin, and the photosensitive element 20 can sequentially receive the red light, the green light, and the blue light by driving the light processing unit 10 to rotate around the Y axis, and obtain images corresponding to the monochromatic lights 300 of three colors respectively, so as to obtain the first image 101 subsequently.
According to another aspect of the present invention, the present invention further provides an image capturing method of a periscopic camera module, wherein the image capturing method includes the following steps:
(a) a light processing unit 10 breaks a composite light 400 from a photographed object and disperses the composite light 400 to form a plurality of monochromatic lights 300;
(b) receiving the monochromatic light 300 by a photosensitive surface 210 of a photosensitive element 20, and respectively obtaining images of the monochromatic light 300 with different colors; and
(c) the images of the monochromatic light 200 of different colors are synthesized to obtain a final image of the photographed object.
Specifically, in the step (a), the composite light 400 from the object to be photographed can be dispersed by the light processing unit 10 to form the monochromatic light 300 of different colors, that is, the composite light 400 is dispersed by the light processing unit 10 to form the monochromatic light 300 of seven colors of red light, orange light, yellow light, green light, blue light, indigo light and violet light, and then the monochromatic light 300 of different colors can be imaged on the photosensitive surface 210 of the photosensitive element 20, wherein the monochromatic light 300 of different colors correspond to different object plane positions of the object to be photographed, specifically, a clear image of each monochromatic light 300 corresponds to a clear object plane position of the object to be photographed, and subsequently, the clear images of the monochromatic lights 300 of different colors are superimposed by an algorithm, so as to synthesize the final image of the object, the depth of field of the periscopic camera module 100 is improved in this way.
Preferably, in the step (b), the photosensitive surface 210 of the photosensitive element 20 receives the red light, the green light and the blue light correspondingly, so as to obtain clear imaging of the red light, the green light and the blue light. It should be noted that the specific color of the monochromatic light 300 received by the photosensitive element 20 cannot be a limitation to the content and scope of the image capturing method of the periscopic camera module according to the present invention. More specifically, the red pixel unit 2111, the green pixel unit 2112 and the blue pixel unit 2113 of the light-sensing surface 210 of the light-sensing element 210 respectively receive the red light, the green light and the blue light, and thus respectively obtain images of the red light, the green light and the blue light. More preferably, the red pixel cell 2111, the green pixel cell 2112, and the blue pixel cell 2113 correspondingly receive the red light, the green light, and the blue light in a manner of being arranged in the order of R-RG-RGB-GB-B from top to bottom.
Further comprising step (e) before said step (b): the monochromatic light 300 is converged by a lens unit 30.
Preferably, said step (e) is followed by a step (f): the distance between the monochromatic lights 300 of different colors is enlarged by an auxiliary device 80. Specifically, the monochromatic light 300 emitted from the light processing unit 10 is dispersed after passing through the auxiliary element 80, so that the overlapping portion of the monochromatic light 300 of different colors is reduced, and thus, the overlapping area of the red light, the green light, and the blue light is reduced, which is beneficial for the photosensitive element 20 to better receive the monochromatic light 300 of the corresponding color, and improves the definition of an image formed by the red light, the green light, and the blue light separately.
Further comprising step (g) before said step (b): the stray light is filtered by a color filter 40. Specifically, the color filter 40 only allows light of a predetermined wavelength band to pass through, so as to ensure that the red light, the green light, and the blue light respectively form clear independent images on the photosensitive surface 210 of the photosensitive element 20, thereby ensuring the imaging quality.
Preferably, step (h) is further included before step (b): the light processing unit 10 is rotated. Specifically, the light processing unit 10 can be driven to rotate by a driving element 90, so that the light sensing element 20 can sequentially receive the red light, the green light, and the blue light, and thus, all of the red light, the green light, and the blue light emitted from the light processing unit 20 can be received by the light sensing element 20, which is beneficial to making the periscopic imaging module 100 light and thin.
In accordance with another aspect of the present invention, a periscopic camera module 100 according to another preferred embodiment of the present invention will be described in the following description. Referring to fig. 6 and 7 of the specification, at least one periscopic camera module 100 is applied to a camera module 1000, wherein the camera module 1000 can obtain high-quality images. Specifically, the camera module 1000 includes at least one periscopic camera module 100 and at least one main camera module 200, wherein the periscopic camera module 100 and the main camera module 200 can respectively obtain a first image 101 and a second image 102, and the first image 101 and the second image 102 can be synthesized into a final image 103 of the camera module 1000 through an algorithm. Further, the periscopic camera module 100 can determine the position of an actual scene and obtain the outline of the actual scene, the periscopic camera module 100 obtains the first image 101 having depth information, the first image 101 and the second image 102 are fused to obtain the final image 103 which can embody the outline of the actual scene and present a stereoscopic effect, and the image quality obtained by the camera module 1000 is improved.
It should be noted that the numbers of the periscopic camera modules 100 and the main camera modules 200 of the camera module 1000 are not limited, for example, the camera module 1000 may be implemented as a dual camera module, that is, the camera module 1000 includes one periscopic camera module 100 and one main camera module 200; the camera module 1000 can also be implemented as a three-camera module, that is, the camera module 1000 includes two periscopic camera modules 100 and one main camera module 200, or the camera module 1000 includes one periscopic camera modules 100 and two main camera modules 200. The specific number of the periscopic camera modules 100 and the main camera modules 200 of the camera module 1000 is only an example, and cannot be a limitation to the content and scope of the camera module 1000 and the periscopic camera modules 100 thereof according to the present invention. In the drawings and the following description of the present invention, the image pickup module 1000 is implemented as a dual camera module as an example.
Referring to fig. 1 to 3B, the periscopic camera module 100 images by receiving a monochromatic light 300, analyzes the actual position of the actual scene according to the sharpness of the monochromatic light 300, and obtains the outline of the actual scene. Specifically, the periscopic camera module 100 includes a light processing unit 10 and a photosensitive element 20, wherein the light processing unit 10 is held in a photosensitive path of the photosensitive element 20, and light entering the periscopic camera module 100 can pass through the light processing unit 10 and then reach the photosensitive element 20, and forms an image on a photosensitive surface 210 of the photosensitive element 20. Further, a compound light 400 in the external environment passes through the light processing unit 10 is dispersed, and then forms the monochromatic light 300, that is, the compound light 400 passes through the light processing unit 10 is dispersed to form seven colors of a red light, an orange light, a yellow light, a green light, a blue light, an indigo light and a violet light the monochromatic light 300, and in the follow-up, the red light, the green light and the blue light the monochromatic light 300 can be imaged on the photosensitive element 20, and then according to the three colors, the image analysis that the monochromatic light 300 forms calculates every the position of the actual scenery corresponding to the monochromatic light 300, thereby obtaining the outline of the actual scenery. Particularly, the photosensitive surface 210 of the photosensitive element 20 is a plane, and since there is a certain difference between the red light, the green light and the blue light corresponding to different refractive indexes and the propagation speed, the position of the actual scene corresponding to the red light, the green light and the blue light and the relative position between the photosensitive surfaces 210 of the photosensitive element 20 are also different, so that there is a difference in the sharpness of the different monochromatic lights received by the photosensitive element 20. Further, according to the image formed on the photosensitive surface 210 of the photosensitive element 20, the actual scene position or depth information corresponding to different monochromatic lights can be analyzed. It should be appreciated that white light, sunlight, etc. from the external environment are all included in the composite light 400.
It should be noted that the optical processing unit 10 can change the optical path direction of the light beam entering the periscopic camera module 100. For example, the composite light 400 propagating along the X-axis direction enters the light processing unit 10, and then propagates along the Z-axis direction after passing through the light processing unit 10, that is, the light processing unit 10 can turn the light path of the light beam entering the periscopic camera module 100, so that the periscopic camera module 100 can reduce the overall height of the camera module 1000 while having a long-focus shooting effect, thereby facilitating the camera module 1000 to be applied to a mobile electronic device 2000, and meeting the trend of the mobile electronic device 2000 of being light and thin, wherein the mobile electronic device 2000 is a mobile phone, an iPad, or the like.
Referring to fig. 3A and 3B, the composite light 400 is dispersed to form the monochromatic light 300 after passing through the light processing unit 10, and different positions of the monochromatic light 300 after being emitted from the light processing unit 10 are different because different wavelengths and frequencies of the monochromatic light 300 are different and refractive indexes of the light processing unit 10 to the monochromatic light 300 with different wavelengths are different. For example, the monochromatic light 300 formed by the composite light 400 propagating along the X-axis direction after passing through the light processing unit 10 and propagating along the Z-axis direction is sequentially arranged from top to bottom, that is, when the monochromatic light 300 of seven colors reaches a plane perpendicular to the Z-axis direction, for example, the photosensitive surface 210 of the photosensitive element 20, image areas formed by the monochromatic light 300 of different colors are also arranged from top to bottom on the photosensitive surface 210 of the photosensitive element 20, wherein the red light is located in the uppermost area, and the orange light is located in the lowermost area. In a preferred embodiment of the present invention, the light processing unit 10 is a prism. It should be understood by those skilled in the art that the embodiment of the light processing unit 10 is only an example and is not intended to limit the content and scope of the periscopic camera module 1000 according to the present invention.
Further, referring to fig. 1 and 2A, the periscopic camera module 100 includes a lens unit 30, wherein the lens unit 30 is disposed between the light processing unit 10 and the photosensitive element 20, the lens unit 30 is maintained in a photosensitive path of the photosensitive element 20, the monochromatic light 300 emitted from the light processing unit 10 can reach the lens unit 30, the monochromatic light 300 can be focused by the lens unit 30, and further, the monochromatic light 300 passing through the lens unit 30 can be imaged on the photosensitive surface 210 of the photosensitive element 20.
Referring to fig. 1 and 2A, the periscopic camera module 100 further includes a color filter 40, wherein the color filter 40 is held in a photosensitive path of the photosensitive element 20, and the color filter 40 is disposed in front of the photosensitive element 20, so that light entering the periscopic camera module 100 passes through the color filter 40 before reaching the photosensitive element 20. The color filter element 40 filters the monochromatic light 300 of different colors, only allows the red light, the green light and the blue light to pass through, and ensures that only the red light, the green light and the blue light can form an image on the photosensitive surface 210 of the photosensitive element 20, thereby ensuring the imaging quality. Further, the positions of the actual scenery corresponding to the monochromatic light 300 of the three colors are different, the monochromatic light 300 of the three colors is analyzed to be formed on the image of the photosensitive surface 210 of the photosensitive element 20, the position or depth information of the actual scenery can be analyzed, and the outline of the actual scenery can be obtained.
Particularly, red light green light and optical properties such as the refracting index of blue light, propagation speed are different, by same point on the actual scenery, or be located the point in same vertical plane, promptly, with the periscope formula module of making a video recording light sensing element 20 light sensing surface 210 is reflected the entering apart from the same point the periscope formula is made a video recording in the module compound light 400 is dispersedly formed red light green light and blue light can be focused on different planes, works as light sensing element 20 when the fixed position of light sensing surface 210 is unchangeable, light sensing element 20 the different colours that light sensing surface 210 received monochromatic light 300's sharp degree is inequality. For example, the sharpness of the image formed by the red light scattered by the composite light 400 reflected into the periscopic camera module at the same position on the actual scene on the light sensing surface 210 of the light sensing element 20 is greater than the image formed by the blue light scattered by the composite light 400 on the light sensing surface 210 of the light sensing element 20. Similarly, the positions of the respective points on the actual scene corresponding to the monochromatic lights 300 with different colors and the same sharpness received by the photosensitive surface 210 of the photosensitive element 20 may be different. For example, the images of the red light and the blue light on the photosensitive surface 210 of the photosensitive element 20 have the same sharpness, and the distance between the corresponding point on the actual scene corresponding to the red light and the periscopic camera module is greater than the distance between the corresponding point on the actual scene corresponding to the blue light and the periscopic camera module. Further, through the mode that colour filter element 40 filtered the parasitic light has ensured red light green light and blue light in photosensitive element 20 the definition of photosurface 210 formation of image to reduced other colours monochromatic light 300 is to the interference of obtaining actual scenery place position, in order to do benefit to the improvement the accuracy that actual scenery place position was obtained to the periscope formula module of making a video recording to in the follow-up outline that can demonstrate actual scenery better. It should be understood that the stray light according to the present invention refers to the monochromatic light 300 of other colors than red, green and blue light.
Preferably, the color filter 40 is disposed between the lens unit 30 and the light sensing element 20, and the color filter 40 filters the monochromatic light 300 emitted from the lens unit 30, so that only the red light, the green light, and the blue light can reach the light sensing element 20 and the first image 101 can be obtained subsequently. Preferably, the color filter 40 is disposed between the lens unit 30 and the light processing module 10, so that the monochromatic light 300 emitted from the light processing module 10 can be filtered by the color filter 40, and only the red light, the green light, and the blue light can be focused by the lens unit 30.
Further, the photosensitive element 20 correspondingly receives the red light, the green light and the blue light, the monochromatic light 300, the photosensitive element 20 converts the optical signal into an electrical signal, and transmits the electrical signal to a processing device 50 communicably connected to the photosensitive element 20, so as to obtain the image of the red light, the image of the green light and the image of the blue light respectively, and subsequently, the processing device 50 analyzes the image of the red light, the image of the green light and the image of the blue light, and the position of the actual scene is obtained according to the difference of the sharpness of the monochromatic light 300 of the three colors, so as to obtain the outline of the actual scene. It should be understood that the image of red light, the image of green light, and the image of blue light can be synthesized into the first image 101. It should be noted that the processing device 50 can be implemented as a processor of the mobile electronic device 2000, and the processing device 50 can also be implemented as a processor of the camera module 1000 itself, and it should be understood by those skilled in the art that the specific implementation of the processing device 50 is merely an example, and is not intended to limit the content and scope of the camera module 1000 and the periscopic camera module 100 according to the present invention.
Specifically, referring to fig. 2B, the photosensitive surface 210 of the photosensitive element 20 includes a plurality of pixel points 21, that is, the photosensitive surface 210 of the photosensitive element 20 includes a pixel point array, and receives the red light, the green light, and the blue light corresponding to the pixel points 21, so as to obtain an image corresponding to the monochromatic light 300 of three colors. More specifically, the pixel point 21 includes a plurality of pixel units 211, where each pixel unit 211 can receive the monochromatic light 300 with a corresponding color, and the plurality of pixel units 211 are sequentially arranged from top to bottom, so that the pixel point 211 can receive the monochromatic light 300 with a corresponding color. The pixel cells 211 are selected from: one or a combination of a plurality of pixel types including a red pixel 2111, a green pixel 2112, and a blue pixel 2113, wherein the red pixel 2111 can receive the red light, the green pixel 2112 can receive the green light, the blue pixel 2113 can receive the blue light, and the red pixel 2111, the green pixel 2112, and the blue pixel 2113 are arranged according to a predetermined rule, so that the pixel 21 can receive the monochromatic light 300 of a corresponding color. In the drawings of the specification and the following description, the red pixel unit 2111, the green pixel unit 2112, and the blue pixel unit 2113 are represented by letters "R", "G", and "B", respectively, as shown in fig. 2B.
Preferably, the light sensing element 20 is implemented as an irregular color filter array. Specifically, the pixel cells 211 are arranged in the order of R-RG-RGB-GB-B from top to bottom so that the light sensing element 20 can receive the monochromatic light of the corresponding color. Specifically, since the wavelengths and frequencies of the different monochromatic lights 300 are different, and the refractive indexes of the light processing unit 10 to the monochromatic lights 300 with different wavelengths are different, the positions of the different monochromatic lights 300 emitted from the light processing unit 10 are also different, and the monochromatic lights 300 are sequentially arranged from top to bottom according to the sequence of the red light, the orange light, the yellow light, the green light, the blue light, the indigo light, and the violet light, the corresponding pixel units 211 including only the red pixel unit 2111 are distributed at the top, and the pixel units 211 including only the blue pixel unit 2113 are distributed at the bottom. Further, due to the influence of the refraction angle and the focal length, the red light, the green light, and the blue light are difficult to be completely separated, that is, the phenomenon that the red light and the green light coincide with each other, and the red light, the green light, and the blue light coincide with each other, may occur from top to bottom, and accordingly, the pixel unit 211 including at least one of the red pixel units 2111 and at least one of the green pixel units 2112 is located below the pixel unit 211 including only the red pixel unit 2111, and the pixel unit 211 including at least one of the red pixel units 2111, at least one of the green pixel units 2112, and at least one of the blue pixel units 2113 is located above the pixel unit 211 including only the blue pixel unit 2113. That is to say, the pixel units 211 of the pixels 21 of the photosensitive element 20 are designed according to the distribution of the monochromatic light 300 formed after dispersion, so that the photosensitive element 20 can better receive the monochromatic light 300 of the corresponding color, which is further beneficial to improving the imaging effect.
Optionally, the light sensing element 20 is implemented as an RGB color filter array, i.e. each of the pixel cells 211 comprises at least one of the red pixel cells 2111, at least one of the green pixel cells 2112 and at least one of the blue pixel cells 2113.
It should be noted that the photosensitive surface 210 of the photosensitive element 20 is a flat surface, and the position of the photosensitive element 20 is fixed, that is, the image surface is fixed, when the photosensitive element 20 is set within a predetermined range, the red light, the green light, and the blue light can be imaged on the photosensitive surface of the photosensitive element 10. Because the distances between each point forming the outline of the actual scenery and the periscopic camera module 100 are different, and the positions of the actual scenery corresponding to the monochromatic light 300 with different colors are different, the sharpness of the monochromatic light 300 with different colors received by the photosensitive element 20 is different. For example, referring to fig. 3A and 3B, the photosensitive element 20 can be disposed at any position between an a plane and a C plane, and an image plane corresponding to the actual scene can be formed on the photosensitive element 20, and a point at which a beam of the red light can be focused is located on the plane B, so that when the photosensitive element 10 is disposed at the position corresponding to the plane B, the luminance of the image formed by the red light on the photosensitive element 10 is higher, that is, the sharpness of the red light is higher, and when the photosensitive element 20 is disposed at the C plane before the B plane or the a plane after the B plane, the luminance of the red light that can be received by the photosensitive element 20 is reduced; similarly, if the point at which another beam of the red light can be focused is located on the plane C, the luminance of the image formed by the red light on the photosensitive element 10 is higher when the photosensitive element 10 is disposed at a position corresponding to the plane C, that is, the sharpness of the red light is higher, the luminance of the red light that can be received by the photosensitive element 20 decreases when the photosensitive element 20 is disposed on the plane B behind the plane C, and the luminance of the red light that can be received by the photosensitive element 20 decreases when the photosensitive element 20 is disposed on the plane a behind the plane B. It should be understood that the monochromatic light 300 of different colors formed by the dispersion of the same composite light 400 can be focused at different positions. In other words, the monochromatic lights 300 of different colors have different optical properties, and are reflected from the same point on the actual scene or from a point located in the same vertical plane, that is, the red light, the green light, and the blue light, which are formed by dispersing the composite light 400 reflected into the periscopic imaging module from the point having the same distance from the photosensitive surface 210 of the photosensitive element 20 of the periscopic imaging module, will be focused on different planes, and when the position of the photosensitive surface 210 of the photosensitive element 20 is fixed, the sharpness degrees of the monochromatic lights 300 of different colors received by the photosensitive surface 210 of the photosensitive element 20 are different; similarly, the positions of the respective points on the actual scene corresponding to the monochromatic lights 300 with different colors and the same sharpness received by the photosensitive surface 210 of the photosensitive element 20 may be different. That is to say, the periscopic camera module is through will getting into the compound light dispersion of camera module becomes different colours monochromatic light 300, and by photosensitive element 20 photosensitive surface 210 receives monochromatic light 300's mode is favorable to being favorable to according to different colours monochromatic light 300 forms in photosensitive element 20 photosensitive surface 210's image analysis etc. obtains the position of each point on the actual scenery to obtain the profile of actual scenery.
Referring to fig. 1, the periscopic camera module 100 further includes a focusing mechanism 60, wherein the focusing mechanism 60 enables an image corresponding to an actual scene to be maintained at a focal length of the lens unit 30, so as to improve an imaging effect of the periscopic camera module 100. Preferably, the focusing mechanism 60 is disposed on the lens unit 30, and the focusing mechanism 60 can drive the lens unit 30 to rotate to complete focusing. For example, the focusing mechanism 60 can drive the lens unit 30 to move along the Z-axis direction to achieve focusing. It should be noted that the processing device 50 can determine the focusing parameters used in the current region according to the contours of the actual scenes, and keep all the scenes in the picture at the focal length, that is, some errors in the focusing process of the focusing mechanism 60 can be compensated through an algorithm, so as to reduce the precision requirement on the focusing mechanism 60 and ensure the imaging effect.
Referring to fig. 1, the periscopic camera module 100 further includes an anti-shake mechanism 70, wherein the anti-shake mechanism 70 is disposed on the lens unit 30 and/or the light processing unit 10, and the anti-shake mechanism 70 can reduce the influence of shake on the shooting effect. It should be noted that the periscopic camera module 100 can keep objects in a range from 40cm to infinity clear, and is not as sensitive to the shaking of the user's hand during the shooting process as the auto-focusing.
In the specific embodiment of the periscopic camera module 100 shown in fig. 4, the periscopic camera module 100 further includes an auxiliary element 80, wherein the auxiliary element 80 is disposed between the light processing unit 10 and the lens unit 30, and the auxiliary element 80 is maintained in the photosensitive path of the photosensitive element 20, the monochromatic light 300 emitted from the light processing unit 10 is dispersed after passing through the auxiliary element 80, so that the overlapping portion of the monochromatic light 300 of different colors is reduced, and thus the overlapping area of the red light, the green light and the blue light is reduced, which is beneficial for the photosensitive element 20 to better receive the monochromatic light 300 of the corresponding color, and the definition of the image formed by the red light, the green light and the blue light is improved, so as to more accurately obtain the position of the actual scenery, so as to obtain the outline of the actual scene and better present the stereoscopic effect. Preferably, the auxiliary element 80 is implemented as a cylindrical mirror. Preferably, the auxiliary element 80 is embodied as a free-form surface.
The periscopic camera module 100 shown in fig. 5 of the specification differs from the periscopic camera module 100 shown in fig. 2A in that the periscopic camera module 100 shown in fig. 5 further includes a driving element 90, wherein the light processing unit 10 is rotatably connected to the driving element 90, such that the red light, the green light, and the blue light emitted from the light processing unit 10 can all be received by the photosensitive element 20, in such a way that the photosensitive element 20 can receive all of the red light, the green light, and the blue light without enlarging the area of the photosensitive surface 210 of the photosensitive element 20. For example, since the refractive indexes of the monochromatic lights 300 of different colors are different in the light processing unit 10, the angles formed by the monochromatic lights 300 emitted from the light processing unit 10 and the YZ plane are different, and the distance between the monochromatic lights 300 increases along with the propagation of light, if all of the red light, the green light, and the blue light are to be received, the photosensitive area of the photosensitive element 20 required is relatively large, which is not favorable for the periscopic imaging module 100 to be light and thin, and the photosensitive element 20 can sequentially receive the red light, the green light, and the blue light by driving the light processing unit 10 to rotate around the Y axis, and obtain images corresponding to the monochromatic lights 300 of three colors respectively, so as to obtain the first image 101 subsequently.
According to another aspect of the present invention, the present invention further provides an image capturing method of a periscopic camera module, wherein the image capturing method includes the following steps:
(a) a light processing unit 10 is used for turning a composite light 400 from a shot object and simultaneously dispersing the composite light 400 to form a plurality of monochromatic lights 300, wherein at least one monochromatic light 300 is red light, at least one monochromatic light 300 is green light, and at least one monochromatic light 300 is blue light;
(b) receiving the red light, the green light and the blue light by a photosensitive element 20; and
(c) and acquiring the outline of the shot object.
Specifically, the optical processing unit 10 changes the optical path direction of the light beam that has entered the periscopic imaging module 100. For example, the composite light 400 propagating along the X-axis direction enters the light processing unit 10, and then propagates along the Z-axis direction after passing through the light processing unit 10, that is, the light processing unit 10 can turn the light path of the light beam entering the periscopic camera module 100, so that the periscopic camera module 100 has a long-focus shooting effect and can reduce the overall height of the camera module 1000, thereby facilitating the camera module 1000 to be applied to a mobile electronic device 2000 and meeting the trend of the mobile electronic device 2000 for thinning and developing. And, in the step (a), the composite light 400 in the external environment can be dispersed by the light processing unit 10 to form the monochromatic light 300 of different colors, that is, the composite light 400 is dispersed by the light processing unit 10 to form the monochromatic light 300 of seven colors of red light, orange light, yellow light, green light, blue light, indigo light and violet light, and then the monochromatic light 300 of the three colors of red light, green light and blue light can be imaged on a photosensitive surface 210 of the photosensitive element 20.
In the step (b), the light sensing elements receive the red light, the green light, and the blue light, respectively. Specifically, the red, green, and blue light is received by the red pixel unit 2111, the green pixel unit 2112, and the blue pixel unit 2113 of the light-sensing surface 210 of the light-sensing element 210, respectively. More preferably, the red pixel cell 2111, the green pixel cell 2112, and the blue pixel cell 2113 correspondingly receive the red light, the green light, and the blue light in a manner of being arranged in the order of R-RG-RGB-GB-B from top to bottom.
Further comprising step (e) before said step (b): the monochromatic light 300 is converged by a lens unit 30.
Preferably, said step (e) is followed by a step (f): the distance between the monochromatic lights 300 of different colors is enlarged by an auxiliary device 80. Specifically, follow the light processing unit 10 is emitted monochromatic light 300 process is dispersed behind the auxiliary element 80, makes different colours monochromatic light 300's overlap is reduced, like this, red light green and the overlapping region of blue light is reduced, so as to do benefit to photosensitive element 20 receives corresponding colour better monochromatic light 300, and improved red light green and the definition of the independent image that forms of blue light to can acquire the position that actual scenery is located more accurately, with the outline that obtains actual scenery, and demonstrate the stereoeffect better.
Further comprising step (g) before said step (b): the stray light is filtered by a color filter 40. Specifically, stray light is filtered in a manner of allowing only the red light, the green light and the blue light to pass through, and it is ensured that only the red light, the green light and the blue light can be imaged on the photosensitive element 20, so that the position of an actual scene can be analyzed according to images formed by the red light, the green light and the blue light respectively, and the outline of the actual scene can be obtained. The mode through filtering parasitic light has ensured the definition of formation of image to reduce other colours monochromatic light 300 is to the interference of acquireing actual scenery position, in order to do benefit to the improvement the periscopic module of making a video recording acquires the accuracy of actual scenery position, in order to demonstrate the profile of actual scenery better in follow-up.
Preferably, step (h) is further included before step (b): the light processing unit 10 is rotated. Specifically, the light processing unit 10 can be driven to rotate by a driving element 90, so that the light receiving element 20 can sequentially receive the red light, the green light, and the blue light, and thus all of the red light, the green light, and the blue light emitted from the light processing unit 20 can be received by the light receiving element 20.
The step (c) further comprises the steps of:
(c.1) forming images of the red, green, and blue light, respectively;
(c.2) analyzing the sharpness of the red, green, and blue light; and
and (c.3) determining the position of the shot object.
Further comprising step (c.4) after said step (c.3): and acquiring the outline of the actual scene according to the position of the shot object. Particularly, red light green light and optical properties such as the refracting index of blue light, propagation speed are different, by same point on the actual scenery, or be located the point in same vertical plane, promptly, with the periscope formula module of making a video recording light sensing element 20 light sensing surface 210 is reflected the entering apart from the same point the periscope formula is made a video recording in the module compound light 400 is dispersedly formed red light green light and blue light can be focused on different planes, works as light sensing element 20 when the fixed position of light sensing surface 210 is unchangeable, light sensing element 20 the different colours that light sensing surface 210 received monochromatic light 300's sharp degree is inequality. For example, the sharpness of the image formed by the red light scattered by the composite light 400 reflected into the periscopic camera module at the same position on the actual scene on the light sensing surface 210 of the light sensing element 20 is greater than the image formed by the blue light scattered by the composite light 400 on the light sensing surface 210 of the light sensing element 20. Similarly, the positions of the respective points on the actual scene corresponding to the monochromatic lights 300 with different colors and the same sharpness received by the photosensitive surface 210 of the photosensitive element 20 may be different. For example, the images of the red light and the blue light on the photosensitive surface 210 of the photosensitive element 20 have the same sharpness, and the distance between the corresponding point on the actual scene corresponding to the red light and the periscopic camera module is greater than the distance between the corresponding point on the actual scene corresponding to the blue light and the periscopic camera module. And then the mode of receiving the monochromatic light 300 by the photosensitive surface 210 of the photosensitive element 20 analyzes the image of the monochromatic light 300 with different colors formed on the photosensitive surface 210 of the photosensitive element 20 so as to obtain the position of each point on the actual scenery and obtain the outline of the actual scenery, thereby obtaining the depth information of the actual scenery.
Further comprising step (i) after said step (c.4): a focus parameter is generated based on the contour of the actual scene. Further, the periscopic camera module completes focusing according to the generated focusing parameters, and obtains a first image 101 of the periscopic camera module 100.
According to another aspect of the present invention, the present invention further provides an image obtaining method of a camera module, wherein the image obtaining method includes the following steps:
(a) dispersing a composite light 400 by a light processing unit 10 of at least one periscopic camera module 100 and forming a plurality of monochromatic lights 300;
(b) receiving a red light, a green light and a blue light by a photosensitive element 20 of the periscopic camera module 100;
(c) acquiring the outline of an actual scene; and
(d) to assist a main camera module 200 to complete focusing.
Specifically, the periscopic camera module 100 disperses the coincidence light 400 into the monochromatic light 300, and can obtain images of the red light, the green light and the blue light, a processing device 50 analyzes an actual position of an actual scene corresponding to the monochromatic light 300 according to a difference of sharpness of the monochromatic light 300 of different colors, and obtains a contour of the actual scene, and further obtains a focusing parameter, the processing device 50 outputs the focusing parameter to the main camera module 200, so that the main camera module 200 completes focusing according to the focusing parameter to obtain a second image 102, the focusing efficiency of the main camera module 200 is improved, and the precision requirement of a focusing mechanism of the main camera module 200 can be reduced.
After the step (c), further comprising the step (e): a first image 101 is obtained by the periscopic camera module 100.
After the step (d), a final image 103 of the camera module 1000 is synthesized. Specifically, the first image 101 and the second image 102 are synthesized by an algorithm to obtain the final image 103, wherein the first image 101 can have accurate depth information, and the final image 103 obtained by fusing the first image 101 and the second image 102 can embody the outline of an actual scene and present a stereoscopic effect, thereby improving the quality of the image acquired by the camera module 1000. For example, referring to fig. 7, when a mobile electronic device 2000 with the camera module 1000 is used to shoot a human face, the first image 101 acquired by the periscopic camera module 100 of the camera module 1000 can represent the actual existing position of the human face, that is, the relative positions between the points constituting the human face, so as to obtain the contour of the human face, and the first image 101 presents the depth information of the human face, and when the first image 101 and the second image 102 acquired by the main camera module 100 are further synthesized into the final image 103, the final image 103 can present the stereoscopic contour of the human face.
It will be appreciated by persons skilled in the art that the above embodiments are only examples, wherein features of different embodiments may be combined with each other to obtain embodiments which are easily conceivable in accordance with the disclosure of the invention, but which are not explicitly indicated in the drawings.
It will be appreciated by persons skilled in the art that the embodiments of the invention described above and shown in the drawings are given by way of example only and are not limiting of the invention. The objects of the invention have been fully and effectively accomplished. The functional and structural principles of the present invention have been shown and described in the examples, and any variations or modifications of the embodiments of the present invention may be made without departing from the principles.

Claims (21)

1. An image acquisition method of a periscopic camera module is characterized by comprising the following steps:
(a) a light processing unit turns a composite light from a photographed object and disperses the composite light to form a plurality of monochromatic lights;
(b) receiving the monochromatic light by a light sensing surface of a light sensing element to respectively obtain images of the monochromatic light with different colors, wherein the light sensing element correspondingly receives at least one red light, at least one green light and at least one blue light in the monochromatic light, and the red light, the green light and the blue light are correspondingly received by a red pixel unit, a green pixel unit and a blue pixel unit which are arranged from top to bottom according to the sequence of R-RG-RGB-GB-B; and
(c) and synthesizing the images of the monochromatic light with different colors to obtain a final image of the shot object.
2. The image capturing method as claimed in claim 1, wherein in step (b), the red, green and blue light beams are respectively received by a red pixel unit, a green pixel unit and a blue pixel unit of the photosensitive surface of the photosensitive element, so as to form the red, green and blue light beams.
3. The image acquisition method according to claim 1, further comprising, before the step (b), a step (e): the monochromatic lights with different colors are respectively converged by a lens unit.
4. The image acquisition method according to claim 3, further comprising a step (f) after the step (e): the distance between the monochromatic lights of different colors is enlarged by an auxiliary device.
5. The image acquisition method according to claim 2, further comprising, before the step (b), a step (g): the stray light is filtered by a color filter.
6. The image acquisition method according to claim 2, further comprising, before the step (b), a step (h): rotating the light processing unit.
7. The working method of the camera module is characterized by comprising the following steps:
(a) turning a composite light from a shot object by using a light processing unit of at least one periscopic camera module, simultaneously dispersing the composite light, and forming a plurality of monochromatic lights;
(b) receiving the monochromatic light by a light sensing surface of a light sensing element to respectively obtain images of the monochromatic light with different colors, wherein the light sensing element correspondingly receives at least one red light, at least one green light and at least one blue light in the monochromatic light, and the red light, the green light and the blue light are correspondingly received by a red pixel unit, a green pixel unit and a blue pixel unit which are arranged from top to bottom according to the sequence of R-RG-RGB-GB-B; synthesizing the images of the monochromatic light with different colors to obtain an image of a shot object;
(c) acquiring the outline of the shot object; and
(d) and assisting at least one main camera module to finish focusing.
8. The method of claim 7, wherein in the step (c), the step (c) comprises the steps of:
(c.1) forming images of the red, green, and blue light;
(c.2) analyzing the sharpness of the red, green, and blue light; and
(c.3) determining the position of the photographed object.
9. The method of operation of claim 8, further comprising, after said step (c.3), a step (c.4): and acquiring the outline of the shot object according to the position of the shot object.
10. The operating method as claimed in claim 9, wherein the step (d) further comprises the steps of:
the periscopic camera module determines a focusing parameter according to the outline of the shot object; and
and the main camera module finishes focusing according to the focusing parameters.
11. The method of claim 10, wherein after step (d), the master camera module obtains a second image of the object.
12. The operating method as claimed in claim 11, wherein after said step (c), further comprising the step (e): obtaining a first image by the periscopic camera module; in the method, the first image and the second image are synthesized to obtain a final image of the camera module.
13. A periscopic module of making a video recording, its characterized in that includes:
a light processing unit, wherein a composite light passing through the light processing unit is dispersed to form a plurality of monochromatic lights;
a lens unit, wherein the monochromatic light can be focused by the lens unit; and
a photosensitive element, wherein the light processing unit and the lens unit are held in a photosensitive path of the photosensitive element, the lens unit is disposed between the light processing unit and the photosensitive element, the monochromatic light passes through the lens unit and is received by a photosensitive surface of the photosensitive element, wherein the photosensitive surface of the photosensitive element correspondingly receives a plurality of beams of at least one red light, at least one green light and at least one blue light in the monochromatic light, wherein the photosensitive element comprises a plurality of pixel points, wherein the pixel points comprise a plurality of pixel units, and wherein the pixel units are arranged from top to bottom in an order of R-RG-RGB-GB-B.
14. The periscopic camera module of claim 13, further comprising a color filter, wherein the color filter is retained in a photosensitive path of the photosensitive element.
15. The periscopic camera module of claim 14, wherein the color filter is disposed between the lens unit and the light sensing element.
16. The periscopic camera module of claim 14, further comprising an auxiliary component, wherein the auxiliary component is disposed between the light processing unit and the lens unit, and the auxiliary component is held in a photosensitive path of the photosensitive component.
17. The periscopic camera module of claim 16, wherein said auxiliary component is a cylindrical mirror.
18. The periscopic camera module of claim 16, wherein the auxiliary device is a free-form surface mirror.
19. The periscopic camera module of any one of claims 13-18, further comprising a driving element, wherein the driving element is disposed on the light processing unit, and the light processing unit is driven to be rotatably connected to the driving element.
20. The periscopic camera module of claim 13, wherein said light processing unit is a prism.
21. A camera module, comprising:
at least one periscopic camera module according to any one of claims 13-20, wherein said periscopic camera module obtains a first image of the object; and
and the main camera module obtains a second image related to the shot object, and the first image and the second image are processed to form a final image.
CN201811465390.4A 2018-12-03 2018-12-03 Camera module, periscopic camera module, image acquisition method and working method Active CN111258166B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811465390.4A CN111258166B (en) 2018-12-03 2018-12-03 Camera module, periscopic camera module, image acquisition method and working method
PCT/CN2019/113351 WO2020114144A1 (en) 2018-12-03 2019-10-25 Camera module, periscope camera module thereof, image obtaining method and operating method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811465390.4A CN111258166B (en) 2018-12-03 2018-12-03 Camera module, periscopic camera module, image acquisition method and working method

Publications (2)

Publication Number Publication Date
CN111258166A CN111258166A (en) 2020-06-09
CN111258166B true CN111258166B (en) 2021-11-05

Family

ID=70948776

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811465390.4A Active CN111258166B (en) 2018-12-03 2018-12-03 Camera module, periscopic camera module, image acquisition method and working method

Country Status (2)

Country Link
CN (1) CN111258166B (en)
WO (1) WO2020114144A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114338997B (en) * 2021-12-30 2024-02-27 维沃移动通信有限公司 Image pickup module, image pickup method, image pickup device and electronic equipment

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL188169A (en) * 2006-12-18 2011-06-30 Visionsense Ltd High resolution endoscope
JP5075795B2 (en) * 2008-11-14 2012-11-21 株式会社東芝 Solid-state imaging device
CN103155543B (en) * 2010-10-01 2014-04-09 富士胶片株式会社 Imaging device
US9258468B2 (en) * 2012-02-15 2016-02-09 Fluxdata, Inc. Method and apparatus for separate spectral imaging and sensing
CN108663775A (en) * 2017-04-01 2018-10-16 华为技术有限公司 A kind of camera lens module and terminal
CN107121783A (en) * 2017-06-23 2017-09-01 中山联合光电科技股份有限公司 A kind of imaging system
CN107948470B (en) * 2017-11-22 2020-07-10 德淮半导体有限公司 Camera module and mobile device
CN207835595U (en) * 2017-12-14 2018-09-07 信利光电股份有限公司 A kind of dual camera module and terminal
CN108810386A (en) * 2018-08-02 2018-11-13 Oppo广东移动通信有限公司 Camera module, CCD camera assembly and electronic device

Also Published As

Publication number Publication date
WO2020114144A1 (en) 2020-06-11
CN111258166A (en) 2020-06-09

Similar Documents

Publication Publication Date Title
CN105578019B (en) Image extraction system capable of obtaining depth information and focusing method
CN105590939B (en) Imaging sensor and output method, phase focusing method, imaging device and terminal
US9398272B2 (en) Low-profile lens array camera
EP3480648B1 (en) Adaptive three-dimensional imaging system
JP2018077190A (en) Imaging apparatus and automatic control system
JP2008294819A (en) Image pick-up device
CN105282443A (en) Method for imaging full-field-depth panoramic image
WO2017155622A1 (en) Phase detection autofocus using opposing filter masks
CN107948470B (en) Camera module and mobile device
US20140085422A1 (en) Image processing method and device
US9148552B2 (en) Image processing apparatus, image pickup apparatus, non-transitory storage medium storing image processing program and image processing method
WO2011095026A1 (en) Method and system for photography
JP7378219B2 (en) Imaging device, image processing device, control method, and program
US20210211580A1 (en) Phase detection autofocus (pdaf) optical system
US20060092314A1 (en) Autofocus using a filter with multiple apertures
CN110998228A (en) Distance measuring camera
JP2006126652A (en) Imaging apparatus
JP5348258B2 (en) Imaging device
CN111258166B (en) Camera module, periscopic camera module, image acquisition method and working method
CN106934349A (en) Dual camera is imaged and iris capturing identification integration apparatus
JP6756898B2 (en) Distance measuring device, head-mounted display device, personal digital assistant, video display device, and peripheral monitoring system
CN105979128A (en) Image acquisition device and panoramic camera
US20210208482A1 (en) Phase detection autofocus (pdaf) optical system
JP2019016975A (en) Image processing system and image processing method, imaging apparatus, program
CN114080565B (en) System and method for obtaining ultra-macro images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant