CN114302046A - Camera device applied under screen - Google Patents
Camera device applied under screen Download PDFInfo
- Publication number
- CN114302046A CN114302046A CN202111678388.7A CN202111678388A CN114302046A CN 114302046 A CN114302046 A CN 114302046A CN 202111678388 A CN202111678388 A CN 202111678388A CN 114302046 A CN114302046 A CN 114302046A
- Authority
- CN
- China
- Prior art keywords
- unit
- lenses
- applied under
- camera
- units
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Abstract
The invention discloses a camera device applied under a screen, and relates to the technical field of camera devices. The invention includes a group of conventional array first lenses, at least a group of diffractive second lenses, and a sensor matched with the second lenses. The Diffractive Optical Element (DOE) of the present invention is a phase element which changes the phase of light propagating through it by microstructure design and can obtain a light spot output of almost any desired intensity distribution or shape as long as the microstructure design is correct, and in general, the DOE is designed for a specific wavelength, and a diffractive optical lens (DOELens) whose microstructure design is concentric distribution and is denser the farther from the center of the optical axis, has the function of a general lens, has a focal length, can image, and is different from the general lens in that chromatic dispersion is large, and different wavelengths have different focal length values, and are greatly different.
Description
Technical Field
The invention belongs to the technical field of camera devices, and particularly relates to a camera device applied under a screen.
Background
The image pickup device records the picture. The operation process of picture taking comprises the following steps: long shot, panoramic, medium shot, close-up, microscopic, etc. The camera shooting technique comprises the application of the lens, such as pushing, pulling, shaking, moving, tracking and the like, the combination of the lens is faded out, faded in, switched, overlapped and the like, and the lens frame is always aligned to a shot object to shoot at a wide-angle stage of starting and a long-focus stage of falling and the focus of the lens is changed or the camera position is moved.
The difference of focal lengths of different wavelengths is larger than that of a common lens, so that the positions of imaging surfaces of the wavelengths are relatively far apart, and imaging has serious chromatic aberration for common application, and the imaging cannot be used independently.
Disclosure of Invention
The invention aims to provide a camera device applied under a screen, which solves the technical problem that the existing camera device has serious chromatic aberration in imaging.
In order to achieve the purpose, the invention is realized by the following technical scheme:
a camera device applied under a screen comprises a group of traditional array first lenses, at least one group of diffraction second lenses and a sensor matched with the second lenses, wherein the difference of the wavelength focal lengths of the second lenses is larger than that of the first lenses, and the imaging colors on the sensor have different definition degrees;
the acquisition unit is used for acquiring a target object from a shooting picture displayed on a display screen;
the synthesis unit is used for obtaining a high-resolution image by synthesis after the acquisition unit acquires the target object;
and the control unit is used for acquiring the voice information of the user in the camera switching mode and identifying the camera number in the voice information.
Optionally, the second lens is composed of a first unit and a second unit, and the second unit is not connected to the first unit; the second unit is not overlapped with the first units, the area of the second unit is smaller than that of the first units in the aspect of plane geometry, the thickness of the second unit is smaller than that of the imaging units in the aspect of thickness geometry, the second unit is positioned in the four first units, the four first units are arranged in a square shape, the second unit is positioned in the three first units, the three first units are arranged in a regular triangle shape, and the second unit and the imaging units are arranged randomly.
Optionally, the control unit is configured to obtain an offset length between a center point of the spacer width and a center point of the unit width, calculate, according to a display screen operation, a direction and an angle of the camera that need to rotate, and control the camera to rotate correspondingly, and the obtaining unit is configured to calculate, according to a touch operation on the display screen, a direction and a distance of the shot picture that need to move.
The embodiment of the invention has the following beneficial effects:
one embodiment of the present invention is a Diffractive Optical Element (DOE) which is a phase element that changes the phase of light propagating through it by microstructure design, and can obtain a spot output of almost any desired intensity distribution or shape as long as the microstructure design is correct, and in general, DOEs are designed for a specific wavelength, and a diffractive optical lens (DOELens) whose microstructure design is concentric distribution, the farther from the center of the optical axis the more dense, has the function of a general lens, has a focal length, can image, and is different from a general lens in that chromatic dispersion is large, and different wavelengths have different focal length values, and are greatly different.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a first diagram of object, image and focal lengths according to an embodiment of the present invention;
FIG. 2 is a second schematic diagram of object distance, image distance and focal length according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a conventional lens structure according to an embodiment of the invention;
FIG. 4 is a schematic bottom view of a sensor according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a DOE lens structure according to an embodiment of the present invention;
FIG. 6 is a schematic side view of a sensor according to an embodiment of the present invention;
FIG. 7 is a first schematic view of a conventional lens and DOE lens arrangement in accordance with one embodiment of the present invention;
FIG. 8 is a schematic diagram of a second arrangement of conventional lenses and DOE lenses according to one embodiment of the present invention;
FIG. 9 is a third arrangement of a conventional lens and a DOE lens according to an embodiment of the present invention;
FIG. 10 is a schematic diagram of a first array of imaging units and diffraction units according to one embodiment of the present invention;
FIG. 11 is a diagram illustrating a second array of imaging units and diffractive units according to an embodiment of the present invention;
FIG. 12 is a diagram illustrating a third array of imaging units and diffractive units according to an embodiment of the present invention;
figure 13 is a diagram illustrating a fourth array of imaging units and diffraction units according to one embodiment of the present invention.
Wherein the figures include the following reference numerals:
a first unit 1 and a second unit 2.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
To maintain the following description of the embodiments of the present invention clear and concise, a detailed description of known functions and known components of the invention have been omitted.
Referring to fig. 1 to 13, in the present embodiment, an image capturing apparatus applied under a screen is provided, including: the array comprises a group of traditional array first lenses, at least one group of diffraction second lenses and a sensor matched with the second lenses, wherein the difference of the wavelength focal lengths of the second lenses is larger than that of the first lenses, and the imaging colors on the sensor have different definition degrees;
the acquisition unit is used for acquiring a target object from a shooting picture displayed on a display screen;
the synthesis unit is used for obtaining a high-resolution image by synthesis after the acquisition unit acquires the target object;
and the control unit is used for acquiring the voice information of the user in the camera switching mode and identifying the camera number in the voice information.
When the image distances l' are the same, it indicates that the RGB wavelengths have different object distances lr, lg, lb, as shown in the left diagram, it indicates that three RGB colors can be imaged at the same time for three object distances, and it is mentioned earlier that the focal length difference of each wavelength of DOELens is much larger than that of a general lens, so that the object distance difference is much larger than that of a general lens, and the distance of the object can be calculated by judging the degree of blur of each color photograph of the object. That is, when focusing is not performed, the absolute and relative distances of all the subjects on the screen can be calculated by only transmitting photographs of different colors.
Diffractive Optical Elements (DOEs) are phase elements that change the phase of light propagating through them by microstructure design to obtain spot output of almost any desired intensity distribution or shape as long as the microstructure design is correct, typically DOEs are designed for a specific wavelength, diffractive optical lenses (DOELens) whose microstructure design is concentric, the farther away from the center of the optical axis the more dense, have the function of a general lens, have focal lengths, can image, unlike a general lens, are dispersive, and different wavelengths have different focal length values, and are very different.
Referring to fig. 1, the second lens of the present embodiment is composed of a first unit 1 and a second unit 2, and the second unit 2 is not connected to the first unit 1; the second unit 2 is not overlapped with the first unit 1, in the plane geometric dimension, the area of the second unit 2 is smaller than that of the first unit 1, in the thickness geometric dimension, the thickness of the second unit 2 is smaller than that of the imaging unit, the second unit 2 is positioned in the four first units 1, the four first units 1 are arranged in a square shape, the second unit 2 is positioned in the three first units 1, the three first units 1 are arranged in a regular triangle shape, the second unit 2 and the imaging unit are arranged randomly, and the difference of focal lengths of different wavelengths of DOELEns is larger than that of a common lens. Therefore, the imaging plane positions of the wavelengths are relatively far apart, the imaging has serious chromatic aberration for general application, and the imaging cannot be used alone, and f ' r, f ' g and f ' b represent the focal lengths of RGB, according to the following gaussian imaging formula:
l is the object distance, l ' is the image distance, f ' is the focal length, when the object distance l is the same, the focal lengths f ' r, f ' g, f ' b of the wavelengths RGB have different image distances l ' r, l ' g, l ' b, so the single imaging position has serious chromatic aberration, when the image distance l ' is the same, it means that the RGB wavelengths have different object distances lr, lg, lb, as shown in the left figure, it means that the RGB three colors can be imaged for three object distances at the same time, previously mentioned each wavelength focal length difference of DOELens is much larger than the common lens, so the object distance difference on this side is much larger than the common lens, and the distance of the object can be calculated by judging the fuzzy degree of each color picture of the object. That is, when focusing is not performed, the absolute and relative distances of all the subjects on the screen can be calculated by only transmitting photographs of different colors.
The traditional lens array with the 3D stereoscopic image function is a conventional technology, is similar to a binocular vision triangle intersection principle, has angular offset at the imaging position of each lens, can calculate the distance of an object, further constructs a 3D stereoscopic image, changes the traditional lens array into a DOE lens array, has more distance measuring functions of focusing different colors at different object distances besides the original 3D stereoscopic image function, and enhances the reproduction accuracy of the 3D stereoscopic image, and in the best embodiment, the first unit (1) is an imaging unit which is a micro lens; the second element (2) is a diffractive element, which is a DOE, which may be a Diffractive Optical Element (DOE), Fresnel panels or metamenses.
The above embodiments may be combined with each other.
It should be noted that the terms "first," "second," and the like in the description and claims of this application and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the application described herein are capable of operation in sequences other than those illustrated or described herein.
In the description of the present invention, it is to be understood that the orientation or positional relationship indicated by the orientation words such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal" and "top, bottom", etc. are usually based on the orientation or positional relationship shown in the drawings, and are only for convenience of description and simplicity of description, and in the case of not making a reverse description, these orientation words do not indicate and imply that the device or element being referred to must have a specific orientation or be constructed and operated in a specific orientation, and therefore, should not be considered as limiting the scope of the present invention; the terms "inner and outer" refer to the inner and outer relative to the profile of the respective component itself.
Claims (10)
1. An image pickup apparatus applied under a screen, comprising:
the array comprises a group of traditional array first lenses, at least one group of diffraction second lenses and a sensor matched with the second lenses, wherein the difference of the wavelength focal lengths of the second lenses is larger than that of the first lenses, and the imaging colors on the sensor have different definition degrees;
the acquisition unit is used for acquiring a target object from a shooting picture displayed on a display screen;
the synthesis unit is used for obtaining a high-resolution image by synthesis after the acquisition unit acquires the target object;
and the control unit is used for acquiring the voice information of the user in the camera switching mode and identifying the camera number in the voice information.
2. The device as claimed in claim 1, wherein the second lens is composed of a first unit (1) and a second unit (2), and the second unit (2) is not connected to the first unit (1), and the second unit (2) is not overlapped with the first unit (1).
3. An image pick-up device applied under a screen according to claim 2, characterized in that the second unit (2) has a smaller area than the first unit (1) in terms of planar geometry.
4. An image pick-up device applied under a screen according to claim 2, characterized in that the thickness of the second unit (2) is smaller than the thickness of the imaging unit in terms of thickness geometry.
5. An image pick-up device applied under a screen according to claim 2, characterized in that the second unit (2) is located in four first units (1), and the four first units (1) are arranged in a square.
6. An image pick-up device applied under a screen according to claim 2, characterized in that the second unit (2) is located in three first units (1), and the three first units (1) are arranged in a regular triangle.
7. An image pickup apparatus applied under a screen according to claim 4, wherein the second unit (2) and the imaging unit are arranged randomly.
8. The image pickup apparatus applied under a screen according to claim 1, wherein the control unit is configured to acquire an offset length between a center point of the space width and a center point of the cell width.
9. The camera device as claimed in claim 1, wherein the control unit calculates the direction and angle of the rotation of the camera according to the operation of the display screen and controls the camera to rotate accordingly.
10. The image pickup apparatus as claimed in claim 1, wherein the acquisition unit is configured to calculate a direction and a distance of movement required for the photographing picture according to a touch operation on the display screen.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111678388.7A CN114302046A (en) | 2021-12-31 | 2021-12-31 | Camera device applied under screen |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202111678388.7A CN114302046A (en) | 2021-12-31 | 2021-12-31 | Camera device applied under screen |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114302046A true CN114302046A (en) | 2022-04-08 |
Family
ID=80976259
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202111678388.7A Pending CN114302046A (en) | 2021-12-31 | 2021-12-31 | Camera device applied under screen |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114302046A (en) |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102494771A (en) * | 2011-12-14 | 2012-06-13 | 中国科学院光电研究院 | Diffractive optical imaging system and imaging spectrometer comprising same |
WO2014187187A1 (en) * | 2013-05-20 | 2014-11-27 | 爱佩仪光电技术(深圳)有限公司 | Method for realizing tilt-shift photography and three-dimensional multi-area auto-focus via touch screen operation |
US20150229815A1 (en) * | 2014-02-07 | 2015-08-13 | Olympus Corporation | Imaging system, display system, and optical device |
CN107370945A (en) * | 2017-07-25 | 2017-11-21 | 广东虹勤通讯技术有限公司 | A kind of camera control method and shooting head controlling device |
CN111552138A (en) * | 2020-05-29 | 2020-08-18 | Oppo广东移动通信有限公司 | Under-screen camera, imaging method and terminal |
CN112202991A (en) * | 2020-09-17 | 2021-01-08 | 欧菲微电子技术有限公司 | Camera module, electronic equipment, optical element and preparation method of camera module |
CN112272271A (en) * | 2020-09-30 | 2021-01-26 | 富盛科技股份有限公司 | Camera control method and device |
CN113067961A (en) * | 2020-10-19 | 2021-07-02 | 上海鲲游科技有限公司 | Under-screen camera imaging system, phase compensation element thereof and manufacturing method |
-
2021
- 2021-12-31 CN CN202111678388.7A patent/CN114302046A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102494771A (en) * | 2011-12-14 | 2012-06-13 | 中国科学院光电研究院 | Diffractive optical imaging system and imaging spectrometer comprising same |
WO2014187187A1 (en) * | 2013-05-20 | 2014-11-27 | 爱佩仪光电技术(深圳)有限公司 | Method for realizing tilt-shift photography and three-dimensional multi-area auto-focus via touch screen operation |
US20150229815A1 (en) * | 2014-02-07 | 2015-08-13 | Olympus Corporation | Imaging system, display system, and optical device |
CN107370945A (en) * | 2017-07-25 | 2017-11-21 | 广东虹勤通讯技术有限公司 | A kind of camera control method and shooting head controlling device |
CN111552138A (en) * | 2020-05-29 | 2020-08-18 | Oppo广东移动通信有限公司 | Under-screen camera, imaging method and terminal |
CN112202991A (en) * | 2020-09-17 | 2021-01-08 | 欧菲微电子技术有限公司 | Camera module, electronic equipment, optical element and preparation method of camera module |
CN112272271A (en) * | 2020-09-30 | 2021-01-26 | 富盛科技股份有限公司 | Camera control method and device |
CN113067961A (en) * | 2020-10-19 | 2021-07-02 | 上海鲲游科技有限公司 | Under-screen camera imaging system, phase compensation element thereof and manufacturing method |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6027560B2 (en) | Automatic tracking imaging device | |
JP6641470B2 (en) | Stereo camera and stereo camera control method | |
US7397501B2 (en) | Digital camera with viewfinder designed for improved depth of field photographing | |
JP5943785B2 (en) | IMAGING DEVICE, IMAGING SYSTEM, IMAGE PROCESSING DEVICE, AND IMAGING DEVICE CONTROL METHOD | |
CN111580237A (en) | Electronic device and control method thereof | |
JP2010008873A (en) | Focus detecting device and imaging device | |
JP5617157B2 (en) | Focus detection apparatus and imaging apparatus | |
JP4995002B2 (en) | Imaging device, focusing device, imaging method, and focusing method | |
JP2017107204A (en) | Information processing apparatus and information processing method | |
JP4693727B2 (en) | 3D beam input device | |
US11039117B2 (en) | Dual lens imaging module and capturing method thereof | |
JP5116168B2 (en) | Stereoscopic video lens system | |
JPH1172717A (en) | Microscopic digital photographing system | |
CN101395519B (en) | 3-D photographing lens system | |
CN114302046A (en) | Camera device applied under screen | |
JP5240591B2 (en) | Imaging device, focusing device, imaging method, and focusing method | |
KR101608404B1 (en) | Single lens Microscope for three dimensional image | |
JP2010249965A (en) | Method of photographing different focal point images by using optical element, and device for the same | |
CN111095072A (en) | Optical system, projection device, and imaging device | |
JP4740477B2 (en) | Stereoscopic imaging adapter lens and stereoscopic imaging system | |
TW202125082A (en) | Dual lens imaging module and capturing method thereof | |
TWI747038B (en) | Dual lens imaging module and capturing method thereof | |
KR101957357B1 (en) | Multiscale Imaging system using one mirror | |
Oberdörster et al. | Folded multi-aperture camera system for thin mobile devices | |
CN113826036A (en) | Optical zoom system and camera of mobile device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |