CN211152041U - Electronic equipment and camera assembly thereof - Google Patents

Electronic equipment and camera assembly thereof Download PDF

Info

Publication number
CN211152041U
CN211152041U CN202020211016.8U CN202020211016U CN211152041U CN 211152041 U CN211152041 U CN 211152041U CN 202020211016 U CN202020211016 U CN 202020211016U CN 211152041 U CN211152041 U CN 211152041U
Authority
CN
China
Prior art keywords
light
prism
sensing chip
refracted
camera assembly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202020211016.8U
Other languages
Chinese (zh)
Inventor
姚坤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Realme Chongqing Mobile Communications Co Ltd
Original Assignee
Realme Chongqing Mobile Communications Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Realme Chongqing Mobile Communications Co Ltd filed Critical Realme Chongqing Mobile Communications Co Ltd
Priority to CN202020211016.8U priority Critical patent/CN211152041U/en
Application granted granted Critical
Publication of CN211152041U publication Critical patent/CN211152041U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Studio Devices (AREA)

Abstract

The application discloses electronic equipment and camera subassembly thereof, camera subassembly include camera lens, beam splitting prism and sensitization unit. The splitting prism is arranged at the downstream of the lens, the splitting prism is provided with a first prism surface and a second prism surface, incident light entering through the lens is divided into reflected light and refracted light after entering the first prism surface, and the refracted light enters the second prism surface; the light sensing unit is arranged at the downstream of the beam splitter prism and comprises a first light sensing chip and a second light sensing chip, the first light sensing chip is used for receiving reflected light, and the second light sensing chip is used for receiving refracted light passing through the second prism surface. Through adopting single-lens structure, not only can be so that camera subassembly's compact structure to reduce camera subassembly's volume, because the coincidence of the optical axis of two images can promote the stability of optical axis moreover, reduce image algorithm's work load, and then promote the precision of depth of field information calculation and the effect of the background blurring of shooing.

Description

Electronic equipment and camera assembly thereof
Technical Field
The present disclosure relates to electronic devices, and particularly to an electronic device and a camera assembly thereof.
Background
At present, with the development of science and technology, electronic equipment such as smart phones and the like are gradually becoming necessities of life of people.
The double-camera structure simulates the visual difference of the left eye and the right eye of a human, and the depth of field information of an object is measured through the triangulation principle, so that the functions of 3D modeling, background blurring and the like are realized. In order to ensure the accuracy of the depth information, it is necessary to ensure that the relative positions of the two cameras are fixed, and the requirement on the assembly mode is high. Moreover, if the consumer falls or collides in the using process, the relative positions of the optical axes of the two cameras are easy to change, and the deviation is generated between the optical axes and the calibrated data when the consumer leaves a factory, so that the photographing blurring effect is influenced.
SUMMERY OF THE UTILITY MODEL
In one aspect, an embodiment of the present application provides a camera assembly, which includes: a lens; the light splitting prism is arranged at the downstream of the lens, is provided with a first prism surface and a second prism surface, and is used for splitting incident light entering through the lens into reflected light and refracted light after the incident light enters the first prism surface, and the refracted light enters the second prism surface; the light sensing unit is arranged at the downstream of the beam splitter prism and comprises a first light sensing chip and a second light sensing chip, the first light sensing chip is used for receiving reflected light, and the second light sensing chip is used for receiving refracted light passing through the second prism surface.
On the other hand, this application embodiment still provides an electronic equipment, and electronic equipment includes the casing and as before camera assembly, camera assembly sets up in the casing, and is formed with the daylight opening that corresponds with the camera lens on the casing.
The embodiment of the application is provided with the beam splitter prism, incident light entering from the same lens is reflected and refracted by the beam splitter prism respectively to form an image corresponding to the reflected light and an image corresponding to the refracted light respectively, and the functions of 3D modeling, background blurring and the like are realized through the two images. Adopt single-lens structure, not only can be so that camera subassembly's compact structure to reduce camera subassembly's volume, because the optical axis coincidence of two images moreover, can promote the stability of optical axis, reduce the dynamic change at the in-process optical axis contained angle such as collision or fall, reduce image algorithm's work load, and then promote the precision of depth of field information calculation and the effect of the background blurring of shooing.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained without inventive work, wherein:
fig. 1 is a schematic diagram illustrating a principle of imaging using a dual-camera structure in the related art;
FIG. 2 is a schematic diagram of a camera assembly according to an embodiment of the present application;
FIG. 3 is a schematic view of a camera assembly according to another embodiment of the present application;
FIG. 4 is a schematic structural diagram of a camera head assembly according to yet another embodiment of the present application;
FIG. 5 is a schematic view of a camera assembly according to another embodiment of the present application;
FIG. 6 is a schematic structural diagram of a camera head assembly according to yet another embodiment of the present application;
fig. 7 is an exploded schematic view of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the specific embodiments described herein are merely illustrative of the application and are not limiting of the application. It should be further noted that, for the convenience of description, only some of the structures related to the present application are shown in the drawings, not all of the structures. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first", "second", etc. in this application are used to distinguish between different objects and not to describe a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Referring to fig. 1, fig. 1 is a schematic diagram illustrating a principle of imaging with a dual-camera structure in the related art. The dual-camera structure generally comprises a main camera 1 and an auxiliary camera 2, wherein the main camera 1 is mainly used for realizing a photographing function, and the auxiliary camera 2 is mainly used for providing depth-of-field information. When shooting is carried out, the main camera 1 and the auxiliary camera 2 shoot a picture of the same scene at the same time, the corresponding positions of the same object in the main shooting image and the auxiliary shooting image in the actual scene are searched through an image algorithm, and the depth of field information is calculated through a triangular distance measurement mode.
Generally, the fixing mode of the main camera 1 and the auxiliary camera 2 can adopt a common support mode, a common substrate mode and a mobile phone structure fixing mode, and the size of the camera structure can be larger by adopting the fixing mode. Moreover, because the optical axes of the main camera 1 and the auxiliary camera 2 are not coincident, the optical axes of the two cameras have a distance and an included angle, and in order to meet the requirement of an algorithm, the requirement on an assembly mode is higher generally. If the consumer falls or collides in the use process, the relative positions of the optical axes of the two cameras are easily changed, and the deviation is generated between the optical axes and the calibrated data when leaving the factory, so that the photographing blurring effect is influenced.
In addition, because the centers of the two cameras are not coincident, when the two cameras shoot the same object, the relative positions of the object in the two pictures are different, and in the process of calculating the depth of field information, in order to find the corresponding point of the shot point of the same object in the two cameras, not only a complex image algorithm needs to be used, but also the precision of the image algorithm influences the precision of the depth of field information, and the precision of 3D modeling and the effect of blurring of the shooting background are related.
Referring to fig. 2, fig. 2 is a schematic structural diagram of a camera assembly according to an embodiment of the present application. The camera assembly 100 includes a lens 120, a beam splitter prism 140, and a light sensing unit 160.
A beam splitting prism 140 is disposed downstream of lens 120 for receiving incident light 112. The beam splitting prism 140 has a first prism surface 142 and a second prism surface 144, the incident light 112 is incident through the lens 120 and then irradiates onto the first prism surface 142, and at the position of the first prism surface 142, a part of the incident light 112 is reflected to form a reflected light 114, and a part of the incident light 112 is refracted to form a refracted light 116. The refracted light 116 is transmitted inside the beam splitter prism 140, enters the second prism surface 144, and exits through the second prism surface 144.
The incident light 112 incident through the lens 120 may directly irradiate the beam splitter prism 140, or at least one reflecting element may be disposed between the lens 120 and the beam splitter prism 140, so as to reflect the incident light 112 incident through the lens 120 to the beam splitter prism 140.
The photosensitive unit 160 is disposed downstream of the beam splitter prism 140, and is configured to receive the light passing through the beam splitter prism 140 for imaging. In the present embodiment, the photosensitive unit 160 includes a first photosensitive chip 162 and a second photosensitive chip 164. The first photo-sensing chip 162 is used for receiving the reflected light 114, and the second photo-sensing chip 164 is used for receiving the refracted light 116 passing through the second prism surface 144.
Specifically, as shown in fig. 2, the first photosensitive chip 162 is disposed opposite to the first prism surface 142 at an interval, the second photosensitive chip 164 is disposed opposite to the second prism surface 144 at an interval, the reflected light 114 reflected by the first prism surface 142 is incident to the first photosensitive chip 162, and the first photosensitive chip 162 receives the reflected light 114 and performs imaging. The refracted light 116 refracted by the first prism surface 142 enters the second prism surface 144, is refracted by the second prism surface 144 and then enters the second photosensitive chip 164, and the second photosensitive chip 164 receives the refracted light 116 and performs imaging.
In the embodiment of the present application, the beam splitter prism 140 is arranged, and the beam splitter prism 140 is used to reflect and refract the incident light 112 entering from the same lens 120 respectively, so as to form an image corresponding to the reflected light 114 and an image corresponding to the refracted light 116, and the functions of 3D modeling, background blurring, and the like are realized through the two images. Adopt single-lens structure, not only can be so that camera subassembly 100's compact structure to reduce camera subassembly 100's volume, because the coincidence of the optical axis of two images, can promote the stability of optical axis moreover, reduce the dynamic change at collision or fall etc. in-process optical axis contained angle, reduce image algorithm's work load, and then promote the precision of depth of field information calculation and the effect of the background blurring of shooing.
In the present embodiment, the reflected light 114 is directly incident on the first photo sensor chip 162, and the refracted light 116 is directly incident on the second photo sensor chip 164 after passing through the second prism surface 144. Through this kind of mode of setting, can reduce the transmission loss of light, and then promote the luminous intensity of incidenting to first sensitization chip 162 and second sensitization chip 164, promote the detection precision.
Alternatively, in another embodiment, at least one reflecting member may be further disposed between the first prism surface 142 and the first photosensitive chip 162, and the reflecting member is configured to reflect the reflected light 114 and then enter the first photosensitive chip 162. Alternatively, at least one reflecting member may be disposed between the second prism surface 144 and the second photosensitive chip 164, and the reflecting member is used for reflecting the refracted light 116 and then entering the second photosensitive chip 164. By arranging at least one reflecting piece between the first prism surface 142 and the first photosensitive chip 162 and/or between the second prism surface 144 and the second photosensitive chip 164, the arrangement positions of the first photosensitive chip 162 and the second photosensitive chip 164 relative to the beam splitter prism 140 can be flexibly adjusted, so that the layout of the camera assembly 100 is facilitated, and the compatibility of the camera assembly 100 is improved.
Further, the reflected light 114 is perpendicular to the plane of the first photo-sensing chip 162, and the refracted light 116 is perpendicular to the plane of the second photo-sensing chip 164.
Specifically, the angle between the first prism surface 142 and the first photosensitive chip 162 can be adjusted so that the reflected light 114 is vertically incident on the first photosensitive chip 162. The angle between the second prism surface 144 and the second photosensitive chip 164 can be adjusted to make the refracted light 116 perpendicularly incident on the second photosensitive chip 164. Through this kind of mode of setting, can reduce the loss of reflection ray 114 and refraction ray 116 to can promote the imaging precision of first sensitization chip 162 and second sensitization chip 164, and then promote the precision of 3D modeling and the effect of the background blurring of shooing.
In this embodiment, as shown in fig. 2, the central axis of the lens 120 is disposed along a vertical direction, and the direction of the incident light 112 is parallel to the central axis of the lens 120. The incident angle of the incident light 112 on the first prism surface 142 is 45 °, and at this time, the plane where the first photosensitive chip 162 is located is parallel to the central axis of the lens 120, so that the reflected light 114 is perpendicular to the plane where the first photosensitive chip 162 is located. The plane of the second photo-sensing chip 164 is perpendicular to the central axis of the lens 120, so that the refracted light 116 is perpendicular to the plane of the second photo-sensing chip 164.
Further, the incident point of the reflected light ray 114 on the first photo-sensing chip 162 may be set as the center of the first photo-sensing chip 162, and the incident point of the refracted light ray 116 on the second photo-sensing chip 164 may be set as the center of the second photo-sensing chip 164.
Specifically, the center of the first photosensitive chip 162 refers to the geometric center of the first photosensitive chip 162, and the center of the second photosensitive chip 164 refers to the geometric center of the second photosensitive chip 164. In this embodiment, the first photosensitive chip 162 and the second photosensitive chip 164 may be set to be rectangular, the center of the first photosensitive chip 162 may be referred to as an intersection of two diagonal lines of the first photosensitive chip 162, and the center of the second photosensitive chip 164 may be referred to as an intersection of two diagonal lines of the second photosensitive chip 164.
By setting the incident point of the reflected light 114 on the first photosensitive chip 162 as the center of the first photosensitive chip 162 and the incident point of the refracted light 116 on the second photosensitive chip 164 as the center of the second photosensitive chip 164, the centers of the pictures shot by the first photosensitive chip 162 and the second photosensitive chip 164 coincide, so that the relative position of the same object point in the two pictures is found simply, the alignment precision of the two pictures is improved, the workload of an image algorithm is reduced, the software calculation time is saved, the 3D modeling and the background blurring precision are improved, and the speed is higher.
Further, the camera assembly 100 may further include an optical filter 180, where the optical filter 180 is disposed on the light incident sides of the first photosensitive chip 162 and the second photosensitive chip 164 to filter stray light and scattered light, so as to avoid interference on the imaging of the first photosensitive chip 162 and the second photosensitive chip 164.
In this embodiment, the optical filter 180 may be an infrared filter, and the infrared filter is used for filtering infrared light to avoid interference on imaging.
Alternatively, in other alternative embodiments, other types of optical filters 180 may also be provided, and the embodiments of the present application are not particularly limited.
As shown in fig. 2, in the present embodiment, the number of the optical filters 180 is two, and the optical filters include a first optical filter 182 and a second optical filter 184, the first optical filter 182 is disposed between the first photosensitive chip 162 and the first prism surface 142, and the second optical filter 184 is disposed between the second photosensitive chip 164 and the second prism surface 144.
In another embodiment, as shown in fig. 3, fig. 3 is a schematic structural diagram of a camera head assembly in another embodiment of the present application. The number of the filters 180 may also be one. An optical filter 180 is disposed between the lens 120 and the beam splitter prism 140 for filtering out stray light or scattered light, etc. in the incident light 112 incident to the beam splitter prism 140.
Alternatively, in another embodiment, as shown in fig. 4, fig. 4 is a schematic structural diagram of a camera head assembly in another embodiment of the present application. The filter 180 may be disposed upstream of the lens 120, that is, on a side of the lens 120 facing the external environment, so as to filter stray light or scattered light in the external light before the light in the external environment enters the camera assembly 100. The optical filter 180 may be integrated with a cover lens of the camera assembly 100, for example, and filter stray light or scattered light in an external environment while protecting the camera assembly 100, thereby reducing the volume of the camera assembly 100. The present embodiment does not limit the position where the optical filter 180 is disposed.
Optionally, as shown in fig. 2, the camera assembly 100 may include a first antireflection film 110, and the first antireflection film 110 is disposed on the first prism face 142 for enhancing the light intensity of the refracted light 116.
Specifically, in this embodiment, a layer of the first antireflection film 110 may be covered on the surface of the first prism surface 142 to enhance the light intensity of the refracted light 116, so that the light intensities of the light reflected and refracted by the first prism surface 142 are as equal as possible, and the light intensity incident on the first photosensitive chip 162 is approximately equal to the light intensity incident on the second photosensitive chip 164, thereby reducing the detection error.
The first antireflection film 110 may be made of materials and structures in the related art, and may be formed on the surface of the first prism face 142 by electroplating, coating, or the like, which is not specifically limited in this embodiment.
Optionally, the camera assembly 100 may further include a second antireflection film 130, and the second antireflection film 130 is disposed on the second prism face 144 for enhancing the emission rate of the refracted light 116.
Specifically, as shown in fig. 2, a second antireflection film 130 may be covered on the surface of the second prism surface 144 to enable as much refracted light 116 to exit to the second photosensitive chip 164 for imaging, so as to reduce light loss and improve imaging accuracy, and reduce interference caused by reflection of the refracted light 116 inside the beam splitter prism 140.
Alternatively, in another embodiment, the angle of the second prism surface 144 relative to the refracted light 116 may be adjusted to make the refracted light 116 vertically incident on the second prism surface 144, so that the refracted light 116 is completely emitted from the second prism surface 144, and the refracted light 116 is prevented from being reflected to reduce the light intensity.
Referring to fig. 5, fig. 5 is a schematic structural diagram of a camera assembly according to another embodiment of the present application. The structure of the camera head assembly 200 in this embodiment is substantially the same as the structure of the camera head assembly 100 shown in fig. 2, and for the same parts, reference is made to the description of the above embodiment, and the description is omitted here. The present embodiment is different from the above-described embodiments in that, in the present embodiment, the beam splitter prism 240 further has a third prism surface 246, and the photosensitive unit further includes a third photosensitive chip 266.
Specifically, in the present embodiment, the first photosensitive chip 262 is disposed opposite to and spaced apart from the first prism surface 242, the second photosensitive chip 264 is disposed opposite to and spaced apart from the second prism surface 244, and the third photosensitive chip 266 is disposed opposite to and spaced apart from the third prism surface 246.
After the incident light 212 entering through the lens 220 is irradiated onto the first prism surface 242, a part of the incident light 212 is reflected by the first prism surface 242 to form a reflected light 214, and is incident on the first photo sensor chip 262. Part of the incident light 212 is refracted by the first prism surface 242, enters the inside of the beam splitter prism 240, and is irradiated to the second prism surface 244. Part of the refracted light 216 is refracted by the second prism surface 244 to form the first refracted light 215, and the first refracted light 215 exits the beam splitter 240 and is irradiated to the second photosensitive chip 264. Part of the refracted light rays 216 are reflected by the second prism surface 244 to form first reflected light rays 217, and the first reflected light rays 217 irradiate the third prism surface 246, then exit and irradiate the third photosensitive chip 266 for imaging. In this embodiment, the beam splitter prism 240 is provided with a first prism surface 242, a second prism surface 244, a third prism surface 246, and a first photosensitive chip 262, a second photosensitive chip 264, and a third photosensitive chip 266 corresponding thereto, so that triple-shot imaging can be realized through the single lens 220 to enrich the functions of the camera assembly 200.
In this embodiment, a first antireflection film 210 may be disposed on the surface of the first prism face 242 to enhance the light intensity of the refracted light ray 216, so that the light intensity of the refracted light ray 216 is twice as high as the light intensity of the reflected light ray 214. A second antireflection film 230 may be disposed on the surface of second prism face 244 to enhance the light intensity of refracted light ray 216 such that the light intensity of first reflected light ray 217 and first refracted light ray 215 are approximately equal. A third antireflection film 250 may be disposed on the surface of the third prism surface 246 to enhance the emission rate of the first reflected light 217, so that as much of the first reflected light 217 as possible is emitted to the third photosensitive chip 266 for imaging, thereby not only reducing light loss and improving imaging accuracy, but also reducing interference caused by the reflection of the first reflected light 217 inside the beam splitter prism 240. In addition, the arrangement of the first antireflection film 210, the second antireflection film 230, and the third antireflection film 250 may also make the light intensities of the photos imaged by the first photosensitive chip 262, the second photosensitive chip 264, and the third photosensitive chip 266 substantially the same, so as to improve the imaging accuracy and the imaging quality.
Further, in the present embodiment, as shown in fig. 5, the camera head assembly 200 may further include a third filter 286. The third filter 286 is disposed between the third photosensitive chip 266 and the third prism surface 246 for filtering stray light and scattered light, and avoiding interference to the imaging of the third photosensitive chip 266.
Alternatively, as shown in fig. 5, the first reflected light ray 217 may be arranged perpendicular to a plane on which the third photosensitive chip 266 is located.
Specifically, the angle between the third prism surface 246 and the third photosensitive chip 266 may be adjusted so that the first reflected light ray 217 is perpendicularly incident on the third photosensitive chip 266. Through this kind of mode of setting, can reduce the loss of first reflection light 217 to can promote the imaging accuracy of third sensitization chip 266, and then promote the precision of 3D modeling and the effect of the background blurring of shooing.
Optionally, the incident point of the first reflected light ray 217 on the third photosensitive chip 266 may be set as the center of the third photosensitive chip 266, so that the centers of the photos shot by the first photosensitive chip 262, the second photosensitive chip 264, and the third photosensitive chip 266 are overlapped, so that it is simple to find the relative position of the same object point in the three photos, improve the alignment precision of the three photos, reduce the workload of the image algorithm, save the software computation time, improve the precision of the 3D modeling and the background blurring, and speed the method.
Referring to fig. 6, fig. 6 is a schematic structural diagram of a camera assembly according to another embodiment of the present application. The structure of the camera head assembly 300 in this embodiment is substantially the same as that of the camera head assembly 200 shown in fig. 5, and for the same parts, reference is made to the description of the above embodiment, and the description is omitted here. The present embodiment is different from the above embodiments in that in the present embodiment, the beam splitter prism 340 further has a fourth prism surface 348, and the photosensitive unit includes a fourth photosensitive chip 368.
Specifically, in the present embodiment, the first photosensitive chip 362 is disposed opposite to and spaced from the first prism surface 342, the second photosensitive chip 364 is disposed opposite to and spaced from the second prism surface 344, the third photosensitive chip 366 is disposed opposite to and spaced from the third prism surface 346, and the fourth photosensitive chip 368 is disposed opposite to and spaced from the fourth prism surface 348.
After the incident light 312 entering through the lens 320 is irradiated onto the first prism surface 342, a part of the incident light 312 is reflected by the first prism surface 342 to form a reflected light 314, and is incident on the first photo sensor chip 362. Part of the incident light 312 is refracted by the first prism surface 342, enters the inside of the beam splitter prism 340, and is irradiated to the second prism surface 344. Part of the refracted light 316 is refracted by the second prism surface 344 to form a first refracted light 315, and the first refracted light 315 exits the beam splitter prism 340 and is irradiated to the second photosensitive chip 364. Part of the refracted light 316 is reflected by the second prism surface 344 to form a first reflected light 317, the first reflected light 317 is irradiated to the third prism surface 346, part of the first reflected light 317 is refracted by the third prism surface 346 to form a second refracted light 318, and the second refracted light 318 exits the beam splitter prism 340 and is irradiated to the third photosensitive chip 366. Part of the first reflected light 317 is reflected by the third prism surface 346 to form a second reflected light 319, and the second reflected light 319 exits the beam splitter prism 340 and then irradiates the fourth photosensitive chip 368 for imaging.
In the present embodiment, the beam splitter prism 340 is provided with a first prism surface 342, a second prism surface 344, a third prism surface 346, a fourth prism surface 348, and a first photosensitive chip 362, a second photosensitive chip 364, a third photosensitive chip 366, and a fourth photosensitive chip 368 respectively corresponding thereto, so that four-shot imaging can be realized through a single lens 320, so as to enrich the functions of the camera assembly 300.
In this embodiment, a first antireflection film 310 may be disposed on the surface of the first prism face 342 to enhance the light intensity of the refracted light 316, so that the light intensity of the refracted light 316 is three times that of the reflected light 314. A second antireflection film 330 may be disposed on the surface of the second prism face 344 to enhance the light intensity of the first reflected light 317 such that the light intensity of the first reflected light 317 is twice the light intensity of the first refracted light 315. A third antireflection film 350 may be disposed on the surface of the third prism surface 346 to enhance the light intensity of the second reflected light ray 319 such that the light intensity of the second refracted light ray 318 is approximately equal to the light intensity of the second reflected light ray 319. A fourth antireflection film 370 may be disposed on the surface of the fourth prism face 348 to enhance the emergence rate of the second reflected light 319, so that as much of the second reflected light 319 is emitted to the fourth photosensitive chip 368 for imaging, which not only reduces light loss and improves imaging accuracy, but also reduces interference caused by reflection of the second reflected light 319 in the beam splitter prism 340. In addition, the arrangement of the first antireflection film 310, the second antireflection film 330, the third antireflection film 350, and the fourth antireflection film 370 may also make the light intensities of the photos imaged by the first photosensitive chip 362, the second photosensitive chip 364, the third photosensitive chip 366, and the fourth photosensitive chip 368 substantially the same, so as to improve the imaging accuracy and improve the imaging quality.
Further, in the present embodiment, as shown in fig. 6, the camera assembly 300 may further include a fourth filter 388. The fourth optical filter 388 is disposed between the fourth photosensitive chip 368 and the fourth prism surface 348 for filtering stray light and scattered light, and avoiding interference to the imaging of the fourth photosensitive chip 368.
Alternatively, as shown in fig. 6, the second reflected light 319 may be arranged perpendicular to the plane of the fourth photosensitive chip 368.
Specifically, the angle between the fourth prism face 348 and the fourth photosensitive chip 368 can be adjusted so that the second reflected light 319 perpendicularly enters the fourth photosensitive chip 368. Through this kind of mode of setting, can reduce the loss of second reflection light 319 to can promote the imaging precision of fourth sensitization chip 368, and then promote the precision of 3D modeling and the effect of the background blurring of shooing.
Optionally, the incident point of the second reflected light 319 on the fourth photosensitive chip 368 may be set as the center of the fourth photosensitive chip 368, so that the centers of the photos taken by the first photosensitive chip 362, the second photosensitive chip 364, the third photosensitive chip 366, and the fourth photosensitive chip 368 coincide, so that finding the relative position of the same object point in the four photos is simple, the alignment accuracy of the four photos is improved, the workload of an image algorithm is reduced, the software calculation time is saved, the accuracy of 3D modeling and background blurring is improved, and the speed is faster.
Optionally, in other embodiments, the number of prism faces of the light splitting prism may also be five, six, and the like, and the setting manner thereof is described in the above embodiments and is not described herein again.
Referring to fig. 7, fig. 7 is an exploded schematic view of an electronic device according to an embodiment of the disclosure. In this embodiment, the electronic device 400 may be a smartphone. In other embodiments, the electronic device 400 may also be a tablet computer, a palm top computer, a smart watch, and the like.
In this embodiment, the electronic device 400 includes a housing 410 and a camera assembly 420, wherein the camera assembly 420 is disposed in the housing 410, a light-collecting opening corresponding to the lens is formed on the housing 410, and external ambient light enters the inside of the lens through the light-collecting opening for imaging.
The structure of the camera head assembly 420 in this embodiment is the same as that in the above embodiment, please refer to the description in the above embodiment, and will not be described herein again.
The above description is only for the purpose of illustrating embodiments of the present application and is not intended to limit the scope of the present application, and all modifications of equivalent structures and equivalent processes, which are made by the contents of the specification and the drawings of the present application or are directly or indirectly applied to other related technical fields, are also included in the scope of the present application.

Claims (10)

1. A camera head assembly, characterized in that the camera head assembly comprises:
a lens;
the beam splitting prism is arranged at the downstream of the lens, is provided with a first prism surface and a second prism surface, and is divided into reflected light and refracted light after the incident light entering through the lens enters the first prism surface, and the refracted light enters the second prism surface;
the light sensing unit set up in the low reaches of beam splitting prism, including first sensitization chip and second sensitization chip, first sensitization chip is used for receiving reflection light, second sensitization chip is used for receiving the process the second prism face refraction light.
2. The camera assembly of claim 1, wherein the reflected light is perpendicular to a plane of the first photo-sensing chip and the refracted light is perpendicular to a plane of the second photo-sensing chip.
3. The camera assembly of claim 1, wherein the incident point of the reflected light on the first photo-sensing chip is a center of the first photo-sensing chip, and the incident point of the refracted light on the second photo-sensing chip is a center of the second photo-sensing chip.
4. The camera assembly of claim 1, wherein the camera assembly comprises a filter disposed at the light incident side of the first photo-sensing chip and the second photo-sensing chip.
5. The camera assembly of claim 4, wherein the optical filter comprises a first optical filter and a second optical filter, the first optical filter is disposed between the first photo-sensing chip and the first prism face, and the second optical filter is disposed between the second photo-sensing chip and the second prism face.
6. The camera assembly of claim 1, wherein the camera assembly comprises a first antireflection film disposed on the first prism face for enhancing the light intensity of the refracted light.
7. The camera assembly of claim 1, wherein the camera assembly comprises a second antireflection film disposed on the second prism face for enhancing the emergence rate of the refracted light.
8. The camera assembly of claim 1, wherein the beam splitter prism has a third prism face, the light sensing unit includes a third light sensing chip, the refracted light is incident on the second prism face and is divided into a first sub-reflected light and a first sub-refracted light, the second light sensing chip is configured to receive the first sub-refracted light, and the third light sensing chip is configured to receive the first sub-reflected light passing through the third prism face.
9. The camera assembly of claim 8, wherein the beam splitter prism has a fourth prism surface, the light sensing unit includes a fourth light sensing chip, the first sub-reflected light is incident on the third prism surface and is divided into a second sub-reflected light and a second sub-refracted light, the third light sensing chip is configured to receive the second sub-refracted light, and the fourth light sensing chip is configured to receive the second sub-reflected light.
10. An electronic apparatus, comprising a housing and a camera assembly according to any one of claims 1 to 9, the camera assembly being disposed in the housing, and a daylight opening corresponding to the lens being formed in the housing.
CN202020211016.8U 2020-02-25 2020-02-25 Electronic equipment and camera assembly thereof Active CN211152041U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202020211016.8U CN211152041U (en) 2020-02-25 2020-02-25 Electronic equipment and camera assembly thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202020211016.8U CN211152041U (en) 2020-02-25 2020-02-25 Electronic equipment and camera assembly thereof

Publications (1)

Publication Number Publication Date
CN211152041U true CN211152041U (en) 2020-07-31

Family

ID=71749032

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202020211016.8U Active CN211152041U (en) 2020-02-25 2020-02-25 Electronic equipment and camera assembly thereof

Country Status (1)

Country Link
CN (1) CN211152041U (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112295953A (en) * 2020-10-14 2021-02-02 合肥泰禾光电科技股份有限公司 Infrared sorting machine with three light paths
WO2021052190A1 (en) * 2019-09-16 2021-03-25 RealMe重庆移动通信有限公司 Electronic apparatus
CN112600995A (en) * 2020-12-04 2021-04-02 Oppo广东移动通信有限公司 Camera assembly, calibration method thereof and electronic equipment
WO2022089113A1 (en) * 2020-11-02 2022-05-05 Oppo广东移动通信有限公司 Lens assembly, electronic device, depth detection method, and storage medium
WO2022206560A1 (en) * 2021-03-29 2022-10-06 维沃移动通信有限公司 Camera module and electronic device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021052190A1 (en) * 2019-09-16 2021-03-25 RealMe重庆移动通信有限公司 Electronic apparatus
CN112295953A (en) * 2020-10-14 2021-02-02 合肥泰禾光电科技股份有限公司 Infrared sorting machine with three light paths
WO2022089113A1 (en) * 2020-11-02 2022-05-05 Oppo广东移动通信有限公司 Lens assembly, electronic device, depth detection method, and storage medium
CN112600995A (en) * 2020-12-04 2021-04-02 Oppo广东移动通信有限公司 Camera assembly, calibration method thereof and electronic equipment
WO2022206560A1 (en) * 2021-03-29 2022-10-06 维沃移动通信有限公司 Camera module and electronic device

Similar Documents

Publication Publication Date Title
CN211152041U (en) Electronic equipment and camera assembly thereof
WO2020057208A1 (en) Electronic device
Breitbarth et al. Measurement accuracy and dependence on external influences of the iPhone X TrueDepth sensor
TW201939087A (en) Electronic device
CN107920211A (en) A kind of photographic method, terminal and computer-readable recording medium
CN105681687B (en) Image processing apparatus and mobile camera including the same
Kawahara et al. A pixel-wise varifocal camera model for efficient forward projection and linear extrinsic calibration of underwater cameras with flat housings
CN106444042A (en) Dual-purpose display equipment for augmented reality and virtual reality, and wearable equipment
CN110213491B (en) Focusing method, device and storage medium
TW201925845A (en) Lens system, projection device, detecting module and electronic device
CN109974659A (en) A kind of embedded range-measurement system based on binocular machine vision
US20230017668A1 (en) Mobile communication terminal
WO2022089113A1 (en) Lens assembly, electronic device, depth detection method, and storage medium
CN109194851A (en) A kind of ultrashort burnt Vision imaging system of miniaturization
CN210274243U (en) Sensor with depth camera and common camera integrated
CN106839994B (en) A kind of measuring system for image
TWI565320B (en) Mixed optical device for image taking and light sensing
Zhu et al. Three-dimensional measurement of fringe projection based on the camera response function of the polarization system
US11997247B2 (en) Three-dimensional space camera and photographing method therefor
CN111953875A (en) Depth detection assembly and electronic equipment
CN217643416U (en) Electronic device
CN220568070U (en) Position detection module and device
CN109788196B (en) Electronic equipment and mobile platform
CN205537631U (en) Range finding module and three -dimensional scanning system
Yang et al. Calibration of photometric stereo point light source based on standard block

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant