CN112954153B - Camera device, electronic equipment, depth of field detection method and depth of field detection device - Google Patents

Camera device, electronic equipment, depth of field detection method and depth of field detection device Download PDF

Info

Publication number
CN112954153B
CN112954153B CN202110104744.8A CN202110104744A CN112954153B CN 112954153 B CN112954153 B CN 112954153B CN 202110104744 A CN202110104744 A CN 202110104744A CN 112954153 B CN112954153 B CN 112954153B
Authority
CN
China
Prior art keywords
polarized light
polarization
light
polarization angle
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110104744.8A
Other languages
Chinese (zh)
Other versions
CN112954153A (en
Inventor
聂磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202110104744.8A priority Critical patent/CN112954153B/en
Publication of CN112954153A publication Critical patent/CN112954153A/en
Application granted granted Critical
Publication of CN112954153B publication Critical patent/CN112954153B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/18Focusing aids
    • G03B13/30Focusing aids indicating depth of field
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/514Depth or shape recovery from specularities

Abstract

The application discloses a camera device, electronic equipment, a field depth detection method and a field depth detection device, belongs to the technical field of communication, and can solve the problem that multi-machine interference is difficult to overcome when the field depth is detected by the conventional electronic equipment. Comprises a transmitting component and a receiving component. The transmitting assembly comprises at least one transmitting subarea, and each transmitting subarea is respectively provided with at least one laser used for transmitting polarized light with a first appointed polarization angle; the receiving assembly comprises a plurality of polarization detection units, each polarization detection unit comprises a polarization filter element and a photoelectric imaging element, and when the receiving assembly receives the polarized light of the first polarization angle, the polarized light of the first polarization angle can only pass through the polarization filter element corresponding to the first polarization angle and irradiate on the photoelectric imaging element. The scheme can greatly reduce the possibility that the receiving assembly receives other light rays (light rays reflected by the emitted light rays based on other emitting assemblies), thereby avoiding the condition of multi-machine interference.

Description

Camera device, electronic equipment, depth of field detection method and depth of field detection device
Technical Field
The application belongs to the technical field of communication, and particularly relates to a camera device, an electronic device, a depth of field detection method and a depth of field detection device.
Background
With the increasing popularity of Augmented Reality (AR), Virtual Reality (VR) and related three-dimensional (3D) applications, more and more equipment manufacturers carry 3D cameras on products to support the related applications. At present, the mainstream 3D camera adopts a Time of Flight (TOF) scheme, and calculates the distance from the device to a certain point in the environment through the light Flight Time, thereby obtaining the depth of field information of the environment.
In the existing scheme, when a plurality of transmitting ends work in the same environment, light rays transmitted by the plurality of transmitting ends exist in the environment and can be received by the same receiving end through reflection of the environment, and a multi-machine interference phenomenon is generated.
Disclosure of Invention
The embodiment of the application aims to provide a camera device, an electronic device, a depth of field detection method and a depth of field detection device, and can solve the problem that multi-machine interference is difficult to overcome when the depth of field is detected by the conventional electronic device.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides a camera apparatus, including a transmitting component and a receiving component;
the emission component comprises at least one emission subarea, and each emission subarea is provided with at least one laser; the laser is used for emitting polarized light of a first specified polarization angle;
the receiving assembly comprises a plurality of polarization detection units; each polarization detection unit comprises a polarization filter element and a photoelectric imaging element; when the receiving assembly receives polarized light of a first polarization angle, the polarized light of the first polarization angle can only pass through the polarization filter element corresponding to the first polarization angle and irradiate on the photoelectric imaging element.
In a second aspect, an embodiment of the present application provides an electronic device, including the camera apparatus according to the first aspect.
In a third aspect, an embodiment of the present application provides a depth of field detection method, which is applied to the electronic device according to the second aspect, and includes:
emitting first polarized light to a target object in response to a detection instruction for depth information of the target object;
receiving second polarized light and determining polarization information of the second polarized light;
judging whether the polarization information of the second polarized light is matched with the polarization information of the first polarized light;
if so, determining the second polarized light as the polarized light reflected by the target object based on the first polarized light; and determining depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light.
In a fourth aspect, an embodiment of the present application provides a depth-of-field detection apparatus, including:
the device comprises an emitting module, a receiving module and a processing module, wherein the emitting module is used for responding to a detection instruction of depth information of a target object and emitting first polarized light to the target object;
the first determining module is used for receiving the second polarized light and determining the polarization information of the second polarized light;
the judging module is used for judging whether the polarization information of the second polarized light is matched with the polarization information of the first polarized light;
a second determining module, configured to determine, if yes, that the second polarized light is polarized light reflected by the target object based on the first polarized light; and determining depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light.
In a fifth aspect, the present application provides an electronic device, which includes a processor, a memory, and a program or an instruction stored on the memory and executable on the processor, and when executed by the processor, the method of detecting depth of field according to the third aspect is implemented.
In a sixth aspect, the present application provides a readable storage medium, on which a program or instructions are stored, and when executed by a processor, the program or instructions implement the steps of the depth detection method according to the third aspect.
In a seventh aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the steps of the depth detection method according to the third aspect.
In an embodiment of the present application, a camera device includes an emitting component and a receiving component, the emitting component includes at least one emitting partition, each emitting partition is provided with at least one laser for emitting polarized light of a first designated polarization angle, the receiving component includes a plurality of polarization detection units, and each polarization detection unit includes a polarization filter element and a photoelectric imaging element. Therefore, the camera device capable of emitting and receiving polarized light of various angles is realized by the scheme, compared with the traditional camera device, when the receiving assembly receives the polarized light of the first polarization angle, the polarized light of the first polarization angle can only pass through the polarization filter element corresponding to the first polarization angle and irradiate on the photoelectric imaging element, so that the possibility that the receiving assembly receives other light rays (light rays reflected by the emitted light rays based on other emitting assemblies) is greatly reduced, and the condition of multi-machine interference can be avoided.
Further, the depth of field detection method applied to the electronic device provided with the camera device according to the embodiment of the present application, by transmitting the first polarized light to the target object in response to the detection instruction for the depth information of the target object, receiving the second polarized light, and determining the polarization information of the second polarized light, thereby determining whether the polarization information of the second polarized light matches the polarization information of the first polarized light, when the polarization information of the second polarized light matches the polarization information of the first polarized light, determining the second polarized light as the polarized light reflected by the target object based on the first polarized light, and determining the depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light. Therefore, according to the technical scheme, the camera device capable of emitting and receiving polarized light with various angles is used for determining the depth of field information, and the condition of multi-machine interference can be avoided due to the camera device, so that the depth of field information which is relatively more accurate can be obtained compared with the depth of field information determined by the traditional camera device.
Drawings
Fig. 1 is a schematic block diagram of a camera device in one embodiment of the present application.
Fig. 2 is a schematic block diagram of a transmit assembly in one embodiment of the present application.
FIG. 3 is a schematic block diagram of a polarization detection unit in one embodiment of the present application.
Fig. 4 is a schematic structural diagram of a polarization detection unit in an embodiment of the present application.
Fig. 5 is a schematic flow chart of a depth of field detection method in an embodiment of the present application.
Fig. 6 is a schematic flow chart of a depth of field detection method in another embodiment of the present application.
Fig. 7 is a schematic structural diagram of a depth-of-field detection apparatus according to an embodiment of the present application.
Fig. 8 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The following describes in detail a camera device provided in an embodiment of the present application with reference to the accompanying drawings and specific embodiments and application scenarios thereof.
Fig. 1 is a schematic block diagram of a camera device in one embodiment of the present application. As shown in fig. 1, the camera apparatus includes a transmitting component 10 and a receiving component 20; wherein, the first and the second end of the pipe are connected with each other,
the emission component 10 includes at least one emission sub-section 11 (only one emission sub-section 11 is schematically illustrated in fig. 1), and each emission sub-section 11 is provided with at least one laser 12 (only one laser 12 is schematically illustrated in fig. 1), and the laser 12 is configured to emit polarized light of a first specified polarization angle. The receiving assembly 20 comprises a plurality of polarization detection units 21 (only two polarization detection units 21 are schematically illustrated in fig. 1), each polarization detection unit 21 comprising a polarization filter element 22 and a photo imaging element 23, respectively. When the receiving assembly 20 receives the polarized light of the first polarization angle, the polarized light of the first polarization angle can only pass through the polarization filter element 22 corresponding to the first polarization angle and irradiate onto the photoelectric imaging element 23.
The number of lasers 12 may be set according to the power requirement of each camera device for emitting light, and the higher the power of the emitted light, the greater the number of lasers 12. The laser 12 may be a circular light emitting hole provided on the emission sub-area 11. As shown in fig. 2, a firing assembly 10 is schematically illustrated comprising 4 firing sectors 11, each firing sector 11 having 5 lasers 12 disposed thereon.
In an embodiment of the present application, a camera apparatus includes an emitting assembly and a receiving assembly, the emitting assembly includes at least one emitting partition, and each emitting partition is respectively provided with at least one laser for emitting polarized light of a first specified polarization angle, the receiving assembly includes a plurality of polarization detection units, and each polarization detection unit respectively includes a polarization filter element and a photoelectric imaging element. Therefore, the camera device capable of emitting and receiving polarized light of various angles is realized by the scheme, compared with the traditional camera device, when the receiving assembly receives the polarized light of the first polarization angle, the polarized light of the first polarization angle can only pass through the polarization filter element corresponding to the first polarization angle and irradiate on the photoelectric imaging element, so that the possibility that the receiving assembly receives other light rays (light rays reflected by the emitted light rays based on other emitting assemblies) is greatly reduced, and the condition of multi-machine interference can be avoided.
In one embodiment, the emission partition 11 may include a plurality of emission partitions 11, and each emission partition 11 corresponds to a respective first designated polarization angle. The number of the emission sectors 11 may be 2, 4 or 8, each of which corresponds to a first specific polarization angle, and the first specific polarization angle may be determined at intervals from 0 to 180 °. For example, the first specified polarization angle may be any one of 0 °, 60 °, and 120 ° at every interval of 60 °. The first designated polarization angle may be any one of 0 °, 30 °, 60 °, 90 °, 120 °, and 150 ° at intervals of every 30 °. The selectable number of first specified polarization angles is increasing with decreasing separation.
With the increase of the emission subareas and the reduction of the interval of the polarization angles, the differentiation among the emission components can be ensured, so that the condition of multi-machine interference is avoided. For example, if the first specified polarization angle is determined at an interval of every 30 ° for an emission assembly comprising 4 emission sectors, there are 6 choices per emission sector, 6 × 1296 choices for 4 emission sectors, corresponding to 1296 emission assemblies.
When the transmitting assembly leaves a factory, the first specified polarization angle of each transmitting subarea can be noted in a memory burning or two-dimensional code recording mode.
In this embodiment, by providing the emission assemblies including the plurality of emission partitions, each emission partition corresponds to the respective first designated polarization angle, so that differences between the emission assemblies can be ensured, and thus, the occurrence of multi-machine interference is avoided.
In one embodiment, the polarization filter element 22 may include a micro lens 221, a filter 222, and a filter material 223 disposed on the micro lens 221 and/or the filter 222 for filtering polarized light that does not match the second designated polarization angle corresponding to the polarization detection unit 21.
The second specified polarization angle corresponding to the polarization detection unit indicates that only the polarized light with the second specified polarization angle can pass through the polarization detection unit. The filter material may be a coating film. The photo imaging element 23 may be a photodiode.
In one embodiment, the receiving assembly 20 may include a plurality of pixel arrays, as shown in fig. 3, one pixel array for every 9 pixels, the polarization detection unit 21 may be disposed at the center position 310 of the pixel array, and the 8 positions at the edge are regular pixels (only photodiodes may be disposed). The polarization detection unit 21 for filtering out polarized light with different polarization angles may be distributed on the receiving assembly 20. The angle of polarization that may be detected by the receiving component 20 may include a first specified angle of polarized light emitted by the emitting component 10. When the polarized light passes through the pixel array of the receiving module 20, the normal pixels normally perform information acquisition through photoelectric conversion, and in the polarization detection unit, only when the polarization angle of the polarized light is consistent with the inherent angle of the filter material 223, photoelectric conversion is generated and a signal is output.
In one embodiment, the polarization detection unit 21 is shown in fig. 4, and includes a micro lens 221, an optical filter 222, a photo imaging element 23, and a plated film on the micro lens 221 or the optical filter 222 (the plated film is identified by the position of the polarization angle in fig. 4). Polarized light having a polarization angle in accordance with the intrinsic angle of the plating film can be allowed to pass through and be irradiated onto the electrophotographic element 23.
In an embodiment, the camera device may be disposed in an electronic device, so that a user may detect depth information using the electronic device. The electronic device may include a mobile phone, a computer, a personal game machine, an unmanned aerial vehicle, and the like. The camera device may be a TOF camera device, and the electronic device provided with the TOF camera device may implement detection of depth of field information by TOF principles.
Fig. 5 is a schematic flow chart of a depth of field detection method in an embodiment of the present application, which is applied to an electronic device provided with a camera apparatus. The method of fig. 5 may include:
s502, in response to a detection instruction for the depth information of the target object, emitting first polarized light to the target object.
S504, receiving the second polarized light, and determining the polarization information of the second polarized light.
S506, judging whether the polarization information of the second polarized light is matched with the polarization information of the first polarized light.
S508, if the polarization information of the second polarized light is matched with the polarization information of the first polarized light, determining the second polarized light as the polarized light reflected by the target object based on the first polarized light; and determining depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light.
In the embodiment of the present application, whether the polarization information of the second polarized light matches the polarization information of the first polarized light is determined by emitting the first polarized light to the target object, receiving the second polarized light, and determining the polarization information of the second polarized light in response to the detection instruction for the depth information of the target object, when the polarization information of the second polarized light matches the polarization information of the first polarized light, the second polarized light is determined as the polarized light reflected by the target object based on the first polarized light, and the depth information corresponding to the target object is determined according to the light information of the first polarized light and the light information of the second polarized light. Therefore, according to the technical scheme, the camera device capable of emitting and receiving polarized light with various angles is used for determining the depth of field information, and the condition of multi-machine interference can be avoided due to the camera device, so that the depth of field information which is relatively more accurate can be obtained compared with the depth of field information determined by the traditional camera device.
In one embodiment, the light information of the first polarization of light comprises a light emission time and the light information of the second polarization of light comprises a light reflection time. When determining the depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light, the flight time of the second polarized light may be determined based on the light reflection time and the light emission time, and then the depth information corresponding to the target object may be determined according to the flight time and the light speed of the second polarized light.
In the embodiment, the flight time of the second polarized light is determined, so that the depth information corresponding to the target object is determined according to the flight time and the light speed, and the effect of simply and quickly determining the depth information is achieved.
In one embodiment, if the polarization information of the second polarized light matches the polarization information of the first polarized light, the illumination intensity of the second polarized light may be further determined, and whether the illumination intensity is greater than or equal to a first preset threshold value may be determined; if the illumination intensity is greater than or equal to a first preset threshold value, determining that the second polarized light is the polarized light reflected by the target object based on the first polarized light; and if the illumination intensity is smaller than the first preset threshold value, determining that the second polarized light is not the polarized light reflected by the target object based on the first polarized light.
The first preset threshold may be an empirical value set for the illumination intensity of the polarized light reflected by the polarized light based on the first specified polarization angle received by the receiving component in each electronic device, and the empirical value may be obtained through a plurality of tests performed on each electronic device.
In this embodiment, by comparing the magnitude relationship between the illumination intensity of the second polarized light and the first preset threshold, it can be effectively avoided that the received second polarized light is a part of the natural light (the polarization angle of the part is consistent with the first specified polarization angle), so that the accuracy of the determined depth-of-field information can be improved.
In addition, when the polarization information of the second polarized light is matched with the polarization information of the first polarized light, the illumination intensity of the second polarized light is further determined, and whether the difference value between the illumination intensity of the second polarized light and the illumination intensity of the first polarized light is smaller than a second preset threshold value or not is judged; if the difference value is smaller than a second preset threshold value, determining that the second polarized light is the polarized light reflected by the target object based on the first polarized light; and if the difference value is larger than or equal to a second preset threshold value, determining that the second polarized light is not the polarized light reflected by the target object based on the first polarized light.
In one embodiment, the polarization information of the first polarized light comprises a first specified polarization angle and the polarization information of the second polarized light comprises the first polarization angle. When judging whether the polarization information of the second polarized light matches the polarization information of the first polarized light, it can be judged whether the first polarization angle of the second polarized light matches the first specified polarization angle of the first polarized light.
In one embodiment, the emission sub-section may include a plurality of emission sub-sections, each of the emission sub-sections corresponding to a respective first designated polarization angle. When the first polarized light is emitted to the target object, a plurality of first polarized lights can be emitted to the target object sequentially through the emission subareas according to a preset polarized light emission sequence corresponding to the emission subareas. When judging whether the polarization information of the second polarized light is matched with the polarization information of the first polarized light, whether the first polarization angle of each second polarized light is matched with the first specified polarization angle of the corresponding first polarized light can be sequentially judged according to the polarized light emission sequence.
In this embodiment, whether the first polarization angle of each second polarized light matches the first specified polarization angle of the corresponding first polarized light is sequentially determined according to the polarized light emission sequence, so that the occurrence of accidental events can be effectively avoided, and the accuracy of the determination result can be ensured, so that relatively accurate depth-of-field information can be obtained based on the accurate determination result.
Fig. 6 is a schematic flow chart of a depth of field detection method in another embodiment of the present application, applied to an electronic device provided with a TOF camera apparatus, where the method of fig. 6 may include:
s601, in response to a detection instruction for depth information of a target object, emitting first polarized light to the target object.
S602, receiving the second polarized light, and determining the polarization information of the second polarized light.
The polarization information of the second polarized light may include a first polarization angle of the second polarized light.
S603, judging whether the first polarization angle of the second polarized light is matched with the first appointed polarization angle of the first polarized light; if yes, go to S604; if not, go to S606.
S604, the illumination intensity of the second polarized light is further determined.
S605, judging whether the illumination intensity is greater than or equal to a first preset threshold value; if not, executing S606; if yes, S607 is executed.
And S606, determining that the second polarized light is not the polarized light reflected by the target object based on the first polarized light.
S607, the second polarized light is determined as the polarized light reflected by the target object based on the first polarized light.
S608, determining a time of flight of the second polarized light based on the light reflection time of the second polarized light and the light emission time of the first polarized light.
And S609, determining depth information corresponding to the target object according to the flight time and the light speed of the second polarized light.
The electronic device can determine depth information corresponding to the target object by using the flight time and the light speed of the second polarized light according to the TOF principle.
In one embodiment, the emission sector may include a plurality of emission sectors, each emission sector corresponding to a respective first designated polarization angle. When the first polarized light is emitted to the target object, a plurality of pieces of first polarized light can be emitted to the target object sequentially through the emission subareas according to a preset polarized light emission sequence corresponding to the emission subareas. When judging whether the polarization information of the second polarized light is matched with the polarization information of the first polarized light, whether the first polarization angle of each second polarized light is matched with the first specified polarization angle of the corresponding first polarized light can be sequentially judged according to the polarized light emission sequence.
The specific processes of S601-S609 are described in detail in the above embodiments, and are not described herein again.
In the embodiment of the application, whether the polarization information of the second polarized light matches the polarization information of the first polarized light is judged by emitting the first polarized light to the target object, receiving the second polarized light and determining the polarization information of the second polarized light in response to the detection instruction of the depth information of the target object, when the polarization information of the second polarized light matches the polarization information of the first polarized light, the second polarized light is determined as the polarized light reflected by the target object based on the first polarized light, and the depth information corresponding to the target object is determined according to the light information of the first polarized light and the light information of the second polarized light. Therefore, according to the technical scheme, the TOF camera device capable of emitting and receiving polarized light with various angles is used for determining the depth of field information, and the camera device can avoid the condition of multi-machine interference, so that the depth of field information which is relatively more accurate can be obtained compared with the depth of field information determined by the traditional camera device.
It should be noted that, in the depth of field detection method provided in the embodiment of the present application, the execution subject may be a depth of field detection device, or a control module in the depth of field detection device, which is used for executing the depth of field detection method. In the embodiment of the present application, a depth of field detection device is taken as an example to execute a depth of field detection method, and the depth of field detection device provided in the embodiment of the present application is described.
Fig. 7 is a schematic structural diagram of a depth-of-field detection apparatus according to an embodiment of the present application. Referring to fig. 7, the depth-of-field detection apparatus includes:
an emitting module 710, configured to emit first polarized light to a target object in response to a detection instruction for depth information of the target object;
a first determining module 720, configured to receive the second polarized light and determine polarization information of the second polarized light;
the judging module 730, configured to judge whether the polarization information of the second polarized light matches the polarization information of the first polarized light;
a second determining module 740, configured to determine, if yes, that the second polarized light is polarized light reflected by the target object based on the first polarized light; and determining depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light.
In one embodiment, the light information of the first polarized light comprises a light emission time; the optical information of the second polarized light comprises a light reflection time;
the second determining module 740 includes:
a first determination unit for determining a time of flight of the second polarized light based on the light reflection time and the light emission time;
and the second determining unit is used for determining the depth information corresponding to the target object according to the flight time and the light speed of the second polarized light.
In one embodiment, the second determining module 740 includes:
a third determining unit, configured to further determine the illumination intensity of the second polarized light if the polarization information of the second polarized light matches the polarization information of the first polarized light;
the first judging unit is used for judging whether the illumination intensity is greater than or equal to a first preset threshold value or not;
and if so, determining the second polarized light as the polarized light reflected by the target object based on the first polarized light.
In one embodiment, the polarization information of the first polarized light comprises a first specified polarization angle; the polarization information of the second polarized light comprises a first polarization angle;
the determining module 730 includes:
and the second judging unit is used for judging whether the first polarization angle of the second polarized light is matched with the first appointed polarization angle of the first polarized light.
In one embodiment, the emission sector includes a plurality; each emission subarea corresponds to a first appointed polarization angle;
the transmitting module 710 includes:
the transmitting unit is used for transmitting a plurality of first polarized lights to the target object sequentially through the transmitting subareas according to a preset polarized light transmitting sequence corresponding to the transmitting subareas;
the determining module 730 includes:
and the third judging unit is used for sequentially judging whether the first polarization angle of each second polarized light is matched with the first appointed polarization angle of the corresponding first polarized light according to the polarized light emission sequence.
In the embodiment of the application, whether the polarization information of the second polarized light matches the polarization information of the first polarized light is judged by emitting the first polarized light to the target object, receiving the second polarized light and determining the polarization information of the second polarized light in response to the detection instruction of the depth information of the target object, when the polarization information of the second polarized light matches the polarization information of the first polarized light, the second polarized light is determined as the polarized light reflected by the target object based on the first polarized light, and the depth information corresponding to the target object is determined according to the light information of the first polarized light and the light information of the second polarized light. Therefore, the device determines the depth of field information by utilizing the camera device capable of emitting and receiving polarized light with various angles, and can obtain more accurate depth of field information compared with the depth of field information determined by the traditional camera device because the condition of multi-machine interference can be avoided by adopting the camera device.
The depth of field detection device in the embodiment of the present application may be a device, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The depth detection device in the embodiment of the present application may be a device having an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present application are not limited specifically.
The depth of field detection device provided in the embodiment of the present application can implement each process implemented by the depth of field detection method embodiments of fig. 5 to 6, and is not described here again to avoid repetition.
Optionally, as shown in fig. 8, an electronic device 800 is further provided in this embodiment of the present application, and includes a processor 801, a memory 802, and a program or an instruction stored in the memory 802 and executable on the processor 801, where the program or the instruction is executed by the processor 801 to implement each process of the foregoing depth of field detection method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 9 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 900 includes, but is not limited to: a radio frequency unit 901, a network module 902, an audio output unit 903, an input unit 904, a sensor 905, a display unit 906, a user input unit 907, an interface unit 908, a memory 909, and a processor 910.
Those skilled in the art will appreciate that the electronic device 900 may further include a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 910 through a power management system, so as to manage charging, discharging, and power consumption management functions through the power management system. The electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
Wherein, the processor 910 is configured to emit first polarized light to the target object in response to a detection instruction for the depth information of the target object; receiving the second polarized light and determining the polarization information of the second polarized light; judging whether the polarization information of the second polarized light is matched with the polarization information of the first polarized light; if so, determining the second polarized light as the polarized light reflected by the target object based on the first polarized light; and determining depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light.
Optionally, the light information of the first polarized light includes a light emission time; the optical information of the second polarized light comprises a light reflection time;
a processor 910, further configured to determine a time of flight for the second polarized light based on the light reflection time and the light emission time; and determining depth information corresponding to the target object according to the flight time and the light speed of the second polarized light.
Optionally, the processor 910 is further configured to further determine the illumination intensity of the second polarized light if the polarization information of the second polarized light matches the polarization information of the first polarized light; judging whether the illumination intensity is greater than or equal to a first preset threshold value or not; if so, determining the second polarized light as the polarized light reflected by the target object based on the first polarized light.
Optionally, the polarization information of the first polarized light includes a first specified polarization angle; the polarization information of the second polarized light comprises a first polarization angle;
the processor 910 is further configured to determine whether the first polarization angle of the second polarized light matches the first designated polarization angle of the first polarized light.
Optionally, the emission partition comprises a plurality of emission partitions; each emission subarea corresponds to a first appointed polarization angle;
the processor 910 is further configured to transmit a plurality of first polarized lights to the target object sequentially through the emission partitions according to a preset polarized light emission sequence corresponding to each emission partition; and sequentially judging whether the first polarization angle of each second polarized light is matched with the first appointed polarization angle of the corresponding first polarized light according to the polarized light emission sequence.
In the embodiment of the application, whether the polarization information of the second polarized light matches the polarization information of the first polarized light is judged by emitting the first polarized light to the target object, receiving the second polarized light and determining the polarization information of the second polarized light in response to the detection instruction of the depth information of the target object, when the polarization information of the second polarized light matches the polarization information of the first polarized light, the second polarized light is determined as the polarized light reflected by the target object based on the first polarized light, and the depth information corresponding to the target object is determined according to the light information of the first polarized light and the light information of the second polarized light. Therefore, the electronic equipment determines the depth of field information by utilizing the camera device capable of emitting and receiving polarized light with various angles, and the adoption of the camera device can avoid the condition of multi-machine interference, so that the depth of field information determined by the electronic equipment can be obtained more accurately compared with the depth of field information determined by the traditional camera device.
It should be understood that, in the embodiment of the present application, the input Unit 904 may include a Graphics Processing Unit (GPU) 9041 and a microphone 9042, and the Graphics processor 9041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 906 may include a display panel 9061, and the display panel 9061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 907 includes a touch panel 9071 and other input devices 9072. A touch panel 9071 also referred to as a touch screen. The touch panel 9071 may include two parts, a touch detection device and a touch controller. Other input devices 9072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. The memory 909 may be used to store software programs as well as various data including, but not limited to, application programs and an operating system. The processor 910 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 910.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the above depth of field detection method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the above depth-of-field detection method embodiment, and can achieve the same technical effect, and in order to avoid repetition, the details are not repeated here.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatuses in the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions recited, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
Through the description of the foregoing embodiments, it is clear to those skilled in the art that the method of the foregoing embodiments may be implemented by software plus a necessary general hardware platform, and certainly may also be implemented by hardware, but in many cases, the former is a better implementation. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (14)

1. A camera apparatus, comprising a transmitting component and a receiving component;
the emission component comprises a plurality of emission subareas, and each emission subarea is provided with at least one laser; the laser is used for emitting polarized light of a first specified polarization angle; each emission subarea corresponds to the first appointed polarization angle;
the receiving assembly comprises a plurality of polarization detection units; each polarization detection unit comprises a polarization filter element and a photoelectric imaging element; when the receiving assembly receives polarized light of a first polarization angle, the polarized light of the first polarization angle can only pass through the polarization filter element corresponding to the first polarization angle and irradiate on the photoelectric imaging element.
2. The apparatus of claim 1, wherein the polarizing filter element comprises a microlens, a filter, and a filter material; the light filtering material is arranged on the micro lens and/or the light filter and is used for filtering polarized light which is not matched with the second specified polarization angle corresponding to the polarization detection unit.
3. The apparatus of claim 2, wherein the filter material is a coating.
4. An electronic device characterized by comprising the camera apparatus according to any one of claims 1 to 3.
5. A depth of field detection method applied to the electronic device of claim 4, comprising:
emitting first polarized light to a target object in response to a detection instruction for depth information of the target object;
receiving second polarized light, and determining a first polarization angle of the second polarized light;
judging whether the first polarization angle of the second polarized light is matched with a first appointed polarization angle of the first polarized light;
if so, determining the second polarized light as the polarized light reflected by the target object based on the first polarized light; determining depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light;
the determining a first polarization angle of the second polarized light comprises: determining the first polarization angle of the second polarized light according to a specified polarization angle corresponding to a polarization detection unit for receiving the second polarized light; and only the polarized light with the specified polarization angle can penetrate through the corresponding polarization detection unit.
6. The method of claim 5, wherein the light information of the first polarized light comprises a light emission time; the light information of the second polarized light comprises light reflection time;
the determining the depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light includes:
determining a time of flight for the second polarized light based on the light reflection time and the light emission time;
and determining depth information corresponding to the target object according to the flight time and the light speed of the second polarized light.
7. The method of claim 5 or 6, wherein if so, determining that the second polarized light is the polarized light reflected by the target object based on the first polarized light comprises:
if the polarization information of the second polarized light is matched with the polarization information of the first polarized light, further determining the illumination intensity of the second polarized light;
judging whether the illumination intensity is greater than or equal to a first preset threshold value or not;
if yes, determining that the second polarized light is the polarized light reflected by the target object based on the first polarized light.
8. The method of claim 5, wherein the transmission sector comprises a plurality; each emission subarea corresponds to the first appointed polarization angle;
the emitting of the first polarized light to the target object includes:
sequentially passing through the emission subareas according to a preset polarized light emission sequence corresponding to the emission subareas, and emitting a plurality of first polarized lights to the target object;
the determining whether the first polarization angle of the second polarized light matches a first specified polarization angle of the first polarized light includes:
and sequentially judging whether the first polarization angle of each second polarization light is matched with the first specified polarization angle of the corresponding first polarization light according to the polarized light emission sequence.
9. A depth of field detection apparatus, comprising:
the device comprises an emitting module, a receiving module and a processing module, wherein the emitting module is used for responding to a detection instruction of depth information of a target object and emitting first polarized light to the target object; wherein the first polarized light is emitted by a plurality of emission subsections; each emission subarea is used for emitting first polarized light with a first appointed polarization angle, and each emission subarea corresponds to the first appointed polarization angle;
the first determining module is used for receiving the second polarized light and determining a first polarization angle of the second polarized light;
the judging module is used for judging whether the first polarization angle of the second polarized light is matched with a first appointed polarization angle of the first polarized light;
a second determining module, configured to determine, if yes, that the second polarized light is polarized light reflected by the target object based on the first polarized light; determining depth information corresponding to the target object according to the light information of the first polarized light and the light information of the second polarized light;
the determining a first polarization angle of the second polarized light comprises: determining the first polarization angle of the second polarized light according to a specified polarization angle corresponding to a polarization detection unit for receiving the second polarized light; and only the polarized light with the specified polarization angle can penetrate through the corresponding polarization detection unit.
10. The apparatus of claim 9, wherein the light information of the first polarized light comprises a light emission time; the optical information of the second polarized light comprises optical reflection time;
the second determining module includes:
a first determination unit configured to determine a time of flight of the second polarized light based on the light reflection time and the light emission time;
and the second determining unit is used for determining the depth information corresponding to the target object according to the flight time and the light speed of the second polarized light.
11. The apparatus of claim 9 or 10, wherein the second determining means comprises:
a third determining unit, configured to further determine the illumination intensity of the second polarized light if the polarization information of the second polarized light matches the polarization information of the first polarized light;
the first judging unit is used for judging whether the illumination intensity is greater than or equal to a first preset threshold value or not;
a fourth determining unit, configured to determine, if yes, that the second polarized light is polarized light reflected by the target object based on the first polarized light.
12. The apparatus of claim 9, wherein the emission sector comprises a plurality; each emission subarea corresponds to the first appointed polarization angle;
the transmitting module includes:
the transmitting unit is used for sequentially transmitting a plurality of first polarized lights to the target object through the transmitting subareas according to a preset polarized light transmitting sequence corresponding to the transmitting subareas;
the judging module comprises:
and a third judging unit, configured to sequentially judge whether the first polarization angle of each second polarized light matches the first specified polarization angle of the corresponding first polarized light according to the polarized light emitting order.
13. An electronic device comprising a processor, a memory, and a program or instructions stored on the memory and executable on the processor, the program or instructions when executed by the processor implementing the steps of the depth of view detection method according to any one of claims 5 to 8.
14. A readable storage medium, on which a program or instructions are stored, which when executed by a processor, implement the steps of the depth of field detection method according to any one of claims 5 to 8.
CN202110104744.8A 2021-01-26 2021-01-26 Camera device, electronic equipment, depth of field detection method and depth of field detection device Active CN112954153B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110104744.8A CN112954153B (en) 2021-01-26 2021-01-26 Camera device, electronic equipment, depth of field detection method and depth of field detection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110104744.8A CN112954153B (en) 2021-01-26 2021-01-26 Camera device, electronic equipment, depth of field detection method and depth of field detection device

Publications (2)

Publication Number Publication Date
CN112954153A CN112954153A (en) 2021-06-11
CN112954153B true CN112954153B (en) 2022-09-02

Family

ID=76237096

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110104744.8A Active CN112954153B (en) 2021-01-26 2021-01-26 Camera device, electronic equipment, depth of field detection method and depth of field detection device

Country Status (1)

Country Link
CN (1) CN112954153B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113743343A (en) * 2021-09-10 2021-12-03 维沃移动通信有限公司 Image information acquisition module, information processing method and device and electronic equipment

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109946710A (en) * 2019-03-29 2019-06-28 中国科学院上海技术物理研究所 A kind of more polarized laser imaging devices of dual wavelength

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050018450A1 (en) * 2002-06-14 2005-01-27 Tseng-Lu Chien Fiber optic light kits for footwear
JP4566930B2 (en) * 2006-03-03 2010-10-20 富士通株式会社 Imaging device
US8723118B2 (en) * 2009-10-01 2014-05-13 Microsoft Corporation Imager for constructing color and depth images
CN203251357U (en) * 2013-03-22 2013-10-23 路全 Night vision device
CN106878697A (en) * 2016-06-29 2017-06-20 鲁班嫡系机器人 A kind of image pickup method and its imaging method, device and equipment

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109946710A (en) * 2019-03-29 2019-06-28 中国科学院上海技术物理研究所 A kind of more polarized laser imaging devices of dual wavelength

Also Published As

Publication number Publication date
CN112954153A (en) 2021-06-11

Similar Documents

Publication Publication Date Title
US10663691B2 (en) Imaging devices having autofocus control in response to the user touching the display screen
EP2898399B1 (en) Display integrated camera array
CN107424186A (en) depth information measuring method and device
CN110035218B (en) Image processing method, image processing device and photographing equipment
US8614694B2 (en) Touch screen system based on image recognition
CN109639896A (en) Block object detecting method, device, storage medium and mobile terminal
CN105320270A (en) Method for performing a face tracking function and an electric device having the same
TW201945759A (en) Time of flight ranging with varying fields of emission
CN112954153B (en) Camera device, electronic equipment, depth of field detection method and depth of field detection device
CN113473007A (en) Shooting method and device
CN112543284B (en) Focusing system, method and device
CN112291473B (en) Focusing method and device and electronic equipment
CN113014320B (en) Visible light communication control method and device for electronic equipment and electronic equipment
CN109242782B (en) Noise processing method and device
CN112437231B (en) Image shooting method and device, electronic equipment and storage medium
US20200409479A1 (en) Method, device and equipment for launching an application
US11418707B2 (en) Electronic device and notification method
CN108600623B (en) Refocusing display method and terminal device
EP3962062A1 (en) Photographing method and apparatus, electronic device, and storage medium
CN112261300B (en) Focusing method and device and electronic equipment
CN105718121A (en) Optical touch device
CN113743343A (en) Image information acquisition module, information processing method and device and electronic equipment
CN111722240B (en) Electronic equipment, object tracking method and device
CN112532879B (en) Image processing method and device
CN109542231B (en) Method and device for feeding back information, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant