CN108957914B - Laser projection module, depth acquisition device and electronic equipment - Google Patents

Laser projection module, depth acquisition device and electronic equipment Download PDF

Info

Publication number
CN108957914B
CN108957914B CN201810828544.5A CN201810828544A CN108957914B CN 108957914 B CN108957914 B CN 108957914B CN 201810828544 A CN201810828544 A CN 201810828544A CN 108957914 B CN108957914 B CN 108957914B
Authority
CN
China
Prior art keywords
depth
laser
target object
memory
acquisition device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810828544.5A
Other languages
Chinese (zh)
Other versions
CN108957914A (en
Inventor
欧锦荣
周海涛
郭子青
谭筱
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN201810828544.5A priority Critical patent/CN108957914B/en
Publication of CN108957914A publication Critical patent/CN108957914A/en
Priority to PCT/CN2019/070851 priority patent/WO2020019682A1/en
Application granted granted Critical
Publication of CN108957914B publication Critical patent/CN108957914B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/48Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
    • G03B17/54Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

The invention discloses a laser projection module. The laser projection module comprises a laser projector and a memory. The laser projector is used for projecting laser to a target object to form a laser pattern. The memory is integrated with the laser projector, the memory is used for storing calibration data of the laser projector, and the calibration data and the laser pattern are jointly used for acquiring depth information. The invention also discloses a depth acquisition device and electronic equipment. The laser projection module, the depth acquisition device and the electronic equipment of the embodiment of the invention integrate the laser projector and the memory, wherein the memory stores the calibration data of the corresponding laser projector, so that after the laser projector is replaced, the calibration data corresponding to the replaced laser projector in the memory can be obtained, and accurate depth information can be obtained by utilizing the calibration data and the laser pattern.

Description

Laser projection module, depth acquisition device and electronic equipment
Technical Field
The present invention relates to an imaging technology, and more particularly, to a laser projection module, a depth acquisition device, and an electronic apparatus.
Background
When the depth information is acquired by combining the laser projector and the image collector, the depth information can be calculated and acquired only by combining the laser pattern acquired by the image collector with the calibration data of the laser projector. Different laser projectors generally correspond to different calibration data, and after the laser projector is replaced, the original calibration data are not suitable for a new laser projector, so that how to enable the laser projector to normally acquire depth information after replacement becomes a technical problem to be solved.
Disclosure of Invention
The embodiment of the invention provides a laser projection module, a depth acquisition device and electronic equipment.
The laser projection module comprises a laser projector and a memory. The laser projector is used for projecting laser to a target object to form a laser pattern. The memory is integrated with the laser projector, the memory is used for storing calibration data of the laser projector, and the calibration data and the laser pattern are jointly used for acquiring depth information.
The depth acquisition device comprises the laser projection module and the image collector. The image collector is used for receiving the laser modulated by the target object to form the laser pattern.
The electronic equipment comprises a depth acquisition device, wherein the depth acquisition device comprises the laser projection module and an image collector. The image collector is used for receiving the laser modulated by the target object to form the laser pattern.
The laser projection module, the depth acquisition device and the electronic equipment of the embodiment of the invention integrate the laser projector and the memory, wherein the memory stores the calibration data of the corresponding laser projector, so that after the laser projector is replaced, the calibration data corresponding to the replaced laser projector in the memory can be obtained, and accurate depth information can be obtained by utilizing the calibration data and the laser pattern.
Additional aspects and advantages of embodiments of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic view of a laser projection module according to some embodiments of the present invention.
FIG. 2 is a schematic view of a depth-acquisition device according to some embodiments of the present invention.
FIG. 3 is a schematic view of an electronic device of some embodiments of the inventions.
Fig. 4-6 are schematic plan views of electronic devices according to some embodiments of the invention.
Fig. 7 is a schematic view of an application scenario of the depth acquisition device according to some embodiments of the present invention.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings. The same or similar reference numbers in the drawings identify the same or similar elements or elements having the same or similar functionality throughout.
In addition, the embodiments of the present invention described below with reference to the accompanying drawings are exemplary only for the purpose of explaining the embodiments of the present invention, and are not to be construed as limiting the present invention.
In the present invention, unless otherwise expressly stated or limited, the first feature "on" or "under" the second feature may be directly contacting the first and second features or indirectly contacting the first and second features through an intermediate. Also, a first feature "on," "over," and "above" a second feature may be directly or diagonally above the second feature, or may simply indicate that the first feature is at a higher level than the second feature. A first feature being "under," "below," and "beneath" a second feature may be directly under or obliquely under the first feature, or may simply mean that the first feature is at a lesser elevation than the second feature.
Referring to fig. 1, a laser projection module 10 according to an embodiment of the present invention includes a laser projector 12 and a memory 14. The laser projector 12 is used to project laser light toward the target object 2000 to form a laser light pattern. A memory 14 is provided integrally with the laser projector 12, the memory 14 being used to store calibration data for the laser projector 12, which calibration data is used together with the laser pattern to obtain depth information.
Referring to fig. 2, a depth acquisition apparatus 100 according to an embodiment of the present invention includes a laser projection module 10 and an image collector 20. The laser projection module 10 includes a laser projector 12 and a memory 14. The laser projector 12 is used to project laser light toward the target object 2000 to form a laser light pattern. A memory 14 is provided integrally with the laser projector 12, the memory 14 being used to store calibration data for the laser projector 12, which calibration data is used together with the laser pattern to obtain depth information. The image collector 20 is used for receiving the laser light modulated by the target object 2000 to form a laser light pattern. The image collector 20 is, for example, an infrared camera.
Referring to fig. 3, an electronic apparatus 1000 according to an embodiment of the invention includes a depth obtaining device 100. The depth acquisition device 100 comprises a laser projection module 10 and an image collector 20. The laser projection module 10 includes a laser projector 12 and a memory 14. The laser projector 12 is used to project laser light toward the target object 2000 to form a laser light pattern. A memory 14 is provided integrally with the laser projector 12, the memory 14 being used to store calibration data for the laser projector 12, which calibration data is used together with the laser pattern to obtain depth information. The image collector 20 is used for receiving the laser light modulated by the target object 2000 to form a laser light pattern.
That is, the electronic device 1000 according to the embodiment of the present invention may include the depth acquisition apparatus 100 according to the embodiment of the present invention, and the depth acquisition apparatus 100 according to the embodiment of the present invention may include the laser projection module 10 according to the embodiment of the present invention.
The laser projection module 10, the depth acquisition device 100 and the electronic apparatus 1000 according to the embodiment of the present invention integrate the laser projector 12 with the memory 14 (i.e., the memory 14 and the laser projector 12 are integrated into a single module, i.e., integrated into the laser projection module 10), wherein the memory 14 stores the calibration data of the corresponding laser projector 12, so that when the laser projector 12 is replaced, the memory 14 is replaced correspondingly, i.e., the laser projection module 10 is replaced as a whole, and after the laser projector 12 is replaced, the calibration data corresponding to the replaced laser projector 12 in the memory 14 can be obtained, so that accurate depth information can be acquired by using the calibration data and the laser pattern.
In some embodiments, the laser projector 12 is integrated with the memory 14 and may be: a memory 14 is provided within the laser projector 12. Thus, on one hand, the size of the laser projection module 10 can be reduced, and on the other hand, the laser projector 12 can protect the memory 14 to a certain extent from water, dust, collision, and the like.
In some embodiments, the laser projector 12 is integrated with the memory 14, and may be: the memory 14 is disposed outside of the laser projector 12 and the memory 14 is packaged with the laser projector 12. In this way, when the memory 14 is provided, it is possible to avoid the influence on the original circuit structure of the laser projector 12 as much as possible, and it is also possible to avoid the memory 14 from being damaged by high temperature generated when the laser projector 12 operates.
The electronic device 1000 may be a camera, a mobile phone, a tablet computer, a laptop computer, a game console, a wearable device (smart watch, smart bracelet, smart glasses, smart helmet, etc.), an access control system, a teller machine, etc. The depth acquisition device 100 may be a device for performing depth measurement by using structured light, and when performing depth measurement by using structured light, it is necessary to calculate depth information of the target object 2000 according to calibration data. The calibration data may be data calibrated by the laser projector 12 through a series of tests before the laser projector is shipped.
In some embodiments, memory 14 is only used to store calibration data, such that the storage space required by memory 14 is small, which may reduce the size of memory 14 and the manufacturing cost of memory 14. In addition, the electronic device 1000 may include other storage elements than the memory 14, and thus, the electronic device 1000 may store other files or information or the like using the storage elements.
Referring to fig. 3, in some embodiments, the electronic apparatus 1000 further includes a depth acquisition device 200 and a processor 300. The processor 300 is configured to control the depth acquisition device 200 to acquire the reference depth h1 of the target object 2000, control the depth acquisition device 100 to acquire the test depth h2 of the target object 2000, determine whether a deviation between the test depth h2 and the reference depth h1 is greater than a predetermined deviation threshold, and determine that the depth acquisition device 100 is not correctly installed on the electronic apparatus 1000 when a deviation between the test depth h2 and the reference depth h1 is greater than the deviation threshold.
Specifically, due to assembly errors, if the depth acquisition device 100 is not correctly installed on the electronic apparatus 1000 (specifically, the laser projection module 10 is not correctly installed on the electronic apparatus 1000, or the image capturing device 20 is not correctly installed on the electronic apparatus 1000, or neither the laser projection module 10 nor the image capturing device 20 is correctly installed on the electronic apparatus 1000), an error exists in the depth information measured by the depth acquisition device 100. In addition, when another device (e.g., a speaker, a proximity sensor, or the like) in the electronic apparatus 1000 is replaced and the position of the depth acquisition device 100 is changed, there is a possibility that the depth information measured by the depth acquisition device 100 has an error.
The depth capture device 200 is configured to obtain a reference depth h1 of the target object 2000, wherein the reference depth h1 represents an actual depth of the target object 2000 in the current target scene. The depth acquisition apparatus 100 is configured to acquire a test depth h2 of the target object 2000, wherein the test depth h2 represents a depth of the current target object 2000 calculated according to the calibration data.
In some embodiments, the processor 300 is configured to control the laser projector 12 to project laser light onto the target object 2000, control the image collector 20 to obtain a laser light pattern modulated by the target object 2000, and obtain the test depth h2 according to the laser light pattern and the calibration data. In this manner, the test depth h2 of the target object 2000 can be acquired by the depth acquisition apparatus 100.
In the embodiment of the present invention, the deviation magnitude of the test depth h2 from the reference depth h1 may refer to an absolute value of a difference between the test depth h2 and the reference depth h 1. The reference depth h1 and the test depth h2 each include depth information such as a maximum depth, a minimum depth, an average depth, etc., and thus the deviation of the test depth h2 from the reference depth h1 may be a deviation of the maximum depth, a deviation of the minimum depth, or a deviation of the average depth, etc., of the two. In addition, the reference depth h1 and the test depth h2 of the target object 2000 may refer to the depth of the entire target object 2000, or may be the depth of one or more features on the target object 2000.
Now, taking the minimum depth as an example, the deviation of the test depth h2 from the reference depth h1 is explained. When the depth acquisition apparatus 100 is replaced or reinstalled, the position where the depth acquisition apparatus 100 is disposed on the electronic device 1000 may not coincide with the position calibrated in the production line environment. When the depth acquisition apparatus 100 is correctly installed (as shown by the solid line box of fig. 3), the deviation of the test depth h2 from the reference depth h1 is smaller than the set deviation threshold. For example, if the predetermined deviation threshold is 2cm, the current reference depth h1 is 70cm, the test depth h2 is 69cm, and the deviation between the test depth h2 and the reference depth h1 is less than the deviation threshold 2cm, it indicates that the test depth h2 measured according to the calibration data is closer to the actual depth, and the depth obtaining apparatus 100 is correctly installed on the electronic device 1000. When the depth acquisition apparatus 100 is not properly installed (as shown by the broken-line box of fig. 3), the deviation of the test depth h2 from the reference depth h1 is greater than a predetermined deviation threshold. For example, if the set deviation threshold is 2cm, the current reference depth h1 is 70cm, the test depth h2 is 65cm, and the deviation between the test depth h2 and the reference depth h1 exceeds the deviation threshold of 2cm, it indicates that the difference between the test depth h2 measured according to the calibration data and the actual depth is large, and the depth obtaining apparatus 100 is not correctly installed on the electronic device 1000. Therefore, by determining the deviation between the test depth h2 and the reference depth h1, whether the depth acquisition device 100 is correctly installed on the electronic device 1000 can be quickly determined, and the electronic device 1000 is prevented from acquiring the depth with the calibration data of the production line when the depth acquisition device 100 is not correctly installed. Specifically, it is avoided that the depth is still obtained by using new calibration data because the depth obtaining apparatus 100 is not correctly installed on the electronic device 1000 after replacement; in addition, it is avoided that the electronic device 1000 still obtains the depth with the original calibration data because the depth obtaining device 100 is not correctly mounted on the electronic device 1000 due to the mounting and dismounting of other modules.
It is understood that the depth acquisition device 200 may refer to any device having depth measurement capability, such as a binocular vision depth acquisition device, a time of flight (TOF) depth acquisition device, a single-camera depth acquisition device, a structured light depth acquisition device, and the like.
Referring to fig. 4, in some embodiments, the depth capture device 200 is a binocular vision depth capture device. Specifically, the depth acquisition apparatus 200 includes a first imaging apparatus 210 and a second imaging apparatus 230. The processor 300 is configured to control the first imaging device 210 to acquire a first planar image of the target object 2000, control the second imaging device 230 to acquire a second planar image of the target object 2000, and acquire the reference depth h1 according to the first planar image and the second planar image.
In some embodiments, the first imaging device 210 is a visible light camera or an infrared camera; the second imaging device 230 is a visible light camera or an infrared camera.
Specifically, the first imaging device 210 and the second imaging device 230 may both be visible light cameras, and correspondingly, the first planar image and the second planar image are both visible light images. Alternatively, the first imaging device 210 may be a visible light camera, and the second imaging device 230 may be an infrared camera, and accordingly, the first planar image is a visible light image, and the second planar image is an infrared image. Alternatively, the first imaging device 210 may be an infrared camera, and the second imaging device 230 may be a visible light camera, and accordingly, the first planar image is an infrared image, and the second planar image is a visible light image. Alternatively, the first imaging device 210 and the second imaging device 230 may both be infrared cameras, and correspondingly, the first planar image and the second planar image are both infrared images. After acquiring the first planar image and the second planar image, the processor 300 may acquire the reference depth h1 of the target object 2000 by using the triangulation principle.
In some embodiments, when both the depth acquisition apparatus 100 and the depth acquisition apparatus 200 include infrared cameras, the depth acquisition apparatus 200 and the depth acquisition apparatus 100 may share the same infrared camera (e.g., fig. 4, the first imaging device 210 and the image collector 20 are the same infrared camera; similarly, the second imaging device 230 and the image collector 20 may also be the same infrared camera). Of course, the depth acquisition device 200 and the depth acquisition device 100 may also use different infrared cameras, and are not limited in this respect.
Referring to fig. 5, in some embodiments, the depth acquisition device 200 is a TOF depth acquisition device. Specifically, the depth acquisition device 200 includes a light emitter 250 and a light receiver 270. The processor 300 is configured to control the optical transmitter 250 to transmit an optical signal to the target object 2000, control the optical receiver 270 to receive an optical signal reflected by the target object 2000, and obtain the reference depth h1 according to a transmission time of the optical transmitter 250 and a time when the optical receiver 270 receives the reflected optical signal.
Specifically, the optical transmitter 250 may be an infrared optical transmitter and the optical receiver 270 may be an infrared optical receiver (e.g., an infrared camera). The reference depth h1 is obtained according to the emitting time of the optical emitter 250 and the time when the optical receiver 270 receives the reflected optical signal, which may be the reference depth h1 obtained according to the flight time of the infrared light. Specifically, the flight time of the infrared light may be directly calculated according to the emitting time of the optical signal and the receiving time of the optical signal, or may be calculated according to the phase difference between the electrical signal formed by receiving the optical signal by the optical receiver 270 and the reference electrical signal, and the reference depth h1 may be calculated according to the flight time of the infrared light and the propagation speed of the infrared light.
In some embodiments, when both depth capture device 100 and depth capture device 200 include infrared cameras, depth capture device 200 and depth capture device 100 may share the same infrared camera (e.g., fig. 5, where optical receiver 270 and image capture device 20 are the same infrared camera). Of course, the depth acquisition device 200 and the depth acquisition device 100 may also use different infrared cameras, and are not limited in this respect.
Referring to fig. 6, in some embodiments, the depth capture device 200 is a single-camera depth capture device. Specifically, the depth acquisition apparatus 200 further includes an image acquirer 290. The processor 300 is configured to control the image acquirer 290 to acquire a third planar image including the target object 2000, and to process the third planar image to obtain the reference depth h 1. The image acquirer 290 may be a visible light camera, an infrared camera, or the like. When the image acquirer 290 is an infrared camera, the image acquirer 290 and the image acquirer 20 may refer to the same infrared camera or two different infrared cameras.
In some embodiments, the target object 2000 may be a predetermined model, and the predetermined model may be a planar model or a stereoscopic model, such as a false face model, a sphere, a cube, or the like. In some embodiments, the predetermined model includes a plurality of features, one or more of which may be selected as target features. The relative position relationship among the plurality of features is known, so that the relative position relationship between other features and the target feature cannot be determined due to individual difference. The selection of the target feature is arbitrary, and for example, the most protruding structure on the predetermined model may be selected as the target feature. Taking a false face model as an example, a nose is selected as a target feature, and the relative positions of the features such as eyes, mouth, eyebrows and the like and the nose are determined before detection. Of course, the target feature may be subdivided into pixels of a certain region on the preset model, and the like. Taking a false face model as an example, pixels in the forehead area are selected as target features. After determining the depth of the target feature, the depths of other features may also be determined by relative positional relationships with the target feature. It should be noted that the reference depth h1 and the test depth h2 of the target object 2000 may refer to the depth of the entire target object 2000, or may refer to the depth of one or more features (e.g., target features) of the target object 2000.
It is understood that in other embodiments, the target object 2000 may be any other object besides the preset model, and is not limited in detail herein.
With continued reference to fig. 6, in some embodiments, the third plane image is processed to obtain a reference depth h1, which may be: the ratio of the area of the target object 2000 in the third plane image is acquired, and the reference depth h1 of the target object 2000 is acquired from the ratio. That is, the processor 300 is configured to acquire the proportion occupied by the area of the target object 2000 in the third plane image, and acquire the reference depth h1 of the target object 2000 according to the proportion.
Specifically, referring to fig. 7, on the third plane image acquired by the image acquirer 290, the ratio W of the area of the target object 2000 is inversely proportional to the reference depth h 1. Specifically, as the reference depth h1 gradually increases, that is, the distance S between the target object 2000 and the electronic apparatus 1000 gradually increases, the proportion W of the area of the target object 2000 on the third plane image gradually decreases. For example, when the distance between the target object 2000 and the electronic apparatus 1000 is S1 is 20cm, the ratio W1 of the area is 80%; when the distance between the target object 2000 and the electronic apparatus 1000 is S2 ═ 40cm, the proportion W2 of the area is 60%; when the distance between the target object 2000 and the electronic apparatus 1000 is S3 ═ 60cm, the proportion W3 of the area is 40%. The reference depth h1 can be obtained by calculating the proportion W of the area of the target object 2000 in the third plane image. In the embodiment of the present invention, the correspondence between the distance between the target object 2000 and the electronic device 1000 and the ratio W of the area of the target object 2000 may be stored in advance on the electronic device 1000. In this way, by finding the distance corresponding to the ratio W on the electronic apparatus 1000, the reference depth h1 of the target object 2000 can be quickly acquired. Of course, the reference depth h1 of the target feature may be obtained by obtaining the ratio of the area of the target feature in the third plane image.
In some embodiments, the depth capture device 200 may be a structured light depth capture device different from the depth capture device 100, and when the depth capture device 200 is properly mounted on the electronic device 1000, the depth captured by the depth capture device 200 may be used as the reference depth h 1.
In some embodiments, the reference depth h1 may also be determined based on user input. Specifically, the user may place the depth acquisition apparatus 100 at a preset position, for example, the user places the depth acquisition apparatus 100 at a position 60cm away from the target object 2000 and then inputs the distance information as the reference depth h 1.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above are not necessarily intended to refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present invention, "a plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present invention in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present invention.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present invention may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional units in the embodiments of the present invention may be integrated into one processing module, or each unit may exist alone physically, or two or more units are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc. Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present invention.

Claims (7)

1. The utility model provides an electronic equipment, its characterized in that, electronic equipment includes degree of depth acquisition device, degree of depth acquisition device includes that laser throws module and image collector, the laser throws the module and includes:
a laser projector for projecting laser light toward a target object to form a laser light pattern;
the memory is integrated with the laser projector and used for storing calibration data of the laser projector, and the calibration data and the laser pattern are jointly used for acquiring depth information;
the image collector is used for receiving the laser modulated by the target object to form the laser pattern;
the electronic equipment further comprises a depth acquisition device and a processor, wherein the processor is used for:
controlling the depth acquisition device to acquire the reference depth of the target object;
controlling the depth acquisition device to acquire the test depth of the target object;
judging whether the deviation of the test depth and the reference depth is greater than a preset deviation threshold value;
when the deviation of the test depth from the reference depth is greater than the deviation threshold, determining that the depth acquisition device is not correctly installed on the electronic equipment.
2. The electronic device of claim 1, wherein the memory is disposed within the laser projector, or,
the memory is disposed outside of the laser projector and the memory is packaged with the laser projector.
3. The electronic device of claim 1 or 2, wherein the processor is configured to:
controlling the laser projector to project the laser light toward the target object;
controlling the image collector to obtain the laser pattern modulated by the target object;
and acquiring the test depth according to the laser pattern and the calibration data.
4. The electronic device of claim 1 or 2, wherein the depth acquisition device comprises a first imaging device and a second imaging device, and wherein the processor is configured to:
controlling the first imaging device to acquire a first plane image of the target object;
controlling the second imaging device to acquire a second plane image of the target object;
and acquiring the reference depth according to the first plane image and the second plane image.
5. The electronic device of claim 4, wherein the first imaging device is a visible light camera or an infrared camera; the second imaging device is a visible light camera or an infrared camera.
6. The electronic device of claim 1 or 2, wherein the depth-acquisition apparatus comprises a light emitter and a light receiver, and the processor is configured to:
controlling the light emitter to emit a light signal to the target object;
controlling the optical receiver to receive an optical signal reflected by the target object;
and acquiring the reference depth according to the transmitting time of the optical transmitter and the time of the optical receiver receiving the reflected optical signal.
7. The electronic device of claim 1 or 2, wherein the depth acquisition device further comprises an image acquirer, the processor configured to:
controlling the image acquirer to acquire a third planar image including the target object;
acquiring the proportion of the area of the target object in the third plane image;
and acquiring the reference depth of the target object according to the proportion.
CN201810828544.5A 2018-07-25 2018-07-25 Laser projection module, depth acquisition device and electronic equipment Active CN108957914B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201810828544.5A CN108957914B (en) 2018-07-25 2018-07-25 Laser projection module, depth acquisition device and electronic equipment
PCT/CN2019/070851 WO2020019682A1 (en) 2018-07-25 2019-01-08 Laser projection module, depth acquisition apparatus and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810828544.5A CN108957914B (en) 2018-07-25 2018-07-25 Laser projection module, depth acquisition device and electronic equipment

Publications (2)

Publication Number Publication Date
CN108957914A CN108957914A (en) 2018-12-07
CN108957914B true CN108957914B (en) 2020-05-15

Family

ID=64464920

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810828544.5A Active CN108957914B (en) 2018-07-25 2018-07-25 Laser projection module, depth acquisition device and electronic equipment

Country Status (2)

Country Link
CN (1) CN108957914B (en)
WO (1) WO2020019682A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108957914B (en) * 2018-07-25 2020-05-15 Oppo广东移动通信有限公司 Laser projection module, depth acquisition device and electronic equipment
CN114034246B (en) * 2021-11-11 2023-10-13 易思维(杭州)科技有限公司 Calibration system and method for laser light plane

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09141473A (en) * 1995-11-22 1997-06-03 Shibuya Kogyo Co Ltd Laser beam processing machine of head section exchange type
CN2359713Y (en) * 1998-06-08 2000-01-19 北京大学 Relief sculpture and carving three-D laser scanning instrument
AU2002356548A1 (en) * 2001-10-09 2003-04-22 Dimensional Photonics, Inc. Device for imaging a three-dimensional object
JP4504031B2 (en) * 2004-01-22 2010-07-14 オリンパス株式会社 Camera system
EP2568253B1 (en) * 2010-05-07 2021-03-10 Shenzhen Taishan Online Technology Co., Ltd. Structured-light measuring method and system
JP6162681B2 (en) * 2011-03-31 2017-07-12 エーティーエス オートメーション ツーリング システムズ インコーポレイテッドAts Automation Tooling Systems Inc. Three-dimensional light detection through optical media
TWI503618B (en) * 2012-12-27 2015-10-11 Ind Tech Res Inst Device for acquiring depth image, calibrating method and measuring method therefore
CN203492137U (en) * 2013-09-10 2014-03-19 中国船舶重工集团公司第七一九研究所 Multifunctional photographing and evidence-taking device capable of realizing target positioning
CN103760025B (en) * 2014-02-10 2016-10-05 深圳三思纵横科技股份有限公司 extensometer and measuring method thereof
CN104918031B (en) * 2014-03-10 2018-08-07 联想(北京)有限公司 depth recovery device and method
CN205218309U (en) * 2015-12-29 2016-05-11 同高先进制造科技(太仓)有限公司 Laser head quick change device
CN108227361B (en) * 2018-03-12 2020-05-26 Oppo广东移动通信有限公司 Control method, control device, depth camera and electronic device
CN108957914B (en) * 2018-07-25 2020-05-15 Oppo广东移动通信有限公司 Laser projection module, depth acquisition device and electronic equipment

Also Published As

Publication number Publication date
CN108957914A (en) 2018-12-07
WO2020019682A1 (en) 2020-01-30

Similar Documents

Publication Publication Date Title
US20200162655A1 (en) Exposure control method and device, and unmanned aerial vehicle
KR102660109B1 (en) Method and apparatus for determining depth map for image
US9807371B2 (en) Depth perceptive trinocular camera system
CN109831660B (en) Depth image acquisition method, depth image acquisition module and electronic equipment
RU2769303C2 (en) Equipment and method for formation of scene representation
EP3191888B1 (en) Scanning laser planarity detection
CN109557669B (en) Method for determining image drift amount of head-mounted display equipment and head-mounted display equipment
US8488872B2 (en) Stereo image processing apparatus, stereo image processing method and program
CN112422939A (en) Trapezoidal correction method and device for projection equipment, projection equipment and medium
US11644570B2 (en) Depth information acquisition system and method, camera module, and electronic device
KR20150080863A (en) Apparatus and method for providing heatmap
CN110400342B (en) Parameter adjusting method and device of depth sensor and electronic equipment
CN108957914B (en) Laser projection module, depth acquisition device and electronic equipment
US20170292827A1 (en) Coordinate measuring system
US20160277728A1 (en) Method and apparatus for calibrating a dynamic auto stereoscopic 3d screen device
US12067741B2 (en) Systems and methods of measuring an object in a scene of a captured image
CN112771575A (en) Distance determination method, movable platform and computer readable storage medium
US20190156505A1 (en) Video processing technique for 3d target location identification
CN110072044B (en) Depth camera control method and device, terminal and readable storage medium
CN108760059B (en) Detection method, detection device and detection system of laser projector
JP7504688B2 (en) Image processing device, image processing method and program
CN108931202B (en) Detection method and apparatus, electronic apparatus, computer device, and readable storage medium
JP6548076B2 (en) Pattern image projection apparatus, parallax information generation apparatus, pattern image generation program
CN113936316B (en) DOE (DOE-out-of-state) detection method, electronic device and computer-readable storage medium
CN108833884B (en) Depth calibration method and device, terminal, readable storage medium and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant