CN110196023B - Dual-zoom structured light depth camera and zooming method - Google Patents

Dual-zoom structured light depth camera and zooming method Download PDF

Info

Publication number
CN110196023B
CN110196023B CN201910277754.4A CN201910277754A CN110196023B CN 110196023 B CN110196023 B CN 110196023B CN 201910277754 A CN201910277754 A CN 201910277754A CN 110196023 B CN110196023 B CN 110196023B
Authority
CN
China
Prior art keywords
zoom
lens group
mirror
structured light
depth camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910277754.4A
Other languages
Chinese (zh)
Other versions
CN110196023A (en
Inventor
刘龙
许星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Orbbec Inc
Original Assignee
Orbbec Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Orbbec Inc filed Critical Orbbec Inc
Priority to CN201910277754.4A priority Critical patent/CN110196023B/en
Publication of CN110196023A publication Critical patent/CN110196023A/en
Application granted granted Critical
Publication of CN110196023B publication Critical patent/CN110196023B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04BTRANSMISSION
    • H04B10/00Transmission systems employing electromagnetic waves other than radio-waves, e.g. infrared, visible or ultraviolet light, or employing corpuscular radiation, e.g. quantum communication
    • H04B10/40Transceivers
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/271Image signal generators wherein the generated image signals comprise depth maps or disparity maps
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof

Abstract

The invention provides a dual-zoom structured light depth camera and a zooming method, wherein the dual-zoom structured light depth camera comprises: the emitting end comprises a light source, a zoom projection lens group and a DOE, and is used for projecting a structural light beam to a person or an object in space; the receiving end comprises a zoom imaging lens group and an image sensor and is used for receiving the structural light modulated and reflected by the person or the object and generating a structural light image; the processor is used for receiving the structured light image from the receiving end and calculating a depth image of a person or an object; the depth camera further comprises an actuator for driving the zoom projection lens group and the zoom imaging lens group to realize optical zooming; the transmitting end and the receiving end respectively further comprise a first reflecting mirror and a second reflecting mirror, and the first reflecting mirror and the second reflecting mirror are used for reflecting the light beam so as to change the propagation direction of the light beam. The dual-zoom structured light depth camera and the zooming method enable the depth camera to have more flexible and variable depth of field, thereby realizing larger-range depth measurement.

Description

Dual-zoom structured light depth camera and zooming method
Technical Field
The invention relates to the technical field of optics, in particular to a dual-zoom structured light depth camera and a zooming method.
Background
The structured light depth camera further calculates a depth image by projecting the structured light image to the target space and collecting the structured light image reflected by the object, can realize the functions of 3D modeling, face recognition, gesture interaction and the like based on the depth image, and has the advantages of high resolution, high precision, low power consumption and the like, so that the structured light depth camera is widely applied to intelligent equipment such as mobile phones, computers, robots, AR/VR and the like.
However, the current structured light depth camera faces a significant challenge, and one of the problems is that the measurement range is limited, for example, the measurement range of a short-distance structured light depth camera applied to products such as mobile phones is about 0.2-1 m, and the measurement range of a medium-distance structured light depth camera applied to products such as robots is about 0.6-5 m. The main reasons for the limited measuring range are on the one hand the limitation of the projection distance of the structured light and on the other hand the limitation of the depth of field of the acquisition camera. The invention provides a zoom depth camera, which aims to solve the problem that the measuring distance of the depth camera is limited.
Disclosure of Invention
The invention aims to solve the problem that the measuring distance of a depth camera in the prior art is limited, and provides a dual-zoom structured light depth camera and a zooming method.
The binary zoom structured light depth camera of the present invention comprises: the emitting end comprises a light source, a zoom projection lens group and a DOE, and is used for projecting a structural light beam to a person or an object in space; the light source is used for emitting light beams; the zoom projection lens group is used for converging the light beams emitted by the light source onto the DOE; the DOE is configured to diffract, split, or replicate the received light beam to project a structured light beam composed of multiple beams of light outwards; a receiving end including a zoom imaging lens group and an image sensor for receiving the structured light modulated and reflected by the person or object and generating a structured light image; the zoom imaging lens group is used for imaging the received structured light beam on the image sensor to generate a structured light image; the processor is used for receiving the structured light image from the receiving end and calculating the depth image of the person or the object; the depth camera further comprises an actuator for driving the zoom projection lens group and the zoom imaging lens group to realize optical zooming; the transmitting end and the receiving end respectively further comprise a first reflecting mirror and a second reflecting mirror, and the first reflecting mirror and the second reflecting mirror are used for reflecting the light beam so as to change the propagation direction of the light beam.
In a preferred embodiment, the zoom projection lens group and the zoom imaging lens group each include 2 or more lenses, and a distance between the lenses can be adjusted by the actuator so that optical zooming can be achieved for both the entire zoom projection lens group and the entire zoom imaging lens group. More preferably, the zoom projection lens group as a whole and/or the zoom imaging lens group as a whole is movable by the driving of the actuator to achieve focusing.
In a preferred embodiment, the first mirror is used to reflect the structured light beam projected by the DOE and project it towards the person or object in space; the second reflecting mirror is used for receiving the structural light beam projected by the transmitting end, modulated and reflected by the person or object, and reflecting the received structural light beam to the zoom imaging lens group; the zoom imaging lens group is used for receiving the structural light beam reflected by the second reflecting mirror and imaging the received structural light beam on the image sensor to generate a structural light image.
In a preferred embodiment, the emitting end further includes a third mirror disposed between the light source and the zoom projection lens group for reflecting the light beam to change a propagation direction of the light beam; and/or the receiving end further comprises a fourth reflecting mirror, wherein the fourth reflecting mirror is arranged between the image sensor and the zoom imaging lens group and is used for reflecting the light beam to change the propagation direction of the light beam. More preferably, the first mirror and/or the third mirror of the transmitting end are convex mirrors for increasing the field angle of the transmitting end, and the second mirror and/or the fourth mirror of the receiving end are plane mirrors or concave mirrors; or the second reflecting mirror and/or the fourth reflecting mirror of the receiving end are/is concave reflecting mirrors for reducing the field angle of the receiving end and compensating the distortion of the zooming imaging lens group, and the first reflecting mirror and/or the third reflecting mirror of the transmitting end are/is plane reflecting mirrors or convex reflecting mirrors.
In a preferred embodiment, the first mirror and the second mirror are arranged between the light source and the image sensor; alternatively, the light source and the image sensor are disposed between the first mirror and the second mirror. In a preferred embodiment, the baseline length of the depth camera is adjustable by the actuator. In a preferred embodiment, the binary zoom structured light depth camera of the present invention further comprises a controller for controlling the adjustment of the focal length of the zoom projection lens group and the zoom imaging lens group by the actuator; the focal lengths of the transmitting end and the receiving end accord with a specific constraint relation, so that the structural light beam emitted by the transmitting end can realize high-quality imaging in the receiving end.
In a preferred embodiment, the first reflecting mirror is further used for splitting the zoom projection lens group into two lenses or two lens groups, and the first reflecting mirror is arranged at the pupil position of the zoom projection lens group and used for reducing the area of the first reflecting mirror; and/or the second reflecting mirror is further used for disassembling the zoom imaging lens group into two lenses or two lens groups, and the second reflecting mirror is arranged at the pupil position of the zoom imaging lens group and used for reducing the area of the second reflecting mirror.
In a preferred embodiment, the light sources are array light sources and comprise more than 2 sub-array light sources, the number of light sources of each sub-array light source being unequal and each being capable of being individually controlled for enabling the emitting end to generate structured light beams of different densities.
The invention also provides a zooming method of the dual-zooming structure light depth camera, which comprises the following steps: s1: presetting at least two focal length modes, and storing the focal length modes in a memory; s2: receiving a zooming instruction, and then reading a constraint relation under a corresponding focal length mode from the memory by the processor, and converting the constraint relation into a control instruction to control the transmitting end and/or the receiving end to zoom; s3: and under the current focal length mode, the processor controls the transmitting end and the receiving end to respectively transmit the structured light beam and collect the structured light image so as to form a structured light image, and calculates a depth image based on the structured light image.
Compared with the prior art, the invention has the beneficial effects that:
according to the dual-zoom structured light depth camera and the zoom method, the focal lengths of the transmitting end and the receiving end can be adjusted, so that the depth camera has a more flexible and variable depth of field, and a larger range of depth measurement is realized.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other embodiments of the drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a binary zoom structured light depth camera according to an embodiment of the present invention.
FIG. 2 is a schematic diagram of a binary zoom structured light depth camera according to another embodiment of the present invention.
FIG. 3 is a schematic diagram of a binary zoom structured light depth camera according to yet another embodiment of the present invention.
Fig. 4 is a schematic structural view of a transmitting end in an embodiment of the present invention.
FIG. 5 is a flow chart of steps of a zoom method for a dual zoom structured light depth camera in accordance with one embodiment of the present invention.
Detailed Description
The invention will be described in further detail with reference to the following detailed description and with reference to the accompanying drawings. Wherein like reference numerals refer to like parts throughout unless otherwise specified. It should be emphasized that the following description is merely exemplary in nature and is in no way intended to limit the scope of the invention or its applications.
The binary zoom structured light depth camera of the present invention comprises: an emission end for projecting a structured light beam towards a person or object in space; the receiving end is used for receiving the structured light modulated and reflected by the person or the object and generating a structured light image; the processor is used for receiving the structured light image from the receiving end and calculating the depth image of the person or the object; the transmitting end and the receiving end respectively comprise a zooming projection lens group and a zooming imaging lens group, and the depth camera also comprises an actuator which is used for driving the zooming projection lens group and the zooming imaging lens group to realize optical zooming; the transmitting end and the receiving end respectively further comprise a first reflecting mirror and a second reflecting mirror, and the first reflecting mirror and the second reflecting mirror are used for reflecting the light beam so as to change the propagation direction of the light beam.
FIG. 1 is a schematic diagram of a binary zoom structured light depth camera according to one embodiment of the present invention. The binary zoom structured light depth camera 10 comprises an emitting end 112, a receiving end 113, and a processor (not shown), wherein the emitting end 112 is configured to project a structured light beam, such as a spot patterned light beam with an infrared wavelength, but also other wavelengths or other kinds of structured light beams, toward an object 111 in space; the structured light beam is modulated and reflected by the object 111, then received by the receiving end 113 and generates a structured light image, and the processor receives the structured light image from the receiving end 113 and calculates a depth image of the object by using a structured light imaging algorithm, for example, a matching algorithm is used to firstly obtain a pixel deviation between the structured light image and a pre-stored reference structured light image, then the pixel deviation is converted into a depth value based on a triangulation method, and finally the depth image of the object 111 in space is obtained.
The emission end 112 includes a light source 101, a zoom projection lens group 102, a Diffractive Optical Element (DOE) 104, and a first mirror 105, where the light beam emitted from the light source 101 is incident on the DOE104 after being converged by the zoom projection lens group 102, and the DOE104 diffracts (replicates) the received light beam to project a structured light beam (such as a spot patterned light beam composed of a plurality of light beams) outward, and the first mirror 105 further reflects the structured light beam and projects the structured light beam toward an object in space.
The light source 101 may be a single light source or an array of multiple light sources, such as a single light source comprising a single edge-emitting laser emitter or an array of multiple vertical facet laser emitters.
The zoom projection lens group 102 is composed of a plurality of (more than 2) lenses, a first actuator 103 (illustrated by arrows for convenience in the figure), and the like, and the positions of the plurality of lenses are changed under the action of the first actuator, for example, the change of the lens spacing along the beam propagation direction can change the focal length of the whole zoom projection lens group so as to realize optical zooming; in one embodiment, the first actuator may also move the entire lens to achieve focus. The actuator may be of the type microelectromechanical system (MEMS), electromagnetic drive actuator, electromechanical actuator, or the like. The first actuator may include a plurality of sub-actuators to control different lenses, respectively.
When the focal length of the zoom projection lens set 102 is changed, the converging effect of the light beam emitted by the light source 101 is also changed, for example, the focusing distance and depth of field of the light beam can be changed by increasing or decreasing the focal length so as to adapt to different measuring ranges.
The receiving end 113 includes an image sensor 109, a zoom imaging lens group 107, and a second reflecting mirror 106, and the structured light beam projected from the emitting end 112 is modulated and reflected (diffuse reflection) by the object 111, and then received and reflected by the second reflecting mirror 106 to the zoom imaging lens group 107, and the zoom imaging lens group 107 images the incident light beam on the image sensor 109 to generate a structured light image. In an embodiment, the receiving end 113 further includes an optical filter, for example, when the light source 101 is infrared light, an infrared optical filter may be disposed between the second reflecting mirror 106 and the zoom imaging lens group 107 or between the image sensor 109 and the zoom imaging lens group 107 in the receiving end 113, and the infrared optical filter has a very high transmittance for infrared light with a corresponding wavelength, so that imaging quality may be improved. The image sensor 109 may be a CMOS, CCD or like type image sensor that includes a plurality of imaging pixels for imaging an area of a field of view of space.
The zoom imaging lens group 107 is composed of a plurality of (more than 2) lenses, a second actuator 108 (illustrated by arrows for convenience in the figure), and the like, and the positions of the plurality of lenses are changed under the action of the second actuator, for example, the change of the lens spacing along the beam propagation direction can change the focal length of the whole zoom imaging lens group so as to realize optical zooming; in one embodiment, the second actuator may also move the entire lens to achieve focus. The actuator may be of the type microelectromechanical system (MEMS), electromagnetic drive actuator, electromechanical actuator, or the like. The second actuator may comprise a plurality of sub-actuators to control different lenses respectively.
When the focal length of the zoom imaging lens group 102 changes, the distance and depth of field of the object to be imaged change, so that the imaging of objects with different distances can be realized by changing the focal length of the zoom imaging lens group 102, and finally, the dual-zoom structured light depth camera 10 can measure objects with different measuring ranges.
The first mirror 105 and the second mirror 106 may be separate optical devices or may be integrated into a single optical device, such as a prism, and in fig. 1, two surfaces of the prism are coated with a reflective film to serve as the first mirror 105 and the second mirror 106, respectively.
In one embodiment, the binary zoom structured light depth camera 10 further comprises a third actuator 110 for adjusting the distance between the first mirror 105 and the second mirror 106. Generally, for structured light depth cameras, the distance between the transmitting end and the receiving end (i.e., the baseline) affects the imaging accuracy, such as for short-range measurements, the baseline setting is smaller, and for long-range measurements, the baseline setting is larger to improve accuracy. In the present invention, the base line represents the distance between the first mirror 105 and the second mirror 106 (to be precise, the distance between the first intersection point of the optical axis extension line of the zoom projection lens group 102 and the first mirror 105 and the second intersection point of the optical axis extension line of the zoom imaging lens group 107 and the second mirror 106). Therefore, when zooming the focal length of the transmitting end 112 and/or the receiving end 113 to measure objects at different distances, it is preferable to adjust the length of the base line at the same time, so that a depth image with higher accuracy can be obtained.
It will be appreciated that the actuator typically operates under the influence of a driver, which is further controlled by the processor. Since the processor is involved in parameters such as the baseline length, the focal length, etc. when performing the depth calculation, when the baseline length, the focal length, etc. are adjusted, the distance of adjustment should be performed under the control of the processor, and in some embodiments, the current adjustment amount needs to be fed back to the processor in real time to achieve more accurate control.
Generally, when depth measurement is performed, the focal lengths of the transmitting end 112 and the receiving end 113 should conform to a specific constraint relationship, and under the constraint of such a relationship, it is ensured that the structured light beam emitted by the transmitting end 112 can finally achieve high-quality imaging in the receiving end 113. For example, in one embodiment, the focal length of the transmitting end 112 and the focal length of the receiving end 113 are always equal during zooming, and other constraint relationships are also possible. In addition, the focal length of the transmitting end 112 or the receiving end 113 and the base line may also conform to a certain constraint relationship, where the constraint relationship may be obtained through accurate calculation or may be obtained through experience, and the final purpose is to ensure that a high-precision depth image may be obtained on the premise that the focal length and the base line satisfy the constraint relationship. In one embodiment, these constraint relationships are saved to memory, and the processor needs to call them first (or address a file that holds the constraint relationships) when sending an adjustment instruction to the actuator, and translate the constraint relationships into corresponding control instructions to control zoom and/or baseline adjustment.
The embodiment shown in fig. 1 has at least one of the following advantages over the prior art constant Jiao Jiegou light depth camera:
the measuring range is large. The focal length can be adjusted, so that the depth camera has a more flexible and variable depth of field, and a larger range of depth measurement is realized.
The precision is high. Since the focal length and the base line can be adjusted, the optimal base line length can be selected at the corresponding focal length to obtain higher measurement accuracy.
The volume is small. For the traditional fixed-focus structured light depth camera, a long-focus lens is often required for realizing remote measurement, and the long-focus lens has a longer length, so that the depth camera has larger thickness and large volume and is difficult to apply to thinner equipment. In this embodiment, the reflection scheme is adopted by using a mirror, so that the problem of large thickness is avoided.
FIG. 2 is a schematic diagram of a binary zoom structured light depth camera according to yet another embodiment of the present invention. In comparison with the embodiment shown in fig. 1, the transmitting end 201 of the binary zoom structured light depth camera 20 of the present embodiment further includes a third mirror 203, and/or the receiving end 202 further includes a fourth mirror 204. The third reflecting mirror 203 is disposed between the light source and the zoom projection lens group, and the fourth reflecting mirror 204 is disposed between the image sensor and the zoom imaging lens group, respectively for reflecting the light beam to change the light beam propagation direction. By the arrangement of the third mirror 203 and the fourth mirror 204, the light source and the image sensor can have a more multidirectional structure arrangement.
Fig. 3 is a schematic diagram of a binary zoom structured light depth camera according to a third embodiment of the present invention. Unlike the embodiment shown in fig. 1 and 2, the light source, the image sensor, and the like are disposed between the first mirror 303 and the second mirror 304, not outside as in fig. 1 and 2. The advantage of this arrangement is that for structured light depth cameras with larger baselines, the arrangement of the light source, image sensor, zoom lens group, etc. between the mirrors may further reduce the volume of the depth camera, while the upper limit of the adjustment of baselines between the mirrors may be higher, and the lower limit may be reached than the upper limit of the adjustment of baselines in the embodiments shown in fig. 1, 2. The solution of fig. 1, 2 is therefore more suitable for close range measurements than the solution of fig. 3 is relatively more suitable for long range measurements. It should be noted that, similar to the embodiments shown in fig. 1 and 2, in the embodiment shown in fig. 3, the zoom projection lens group and the zoom imaging lens group in the transmitting end and the receiving end can both be moved under the action of the actuator to achieve optical zooming and/or focusing; in other embodiments shown in fig. 3, similar to the embodiments shown in fig. 1 and 2, the distance between the mirror at the transmitting end and the mirror at the receiving end (i.e., the baseline) can also be adjusted by the actuator to obtain a depth image with higher accuracy.
Fig. 4 is a schematic diagram of a transmitting end according to one embodiment of the invention. The emitting end 40 includes a light source 401, a first lens 402, a reflecting mirror 404, a second lens 405, and a Diffractive Optical Element (DOE) 406, where the light beam emitted from the light source 401 is converged by the first lens 402 and then enters the reflecting mirror 404, and the light beam reflected by the reflecting mirror 404 enters the second lens 405, and further the light beam is diffracted and split by the DOE406 to generate a structural light beam for being emitted outwards.
The light source 401 is an array light source composed of a plurality of sub-light sources, such as a vertical cavity surface laser emitter array chip (VCSEL array chip), after the array light source emits a plurality of diffused light beams, the first lens 402 receives and condenses the plurality of diffused light beams to be condensed into a single pupil 403 (the pupil herein may be considered as a position where the light beam is condensed and narrowed to reach a minimum cross-sectional area), and then the light beam will continue to diffuse; in this embodiment, the reflecting mirror 404 is disposed at the single pupil 403 (when the focal length of the first lens 402 is changed, the pupil position is also changed, so that the reflecting mirror 404 is not required to be strictly disposed at the pupil position, a certain tolerance is allowed), so as to reflect the light beam to change the propagation direction, the light beam reflected by the reflecting mirror 404 is further incident into the second lens 405 to be converged (focused or collimated) again, and the light beam after being secondarily converged is diffracted by the DOE and split to generate a structural light beam for being emitted outwards.
The transmitting end 40 further includes an actuator to adjust the first lens 402 and/or the second lens 405 to achieve zooming, and the first lens 402 and the second lens 405 may be a single lens or a lens group formed by a plurality of lenses. There are various ways of achieving zooming, such as zooming only the first lens 402 or the second lens 405, or both, or adjusting the distance between the two lenses to achieve zooming, etc.
It will be appreciated that, in this embodiment, since the cross-sectional area of the beam at the position of the pupil is smaller, the mirror 404 is disposed at the position of the pupil 403, so that the area of the mirror can be reduced to a greater extent, thereby reducing the cost and manufacturing difficulty. It should be noted that, in comparison with the embodiment shown in fig. 1-3, in the embodiment shown in fig. 4, the first mirror at the transmitting end (i.e., the mirror 404 in fig. 4) is not only used to reflect the light beam to change the propagation direction of the light beam, but also used to split the zoom projection lens into two lenses or two lens groups, so as to enlarge the manner in which the transmitting end achieves zooming, and reduce the area of the first mirror at the transmitting end. The improvement of the transmitting end is also applicable to the receiving end, and in other embodiments, the same or similar manner may be adopted at the receiving end, and the second reflecting mirror of the receiving end is not only used for reflecting the light beam to change the propagation direction of the light beam, but also used for disassembling the zoom imaging lens group into two lenses or two lens groups, so as to enlarge the manner of implementing zooming at the receiving end and reduce the area of the second reflecting mirror in the receiving end, which is not described herein again.
In some embodiments, when the light source is an array light source, the array light source is configured to be composed of a plurality of sub-array light sources that are controllable in groups, which may be applied and is not limited to the above embodiments of the respective depth cameras. In one embodiment, the light source comprises a first sub-array light source and a second sub-array light source which can be controlled independently, wherein the number of the light sources of the first sub-array light source is smaller than that of the second sub-array light source, so that the emitting end formed by the light source can generate at least three structural light beams with different densities (the first sub-array light beam and the second sub-array light beam are independently opened and simultaneously formed into three structural light beams), and the larger the density is, the more suitable for remote measurement, and therefore, the measurement range formed by at least three different measurement intervals can be realized. For example, in the embodiments of fig. 1 to 4, the zoom magnification of the zoom lens determines the number of measurement intervals and the overall measurement range, so that the array light sources can be also controlled in groups to generate structured light beams with different densities not lower than the zoom magnification.
It will be appreciated that in some embodiments, when the emitting end may emit the structured light beams with different densities, the zoom projection lens group may not be used, and only the zoom imaging lens group corresponding to the structured light beams with different densities may be used at the receiving end, so that a better depth imaging effect may be achieved, and the cost may be reduced.
For the above embodiments, the mirror may also be provided as a curved mirror. In one embodiment, considering the problem of diffraction angle of the diffraction optical element at the transmitting end, it is generally difficult to achieve a large angle of view at the transmitting end, and in order to ensure that the angle of view at the transmitting end can cover the angle of view at the receiving end in a large measurement range, the mirror (such as the first mirror and/or the third mirror) in the transmitting end is configured as a convex mirror, so that the angle of view at the transmitting end can be increased to a certain extent, and the mirror (such as the second mirror and/or the fourth mirror) in the receiving end can be a plane mirror or a concave mirror. In one embodiment, the mirror (such as the second mirror and/or the fourth mirror) at the receiving end is configured as a concave mirror, so that on one hand, the field angle can be reduced, and on the other hand, the distortion of the zoom imaging lens group can be compensated by the configuration of the concave mirror, and the reduction of the field angle is beneficial to the long-focus long-distance measurement, where the mirror (such as the second mirror and/or the fourth mirror) at the transmitting end can be a plane mirror or a convex mirror.
Based on the binary zoom structured light depth camera of the above embodiments, the present invention further provides a working method of the binary zoom structured light depth camera, as shown in fig. 5, specifically including the following steps:
s1: at least two focal length modes are preset, and the focal length modes are stored in a memory. Different constraint relations exist under different focal length modes, wherein the constraint relations can refer to specific focal length values and also can refer to relations among focal lengths. For example, in one embodiment, the constraint relationship includes a value of a focal length of the transmitting end and a focal length of the receiving end; or the numerical value of the focal length of the transmitting end and the ratio of the focal length of the receiving end to the focal length of the transmitting end are included. In one embodiment, the constraint relationship further includes a length value of the baseline or a relationship between the length of the baseline and the focal length of the transmitting end and/or the receiving end, such as a relationship between the length of the baseline and the focal length of the transmitting end and/or the receiving end.
S2: receiving a zooming instruction, and then reading a constraint relation in a corresponding focal length mode from a memory by a processor, and converting the constraint relation into a control instruction to control a transmitting end and/or a receiving end to zoom; the receiving of the zoom command may be from an external control or may be adaptive adjustment of the depth camera itself. In one embodiment, the user may issue the zoom instruction by way of input via touch, mouse, or the like. In one embodiment, the depth camera performs adaptive zooming based on the data of the previous frame, for example, when the processor recognizes that the depth image acquired in the previous frame has more holes, which indicates that the measured object may exceed the measurement range corresponding to the current focal length, the processor will automatically issue a zooming instruction in the next frame. Based on the zoom command, such as increasing the focal length or decreasing the focal length, the processor reads the constraint relation in the corresponding focal length mode from the memory, and converts the constraint relation into a control command signal based on the constraint relation, so as to further control the actuator to drive the transmitting end, the receiving end or the reflecting end to adjust the focal length and/or the base line.
S3: in the current focal length mode, the processor controls the transmitting end and the receiving end to transmit the structured light beam and collect the structured light image to form a structured light image, and calculates a depth image based on the structured light image.
In the description of the present invention, it should be noted that the terms "center", "longitudinal", "lateral", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the apparatus or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present invention. Furthermore, the terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
In the description of the present invention, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
It will be understood that when an element is referred to as being "fixed to" another element, it can be directly on the other element or intervening elements may also be present. When an element is referred to as being "connected" to another element, it can be directly connected to the other element or intervening elements may also be present. The terms "vertical," "horizontal," "left," "right," and the like are used herein for illustrative purposes only and are not meant to be the only embodiment.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
It should be noted that each step/component described in the present application may be split into more steps/components, or two or more steps/components or part of the operations of the steps/components may be combined into new steps/components, as needed for implementation, to achieve the object of the present invention.
The zooming method according to the present invention described above may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, RAM, floppy disk, hard disk, or magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium and to be stored in a local recording medium downloaded through a network, so that the method described herein may be stored in such software process on a recording medium using a general purpose computer, a special purpose processor, or programmable or dedicated hardware such as an ASIC or FPGA. It is understood that a computer, processor, microprocessor controller, or programmable hardware includes a memory component (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by the computer, processor, or hardware, implements the processing methods described herein. Further, when the general-purpose computer accesses code for implementing the processes shown herein, execution of the code converts the general-purpose computer into a special-purpose computer for executing the processes shown herein.
The foregoing is a further detailed description of the invention in connection with the preferred embodiments, and it is not intended that the invention be limited to the specific embodiments described. It will be apparent to those skilled in the art that several equivalent substitutions and obvious modifications can be made without departing from the spirit of the invention, and the same should be considered to be within the scope of the invention.

Claims (12)

1. A binary zoom structured light depth camera, comprising:
the emitting end comprises a light source, a zoom projection lens group and a DOE, and is used for projecting a structural light beam to a person or an object in space; the light source is used for emitting light beams; the zoom projection lens group is used for converging the light beams emitted by the light source onto the DOE; the DOE is configured to diffract, split, or replicate the received light beam to project a structured light beam composed of multiple beams of light outwards;
a receiving end including a zoom imaging lens group and an image sensor for receiving the structured light modulated and reflected by the person or object and generating a structured light image; the zoom imaging lens group is used for imaging the received structured light beam on the image sensor to generate a structured light image;
the processor is used for receiving the structured light image from the receiving end and calculating the depth image of the person or the object;
the depth camera further comprises an actuator for driving the zoom projection lens group and the zoom imaging lens group to realize optical zooming;
the transmitting end and the receiving end respectively further comprise a first reflecting mirror and a second reflecting mirror, and the first reflecting mirror and the second reflecting mirror are used for reflecting the light beam so as to change the propagation direction of the light beam.
2. The binary zoom structured light depth camera of claim 1, wherein the zoom projection lens group and the zoom imaging lens group each comprise more than 2 lenses, and wherein a pitch between the lenses is adjustable by the actuator such that an optical zoom is achieved for both the entire zoom projection lens group and the entire zoom imaging lens group.
3. The binary zoom structured light depth camera of claim 1, wherein the zoom projection lens group as a whole and/or the zoom imaging lens group as a whole is movable under the driving action of the actuator to achieve focusing.
4. The binary zoom structured light depth camera of claim 1 wherein the first mirror is for reflecting the DOE projected structured light beam and projecting it toward the person or object in space;
the second reflecting mirror is used for receiving the structural light beam projected by the transmitting end, modulated and reflected by the person or object, and reflecting the received structural light beam to the zoom imaging lens group;
the zoom imaging lens group is used for receiving the structural light beam reflected by the second reflecting mirror and imaging the received structural light beam on the image sensor to generate a structural light image.
5. The binary zoom structured light depth camera of claim 4, wherein the emission end further comprises a third mirror disposed between the light source and the zoom projection lens group for reflecting the light beam to change a direction of propagation of the light beam; and/or the receiving end further comprises a fourth reflecting mirror, wherein the fourth reflecting mirror is arranged between the image sensor and the zoom imaging lens group and is used for reflecting the light beam to change the propagation direction of the light beam.
6. The binary zoom structured light depth camera of claim 5, wherein the first mirror and/or the third mirror of the emission end is a convex mirror for increasing the field angle of the emission end, and the second mirror and/or the fourth mirror of the receiving end is a flat mirror or a concave mirror; or the second reflecting mirror and/or the fourth reflecting mirror of the receiving end are/is concave reflecting mirrors for reducing the field angle of the receiving end and compensating the distortion of the zooming imaging lens group, and the first reflecting mirror and/or the third reflecting mirror of the transmitting end are/is plane reflecting mirrors or convex reflecting mirrors.
7. The binary zoom structured light depth camera of claim 4, wherein the first mirror and the second mirror are disposed between the light source and the image sensor; alternatively, the light source and the image sensor are disposed between the first mirror and the second mirror.
8. The binary zoom structured light depth camera of claim 1, wherein a baseline length of the depth camera is adjustable by the actuator.
9. The binary zoom structured light depth camera of claim 1, further comprising a controller for controlling adjustment of focal lengths of the zoom projection lens group and the zoom imaging lens group by the actuator; the focal lengths of the transmitting end and the receiving end accord with a specific constraint relation, so that the structural light beam emitted by the transmitting end can realize high-quality imaging in the receiving end.
10. The binary zoom structured light depth camera of claim 1, wherein the first mirror is further configured to split the zoom projection lens group into two lenses or two lens groups, and the first mirror is disposed at a pupil position of the zoom projection lens group for reducing an area of the first mirror; and/or the second reflecting mirror is further used for disassembling the zoom imaging lens group into two lenses or two lens groups, and the second reflecting mirror is arranged at the pupil position of the zoom imaging lens group and used for reducing the area of the second reflecting mirror.
11. The binary zoom structured light depth camera of claim 1 wherein the light source is an array light source and comprises more than 2 sub-array light sources, each sub-array light source having an unequal number of light sources and each being individually controllable for causing the emitting end to produce structured light beams of different densities.
12. The zoom method of a binary zoom structured light depth camera according to any one of claims 1 to 11, comprising the steps of:
s1: presetting at least two focal length modes, and storing the focal length modes in a memory;
s2: receiving a zooming instruction, and then reading a constraint relation under a corresponding focal length mode from the memory by the processor, and converting the constraint relation into a control instruction to control the transmitting end and/or the receiving end to zoom;
s3: and under the current focal length mode, the processor controls the transmitting end and the receiving end to respectively transmit the structured light beam and collect the structured light image so as to form a structured light image, and calculates a depth image based on the structured light image.
CN201910277754.4A 2019-04-08 2019-04-08 Dual-zoom structured light depth camera and zooming method Active CN110196023B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910277754.4A CN110196023B (en) 2019-04-08 2019-04-08 Dual-zoom structured light depth camera and zooming method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910277754.4A CN110196023B (en) 2019-04-08 2019-04-08 Dual-zoom structured light depth camera and zooming method

Publications (2)

Publication Number Publication Date
CN110196023A CN110196023A (en) 2019-09-03
CN110196023B true CN110196023B (en) 2024-03-12

Family

ID=67751859

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910277754.4A Active CN110196023B (en) 2019-04-08 2019-04-08 Dual-zoom structured light depth camera and zooming method

Country Status (1)

Country Link
CN (1) CN110196023B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110824490B (en) * 2019-09-27 2023-01-03 深圳奥锐达科技有限公司 Dynamic distance measuring system and method
CN111025321B (en) * 2019-12-28 2022-05-27 奥比中光科技集团股份有限公司 Variable-focus depth measuring device and measuring method
CN111025317B (en) * 2019-12-28 2022-04-26 奥比中光科技集团股份有限公司 Adjustable depth measuring device and measuring method
CN111025318B (en) * 2019-12-28 2022-05-27 奥比中光科技集团股份有限公司 Depth measuring device and measuring method
CN111522474B (en) * 2020-04-20 2023-11-07 歌尔光学科技有限公司 Touch structure and touch system
CN112749610A (en) * 2020-07-27 2021-05-04 腾讯科技(深圳)有限公司 Depth image, reference structured light image generation method and device and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201974159U (en) * 2008-04-01 2011-09-14 感知器公司 Contour sensor with MEMS reflector
CN103134444A (en) * 2013-02-01 2013-06-05 同济大学 Double-field variable-focus three-dimensional measurement system
CN106842529A (en) * 2017-01-23 2017-06-13 清华大学 Quick three-dimensional micro imaging system
CN107743628A (en) * 2015-06-12 2018-02-27 微软技术许可有限责任公司 The luminous structured light in LED faces
CN108337492A (en) * 2018-01-15 2018-07-27 深圳奥比中光科技有限公司 Dynamic projection imaging device
CN108833903A (en) * 2018-05-23 2018-11-16 努比亚技术有限公司 Structured light projection mould group, depth camera and terminal
CN110174075A (en) * 2019-04-08 2019-08-27 深圳奥比中光科技有限公司 A kind of list Zoom structure optical depth camera and Zooming method
CN209783544U (en) * 2019-04-08 2019-12-13 深圳奥比中光科技有限公司 Zoom structure optical depth camera

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102163728B1 (en) * 2013-12-05 2020-10-08 삼성전자주식회사 Camera for depth image measure and method of measuring depth image using the same
KR102312273B1 (en) * 2014-11-13 2021-10-12 삼성전자주식회사 Camera for depth image measure and method of operating the same
TWI563843B (en) * 2015-08-21 2016-12-21 Everready Prec Ind Corp Imaging and lighting apparatus
US10551614B2 (en) * 2017-08-14 2020-02-04 Facebook Technologies, Llc Camera assembly with programmable diffractive optical element for depth sensing
US10586342B2 (en) * 2017-08-31 2020-03-10 Facebook Technologies, Llc Shifting diffractive optical element for adjustable depth sensing resolution

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201974159U (en) * 2008-04-01 2011-09-14 感知器公司 Contour sensor with MEMS reflector
CN103134444A (en) * 2013-02-01 2013-06-05 同济大学 Double-field variable-focus three-dimensional measurement system
CN107743628A (en) * 2015-06-12 2018-02-27 微软技术许可有限责任公司 The luminous structured light in LED faces
CN106842529A (en) * 2017-01-23 2017-06-13 清华大学 Quick three-dimensional micro imaging system
CN108337492A (en) * 2018-01-15 2018-07-27 深圳奥比中光科技有限公司 Dynamic projection imaging device
CN108833903A (en) * 2018-05-23 2018-11-16 努比亚技术有限公司 Structured light projection mould group, depth camera and terminal
CN110174075A (en) * 2019-04-08 2019-08-27 深圳奥比中光科技有限公司 A kind of list Zoom structure optical depth camera and Zooming method
CN209783544U (en) * 2019-04-08 2019-12-13 深圳奥比中光科技有限公司 Zoom structure optical depth camera

Also Published As

Publication number Publication date
CN110196023A (en) 2019-09-03

Similar Documents

Publication Publication Date Title
CN110196023B (en) Dual-zoom structured light depth camera and zooming method
CN110174075B (en) Single-zoom-structure optical depth camera and zooming method
CN211669428U (en) Automatic focusing device based on inner coaxial vision
CN108333859B (en) Structured light projection device and depth camera for depth image imaging method based on depth camera
CN209783544U (en) Zoom structure optical depth camera
GB2579689A (en) Improved 3D sensing
US20160147214A1 (en) Three-dimensional laser processing apparatus and positioning error correction method
CN107783353B (en) Device and system for capturing three-dimensional image
CN111412835B (en) Novel laser scanning projection method
EP3180584A2 (en) Confocal imaging apparatus with curved focal surface or target reference element and field compensator
CN109029288B (en) Reflective large-gradient aspheric surface and free-form surface detection device and method based on DMD wave-front sensing technology
WO2012009210A1 (en) Systems and methods for reducing speckle in laser projected images
CN216434623U (en) Optical lens equipment and adjustable uniform light source system
JP2000068934A (en) Optical communication device mounted on satellite
CN109981986B (en) Reflective infrared micro-scanning optical imaging system for image super-resolution restoration
CN103487916A (en) Method for adjusting off-axis paraboloidal mirror based on high-resolution scientific CCD camera
CN116520619A (en) Zoom laser infrared light supplementing lamp with laser indication function and control method
JP2000283721A (en) Three-dimensional input device
CN113296105B (en) Non-coaxial laser scanning imaging system based on MEMS scanning mirror
JP3360505B2 (en) Three-dimensional measuring method and device
CN216595732U (en) Laser dynamic tracking scanning system
CN216927232U (en) Self-adaptive zooming device
JP3733625B2 (en) Imaging device for 3D measurement
JPH09145320A (en) Three-dimensional input camera
CN218675456U (en) Zoom lens for laser focusing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 11-13 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Applicant after: Obi Zhongguang Technology Group Co.,Ltd.

Address before: 12 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Nanshan District, Shenzhen, Guangdong 518000

Applicant before: SHENZHEN ORBBEC Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant