CN112068144B - Light projection system and 3D imaging device - Google Patents

Light projection system and 3D imaging device Download PDF

Info

Publication number
CN112068144B
CN112068144B CN201910500339.0A CN201910500339A CN112068144B CN 112068144 B CN112068144 B CN 112068144B CN 201910500339 A CN201910500339 A CN 201910500339A CN 112068144 B CN112068144 B CN 112068144B
Authority
CN
China
Prior art keywords
discrete
light
projector
target object
control state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910500339.0A
Other languages
Chinese (zh)
Other versions
CN112068144A (en
Inventor
朱力
吕方璐
汪博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Guangjian Technology Co Ltd
Original Assignee
Shenzhen Guangjian Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Guangjian Technology Co Ltd filed Critical Shenzhen Guangjian Technology Co Ltd
Priority to CN201910500339.0A priority Critical patent/CN112068144B/en
Publication of CN112068144A publication Critical patent/CN112068144A/en
Application granted granted Critical
Publication of CN112068144B publication Critical patent/CN112068144B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth

Abstract

The invention provides a light projection system and a 3D imaging device, comprising a discrete light beam projector, a surface light source projector and a controller; a discrete beam projector for projecting a plurality of discrete collimated beams toward a target object; a surface light source projector for projecting floodlight toward a target object; and the controller is used for controlling the discrete light beam projector and the surface light source projector to be opened and closed according to a preset control rule, the control rule at least comprises a first control state and a second control state, the controller controls the discrete light beam projector to be opened and the surface light source projector to be closed in the first control state so as to project a plurality of discrete collimated light beams to the target object, and the controller controls the discrete light beam projector to be closed and the surface light source projector to be opened in the second control state so as to project floodlight to the target object. The invention enlarges the projection time proportion of the scattered light beams when the illuminance is larger and the distance is longer, conversely enlarges the projection time proportion of the floodlight, and improves the application scene of the 3D camera loaded with the light projection system.

Description

Light projection system and 3D imaging device
Technical Field
The invention relates to the field of 3D imaging, in particular to a light projection system and a 3D imaging device.
Background
In recent years, with the development of the consumer electronics industry, the 3D camera having the depth sensing function is receiving increasing attention from the consumer electronics world. The current well-established depth measurement method is a structured light scheme, i.e. a specific structured light pattern is projected on an object, and then the depths of different positions of the object are calculated through the deformation or displacement of the pattern.
The ToF (time of flight) technique is a 3D imaging technique that emits measurement light from a projector and reflects the measurement light back to a receiver through a target object, thereby enabling acquisition of a spatial distance from the object to a sensor from a propagation time of the measurement light in this propagation path. Common ToF techniques include single point scanning projection methods and area light projection methods.
The ToF method of single-point scanning projection uses a single-point projector to project a single beam of collimated light whose projection direction is controlled by a scanning device so that it can be projected onto different target locations. After the collimated light of the single light beam is reflected by the target object, part of the light is received by the single-point light detector, and therefore the depth measurement data of the current projection direction is obtained. The method can concentrate all the optical power on one target point, thereby realizing high signal-to-noise ratio at a single target point and further realizing high-precision depth measurement. Scanning of the entire target object relies on scanning devices such as mechanical motors, MEMS, photo phased radar, etc. And splicing the depth data points obtained by scanning to obtain the discrete point cloud data required by 3D imaging. The method is beneficial to realizing long-distance 3D imaging, but a complex projection scanning system is required, and the cost is high.
The ToF method of surface light projection projects a surface light beam with a continuous energy distribution. The projected light continuously covers the target object surface. The light detector is a light detector array capable of acquiring the propagation time of the light beam. When the optical signal reflected by the target object is imaged on the optical detector through the optical imaging system, the depth obtained by each detector image point is the depth information of the object image relationship corresponding to the object position. This method enables to get rid of complex scanning systems. However, since the optical power density of the surface light projection is much lower than that of the singular collimated light, the signal-to-noise ratio is greatly reduced compared with the method of single-point scanning projection, so that the method can only be applied to scenes with reduced distance and lower precision.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a light projection system and a 3D imaging device, which can project discrete collimated light beams or floodlight and improve the application scene of a 3D camera provided with the light projection system.
The invention provides a light projection system, which comprises a discrete light beam projector, a surface light source projector and a controller;
the discrete light beam projector is used for projecting a plurality of discrete collimated light beams to a target object;
the surface light source projector is used for projecting floodlight to the target object;
the controller is used for controlling the discrete light beam projector and the surface light source projector to be opened and closed according to preset control rules, the control rules at least comprise a first control state and a second control state, in the first control state, the controller controls the discrete light beam projector to be opened and the surface light source projector to be closed so as to project a plurality of discrete collimated light beams to the target object, and in the second control state, the controller controls the discrete light beam projector to be closed and the surface light source projector to be opened so as to project floodlight to the target object.
Preferably, the discrete beam projector comprises an edge-emitting laser and a beam projector disposed on an optical path;
the edge-emitting laser is used for projecting laser to the beam projector;
the beam projector is used for projecting the incident laser light into a plurality of discrete collimated light beams.
Preferably, the discrete beam projector comprises a laser array, a collimating lens and a beam splitting device arranged on an optical path;
the laser array is used for projecting laser of a first order of magnitude to the collimating lens;
the collimating lens is used for collimating the incident multiple laser beams and then emitting collimated light beams with a first order of magnitude;
the beam splitting device is used for splitting the incident collimated light beam with the first order of magnitude to emit a collimated light beam with a second order of magnitude;
the second order of magnitude is greater than the first order of magnitude.
Preferably, when the illuminance is greater than a preset illuminance threshold or the distance is greater than a preset distance threshold, the duration ratio of the second control state to the first control state is reduced to enlarge the projection time proportion of the discrete light beams;
and when the illuminance is smaller than a preset illuminance threshold or the distance is smaller than a preset distance threshold, increasing the duration ratio of the second control state to the first control state so as to enlarge the projection time proportion of the floodlight.
Preferably, the surface light source projector employs an LED light source.
Preferably, the control rules further comprise controlling the switching between the first control state and the second control state two or more times during an image acquisition phase to achieve a first predetermined ratio of time of flood projection to discrete collimated light projection beams.
Preferably, the first predetermined ratio comprises any one of:
-the ratio of time of the flood and discrete collimated beam projections is 1:1;
-the ratio of time of the flood and discrete collimated beam projections is 10;
-the ratio of time of the flood and discrete collimated beam projections is any value between 1:1 to 10.
Preferably, the control rules further comprise controlling the first and second control states to oscillate during an image acquisition phase to achieve a projected flood to discrete collimated beam time to a second predetermined ratio.
The invention provides a 3D imaging device which is characterized by comprising the light projection system and an imaging module;
the imaging module is used for receiving the discrete collimated light beam reflected by the target object, obtaining first depth data of the surface of the target object according to the reflected discrete collimated light beam, receiving floodlight reflected by the target object, obtaining second depth data of the surface of the target object according to the reflected floodlight, and further generating a target depth image according to the first depth data and the second depth data.
Preferably, the imaging module comprises an optical imaging lens, a photodetector array, a driving circuit and a processor; the light detector array comprises a plurality of light detectors distributed in an array;
the optical imaging lens is used for enabling direction vectors of the collimated light beams which penetrate through the optical imaging lens and enter the light detector array to be in one-to-one correspondence with the light detectors;
the light detector is used for receiving the discrete collimated light beam and the floodlight reflected by the target object;
the driving circuit is used for measuring the propagation time of a plurality of discrete collimated light beams and further generating first depth data of the surface of the target object, and measuring the propagation time of the floodlight and further generating second depth data of the surface of the target object;
the processor is configured to generate a target depth image from the first depth data and the second depth data.
Compared with the prior art, the invention has the following beneficial effects:
the invention is provided with the stray light beam projector and the surface light source projector, can project stray light beams or floodlight to a target object under the control of the controller, can enlarge the projection time proportion of the stray light beams when the illuminance is higher or the distance is longer, and enlarge the projection time proportion of the floodlight beams when the illuminance is lower or the distance is closer, thereby improving the application scene of a 3D camera provided with the light projection system; according to the invention, the first depth data of the surface of the target object can be obtained according to the reflected discrete collimated light beams, the second depth data of the surface of the target object can be obtained according to the reflected floodlight of the target object, and the target depth image can be generated according to the first depth data and the second depth data, so that the precision of the depth image can be improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts. Other features, objects and advantages of the invention will become more apparent upon reading of the detailed description of non-limiting embodiments with reference to the following drawings:
FIG. 1 is a schematic view of a light projection system according to the present invention;
FIG. 2 is a schematic diagram of one construction of a discrete beam projector according to the present invention;
FIG. 3 is a schematic view of another construction of the diverging beam projector of the present invention;
FIGS. 4 (a), (b), and (c) are schematic diagrams of the periodic arrangement of a plurality of discrete collimated light beams according to the present invention;
FIGS. 5 (a), (b), and (c) are schematic illustrations of non-periodic arrangements of a plurality of discrete collimated light beams in accordance with the present invention;
FIG. 6 is a schematic diagram of a 3D imaging apparatus according to the present invention; and
fig. 7 is a schematic structural view of an imaging module according to the present invention.
In the figure:
1 is a light projection system;
2 is an imaging module;
3 is a target object;
101 is a discrete beam projector;
102 is a surface light source projector;
1011 is an edge-emitting laser;
1012 is a beam projector;
1013 is a laser array;
1014 is a collimating lens;
1015 is a beam splitting device;
201 is a photodetector array;
202 is an optical imaging lens.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the concept of the invention. All falling within the scope of the present invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element. The connection may be for fixation or for circuit connection.
It is to be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for convenience in describing the embodiments of the present invention and to simplify the description, and are not intended to indicate or imply that the referenced device or element must have a particular orientation, be constructed in a particular orientation, and be in any way limiting of the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present invention, "a plurality" means two or more unless specifically limited otherwise.
Fig. 1 is a schematic structural diagram of a light projection system according to the present invention, and as shown in fig. 1, the light projection system 1 according to the present invention includes a divergent light beam projector 101, a surface light source projector 102, and a controller; the discrete beam projector 101 for projecting a plurality of discrete collimated beams of light towards the target object 3;
the surface light source projector 102 is used for projecting floodlight to the target object 3;
the controller is configured to control the discrete light beam projector 101 and the surface light source projector 102 to open and close according to a preset control rule, where the control rule at least includes a first control state and a second control state, in the first control state, the controller controls the discrete light beam projector 101 to open and close the surface light source projector 102 to project a plurality of discrete collimated light beams onto the target object 3, and in the second control state, the controller controls the discrete light beam projector 101 to close and open the surface light source projector 102 to project floodlight onto the target object 3.
In an embodiment of the present invention, when the illuminance is greater than a preset illuminance threshold or the distance is greater than a preset distance threshold, the duration ratio between the second control state and the first control state is decreased to expand the projection time ratio of the discrete light beams; and when the illuminance is smaller than a preset illuminance threshold or the distance is smaller than a preset distance threshold, increasing the duration ratio of the second control state to the first control state so as to enlarge the projection time proportion of the floodlight.
Namely, by arranging the stray light beam projector 101 and the surface light source projector 102 in the invention, the stray light beams or floodlights can be projected to the target object 3 under the control of the controller, the projection time proportion of the stray light beams can be enlarged when the illuminance is high and the distance is long, the projection time proportion of the floodlights can be enlarged when the illuminance is low and the distance is short, and the application scene of the 3D camera provided with the light projection system in the invention is improved.
The ratio of the time of the flood light and the discrete collimated light beam projection is 1:2 when the distance is greater than 1 meter, and 2:1 when the distance is less than 1 meter, for example. The ratio of the time of projection of the flood light and the discrete collimated light beams is 1:2 when the illuminance is greater than 50lux, and the ratio of the time of projection of the flood light and the discrete collimated light beams is 2:1 when the illuminance is less than 50 lux.
In an embodiment of the present invention, the illuminance threshold is any value from 10lux to 50lux, and the distance threshold is any value from 1 meter to 5 meters.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, so that the above is the core idea of the present invention, and the above objects, features and advantages of the present invention can be more clearly understood. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In an embodiment of the present invention, a plurality of discrete collimated light beams projected by the discrete light beam projector 101 in a discrete shape are reflected by the target object 3, the partially reflected collimated light beams are received by the photo detector array 201, and each photo detector can obtain the flight time t from emission to reception of the corresponding light beam, so that the flight distance s = ct of the collimated light beam is obtained by the speed of light c, and the depth information of the surface position of the target object 3 irradiated by each discrete light beam can be measured. These discrete-position depth data points construct point cloud data that can replicate the 3D morphology of the object, thereby enabling 3D imaging of the target object 3. The plurality of discrete collimated light beams is tapered.
In an embodiment of the present invention, the surface light source projector 102 employs an LED light source. In a variant, other light sources, such as infrared light sources, may also be used.
In an embodiment of the invention, the control rule further comprises controlling the switching between the first control state and the second control state two or more times during an image acquisition phase to achieve a first predetermined ratio of time of flood projection to discrete collimated light projection beams.
The duration of the first control state and the duration of the second control state may be the same, such as 1:1, or a ratio of 1:2, the duration of the first control state and the duration of the second control state may be controlled according to illuminance.
The image acquisition phase is the acquisition time of at least one image, such as 0.1S, or any other number of seconds.
The first predetermined ratio comprises any one of:
-the ratio of time of the flood and discrete collimated beam projections is 1:1;
-the ratio of time of the flood and discrete collimated beam projections is 10;
-the ratio of time of the flood light and the discrete collimated beam projections is any value between 1:1 and 10.
In an embodiment of the invention, the control rule further comprises controlling the first control state and the second control state to oscillate during an image acquisition phase to achieve a second predetermined ratio of projected flood to discrete collimated beam time. The second predetermined ratio may be the same as the first predetermined ratio or may be different from the first predetermined ratio.
FIG. 2 is a schematic diagram of a discrete beam projector according to the present invention, and as shown in FIG. 2, the discrete beam projector 101 includes an edge-emitting laser 1011 and a beam projector 1012 disposed in an optical path;
the edge-emitting laser 1011 for projecting laser light toward the beam projector 1012;
the beam projector 1012 is configured to project the incident laser light into a plurality of discrete collimated beams.
In an embodiment of the invention, the inner surface of the beam splitting projector is processed with a micro-nano structured optical chip and matched with an optical lens. The beam splitting projector can perform the function of splitting incident light from the edge-emitting laser 1011 into any of a plurality of collimated beams. The emission direction of the edge-emitting laser 1011 and the projection direction of the beam splitting projector may be the same, or may be at 90 degrees or any angle required for the optical system design.
In an embodiment of the invention, the number of discrete collimated light beams is between two and several tens of thousands of beams, such as 2 to 10 thousands of beams.
In one embodiment of the present invention, the beam projector employs a beam splitter.
Fig. 3 is a schematic view showing another structure of the discrete beam projector of the present invention, and as shown in fig. 3, the discrete beam projector 101 includes a laser array 1013, a collimator lens 1014, and a beam splitting device 1015 arranged on an optical path;
the laser array 1013 is used for projecting laser light of a first order of magnitude to the collimating lens 1014;
the collimating lens 1014 is configured to collimate the incident multiple laser beams and emit collimated light beams of a first order of magnitude;
the beam splitting device 1015 is configured to split the incident collimated light beam of the first order of magnitude and emit a collimated light beam of a second order of magnitude;
the second order of magnitude is greater than the first order of magnitude.
In an embodiment of the invention, the second order of magnitude is one to two times the first order of magnitude.
In the embodiment of the present invention, the Laser array 1013 may be formed by a plurality of Vertical Cavity Surface Emitting Lasers (VCSELs) or a plurality of Edge Emitting Lasers (EELs). The multiple laser beams can become highly parallel collimated beams after passing through the collimating lens 1014. The beam splitting device 1015 may be used to realize more collimated beams according to the requirement of the number of discrete beams in practical application. The beam splitting device 1015 may employ a diffraction grating (DOE), a Spatial Light Modulator (SLM), and the like.
In an embodiment of the present invention, the plurality of discrete collimated light beams are periodically arranged in a predetermined shape, that is, in a geometrically regular distribution.
Fig. 4 (a), (b), and (c) are schematic diagrams of the periodic arrangement of a plurality of discrete collimated light beams in the present invention, and as shown in fig. 4, in an embodiment of the present invention, the preset shape includes any one of the following shapes or any plurality of shapes that can be switched with each other:
straight line shape
-a triangle;
-a quadrilateral;
-a rectangle;
-a circular shape;
-a hexagon;
-a pentagon.
The shape of the periodic arrangement of the plurality of discrete collimated light beams is not limited to the above shape, and the plurality of discrete collimated light beams may be arranged in other shapes. As shown in fig. 5 (a), when the preset shape is a rectangle, that is, the unit arrangement shape of the collimated light beams in one period is a rectangle, and is periodically repeated in space. As shown in fig. 5 (b), when the preset shape is a triangle, that is, the unit arrangement shape of the collimated light beam in one period is a triangle, and is periodically repeated in space. As shown in fig. 5 (c), when the preset shape is a hexagon, that is, the unit arrangement shape of the collimated light beams in one period is a hexagon, and is periodically repeated in space. Since the present invention is limited to an optical system in implementation, the arrangement of the actual collimated light beam in the cross section may have distortion, such as stretching, twisting, and the like. And the energy distribution of each collimated light beam in the cross section can be circular, circular ring or elliptical and the like. In such an arrangement as shown in fig. 5, it is advantageous to simplify the spatial correspondence of the plurality of discrete collimated light beams to the photodetector array 201.
In an embodiment of the invention, the plurality of discrete collimated light beams are non-periodically arranged in another predetermined shape.
In an embodiment of the present invention, the aperiodic arrangement includes any one of the following arrangements or any plurality of arrangements that can be switched with each other:
-a random arrangement;
-a spatial coding arrangement;
-a quasi-lattice arrangement.
The shape of the non-periodic arrangement of the plurality of discrete collimated light beams is not limited to the above shape, and the plurality of discrete collimated light beams may be arranged in other shapes. As shown in fig. 5 (a), the spatial coding arrangement, specifically, in the periodic arrangement, a part of the light beams is deleted, so as to implement the spatial coding of the arrangement position, and the actually adopted coding is not limited to the example in fig. 5 (a); as shown in fig. 5 (b), the random arrangement, specifically the arrangement of the collimated light beams, is randomly distributed, so that the similarity of the arrangement modes at different positions is small or close to zero, and as shown in fig. 5 (c), the quasi-lattice arrangement, specifically the quasi-collimated light beams, are non-periodically arranged at close-distance adjacent positions and are periodically arranged at long-distance positions. Since the present invention is limited to an optical system in implementation, the arrangement of the actual collimated light beam in the cross section may have distortion, such as stretching, twisting, and the like. And the energy distribution of each collimated light beam in the cross section can be circular, circular ring or elliptical and the like. In this arrangement as shown in fig. 6, this arrangement facilitates uniform sampling of non-deterministic targets, optimizing the effect of the final 3D depth map.
In an embodiment of the present invention, the light detector employs any one of the following light sensors:
-a CMOS light sensor;
-a CCD light sensor;
SPAD light sensor.
The type of the light detector is not limited to the light sensor, and may also include other types of light sensors.
In the embodiment of the present invention, the present invention further provides a 3D imaging apparatus, including the light projection system 1, further including an imaging module 2;
the imaging module 2 is configured to receive the discrete collimated light beam reflected by the target object 3, obtain first depth data of the surface of the target object 3 according to the reflected discrete collimated light beam, receive floodlight reflected by the target object 3, obtain second depth data of the surface of the target object 3 according to the reflected floodlight, and generate a target depth image according to the first depth data and the second depth data.
According to the invention, the first depth data of the surface of the target object 3 can be obtained according to the reflected discrete collimated light beams, the second depth data of the surface of the target object 3 can be obtained according to the reflected floodlight of the target object 3, and then the target depth image can be generated according to the first depth data and the second depth data, so that the precision of the depth image can be improved.
In the first depth data acquired by the discrete collimated light beams, there is a problem that each point in the first depth data has a high precision and the number of points is small due to the limitation of the number of light beams, and in the second depth data acquired by the flood light, there is a problem that the point in the second depth data is large in number and comprehensive but each point has a low precision due to the projection of the surface light source. The present invention can improve the accuracy of a depth image by generating first depth data and second depth data through the alternate projection of the discrete beam projector 101 and the area light source projector 102, and further generating a target depth image through the complementary characteristics of the first depth data and the second depth data.
Fig. 7 is a schematic structural diagram of an optical imaging lens according to the present invention, and as shown in fig. 7, the imaging module 2 includes an optical imaging lens 202, a photodetector array 201, and a driving circuit; the photodetector array 201 includes a plurality of photodetectors distributed in an array;
the optical imaging lens 202 is configured to enable a direction vector of the collimated light beam entering the photodetector array 201 through the optical imaging lens 202 to have a one-to-one correspondence with the photodetectors;
the light detector is used for receiving the collimated light beam reflected by the target object 3;
the driving circuit is used for measuring the propagation time of a plurality of discrete collimated light beams and further generating first depth data of the surface of the target object, and measuring the propagation time of the floodlight and further generating second depth data of the surface of the target object;
the processor is configured to generate a target depth image from the first depth data and the second depth data.
In an embodiment of the invention, the driving circuit is further configured to control the discrete beam projector 101 and the imaging module 2 to be turned on or off simultaneously.
The driving circuit may be a stand-alone dedicated circuit, such as a dedicated SOC chip, an FPGA chip, an ASIC chip, etc., or may comprise a general-purpose processor, for example, when the depth camera is integrated into an intelligent terminal, such as a mobile phone, a television, a computer, etc., a processor in the terminal may be at least a part of the processing circuit.
To filter background noise, a narrow band filter is typically also mounted within the optical imaging lens 202, such that the photodetector array 201 can only pass incident collimated light beams of a predetermined wavelength. The preset wavelength can be the wavelength of the incident collimated light beam, and can also be between 50 nanometers smaller than the incident collimated light beam and 50 nanometers larger than the incident collimated light beam. The photodetector array 201 may be arranged periodically or aperiodically. Each photodetector, in cooperation with an auxiliary circuit, may enable measurement of the time of flight of the collimated beam. The photodetector array 201 may be a combination of multiple single-point photodetectors or a sensor chip integrating multiple photodetectors, as required by the number of discrete collimated light beams. To further optimize the sensitivity of the light detectors, the illumination spot of one discrete collimated light beam on the target object 3 may correspond to one or more light detectors. When a plurality of light detectors correspond to the same irradiation light spot, signals of each detector can be communicated through a circuit, so that the light detectors with larger detection areas can be combined.
In an embodiment of the present invention, the photodetector array imager 2 ensures spatial position correspondence between the projected plurality of discrete collimated light beams and the photodetector array 201. So that each photodetector in the photodetector array 201 can measure the propagation time of light by using a ToF method of modulating a light beam or pulse continuously in time, and then calculate the distance traveled by light by means of the speed of light.
In an embodiment of the present invention, the imaging module 2 includes an RGB camera and an infrared camera;
the RGB camera is used for collecting a 2D image of the surface of the target object 3 when the illuminance is greater than a preset fixed illuminance value;
the infrared camera is used for obtaining a target depth image of the surface of the target object 3 according to a received light spot pattern formed by the discrete collimated light beams reflected by the target object 3 and collecting a 2D image of the surface of the target object 3 under the floodlight when the illuminance is smaller than a preset illuminance value.
In one embodiment of the invention, when the illuminance is lower than a preset value, the floodlight projected by the surface light source projector is used for illumination, and under the illumination of the floodlight, the infrared camera can capture a 2D image of the surface of the target object; and finally, generating a 3D image of the target object according to the target depth image and the 2D image of the surface of the target object.
The embodiments in the present description are described in a progressive manner, each embodiment focuses on differences from other embodiments, and the same and similar parts among the embodiments are referred to each other. The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes and modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention.

Claims (9)

1. A light projection system comprising a discrete beam projector, a surface light source projector, and a controller;
the discrete light beam projector is used for projecting a plurality of discrete collimated light beams to a target object;
the surface light source projector is used for projecting floodlight to the target object;
the controller is used for controlling the discrete light beam projector and the surface light source projector to open and close according to preset control rules, the control rules at least comprise a first control state and a second control state, in the first control state, the controller controls the discrete light beam projector to open and close the surface light source projector so as to project a plurality of discrete collimated light beams to the target object, and in the second control state, the controller controls the discrete light beam projector to close and open the surface light source projector so as to project floodlight to the target object;
when the illuminance is greater than a preset illuminance threshold or the distance is greater than a preset distance threshold, reducing the duration ratio of the second control state to the first control state to expand the projection time proportion of the discrete light beams;
and when the illuminance is smaller than a preset illuminance threshold or the distance is smaller than a preset distance threshold, increasing the duration ratio of the second control state to the first control state so as to enlarge the projection time proportion of the floodlight.
2. The light projection system of claim 1, wherein the discrete beam projector includes an edge-emitting laser and a beam projector disposed in an optical path;
the edge-emitting laser is used for projecting laser to the beam projector;
the beam projector is used for projecting the incident laser light into a plurality of discrete collimated light beams.
3. The light projection system of claim 1, wherein the divergent beam projector includes an array of lasers, a collimating lens, and a beam splitting device disposed in an optical path;
the laser array is used for projecting laser of a first order of magnitude to the collimating lens;
the collimating lens is used for collimating the incident multiple laser beams and then emitting collimated light beams with a first order of magnitude;
the beam splitting device is used for splitting the incident collimated light beam with the first order of magnitude to emit a collimated light beam with a second order of magnitude;
the second order of magnitude is greater than the first order of magnitude, and the second order of magnitude is one to two times the first order of magnitude.
4. The light projection system of claim 1, wherein the control rules further comprise controlling the switching between the first control state and the second control state two or more times during an image acquisition phase to achieve a first predetermined ratio of time of flood projection to discrete collimated light projection beams.
5. The light projection system of claim 4,
the first predetermined ratio comprises any one of:
-the ratio of time of the flood and discrete collimated beam projections is 1:1;
-the ratio of time of the flood and discrete collimated beam projections is 10;
-the ratio of time of the flood and discrete collimated beam projections is any value between 1:1 to 10.
6. A light projection system as claimed in claim 1 wherein the control rules further comprise controlling the first and second control states to oscillate during an image acquisition phase to achieve a projected flood to discrete collimated beam time to a second predetermined ratio.
7. A light projection system as claimed in claim 1, characterized in that the area light source projector employs an LED light source.
8. A 3D imaging device comprising the light projection system of any one of claims 1 to 7, further comprising an imaging module;
the imaging module is used for receiving the discrete collimated light beam reflected by the target object, obtaining first depth data of the surface of the target object according to the reflected discrete collimated light beam, receiving floodlight reflected by the target object, obtaining second depth data of the surface of the target object according to the reflected floodlight, and further generating a target depth image according to the first depth data and the second depth data.
9. The 3D imaging device according to claim 8, wherein the imaging module comprises an optical imaging lens, a photodetector array, a driving circuit, and a processor; the light detector array comprises a plurality of light detectors distributed in an array;
the optical imaging lens is used for enabling direction vectors of the collimated light beams which penetrate through the optical imaging lens and enter the light detector array to be in one-to-one correspondence with the light detectors;
the light detector is used for receiving the discrete collimated light beam reflected by the target object and the floodlight;
the driving circuit is used for measuring the propagation time of a plurality of discrete collimated light beams and further generating first depth data of the surface of the target object, and measuring the propagation time of the floodlight and further generating second depth data of the surface of the target object;
the processor is configured to generate a target depth image from the first depth data and the second depth data.
CN201910500339.0A 2019-06-11 2019-06-11 Light projection system and 3D imaging device Active CN112068144B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910500339.0A CN112068144B (en) 2019-06-11 2019-06-11 Light projection system and 3D imaging device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910500339.0A CN112068144B (en) 2019-06-11 2019-06-11 Light projection system and 3D imaging device

Publications (2)

Publication Number Publication Date
CN112068144A CN112068144A (en) 2020-12-11
CN112068144B true CN112068144B (en) 2022-10-21

Family

ID=73658731

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910500339.0A Active CN112068144B (en) 2019-06-11 2019-06-11 Light projection system and 3D imaging device

Country Status (1)

Country Link
CN (1) CN112068144B (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1370368A (en) * 1971-01-16 1974-10-16 Ibm Optical examination of surfaces
JP2000193926A (en) * 1998-12-25 2000-07-14 Seiko Epson Corp Light source unit, illuminating optical system and projection type display device
JP2004006326A (en) * 2002-04-22 2004-01-08 Mitsubishi Rayon Co Ltd Surface light source and photo conductive member therefor
CN101657716A (en) * 2007-03-15 2010-02-24 科学技术设备委员会 Illumination of diffusely scattering media

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4316348A1 (en) * 1993-05-15 1994-11-17 Wild Heerbrugg Ag Distance measuring device
JP4031306B2 (en) * 2002-07-12 2008-01-09 日本放送協会 3D information detection system
JP2004188604A (en) * 2002-12-06 2004-07-08 Ts Corporation Optical three-dimensional shaping apparatus
CN2779424Y (en) * 2005-03-24 2006-05-10 南京德朔实业有限公司 Distance measurer
US9841496B2 (en) * 2014-11-21 2017-12-12 Microsoft Technology Licensing, Llc Multiple pattern illumination optics for time of flight system
CN106576159B (en) * 2015-06-23 2018-12-25 华为技术有限公司 A kind of photographing device and method obtaining depth information
CN107105217B (en) * 2017-04-17 2018-11-30 深圳奥比中光科技有限公司 Multi-mode depth calculation processor and 3D rendering equipment
CN107167996A (en) * 2017-06-05 2017-09-15 深圳奥比中光科技有限公司 The laser projection module and depth camera adaptively adjusted
EP3662406B1 (en) * 2017-08-01 2023-11-22 Apple Inc. Determining sparse versus dense pattern illumination
CN108052878B (en) * 2017-11-29 2024-02-02 上海图漾信息科技有限公司 Face recognition device and method
CN108169981A (en) * 2018-01-15 2018-06-15 深圳奥比中光科技有限公司 Multi-functional lighting module
CN108540717A (en) * 2018-03-31 2018-09-14 深圳奥比中光科技有限公司 Target image obtains System and method for
CN108769649B (en) * 2018-06-28 2019-08-23 Oppo广东移动通信有限公司 Advanced treating device and three dimensional image apparatus
CN109118581B (en) * 2018-08-22 2023-04-11 Oppo广东移动通信有限公司 Image processing method and device, electronic equipment and computer readable storage medium
CN109194856A (en) * 2018-09-30 2019-01-11 Oppo广东移动通信有限公司 The control method and electronic device of electronic device
CN109798838B (en) * 2018-12-19 2020-10-27 西安交通大学 ToF depth sensor based on laser speckle projection and ranging method thereof
CN109709742B (en) * 2019-01-09 2021-08-10 深圳市光鉴科技有限公司 Structured light projector and 3D camera
CN109831660B (en) * 2019-02-18 2021-04-23 Oppo广东移动通信有限公司 Depth image acquisition method, depth image acquisition module and electronic equipment
CN109862275A (en) * 2019-03-28 2019-06-07 Oppo广东移动通信有限公司 Electronic equipment and mobile platform

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1370368A (en) * 1971-01-16 1974-10-16 Ibm Optical examination of surfaces
JP2000193926A (en) * 1998-12-25 2000-07-14 Seiko Epson Corp Light source unit, illuminating optical system and projection type display device
JP2004006326A (en) * 2002-04-22 2004-01-08 Mitsubishi Rayon Co Ltd Surface light source and photo conductive member therefor
CN101657716A (en) * 2007-03-15 2010-02-24 科学技术设备委员会 Illumination of diffusely scattering media

Also Published As

Publication number Publication date
CN112068144A (en) 2020-12-11

Similar Documents

Publication Publication Date Title
CA3012691C (en) Lidar based 3-d imaging with far-field illumination overlap
CA3017817C (en) Lidar based 3-d imaging with varying illumination field density
US11073617B2 (en) Integrated illumination and detection for LIDAR based 3-D imaging
CN110244318B (en) 3D imaging method based on asynchronous ToF discrete point cloud
CA3017811C (en) Lidar based 3-d imaging with varying pulse repetition
CA3017819C (en) Lidar based 3-d imaging with varying illumination intensity
CA3005902C (en) Three dimensional lidar system with targeted field of view
US11435446B2 (en) LIDAR signal acquisition
CN110221309B (en) 3D imaging device and electronic equipment based on asynchronous ToF discrete point cloud
CN109557522A (en) Multi-beam laser scanner
CN112066906A (en) Depth imaging device
WO2020221188A1 (en) Synchronous tof discrete point cloud-based 3d imaging apparatus, and electronic device
CN112066907B (en) Depth imaging device
CN210128694U (en) Depth imaging device
CN210835244U (en) 3D imaging device and electronic equipment based on synchronous ToF discrete point cloud
CN112068144B (en) Light projection system and 3D imaging device
CN111947565A (en) 3D imaging method based on synchronous ToF discrete point cloud
EP3665503A1 (en) Lidar signal acquisition
CN112834434A (en) 4D camera device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant