CN113238248A - 3D imaging device with structured light and TOF technology integrated - Google Patents

3D imaging device with structured light and TOF technology integrated Download PDF

Info

Publication number
CN113238248A
CN113238248A CN202110459066.7A CN202110459066A CN113238248A CN 113238248 A CN113238248 A CN 113238248A CN 202110459066 A CN202110459066 A CN 202110459066A CN 113238248 A CN113238248 A CN 113238248A
Authority
CN
China
Prior art keywords
tof
structured light
light
module
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110459066.7A
Other languages
Chinese (zh)
Inventor
陈驰
李安
张俊君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Angstrong Technology Co ltd
Original Assignee
Shenzhen Angstrong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Angstrong Technology Co ltd filed Critical Shenzhen Angstrong Technology Co ltd
Priority to CN202110459066.7A priority Critical patent/CN113238248A/en
Publication of CN113238248A publication Critical patent/CN113238248A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/10Systems determining position data of a target for measuring distance only using transmission of interrupted, pulse-modulated waves

Abstract

The invention discloses a 3D imaging device with the fusion of structured light and TOF technology, which comprises: a projection module (10) comprising a light source (101), a collimating lens (102), and a diffractive optical element (103); the projection module (10) has a structured light emission mode and a TOF emission mode; projecting a structured light pattern toward a target object in a structured light mode of operation; projecting a flood illumination pattern toward a target object in a TOF transmit mode; a receiving module (20) for collecting the optical signal emitted by the target object; and the control calculation module (30) is used for controlling the projection module (10) and the receiving module (20) and calculating the depth information of the target object according to the optical signals collected by the receiving module (20). The device provided by the invention integrates two working modes of structured light and TOF technology, other component frameworks do not need to be added, the number of integrated components is reduced, the size of a module is reduced, the integration level is effectively improved, and the application scene range of the device is greatly improved.

Description

3D imaging device with structured light and TOF technology integrated
Technical Field
The invention relates to the field of photoelectric technology, in particular to a method for fusing structured light and flight time technology and a 3D imaging device.
Background
The 3D imaging device has already begun to be applied to some electronic consumer products in the market, such as motion recognition of motion sensing games, and structured light 3D face recognition of new generation iphones. The 3D imaging device can greatly enrich the experience of users, promote the product competitiveness, and particularly, compared with the 2D face recognition, the 3D face recognition device is incomparable with the 2D face recognition device in the aspects of experience, safety and the like due to the addition of one-dimensional information. Compared with traditional biological identification, such as fingerprint identification, the reliability and the safety of 3D face identification are higher by one step.
Different from a traditional 2D imaging device, such as a video camera, the three-dimensional imaging device can only acquire plane 2D information of an object, and the 3D imaging device can also acquire depth information of the object to construct a three-dimensional 3D model, so that the 3D imaging device is widely applied to the fields of industrial measurement, part modeling, medical diagnosis, security monitoring, machine vision, biological recognition, augmented reality AR, virtual reality VR and the like, and has great application value.
The 3D imaging technology is divided into two types, active and passive, with structured light and time flight technologies being the mainstream for the active mode and binocular vision being the mainstream for the passive mode. Because the passive binocular vision technology is influenced by objective factors such as external environment, surface texture attributes of a shot object and the like, and is more complex in an automatic feature point matching algorithm, the passive binocular vision technology is not popularized in the field of 3D imaging consumer electronics at present, and active application of structured light and time of flight (TOF) technology is mainly used in the market, particularly in the field of 3D face recognition and payment. The structured light technical scheme is characterized by comprising a speckle projector and an IR imaging module, the precision within 1m of a short distance reaches sub-millimeter, and the highest 3D face recognition precision requirement in the financial payment industry is met. The TOF scheme of the time flight technology is characterized by comprising an illumination transmitting module and a TOF sensor receiving module, 3D imaging can be carried out within a distance of 10m, and application scenes are multiple.
The structured light technology and the time flight technology on the market are presented in the form of independent applications, and the technical contents described in the chinese patent applications with publication numbers CN109343070A and CN206805630U have advantages in different imaging distances, but cannot combine the advantages of the two technologies.
Disclosure of Invention
In order to realize the combination of the advantages of structured light and TOF technology and the higher integration level of a module, the invention provides a 3D imaging device with the combination of the structured light and the TOF technology, the two technologies are fully combined, and the component architecture of the device is not increased compared with the technical scheme of single structured light, so that the high-precision 3D imaging in a short distance can be realized, and the common-precision 3D imaging in a long distance can also be realized; meanwhile, the number of integrated parts is reduced, the size of the module is reduced, the integration level is effectively improved, and the application scene range of the device is greatly improved.
The invention adopts the following specific technical scheme:
a structured light and TOF technique fused 3D imaging apparatus comprising:
a projection module comprising a light source, a collimating lens, and a diffractive optical element; the projection module has a structured light emission mode and a TOF emission mode; projecting a structured light pattern towards a target object in a structured light emission mode; projecting a flood illumination pattern toward a target object in a TOF transmit mode;
the receiving module is used for collecting the optical signal emitted by the target object;
and the control calculation module is used for controlling the projection module and the receiving module and calculating the depth information of the target object according to the optical signal collected by the receiving module.
The projection module has two working modes of a structural light emission mode and a TOF emission mode, the two modes share the same subsequent emission light path, and the switching is controlled by the control calculation module.
In order to integrate the two emission modes in the same projection module, the invention improves the arrangement of the light source or the collimating lens.
As a further improvement, a power component for driving the light source to move in the optical axis direction is arranged; when the light source is in the focusing position of the optical system, the projection module is in a structural light emission mode; when the light source is in the out-of-focus position of the optical system, the projection module is in the TOF emission mode.
In the above preferred embodiment, the power unit moves the light source in the optical axis direction, so that the light source processes the in-focus position and the out-of-focus position of the optical system, and controls the emission mode of the projection module. Further preferably, the power component may be a micro mechanical device or a micro motion platform.
As a further improvement, the collimating lens of the projection module can move in the direction of the optical axis, and the projection module is controlled to switch between the structured light emission mode and the TOF emission mode.
Similarly, in the preferred embodiment, the collimating lens may also be moved in the optical axis direction to control the emission mode of the transmission module by making the light source process the in-focus position and the out-of-focus position of the optical system.
In addition, in the technical scheme of the invention, the projection module can have a structural light emission mode and a TOF emission mode by changing the arrangement of the light emitting holes of the light source without adding other components.
As a further improvement, the light source is divided into a TOF working area and a structured light working area which have height difference in the direction of the optical axis; the optical axis position corresponding to the TOF working area is the out-of-focus position of the optical projection system and is used for projecting a floodlighting pattern to a target object; and the optical axis position corresponding to the structured light working area is the focusing position of the optical projection system and is used for projecting the structured light pattern to the target object.
In the above preferred technical solution, the light emitting holes of the light source are divided into a structured light working area and a TOF working area corresponding to the structured light emitting mode and the TOF emitting mode, the light emitting holes in the two working areas have a height difference in the optical axis direction, form an out-of-focus position and a focus position with respect to the same collimating lens, and form a corresponding structured light emitting mode and TOF emitting mode when the light emitting holes in different areas are lit.
As a further improvement, the luminous holes in the TOF working area are in regular lattice arrangement or random scattered point distribution.
In another preferred technical scheme, the projection module can be controlled to be in a structural light emission mode or a TOF emission mode by controlling the lighting density of the light emitting holes.
As a further improvement, the light source is provided with light emitting holes on the same light emitting surface, the light source is in a structured light working mode when part of the light emitting holes emit light beams, and the light source is in a TOF emission mode when all the light emitting holes emit light beams.
As a further improvement, the light emitting holes of the light source are divided into a TOF working area and a structured light working area, and part of the light emitting holes in the structured light working area are independent communicating areas; the light emitting aperture of the TOF working area covers the entire light emitting face of the light source.
As a further improvement, the luminous holes of the TOF working area and the luminous holes of the structured light working area cover the luminous surface of the whole light source, and the density of the luminous holes in the structured light working area is smaller than that of the TOF working area.
As a further improvement, the receiving module comprises an image sensor (201), an optical filter and an imaging lens, wherein the image sensor is composed of array pixel units and is used for collecting and converting optical signals into electric signals.
As a further improvement, in the structural light emission mode, the control calculation module calculates depth information of the target object based on a triangular distance measurement principle according to the collected light signals;
in a TOF emission mode, the control calculation module calculates depth information of the target object based on a time-of-flight distance measurement principle according to the collected optical signals.
Compared with the prior art, the device provided by the invention integrates two working modes of structured light and TOF technology without adding other component frameworks, the number of integrated components is reduced, the size of a module is reduced, the integration level is effectively improved, and the application scene range of the device is greatly improved.
Drawings
FIG. 1 is a schematic diagram of a 3D imaging apparatus with structured light and TOF technology integrated according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a projection module according to an embodiment of the invention; wherein (a) is the internal structure of the projection module; (b) the speckle patterns corresponding to the two projection modes are shown;
FIG. 3 is a structured light speckle pattern and a TOF flood illumination pattern in an embodiment of the present invention; wherein (a) is a speckle pattern in a structured light mode of operation; (b) the graph is a TOF flood illumination pattern;
FIG. 4 is a schematic diagram of a projection module according to an embodiment of the invention;
FIG. 5 is a schematic view of a projection module according to an embodiment of the invention; wherein, the picture (a) is the internal structure of the projection module, and the picture (b) is the distribution of the luminous holes in two working areas;
FIG. 6 is a schematic view of a projection module according to an embodiment of the present invention; wherein, the picture (a) is the internal structure of the projection module, and the picture (b) is the distribution of the luminous holes in two working areas;
FIG. 7 is a schematic diagram of high density speckle patterns and sparse speckle patterns in an embodiment of the present invention; wherein (a) the image is a high density speckle pattern and (b) the image is a sparse speckle pattern;
FIG. 8 is a schematic view of a laser source partition in an embodiment of the invention;
FIG. 9 is a schematic diagram of an image sensor pixel array in an embodiment of the invention;
FIG. 10 is a schematic diagram of the internal circuitry of a single pixel of an image sensor in an embodiment of the invention;
FIG. 11 is a signal timing diagram for the structured light mode of operation;
fig. 12 is a signal timing diagram in TOF mode of operation.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and thus the present invention is not limited to the specific embodiments disclosed below.
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are illustrative and intended to be illustrative of the invention and are not to be construed as limiting the invention.
In the present invention, unless otherwise expressly specified or limited, the terms "mounted," "connected," "secured," and the like are to be construed broadly and can, for example, be fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those skilled in the art according to specific situations.
The 3D imaging apparatus as shown in fig. 1, comprises:
a projection module 10 including a light source 101, a collimator lens 102, and a diffractive optical element 103; the projection module 10 has a structured light emission mode and a TOF emission mode; projecting a structured light pattern toward a target object in a structured light mode of operation; projecting a flood illumination pattern toward a target object in a TOF transmit mode;
a receiving module 20, configured to collect an optical signal emitted by the target object;
and a control calculation module 30, configured to control the projection module 10 and the receiving module 20, and calculate depth information of the target object according to the light signal collected by the receiving module 20.
In this embodiment, the projection module 10 is configured to project a structured light pattern to a front object target in a structured light operating mode, where the structured light pattern may include one or more of patterns in the form of pseudo-random speckles, stripes, two-dimensional codes, and the like, and may be combined into different embodiments; on the other hand, the projection module 10 is also used to project a flood lighting pattern, which may be a full-surface lighting rectangular spot pattern or a high density lattice pattern with a sufficiently high density, onto the front target object in the TOF operation mode. Under the control of the control calculation module 30, the projection module 10 has two operation modes: the structure light emission mode and the TOF emission mode share the same subsequent emission light path, and the difference lies in that the corresponding light source working areas are different and the driving pulses are different, wherein the structure light emission mode corresponds to low-frequency wide pulses, and the TOF emission mode corresponds to high-frequency narrow pulses.
The receiving module 20 is configured to receive the optical signal emitted by the projecting module 10 and returned back through the target object, and includes three sub-modules, namely an image sensor 201, an optical filter 202, and an imaging lens 203. Similarly, under the control of the control calculation module 30, the receiving module 20 has two operation modes: a structured light receive mode and a TOF receive mode.
The control calculation module 30 is used for controlling the projection module 10 to select different light source working modes, and giving different driving signals to the projection module 10 according to the different working modes. Meanwhile, the working mode of the receiving module 20 is also controlled, the returned light signals can be properly collected, and the depth information (distance information) of the image is calculated based on the corresponding algorithm processing of the emitted light signals and the returned light signals.
As shown in fig. 1, the projection module specifically includes a laser light source 101, a collimating lens 102, and a diffractive optical element DOE 103. The laser light source 101 is a vertical cavity surface laser emitter vcsel distributed in a pseudo-random lattice manner, the wavelength of emitted light is 940nm, and the wavelength can be selected according to the system requirement. The collimating lens 102 is formed by one lens in this figure, or may be formed by other lens lenses, and is used for collimating the light beams emitted from all the light emitting holes 1010 of the laser light source 101, so that each light beam enters the subsequent diffractive optical element DOE 103 in an approximately parallel light beam form. The diffractive optical element DOE 103 is manufactured by a micro-nano process, and functions to perform spatial light modulation on each incident light beam, and a preferred specific function in this embodiment is to perform splitting and copying on each light beam, for example, a laser light source includes 400 light emitting holes, and after splitting and copying of the DOE, the number of the light beams is increased to 80 times, so as to obtain 400 × 80 — 32000 light beams.
Fig. 2 (a) shows a projection module 10 according to an embodiment of the present invention, in which a laser light source 101 is mounted with a movable micro-mechanical device as a power component, and can move back and forth along an optical axis z under the control of a control calculation module 30. According to the basic imaging optical principle, when the laser light source is in the focusing position of the optical system, the speckles are projected in the speckle projection optical system in a focusing state, and the speckles are minimal, clear and bright. On the contrary, when the laser source is in the out-of-focus position of the optical system, the projected speckle is dispersed, the speckle is enlarged, and the light energy distribution is dispersed accordingly, which is shown in fig. 2 (b). In this embodiment, when the system operates in the structured light mode, the z-axis position of the laser source 101 is controlled to be at the "structured light operation mode" position in fig. 2 (a), and the projection module 10 projects speckle as shown in fig. 3 (a), which is a discrete, focused, clear and bright state. The speckle pattern is projected on a target object and folded back, the receiving module 20 collects the speckle pattern in a corresponding working mode, and the control calculation module 30 calculates depth information of the target object based on a triangulation distance measurement principle according to the collected speckle pattern of the target object. When the system works in the TOF mode, the z-axis position of the laser light source 101 is controlled to move to the "TOF working mode" position in the diagram (a) of fig. 2, and is in the out-of-focus position, and the speckles projected by the projection module 10 are as the diagram (b) of fig. 3, and are dispersed and mutually fill the gaps between the speckles, and are connected into a piece to form a flood lighting pattern of the whole-surface integral lighting. The floodlight pattern is projected on a target object and folded back, the receiving module 20 collects the floodlight pattern in a corresponding working mode, and the control calculation module 30 calculates depth information of the target object based on a flight time ranging principle according to the collected floodlight pattern of the target object. It should be noted that the moving amount of the laser light source 101 is adjusted according to the focal length of different optics, and may be along the + z-axis direction or the-z-axis direction.
In another embodiment, as shown in fig. 4, the optical projection system can be moved to an "out-of-focus position" by moving the collimating lens 102 to achieve a state where the projected speckle changes from sharp focus to diffuse blur, which enables the projection module to switch from projection of a sharp speckle pattern required for structured light operation mode to projection of a flood illumination pattern required for TOF operation mode.
In another embodiment, as shown in fig. 5 (a), the laser light source 101 of the projection module 10 is divided into two regions in the y-axis direction: a TOF operating region 1011 and a structured light operating region 1012. FIG. 5 (b) is a top view of a laser light source, where the two regions have a height difference in the z-axis direction, where the z-axis position corresponding to the TOF working region is the out-of-focus position of the optical projection system, and the z-axis position corresponding to the structured light working region is the in-focus position of the optical projection system. When the system is in a TOF working mode, the control calculation module applies a corresponding driving signal to the TOF working area 1011 to enable the light emitting hole to be lightened, and a flood lighting pattern with integral lighting on the whole surface is generated due to the defocusing effect. When the system is in the structured light mode of operation, the control and calculation module applies a corresponding drive signal to the structured light operating region 1012 to illuminate the light emitting aperture and produce a sharply focused structured light speckle pattern. The corresponding receiving module operates in the same manner as above, and finally obtains the structured light depth information of the front target object based on the triangle distance measurement principle and the TOF depth information based on the time-of-flight distance measurement principle, which are not described in detail herein. It should be noted that the arrangement of the light emitting holes in the TOF working area may be random scatter distribution as illustrated in this embodiment, or may be regular lattice arrangement, because the TOF ranging system has no requirement on the arrangement of the light emitting holes of the light source, and only the whole flood illumination pattern needs to be finally formed on the target object. It should be noted that the laser light source may be divided equally in the x-axis direction.
In another embodiment, as shown in fig. 6 (a), the structured light operation region 1012 of the laser light source 101 of the projection module 10 occupies only half of the whole light source area, and the TOF operation region 1011 is the whole light source portion, and includes the structured light operation region. Fig. 6 (b) is a top view of the laser light source 101, where all the light emitting hole areas are in the focus position of the optical projection system, and finally a focused and clear speckle pattern is projected. When the system is in a TOF working mode, the control calculation module applies corresponding driving signals to all the luminous holes of the TOF working area 1011 to enable the luminous holes to be lightened, and since all the luminous holes emit laser light and are enough in number, a high-density speckle pattern with clear focusing is finally formed on a front target object, the density is high enough to enable the lighting effect of the high-density speckle pattern to be close to a flood lighting pattern with integral lighting on the whole surface. When the system is in the structured light mode of operation, the control and calculation module applies corresponding drive signals to the light emitting apertures of structured light operating region 1012 to produce a sharply focused structured light speckle pattern that is sparse compared to the speckle pattern of the TOF mode, as shown in fig. 7 (b). The corresponding receiving module operates in the same manner as above, and finally obtains the structured light depth information of the front target object based on the triangle distance measurement principle and the TOF depth information based on the time-of-flight distance measurement principle, which are not described in detail herein. It should be noted that the partition mode of the laser light source 101 is not limited to the division into two in this embodiment, and there are other various partition methods, as shown in fig. 8, the final purpose is to open more areas of the light emitting holes to form a high-density speckle pattern or a dot matrix pattern in the TOF operating mode, and open some areas of the light emitting holes to form a relatively sparse speckle pattern in the structured light operating mode. Or the total number of the light emitting holes of the laser light source is enough (> 300), the two working modes correspond to the high-density speckle patterns uniformly, and a partition mode is not adopted, because the high-density speckle patterns and the low-density speckle patterns can work normally in the structured light working mode, and the difference is that when the structured light working mode is low in density, the upper limit of the driving current given by a single light emitting hole is larger, the power is higher, and the structured light 3D imaging device can be suitable for structured light 3D imaging of a target object with a longer distance.
Correspondingly, the receiving module 20 in this embodiment also has two operation modes: TOF mode of operation and structured light mode of operation. Under the control and allocation of the control and calculation module, the illumination patterns emitted by the projection module 10 for collecting the corresponding working modes are received. The receiving module 20 includes an image sensor 201, an optical filter 202, and an imaging lens 203, where the image sensor 201 is composed of an array pixel unit 2010, as shown in fig. 9, the number of the array pixel unit 2010 in the horizontal direction and the vertical direction may be determined according to the system design, and the embodiment preferably has a resolution of 1280 × 800. The array pixel unit 2010 may be one of a Charge Coupled Device (CCD) and a Complementary Metal Oxide Semiconductor (CMOS), and is characterized by collecting a light signal, converting an electric signal, performing an integral calculation, and quantifying the light intensity. As shown in fig. 10, each array pixel unit 2010 includes at least 2 high-speed gate switches G1 and G2 and charge integration units C1 and C2 connected to the high-speed gate switches, light is irradiated onto the array pixel unit 2010, received by the photoelectric conversion element PD, and the light signal energy is converted into a charge signal, and stored in the charge integration unit where the corresponding high-speed gate switch is turned on, and the magnitude of the charge amount directly represents the magnitude of the light intensity. Also typically connected to array pixel element 2010 is a readout circuit comprising one or more of a signal amplifier, an analog-to-digital converter (ADC), and the like. The filter 202 functions to block unwanted light in the ambient light from entering the image sensor 201 and from causing interference, typically a narrow band filter corresponding to the laser wavelength of the emitting module.
The general 3D imaging device is also provided with devices such as a color imaging camera, a proximity optical sensor and an IMU (inertial measurement Unit), belongs to auxiliary accessories and non-core functional modules, and aims to realize richer functions such as 3D texture modeling and infrared face recognition. The technology of these devices is mature and the present embodiment will not be described in detail.
According to the device structure of the embodiment, the specific working process of the device of the invention is as follows:
when the 3D imaging apparatus starts to operate, as shown in fig. 11, first in the structured light operation mode, the control calculation module 30 provides a low-frequency wide pulse current driving signal to the emission module 10, forming a light emission driving timing 401, typically with a frequency lower than 120 hz and a pulse width greater than 1 ms. Under the control of the control calculation module 30, the laser light source 101 is switched to a "structured light operating mode", emits laser light along with the light emission driving time sequence 401, passes through a subsequent emission light path, finally forms a speckle pattern of structured light to irradiate on a target object in a front space, and returns part of speckle light signals to the receiving module 20 to be collected. When the projection module 10 starts to work, the control calculation module 30 provides a light receiving synchronization timing 402 signal to the receiving module 20, and after receiving the signal, the array pixel unit 2010 of the image sensor 201 will always turn on the two high-speed gate switches G1, G2 in the high-level time period of the synchronization signal timing, complete the conversion of the return light signal in the time period to the electric signal, and integrate and store the electric signal in the corresponding charge integration unit, thus completing one image exposure collection of the structured light part. The receiving module 20 sends the acquired structured light image to the control and calculation module 30 for algorithm processing, and based on the principle of triangulation distance measurement, a high-precision depth map of a target object within a short distance range is obtained. In general, the light emitting driving sequence 401 lasts for one to more pulse periods, and the light receiving synchronization sequence 402 similarly, the preferred embodiment lasts for 1 pulse period, the frequency is 30Hz, that is, the image frame rate of the receiving module 20 is 30fps, and finally, a high-precision depth map of the target object within a short distance range of 1 frame is obtained within 33 ms.
After the structured light operation mode is completed, as shown in fig. 12, the control and calculation module 30 provides a high-frequency narrow-pulse current driving signal to the emission module 10 to form an emission driving timing sequence 402, which generally has a frequency of 1 mhz to 100 mhz and a pulse width of less than 100 microseconds. Under the control of the control calculation module 30, the laser light source 101 is switched to the "TOF working mode", emits laser light along with the light emission driving time sequence 403, passes through a subsequent emission light path, finally forms a floodlight illumination pattern for TOF to be irradiated on a target object in the front space, and returns part of the illumination light signals to the receiving module 20 to be collected. While the projection module 10 is switched to the "TOF operation mode", the control calculation module 30 provides signals of the C1 operation timing 404 and the C2 operation timing 405 to the high-speed gate switches G1 and G2, respectively, and a high level indicates that the switches are turned on, so that the high-speed switches are realized by the two under the high-speed timing signals. It can be seen from fig. 7 that the C1 operation timing 404 is completely synchronized with the light emission driving timing 403 and is completely opposite to the C2 operation timing 405, that is, the high-speed gate switch G2 must be in an on state during the time period in which the high-speed gate switch G1 is turned on, and the operation sampling integration time periods of the charge integrating cells C1 and C2 are also completely opposite.
In the TOF working mode, the projection module 10 emits a flood lighting pattern, and after the flood lighting pattern is irradiated on an object in the front space, a part of light is folded back to the receiving module 20 to form a reflected light signal 500 in fig. 12, it can be seen that the reflected light signal 500 has a "delay" in time relative to the light emission driving timing sequence 403, that is, a phase delay is generated, and by obtaining the phase delays of all the array pixel units 2010, the depth information, that is, the distance information of the sampling points on all the objects in the front space can be directly calculated, using the following formula:
distance-phase delay x speed of light/2
In the preferred embodiment, the phase delay is Q2/(Q1+ Q2), Q1 represents the magnitude of the charge integrated by the charge integration unit C1 during the high level interval of the reflected light signal 500, i.e. represents the sampled reflected light energy, as shown by the area of the shaded portion in fig. 7, and Q2 represents the magnitude of the charge integrated by the charge integration unit C2 during the high level interval of the reflected light signal 500. The method for obtaining and calculating the phase delay is a method commonly used in practical operations in the industry, and in some embodiments, an improved calculation method including ambient light suppression, a remote scheme of multiple high-speed door switches, and the like may be used, and the basic principles are similar and will not be described in detail herein. Another conventional four-step phase shifting method may be used to calculate the phase delay, and the corresponding pixel element 2010 structure may be different, but the final purpose is to obtain the phase delay without departing from the scope of the present embodiment.
Under the control of the control calculation module 30, the light emission driving timing 403, the C1 operation timing 404, and the C2 operation timing 405 may continue until the end of the exposure time of a single frame (e.g. 33ms), which typically has thousands of pulse timing cycles. When the single-frame exposure time of the receiving module 20 is over, a common precision depth map in a long distance range of 1 frame can be calculated by the obtained phase delays of all the array pixel units 2010.
Therefore, the system obtains a high-precision depth map within a short-distance range of 1 frame and a common-precision depth map within a long-distance range of 1 frame, and fully fuses the information of the two frames of depth maps through a depth map segmentation and interframe fusion algorithm embedded in the control calculation module 30 to form a complete depth map output, wherein a short distance (within 1 m) is high-precision and is obtained by a structured light 3D imaging technology, and a long distance (within 1-10m) is common-precision and is obtained by a flight time 3D imaging technology. Therefore, the method for fusing structured light and time-of-flight technology provided by the embodiment can be used for high-precision 3D imaging in a short distance and general-precision 3D imaging in a long distance, so that the application scene range of the device is greatly improved.
In a static 3D imaging application scenario, the device requires only two frames to complete. If the method is applied in a dynamic 3D imaging scenario, the frame rate is reduced by half based on the frame rate of the receiving module 20, for example, in a preferred embodiment, the frame rate of the receiving module 20 is 30fps, and the final frame rate of the depth information output of the system device is 15 fps.
The above description is only exemplary of the preferred embodiments of the present invention, and is not intended to limit the present invention, and any modifications, equivalents, improvements, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A structured light and TOF technology fused 3D imaging apparatus comprising:
a projection module (10) comprising a light source (101), a collimating lens (102), and a diffractive optical element (103); the projection module (10) has a structured light emission mode and a TOF emission mode; projecting a structured light pattern toward a target object in a structured light mode of operation; projecting a flood illumination pattern toward a target object in a TOF transmit mode;
a receiving module (20) for collecting the optical signal emitted by the target object;
and the control calculation module (30) is used for controlling the projection module (10) and the receiving module (20) and calculating the depth information of the target object according to the optical signals collected by the receiving module (20).
2. The structured light and TOF technology fused 3D imaging apparatus according to claim 1, wherein a power component is provided to drive the light source (101) to move in the direction of the optical axis; when the light source (101) is in the in-focus position of the optical system, the projection module (10) is in a structured light emission mode; the projection module (10) is in a TOF transmit mode when the light source (101) is in an out-of-focus position of the optical system.
3. The structured light and TOF technique fused 3D imaging apparatus according to claim 1 wherein the collimating lens (102) of the projection module (10) is movable in the direction of the optical axis controlling the projection module (10) to switch between the structured light emission mode and the TOF emission mode.
4. The structured light and TOF technology fused 3D imaging apparatus according to claim 1 wherein the light source (101) is divided into a TOF working area (1011) and a structured light working area (1012) with height difference in the optical axis direction; the optical axis position corresponding to the TOF working area (1011) is the out-of-focus position of the optical projection system and is used for projecting a floodlighting pattern to a target object; the optical axis position corresponding to the structured light work area (1012) is an in-focus position of the optical projection system for projecting the structured light pattern towards the target object.
5. The structured light and TOF technology fused 3D imaging apparatus according to claim 4 wherein the light emitting apertures within the TOF working area (1011) are in a regular lattice arrangement or a random scatter distribution.
6. The structured light and TOF technology fused 3D imaging device according to claim 1 wherein the light source (101) has light emitting holes on the same light emitting face, in structured light mode when part of the light emitting holes emit light beams and in TOF emission mode when all light emitting holes emit light beams.
7. The structured light and TOF technology fused 3D imaging apparatus according to claim 6 wherein the light emitting aperture of the light source (101) is divided into a TOF working area (1011) and a structured light working area (1012), and part of the light emitting apertures in the structured light working area (1012) are independent connected areas; the light emitting aperture of the TOF working area (1011) covers the entire light emitting face of the light source.
8. The structured light and TOF technology fused 3D imaging apparatus according to claim 6, wherein the light emitting apertures of the TOF working area (1011) and the structured light working area (1012) each cover the entire light emitting face of the light source, and the density of light emitting apertures within the structured light working area (1012) is smaller than the TOF working area (1011).
9. The structured light and TOF technology fused 3D imaging apparatus according to claim 1, wherein the receiving module (20) comprises an image sensor (201), an optical filter (202) and an imaging lens (203), wherein the image sensor (201) is composed of an array of pixel units (2010) for collecting and converting optical signals into electrical signals.
10. The structured light and TOF technology fused 3D imaging apparatus according to claim 1, wherein in the structured light emission mode, the control calculation module (30) calculates depth information of the target object based on a triangulation distance measurement principle from the collected light signals;
in a TOF emission mode, the control calculation module (30) calculates depth information of the target object based on a time-of-flight distance measurement principle according to the collected optical signals.
CN202110459066.7A 2021-04-27 2021-04-27 3D imaging device with structured light and TOF technology integrated Pending CN113238248A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110459066.7A CN113238248A (en) 2021-04-27 2021-04-27 3D imaging device with structured light and TOF technology integrated

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110459066.7A CN113238248A (en) 2021-04-27 2021-04-27 3D imaging device with structured light and TOF technology integrated

Publications (1)

Publication Number Publication Date
CN113238248A true CN113238248A (en) 2021-08-10

Family

ID=77129330

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110459066.7A Pending CN113238248A (en) 2021-04-27 2021-04-27 3D imaging device with structured light and TOF technology integrated

Country Status (1)

Country Link
CN (1) CN113238248A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113542534A (en) * 2021-09-17 2021-10-22 珠海视熙科技有限公司 TOF camera control method and device and storage medium
CN114428437A (en) * 2022-01-14 2022-05-03 深圳市安思疆科技有限公司 3D projector and electronic equipment that structured light and floodlight illumination closed and put
CN114779487A (en) * 2022-04-28 2022-07-22 深圳市安思疆科技有限公司 Optical device and optical system
CN114428437B (en) * 2022-01-14 2024-05-14 深圳市安思疆科技有限公司 3D projector and electronic equipment that structured light and floodlight illumination put together

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113542534A (en) * 2021-09-17 2021-10-22 珠海视熙科技有限公司 TOF camera control method and device and storage medium
CN114428437A (en) * 2022-01-14 2022-05-03 深圳市安思疆科技有限公司 3D projector and electronic equipment that structured light and floodlight illumination closed and put
CN114428437B (en) * 2022-01-14 2024-05-14 深圳市安思疆科技有限公司 3D projector and electronic equipment that structured light and floodlight illumination put together
CN114779487A (en) * 2022-04-28 2022-07-22 深圳市安思疆科技有限公司 Optical device and optical system

Similar Documents

Publication Publication Date Title
CN111025317B (en) Adjustable depth measuring device and measuring method
US20210311171A1 (en) Improved 3d sensing
CN109596091B (en) Distance measuring sensor
CN106412433B (en) Atomatic focusing method and system based on RGB-IR depth camera
CN111722241B (en) Multi-line scanning distance measuring system, method and electronic equipment
EP2656100B1 (en) 3d landscape real-time imager and corresponding imaging methods
CN107925750B (en) Projection arrangement with range image acquisition device and projection mapping method
CN113238248A (en) 3D imaging device with structured light and TOF technology integrated
CN107783353A (en) For catching the apparatus and system of stereopsis
CN212135135U (en) 3D imaging device
WO2013121267A1 (en) Time of flight camera with stripe illumination
CN111025318A (en) Depth measuring device and measuring method
CN111487639A (en) Laser ranging device and method
CN111427230A (en) Imaging method based on time flight and 3D imaging device
CN111025321A (en) Variable-focus depth measuring device and measuring method
CN212135134U (en) 3D imaging device based on time flight
CN112799080A (en) Depth sensing device and method
CN108534703A (en) Structured light generators and depth data measurement head and measuring device
TW202303304A (en) Method for performing hybrid depth detection with aid of adaptive projector, and associated apparatus
CN208505256U (en) Structured light generators
KR102634094B1 (en) 3d image acquisition device
Chen et al. A light modulation/demodulation method for real-time 3d imaging
KR20200023927A (en) Image processing apparatus and image processing method
EP4199509A1 (en) 3d image acquisition device
CN216535126U (en) Depth camera covering short distance and long distance and sweeping robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination