CN216246133U - Structured light projection device, depth data measuring head and computing equipment - Google Patents

Structured light projection device, depth data measuring head and computing equipment Download PDF

Info

Publication number
CN216246133U
CN216246133U CN202120814643.5U CN202120814643U CN216246133U CN 216246133 U CN216246133 U CN 216246133U CN 202120814643 U CN202120814643 U CN 202120814643U CN 216246133 U CN216246133 U CN 216246133U
Authority
CN
China
Prior art keywords
light
sub
image sensor
depth data
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202120814643.5U
Other languages
Chinese (zh)
Inventor
王敏捷
梁雨时
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Tuyang Information Technology Co ltd
Original Assignee
Shanghai Tuyang Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Tuyang Information Technology Co ltd filed Critical Shanghai Tuyang Information Technology Co ltd
Priority to CN202120814643.5U priority Critical patent/CN216246133U/en
Application granted granted Critical
Publication of CN216246133U publication Critical patent/CN216246133U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

A structured light projection apparatus, a depth data measurement head and a computing device are disclosed. The structured light projection device includes: the light source module is used for generating and emitting a light beam with textures; a steering projection module comprising: reflection means arranged on an exit path of the light beam for reflecting the light beam incident thereto to exit the light beam; and the driving device is connected to the reflecting device and used for changing the angle of the reflecting device relative to the incident light beam so as to change the emergent direction of the light beam. The utility model uses the improved structured light projection device which can reflect the structured light generated by the light source module at different angles, thereby realizing the multi-pattern projection which is faster, more economic and has low failure rate. Furthermore, the structured light projection device can be matched with a plurality of pairs of binocular sensors of a common light path, so that the frame interval is further shortened, and the quality of depth fusion data is improved.

Description

Structured light projection device, depth data measuring head and computing equipment
Technical Field
The utility model relates to the technical field of three-dimensional detection, in particular to a structured light projection device, a depth data measuring head and computing equipment.
Background
In recent years, three-dimensional imaging techniques have been developed vigorously. Currently, a binocular detection scheme based on structured light can perform three-dimensional measurement on the surface of an object in real time. Briefly, the scheme includes that firstly, two-dimensional laser texture patterns with coded information, such as discretized speckle patterns, are projected to the surface of a natural body, laser textures are continuously collected through two image collecting devices with relatively fixed positions, a processing unit samples two images collected by the two image collecting devices simultaneously through a sampling window, matched laser texture patterns in the sampling window are determined, depth distances of all laser texture sequence segments projected to the surface of the natural body are calculated according to differences among the matched texture patterns, and three-dimensional data of the surface of an object to be measured are further obtained through measurement.
In the matching process, the larger the sampling window is, the larger the amount of pattern information included in a single sampling is, and thus matching is easier to perform, but the larger the granularity of the obtained depth image is. Accordingly, the smaller the sampling window, the finer the granularity of the image, but the greater the mismatch rate. Although the sampling window can be reduced by taking multiple sets of different images in succession, this introduces additional system complexity and can reduce the frame rate.
To this end, there is a need for an improved structured light projection device, and a depth data measurement scheme utilizing the same.
SUMMERY OF THE UTILITY MODEL
A technical problem to be solved by the present disclosure is to provide a depth data measuring scheme using an improved structured light projection device capable of reflecting structured light generated by a light source module at different angles, thereby enabling a faster, more economical, and less failure rate multi-pattern projection. Furthermore, the structured light projection device can be matched with a plurality of pairs of binocular sensors of a common light path, so that the frame interval is further shortened, and the quality of depth fusion data is improved.
According to a first aspect of the present disclosure, there is provided a structured light projection device comprising: the light source module is used for generating and emitting a light beam with textures; a steering projection module comprising: reflection means arranged on an exit path of the light beam for reflecting the light beam incident thereto to exit the light beam; and the driving device is connected to the reflecting device and is used for changing the angle of the reflecting device relative to the incident light beam so as to change the emergent direction of the light beam.
According to a second aspect of the present disclosure, there is provided a depth data measurement head comprising: the structured light projection device according to the first aspect of the present invention is configured to project a light beam with textures to a measured space under different projection angles driven by a driving device, so as to form different textures on an object to be detected in the measured space; and the first image sensor and the second image sensor are respectively arranged at two sides of the structured light projection device, a preset relative spatial position relation is formed between the first image sensor and the second image sensor, at least two times of imaging is carried out on a measured space in the movement process of the reflection device, so as to obtain at least two groups of images with different texture distribution, wherein the at least two groups of images are used for obtaining single-time measurement depth data of an object to be detected.
According to a third aspect of the present disclosure, there is provided a depth data calculation device comprising: the depth data measuring head according to the second aspect of the present invention, and a processor, configured to acquire the at least two sets of images, determine depth data of the texture in each set of images according to a predetermined relative spatial position relationship between the first and second image sensors, and fuse the depth data determined based on the at least two sets of images to obtain new depth data, which is used as single-measurement depth data of the object to be detected.
Therefore, the projection flexibility of the structured light projection device is improved by rotating and reflecting the structured light. The device can be further combined with a coaxial binocular scheme to further improve the precision and imaging speed of a multi-frame fusion scheme.
Drawings
The above and other objects, features and advantages of the present disclosure will become more apparent by describing in greater detail exemplary embodiments thereof with reference to the attached drawings, in which like reference numerals generally represent like parts throughout.
FIG. 1 shows a schematic composition diagram of a structured light projection device according to one embodiment of the present invention.
Fig. 2 shows a schematic view of the present invention for changing the projection direction.
FIG. 3 illustrates a perspective view of a structured light projection device, according to one embodiment of the present invention.
Fig. 4A-B illustrate examples of structured light projected by the structured light projection apparatus of fig. 3 at different perspective angles.
FIG. 5 is a schematic diagram showing the composition of a depth data measuring head according to one embodiment of the present invention.
FIG. 6 shows a contrast timing diagram for coaxial two-set imaging and single-set imaging.
Fig. 7 shows timing diagrams for coaxial three-group binocular imaging.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
As described above, in the matching process of binocular imaging, the larger the sampling window is, the larger the amount of pattern information contained in a single sampling is, and thus matching is easier to perform, but the more granularity of the obtained depth image is. Accordingly, the smaller the sampling window, the finer the granularity of the image, but the greater the mismatch rate. The sampling window can be reduced by taking multiple sets of different images in succession.
For example, the same light source module can be projected from different angles by driving the light source module to rotate by using the driving device. At this time, even though the patterns projected by the light source modules are the same, the patterns still appear to be different in the field of view of the imaging device (image sensor) due to the different projection angles. However, the light source module is self-weight, and needs to be wired for power supply, so that the driving function of the driving device is limited. In addition, the depth fusion based on multiple frames may reduce the frame rate and reduce the photographing performance of the dynamic object.
Therefore, the present invention provides a depth data measuring scheme using an improved structured light projection device capable of reflecting structured light generated from a light source module at different angles, thereby enabling faster, more economical, and lower-failure multi-pattern projection. Furthermore, the structured light projection device can be matched with a plurality of pairs of binocular sensors of a common light path, so that the frame interval is further shortened, and the quality of depth fusion data is improved.
First, fig. 1 shows a schematic composition diagram of a structured light projection device according to an embodiment of the present invention.
As shown, the structured light projection device 110 may include a light source module I located at an upper portion of the illustrated dotted line and a steering projection module II located at a lower portion of the illustrated dotted line. The light source module is used for generating a light beam to be projected, and the steering projection module is used for steering and emitting the light beam.
Specifically, the light source module is used for generating and emitting a textured light beam. Generally, the light source module does not directly project the light beam emitted by the light emitting device, but performs a certain optical treatment on the light beam projected by the light emitting device to make the light beam present a desired distribution, brightness or pattern. For this purpose,
as shown in fig. 1, the light source module I may include a laser generator 111 and a Diffractive Optical Element (DOE) 112. Wherein the laser generator 111 is adapted to emit a laser beam (illustrated by the single arrow). The DOE 112, which is disposed in the exit path of the laser beam, can modulate the incident laser light, for example, such that the incident laser light is diffracted and modulated into discrete spots having a specific projection rule (shown by double arrows to indicate that the diffracted beam has a certain width).
In other embodiments, the light source module I can also be implemented by other schemes besides laser + diffraction. For example, the light source module may comprise a flood light source for generating a flood light; and a mask arranged on the flood light source for converting the flood light into discrete spots having a specific projection rule.
Further, the steering projection module is a module for reflecting the light beam (e.g., speckle pattern) generated by the light source module out. Unlike conventional reflective modules, however, the steer projection module of the present invention is capable of driving the reflective module to move, thereby enabling the light beam reflected by the reflective module to change.
Specifically, as shown in fig. 1, the steering projection module II may include a reflection device 113 and a driving device 114. The reflecting device 113 disposed on the exit path of the light beam can be used to reflect the incident light beam to exit the light beam, and the driving device 114 connected to the reflecting device can be used to change the angle of the reflecting device relative to the incident light beam to change the exit direction of the light beam.
In the embodiment of the present invention, for convenience of description, the direction of the light exiting from the measuring head may be defined as a z direction, and the horizontal direction of the photographing plane is an x direction and the vertical direction is a y direction. For this reason, in fig. 1 and fig. 2, 3, and 4A-B to be described below, an example will be described in detail in which the light source module projects structured light downward (y direction), and the exit direction is changed to the z direction (actually, a direction deviated by a slight angle in the z direction) by the turn projection module. It should be understood that in other embodiments, the structured light projection device of the present invention can be arranged or positioned in other orientations, depending on the imaging needs.
For ease of understanding, fig. 2 shows a schematic view of the present invention changing the projection direction. Light emitted by the light source is projected downward along the y-axis to a reflecting device, such as a mirror, and reflected by the reflecting device to project to the space under test, and is capable of forming a spot on an imaging plane perpendicular to the z-axis. The reflecting means can be axially rotated along the x-axis, for example within the angle a-B as shown, whereby a spot moving in the range a '-B' on the imaging plane can be correspondingly obtained. When the light source module I shown in fig. 1 projects discrete light spots having a two-dimensional distribution pattern, although the patterns projected onto the mirror by the light source module I are the same (because the relative positions of the laser emitter 111 and the DOE 112 are fixed and do not move), the projected patterns have angular offsets due to the rotation of the mirror, so that the patterns photographed by the image sensor at different projection angles can be regarded as different patterns in the case of being used in combination with binocular imaging as described below.
Although a mirror that can rotate through a large a-B angle range is shown in fig. 2 for the purpose of illustrating the principles of the present invention, it should be understood that in practical application scenarios, the angular difference of the projection of the same pattern may be small, e.g., 1 °, thereby ensuring substantial coincidence of the imaging ranges while ensuring that the imaging patterns are not the same.
Further, FIG. 3 shows a perspective view of a structured light projection device, according to one embodiment of the present invention. Fig. 4A-B illustrate examples of structured light projected by the structured light projection apparatus of fig. 3 at different perspective angles.
As shown, the laser generator 311 may be disposed inside the housing (or fixed structure) 315, and the light source generated by the laser generator may be diffracted by the DOE 312 disposed on the outgoing light path to obtain a diffraction pattern with a certain two-dimensional distribution, the diffraction pattern propagates on the outgoing light path to the mirror 313, the mirror 313 reflects the diffraction pattern so that the diffraction pattern exits along the z direction (approximately), and forms a speckle pattern on a projection plane perpendicular to the z direction (as shown in fig. 4A and 4B, where fig. 4B can be regarded as a view of the device shown in fig. 3 and 4A rotated by 90 ° along the y axis). The projection plane of fig. 4A and 4B can be viewed as a three-dimensional illustration of the imaging plane shown in fig. 1.
Further, the driving means may control the reflecting means to move in an axial direction, wherein the light beam emitted from the light source module is incident to the reflecting means in a direction perpendicular to the axial direction (x-direction), and the emitting direction is changed based on the axial movement of the reflecting means.
As shown in the figure, a rotating shaft extending from the driving device (which may be a motor) 314 is fixedly connected to the mirror 313, so that when the motor works, the rotating shaft drives the mirror 313 to perform axial movement. Thereby forming a projected pattern with an angular offset on the projection plane. These projection patterns may be different patterns for an image sensor that performs projection photographing, and thus the projection apparatus of the present invention can realize convenient projection of "different" patterns by rotating the reflective structured light (for example, a diffraction pattern having a two-dimensional distribution).
In some embodiments, the steering projection module may be a galvanometer, and the motor may drive the mirror to reciprocate within a certain range. In other embodiments, the steering projection module can be a rotating mirror, and the motor can only move in one direction along the axial direction.
Specifically, the steering projection module may be a mechanical galvanometer that oscillates back and forth at a predetermined frequency, thereby being capable of projecting structured light at a predetermined frequency toward the measured area, thereby presenting a two-dimensional diffraction pattern (discrete light spot) moving up and down in the y direction on the projection plane. Due to the controllability of the mechanical galvanometer, the steering projection module may also be kept stationary for a predetermined window period during the continuous motion.
For example, a mechanical galvanometer may have a range of motion along the z direction of ± 1 °, and may have very high vibration frequencies up to 2k per second. When the image sensor is used together, for example, when three frames of images are combined into one frame of image, the mechanical galvanometer can be respectively suspended for a period of time when moving to-1 degrees, 0 degrees and 1 degrees, for example, the mechanical galvanometer is kept still within the exposure time (for example, 1ms) required by the photosensitive unit, so that the projected pattern is kept unchanged during the exposure of the image sensor, and the imaging precision is improved.
In the embodiment using the rotating mirror, although the rotating mirror can rotate only in one direction and is difficult to control in variable speed, the moving angle of the rotating mirror can be sensed by a photodiode or the like, and structured light projection and dynamic imaging are performed within an appropriate angle, for example, projection and corresponding imaging are performed within an interval of ± 1 ° in the z direction in 360 °.
In addition, it should be understood that although fig. 1-4B illustrate examples in which the projected structured light is projected along the z-direction and varies in the vertical direction (y-direction), in other embodiments, the projected structured light may be projected along the z-direction and varies in the horizontal direction (x-direction), or there may be angular variations in both the x-and y-directions.
The structured light projection device of the present invention may be used for depth data measurement. FIG. 5 is a schematic diagram showing the composition of a depth data measuring head according to one embodiment of the present invention. For simplicity, one example of the composition of the image sensor (520) is shown in greater detail.
As shown in fig. 5, the binocular principle-based depth data measuring head 500 includes a projection device 510 and first and second image sensors 520 and 530 having a predetermined relative positional relationship. The projection device 510 may be a structured light projection device as previously described in connection with fig. 1 and 3.
Although not shown in the drawings for convenience of explanation, the measuring head 500 may further include a case for enclosing the above-described device, and the connection structure 540 shown in fig. 5 may be regarded as a mechanism for fixing the above-described device and connecting to the case. In some embodiments, the connection structure 540 may be a circuit board including control circuitry thereon. It should be understood that in other implementations, the device 510 and 530 may be connected to the housing in other manners and perform corresponding data transmission and command reception operations.
Here, the projection device 510 is used to project the structured light, for example, the same pattern diffracted via the DOE, onto the shooting area, but due to the existence of the steering mechanism, the same pattern may be projected along different angles, so that the first image sensor 520 and the second image sensor 530 having a predetermined relative positional relationship shoot the shooting area to obtain a set of image frame pairs having different patterns. This set of image frame pairs may then be used for a single depth data calculation of the capture area.
Specifically, the first image sensor 520 and the second image sensor 530 may be disposed at both sides of the structured light projection device 510, respectively. The first image sensor and the second image sensor have a preset relative spatial position relationship, and image the measured space at least twice in the movement process of the reflecting device so as to obtain at least two groups of images with different texture distributions, wherein the at least two groups of images are used for obtaining single-time measurement depth data of the object to be detected.
For example, the projection device 510 may project a light beam with texture to the space to be measured under the driving of its driving device at a constantly changing projection angle, so as to form different textures on the object to be detected in the space to be measured. The image sensor may perform imaging a plurality of times during steering, for example, once each when moving to-1 °, 0 °, and 1 °, thereby obtaining a set of image frame pairs including three pairs (6 frames). These 6 frames of images are commonly used for the calculation of the primary depth data for the shooting area, that is, the depth image of one frame can be calculated.
In some embodiments, the light source module in the projection device 510 can be kept normally on during operation, and the image sensor can perform multiple imaging at a specific angle or at any angle (or at any angle within a predetermined movement range) of the rotation of the steering device.
For example, in a scenario using a controllable mechanical turning mirror, corresponding imaging may be performed according to the rotation angle of the turning mirror, for example, once imaging is performed when the turning mirror moves to-1 °, 0 ° and 1 °. Corresponding exposure can also be carried out according to the measured rotation angle.
In some embodiments, the rotation angle and the exposure of the image sensor may not be synchronized. For example, when the mechanical rotating mirror is set to rotate within ± 1 °, the image sensor can take a desired set of images at arbitrary timing and at arbitrary intervals, and different images can be taken as long as the shooting intervals do not completely coincide with the rotation frequency of the rotating mirror. In other words, it is not necessary to specify what pattern is specifically projected since the differences between the images of the same group are compared in the binocular scene and do not need to be compared to the reference pattern.
In other embodiments, the light source module in the projection device 510 may be synchronized with the exposure of the image sensor. In this case, the measuring head 500 may further include a controller for controlling the light source module to be simultaneously turned on when the first and second image sensors are exposed.
In addition, the controller may be further configured to control the driving device to remain stationary during the exposure of the first and second image sensors. Thereby, a clearer projection pattern, e.g. discrete spots, can be obtained compared to imaging during movement of the drive means.
In some embodiments of the utility model, the first and second image sensors may be conventional image sensors. In other embodiments, however, the first and second image sensors each comprise at least two sub-image sensors sharing at least part of the optical path, each for performing one of the at least two imaging passes.
As shown in fig. 5, the first image sensor 520 and the second image sensor 530 each include only one photosensitive unit and each performs imaging three times to acquire a set of three pairs (6 frames) of image frames, respectively, and in the present invention, the first and second image sensors each include at least two sub-image sensors sharing at least a part of an optical path, the at least two sub-image sensors being respectively used for imaging different patterns of structured light projected successively by the projection apparatus.
Fig. 5 shows an example in which the first and second image sensors each include two sub-image sensors (light-sensing units). As shown, the first image sensor 520 includes sub-image sensors 523 and 554, and the second image sensor 330 includes sub-image sensors 533 and 534. Here, the sub-image sensors 523 and 524 share the optical path up to the beam splitting surface of the beam splitting device 522 and are spaced apart from the beam splitting area by the same distance. Similarly, the sub-image sensors 533 and 534 share the optical path up to the beam splitting surface of the beam splitting device 532, and are spaced apart from the beam splitting area by the same distance. In other words, the present invention introduces multiple sets of binocular structures that are coaxial with each other. Here, the sub-image sensors 523 and 533 can be regarded as a first group of image sensors (a first group of binoculars) for imaging the structured light at one projection angle. Sub-image sensors 524 and 534, which may be considered a second set of image sensors (a second set of binoculars), may then be used to image structured light at another projection angle. In other words, the sub-image sensors 524 and 534, which are coaxial with 523 and 533, respectively, can be considered to be in place (i.e., have equivalent optical paths) at this time, and imaging of the latter pattern structured light is performed instead of 523 and 533. Thus, the imaging interval of two adjacent frames can be imaged at a smaller interval without depending on the frame interval of each image sensor.
To this end, the measuring head 500 may further comprise: synchronization means for causing the first and second image sensors 520 and 530 to each include at least two sub-image sensors to simultaneously image the at least two different patterns of structured light sequentially and respectively at a first interval smaller than a frame imaging interval of the sub-image sensors while the projection means projects the at least two different patterns of structured light at the first interval. Accordingly, each sub-image sensor still performs its own next frame imaging (e.g., imaging at its own frame interval) at a second interval not less than the frame imaging interval of the sub-image sensor, and the above-described imaging operation can be synchronized with the projection of the projection apparatus under the synchronization of the synchronization apparatus.
FIG. 6 shows a contrast timing diagram for coaxial two-set imaging and single-set imaging. For convenience of description herein, it may be assumed that the frame rate of each light-sensing unit (sub-image sensor) is 100 frames/s, the frame interval thereof is 10ms, and the exposure time required for each light-sensing unit is 1 ms.
If the first and second image sensors 520 and 530 are conventional image sensors including only a single photosensitive cell, when depth data calculation is to be performed using the three patterns shown in fig. 1, three times of imaging at 0 th, 10 th and 20 th milliseconds are required as shown in the lower part of fig. 6. For this reason, synthesizing each depth data image requires that the subject remain stationary for 21ms (and thus it is more difficult to photograph a moving subject), and the frame rate is also reduced from 100 frames/s to 33.3 frames/s.
In contrast, if the first and second image sensors 520 and 530 are image sensors of the present invention including two photosensitive units (e.g., the first and second image sensors 520 and 530 each include the sub-image sensors 523 and 524, and the sub-image sensors 533 and 534), when the depth data calculation is to be performed using three patterns, then, as shown in the upper part of fig. 6, the first group of photosensitive units performs imaging for the pattern 1 (e.g., the pattern at the first projection angle) at the 0 th millisecond, the second group of photosensitive units performs imaging for the pattern 2 (e.g., the pattern at the second projection angle) at the 1 st millisecond, and then after the interval of 10ms, the first set of photosites images the pattern 3 (e.g., the pattern at the third projection angle) at the 10 th millisecond, thus completing the three images required for one depth data image. Subsequently, at 11ms, the second group of photosensitive units can start the next imaging round for pattern 1. At the 20 th millisecond, the first group of photosensitive units performed imaging for pattern 2. At the 21 st millisecond, the second group of photosensitive units again performed imaging for pattern 3. In this way, the imaging intervals of different groups of photosensitive units only need to be separated by the time (e.g. 1ms) required for imaging, and the re-imaging intervals of the same group of photosensitive units still follow the frame interval time (e.g. 10ms) corresponding to the frame rate. At this time, by introducing two sets of coaxial binoculars, synthesizing each depth data image only requires the subject to remain motionless for 11ms (and thus it is easier to capture moving subjects), and the frame rate can be kept close to 66.6 frames/s.
Although an example having two sets of coaxial (co-axial) photosensitive cells is described in connection with fig. 5 and 6, in other embodiments, the first and second image sensors may each include more photosensitive cells. Fig. 7 shows timing diagrams for coaxial three-group binocular imaging. At this time, each of the first and second image sensors may include three photosensitive cells (sub-image sensors) coaxially. For this reason, as shown in fig. 7, the first group of photosensitive units performs image formation for the pattern 1 at the 0 th millisecond, the second group of photosensitive units performs image formation for the pattern 2 at the 1 st millisecond, and the third group of photosensitive units performs image formation for the pattern 3 at the 2 nd millisecond. Subsequently, the three sets of imaging for the next round are started at 10 milliseconds, the three sets of imaging for the next round are started at 20 milliseconds, and so on. At this time, by introducing three sets of coaxial binoculars, three sets (6 frames) of images required for synthesizing one depth data image can be obtained in only 3ms, i.e., the photographic subject needs to be kept still for only 3ms, so that the photographing level for the moving subject is greatly improved, and the frame rate can be kept close to 100 frames/s (in this example, 1003ms, i.e., 1.003 seconds is required for photographing 100 frames).
Thus, it should be appreciated that by introducing only an additional set of coaxial binocular structures (or monocular structures), it is possible to double the frame rate of depth data based on multi-frame synthesis and shorten the imaging time per frame. In theory, the same number of sets of coaxial binocular structures as the number of images projected by the projection device may be arranged, thereby allowing the framing time of each depth frame to be related to the frame interval of the sensor, only to multiples of the exposure time (in the case of frame intervals greater than the number of sets of exposure time x coaxial structures). For example, in the case of synthesizing a depth frame based on four patterns, if two sets of coaxial binoculars as shown in fig. 5 are used, the imaging time for acquiring the four frames slightly rises to 12ms, but the frame rate falls to nearly 50 frames/s. But if four sets of coaxial binoculars are used, the imaging time to acquire four frames is only 4ms and the frame rate remains close to 100 frames/s. However, too much introduction of the coaxial structure increases the difficulty of constructing the image sensor, and therefore, the cost, the feasibility and the imaging speed are compromised.
In addition, it should be understood that fig. 6 and 7 give an example in which the second sub-image sensor 523 images for 1ms immediately after the first sub-image sensor 522 images for explaining the performance of the on-axis imaging. However, in practical applications, the imaging interval of the first and second sub-image sensors also needs to take into account the driving of the driving means. Specifically, if imaging is performed directly during driving and there is no requirement for the exact angle of projection, imaging of the second sub-image sensor as shown in fig. 6 and 7 may be performed directly after the first sub-image sensor completes imaging. However, if the exact angle of projection is required, or the drive means needs to remain stationary during exposure, it is necessary to wait for the drive means to move to the appropriate position (and/or to become in the appropriate state of motion, e.g. completely stationary) after the first sub-image sensor has completed imaging, and to proceed with exposure of the second sub-image sensor. However, since the moving speeds of the rotating mirror and the galvanometer mirror are fast, the waiting time is relatively short, for example, several tens of microseconds.
In order to realize coaxial arrangement of different photosensitive units in the same image sensor, the light path needs to be designed. In the example of fig. 5, a coaxial arrangement based on a beam splitting implementation is shown. At this time, taking the first image sensor 520 as an example, the method may include: a mirror unit 521 for receiving incident return structured light; beam splitting means 522 for splitting the incident return structured light into at least a first beam and a second beam; a first sub-image sensor 523 for imaging the first light beam; a second sub-image sensor 524 for imaging a second beam of return structured light corresponding to a different pattern.
In one embodiment, beam splitting device 522 is an optical prism, such as a square prism or a triangular prism. Thus, the reflected infrared light in the incident light reaches the second sub-image sensor 524, and the un-reflected visible light in the incident light can travel straight to the first sub-image sensor 523.
As shown, a beam splitting device 522 in the form of a prism may split incident light into two beams whose directions of propagation are perpendicular to each other. Accordingly, the first sub-image sensor 523 and the second sub-image sensor 524 may also be vertically arranged so as to each receive the incident visible and infrared light beams at a vertical angle.
In order to eliminate parallax and achieve pixel level alignment, components in incident light need to have the same optical path length. For this, in the case of using the quarter prism as the beam splitting device 522, the first and second sub image sensors 523 and 524 may be disposed at equal distances from the beam splitting area of the beam splitting device 522. In the case of using a prism as the beam splitting device 522, the distance between the two photosensitive units and the beam splitting device 522, especially the beam splitting area, can be flexibly adjusted according to the ratio of the refractive index of air to the prism material.
The pixel level alignment between the first sub image sensor 523 and the second sub image sensor 524 can be theoretically achieved by making the incident light share most of the optical path and have the same optical path. However, in the actual manufacturing process of the image sensor, the deviation between the imaging of the first sub-image sensor 523 and the second sub-image sensor 524 may be caused because the actual arrangement of the two sub-image sensors cannot present the ideal vertical and equidistant conditions. In this case, the manufactured image sensor can be subjected to forced software correction. For example, by introducing a calibration target and aligning the images of both the first 523 and second 524 sub-image sensors with the calibration target, a true pixel level correction is achieved. In other words, the pixel-level alignment between the first sub-image sensor 523 and the second sub-image sensor 524 may be a precise pixel-level alignment, or may have several pixel differences and the alignment is achieved by calibration.
As shown, the image sensor 520 of the present invention may be implemented as a separate module. To this end, the image sensor 520 may further include a housing for fixing relative positions of the lens unit, the beam splitting device, and the two photosensitive units. Preferably, the housing may form a seal in combination with the lens unit 521 to avoid contamination of the contained device from the external environment. In other embodiments, the image sensor 520 of the present invention may be part of a larger module (e.g., a depth data measurement head) and the fixation between the elements is achieved by the housing of the larger module.
Preferably, the image sensor 520 may further include cables connected to the first and second sub-image sensors 523 and 524, respectively. The housing has an opening for cable access. In one embodiment, the cable may be a flexible cable, such as an FPC (flexible circuit board) line.
In one embodiment, the light beam may also pass through a filter before being incident on the first sub-image sensor 523 and the second sub-image sensor 524 to further filter out the influence of light of other wavelengths. In one embodiment, the projection device may project infrared laser light, and thus the optical filter disposed in the image sensor may be a corresponding infrared light transmission unit for transmitting infrared light of a specific frequency range, for example, infrared light with a wavelength of 780-. In other embodiments, the projection device may also project visible light, such as red laser light or blue laser light, such as 635nm red light or 450nm blue light. Although the ambient light may also include red light or blue light, because the exposure time is short and the laser light is instantaneously powerful, imaging with a high signal-to-noise ratio can also be performed with the help of the corresponding filter that projects red light or blue light.
Preferably, in the case that the beam splitting means is a square prism, one side of the optical filter may be in direct physical contact with the square prism, the other side is in physical contact with the light sensing unit, and the light sensing unit and the square prism are clamped in the housing, thereby ensuring the height invariance of the relative positions of the devices.
In some embodiments, especially in case the first and second sub image sensors are infrared light sensors for receiving the projected infrared pattern, an additional visible light sensing unit (not shown in the figures) may be arranged in the image sensor for capturing image information of the object to be measured, such that the image captured by the image sensor comprises both image information and depth information of the object to be measured. The visible light sensing unit can be a grayscale sensor or a color sensor. The gray sensor only captures brightness information, the color sensor can be used for capturing color information of a measured object, and the visible light sensing unit can be composed of three primary color sensing units, wherein the three primary colors can be three primary colors of red, green and blue (RGB) or three primary colors of cyan, magenta and yellow (CMY).
It should be understood that although the structure of the first image sensor 520 is specifically described based on fig. 5, the second image sensor 530 may have the same structure. In addition, it should be understood that 523 and 533 may be regarded as a first set of two eyes and 524 and 534 may be regarded as a second set of two eyes, but 523 and 534 may also be regarded as a first set of two eyes and 524 and 533 may be regarded as a second set of two eyes as long as imaging is turned on after incidence of the corresponding patterns.
In the case of optical path sharing using beam splitting as shown in fig. 5, since the light acquired by each photosensitive unit is reduced, it is possible to secure the sensitivity of image formation or the effective distance range by increasing the projection brightness or enlarging the incident aperture.
For this reason, alternatively, optical path sharing may also be realized based on optical path conversion. At this time, the first and second image sensors 520 and 530 may each include: a lens unit for receiving incident return structured light; optical path conversion means for conveying the incident return structured light to at least the first sub-path and the second sub-path; a first sub-image sensor for imaging the returning structured light on a first sub-path; a second sub-image sensor for imaging the returned structured light corresponding to the different pattern on a second sub-path. In one embodiment, the optical path conversion device may be a turning mirror, which may reflect the incident light to the photosensitive unit 523 at, for example, 0 th millisecond, reflect the incident light to the photosensitive unit 524 at 1 st millisecond, and so on. In other embodiments, the optical path conversion device may also be a device that performs optical path conversion based on other mechanical, chemical or electrical principles.
In the case where the projection apparatus performs full pattern projection (rather than scanning projection), although the image sensor may be a rolling shutter type image sensor, it is preferably implemented as a global image sensor (i.e., all pixels are imaged at the same time).
As previously mentioned, the projection device may include a galvanometer, such as a MEMS galvanometer or a mechanical galvanometer, that oscillates back and forth at a predetermined frequency for scanning the projected structured light at a predetermined frequency and range of motion toward the capture area. Since the galvanometer can realize extremely high vibration frequency, for example, 2k per second, the MEMS galvanometer cannot be directly used for synchronization (because the delay is not reliable), and therefore, in a scene where synchronization (for example, rotation angle learning) is required, a measuring device for measuring the vibration phase of the galvanometer in real time can be included in the synchronization device in consideration of the phase vibration characteristic of the micromirror device.
In one embodiment, the above measurement may be based on the emitted light itself. Then, the above-mentioned measuring means may be one or more photosensors (e.g., two photodiodes PD), and the two photosensors are arranged in any one of the following ways: arranged on different exit paths of the projection device; arranged on different reflected paths within the projection device; and an exit and a reflection path respectively arranged inside and outside the projection device. The arrangement mode of the photoelectric sensors can be reasonably selected, so that the normal projection of the structured light is not influenced while the phases are accurately measured. The PD may be mounted in a projection device and the instantaneous vibration phase determined by measuring the reflection angle at which the laser exits the optical window. Since the vibration phase of the MEMS galvanometer is sinusoidally distributed, one PD can determine the sinusoidal distribution information, while more PDs help to measure the phase more accurately. In other embodiments, the PD may also be mounted outside the projection device, e.g. on the optical window, e.g. near the edge of the optical window to prevent effects on the projection within the shooting area. In other embodiments, the phase measurement may also be performed in other ways, such as by making a capacitance measurement.
In other embodiments, however, the projection device may comprise a mechanical rotating mirror which rotates in one direction, and accordingly, when synchronization is required, the measuring device comprised by the synchronization device may be an angle measurer for measuring the rotation angle of the motor of the reflection device in real time.
In the above embodiment, the synchronization between projection and exposure is achieved by controlling the exposure of the image sensor. This can be used where the angle of projection of the light source is controllable (e.g. the angle and speed of rotation of the mechanical galvanometer can be controlled by voltage and current), and is particularly applicable where the phase and speed of the light source scan is not controllable (e.g. for a MEMS galvanometer or mechanical rotating mirror). Thus, the MEMS galvanometer can use PD or capacitance to detect the angle, and the mechanical rotating mirror can also realize position detection through voltage detection or photoelectric coding.
According to another embodiment of the present invention, there is also realized a depth data calculation apparatus including: a depth data measuring head as described above; and a processor connected to the depth data measuring head for determining depth data of the object in the photographing region according to predetermined relative positions of the first and second image sensors and the set of image frame pairs obtained by imaging the structured light using the predetermined relative positions under a binocular scheme.
The structured light projection device, the measuring head and the calculating device can measure and calculate the depth data.
First, the light beam with the speckle pattern emitted by the light source module can be reflected rotationally.
Subsequently, the measured space is imaged at least twice using the first and second image sensors with fixed relative positions to acquire at least two sets of images, wherein different speckle patterns exhibited by the measured space due to the rotational reflection are projected in the at least two images. Depth data can then be extracted from the at least two sets of images and depth data fused.
In the case of using the coaxial binocular structure, imaging the measured space at least twice using the first and second infrared light image sensors whose relative positions are fixed includes: imaging a first pair of sub-image sensors having a predetermined relative positional relationship to acquire a first pair of image frames; a second imaging is performed using a second pair of sub-image sensors to acquire a second pair of image frames, wherein one sub-image sensor of each of the first and second pairs of sub-image sensors shares at least part of the optical path and constitutes a first image sensor and the other sub-image sensor of each of the first and second pairs of sub-image sensors shares at least part of the optical path and constitutes a second image sensor, the first and second pairs of image frames being used for a single depth data calculation of the capture area.
The structured light projection device, the depth data measuring head, the computing apparatus and the measuring method according to the present invention have been described above in detail with reference to the accompanying drawings. The depth measurement scheme of the utility model uses an improved structured light projection device which can reflect structured light generated by the light source module at different angles, thereby realizing multi-pattern projection which is faster, more economical and has low failure rate. Furthermore, the structured light projection device can be matched with a plurality of pairs of binocular sensors of a common light path, so that the frame interval is further shortened, and the quality of depth fusion data is improved.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (20)

1. A structured light projection device, comprising:
the light source module is used for generating and emitting a light beam with textures;
a steering projection module comprising:
reflection means arranged on an exit path of the light beam for reflecting the light beam incident thereto to exit the light beam;
and the driving device is connected to the reflecting device and is used for changing the angle of the reflecting device relative to the incident light beam so as to change the emergent direction of the light beam.
2. The structured light projection device of claim 1, wherein the light source module comprises:
a laser generator for emitting a laser beam;
a diffractive optical element arranged on an exit optical path of the laser beam, for diffracting and modulating incident laser light into discrete spots having a specific projection rule; or
A flood light source for generating a flood light; and
a mask arranged over the flood light source for converting the flood light into discrete spots having a particular projection rule.
3. The structured light projection device according to claim 1, wherein the driving means controls the reflecting means to move in an axial direction, wherein the light beam emitted from the light source module is incident on the reflecting means in a direction perpendicular to the axial direction, and the emitting direction is changed based on the axial movement of the reflecting means.
4. The structured light projection device of claim 1, wherein the steered projection module is a mechanical galvanometer and the reflecting device reciprocates axially; or
The steering projection module is a mechanical steering mirror, and the reflecting device moves in a single direction along the axial direction.
5. The structured light projection device of claim 1, wherein the steered projection module remains stationary for a predetermined window period during continuous motion.
6. A depth data measurement head, comprising:
the structured light projection device according to any one of claims 1 to 5, for projecting the light beam with the texture to the space under test under different projection angles driven by the driving device to form different textures on the object to be detected in the space under test; and
the first image sensor and the second image sensor are respectively arranged on two sides of the structured light projection device, a preset relative spatial position relation exists between the first image sensor and the second image sensor, and the measured space is imaged at least twice during the movement of the reflection device so as to obtain at least two groups of images with different texture distributions for generating single-time measurement depth data.
7. The depth data measurement head of claim 6, further comprising:
and the controller is used for controlling the light source module to be synchronously lightened when the first image sensor and the second image sensor are exposed.
8. The depth data measurement head of claim 6, further comprising:
a controller for controlling the driving means to remain stationary during exposure of the first and second image sensors.
9. The depth data measurement head of claim 6, wherein the first and second image sensors each comprise at least two sub-image sensors sharing at least part of the optical path, each for performing one of the at least two images.
10. The depth data measurement head of claim 9, comprising:
a controller for causing the first and second image sensors to each include at least two sub-image sensors to sequentially image at a first interval in synchronization, the first interval being less than a frame imaging interval of the sub-image sensors.
11. The depth data measurement head of claim 10, comprising:
a controller for causing each sub-image sensor to perform its next frame imaging at a second interval not less than a frame imaging interval of the sub-image sensor.
12. The depth data measurement head of claim 9, wherein the first and second image sensors each comprise:
a lens unit for receiving incident return structured light;
beam splitting means for splitting incident return structured light into at least a first beam and a second beam;
a first sub-image sensor for imaging the first light beam;
a second sub-image sensor for imaging a second beam of return structured light corresponding to a different pattern.
13. The depth data measuring head of claim 12, wherein the first sub-image sensor and the second sub-image sensor are equidistant from a beam splitting area of the beam splitting device.
14. The depth data measurement head of claim 12, wherein the first sub-image sensor and the second sub-image sensor are pixel-level aligned.
15. The depth data measuring head of claim 12, wherein the at least two sub-image sensors that each include at least a portion of the optical path shared by the first and second image sensors are infrared light sensors; and/or
The first and second image sensors each include:
a visible light image sensor for imaging incident structured light, wherein the visible light image sensor shares at least part of the light path with the first and/or second image sub-sensor.
16. The depth data measurement head of claim 9, wherein the first and second image sensors each comprise:
a lens unit for receiving incident return structured light;
optical path conversion means for conveying incident return structured light to at least a first sub-path and a second sub-path;
a first sub-image sensor for imaging the returning structured light on a first sub-path;
a second sub-image sensor for imaging the returned structured light corresponding to the different pattern on a second sub-path.
17. The depth data measuring head of claim 16, wherein the first sub-image sensor and the second sub-image sensor are equidistant from a light path conversion area of the light path conversion device.
18. The depth data measurement head of claim 16, wherein the first sub-image sensor and the second sub-image sensor are pixel-level aligned.
19. The depth data measuring head of claim 16, wherein the at least two sub-image sensors that each include at least a portion of the optical path shared by the first and second image sensors are infrared light sensors; and/or
The first and second image sensors each include:
a visible light image sensor for imaging incident structured light, wherein the visible light image sensor shares at least part of the light path with the first and/or second image sub-sensor.
20. A depth data computing device, comprising:
the depth data measurement head of any one of claims 6 to 19, and
and the processor is used for acquiring the at least two groups of images and calculating single-measurement depth data of the object to be detected according to the preset relative spatial position relation between the first image sensor and the second image sensor and the at least two groups of images.
CN202120814643.5U 2021-04-20 2021-04-20 Structured light projection device, depth data measuring head and computing equipment Active CN216246133U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202120814643.5U CN216246133U (en) 2021-04-20 2021-04-20 Structured light projection device, depth data measuring head and computing equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202120814643.5U CN216246133U (en) 2021-04-20 2021-04-20 Structured light projection device, depth data measuring head and computing equipment

Publications (1)

Publication Number Publication Date
CN216246133U true CN216246133U (en) 2022-04-08

Family

ID=80939029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202120814643.5U Active CN216246133U (en) 2021-04-20 2021-04-20 Structured light projection device, depth data measuring head and computing equipment

Country Status (1)

Country Link
CN (1) CN216246133U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114995028A (en) * 2022-05-09 2022-09-02 中国科学院半导体研究所 Speckle projection device and projection imaging method for enlarging field of view

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114995028A (en) * 2022-05-09 2022-09-02 中国科学院半导体研究所 Speckle projection device and projection imaging method for enlarging field of view

Similar Documents

Publication Publication Date Title
US6693666B1 (en) Moving imager camera for track and range capture
CN107343130B (en) High dynamic imaging module based on DMD dynamic light splitting
US7078720B2 (en) Range finder for measuring three-dimensional geometry of object and method thereof
US6600168B1 (en) High speed laser three-dimensional imager
KR101257586B1 (en) Optical axis adjusting apparatus, optical axis adjusting method, and projection type display apparatus
US20140313519A1 (en) Depth scanning with multiple emitters
JPWO2017199556A1 (en) Stereo camera and control method of stereo camera
WO2013121267A1 (en) Time of flight camera with stripe illumination
CN216246133U (en) Structured light projection device, depth data measuring head and computing equipment
JP3414624B2 (en) Real-time range finder
KR20120056441A (en) 3-Dimension depth camera using the Infrared laser projection display
CN115218820A (en) Structured light projection device, depth data measuring head, computing device and measuring method
US4922281A (en) Distance measuring device for automatic focusing camera
JPH07307880A (en) Imaging system
JP2023535916A (en) Depth data measurement equipment and structured light projection unit
CN115390087A (en) Laser line scanning three-dimensional imaging system and method
CN216283296U (en) Depth data measuring head and depth data calculating apparatus
WO2022222497A1 (en) Depth data measurement head, depth data computing device, and corresponding method
JPH1026724A (en) Active type multipoint range-finding device
JP3029005B2 (en) Stereo vision camera
WO2010012697A1 (en) Method and device for enhancing the resolution of a camera
WO1998026252A9 (en) Moving imager camera for track and range capture
JP2003014422A (en) Real time range finder
JP2021018174A (en) Distance detector and imaging apparatus
JP2018180136A (en) Imaging apparatus

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant