WO2022222496A1 - 深度数据测量头、计算设备和测量方法 - Google Patents
深度数据测量头、计算设备和测量方法 Download PDFInfo
- Publication number
- WO2022222496A1 WO2022222496A1 PCT/CN2021/137790 CN2021137790W WO2022222496A1 WO 2022222496 A1 WO2022222496 A1 WO 2022222496A1 CN 2021137790 W CN2021137790 W CN 2021137790W WO 2022222496 A1 WO2022222496 A1 WO 2022222496A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sub
- light
- depth data
- image sensor
- image
- Prior art date
Links
- 238000000691 measurement method Methods 0.000 title claims abstract description 5
- 238000003384 imaging method Methods 0.000 claims abstract description 101
- 238000005259 measurement Methods 0.000 claims abstract description 48
- 230000004927 fusion Effects 0.000 claims abstract description 13
- 238000009826 distribution Methods 0.000 claims abstract description 10
- 230000003287 optical effect Effects 0.000 claims description 39
- 230000033001 locomotion Effects 0.000 claims description 14
- 238000006243 chemical reaction Methods 0.000 claims description 13
- 238000004364 calculation method Methods 0.000 claims description 11
- 238000000034 method Methods 0.000 claims description 11
- 230000008859 change Effects 0.000 claims description 6
- 230000009466 transformation Effects 0.000 claims description 4
- 238000004904 shortening Methods 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 16
- 238000005070 sampling Methods 0.000 description 10
- 238000001514 detection method Methods 0.000 description 6
- 230000001360 synchronised effect Effects 0.000 description 4
- 238000009432 framing Methods 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 238000007789 sealing Methods 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000003466 welding Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/254—Projection of a pattern, viewing through a pattern, e.g. moiré
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/22—Measuring arrangements characterised by the use of optical techniques for measuring depth
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
- G01B11/2527—Projection by scanning of the object with phase change by in-plane movement of the patern
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2545—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/529—Depth or shape recovery from texture
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/55—Depth or shape recovery from multiple images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Definitions
- the invention relates to the technical field of three-dimensional detection, in particular to a depth data measuring head, a computing device and a measuring method.
- a binocular detection scheme based on structured light can measure the surface of an object in three dimensions in real time.
- the scheme first projects a two-dimensional laser texture pattern with encoded information on the surface of the target object, such as a discretized speckle pattern.
- the unit uses the sampling window to sample the two images simultaneously collected by the two image acquisition devices, determines the matching laser texture patterns in the sampling window, and calculates each laser texture sequence projected on the surface of the natural body according to the difference between the matched texture patterns The depth distance of the segment is further measured to obtain the three-dimensional data of the surface of the object to be measured.
- the smaller the sampling window the finer the granularity of the image, but the greater the false matching rate. While it is possible to reduce the sampling window by taking multiple sets of different images consecutively, this introduces additional system complexity and reduces the frame rate.
- a technical problem to be solved by the present disclosure is to provide a depth data measurement solution, which uses multi-angle pattern projection, especially an improved structured light projection device capable of reflecting the structured light generated by the light source module at different angles, so as to be able to Enables faster, more economical projection with lower failure rates. Further, the structured light projection device can cooperate with multiple pairs of binocular sensors sharing a light path, thereby further shortening the frame interval and improving the quality of deep fusion data.
- a depth data measurement head comprising: a structured light projection device for projecting a light beam with textures to a measured space under different projection angles driven by the driving device, so as to be different textures are formed on the object to be detected in the measurement space; and first and second image sensors are respectively arranged on both sides of the structured light projection device, and there are predetermined image sensors between the first and second image sensors. relative spatial positional relationship, and image the measured space at least twice during the movement of the reflective device to obtain at least two sets of images with different texture distributions, wherein the at least two sets of images are used to obtain images to be Single-measurement depth data for detected objects.
- a depth data computing device comprising: the depth data measurement head according to the first aspect of the present invention, and a processor for acquiring the at least two sets of images, and Determine the depth data of the texture in each set of images according to the predetermined relative spatial positional relationship between the first and second image sensors, and fuse the depth data determined based on the at least two sets of images to obtain a new Depth data, as the single-measurement depth data of the object to be detected.
- a method for measuring depth data comprising: projecting light beams with speckle patterns emitted from a light source module at different angles; using first and second image sensors with fixed relative positions to The measured space is imaged at least twice to obtain at least two sets of images, wherein, in the at least two imagings, the measured space is projected with different speckle patterns presented by the projections from different angles; Depth data is obtained from the at least two sets of images and depth data fusion is performed.
- the projection flexibility of the structured light projection device is improved by rotating the reflected structured light.
- the device can be further combined with a mask light source and a coaxial binocular solution to further improve the accuracy and imaging speed of the multi-frame fusion solution.
- FIG. 1 shows a schematic composition diagram of a structured light projection device according to an embodiment of the present invention.
- FIG. 2 shows a schematic diagram of changing the projection direction of the present invention.
- Figure 3 shows a perspective view of a structured light projection device according to one embodiment of the present invention.
- 4A-B illustrate examples of projecting structured light by the structured light projection device shown in FIG. 3 under different perspective angles.
- 5A-B illustrate examples of speckle projection using the apparatus shown in FIG. 3 or FIG. 4 .
- FIG. 6 shows a block diagram of a depth data measurement head according to an embodiment of the present invention.
- FIG. 7 shows a schematic diagram of the composition of a depth data measuring head according to an embodiment of the present invention.
- FIG. 8 shows a comparison timing diagram of coaxial two-group imaging and single-group imaging.
- FIG. 9 shows the timing diagram of coaxial three-group binocular imaging.
- FIG. 10 shows a schematic flowchart of a method for measuring depth data according to an embodiment of the present invention.
- the sampling window can be reduced by taking multiple sets of different images in succession.
- the same light source module can be projected from different angles by using the driving device to drive the light source module to rotate.
- the driving device to drive the light source module to rotate.
- the patterns projected by the light source modules are the same, they still appear as different patterns in the field of view of the imaging device (image sensor) due to different projection angles.
- the driving function of the driving device is limited.
- multi-frame-based deep fusion reduces the frame rate and degrades the shooting performance of dynamic objects.
- the present invention provides a depth data measurement solution, which preferably uses an improved structured light projection device capable of reflecting the structured light generated by the light source module at different angles, so as to achieve a faster, more economical and lower failure rate.
- Multi-pattern projection the structured light projection device can cooperate with multiple pairs of binocular sensors sharing a light path, thereby further shortening the frame interval and improving the quality of deep fusion data.
- FIG. 1 shows a schematic composition diagram of a structured light projection device according to an embodiment of the present invention.
- the structured light projection device can be used for structured light projection in the depth data measuring head of the present invention.
- the structured light projection device 110 may include a light source module I located at the upper part of the dashed line in the figure and a turning projection module II located at the lower part of the dashed line in the figure.
- the light source module is used for generating the light beam to be projected
- the steering projection module is used for turning and exiting the light beam.
- the light source module is used to generate and emit a textured light beam.
- the light source module does not directly project the light beam emitted by the light emitting device, but performs certain optical processing on the light beam projected by the light emitting device to make it present a desired distribution, brightness or pattern. For this reason, in different implementations, different speckle generation schemes can be used to achieve different projection effects.
- the light source module 1 may include a laser generator 111 and a diffractive optical element (DOE) 112 .
- the laser generator 111 is used for emitting a laser beam (as shown by the single arrow leaving 111 in the figure).
- the DOE 112 arranged on the outgoing optical path of the laser beam can modulate the incident laser light, for example, diffracting the incident laser light and modulating it into discrete light spots with specific projection rules (as indicated by the double arrows leaving 112 in the figure).
- the diffracted beam for the surface has a certain width, i.e. occupies a certain area in the plane of the DOE 112).
- the laser generator 111 may be a laser diode (LD), such as an edge-type laser diode.
- the laser light generated by the LD can be collimated by a collimating lens (not shown in the figure), and then diffracted by the DOE 112.
- the DOE 112 may have a relatively complex structure to diffract an incident one laser spot into a pattern with multiple spots (eg, with two thousand, or even two 10,000 points) to form the projected complex speckle. Due to the collimation, fixed-point projection can be performed by diffraction, which has the advantages of long working distance and low power consumption.
- the laser generator 111 may also be implemented as a VCSEL (Vertical Cavity Surface Emitting Laser). Since the VCSEL itself can include a plurality of light-emitting particles, one VCSEL chip itself can emit a certain pattern, for example, a pattern formed by 100 light-emitting particles. At this time, the DOE 112 only needs to simply copy, such as direct copy, interleaved copy, or rotational copy, to realize the speckle pattern of two thousand or even twenty thousand points as mentioned above. Although collimation is generally not possible, VCSEL schemes can be more energy efficient and require less DOE complexity.
- VCSEL Very Cavity Surface Emitting Laser
- the light source module 1 may also be implemented using other solutions than lasers and diffractive elements.
- the light source module may comprise a flood light source for generating flood light; and a mask arranged on the flood light source for converting the flood light into a light spot with a specific projection encoding.
- the steering projection module is a module for reflecting light beams (eg, speckle patterns) generated by the light source module to exit.
- the steering projection module of the present invention can drive the reflection module to move, so that the light beam reflected by the reflection module can be changed.
- the steering projection module II may include a reflection device 113 and a driving device 114 .
- the reflection device 113 arranged on the exit path of the light beam can be used to reflect the incident light beam to make the light beam exit
- the driving device 114 connected to the reflection device can be used to change the relative relative value of the reflection device. at the angle of the incident beam, thereby changing the exit direction of the beam.
- the direction in which the light exits the measuring head (that is, the direction in which the light leaves the measuring head) may be designated as the z direction, the horizontal direction of the photographing plane is the x direction, and the vertical direction is the y direction .
- the structured light will be projected downward (y direction) to the light source module, and the outgoing direction will be changed to the z direction by the steering projection module ( In fact, an example of placement along the z direction with a slight angular deviation) will be described in detail.
- the structured light projection device of the present invention may also be arranged or placed in other directions according to the needs of the actual imaging scene.
- FIG. 2 shows a schematic diagram of changing the projection direction of the present invention.
- the light emitted by the light source is projected down the y-axis to a reflection device, such as a mirror, and reflected by the reflection device to be projected into the measured space, and can form a light spot on the imaging plane perpendicular to the z-axis.
- the reflector can be rotated axially along the x-axis, for example, in the range of A-B angles as shown in the figure, whereby a light spot moving in the range of A'-B' on the imaging plane can be obtained accordingly.
- the pattern projected by the light source module 1 to the mirror is the same (because the relative positions of the laser emitter 111 and the DOE 112 are fixed, and do not themselves is driven), but due to the rotation of the mirror, the projected pattern will have an angular offset, so when used in combination with binocular imaging as described below, the pattern captured by the image sensor at different projection angles can be regarded as different patterns.
- FIG. 2 shows a mirror that can be rotated within a large A-B angle range in order to illustrate the principle of the present invention
- the angle difference of the same pattern projected can be very small, For example, 1°, thereby ensuring that the imaging patterns are not identical, and the imaging ranges are substantially coincident.
- FIG. 3 shows a perspective view of a structured light projection device according to an embodiment of the present invention.
- 4A-B illustrate examples of projecting structured light by the structured light projection device shown in FIG. 3 under different perspective angles.
- the laser generator 311 can be arranged inside the casing (or the fixed structure) 315, and the light source generated by the laser generator 311 can be diffracted through the DOE 312 arranged on the outgoing optical path to obtain diffraction with a certain two-dimensional distribution.
- the diffraction pattern propagates up to the mirror 313, which reflects the diffraction pattern so that it exits (approximately) along the z-direction and is projected on a plane perpendicular to the z-direction (as shown in Figs. 4A and 4B, where Fig. 4B can be viewed as a view of the device shown in FIGS. 3 and 4A rotated 90° counterclockwise along the y-axis) to form a speckle pattern.
- the projection planes of FIGS. 4A and 4B can be viewed as three-dimensional representations of the imaging plane shown in FIG. 1 .
- the driving device can control the reflection device to move along the axial direction, wherein the light beam emitted from the light source module is incident on the reflection device in a direction perpendicular to the axial direction (x direction), and based on the axial movement of the reflection device, the change is made. exit direction.
- the rotating shaft extending from the driving device (eg, motor) 314 is fixedly connected to the mirror 313, so that when the motor works, the rotating shaft drives the reflecting mirror 313 to move axially.
- a projection pattern with a certain angular offset is formed on the projection plane.
- These projection patterns can be different patterns for the image sensor that performs projection shooting, so the projection device of the present invention can realize the “different” patterns by rotating the reflective structured light (eg, a diffraction pattern with a two-dimensional distribution). convenient projection.
- the steering projection module can be a galvanometer, and the motor can drive the mirror to reciprocate within a certain range. In other embodiments, the steering projection module can be a rotating mirror, and the motor can only move in one direction along the axial direction.
- the steering projection module may be a mechanical galvanometer that vibrates reciprocally at a predetermined frequency, so that structured light can be projected to the measured area at a predetermined frequency, thereby presenting a two-dimensional movement up and down along the y-direction on the projection plane. Diffraction pattern (discrete spot). Due to the controllability of the mechanical galvo mirror, it is also possible to keep the steering projection module stationary for a predetermined window period during continuous motion.
- a mechanical galvo mirror can have a range of motion of ⁇ 1° along the z-direction and can have a vibration frequency of up to 2k per second.
- the mechanical galvanometer can be made to pause for a period of time when it moves to -1°, 0°, and 1°, for example, in the photosensitive unit It remains stationary for the required exposure time (eg, 1 ms), thereby allowing the projected pattern to remain unchanged during exposure of the image sensor, thereby improving imaging accuracy.
- the rotating mirror can only rotate in one direction, and it is difficult to perform variable speed control
- the moving angle of the rotating mirror can be sensed through a device such as a photodiode, and structured light can be performed within a suitable angle.
- Projection and dynamic imaging eg projection and corresponding imaging in an interval of ⁇ 1° along the z-direction in 360°.
- 5A-B illustrate examples of speckle projection using the apparatus shown in FIG. 3 or FIG. 4 .
- 5A and 5B can be regarded as the patterns projected by the mechanical galvanometer when moving to 1° and -1°, respectively, and can also be regarded as the imaging of the image sensor when the structured light projection device cooperates with the image sensor to form a measuring head.
- the projection range of the speckle is shown by a black box in the figure, and it should be understood that in actual use, the black box in the figure will not be projected.
- the speckle occupies less than 30% of the area, so it can be considered that the projection scheme of the laser generator and DOE is adopted.
- the patterns projected by the two figures are actually “the same", but due to different projection angles , so the pattern of FIG. 5B is shifted down some distance along the y-direction.
- the speckle projected on the object is actually different (the projected pattern in the range shown by the dotted box in FIG. 5B is the same as the projected pattern in FIG. 5A ). the same), so it can be considered that different patterns are projected on the same measured object.
- the irregularity (or randomness) of the speckle pattern in different angles of projection, different positions on the surface of the square object are projected to the speckle, so more surface depth information can be obtained in the subsequent depth data fusion. .
- the projected structured light is projected along the z-direction and changes in the vertical direction (y-direction), but in other implementations, the projected structured light can also be a structured light projected along the z-direction and fluctuated in the horizontal direction (x-direction), or a projection with angular transformation in both the x- and y-directions.
- the dotted frame shown in FIG. 5B can be moved up and down, left and right, or even along an oblique line.
- the number of image groups required for acquiring single-measurement depth data of the object to be detected may be determined according to a specific application scenario. For example, when the number of image groups that can be processed per second is determined, the imaging frame rate can be improved by reducing the number of image groups required for a single measurement of depth data; or by increasing the number of image groups required for a single measurement of depth data to improve imaging accuracy.
- the structured light projection device of the present invention may have, for example, five adjustable imaging rates.
- a frame rate of, for example, 100 frames per second can be achieved.
- a frame rate of, for example, 50 frames per second can be achieved.
- each measured depth data uses two sets of image frames taken at different times, and the motor projection angle may need to be changed accordingly, for example, by 0.5° and 0° back and forth, and on 0.5° and 0° Shoot separately to obtain two sets of image frames for fusion.
- an intermediate frame rate ie, mid-range
- a frame rate of, for example, 25 frames per second can be achieved.
- a frame rate of, for example, 20 frames/second can be achieved.
- the user can select the corresponding gear as needed, and make the drive mechanism, light source, image sensor and processor cooperate with each other to achieve the required image capture and fusion calculation.
- FIG. 6 shows a block diagram of a depth data measurement head according to an embodiment of the present invention.
- a depth data measurement head 600 includes a structured light projection device 610 and a first image sensor 620 and a second image sensor 630 having a predetermined relative positional relationship.
- the structured light projection device 610 can, under the driving of the driving device included in it, project light beams with textures to the measured space through different projection angles, so as to form different textures on the object to be detected in the measured space (for example, as shown in Fig. Different textures shown in 5A and 5B).
- the first and second image sensors 620 and 630 respectively arranged on both sides of the structured light projection device, image the measured space at least twice during the movement of the reflection device, so as to obtain at least two images with different texture distributions.
- the structured light projection device 610 may include a light source module for generating and emitting a textured light beam; and a driving device for driving the light source module to project the light beam with the same projection angle to the measured space. Textured light beams.
- the driving device can directly drive the light source module to perform angle transformation.
- the driving device may be, for example, a voice coil motor on which the light source module is mounted.
- the structured light projection device 610 may be a structured light projection device including a reflection device as described above in conjunction with FIG. 1 , FIG. 3 and FIG. 4 .
- the implementation of the measurement head based on this structured light projection device will be described in detail below with reference to FIG. 7 .
- the example of FIG. 7 further includes a preferred imaging scheme of sharing the optical path.
- FIG. 7 shows a schematic diagram of the composition of a depth data measuring head according to an embodiment of the present invention.
- an example of the composition of one of the image sensors 720 is shown in more detail in the figure.
- the depth data measurement head 700 based on the binocular principle includes a projection device 710 and a first image sensor 720 and a second image sensor 730 having a predetermined relative positional relationship.
- the projection device 710 may be a structured light projection device as previously described in conjunction with FIGS. 1 and 3-4.
- the measuring head 700 may further include a casing for surrounding the above-mentioned device, and the connection structure 740 shown in FIG. 7 can be regarded as a mechanism for fixing the above-mentioned device and connecting to the casing.
- the connection structure 740 may be a circuit board that includes control circuitry thereon. It should be understood that, in other implementations, the above-mentioned devices 710-730 may be connected to the housing in other ways, and perform corresponding data transmission and instruction receiving operations.
- the projection device 710 is used to project structured light to the shooting area, such as the same pattern diffracted by DOE, but due to the existence of the steering mechanism, the same pattern can be projected at different angles, so that the first image with a predetermined relative positional relationship
- the sensor 720 and the second image sensor 530 photograph the photographing area to obtain a set of image frame pairs with different patterns. This set of image frame pairs can then be used for a single calculation of depth data of the shooting area.
- the first image sensor 720 and the second image sensor 730 may be arranged on both sides of the structured light projection device 710, respectively. There is a predetermined relative spatial positional relationship between the first and second image sensors, and the measured space is imaged at least twice during the movement of the reflective device to obtain at least two images with different texture distributions. A group of images, wherein the at least two groups of images are used to obtain single-measurement depth data of the object to be detected.
- the projection device 710 may, under the driving of its driving device, project a light beam with textures to the measured space at constantly changing projection angles, so as to form different textures on the object to be detected in the measured space.
- the image sensor can perform multiple imaging during the turning process, for example, when moving to -1°, 0° and 1°, each image is performed once, thereby obtaining a set of image frame pairs including three pairs (6 frames). These 6 frames of images are jointly used for the calculation of one-time depth data for the shooting area, that is, a depth image of one frame can be calculated.
- the light source module in the projection device 510 can be always on during operation, and the image sensor can perform multiple functions at a specific angle or any angle (or any angle within a predetermined range of motion) of the steering device rotation. secondary imaging.
- corresponding imaging can be performed according to the rotation angle of the rotating mirror, for example, when the movement reaches -1°, 0°, and 1°, each imaging is performed once.
- Corresponding exposure can also be performed according to the measured rotation angle.
- the rotation angle and exposure of the image sensor may also not be synchronized.
- the image sensor can shoot a set of images required at any interval at any time, as long as the shooting interval does not completely coincide with the rotation frequency of the rotating mirror, You can take different images.
- the differences between the images of the same group are compared, and there is no need to compare with the reference pattern, it is not necessary to specify what pattern is projected.
- the light source module in the projection device 710 may be synchronized with the exposure of the image sensor.
- the measuring head 700 can also be controlled by a controller for controlling the light source module to light up synchronously when the first and second image sensors are exposed.
- controller may also be used to control the drive to remain stationary during exposure of the first and second image sensors.
- a clearer projection pattern such as discrete light spots, can be obtained.
- the first and second image sensors may be conventional image sensors.
- the first and second image sensors each include at least two sub-image sensors that share at least part of the optical path, the at least two sub-image sensors each being used to perform one imaging of the at least two imagings .
- each of the first image sensor and the second image sensor includes only one photosensitive unit, and each photosensitive unit performs imaging three times to obtain a set of three pairs (6 frames) of image frame pairs.
- the first and second image sensors each include at least two sub-image sensors sharing at least part of the optical path, and the at least two sub-image sensors are respectively used to image the structured light of different patterns successively projected by the projection device.
- FIG. 7 shows an example in which each of the first and second image sensors includes two sub-image sensors (photosensitive units).
- the first image sensor 720 includes sub-image sensors 723 and 724
- the second image sensor 730 includes sub-image sensors 733 and 734 .
- the sub-image sensors 723 and 724 share the optical path up to the beam splitting surface of the beam splitting device 722, and the distance from the above-mentioned beam splitting area is equal.
- the sub-image sensors 733 and 734 share the optical path up to the beam-splitting surface of the beam-splitting device 732, and are equally spaced from the above-mentioned beam-splitting area.
- the present invention introduces sets of binocular structures that are coaxial with each other.
- the sub-image sensors 723 and 733 located in different image sensor housings can be regarded as the first group of image sensors (the first group of binoculars) for imaging structured light under one projection angle.
- sub-image sensors 724 and 734 which can be regarded as a second group of image sensors (a second group of binoculars)
- the sub-image sensors 724 and 534 coaxial with 723 and 733 are in place (ie, have equivalent optical paths), and the imaging of the latter pattern structured light is performed instead of 723 and 733 . Therefore, the imaging interval of two adjacent frames can be performed at a smaller interval without depending on the frame interval of each image sensor.
- the measuring head 700 may further include: a synchronization device for causing the first interval to project structured light of at least two different patterns at a first interval smaller than the frame imaging interval of the sub-image sensor, so that the first
- a synchronization device for causing the first interval to project structured light of at least two different patterns at a first interval smaller than the frame imaging interval of the sub-image sensor, so that the first
- Each of the second image sensors 720 and 730 includes at least two sub-image sensors to sequentially image the at least two different patterns of structured light at the first interval synchronously. Accordingly, each sub-image sensor still performs its own next frame imaging at a second interval not less than the frame imaging interval of the sub-image sensor (for example, imaging at its own frame interval), and the above-mentioned imaging operations can be synchronized
- the synchronization of the device is synchronized with the projection of the projection device.
- FIG. 8 shows a comparison timing diagram of coaxial two-group imaging and single-group imaging.
- the frame rate of each photosensitive unit can be set to 100 frames/s, then the frame interval is 10ms (for example, the minimum frame interval is 10ms), and the required frame rate of each photosensitive unit can be set to 10ms.
- the exposure time is 1ms.
- first and second image sensors 720 and 730 are conventional image sensors including only a single photosensitive unit, when the three patterns shown in FIG. 1 (corresponding to the acquired six images) are to be used for depth data calculation, as follows As shown in the lower part of Figure 8, three imagings are required at the 0th, 10th and 20th milliseconds. To this end, compositing each depth data image requires the subject to remain motionless for 21ms (so it is more difficult to capture moving subjects), and the frame rate is also reduced from 100 frames/s to 33.3 frames/s.
- the first and second image sensors 720 and 730 are composed of two photosensitive units (eg, the first and second image sensors 720 and 730 each include sub-image sensors 723 and 724 , and sub-image sensors 733 and 730 respectively 734) of the present invention
- the imaging of pattern 2 is performed at the first millisecond immediately after the second group of photosensitive units, and then after an interval of 10ms, the first group of photosensitive units is in At the 10th millisecond, imaging for pattern 3 (for example, the pattern at the third projection angle) is performed, thus completing the three imaging required for a pair of depth data images.
- the second group of photosensitive units can start the next round of imaging for pattern 1.
- the first group of photosensitive cells imaged pattern 2 .
- the second group of photosensitive units perform imaging for pattern 3 again.
- the imaging interval of different groups of photosensitive units only needs the time required for imaging (for example, 1 ms), and the re-imaging interval of the same group of photosensitive units still follows the minimum frame interval time (for example, 10 ms) corresponding to the frame rate.
- synthesizing each depth data image only requires the subject to remain motionless for 11ms (so it is easier to capture moving subjects), and the frame rate can be kept close to 66.6 frames/s.
- each of the first and second image sensors may further include more photosensitive units.
- FIG. 9 shows the timing diagram of coaxial three-group binocular imaging.
- each of the first and second image sensors may include three photosensitive units (sub-image sensors) that are coaxial.
- the first group of photosensitive units performs imaging for pattern 1 at the 0th millisecond, followed by the second group of photosensitive units to perform imaging for pattern 2 at the 1st millisecond, followed by the third group of photosensitive units.
- the group photosensitive unit performs imaging for pattern 3 at the second millisecond.
- the next round of three-group imaging starts at the 10th millisecond
- the next round of three-group imaging starts at the 20th millisecond
- so on at this time, by introducing three sets of coaxial binoculars, it only takes 3ms to obtain three sets (6 frames) of images required to synthesize a depth data image, that is, the object only needs to remain motionless for 3ms, so the improvement is greatly improved.
- the shooting level for moving objects, and the frame rate can be kept close to 100 frames/s (in this example, it takes 1003ms to shoot 100 frames, or 1.003 seconds).
- the frame rate of depth data based on multi-frame synthesis can be doubled and the imaging of each frame can be shortened time.
- coaxial binocular structures with the same number of sets of images projected by the projection device can be arranged, so that the framing time of each depth frame and the frame interval of the sensor are only related to the multiple of the exposure time (when the frame interval is greater than In the case of exposure time ⁇ number of coaxial structure groups).
- FIGS. 8 and 9 give an example in which the second sub-image sensor 73 is immediately imaged for 1 ms after the first sub-image sensor 722 is imaged for 1 ms.
- the imaging interval of the first and second sub-image sensors also needs to consider the driving of the driving device. Specifically, if imaging is performed directly during the driving process and there is no requirement for the exact angle of projection, after the first sub-image sensor completes imaging, the second sub-image sensor as shown in FIG. 8 and FIG. 9 can be directly performed. imaging.
- the waiting time is relatively short, such as tens of ⁇ s.
- the actual framing rate can also be set as required, for example, to provide users with high, medium and low framing gears.
- the number of image groups to be shot for each frame is different, for example, two groups of high-grade, four groups of mid-grade, and six groups of low-grade.
- the motor selection angle and rhythm, structured light projection and imaging time are set accordingly to meet different imaging needs.
- different matching windows can also be set according to the number of image groups included in the generation of single depth data. Thereby improving the imaging accuracy.
- the optical path needs to be designed.
- the first image sensor 720 may include: a mirror unit 721 for receiving the incident returning structured light; a beam splitting device 722 for dividing the incident returning structured light into at least a first light beam and a second light beam ; the first sub-image sensor 723 is used to image the first light beam; the second sub-image sensor 724 is used to image the second light beam of the returning structured light corresponding to different patterns.
- the beam splitting device 722 is an optical prism, such as a square prism or a triangular prism.
- the reflected infrared light in the incident light reaches the second sub-image sensor 724 , and the unreflected visible light in the incident light can propagate to the first sub-image sensor 723 in a straight line.
- the beam splitting device 722 in the form of a prism can split the incident light into two beams whose propagation directions are perpendicular to each other.
- the first sub-image sensor 723 and the second sub-image sensor 724 may also be vertically arranged so as to receive incident visible light and infrared light beams at a vertical angle, respectively.
- the components in the incident light need to have the same optical path.
- the first sub-image sensor 723 and the second sub-image sensor 724 may be arranged at equal distances from the beam splitting area of the beam splitting device 722 .
- the distance between the two photosensitive units and the beam splitting device 722 can be flexibly adjusted according to the ratio of the refractive index of air to the prism material.
- Pixel-level alignment (or approximate alignment) between the first sub-image sensor 723 and the second sub-image sensor 724 can be theoretically achieved by making the incident light share most of the optical path and have the same optical path.
- the actual arrangement of the first sub-image sensor 723 and the second sub-image sensor 524 cannot present ideal vertical and equidistant conditions, resulting in a deviation between the images of the two.
- forced software correction can be performed on the manufactured image sensor. For example, by introducing a calibration target and aligning the images of both the first sub-image sensor 723 and the second sub-image sensor 724 with the calibration target, true pixel-level correction is achieved.
- the pixel-level alignment between the first sub-image sensor 723 and the second sub-image sensor 724 may be precise pixel-level alignment, or may have several pixel differences and achieve alignment through calibration.
- the image sensor 720 of the present invention may be implemented as a separate module.
- the image sensor 720 may further include a housing for fixing the relative positions of the lens unit, the beam splitting device, and the two photosensitive units.
- the housing can be combined with the lens unit 721 to form a sealing body, so as to avoid the contamination of the contained devices by the external environment.
- the image sensor 720 of the present invention may be part of a larger module (eg, a depth data measurement head), and the securing between the various elements is accomplished by the housing of the larger module.
- the image sensor 720 may further include cables connected to the first sub-image sensor 723 and the second sub-image sensor 724, respectively.
- the housing then has openings for cable access.
- the cables may be flexible cables, such as FPC (flexible circuit board) wires.
- the light beam before entering the first sub-image sensor 723 and the second sub-image sensor 724, the light beam may also pass through a filter to further filter out the influence of light of other wavelengths.
- the projection device can project infrared laser light, so the filter arranged in the image sensor can be a corresponding infrared light transmission unit for transmitting infrared light in a specific frequency range, for example, the wavelength of 780- 1100nm infrared light.
- the projection device can also project visible light, such as red laser or blue laser, such as red light at 635 nm or blue light at 450 nm.
- the ambient light may also include red light or blue light
- the ambient light may also include red light or blue light
- imaging with high signal-to-noise ratio can also be performed with the help of the corresponding red or blue light filter.
- the sensor since the sensor has a photoelectric conversion efficiency better than that of red light for blue light, and the shorter the wavelength band, the higher the accuracy, so the cost of the laser (blue lasers usually cost more), photoelectric conversion efficiency and imaging accuracy can be considered comprehensively. , choose the appropriate wavelength of the laser.
- the beam splitting device is a square prism
- one side of the filter can be in physical contact with the square prism directly, and the other side is in physical contact with the photosensitive unit, and the photosensitive unit and the square prism are clamped in the housing, This ensures a high degree of invariance in the relative positions of the components.
- additional visible light photosensitive cells may be arranged in the image sensor. out) is used to capture the image information of the object under test, so that the image captured by the image sensor contains both the image information of the object under test and the depth information.
- the visible light sensing unit can be a grayscale sensor or a color sensor. The grayscale sensor only captures the brightness information, and the color sensor can be used to capture the color information of the measured object.
- the visible light sensing unit can be composed of three primary color sensing units, of which the three primary colors can be red, green, and blue (RGB) or cyan, red and yellow.
- CY Three primary colors
- the second image sensor 730 may also have the same structure.
- 723 and 733 can be regarded as the first set of binoculars
- 724 and 734 can be regarded as the second set of binoculars
- 723 and 734 can also be regarded as the first set of binoculars
- 724 and 734 can be regarded as the first set of binoculars.
- optical path transformation device for changing the optical path to deliver the incident returning structured light to the first sub-image sensor and the first sub-image sensor and the first sub-image sensor. sub image sensor.
- the first and second image sensors 720 and 730 may each include: a lens unit for receiving the incident return structured light; a light path conversion device for delivering the incident return structured light to at least the first sub-path and the second sub-path Two sub-paths; a first sub-image sensor for imaging the returned structured light on the first sub-path; a second sub-image sensor for imaging the returning structured light corresponding to different patterns on the second sub-path .
- the optical path conversion device may be a rotating mirror, which may reflect the incident light to the photosensitive unit 723 at the 0th millisecond, reflect the incident light to the photosensitive unit 724 at the 1st millisecond, and so on.
- the optical path conversion device may also be a device for performing optical path conversion based on other mechanical, chemical or electrical principles.
- the image sensor may be a rolling-shutter image sensor or a global image sensor (ie, all pixels are imaged at the same time).
- Global sensors can achieve higher frame rates, and rolling sensors can have adjustable dynamic range, so the sensor type to be used can be selected according to the actual application scenario.
- the projection device may include a galvanometer that vibrates reciprocally at a predetermined frequency, such as a MEMS galvanometer or a mechanical galvanometer, for scanning and projecting structured light toward the shooting area with a predetermined frequency and a range of motion.
- a galvanometer that vibrates reciprocally at a predetermined frequency
- the galvanometer can achieve a very high vibration frequency, for example, 2k per second, it is impossible to directly use the start signal of the MEMS galvanometer for synchronization (because the delay is unreliable), so synchronization is required (for example, knowing the rotation angle) ), considering the phase vibration characteristics of the micromirror device, a measurement device for real-time measurement of the vibration phase of the galvanometer can be included in the synchronization device.
- the above measurements may be based on the outgoing light itself.
- the above-mentioned measurement device may be one or more photosensors (eg, two photodiodes PD), and the two photosensors are arranged in any of the following ways: on different exit paths of the projection device; arranged on different reflection paths within the projection device; and respectively arranged on outgoing and reflection paths inside and outside the projection device.
- the arrangement of the photoelectric sensor can be reasonably selected so that it does not affect the normal projection of structured light while accurately measuring the phase.
- the PD can be installed in the projection device, and the instantaneous vibration phase can be determined by measuring the reflection angle when the laser exits the light window.
- the vibration phase of the MEMS mirror is sinusoidally distributed, one PD can determine the sinusoidal distribution information, and more PDs help to measure the phase more accurately.
- the PD can also be installed outside the projection device, for example, on a light window, for example, near the edge of the light window to prevent the impact on the projection in the shooting area.
- phase measurements may also be performed in other manners, such as capacitance measurements.
- the projection device may include a unidirectional rotating mechanical mirror. Accordingly, when synchronization is required, the measurement device included in the synchronization device may be used to measure the rotation angle of the motor of the reflection device in real time. Angle measurer.
- synchronization between projection and exposure is achieved by controlling the exposure of the image sensor.
- the projection angle of the light source is controllable (for example, the angle and rotational speed of the mechanical galvanometer can be controlled by voltage and current), and it is especially useful when the phase and speed of the light source scanning are not controllable (for example, for MEMS galvo mirrors) or mechanical mirrors). Therefore, the MEMS galvanometer can use PD or capacitance to detect the angle, and the mechanical mirror can also realize the position detection through voltage detection or photoelectric coding.
- the present invention can also be implemented as a depth data computing device, comprising: the depth data measurement head as described above; and a processor connected with the depth data measurement head for binocular Under the scheme, the depth data of the photographed object in the photographing area is determined according to the predetermined relative positions of the first and second image sensors and the set of image frame pairs obtained by imaging the structured light.
- FIG. 10 shows a schematic flowchart of a method for measuring depth data according to an embodiment of the present invention. The method can be implemented in combination with the structured light projection device, the measuring head and the computing device of the present invention.
- step S1010 the light beam with the speckle pattern emitted by the light source module is rotated and reflected.
- step S1020 the measured space is imaged at least twice by using the first and second image sensors whose relative positions are fixed to obtain at least two sets of images, wherein, in the at least two imagings, the measured space is projected different speckle patterns appearing due to the rotational reflection.
- step S1030 depth data is obtained from the at least two sets of images and depth data fusion is performed.
- using the first and second infrared light image sensors with fixed relative positions to image the measured space at least twice includes: using a pair of first sub-image sensors with a predetermined relative positional relationship to perform imaging First imaging to acquire first image frame pair; second imaging using second sub-image sensor pair to acquire second image frame pair, where each of the first and second sub-image sensor pair is one sub-image sensor Share at least part of the optical path and form a first image sensor, the other sub-image sensor in the first and second sub-image sensor pair shares at least part of the optical path and form a second image sensor, the first and second image frame pairs use A single-shot depth data calculation in the shooting area.
- the depth data measurement head, the computing device and the measurement method according to the present invention have been described in detail above with reference to the accompanying drawings.
- the depth measurement solution of the present invention especially uses an improved structured light projection device capable of reflecting structured light generated by the light source module at different angles, thereby enabling faster, more economical and lower failure rate multi-pattern projection.
- the structured light projection device can cooperate with multiple pairs of binocular sensors sharing a light path, thereby further shortening the frame interval and improving the quality of deep fusion data.
- the speckle measurement solution of the present invention is especially suitable for the depth measurement of continuous planes, for example, for loading and unloading material grabbing or welding seam detection in shipyards.
- each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more functions for implementing the specified logical function(s) executable instructions.
- the functions noted in the block may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
- each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or operations , or can be implemented in a combination of dedicated hardware and computer instructions.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
Claims (19)
- 一种深度数据测量头,包括:结构光投射装置,用于在驱动装置驱动不同的投射角度下向被测空间投射带有纹理的光束,以在被测空间中的待检测物体上形成不同的纹理;以及分别布置在所述结构光投射装置两侧的第一和第二图像传感器,所述第一和第二两个图像传感器之间具有预定的相对空间位置关系,用于对被测空间进行至少两次成像,以获取具有不同纹理分布的至少两组图像,其中所述至少两组图像用于获取待检测物体的单次测量深度数据。
- 如权利要求1所述的测量头,其中,所述结构光投射装置包括:光源模块,用于生成并出射带纹理的光束;驱动装置,用于驱动所述光源模块以不同的投射角度下向被测空间投射带有纹理的光束。
- 如权利要求1所述的测量头,其中,所述结构光投射装置包括:光源模块,用于生成并出射带纹理的光束;转向投射模块,包括:布置在所述光束的出射路径上的反射装置,用于对入射的所述光束进行反射,以使所述光束出射;连接至所述反射装置的驱动装置,用于改变所述反射装置相对于入射的所述光束的角度,以改变所述光束的出射方向。
- 如权利要求3所述的测量头,其中,所述转向投射模块是机械振镜,并且所述反射装置沿轴向往复运动;或者所述转向投射模块是机械转镜,并且所述反射装置沿轴向进行单向运动。
- 如权利要求1所述的测量头,其中,所述驱动装置在连续运动期 间的预定窗口期保持静止。
- 如权利要求1所述的测量头,其中,所述光源模块包括:激光发生器,用于发射激光光束;布置在所述激光光束的出射光路上的衍射光学元件,用于使得入射激光发生衍射并使其被调制成具有特定投射规则的离散光斑;或者泛光光源,用于生成泛光;以及布置在所述泛光光源上的掩膜,用于将所述泛光转换成具有特定投射编码的光斑。
- 如权利要求1所述的深度数据测量头,还包括:控制器,用于控制所述光源模块在所述第一和第二图像传感器曝光时同步点亮。
- 如权利要求1所述的深度数据测量头,其中,所述第一和第二图像传感器各自包括至少共用部分光路的至少两个子图像传感器,所述至少两个子图像传感器各自用于进行所述至少两次成像中的一次成像。
- 如权利要求8所述的深度数据测量头,包括:控制器,使得所述第一和第二图像传感器各自包括至少两个子图像传感器同步地以所述第一间隔相继进行成像,所述第一间隔小于所述子图像传感器的最小帧成像间隔。
- 如权利要求9所述的深度数据测量头,其中,所述控制器用于:使得每个子图像传感器以不小于所述子图像传感器的最小帧成像间隔的第二间隔进行自身的下一帧成像。
- 如权利要求8所述的深度数据测量头,其中,所述第一和第二图像传感器各自包括:镜片单元,用于接收入射的返回结构光;光路变换装置,用于改变光路以将入射的返回结构光输送到第一子图像传感器和第一子图像传感器;第一子图像传感器和第二子图像传感器,用于在不同时刻对不同的图案进行成像。
- 如权利要求11所述的深度数据测量头,其中,光路变换装置包括:分束装置,用于将入射的返回结构光分成至少第一光束和第二光束,其中,第一子图像传感器,用于对第一光束进行成像;第二子图像传感器,用于对对应于不同图案的返回结构光的第二光束进行成像。
- 如权利要求11所述的深度数据测量头,其中,光路变换装置包括:光路转换装置,用于将入射的返回结构光输送到至少第一子路径和第二子路径,其中,第一子图像传感器,用于在第一子路径上对返回结构光进行成像;第二子图像传感器,用于在第二子路径上对对应于不同图案的返回结构光进行成像。
- 如权利要求11所述的深度数据测量头,其中,所述第一子图像传感器和所述第二子图像传感器与所述分束装置的分束区域或所述光路转换装置的光路转换区域相距相等的距离。
- 如权利要求11所述的深度数据测量头,其中,所述第一和第二图像传感器各自包括的至少共用部分光路的至少两个子图像传感器是红外光传感器;和/或所述第一和第二图像传感器各自包括:可见光图像传感器,用于对入射的结构光进行成像,其中所述可见光传感器与第一和/或第二图像子传感器共用至少部分光路。
- 如权利要求1所述的深度数据测量头,其中,基于单次测量深度数据所需的不同图像组数,设置相应的驱动装置的投射角度组合、光束的投射时刻以及第一和第二图像传感器的成像时刻。
- 一种深度数据计算设备,包括:如权利要求1-16中任一项所述的深度数据测量头,以及处理器,用于获取所述至少两组图像,并根据所述第一和第二图像传感器之间的预定相对空间位置关系,确定每组图像中所述纹理的深度数据,将基于所述至少两组图像确定的深度数据融合,得到新的深度数据,作为待检测物体的单次测量深度数据。
- 一种深度数据测量方法,包括:对光源模块出射的带散斑图案的光束进行不同角度的投射;使用相对位置固定的第一和第二图像传感器对被测空间进行至少两次成像,以获取至少两组图像,其中,在所述至少两次成像中,被测空间被投射了由于所述不同角度的投射而呈现出的不同散斑图案;从所述至少两组图像中求取深度数据并进行深度数据融合。
- 如权利要求18所述的方法,其中,使用相对位置固定的第一和第二红外光图像传感器对被测空间进行至少两次成像包括:使用具有预定相对位置关系的第一子图像传感器对进行第一次成像以获取第一图像帧对;使用第二子图像传感器对进行第二次成像以获取第二图像帧对,其中,第一和第二子图像传感器对中各自的一个子图像传感器共用至少部分光路并组成第一图像传感器,第一和第二子图像传感器对中各自的另一个子图像传感器共用至少部分光路并组成第二图像传感器,所述第一和第二图像帧对用于所述拍摄区域的单次深度数据计算。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/284,690 US20240167811A1 (en) | 2021-04-20 | 2021-12-14 | Depth data measuring head, computing device and measurement method |
EP21937718.1A EP4328541A1 (en) | 2021-04-20 | 2021-12-14 | Depth data measuring head, computing device and measurement method |
JP2023561894A JP2024513936A (ja) | 2021-04-20 | 2021-12-14 | 深度データ測定ヘッド、計算装置及び測定方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110423509.7 | 2021-04-20 | ||
CN202110423509.7A CN115218820A (zh) | 2021-04-20 | 2021-04-20 | 结构光投射装置、深度数据测量头、计算设备和测量方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022222496A1 true WO2022222496A1 (zh) | 2022-10-27 |
Family
ID=83604661
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/137790 WO2022222496A1 (zh) | 2021-04-20 | 2021-12-14 | 深度数据测量头、计算设备和测量方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20240167811A1 (zh) |
EP (1) | EP4328541A1 (zh) |
JP (1) | JP2024513936A (zh) |
CN (1) | CN115218820A (zh) |
WO (1) | WO2022222496A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116593282A (zh) * | 2023-07-14 | 2023-08-15 | 四川名人居门窗有限公司 | 一种基于结构光的玻璃抗冲击反应测试系统及方法 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3682553A (en) * | 1968-09-19 | 1972-08-08 | Optics Technology Inc | Apparatus for acquiring and laying real time 3-d information |
CN107369156A (zh) * | 2017-08-21 | 2017-11-21 | 上海图漾信息科技有限公司 | 深度数据检测系统及其红外编码投影装置 |
CN108650447A (zh) * | 2018-07-06 | 2018-10-12 | 上海图漾信息科技有限公司 | 图像传感器、深度数据测量头及测量系统 |
WO2019209064A1 (ko) * | 2018-04-26 | 2019-10-31 | 엘지이노텍 주식회사 | 카메라 모듈 및 그의 깊이 정보 추출 방법 |
CN111239729A (zh) * | 2020-01-17 | 2020-06-05 | 西安交通大学 | 融合散斑和泛光投射的ToF深度传感器及其测距方法 |
CN111692987A (zh) * | 2019-03-15 | 2020-09-22 | 上海图漾信息科技有限公司 | 深度数据测量头、测量装置和测量方法 |
CN111721239A (zh) * | 2020-07-22 | 2020-09-29 | 上海图漾信息科技有限公司 | 深度数据测量设备和结构光投射装置 |
CN111829449A (zh) * | 2019-04-23 | 2020-10-27 | 上海图漾信息科技有限公司 | 深度数据测量头、测量装置和测量方法 |
CN212747701U (zh) * | 2020-07-22 | 2021-03-19 | 上海图漾信息科技有限公司 | 结构光投射装置和深度数据测量头 |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012020380A1 (en) * | 2010-08-11 | 2012-02-16 | Primesense Ltd. | Scanning projectors and image capture modules for 3d mapping |
US10368056B2 (en) * | 2015-06-19 | 2019-07-30 | Shanghai Percipio Technology Limited | Depth data detection and monitoring apparatus |
-
2021
- 2021-04-20 CN CN202110423509.7A patent/CN115218820A/zh active Pending
- 2021-12-14 JP JP2023561894A patent/JP2024513936A/ja active Pending
- 2021-12-14 EP EP21937718.1A patent/EP4328541A1/en active Pending
- 2021-12-14 US US18/284,690 patent/US20240167811A1/en active Pending
- 2021-12-14 WO PCT/CN2021/137790 patent/WO2022222496A1/zh active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3682553A (en) * | 1968-09-19 | 1972-08-08 | Optics Technology Inc | Apparatus for acquiring and laying real time 3-d information |
CN107369156A (zh) * | 2017-08-21 | 2017-11-21 | 上海图漾信息科技有限公司 | 深度数据检测系统及其红外编码投影装置 |
WO2019209064A1 (ko) * | 2018-04-26 | 2019-10-31 | 엘지이노텍 주식회사 | 카메라 모듈 및 그의 깊이 정보 추출 방법 |
CN108650447A (zh) * | 2018-07-06 | 2018-10-12 | 上海图漾信息科技有限公司 | 图像传感器、深度数据测量头及测量系统 |
CN111692987A (zh) * | 2019-03-15 | 2020-09-22 | 上海图漾信息科技有限公司 | 深度数据测量头、测量装置和测量方法 |
CN111829449A (zh) * | 2019-04-23 | 2020-10-27 | 上海图漾信息科技有限公司 | 深度数据测量头、测量装置和测量方法 |
CN111239729A (zh) * | 2020-01-17 | 2020-06-05 | 西安交通大学 | 融合散斑和泛光投射的ToF深度传感器及其测距方法 |
CN111721239A (zh) * | 2020-07-22 | 2020-09-29 | 上海图漾信息科技有限公司 | 深度数据测量设备和结构光投射装置 |
CN212747701U (zh) * | 2020-07-22 | 2021-03-19 | 上海图漾信息科技有限公司 | 结构光投射装置和深度数据测量头 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4328541A4 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116593282A (zh) * | 2023-07-14 | 2023-08-15 | 四川名人居门窗有限公司 | 一种基于结构光的玻璃抗冲击反应测试系统及方法 |
CN116593282B (zh) * | 2023-07-14 | 2023-11-28 | 四川名人居门窗有限公司 | 一种基于结构光的玻璃抗冲击反应测试系统及方法 |
Also Published As
Publication number | Publication date |
---|---|
EP4328541A4 (en) | 2024-02-28 |
EP4328541A1 (en) | 2024-02-28 |
US20240167811A1 (en) | 2024-05-23 |
JP2024513936A (ja) | 2024-03-27 |
CN115218820A (zh) | 2022-10-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5281923B2 (ja) | 投射型表示装置 | |
CN110446965B (zh) | 用于结合光扫描投影仪跟踪眼睛运动的方法和系统 | |
US7078720B2 (en) | Range finder for measuring three-dimensional geometry of object and method thereof | |
US20140313519A1 (en) | Depth scanning with multiple emitters | |
CN107369156B (zh) | 深度数据检测系统及其红外编码投影装置 | |
US9013711B2 (en) | Contour sensor incorporating MEMS mirrors | |
WO2017038203A1 (ja) | 距離画像取得装置付きプロジェクタ装置及びプロジェクションマッピング方法 | |
JPH10232626A (ja) | 立体画像表示装置 | |
EP3640678A1 (en) | Tracker, surveying apparatus and method for tracking a target | |
WO2022222496A1 (zh) | 深度数据测量头、计算设备和测量方法 | |
WO2022017441A1 (zh) | 深度数据测量设备和结构光投射装置 | |
CN107836112B (zh) | 投影系统 | |
JP2010085472A (ja) | 画像投影・撮像装置 | |
CN216246133U (zh) | 结构光投射装置、深度数据测量头和计算设备 | |
WO2021032298A1 (en) | High resolution optical depth scanner | |
CN212747701U (zh) | 结构光投射装置和深度数据测量头 | |
WO2022222497A1 (zh) | 深度数据测量头、计算装置及其对应方法 | |
CN216283296U (zh) | 深度数据测量头和深度数据计算设备 | |
CN217604922U (zh) | 深度数据测量头和局部深度数据测量设备 | |
JP3668466B2 (ja) | 実時間レンジファインダ | |
JPH1026724A (ja) | アクティブ式多点測距装置 | |
CN117369197B (zh) | 3d结构光模组、成像系统及获得目标物体深度图的方法 | |
CN212567304U (zh) | 深度数据测量头 | |
US20210152809A1 (en) | Structured light emitting device and image acquisiton device uisng same | |
KR100902176B1 (ko) | 회전다면경을 이용한 3d 스캐너 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21937718 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18284690 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023561894 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021937718 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021937718 Country of ref document: EP Effective date: 20231120 |