WO2022222496A1 - 深度数据测量头、计算设备和测量方法 - Google Patents

深度数据测量头、计算设备和测量方法 Download PDF

Info

Publication number
WO2022222496A1
WO2022222496A1 PCT/CN2021/137790 CN2021137790W WO2022222496A1 WO 2022222496 A1 WO2022222496 A1 WO 2022222496A1 CN 2021137790 W CN2021137790 W CN 2021137790W WO 2022222496 A1 WO2022222496 A1 WO 2022222496A1
Authority
WO
WIPO (PCT)
Prior art keywords
sub
light
depth data
image sensor
image
Prior art date
Application number
PCT/CN2021/137790
Other languages
English (en)
French (fr)
Inventor
王敏捷
梁雨时
Original Assignee
上海图漾信息科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海图漾信息科技有限公司 filed Critical 上海图漾信息科技有限公司
Priority to US18/284,690 priority Critical patent/US20240167811A1/en
Priority to EP21937718.1A priority patent/EP4328541A1/en
Priority to JP2023561894A priority patent/JP2024513936A/ja
Publication of WO2022222496A1 publication Critical patent/WO2022222496A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/529Depth or shape recovery from texture
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Definitions

  • the invention relates to the technical field of three-dimensional detection, in particular to a depth data measuring head, a computing device and a measuring method.
  • a binocular detection scheme based on structured light can measure the surface of an object in three dimensions in real time.
  • the scheme first projects a two-dimensional laser texture pattern with encoded information on the surface of the target object, such as a discretized speckle pattern.
  • the unit uses the sampling window to sample the two images simultaneously collected by the two image acquisition devices, determines the matching laser texture patterns in the sampling window, and calculates each laser texture sequence projected on the surface of the natural body according to the difference between the matched texture patterns The depth distance of the segment is further measured to obtain the three-dimensional data of the surface of the object to be measured.
  • the smaller the sampling window the finer the granularity of the image, but the greater the false matching rate. While it is possible to reduce the sampling window by taking multiple sets of different images consecutively, this introduces additional system complexity and reduces the frame rate.
  • a technical problem to be solved by the present disclosure is to provide a depth data measurement solution, which uses multi-angle pattern projection, especially an improved structured light projection device capable of reflecting the structured light generated by the light source module at different angles, so as to be able to Enables faster, more economical projection with lower failure rates. Further, the structured light projection device can cooperate with multiple pairs of binocular sensors sharing a light path, thereby further shortening the frame interval and improving the quality of deep fusion data.
  • a depth data measurement head comprising: a structured light projection device for projecting a light beam with textures to a measured space under different projection angles driven by the driving device, so as to be different textures are formed on the object to be detected in the measurement space; and first and second image sensors are respectively arranged on both sides of the structured light projection device, and there are predetermined image sensors between the first and second image sensors. relative spatial positional relationship, and image the measured space at least twice during the movement of the reflective device to obtain at least two sets of images with different texture distributions, wherein the at least two sets of images are used to obtain images to be Single-measurement depth data for detected objects.
  • a depth data computing device comprising: the depth data measurement head according to the first aspect of the present invention, and a processor for acquiring the at least two sets of images, and Determine the depth data of the texture in each set of images according to the predetermined relative spatial positional relationship between the first and second image sensors, and fuse the depth data determined based on the at least two sets of images to obtain a new Depth data, as the single-measurement depth data of the object to be detected.
  • a method for measuring depth data comprising: projecting light beams with speckle patterns emitted from a light source module at different angles; using first and second image sensors with fixed relative positions to The measured space is imaged at least twice to obtain at least two sets of images, wherein, in the at least two imagings, the measured space is projected with different speckle patterns presented by the projections from different angles; Depth data is obtained from the at least two sets of images and depth data fusion is performed.
  • the projection flexibility of the structured light projection device is improved by rotating the reflected structured light.
  • the device can be further combined with a mask light source and a coaxial binocular solution to further improve the accuracy and imaging speed of the multi-frame fusion solution.
  • FIG. 1 shows a schematic composition diagram of a structured light projection device according to an embodiment of the present invention.
  • FIG. 2 shows a schematic diagram of changing the projection direction of the present invention.
  • Figure 3 shows a perspective view of a structured light projection device according to one embodiment of the present invention.
  • 4A-B illustrate examples of projecting structured light by the structured light projection device shown in FIG. 3 under different perspective angles.
  • 5A-B illustrate examples of speckle projection using the apparatus shown in FIG. 3 or FIG. 4 .
  • FIG. 6 shows a block diagram of a depth data measurement head according to an embodiment of the present invention.
  • FIG. 7 shows a schematic diagram of the composition of a depth data measuring head according to an embodiment of the present invention.
  • FIG. 8 shows a comparison timing diagram of coaxial two-group imaging and single-group imaging.
  • FIG. 9 shows the timing diagram of coaxial three-group binocular imaging.
  • FIG. 10 shows a schematic flowchart of a method for measuring depth data according to an embodiment of the present invention.
  • the sampling window can be reduced by taking multiple sets of different images in succession.
  • the same light source module can be projected from different angles by using the driving device to drive the light source module to rotate.
  • the driving device to drive the light source module to rotate.
  • the patterns projected by the light source modules are the same, they still appear as different patterns in the field of view of the imaging device (image sensor) due to different projection angles.
  • the driving function of the driving device is limited.
  • multi-frame-based deep fusion reduces the frame rate and degrades the shooting performance of dynamic objects.
  • the present invention provides a depth data measurement solution, which preferably uses an improved structured light projection device capable of reflecting the structured light generated by the light source module at different angles, so as to achieve a faster, more economical and lower failure rate.
  • Multi-pattern projection the structured light projection device can cooperate with multiple pairs of binocular sensors sharing a light path, thereby further shortening the frame interval and improving the quality of deep fusion data.
  • FIG. 1 shows a schematic composition diagram of a structured light projection device according to an embodiment of the present invention.
  • the structured light projection device can be used for structured light projection in the depth data measuring head of the present invention.
  • the structured light projection device 110 may include a light source module I located at the upper part of the dashed line in the figure and a turning projection module II located at the lower part of the dashed line in the figure.
  • the light source module is used for generating the light beam to be projected
  • the steering projection module is used for turning and exiting the light beam.
  • the light source module is used to generate and emit a textured light beam.
  • the light source module does not directly project the light beam emitted by the light emitting device, but performs certain optical processing on the light beam projected by the light emitting device to make it present a desired distribution, brightness or pattern. For this reason, in different implementations, different speckle generation schemes can be used to achieve different projection effects.
  • the light source module 1 may include a laser generator 111 and a diffractive optical element (DOE) 112 .
  • the laser generator 111 is used for emitting a laser beam (as shown by the single arrow leaving 111 in the figure).
  • the DOE 112 arranged on the outgoing optical path of the laser beam can modulate the incident laser light, for example, diffracting the incident laser light and modulating it into discrete light spots with specific projection rules (as indicated by the double arrows leaving 112 in the figure).
  • the diffracted beam for the surface has a certain width, i.e. occupies a certain area in the plane of the DOE 112).
  • the laser generator 111 may be a laser diode (LD), such as an edge-type laser diode.
  • the laser light generated by the LD can be collimated by a collimating lens (not shown in the figure), and then diffracted by the DOE 112.
  • the DOE 112 may have a relatively complex structure to diffract an incident one laser spot into a pattern with multiple spots (eg, with two thousand, or even two 10,000 points) to form the projected complex speckle. Due to the collimation, fixed-point projection can be performed by diffraction, which has the advantages of long working distance and low power consumption.
  • the laser generator 111 may also be implemented as a VCSEL (Vertical Cavity Surface Emitting Laser). Since the VCSEL itself can include a plurality of light-emitting particles, one VCSEL chip itself can emit a certain pattern, for example, a pattern formed by 100 light-emitting particles. At this time, the DOE 112 only needs to simply copy, such as direct copy, interleaved copy, or rotational copy, to realize the speckle pattern of two thousand or even twenty thousand points as mentioned above. Although collimation is generally not possible, VCSEL schemes can be more energy efficient and require less DOE complexity.
  • VCSEL Very Cavity Surface Emitting Laser
  • the light source module 1 may also be implemented using other solutions than lasers and diffractive elements.
  • the light source module may comprise a flood light source for generating flood light; and a mask arranged on the flood light source for converting the flood light into a light spot with a specific projection encoding.
  • the steering projection module is a module for reflecting light beams (eg, speckle patterns) generated by the light source module to exit.
  • the steering projection module of the present invention can drive the reflection module to move, so that the light beam reflected by the reflection module can be changed.
  • the steering projection module II may include a reflection device 113 and a driving device 114 .
  • the reflection device 113 arranged on the exit path of the light beam can be used to reflect the incident light beam to make the light beam exit
  • the driving device 114 connected to the reflection device can be used to change the relative relative value of the reflection device. at the angle of the incident beam, thereby changing the exit direction of the beam.
  • the direction in which the light exits the measuring head (that is, the direction in which the light leaves the measuring head) may be designated as the z direction, the horizontal direction of the photographing plane is the x direction, and the vertical direction is the y direction .
  • the structured light will be projected downward (y direction) to the light source module, and the outgoing direction will be changed to the z direction by the steering projection module ( In fact, an example of placement along the z direction with a slight angular deviation) will be described in detail.
  • the structured light projection device of the present invention may also be arranged or placed in other directions according to the needs of the actual imaging scene.
  • FIG. 2 shows a schematic diagram of changing the projection direction of the present invention.
  • the light emitted by the light source is projected down the y-axis to a reflection device, such as a mirror, and reflected by the reflection device to be projected into the measured space, and can form a light spot on the imaging plane perpendicular to the z-axis.
  • the reflector can be rotated axially along the x-axis, for example, in the range of A-B angles as shown in the figure, whereby a light spot moving in the range of A'-B' on the imaging plane can be obtained accordingly.
  • the pattern projected by the light source module 1 to the mirror is the same (because the relative positions of the laser emitter 111 and the DOE 112 are fixed, and do not themselves is driven), but due to the rotation of the mirror, the projected pattern will have an angular offset, so when used in combination with binocular imaging as described below, the pattern captured by the image sensor at different projection angles can be regarded as different patterns.
  • FIG. 2 shows a mirror that can be rotated within a large A-B angle range in order to illustrate the principle of the present invention
  • the angle difference of the same pattern projected can be very small, For example, 1°, thereby ensuring that the imaging patterns are not identical, and the imaging ranges are substantially coincident.
  • FIG. 3 shows a perspective view of a structured light projection device according to an embodiment of the present invention.
  • 4A-B illustrate examples of projecting structured light by the structured light projection device shown in FIG. 3 under different perspective angles.
  • the laser generator 311 can be arranged inside the casing (or the fixed structure) 315, and the light source generated by the laser generator 311 can be diffracted through the DOE 312 arranged on the outgoing optical path to obtain diffraction with a certain two-dimensional distribution.
  • the diffraction pattern propagates up to the mirror 313, which reflects the diffraction pattern so that it exits (approximately) along the z-direction and is projected on a plane perpendicular to the z-direction (as shown in Figs. 4A and 4B, where Fig. 4B can be viewed as a view of the device shown in FIGS. 3 and 4A rotated 90° counterclockwise along the y-axis) to form a speckle pattern.
  • the projection planes of FIGS. 4A and 4B can be viewed as three-dimensional representations of the imaging plane shown in FIG. 1 .
  • the driving device can control the reflection device to move along the axial direction, wherein the light beam emitted from the light source module is incident on the reflection device in a direction perpendicular to the axial direction (x direction), and based on the axial movement of the reflection device, the change is made. exit direction.
  • the rotating shaft extending from the driving device (eg, motor) 314 is fixedly connected to the mirror 313, so that when the motor works, the rotating shaft drives the reflecting mirror 313 to move axially.
  • a projection pattern with a certain angular offset is formed on the projection plane.
  • These projection patterns can be different patterns for the image sensor that performs projection shooting, so the projection device of the present invention can realize the “different” patterns by rotating the reflective structured light (eg, a diffraction pattern with a two-dimensional distribution). convenient projection.
  • the steering projection module can be a galvanometer, and the motor can drive the mirror to reciprocate within a certain range. In other embodiments, the steering projection module can be a rotating mirror, and the motor can only move in one direction along the axial direction.
  • the steering projection module may be a mechanical galvanometer that vibrates reciprocally at a predetermined frequency, so that structured light can be projected to the measured area at a predetermined frequency, thereby presenting a two-dimensional movement up and down along the y-direction on the projection plane. Diffraction pattern (discrete spot). Due to the controllability of the mechanical galvo mirror, it is also possible to keep the steering projection module stationary for a predetermined window period during continuous motion.
  • a mechanical galvo mirror can have a range of motion of ⁇ 1° along the z-direction and can have a vibration frequency of up to 2k per second.
  • the mechanical galvanometer can be made to pause for a period of time when it moves to -1°, 0°, and 1°, for example, in the photosensitive unit It remains stationary for the required exposure time (eg, 1 ms), thereby allowing the projected pattern to remain unchanged during exposure of the image sensor, thereby improving imaging accuracy.
  • the rotating mirror can only rotate in one direction, and it is difficult to perform variable speed control
  • the moving angle of the rotating mirror can be sensed through a device such as a photodiode, and structured light can be performed within a suitable angle.
  • Projection and dynamic imaging eg projection and corresponding imaging in an interval of ⁇ 1° along the z-direction in 360°.
  • 5A-B illustrate examples of speckle projection using the apparatus shown in FIG. 3 or FIG. 4 .
  • 5A and 5B can be regarded as the patterns projected by the mechanical galvanometer when moving to 1° and -1°, respectively, and can also be regarded as the imaging of the image sensor when the structured light projection device cooperates with the image sensor to form a measuring head.
  • the projection range of the speckle is shown by a black box in the figure, and it should be understood that in actual use, the black box in the figure will not be projected.
  • the speckle occupies less than 30% of the area, so it can be considered that the projection scheme of the laser generator and DOE is adopted.
  • the patterns projected by the two figures are actually “the same", but due to different projection angles , so the pattern of FIG. 5B is shifted down some distance along the y-direction.
  • the speckle projected on the object is actually different (the projected pattern in the range shown by the dotted box in FIG. 5B is the same as the projected pattern in FIG. 5A ). the same), so it can be considered that different patterns are projected on the same measured object.
  • the irregularity (or randomness) of the speckle pattern in different angles of projection, different positions on the surface of the square object are projected to the speckle, so more surface depth information can be obtained in the subsequent depth data fusion. .
  • the projected structured light is projected along the z-direction and changes in the vertical direction (y-direction), but in other implementations, the projected structured light can also be a structured light projected along the z-direction and fluctuated in the horizontal direction (x-direction), or a projection with angular transformation in both the x- and y-directions.
  • the dotted frame shown in FIG. 5B can be moved up and down, left and right, or even along an oblique line.
  • the number of image groups required for acquiring single-measurement depth data of the object to be detected may be determined according to a specific application scenario. For example, when the number of image groups that can be processed per second is determined, the imaging frame rate can be improved by reducing the number of image groups required for a single measurement of depth data; or by increasing the number of image groups required for a single measurement of depth data to improve imaging accuracy.
  • the structured light projection device of the present invention may have, for example, five adjustable imaging rates.
  • a frame rate of, for example, 100 frames per second can be achieved.
  • a frame rate of, for example, 50 frames per second can be achieved.
  • each measured depth data uses two sets of image frames taken at different times, and the motor projection angle may need to be changed accordingly, for example, by 0.5° and 0° back and forth, and on 0.5° and 0° Shoot separately to obtain two sets of image frames for fusion.
  • an intermediate frame rate ie, mid-range
  • a frame rate of, for example, 25 frames per second can be achieved.
  • a frame rate of, for example, 20 frames/second can be achieved.
  • the user can select the corresponding gear as needed, and make the drive mechanism, light source, image sensor and processor cooperate with each other to achieve the required image capture and fusion calculation.
  • FIG. 6 shows a block diagram of a depth data measurement head according to an embodiment of the present invention.
  • a depth data measurement head 600 includes a structured light projection device 610 and a first image sensor 620 and a second image sensor 630 having a predetermined relative positional relationship.
  • the structured light projection device 610 can, under the driving of the driving device included in it, project light beams with textures to the measured space through different projection angles, so as to form different textures on the object to be detected in the measured space (for example, as shown in Fig. Different textures shown in 5A and 5B).
  • the first and second image sensors 620 and 630 respectively arranged on both sides of the structured light projection device, image the measured space at least twice during the movement of the reflection device, so as to obtain at least two images with different texture distributions.
  • the structured light projection device 610 may include a light source module for generating and emitting a textured light beam; and a driving device for driving the light source module to project the light beam with the same projection angle to the measured space. Textured light beams.
  • the driving device can directly drive the light source module to perform angle transformation.
  • the driving device may be, for example, a voice coil motor on which the light source module is mounted.
  • the structured light projection device 610 may be a structured light projection device including a reflection device as described above in conjunction with FIG. 1 , FIG. 3 and FIG. 4 .
  • the implementation of the measurement head based on this structured light projection device will be described in detail below with reference to FIG. 7 .
  • the example of FIG. 7 further includes a preferred imaging scheme of sharing the optical path.
  • FIG. 7 shows a schematic diagram of the composition of a depth data measuring head according to an embodiment of the present invention.
  • an example of the composition of one of the image sensors 720 is shown in more detail in the figure.
  • the depth data measurement head 700 based on the binocular principle includes a projection device 710 and a first image sensor 720 and a second image sensor 730 having a predetermined relative positional relationship.
  • the projection device 710 may be a structured light projection device as previously described in conjunction with FIGS. 1 and 3-4.
  • the measuring head 700 may further include a casing for surrounding the above-mentioned device, and the connection structure 740 shown in FIG. 7 can be regarded as a mechanism for fixing the above-mentioned device and connecting to the casing.
  • the connection structure 740 may be a circuit board that includes control circuitry thereon. It should be understood that, in other implementations, the above-mentioned devices 710-730 may be connected to the housing in other ways, and perform corresponding data transmission and instruction receiving operations.
  • the projection device 710 is used to project structured light to the shooting area, such as the same pattern diffracted by DOE, but due to the existence of the steering mechanism, the same pattern can be projected at different angles, so that the first image with a predetermined relative positional relationship
  • the sensor 720 and the second image sensor 530 photograph the photographing area to obtain a set of image frame pairs with different patterns. This set of image frame pairs can then be used for a single calculation of depth data of the shooting area.
  • the first image sensor 720 and the second image sensor 730 may be arranged on both sides of the structured light projection device 710, respectively. There is a predetermined relative spatial positional relationship between the first and second image sensors, and the measured space is imaged at least twice during the movement of the reflective device to obtain at least two images with different texture distributions. A group of images, wherein the at least two groups of images are used to obtain single-measurement depth data of the object to be detected.
  • the projection device 710 may, under the driving of its driving device, project a light beam with textures to the measured space at constantly changing projection angles, so as to form different textures on the object to be detected in the measured space.
  • the image sensor can perform multiple imaging during the turning process, for example, when moving to -1°, 0° and 1°, each image is performed once, thereby obtaining a set of image frame pairs including three pairs (6 frames). These 6 frames of images are jointly used for the calculation of one-time depth data for the shooting area, that is, a depth image of one frame can be calculated.
  • the light source module in the projection device 510 can be always on during operation, and the image sensor can perform multiple functions at a specific angle or any angle (or any angle within a predetermined range of motion) of the steering device rotation. secondary imaging.
  • corresponding imaging can be performed according to the rotation angle of the rotating mirror, for example, when the movement reaches -1°, 0°, and 1°, each imaging is performed once.
  • Corresponding exposure can also be performed according to the measured rotation angle.
  • the rotation angle and exposure of the image sensor may also not be synchronized.
  • the image sensor can shoot a set of images required at any interval at any time, as long as the shooting interval does not completely coincide with the rotation frequency of the rotating mirror, You can take different images.
  • the differences between the images of the same group are compared, and there is no need to compare with the reference pattern, it is not necessary to specify what pattern is projected.
  • the light source module in the projection device 710 may be synchronized with the exposure of the image sensor.
  • the measuring head 700 can also be controlled by a controller for controlling the light source module to light up synchronously when the first and second image sensors are exposed.
  • controller may also be used to control the drive to remain stationary during exposure of the first and second image sensors.
  • a clearer projection pattern such as discrete light spots, can be obtained.
  • the first and second image sensors may be conventional image sensors.
  • the first and second image sensors each include at least two sub-image sensors that share at least part of the optical path, the at least two sub-image sensors each being used to perform one imaging of the at least two imagings .
  • each of the first image sensor and the second image sensor includes only one photosensitive unit, and each photosensitive unit performs imaging three times to obtain a set of three pairs (6 frames) of image frame pairs.
  • the first and second image sensors each include at least two sub-image sensors sharing at least part of the optical path, and the at least two sub-image sensors are respectively used to image the structured light of different patterns successively projected by the projection device.
  • FIG. 7 shows an example in which each of the first and second image sensors includes two sub-image sensors (photosensitive units).
  • the first image sensor 720 includes sub-image sensors 723 and 724
  • the second image sensor 730 includes sub-image sensors 733 and 734 .
  • the sub-image sensors 723 and 724 share the optical path up to the beam splitting surface of the beam splitting device 722, and the distance from the above-mentioned beam splitting area is equal.
  • the sub-image sensors 733 and 734 share the optical path up to the beam-splitting surface of the beam-splitting device 732, and are equally spaced from the above-mentioned beam-splitting area.
  • the present invention introduces sets of binocular structures that are coaxial with each other.
  • the sub-image sensors 723 and 733 located in different image sensor housings can be regarded as the first group of image sensors (the first group of binoculars) for imaging structured light under one projection angle.
  • sub-image sensors 724 and 734 which can be regarded as a second group of image sensors (a second group of binoculars)
  • the sub-image sensors 724 and 534 coaxial with 723 and 733 are in place (ie, have equivalent optical paths), and the imaging of the latter pattern structured light is performed instead of 723 and 733 . Therefore, the imaging interval of two adjacent frames can be performed at a smaller interval without depending on the frame interval of each image sensor.
  • the measuring head 700 may further include: a synchronization device for causing the first interval to project structured light of at least two different patterns at a first interval smaller than the frame imaging interval of the sub-image sensor, so that the first
  • a synchronization device for causing the first interval to project structured light of at least two different patterns at a first interval smaller than the frame imaging interval of the sub-image sensor, so that the first
  • Each of the second image sensors 720 and 730 includes at least two sub-image sensors to sequentially image the at least two different patterns of structured light at the first interval synchronously. Accordingly, each sub-image sensor still performs its own next frame imaging at a second interval not less than the frame imaging interval of the sub-image sensor (for example, imaging at its own frame interval), and the above-mentioned imaging operations can be synchronized
  • the synchronization of the device is synchronized with the projection of the projection device.
  • FIG. 8 shows a comparison timing diagram of coaxial two-group imaging and single-group imaging.
  • the frame rate of each photosensitive unit can be set to 100 frames/s, then the frame interval is 10ms (for example, the minimum frame interval is 10ms), and the required frame rate of each photosensitive unit can be set to 10ms.
  • the exposure time is 1ms.
  • first and second image sensors 720 and 730 are conventional image sensors including only a single photosensitive unit, when the three patterns shown in FIG. 1 (corresponding to the acquired six images) are to be used for depth data calculation, as follows As shown in the lower part of Figure 8, three imagings are required at the 0th, 10th and 20th milliseconds. To this end, compositing each depth data image requires the subject to remain motionless for 21ms (so it is more difficult to capture moving subjects), and the frame rate is also reduced from 100 frames/s to 33.3 frames/s.
  • the first and second image sensors 720 and 730 are composed of two photosensitive units (eg, the first and second image sensors 720 and 730 each include sub-image sensors 723 and 724 , and sub-image sensors 733 and 730 respectively 734) of the present invention
  • the imaging of pattern 2 is performed at the first millisecond immediately after the second group of photosensitive units, and then after an interval of 10ms, the first group of photosensitive units is in At the 10th millisecond, imaging for pattern 3 (for example, the pattern at the third projection angle) is performed, thus completing the three imaging required for a pair of depth data images.
  • the second group of photosensitive units can start the next round of imaging for pattern 1.
  • the first group of photosensitive cells imaged pattern 2 .
  • the second group of photosensitive units perform imaging for pattern 3 again.
  • the imaging interval of different groups of photosensitive units only needs the time required for imaging (for example, 1 ms), and the re-imaging interval of the same group of photosensitive units still follows the minimum frame interval time (for example, 10 ms) corresponding to the frame rate.
  • synthesizing each depth data image only requires the subject to remain motionless for 11ms (so it is easier to capture moving subjects), and the frame rate can be kept close to 66.6 frames/s.
  • each of the first and second image sensors may further include more photosensitive units.
  • FIG. 9 shows the timing diagram of coaxial three-group binocular imaging.
  • each of the first and second image sensors may include three photosensitive units (sub-image sensors) that are coaxial.
  • the first group of photosensitive units performs imaging for pattern 1 at the 0th millisecond, followed by the second group of photosensitive units to perform imaging for pattern 2 at the 1st millisecond, followed by the third group of photosensitive units.
  • the group photosensitive unit performs imaging for pattern 3 at the second millisecond.
  • the next round of three-group imaging starts at the 10th millisecond
  • the next round of three-group imaging starts at the 20th millisecond
  • so on at this time, by introducing three sets of coaxial binoculars, it only takes 3ms to obtain three sets (6 frames) of images required to synthesize a depth data image, that is, the object only needs to remain motionless for 3ms, so the improvement is greatly improved.
  • the shooting level for moving objects, and the frame rate can be kept close to 100 frames/s (in this example, it takes 1003ms to shoot 100 frames, or 1.003 seconds).
  • the frame rate of depth data based on multi-frame synthesis can be doubled and the imaging of each frame can be shortened time.
  • coaxial binocular structures with the same number of sets of images projected by the projection device can be arranged, so that the framing time of each depth frame and the frame interval of the sensor are only related to the multiple of the exposure time (when the frame interval is greater than In the case of exposure time ⁇ number of coaxial structure groups).
  • FIGS. 8 and 9 give an example in which the second sub-image sensor 73 is immediately imaged for 1 ms after the first sub-image sensor 722 is imaged for 1 ms.
  • the imaging interval of the first and second sub-image sensors also needs to consider the driving of the driving device. Specifically, if imaging is performed directly during the driving process and there is no requirement for the exact angle of projection, after the first sub-image sensor completes imaging, the second sub-image sensor as shown in FIG. 8 and FIG. 9 can be directly performed. imaging.
  • the waiting time is relatively short, such as tens of ⁇ s.
  • the actual framing rate can also be set as required, for example, to provide users with high, medium and low framing gears.
  • the number of image groups to be shot for each frame is different, for example, two groups of high-grade, four groups of mid-grade, and six groups of low-grade.
  • the motor selection angle and rhythm, structured light projection and imaging time are set accordingly to meet different imaging needs.
  • different matching windows can also be set according to the number of image groups included in the generation of single depth data. Thereby improving the imaging accuracy.
  • the optical path needs to be designed.
  • the first image sensor 720 may include: a mirror unit 721 for receiving the incident returning structured light; a beam splitting device 722 for dividing the incident returning structured light into at least a first light beam and a second light beam ; the first sub-image sensor 723 is used to image the first light beam; the second sub-image sensor 724 is used to image the second light beam of the returning structured light corresponding to different patterns.
  • the beam splitting device 722 is an optical prism, such as a square prism or a triangular prism.
  • the reflected infrared light in the incident light reaches the second sub-image sensor 724 , and the unreflected visible light in the incident light can propagate to the first sub-image sensor 723 in a straight line.
  • the beam splitting device 722 in the form of a prism can split the incident light into two beams whose propagation directions are perpendicular to each other.
  • the first sub-image sensor 723 and the second sub-image sensor 724 may also be vertically arranged so as to receive incident visible light and infrared light beams at a vertical angle, respectively.
  • the components in the incident light need to have the same optical path.
  • the first sub-image sensor 723 and the second sub-image sensor 724 may be arranged at equal distances from the beam splitting area of the beam splitting device 722 .
  • the distance between the two photosensitive units and the beam splitting device 722 can be flexibly adjusted according to the ratio of the refractive index of air to the prism material.
  • Pixel-level alignment (or approximate alignment) between the first sub-image sensor 723 and the second sub-image sensor 724 can be theoretically achieved by making the incident light share most of the optical path and have the same optical path.
  • the actual arrangement of the first sub-image sensor 723 and the second sub-image sensor 524 cannot present ideal vertical and equidistant conditions, resulting in a deviation between the images of the two.
  • forced software correction can be performed on the manufactured image sensor. For example, by introducing a calibration target and aligning the images of both the first sub-image sensor 723 and the second sub-image sensor 724 with the calibration target, true pixel-level correction is achieved.
  • the pixel-level alignment between the first sub-image sensor 723 and the second sub-image sensor 724 may be precise pixel-level alignment, or may have several pixel differences and achieve alignment through calibration.
  • the image sensor 720 of the present invention may be implemented as a separate module.
  • the image sensor 720 may further include a housing for fixing the relative positions of the lens unit, the beam splitting device, and the two photosensitive units.
  • the housing can be combined with the lens unit 721 to form a sealing body, so as to avoid the contamination of the contained devices by the external environment.
  • the image sensor 720 of the present invention may be part of a larger module (eg, a depth data measurement head), and the securing between the various elements is accomplished by the housing of the larger module.
  • the image sensor 720 may further include cables connected to the first sub-image sensor 723 and the second sub-image sensor 724, respectively.
  • the housing then has openings for cable access.
  • the cables may be flexible cables, such as FPC (flexible circuit board) wires.
  • the light beam before entering the first sub-image sensor 723 and the second sub-image sensor 724, the light beam may also pass through a filter to further filter out the influence of light of other wavelengths.
  • the projection device can project infrared laser light, so the filter arranged in the image sensor can be a corresponding infrared light transmission unit for transmitting infrared light in a specific frequency range, for example, the wavelength of 780- 1100nm infrared light.
  • the projection device can also project visible light, such as red laser or blue laser, such as red light at 635 nm or blue light at 450 nm.
  • the ambient light may also include red light or blue light
  • the ambient light may also include red light or blue light
  • imaging with high signal-to-noise ratio can also be performed with the help of the corresponding red or blue light filter.
  • the sensor since the sensor has a photoelectric conversion efficiency better than that of red light for blue light, and the shorter the wavelength band, the higher the accuracy, so the cost of the laser (blue lasers usually cost more), photoelectric conversion efficiency and imaging accuracy can be considered comprehensively. , choose the appropriate wavelength of the laser.
  • the beam splitting device is a square prism
  • one side of the filter can be in physical contact with the square prism directly, and the other side is in physical contact with the photosensitive unit, and the photosensitive unit and the square prism are clamped in the housing, This ensures a high degree of invariance in the relative positions of the components.
  • additional visible light photosensitive cells may be arranged in the image sensor. out) is used to capture the image information of the object under test, so that the image captured by the image sensor contains both the image information of the object under test and the depth information.
  • the visible light sensing unit can be a grayscale sensor or a color sensor. The grayscale sensor only captures the brightness information, and the color sensor can be used to capture the color information of the measured object.
  • the visible light sensing unit can be composed of three primary color sensing units, of which the three primary colors can be red, green, and blue (RGB) or cyan, red and yellow.
  • CY Three primary colors
  • the second image sensor 730 may also have the same structure.
  • 723 and 733 can be regarded as the first set of binoculars
  • 724 and 734 can be regarded as the second set of binoculars
  • 723 and 734 can also be regarded as the first set of binoculars
  • 724 and 734 can be regarded as the first set of binoculars.
  • optical path transformation device for changing the optical path to deliver the incident returning structured light to the first sub-image sensor and the first sub-image sensor and the first sub-image sensor. sub image sensor.
  • the first and second image sensors 720 and 730 may each include: a lens unit for receiving the incident return structured light; a light path conversion device for delivering the incident return structured light to at least the first sub-path and the second sub-path Two sub-paths; a first sub-image sensor for imaging the returned structured light on the first sub-path; a second sub-image sensor for imaging the returning structured light corresponding to different patterns on the second sub-path .
  • the optical path conversion device may be a rotating mirror, which may reflect the incident light to the photosensitive unit 723 at the 0th millisecond, reflect the incident light to the photosensitive unit 724 at the 1st millisecond, and so on.
  • the optical path conversion device may also be a device for performing optical path conversion based on other mechanical, chemical or electrical principles.
  • the image sensor may be a rolling-shutter image sensor or a global image sensor (ie, all pixels are imaged at the same time).
  • Global sensors can achieve higher frame rates, and rolling sensors can have adjustable dynamic range, so the sensor type to be used can be selected according to the actual application scenario.
  • the projection device may include a galvanometer that vibrates reciprocally at a predetermined frequency, such as a MEMS galvanometer or a mechanical galvanometer, for scanning and projecting structured light toward the shooting area with a predetermined frequency and a range of motion.
  • a galvanometer that vibrates reciprocally at a predetermined frequency
  • the galvanometer can achieve a very high vibration frequency, for example, 2k per second, it is impossible to directly use the start signal of the MEMS galvanometer for synchronization (because the delay is unreliable), so synchronization is required (for example, knowing the rotation angle) ), considering the phase vibration characteristics of the micromirror device, a measurement device for real-time measurement of the vibration phase of the galvanometer can be included in the synchronization device.
  • the above measurements may be based on the outgoing light itself.
  • the above-mentioned measurement device may be one or more photosensors (eg, two photodiodes PD), and the two photosensors are arranged in any of the following ways: on different exit paths of the projection device; arranged on different reflection paths within the projection device; and respectively arranged on outgoing and reflection paths inside and outside the projection device.
  • the arrangement of the photoelectric sensor can be reasonably selected so that it does not affect the normal projection of structured light while accurately measuring the phase.
  • the PD can be installed in the projection device, and the instantaneous vibration phase can be determined by measuring the reflection angle when the laser exits the light window.
  • the vibration phase of the MEMS mirror is sinusoidally distributed, one PD can determine the sinusoidal distribution information, and more PDs help to measure the phase more accurately.
  • the PD can also be installed outside the projection device, for example, on a light window, for example, near the edge of the light window to prevent the impact on the projection in the shooting area.
  • phase measurements may also be performed in other manners, such as capacitance measurements.
  • the projection device may include a unidirectional rotating mechanical mirror. Accordingly, when synchronization is required, the measurement device included in the synchronization device may be used to measure the rotation angle of the motor of the reflection device in real time. Angle measurer.
  • synchronization between projection and exposure is achieved by controlling the exposure of the image sensor.
  • the projection angle of the light source is controllable (for example, the angle and rotational speed of the mechanical galvanometer can be controlled by voltage and current), and it is especially useful when the phase and speed of the light source scanning are not controllable (for example, for MEMS galvo mirrors) or mechanical mirrors). Therefore, the MEMS galvanometer can use PD or capacitance to detect the angle, and the mechanical mirror can also realize the position detection through voltage detection or photoelectric coding.
  • the present invention can also be implemented as a depth data computing device, comprising: the depth data measurement head as described above; and a processor connected with the depth data measurement head for binocular Under the scheme, the depth data of the photographed object in the photographing area is determined according to the predetermined relative positions of the first and second image sensors and the set of image frame pairs obtained by imaging the structured light.
  • FIG. 10 shows a schematic flowchart of a method for measuring depth data according to an embodiment of the present invention. The method can be implemented in combination with the structured light projection device, the measuring head and the computing device of the present invention.
  • step S1010 the light beam with the speckle pattern emitted by the light source module is rotated and reflected.
  • step S1020 the measured space is imaged at least twice by using the first and second image sensors whose relative positions are fixed to obtain at least two sets of images, wherein, in the at least two imagings, the measured space is projected different speckle patterns appearing due to the rotational reflection.
  • step S1030 depth data is obtained from the at least two sets of images and depth data fusion is performed.
  • using the first and second infrared light image sensors with fixed relative positions to image the measured space at least twice includes: using a pair of first sub-image sensors with a predetermined relative positional relationship to perform imaging First imaging to acquire first image frame pair; second imaging using second sub-image sensor pair to acquire second image frame pair, where each of the first and second sub-image sensor pair is one sub-image sensor Share at least part of the optical path and form a first image sensor, the other sub-image sensor in the first and second sub-image sensor pair shares at least part of the optical path and form a second image sensor, the first and second image frame pairs use A single-shot depth data calculation in the shooting area.
  • the depth data measurement head, the computing device and the measurement method according to the present invention have been described in detail above with reference to the accompanying drawings.
  • the depth measurement solution of the present invention especially uses an improved structured light projection device capable of reflecting structured light generated by the light source module at different angles, thereby enabling faster, more economical and lower failure rate multi-pattern projection.
  • the structured light projection device can cooperate with multiple pairs of binocular sensors sharing a light path, thereby further shortening the frame interval and improving the quality of deep fusion data.
  • the speckle measurement solution of the present invention is especially suitable for the depth measurement of continuous planes, for example, for loading and unloading material grabbing or welding seam detection in shipyards.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of code that contains one or more functions for implementing the specified logical function(s) executable instructions.
  • the functions noted in the block may occur out of the order noted in the figures. For example, two blocks in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.
  • each block of the block diagrams and/or flowchart illustrations, and combinations of blocks in the block diagrams and/or flowchart illustrations can be implemented in dedicated hardware-based systems that perform the specified functions or operations , or can be implemented in a combination of dedicated hardware and computer instructions.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Measurement Of Optical Distance (AREA)

Abstract

一种深度数据测量头(600,700),包括:结构光投射装置(110,610,710),用于在驱动装置(114,314)驱动不同的投射角度下向被测空间投射带有纹理的光束,以在待检测物体上形成不同的纹理;以及分别布置在结构光投射装置(110,610,710)两侧的第一和第二图像传感器(620,630;720,730),具有预定的相对空间位置关系,并对被测空间进行至少两次成像,以获取具有不同纹理分布的至少两组图像,其中至少两组图像用于获取待检测物体的单次测量深度数据。使用以不同角度反射光源模块(I)产生结构光的结构光投射装置(110,610,710),从而能够实现更为快速、经济且故障率低的多图案投射。进一步地,结构光投射装置(110,610,710)与共有光路的多对双目传感器(723,724;733,734)相配合,从而进一步缩短帧间隔,提升深度融合数据的质量。还公开了一种深度数据计算设备和测量方法。

Description

深度数据测量头、计算设备和测量方法 技术领域
本发明涉及三维检测技术领域,尤其涉及一种深度数据测量头、计算设备和测量方法。
背景技术
近年来,三维成像技术得到蓬勃发展。目前,一种基于结构光的双目检测方案能够实时地对物体表面进行三维测量。简单地说,该方案首先向目标对象的表面投射带有编码信息的二维激光纹理图案,例如离散化的散斑图,由位置相对固定的两个图像采集装置对激光纹理进行连续采集,处理单元使用采样窗口对两个图像采集装置同时采集的两幅图像进行采样,确定采样窗口内匹配的激光纹理图案,根据匹配的纹理图案之间的差异,计算出投射在自然体表面的各个激光纹理序列片段的纵深距离,并进一步测量得出待测物表面的三维数据。
在匹配处理中,采样窗口越大,单次采样中包含的图案信息量也越大,因此也越容易进行匹配,但会导致得到的深度图像颗粒度越大。相应地,采样窗口越小,图像的颗粒度越精细,但误匹配率也越大。虽然可以通过连续拍摄多组不同图像来减小采样窗口,但这又引入的额外的系统复杂度并且会降低帧率。
为此,需要一种改进的深度数据测量方案。
发明内容
本公开要解决的一个技术问题是提供一种深度数据测量方案,该方案通过多角度图案投射,尤其是使用改进的、能够以不同角度反射光源模块产生的结构光的结构光投射装置,从而能够实现更为快速、经济且故障率低的投射。进一步地,该结构光投射装置能够与共有光路的多对双目传感器相配合,从而进一步缩短帧间隔,提升深度融合数据的质量。
根据本公开的第一个方面,提供了一种深度数据测量头,包括:结构光投射装置,用于在驱动装置驱动不同的投射角度下向被测空间投射带有纹理的光束,以在被测空间中的待检测物体上形成不同的纹理;以及分别布置在所述结构光投射装置两侧的第一和第二图像传感器,所述第一和第二两个图像传感器之间具有预定的相对空间位置关系,并在所述反射装置的运动过程中对被测空间进行至少两次成像,以获取具有不同纹理分布的至少两组图像,其中所述至少两组图像用于获取待检测物体的单次测量深度数据。
根据本公开的第二个方面,提供了一种深度数据计算设备,包括:如本发明第一方面所述的深度数据测量头,以及处理器,用于获取所述至少两组图像,并根据所述第一 和第二图像传感器之间的预定相对空间位置关系,确定每组图像中所述纹理的深度数据,将基于所述至少两组图像确定的深度数据融合,得到新的深度数据,作为待检测物体的单次测量深度数据。
根据本公开的第三个方面,提供了一种深度数据测量方法,包括:对光源模块出射的带散斑图案的光束进行不同角度的投射;使用相对位置固定的第一和第二图像传感器对被测空间进行至少两次成像,以获取至少两组图像,其中,在所述至少两次成像中,被测空间被投射了由于所述不同角度的投射而呈现出的不同散斑图案;从所述至少两组图像中求取深度数据并进行深度数据融合。
由此,通过转动反射结构光来提升结构光投射装置的投射灵活度。该装置可以进一步与掩膜光源和同轴双目方案结合,以进一步提升多帧融合方案的精度和成像速度。
附图说明
通过结合附图对本公开示例性实施方式进行更详细的描述,本公开的上述以及其它目的、特征和优势将变得更加明显,其中,在本公开示例性实施方式中,相同的参考标号通常代表相同部件。
图1示出了根据本发明一个实施例的结构光投射装置的组成示意图。
图2示出了本发明改变投射方向的示意图。
图3示出了根据本发明一个实施例的结构光投射装置的透视图。
图4A-B示出了不同透视角度下图3所示结构光投射装置投射结构光的例子。
图5A-B示出了使用图3或图4所示装置投射散斑的例子。
图6示出了根据本发明一个实施例的深度数据测量头的组成框图。
图7示出了根据本发明一个实施例的深度数据测量头的组成示意图。
图8示出了同轴两组成像和单组成像的对比时序图。
图9示出了同轴三组双目成像的时序图。
图10示出了根据本发明一个实施例的深度数据测量方法的示意性流程图。
具体实施方式
下面将参照附图更详细地描述本公开的优选实施方式。虽然附图中显示了本公开的优选实施方式,然而应该理解,可以以各种形式实现本公开而不应被这里阐述的实施方式所限制。相反,提供这些实施方式是为了使本公开更加透彻和完整,并且能够将本公开的范围完整地传达给本领域的技术人员。
如前所述,在双目成像的匹配处理中,采样窗口越大,单次采样中包含的图案信息量也越大,因此也越容易进行匹配,但会导致得到的深度图像颗粒度越大。相应地,采样窗口越小,图像的颗粒度越精细,但误匹配率也越大。因此可以通过连续拍摄多组不同图像来减小采样窗口。
例如,可以通过使用驱动装置驱动光源模块转动,来实现同一光源模块从不同角度进行投射。此时,即便是光源模块投射的图案相同,但由于投射角度不同,因此在 成像装置(图像传感器)的视野中仍然呈现为不同的图案。但由于光源模块本身自重,并且需要布线供电,因此驱动装置的驱动功能受限。另外,基于多帧的深度融合会降低帧率,并且降低动态对象的拍摄性能。
为此,本发明提供一种深度数据测量方案,该方案优选使用改进的、能够以不同角度反射光源模块产生的结构光的结构光投射装置,从而能够实现更为快速、经济且故障率低的多图案投射。进一步地,该结构光投射装置能够与共有光路的多对双目传感器相配合,从而进一步缩短帧间隔,提升深度融合数据的质量。
图1示出了根据本发明一个实施例的结构光投射装置的组成示意图。该结构光投射装置可以在本发明的深度数据测量头中用于结构光投射。
如图所示,结构光投射装置110可以包括位于图示虚线上部的光源模块I以及位于图示虚线下部的转向投射模块II。光源模块用于生成要进行投射的光束,转向投射模块则用于使得光束转向并出射。
具体地,光源模块用于生成并出射带纹理的光束。通常,光源模块不会直接投射发光装置发出的光束,而是会对发光装置投射出的光束进行一定的光学处理,使其呈现想要的分布、亮度或是图案。为此,在不同的实现中,可以采用不同的散斑生成方案,以实现不同的投射效果。
如图1所示,光源模块I可以包括激光发生器111以及衍射光学元件(DOE)112。其中,激光发生器111用于发射激光光束(如图中离开111的单箭头所示)。而布置在激光光束的出射光路上的DOE 112则可对入射的激光进行调制,例如使得入射激光发生衍射并使其被调制成具有特定投射规则的离散光斑(如图中离开112的双箭头所示,用于表面经衍射后的光束具有一定宽度,即,在DOE 112的所在平面上占据一定面积)。
在一个实施例中,激光发生器111可以是激光二极管(LD),例如边发型激光二极管。LD生成的激光可由准直透镜(图中未示出)进行准直,再经由DOE 112进行衍射。在激光发生器111是诸如LD的单束激光发生器的情况下,DOE 112可以具有相对复杂的结构,以将入射的一个激光点衍射成具有多个点的图案(例如具有两千、甚至两万的点),以构成投射的复杂散斑。由于经过准直,因此能够通过衍射进行定点投射,具有工作距离远和功耗小等优点。
在另一个实施例中,激光发生器111还可以实现为VCSEL(垂直腔面发射激光器)。由于VCSEL本身可以包括多颗发光颗粒,因此一个VCSEL芯片本身就能够发出一定的图案,例如由100颗发光颗粒形成的图案。此时,DOE 112仅需简单复制,例如直接复制、交错复制或是旋转复制,以实现如上提及的两千、甚至两万的点的散斑图案。虽然通常无法进行准直,但VCSEL方案可以具有更高的能量效率,并且对DOE的复杂度要求也更低。
在其他实施例中,光源模块I也可以利用激光器和衍射元件之外的其他方案实现。例如,光源模块可以包括泛光光源,用于生成泛光;以及布置在所述泛光光源上的掩膜,用于将所述泛光转换成具有特定投射编码的光斑。
在利用泛光光源加掩膜的实现中,由于掩膜相比于DOE能够被更高精度的设计,因此能够获取更高精细度的散斑图案,并且其信息比值(投射区域内散斑所占面积)相比于在前的DOE方案可以更高,例如高达50%。对于投射散斑的空域解码深度数据求取,更高的信息比值意味着能够在单次投射中获取更多表面积的深度信息,也就意味着获取了更多的信息。但由于被掩膜区域的光能实际上无法被利用,因此掩膜方案的能量转换效率不高。换句话说,利用泛光光源加掩膜的结构光生成方案更适用于近距离的高精度成像场景。
进一步地,转向投射模块是用于将光源模块生成的光束(例如,散斑图案)经过反射以便出射的模块。然而不同于常规的反射模块,本发明的转向投射模块能够驱动反射模块运动,从而使得由反射模块反射的光束能够发生变化。
具体地,如图1所示,转向投射模块II可以包括反射装置113和驱动装置114。其中,布置在所述光束的出射路径上的反射装置113可以用于对入射的所述光束进行反射,以使所述光束出射,而连接至反射装置的驱动装置114则可用于改变反射装置相对于入射光束的角度,由此改变光束的出射方向。
在本发明的实施例中,为了描述方便,可以将光线出射测量头的方向(即,光线离开测量头的方向)约定为z方向,拍摄平面的水平方向为x方向,竖直方向为y方向。为此,在图1以及如下将描述的图2、图3以及图4A-B中,将对光源模块向下(y方向)投射结构光,并由转向投射模块将出射方向改变为z方向(其实是沿z方向有微小角度偏离的方向)放置的例子进行详述。应该理解的是,在其他实施例中,也可以根据实际成像场景的需要,以其他方向布置或是放置本发明的结构光投射装置。
为了方便理解,图2示出了本发明改变投射方向的示意图。由光源发射的光沿着y轴向下投射至反射装置,例如反射镜,并由反射装置反射,以投射至被测空间,并且能够在与z轴垂直的成像平面上形成光斑。反射装置能够沿着x轴轴向转动,例如如图所示在A-B角度范围内转动,由此能够相应地获得在成像平面上在A’-B’范围内移动的光斑。当图1所示的光源模块I投射具有二维分布图案的离散光斑时,虽然光源模块I投射到反射镜的图案是相同的(因为激光发射器111和DOE 112的相对位置固定,且本身不被驱动),但由于反射镜的转动,会使得投射出的图案具有角度偏移,因此在如下所述结合双目成像使用的情况下,图像传感器在不同投射角度下拍摄的图案可以看作是不同的图案。
虽然图2中为了阐明本发明原理而示出了能够在较大的A-B角度范围内转动的反射镜,但应该理解的是,在实际的应用场景中,相同图案投射的角度差异可以很小,例如,1°,由此能够在确保成像图案不相同的同时,保证成像范围的大致重合。
进一步地,图3示出了根据本发明一个实施例的结构光投射装置的透视图。图4A-B示出了不同透视角度下图3所示结构光投射装置投射结构光的例子。
如图所示,激光发生器311可以被设置在壳体(或是固定结构)315的内部,其产生的光源可以经由布置在出射光路上的DOE 312进行衍射而得到具有一定二维分布的衍射图案,衍射图案相上传播至反射镜313,反射镜313反射衍射图案使其(大 致)沿着z方向出射,并在垂直于z方向的投射平面(如图4A和图4B所示,其中图4B可以看作是图3和图4A所示装置沿y轴逆时针旋转90°的视图)上形成散斑图案。图4A和图4B的投射平面可以看作是图1所示成像平面的三维图示。
进一步地,驱动装置可以控制反射装置沿轴向运动,其中,从光源模块出射的光束以与轴向(x方向)垂直的方向入射至反射装置,并基于所述反射装置的轴向运动,改变出射方向。
具体如图所示,从驱动装置(例如,电机)314伸出的转轴与反射镜313固定连接,由此在电机工作时,由转轴带动反射镜313进行轴向运动。从而在投射平面上形成具有一定角度偏移的投影图案。这些投射图案对进行投影拍摄的图像传感器而言可以是不同的图案,由此,本发明的投射装置能够通过转动反射结构光(例如,具有二维分布的衍射图案)来实现对“不同”图案的方便投射。
在某些实施例中,转向投射模块可以是振镜,电机可以带动反射镜在一定的范围内往复运动。在另一些实施例中,转向投射模块可以是转镜,电机只能沿轴向进行单向运动。
具体地,转向投射模块可以是以预定频率往复振动的机械振镜,由此能够以预定频率向所述被测区域投射结构光,由此,在投射平面呈现沿着y方向上下运动的二维衍射图案(离散光斑)。由于机械振镜的可控制性,还可以使得转向投射模块在连续运动期间的预定窗口期保持静止。
例如,机械振镜可以具有沿着z方向±1°的运动范围,并且可以具有高达每秒2k的振动频率。在与图像传感器配合使用时,例如在三帧图像合成一帧图像的应用场景下,可以使得机械振镜在运动到-1°、0°和1°时分别暂停运动一段时间,例如在感光单元所需的曝光时间(例如1ms)内保持静止,由此使得在图像传感器的曝光期间,投射的图案保持不变,从而提升成像精度。
而在使用转镜的实施例中,虽然转镜只能够单向旋转,且难以进行变速控制,但是可以通过光电二极管等装置来感测转镜的运动角度,并且在合适的角度内进行结构光投射和动态成像,例如在360°中沿着z方向±1°的区间内进行投射和相应的成像。
图5A-B示出了使用图3或图4所示装置投射散斑的例子。图5A和图5B可以分别看作是机械振镜在运动到1和-1°时投射的图案,也可以看作是在结构光投射装置配合图像传感器形成测量头时,图像传感器的成像。为了方便理解,图中用黑框示出了散斑的投射范围,应该理解的是在实际使用中,并不会投射图中的黑框。另外,由于散斑投射范围内,散斑所占面积不到30%,因此可以认为采用了激光发生器和DOE的投射方案。
由于使用了同一块DOE(更具体地,使用同一块DOE上的同一块图案),因此如图5A和图5B所示,两图投射的图案实际上是“相同”的,但由于投射角度不同,因此图5B的图案沿着y方向下移了一段距离。当测量位于同一位置的物体时,例如图中的灰色方形时,该物体上投射的散斑实际上是不一样的(图5B中的虚线框所示范 围的投射图案才与图5A的投射图案相同),因此可以认为对相同的被测物体投射了不同的图案。并且由于散斑图案的不规律性(或随机性),在不同角度的投射中,方形物体表面的不同位置被投射到散斑,因此在后续的深度数据融合中能够获取更多的表面深度信息。
另外,应该理解的是,虽然图1-图5B给出的例子中,投射的结构光都是沿着z方向投射,并在竖直方向(y方向)上变动的结构光,但在其他实施例中,投射的结构光也可以是沿着z方向投射,并在水平方向(x方向)上变动的结构光,或是在x和y方向上都存在角度变换的投射。换句话说,通过在不同方向上改变投射角度,图5B所示的虚线框可以上下、左右甚至沿斜线移动。
另外,可以根据具体的应用场景来确定用于获取待检测物体的单次测量深度数据所需的图像组数。例如,当每秒能处理的图像组数量确定的情况下,可以通过减少单次测量深度数据所需的图像组数来提升成像帧率;也可以通过增加单次测量深度数据所需的图像组数来提升成像精度。
在利用图4A-B所示的投射结构时,如要减少单次测量深度数据所需的图像组数,可以减少电机变化的角度;如要增加单次测量深度数据所需的图像组数,则可相应增加电机变化的角度。
在一个实施例中,本发明的结构光投射装置例如可以具有五档可调成像速率。在最高成帧率时(即,最高档),可以实现例如100帧/秒的帧率。此时,每次测量深度数据的计算仅使用在同一时刻拍摄的一组图像帧,电机投射角度可以一直保持不变。在次高成帧率时(即,次高档),可以实现例如50帧/秒的帧率。此时,每次测量深度数据的计算使用在不同时刻拍摄的两组图像帧,并且电机投射角度可以需要相应变化,例如,以0.5°和0°来回变换,并在0.5°和0°上分别进行拍摄,以获取两组图像帧进行融合。类似地,在中间成帧率时(即,中档),则可在0.5°、0°、-0.5°上分别进行拍摄,以获取三组图像帧进行融合,实现33帧/秒的帧率。。在次低成帧率时(即,次低档),可以实现例如25帧/秒的帧率。而在最低成帧率时(即,最低档),可以实现例如20帧/秒的帧率。
帧率越高,每帧包含的像素点越少,但适用于需要高速成像的场合;帧率越低,每帧包含的像素点越多,并且由于用于检测的图像数量增加,可以同时降低匹配窗口尺寸,从而获得更高精度,适用于需要高精度成像但对成像帧率要求不太高的场合。在实际使用中,用户可以根据需要选择相应的档位,并使得驱动机构、光源、图像传感器和处理器彼此配合,实现所需的图像拍摄和融合计算。
本发明的结构光投射装置可以用于深度数据的测量。图6示出了根据本发明一个实施例的深度数据测量头的组成框图。
如图所示的一种深度数据测量头600包括结构光投射装置610以及具有预定相对位置关系的第一图像传感器620和第二图像传感器630。
结构光投射装置610能够在其包含的驱动装置的驱动下,通过不同的投射角度向被测空间投射带有纹理的光束,以在被测空间中的待检测物体上形成不同的纹理(例如图5A和5B所示的不同纹理)。
分别布置在所述结构光投射装置两侧的第一和第二图像传感器620和630在所述反射装置的运动过程中对被测空间进行至少两次成像,以获取具有不同纹理分布的至少两组图像,其中所述至少两组图像用于获取待检测物体的单次测量深度数据。
在一个实施例中,结构光投射装置610可以包括光源模块,用于生成并出射带纹理的光束;以及驱动装置,用于驱动所述光源模块以同的投射角度下向被测空间投射带有纹理的光束。换句话说,驱动装置可以直接驱动光源模块进行角度变换。此时,驱动装置例如可以是其上安装有光源模块的音圈马达。
在更为优选的实施例中,结构光投射装置610可以是如上结合图1、图3和图4描述的包括反射装置的结构光投射装置。如下将结合图7详细描述基于此种结构光投射装置的测量头实现。另外,图7的例子中还进一步包括了共用光路的优选成像方案。
图7示出了根据本发明一个实施例的深度数据测量头的组成示意图。出于简明的考虑,图中更为详尽地给出了其中一个图像传感器720的组成实例。
如图7所示,基于双目原理的深度数据测量头700包括投影装置710以及具有预定相对位置关系的第一图像传感器720和第二图像传感器730。该投影装置710可以是如前结合图1和图3-4描述的结构光投射装置。
虽然图中为了方便说明而没有示出,测量头700还可以包括用于包围上述装置的壳体,并且图7所示的连接结构740可以看作是固定上述装置并连接至壳体的机构。在某些实施例中,连接结构740可以是其上包括控制电路的电路板。应该理解的是,在其他实现中,上述装置710-730可以以其他方式连接至壳体,并进行相应的数据传输和指令接收操作。
在此,投影装置710用于向拍摄区域投射结构光,例如经由DOE衍射的相同图案,但由于转向机制的存在,可以使得同一图案沿不同角度投射,从而使得具有预定相对位置关系的第一图像传感器720和第二图像传感器530对所述拍摄区域进行拍摄以获得具有不同图案的一组图像帧对。这一组图像帧对则可用于所述拍摄区域的单次深度数据计算。
具体地,第一图像传感器720和第二图像传感器730可被分别布置在所述结构光投射装置710的两侧。所述第一和第二两个图像传感器之间具有预定的相对空间位置关系,并在所述反射装置的运动过程中对被测空间进行至少两次成像,以获取具有不同纹理分布的至少两组图像,其中所述至少两组图像用于获取待检测物体的单次测量深度数据。
例如,投影装置710可以在其驱动装置的驱动下,以不停变换的投射角度下向被测空间投射带有纹理的光束,以在被测空间中的待检测物体上形成不同的纹理。图像传感器可以在转向过程中进行多次成像,例如在运动到-1°、0°和1°时各自进行一次成像,由此得到包括三对(6帧)的一组图像帧对。这6帧图像共同用于针对拍摄区域的一次深度数据的计算,即,能够计算出一帧的深度图像。
在某些实施例中,投影装置510中的光源模块可以在工作中保持常亮,图像传感器则可在转向装置转动的特定角度或是任意角度(或是预定运动范围内的任意角度)进行多次成像。
例如,在利用可控的机械转镜的场景下,可以根据转镜的旋转角度,进行相应的成像,例如在运动到-1°、0°和1°时各自进行一次成像。也可以根据测得的旋转角度,进行相应的曝光。
在某些实施例中,也可以不对旋转角度和图像传感器的曝光进行同步。例如,当设定机械转镜在±1°内进行转动时,图像传感器可以在任意时刻以任意间隔进行所需的一组图像的拍摄,只要拍摄间隔不与转镜的转动频率完全重合,就可以拍摄到不同的图像。换句话说,由于在双目场景下比较的是同组图像间的差异,而无需与参考图案进行比较,因此具体投射了什么图案并非是需要被规定的。
在另一些实施例中,投影装置710中的光源模块可以与图像传感器的曝光同步。此时,测量头700还可以控制器,用于控制所述光源模块在所述第一和第二图像传感器曝光时同步点亮。
另外,控制器还可以用于控制所述驱动装置在所述第一和第二图像传感器的曝光期间保持静止。由此,相比于驱动装置运动中的成像,能够得到更为清晰的投影图案,例如,离散光斑。
在本发明的某些实施例中,第一和第二图像传感器可以是常规的图像传感器。但在其他实施例中,所述第一和第二图像传感器各自包括至少共用部分光路的至少两个子图像传感器,所述至少两个子图像传感器各自用于进行所述至少两次成像中的一次成像。
在现有技术中,第一图像传感器和第二图像传感器各自仅包括一块感光单元,并且每一块感光单元分别进行三次成像来获取三对(6帧)的一组图像帧对,在本发明中,第一和第二图像传感器则各自包括至少共用部分光路的至少两个子图像传感器,所述至少两个子图像传感器分别用于对所述投影装置相继投射的不同图案的结构光进行成像。
图7示出了第一和第二图像传感器各自包括两个子图像传感器(感光单元)的例子。如图所示,第一图像传感器720包括子图像传感器723和724,第二图像传感器730则包括子图像传感器733和734。在这其中,子图像传感器723和724共用光路直到分束装置722的分束面,并且与上述分束区域相距的距离相等。同样地,子图像传感器733和734共用光路直到分束装置732的分束面,并且与上述分束区域相距的距离相等。换句话说,本发明引入了彼此同轴的多组双目结构。在此,可以将位于不同图像传感器壳体内的子图像传感器723和733看作是第一组图像传感器(第一组双目),用于在一个投射角度下对结构光进行成像。随后,可以将被看作是第二组图像传感器(第二组双目)的子图像传感器724和734用于对另一个投射角度下的结构光进行成像。换句话说,此时可以看作分别与723和733同轴的子图像传感器724和534在原地(即,具有等效光路),代替723和733进行了后一幅图案结构光的成像。由此,相邻两帧的成像间隔就可以不依赖于每个图像传感器的帧间隔,而以更小的间隔进行成像。
为此,测量头700还可以包括:同步装置,用于在所述投影装置以小于所述子图 像传感器的帧成像间隔的第一间隔投射至少两个不同图案的结构光的同时,使得第一和第二图像传感器720和730各自包括至少两个子图像传感器同步地以所述第一间隔相继分别对所述至少两个不同图案的结构光进行成像。相应地,每一个子图像传感器仍然以不小于所述子图像传感器的帧成像间隔的第二间隔进行自身的下一帧成像(例如,就以本身的帧间隔成像),并且上述成像操作能够同步装置的同步下与所述投影装置的投射同步。
图8示出了同轴两组成像和单组成像的对比时序图。这里为了方便说明,可以设每一个感光单元(子图像传感器)的帧率为100帧/s,则其帧间隔为10ms(例如,最小帧间隔为10ms),并且可以设每个感光单元所需的曝光时间为1ms。
如果第一和第二图像传感器720和730是仅包括单个感光单元的常规图像传感器,在要利用图1所示的三幅图案(对应于获取的六幅图像)进行深度数据计算时,则如图8下部所示,需要在第0、第10和第20毫秒处进行三次成像。为此,合成每一幅深度数据图像需要拍摄对象持续21ms保持不动(因此更难以拍摄运动对象),并且帧率也从100帧/s降至33.3帧/s。
相比之下,如果第一和第二图像传感器720和730是包括两个感光单元(例如,第一和第二图像传感器720和730各自包括子图像传感器723和724,以及子图像传感器733和734)的本发明的图像传感器,在要利用三幅图案进行深度数据计算时,则如图8上部所示,第一组感光单元在第0毫秒处进行针对图案1(例如,第一投射角度下的图案)的成像,紧接着第二组感光单元在第1毫秒处就进行针对图案2(例如,第二投射角度下的图案)的成像,随后在间隔10ms之后,第一组感光单元在第10毫秒处进行针对图案3(例如,第三投射角度下的图案)的成像,这样就完成一副深度数据图像所需的三次成像。随后,在第11毫秒,第二组感光单元就能开始下一轮针对图案1的成像。在第20毫秒,第一组感光单元进行针对图案2的成像。在第21毫秒,第二组感光单元再进行针对图案3的成像。这样,不同组感光单元成像的间隔仅需间隔成像所需时间(例如,1ms),同一组感光单元的再次成像间隔则仍然遵循帧率对应的最小帧间隔时间(例如,10ms)。此时,通过引入两组同轴双目,合成每一幅深度数据图像仅需要拍摄对象持续11ms保持不动(因此更易于拍摄运动对象),并且帧率能保持在接近66.6帧/s。
虽然结合图7和图8描述具有两组同轴(同光轴)感光单元的例子,但在其他实施例中,第一和第二图像传感器各自还可以包括更多个感光单元。图9示出了同轴三组双目成像的时序图。此时,第一和第二图像传感器各自可以包括同轴的三个感光单元(子图像传感器)。为此,如图9所示,第一组感光单元在第0毫秒处进行针对图案1的成像,紧接着第二组感光单元在第1毫秒处就进行针对图案2的成像,紧接着第三组感光单元在第2毫秒处就进行针对图案3的成像。随后,在第10毫秒开始下一轮的三组成像,在第20毫秒开始再下一轮的三组成像,并以此类推。此时,通过引入三组同轴双目,仅需3ms就可获取合成一幅深度数据图像所需的三组(6帧)图像,即拍摄对象只需要持续3ms保持不动,因此大大提升了针对运动对象的拍摄水平,并 且帧率能保持在接近100帧/s(在此例中,拍摄100帧需要1003ms,即1.003秒)。
由此,应该理解的是,仅通过引入额外的一组同轴双目结构(或单目结构),就可以将基于多帧合成的深度数据帧率提升一倍,并缩短每一帧的成像时间。理论上,可以布置与投射装置投射图像数量相同组数的同轴双目结构,由此使得每一深度帧的成帧时间与传感器的帧间隔,仅与曝光时间的倍数相关(在帧间隔大于曝光时间×同轴结构组数的情况下)。例如,在基于四幅图案合成深度帧的情况下,如果是使用如图7所示的两组同轴双目,则获取四帧的成像时间微涨至12ms,但帧率则跌至接近50帧/s。但如果使用四组同轴双目,则获取四帧的成像时间仅为4ms,并且帧率仍然保持为接近100帧/s。但过多的引入同轴结构会增加图像传感器的构造难度,为此需要在成本、可行性和成像速度上进行折衷考虑。
另外,应该理解的是,图8和图9为了说明同轴成像的性能而给出了在第一子图像传感器722成像1ms之后,第二子图像传感器73立即成像1ms的例子。然而在实际应用中,第一和第二子图像传感器的成像间隔还需要考虑驱动装置的驱动。具体地,如果是在驱动过程中直接进行成像并且对投射的确切角度没有要求,则可以在第一子图像传感器完成成像之后,直接进行如图8和图9所示的第二子图像传感器的成像。但如果对投射的确切角度有要求,或者需要驱动装置在曝光期间保持静止,则在第一子图像传感器完成成像之后,需要等待驱动装置运动到合适的位置(和/或变为合适的运动状态,例如,完全静止),再进行第二子图像传感器的曝光。由于转镜和振镜的运动速度很快,因此等待时间相对较短,例如几十μs。
在结合多组同轴双目结构进行成像的情况下,同样可以根据需要对实际的成帧速率进行设置,例如,为用户提供高中低的成帧档位。在不同的档位中,每一帧需要拍摄的图像组数不同,例如,高档两组,中档四组,低档六组。当用户选择了相应的档位,则对电机选择角度和节律、结构光投射和成像时刻进行相应的设置,以满足不同的成像需求。另外,在处理器进行计算时,还可以根据生成单次深度数据所包括的图像组数,设定不同的匹配窗口,例如,低档六祖的左右图像匹配窗口可以小于高档两组的匹配窗口,从而提升成像精度。
为了实现同一图像传感器内不同感光单元的同轴配置,需要对光路进行设计。在图7的例子中,示出了基于分束实现的同轴布置。此时,以第一图像传感器720为例,可以包括:镜片单元721,用于接收入射的返回结构光;分束装置722,用于将入射的返回结构光分成至少第一光束和第二光束;第一子图像传感器723,用于对第一光束进行成像;第二子图像传感器724,用于对对应于不同图案的返回结构光的第二光束进行成像。
在一个实施例中,分束装置722是光学棱镜,例如四方棱镜或三棱镜。由此,入射光中经反射的红外光到达第二子图像传感器724,入射光中未经反射的可见光则可进行直线传播至第一子图像传感器723。
如图所示,采用棱镜形式的分束装置722可以将入射光分成传播方向互相垂直的两束光束。相应地,第一子图像传感器723和第二子图像传感器724也可以垂直布置, 以便各自以垂直角度接收入射的可见光和红外光光束。
为了消除视差并实现像素级对齐,需要入射光中的成分具有相同的光程。为此,在使用四分棱镜作为分束装置722的情况下,可以将第一子图像传感器723和第二子图像传感器724布置在与分束装置722的分束区域相距相等的距离处。而在使用三棱镜作为分束装置722的情况下,则可以根据空气与棱镜材料的折射率之比,灵活调整两个感光单元与分束装置722,尤其是与分束区域的距离。
第一子图像传感器723和第二子图像传感器724之间的像素级对齐(或近似对齐)可以通过使得入射光共享大部分光路并具有相同的光程来理论实现。但在图像传感器的实际制造过程中,会因为第一子图像传感器723和第二子图像传感器524的实际布置无法呈现理想的垂直和等距状况而造成两者成像之间的偏差。这时,可以对制造好的图像传感器进行强制软件矫正。例如,通过引入标定靶并使得第一子图像传感器723和第二子图像传感器724的成像都与标定靶对齐,从而实现真正的像素级矫正。换句话说,第一子图像传感器723和第二子图像传感器724之间的像素级对齐可以是精确像素级对齐,也可以是具有若干像素差并通过标定实现对齐。
如图所示,本发明的图像传感器720可以实现为单独的模块。为此,该图像传感器720还可以包括壳体,用于固定镜片单元、分束装置、和两个感光单元的相对位置。优选地,壳体可以结合镜片单元721形成密封体,以避免外界环境对所含器件的污染。在其他实施例中,本发明的图像传感器720可以是更大的模块(例如,深度数据测量头)的一部分,并且由该更大模块的壳体实现各元件之间的固定。
优选地,图像传感器720还可以包括分别连接至第一子图像传感器723和第二子图像传感器724的线缆。壳体则具有用于线缆接入的开口。在一个实施例中,线缆可以是柔性线缆,例如FPC(柔性电路板)线。
在一个实施例中,光束在入射第一子图像传感器723和第二子图像传感器724之前,还可以经过滤光片,以进一步滤除其他波长的光的影响。在一个实施例中,投射装置可以投射红外激光,因此图像传感器中布置的滤光片可以是相应的红外光透射单元,用于透过特定频率范围红外光,例如本发明中使用波长为780-1100nm的红外光。在其他实施例中,投射装置也可以投射可见光,例如投射红色激光或是蓝色激光,例如635nm的红光或者450nm的蓝光。虽然环境光中可能也包括红光或是蓝光,但是由于曝光时间短且激光瞬时光强大,因此也能够在对应的投射红光或是蓝光的滤光片的帮助下进行高信噪比的成像。另外,由于传感器对于蓝光有着优于红光的光电转换效率,并且波段越短,精度越高,因此可以在综合考虑激光器成本(蓝光激光器通常成本较高)、光电转换效率和成像精度的情况下,选择合适波长的激光器。
优选地,在分束装置是四方棱镜的情况下,滤光片的一侧可以直接与四方棱镜物理接触,另一侧与感光单元物理接触,而感光单元和四方棱镜则卡接在壳体内,由此确保各器件相对位置的高度不变性。
在某些实施例中,尤其是在第一和第二子图像传感器是用于接收投射的红外图案的红外光传感器的情况下,图像传感器中还可以布置额外的可见光感光单元(图中未 示出)用来捕获被测物体的图像信息,从而使得图像传感器捕获的图像中既包含被测物体的图像信息又包含深度信息。可见光感应单元可以是灰度传感器,或是彩色传感器。其中灰度传感器仅捕获亮度信息,彩色传感器则可用于捕获被测物体的色彩信息,此时可见光感应单元可由三原色感应单元组成,其中三原色可以是红绿蓝三原色(RGB)也可以是青红黄三原色(CMY)。相比于深度数据,可见光图像能够更好地分辨目标对象的边缘(特别是在目标对象与背景环境颜色和/或亮度相差明显的情况下),由此结合可见光图像能够更全面地获取目标对象的三维信息。
应该理解的是,虽然基于图7具体描述的第一图像传感器720的结构,但第二图像传感器730也可以具有相同的结构。另外,应该理解的是,可以将723和733看作是第一组双目,724和734看作是第二组双目,但也可以将723和734看作第一组双目,724和733看所第二组双目,只要在相应的图案入射后接通成像即可。
除了图7所示的分束装置之外,在不同的实施例中,还可以包括光路变换装置的其他实现,用于改变光路以将入射的返回结构光输送到第一子图像传感器和第一子图像传感器。
在如图7所示利用分束实现光路共享的情况下,由于每一个感光单元获取的光亮会减少,为此可以通过增加投射亮度或是扩大入射光圈的方法来确保成像的敏感性或是有效距离范围。
作为替换,还可以基于光路转换来实现光路共享。此时,第一和第二图像传感器720和730可以各自包括:镜片单元,用于接收入射的返回结构光;光路转换装置,用于将入射的返回结构光输送至至少第一子路径和第二子路径;第一子图像传感器,用于在第一子路径上对返回结构光进行成像;第二子图像传感器,用于在第二子路径上对对应于不同图案的返回结构光进行成像。在一个实施例中,光路转换装置可以是转镜,其可以在例如第0毫秒将入射光反射至感光单元723,在第1毫秒将入射光反射至感光单元724等等。在其他实施例中,光路转换装置也可以是基于其他机械、化学或电学原理进行光路转换的装置。
在投影装置进行全图案投影(而非扫描式投影)的情况下,图像传感器可以是卷帘式图像传感器,也可以是全局图像传感器(即,所有像素同时进行成像)。全局传感器能够实现更高的帧率,卷帘传感器则能够具有可调的动态范围,因此可以根据实际应用场景选择要使用的传感器类型。
如前所述,投影装置可以包括以预定频率往复振动的振镜,例如MEMS振镜或是机械振镜,用于以预定频率和运动范围向所述拍摄区域扫描投射结构光。由于振镜可以实现极高的振动频率,例如,每秒2k,因此使得无法直接利用MEMS振镜的启动信号来进行同步(因为延时不可靠),因此在需要进行同步(例如,获知转动角度)的场景中,考虑到微镜器件相位振动的特性,可以在同步装置中包括用于实时测量振镜的振动相位的测量装置。
在一个实施例中,上述测量可以基于出射光本身。于是,上述测量装置可以是一个或多个光电传感器(例如,两个光电二极管PD),并且所述两个光电传感器以如下 任一方式布置:布置在所述投影装置的不同出射路径上;布置在所述投影装置内的不同反射路径上;以及分别布置在所述投影装置内外的出射和反射路径上。可以合理选择光电传感器的布置方式,以使其在准确测量相位的同时,不对结构光的正常投影产生影响。可将PD安装在投影装置内,通过测量激光出射光窗时的反射角来确定瞬时的振动相位。由于MEMS振镜的振动相位成正弦分布,因此一个PD就能确定正弦分布信息,而更多的PD有助于更准确的测量相位。在其他实施例中,也可以将PD安装在投影装置外,例如,安装在光窗上,例如靠近光窗边缘以防止对拍摄区域内投影的影响。在其他实施例中,还可以利用其他方式进行相位测量,例如进行电容测量。
但是在其他实施例中,投影装置可以包括单向转动的机械转镜,相应地,在需要进行同步时,同步装置所包括的测量装置可以是用于实时测量所述反射装置的电机旋转角度的角测量器。
在如上的实施例中,投射与曝光之间的同步通过控制图像传感器的曝光来实现。这可以用于光源投射角度可控的情况下(例如,可以通过电压和电流来控制机械振镜的角度和转速),并且尤其适用于光源扫描的相位和速度不可控(例如,对于MEMS振镜或机械转镜)的情况。于是,MEMS振镜可以使用PD或者电容来检测角度,机械转镜也可以通过电压检测或光电编码来实现位置检测。
根据本发明的另一个实施例,还可以实现为一种深度数据计算设备,包括:如上所述的深度数据测量头;以及与所述深度数据测量头相连接的处理器,用于在双目方案下根据第一和第二图像传感器的预定相对位置及其对所述结构光成像得到的所述一组图像帧对,确定所述拍摄区域中拍摄对象的深度数据。
本发明还可以实现为一种深度数据测量方法。图10示出了根据本发明一个实施例的深度数据测量方法的示意性流程图。该方法可以结合本发明的结构光投射装置、测量头和计算装置来实现。
在步骤S1010,对光源模块出射的带散斑图案的光束进行转动反射。
在步骤S1020,使用相对位置固定的第一和第二图像传感器对被测空间进行至少两次成像,以获取至少两组图像,其中,在所述至少两次成像中,被测空间被投射了由于所述转动反射而呈现出的不同散斑图案。在步骤S1030,从所述至少两组图像中求取深度数据并进行深度数据融合。
在使用同轴双目结构的情况下,使用相对位置固定的第一和第二红外光图像传感器对被测空间进行至少两次成像包括:使用具有预定相对位置关系的第一子图像传感器对进行第一次成像以获取第一图像帧对;使用第二子图像传感器对进行第二次成像以获取第二图像帧对,其中,第一和第二子图像传感器对中各自的一个子图像传感器共用至少部分光路并组成第一图像传感器,第一和第二子图像传感器对中各自的另一个子图像传感器共用至少部分光路并组成第二图像传感器,所述第一和第二图像帧对用于所述拍摄区域的单次深度数据计算。
上文中已经参考附图详细描述了根据本发明的深度数据测量头、计算设备和测量方法。本发明的深度测量方案尤其使用改进的、能够以不同角度反射光源模块产生的 结构光的结构光投射装置,从而能够实现更为快速、经济且故障率低的多图案投射。进一步地,该结构光投射装置能够与共有光路的多对双目传感器相配合,从而进一步缩短帧间隔,提升深度融合数据的质量。
本发明的散斑测量方案尤其适用于对连续平面的深度测量,例如,用于上下料抓取或是船厂的焊缝检测等。
附图中的流程图和框图显示了根据本发明的多个实施例的系统和方法的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段或代码的一部分,所述模块、程序段或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标记的功能也可以以不同于附图中所标记的顺序发生。例如,两个连续的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
以上已经描述了本发明的各实施例,上述说明是示例性的,并非穷尽性的,并且也不限于所披露的各实施例。在不偏离所说明的各实施例的范围和精神的情况下,对于本技术领域的普通技术人员来说许多修改和变更都是显而易见的。本文中所用术语的选择,旨在最好地解释各实施例的原理、实际应用或对市场中的技术的改进,或者使本技术领域的其它普通技术人员能理解本文披露的各实施例。

Claims (19)

  1. 一种深度数据测量头,包括:
    结构光投射装置,用于在驱动装置驱动不同的投射角度下向被测空间投射带有纹理的光束,以在被测空间中的待检测物体上形成不同的纹理;以及
    分别布置在所述结构光投射装置两侧的第一和第二图像传感器,所述第一和第二两个图像传感器之间具有预定的相对空间位置关系,用于对被测空间进行至少两次成像,以获取具有不同纹理分布的至少两组图像,其中所述至少两组图像用于获取待检测物体的单次测量深度数据。
  2. 如权利要求1所述的测量头,其中,所述结构光投射装置包括:
    光源模块,用于生成并出射带纹理的光束;
    驱动装置,用于驱动所述光源模块以不同的投射角度下向被测空间投射带有纹理的光束。
  3. 如权利要求1所述的测量头,其中,所述结构光投射装置包括:
    光源模块,用于生成并出射带纹理的光束;
    转向投射模块,包括:
    布置在所述光束的出射路径上的反射装置,用于对入射的所述光束进行反射,以使所述光束出射;
    连接至所述反射装置的驱动装置,用于改变所述反射装置相对于入射的所述光束的角度,以改变所述光束的出射方向。
  4. 如权利要求3所述的测量头,其中,所述转向投射模块是机械振镜,并且所述反射装置沿轴向往复运动;或者
    所述转向投射模块是机械转镜,并且所述反射装置沿轴向进行单向运动。
  5. 如权利要求1所述的测量头,其中,所述驱动装置在连续运动期 间的预定窗口期保持静止。
  6. 如权利要求1所述的测量头,其中,所述光源模块包括:
    激光发生器,用于发射激光光束;
    布置在所述激光光束的出射光路上的衍射光学元件,用于使得入射激光发生衍射并使其被调制成具有特定投射规则的离散光斑;或者
    泛光光源,用于生成泛光;以及
    布置在所述泛光光源上的掩膜,用于将所述泛光转换成具有特定投射编码的光斑。
  7. 如权利要求1所述的深度数据测量头,还包括:
    控制器,用于控制所述光源模块在所述第一和第二图像传感器曝光时同步点亮。
  8. 如权利要求1所述的深度数据测量头,其中,所述第一和第二图像传感器各自包括至少共用部分光路的至少两个子图像传感器,所述至少两个子图像传感器各自用于进行所述至少两次成像中的一次成像。
  9. 如权利要求8所述的深度数据测量头,包括:
    控制器,使得所述第一和第二图像传感器各自包括至少两个子图像传感器同步地以所述第一间隔相继进行成像,所述第一间隔小于所述子图像传感器的最小帧成像间隔。
  10. 如权利要求9所述的深度数据测量头,其中,所述控制器用于:
    使得每个子图像传感器以不小于所述子图像传感器的最小帧成像间隔的第二间隔进行自身的下一帧成像。
  11. 如权利要求8所述的深度数据测量头,其中,所述第一和第二图像传感器各自包括:
    镜片单元,用于接收入射的返回结构光;
    光路变换装置,用于改变光路以将入射的返回结构光输送到第一子图像传感器和第一子图像传感器;
    第一子图像传感器和第二子图像传感器,用于在不同时刻对不同的图案进行成像。
  12. 如权利要求11所述的深度数据测量头,其中,光路变换装置包括:
    分束装置,用于将入射的返回结构光分成至少第一光束和第二光束,
    其中,第一子图像传感器,用于对第一光束进行成像;
    第二子图像传感器,用于对对应于不同图案的返回结构光的第二光束进行成像。
  13. 如权利要求11所述的深度数据测量头,其中,光路变换装置包括:
    光路转换装置,用于将入射的返回结构光输送到至少第一子路径和第二子路径,
    其中,第一子图像传感器,用于在第一子路径上对返回结构光进行成像;
    第二子图像传感器,用于在第二子路径上对对应于不同图案的返回结构光进行成像。
  14. 如权利要求11所述的深度数据测量头,其中,所述第一子图像传感器和所述第二子图像传感器与所述分束装置的分束区域或所述光路转换装置的光路转换区域相距相等的距离。
  15. 如权利要求11所述的深度数据测量头,其中,所述第一和第二图像传感器各自包括的至少共用部分光路的至少两个子图像传感器是红外光传感器;和/或
    所述第一和第二图像传感器各自包括:
    可见光图像传感器,用于对入射的结构光进行成像,其中所述可见光传感器与第一和/或第二图像子传感器共用至少部分光路。
  16. 如权利要求1所述的深度数据测量头,其中,基于单次测量深度数据所需的不同图像组数,设置相应的驱动装置的投射角度组合、光束的投射时刻以及第一和第二图像传感器的成像时刻。
  17. 一种深度数据计算设备,包括:
    如权利要求1-16中任一项所述的深度数据测量头,以及
    处理器,用于获取所述至少两组图像,并根据所述第一和第二图像传感器之间的预定相对空间位置关系,确定每组图像中所述纹理的深度数据,将基于所述至少两组图像确定的深度数据融合,得到新的深度数据,作为待检测物体的单次测量深度数据。
  18. 一种深度数据测量方法,包括:
    对光源模块出射的带散斑图案的光束进行不同角度的投射;
    使用相对位置固定的第一和第二图像传感器对被测空间进行至少两次成像,以获取至少两组图像,其中,在所述至少两次成像中,被测空间被投射了由于所述不同角度的投射而呈现出的不同散斑图案;
    从所述至少两组图像中求取深度数据并进行深度数据融合。
  19. 如权利要求18所述的方法,其中,使用相对位置固定的第一和第二红外光图像传感器对被测空间进行至少两次成像包括:
    使用具有预定相对位置关系的第一子图像传感器对进行第一次成像以获取第一图像帧对;
    使用第二子图像传感器对进行第二次成像以获取第二图像帧对,其中,第一和第二子图像传感器对中各自的一个子图像传感器共用至少部分光路并组成第一图像传感器,第一和第二子图像传感器对中各自的另一个子图像传感器共用至少部分光路并组成第二图像传感器,所述第一和第二图像帧对用于所述拍摄区域的单次深度数据计算。
PCT/CN2021/137790 2021-04-20 2021-12-14 深度数据测量头、计算设备和测量方法 WO2022222496A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US18/284,690 US20240167811A1 (en) 2021-04-20 2021-12-14 Depth data measuring head, computing device and measurement method
EP21937718.1A EP4328541A1 (en) 2021-04-20 2021-12-14 Depth data measuring head, computing device and measurement method
JP2023561894A JP2024513936A (ja) 2021-04-20 2021-12-14 深度データ測定ヘッド、計算装置及び測定方法

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110423509.7 2021-04-20
CN202110423509.7A CN115218820A (zh) 2021-04-20 2021-04-20 结构光投射装置、深度数据测量头、计算设备和测量方法

Publications (1)

Publication Number Publication Date
WO2022222496A1 true WO2022222496A1 (zh) 2022-10-27

Family

ID=83604661

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/137790 WO2022222496A1 (zh) 2021-04-20 2021-12-14 深度数据测量头、计算设备和测量方法

Country Status (5)

Country Link
US (1) US20240167811A1 (zh)
EP (1) EP4328541A1 (zh)
JP (1) JP2024513936A (zh)
CN (1) CN115218820A (zh)
WO (1) WO2022222496A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116593282A (zh) * 2023-07-14 2023-08-15 四川名人居门窗有限公司 一种基于结构光的玻璃抗冲击反应测试系统及方法

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3682553A (en) * 1968-09-19 1972-08-08 Optics Technology Inc Apparatus for acquiring and laying real time 3-d information
CN107369156A (zh) * 2017-08-21 2017-11-21 上海图漾信息科技有限公司 深度数据检测系统及其红外编码投影装置
CN108650447A (zh) * 2018-07-06 2018-10-12 上海图漾信息科技有限公司 图像传感器、深度数据测量头及测量系统
WO2019209064A1 (ko) * 2018-04-26 2019-10-31 엘지이노텍 주식회사 카메라 모듈 및 그의 깊이 정보 추출 방법
CN111239729A (zh) * 2020-01-17 2020-06-05 西安交通大学 融合散斑和泛光投射的ToF深度传感器及其测距方法
CN111692987A (zh) * 2019-03-15 2020-09-22 上海图漾信息科技有限公司 深度数据测量头、测量装置和测量方法
CN111721239A (zh) * 2020-07-22 2020-09-29 上海图漾信息科技有限公司 深度数据测量设备和结构光投射装置
CN111829449A (zh) * 2019-04-23 2020-10-27 上海图漾信息科技有限公司 深度数据测量头、测量装置和测量方法
CN212747701U (zh) * 2020-07-22 2021-03-19 上海图漾信息科技有限公司 结构光投射装置和深度数据测量头

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012020380A1 (en) * 2010-08-11 2012-02-16 Primesense Ltd. Scanning projectors and image capture modules for 3d mapping
US10368056B2 (en) * 2015-06-19 2019-07-30 Shanghai Percipio Technology Limited Depth data detection and monitoring apparatus

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3682553A (en) * 1968-09-19 1972-08-08 Optics Technology Inc Apparatus for acquiring and laying real time 3-d information
CN107369156A (zh) * 2017-08-21 2017-11-21 上海图漾信息科技有限公司 深度数据检测系统及其红外编码投影装置
WO2019209064A1 (ko) * 2018-04-26 2019-10-31 엘지이노텍 주식회사 카메라 모듈 및 그의 깊이 정보 추출 방법
CN108650447A (zh) * 2018-07-06 2018-10-12 上海图漾信息科技有限公司 图像传感器、深度数据测量头及测量系统
CN111692987A (zh) * 2019-03-15 2020-09-22 上海图漾信息科技有限公司 深度数据测量头、测量装置和测量方法
CN111829449A (zh) * 2019-04-23 2020-10-27 上海图漾信息科技有限公司 深度数据测量头、测量装置和测量方法
CN111239729A (zh) * 2020-01-17 2020-06-05 西安交通大学 融合散斑和泛光投射的ToF深度传感器及其测距方法
CN111721239A (zh) * 2020-07-22 2020-09-29 上海图漾信息科技有限公司 深度数据测量设备和结构光投射装置
CN212747701U (zh) * 2020-07-22 2021-03-19 上海图漾信息科技有限公司 结构光投射装置和深度数据测量头

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4328541A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116593282A (zh) * 2023-07-14 2023-08-15 四川名人居门窗有限公司 一种基于结构光的玻璃抗冲击反应测试系统及方法
CN116593282B (zh) * 2023-07-14 2023-11-28 四川名人居门窗有限公司 一种基于结构光的玻璃抗冲击反应测试系统及方法

Also Published As

Publication number Publication date
EP4328541A4 (en) 2024-02-28
EP4328541A1 (en) 2024-02-28
US20240167811A1 (en) 2024-05-23
JP2024513936A (ja) 2024-03-27
CN115218820A (zh) 2022-10-21

Similar Documents

Publication Publication Date Title
JP5281923B2 (ja) 投射型表示装置
CN110446965B (zh) 用于结合光扫描投影仪跟踪眼睛运动的方法和系统
US7078720B2 (en) Range finder for measuring three-dimensional geometry of object and method thereof
US20140313519A1 (en) Depth scanning with multiple emitters
CN107369156B (zh) 深度数据检测系统及其红外编码投影装置
US9013711B2 (en) Contour sensor incorporating MEMS mirrors
WO2017038203A1 (ja) 距離画像取得装置付きプロジェクタ装置及びプロジェクションマッピング方法
JPH10232626A (ja) 立体画像表示装置
EP3640678A1 (en) Tracker, surveying apparatus and method for tracking a target
WO2022222496A1 (zh) 深度数据测量头、计算设备和测量方法
WO2022017441A1 (zh) 深度数据测量设备和结构光投射装置
CN107836112B (zh) 投影系统
JP2010085472A (ja) 画像投影・撮像装置
CN216246133U (zh) 结构光投射装置、深度数据测量头和计算设备
WO2021032298A1 (en) High resolution optical depth scanner
CN212747701U (zh) 结构光投射装置和深度数据测量头
WO2022222497A1 (zh) 深度数据测量头、计算装置及其对应方法
CN216283296U (zh) 深度数据测量头和深度数据计算设备
CN217604922U (zh) 深度数据测量头和局部深度数据测量设备
JP3668466B2 (ja) 実時間レンジファインダ
JPH1026724A (ja) アクティブ式多点測距装置
CN117369197B (zh) 3d结构光模组、成像系统及获得目标物体深度图的方法
CN212567304U (zh) 深度数据测量头
US20210152809A1 (en) Structured light emitting device and image acquisiton device uisng same
KR100902176B1 (ko) 회전다면경을 이용한 3d 스캐너

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21937718

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 18284690

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2023561894

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2021937718

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021937718

Country of ref document: EP

Effective date: 20231120