CN117128885A - Depth data measuring apparatus and method - Google Patents

Depth data measuring apparatus and method Download PDF

Info

Publication number
CN117128885A
CN117128885A CN202210542889.0A CN202210542889A CN117128885A CN 117128885 A CN117128885 A CN 117128885A CN 202210542889 A CN202210542889 A CN 202210542889A CN 117128885 A CN117128885 A CN 117128885A
Authority
CN
China
Prior art keywords
period
projection
imaging
pattern
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210542889.0A
Other languages
Chinese (zh)
Inventor
王敏捷
梁雨时
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Tuyang Information Technology Co ltd
Original Assignee
Shanghai Tuyang Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Tuyang Information Technology Co ltd filed Critical Shanghai Tuyang Information Technology Co ltd
Priority to CN202210542889.0A priority Critical patent/CN117128885A/en
Priority to PCT/CN2023/100797 priority patent/WO2023222139A1/en
Publication of CN117128885A publication Critical patent/CN117128885A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/22Measuring arrangements characterised by the use of optical techniques for measuring depth
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A depth data measurement apparatus and method are disclosed. The apparatus comprises: the first measuring head and the second measuring head are fixed in relative positions, and the two measuring heads sequentially complete one scanning and are used for synthesizing single depth data. And the two measuring heads each comprise: projection means for projecting line-type light toward an imaging region; image sensor comprising N groups of pixels uniformly distributed on an imaging surface, each group of pixels being separated from each other by an exposure switching period t of 2 pi/N phase e Exposure is performed. The projection device comprises a plurality of circulation subcyclesIs one scanning period of (2)A pattern scan is completed. At each of theIn the projection period t of the line light p Comprising N wave-shaped projection areas of width 2 pi/N whose light intensities are encoded based on the imaging pattern. Thus, one pattern scan of the measuring head can obtain a set of N-step phase shift patterns. The two measuring heads can shoot the same imaging area from different angles at high speed and high precision, and the two measuring heads can be particularly realized as handheld modeling equipment combined with calibration points.

Description

Depth data measuring apparatus and method
Technical Field
The present invention relates to the field of three-dimensional imaging, and in particular to a depth data measurement apparatus and method.
Background
In the prior art, stripe light encoding can be used to achieve high precision imaging. However, the stripe optical coding needs to shoot a plurality of different stripe images to synthesize a single depth image, so that the obtained depth image has a lower frame rate and cannot meet the requirement of real-time high-precision dynamic imaging.
In addition, since the stripe code is actively projected structured light, there is a phenomenon that the object surface is not irradiated with structured light or reflected light is not received by the image sensor because of shielding. This also adversely affects high-precision imaging for the imaging subject.
For this reason, an improved depth data measurement device is needed.
Disclosure of Invention
An object of the present disclosure is to provide an improved depth data measurement device, comprising two measuring heads fixed in relative position and imaged successively, an image sensor each provided with a different set of pixels of phase-shifted exposure imaging projected line-type light with successive phase shifts in different sub-periods, so that in a single scan of the line-type light, different sets of pixels of the image sensor can each acquire a different phase-shifted fringe image, thereby enabling acquisition of a plurality of fringe images of a single line-type light scan. The two measuring heads can shoot the same imaging area at high speed from different angles, and can be especially realized as a handheld modeling device combined with calibration points.
According to a first aspect of the present disclosure, there is provided a depth data measurement device comprising a first depth imaging measurement head and a second depth imaging measurement head fixed in relative position, and each comprising: a projection device for projecting line-type light moving in a first direction toward an imaging region, wherein a length direction of the line-type light is a second direction perpendicular to the first direction; an image sensor comprising N groups of pixels uniformly distributed on an imaging surface, each group of pixels having an exposure switching period t spaced apart from each other by 2π/N phases e Exposing, wherein N is largeAn integer of 1, wherein the projection device is configured to perform a scan cycleA pattern scan is completed in the same time, the scanning period is +.>Comprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In which the line light is projected for a period t p Performing a change of brightness, the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p Comprising N wave-shaped projection areas of width 2 pi/N, the projected light intensity of each wave-shaped projection area being encoded such that in said scanning period +. >Upon completion of one pattern scan in, each of the N groups of pixels of the image sensor images a different fringe pattern, and the N fringe patterns form a set of N-step phase-shifted patterns with a 2 pi/N phase shift relative to each other,
after the first depth imaging measuring head finishes the first pattern scanning, the second depth imaging measuring head scans the second pattern, and the first N-step phase shift pattern obtained by the first pattern scanning and the second N-step phase shift pattern obtained by the second pattern scanning are synthesized into depth information of an imaging object in an imaging area based on the relative position.
Optionally, the image sensor includes a plurality of pixel units, each pixel unit including one pixel each belonging to N groups of pixels.
Optionally, the projection period t of the linear light p Exposure switching period t with the first group of pixels e And (5) synchronizing.
Alternatively, each waveform projection area projects a rectangular wave of width 2 pi/N or 0, and corresponds to a cyclic sub-period based on a set of N-step phase shift patternsThe light intensity of each waveform projection area is determined.
Alternatively, the set of N-step phase shift patterns is a sine wave four-step phase shift pattern, and each projection period t is found based on the exposure of N sets of pixels to the waveform projection area p The light intensity of each waveform projection area is not smaller than zero.
Alternatively, the line light scan corresponds to dwell time t on each column of pixels c Not less than the scanning periodDivided by the column number C, the residence time t c Is the projection period t p More than 10 times of the total number of the components.
Optionally, at each sub-period T i In which the line light is projected for a period t p Projected m times and each sub-period T i Is longer than the residence time t c
Optionally, each pixel in the image sensor includes a corresponding charge storage unit, during the scan periodAnd when one pattern scanning is completed, acquiring a group of N-step phase shift patterns from N groups of charge storage units corresponding to N groups of pixels respectively, wherein the group of N-step phase shift patterns are used for generating a depth map of the imaging region.
Optionally, the projection device is used for a first scanning periodA pattern scan is completed in such a way that each of the N groups of pixels of the image sensor forms a different fringe pattern,the N stripe patterns form a group of Gray code patterns; the projection device being +.>A pattern scan is completed such that each of the N groups of pixels of the image sensor images a different fringe pattern and the N fringe patterns form the set of N-step phase-shifting patterns, wherein a depth map of the imaged region is generated from the set of N-step phase-shifting patterns based on the set of gray code patterns.
Optionally, the projection device includes: a light emitting device for generating linear light; and a reflection device for reflecting line-type light to project line-type light moving in a direction perpendicular to the stripe direction toward a photographing area at a predetermined frequency, a length direction of the line-type light being a length direction of the projected stripe, the reflection device comprising one of: a mechanical vibrating mirror reciprocally vibrating at the predetermined frequency; a micromirror device reciprocating at a predetermined frequency; and a mechanically rotating mirror that rotates unidirectionally at a predetermined frequency.
Optionally, the image sensor includes a first image sensor and a second image sensor fixed in relative position, wherein the first image sensor and the second image sensor each include the N groups of pixels and are exposed in synchronization with each other.
Optionally, the projection device is operated in a scan period of aCompleting a pattern scan in each of the scan periodsComprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In which N sub-periods are includedT 1 -T N At each sub-period T i In which the line light is projected for a period t p Performing light-dark variation projection, wherein the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p Comprising a bright region, wherein, in the subperiod T 1 -T N In the projection period t p The position of (2 pi/alpha N) is varied at intervals of 2 pi/alpha N phase, so that at each of the scanning periods +.>When one pattern scanning is completed, N groups of pixels of the image sensor respectively image a different stripe pattern, and the pixels are in alpha scanning periods +.>When the pattern scanning is completed for alpha times, the alpha N stripe patterns form a group of alpha N step phase shift patterns with 2 pi/alpha N phase shift between each other, wherein alpha is an integer greater than or equal to 2.
Optionally, the projection device is used for scanning periodA pattern scan is completed in the same time, the scanning period is +.>Comprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In which N sub-periods T are included 1 -T N At each sub-period T i In which the line light is projected for a period t p Performing a change of brightness, the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p Comprising a bright region, wherein, in the subperiod T 1 -T N In the projection period t p The position of which is varied at intervals of 2 pi/N phase such that during said scanning period +. >Upon completion of one pattern scan, the N groups of pixels of the image sensor each image a different fringe pattern, and the N fringe patterns form a set of N-step phase-shift patterns with a 2 pi/N phase shift relative to each other. Alternatively, n=2 n N is an integer of 1 or more. .
According to a second aspect of the present disclosure, there is provided a depth data measurement method comprising: performing a first pattern scan to an imaging region using a first depth imaging measurement head of a depth data measurement device to obtain a first set of N-step phase shift patterns; performing a second pattern scan to the imaging region using a second depth imaging measurement head of the depth data measurement device to obtain a second set of N-step phase shift patterns; synthesizing depth information of an imaging subject within the imaging region from the first and second sets of N-step phase shift patterns based on relative positions of the first and second depth imaging measurement heads, wherein scanning imaging of the first and second depth imaging measurement heads each comprises: projecting line-shaped light moving along a first direction toward an imaging area, wherein the length direction of the line-shaped light is a second direction perpendicular to the first direction, and the projected line-shaped light is scanned in a period A pattern scan is completed in the same time, the scanning period is +.>Comprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In which the line light is projected for a period t p Performing a change of brightness, the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p The method comprises the steps of including N waveform projection areas with the width of 2 pi/N, coding the projection light intensity of each waveform projection area, wherein N is an integer greater than 1; photographing the imaging region using an image sensor including N groups of pixels uniformly distributed on an imaging surface to obtain N image frames under the line-type light scanning projection, wherein each group of pixels has an exposure switching period t of 2 pi/N phase interval from each other e Exposing; and determining depth data of the object to be measured in the imaging region based on the image frame, wherein the projection light intensity of each waveform projection region is encoded so as to be +_ in the scanning period>Upon completion of one pattern scan, the N groups of pixels of the image sensor each image a different fringe pattern, and the N fringe patterns form a set of N-step phase-shift patterns with a 2 pi/N phase shift relative to each other.
Optionally, the method further comprises: the depth data measuring device shoots a plurality of groups of first N-step phase shift patterns and a plurality of groups of second N-step phase shift patterns which move relatively relative to the imaging object; and synthesizing depth information generated from the plurality of first and second sets of N-step phase shift patterns into model information of the imaging subject based on the calibration points.
Therefore, the depth data measuring device can achieve simultaneous acquisition of N-step phase shift diagrams under single scanning of linear light aiming at the same imaging area from different angles, and therefore imaging speed is improved.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following more particular descriptions of exemplary embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout exemplary embodiments of the disclosure.
Fig. 1 shows the principle of depth imaging with structured light encoded in stripes.
Fig. 2 shows another example of projecting stripe coded structured light.
Fig. 3 shows a schematic composition of a depth data measurement head according to one embodiment of the invention.
Fig. 4A-B show an enlarged operation example of the projection apparatus shown in fig. 3.
Fig. 5 shows a simplified perspective schematic diagram of a projection device for use with the present invention.
Fig. 6 shows an example in which a single camera is missing part of depth information of an imaging object when imaging using structured light.
Fig. 7 shows an example in which the dual camera imaging improves the loss of depth information of a part of an imaging subject.
Fig. 8 shows a schematic diagram of a depth data measurement device according to an embodiment of the invention.
Fig. 9 shows an example of the pixel composition of the image sensor used in the present invention.
Fig. 10 shows an example of the relative relationship of exposure periods between different groups of pixels of the same image sensor.
FIG. 11 shows a cycle of sub-periodsAn imaging schematic of a different set of pixels.
FIG. 12 shows an example of the relative relationship between the projected light waveform and the exposure period of pixel groups 1-4 for sine wave four step phase shift pattern imaging.
FIG. 13 shows completion of one scan cycleOne example of patterns 1-4 that are derived from pixel groups 1-4, respectively.
Fig. 14 shows an example of gray code combined with four-step phase shift for depth map imaging.
FIG. 15 shows a cycle of sub-periodsAn imaging schematic of a different set of pixels.
FIGS. 16A-D show sub-period T 1 -T 4 The relative relationship between the projected light waveform and the exposure period of pixel groups 1-4.
FIG. 17 shows completion of one scan cycleOne example of patterns 1-4 that are derived from pixel groups 1-4, respectively.
Fig. 18 shows a composition schematic diagram of a depth data measurement device according to an embodiment of the present invention.
Fig. 19 shows a schematic flow chart of a depth data measurement method according to an embodiment of the invention.
Detailed Description
Preferred embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
According to the principle of structured light measurement, whether the scan angle α can be precisely determined is the key of the whole measurement system, the scan angle can be calculated and determined by mechanical devices such as turning mirrors, etc., and the meaning of image coding and decoding is to determine the scan angle of the coded structured light, i.e. the surface structured light system. Fig. 1 shows the principle of depth imaging with structured light encoded in stripes. For ease of understanding, the coding principle of stripe structured light is briefly described in the figure with two gray scale three bit binary time coding. The projection device can sequentially project three patterns shown in the figure to the measured object in the shooting area, and the projection space is divided into 8 areas by using two gray scales of brightness and darkness in the three patterns. Each region corresponds to a respective projection angle, wherein it can be assumed that bright regions correspond to a code "1" and dark regions correspond to a code "0". And combining the code values of a point in the three code patterns on the scenery in the projection space according to the projection sequence to obtain the area code value of the point, thereby determining the area where the point is located and then decoding to obtain the scanning angle of the point.
In order to improve the matching accuracy, the number of projection patterns in time encoding may be increased. Fig. 2 shows another example of projecting stripe coded structured light. Specifically, a two-gray level five-bit binary time code is shown. In an application scenario such as binocular imaging, this means that each pixel in each of the left and right image frames contains 5 or 0 or 1 region code values, for example, whereby left and right image matching can be achieved with higher accuracy (e.g., pixel level). In the case of a constant projection rate of the projection device, the example of fig. 2 corresponds to achieving a higher accuracy of image matching at a higher time-domain cost than the three coding patterns of fig. 1.
Fig. 3 shows a schematic diagram of projection of a fringe image with line light to acquire depth data. As shown in fig. 3, the depth data measuring head 300 includes a projection device 310 and two image sensors 320_1 and 320_2. In a monocular implementation, the depth data measurement head 300 may also be photographed using an image sensor.
The projection device 310 is used for scanning and projecting structured light with stripe codes to a shooting area. For example, the projection device 310 may successively project three patterns as shown in fig. 1 during successive 3 image frame projection periods, the imaging results of which may be used for the generation of depth data. The image sensors 320_1 and 320_2, which may be referred to as first and second image sensors, respectively, have a predetermined relative positional relationship for photographing the photographing region to obtain first and second two-dimensional image frames under the structured light irradiation, respectively. For example, in the case where the projection device 310 projects three patterns as shown in fig. 1, the first and second image sensors 320_1 and 320_2 may image photographing regions (e.g., an imaging plane and regions within a certain range thereof in fig. 3) on which the three patterns are projected, respectively, in three synchronized image frame imaging periods.
As shown in fig. 3, the projection device 310 may project linear light extending in the x-direction in the z-direction (i.e., toward the photographing region). The projected line light can be continuously moved in the y-direction to cover the entire imaging area. The lower part of fig. 3 gives a more understandable illustration of the scanning of line light for a perspective view of the shot area.
In the present disclosure, the direction of the light exit measurement head is designated as the z direction, the vertical direction of the photographing plane is the x direction, and the horizontal direction is the y direction. Then, the stripe structure light projected by the projection device may be a result of the linear light extending in the x direction moving in the y direction. Although in other embodiments, the synchronization and imaging process may also be performed for stripe-structured light that is obtained by moving linear light extending in the horizontal y-direction in the x-direction, it is still preferable to use vertical stripe light for the description in the present disclosure.
Fig. 4A-B show an enlarged operation example of the projection apparatus shown in fig. 3. Specifically, as shown in fig. 3, in the projection apparatus 310, laser light emitted from a laser generator (such as the laser generator 411 shown in detail in fig. 4A-B) is scanned and projected onto a photographing region (gray region in fig. 3) by a projection mechanism (such as the projection mechanism 412 shown in detail in fig. 4A-B) to perform active structured light projection on an object to be measured (e.g., a person in fig. 3) in the photographing region. The pair of image sensors 320_1 and 320_2 images a photographing region, thereby acquiring image frames required for depth data calculation. As shown in fig. 3, the dashed lines from the projection device 310 are used to represent the projection ranges thereof, while the dashed lines from the image sensors 320_1 and 320_2 are used to represent the respective imaging ranges thereof. The photographing region is generally located in an overlapping region of respective projection and imaging ranges of the three.
In practice, a laser generator is used to generate linear and/or infrared laser light, and the laser generator is switched at high speed to scan and project structured light with alternating shades corresponding to stripe codes. The high speed switching may include high speed switching of the laser generator and high speed code switching.
In one embodiment, the laser generator may emit laser light of the same intensity, and the projected fringe pattern is achieved by turning the laser generator on and off. In this case, since the laser generator projects light of one intensity only at different duty cycles, each pixel of the image sensor integrates the projected light to determine the "presence or absence" of the irradiated light, and thus the equipped image sensor may be a black-and-white image sensor.
In another embodiment, the laser generator itself may emit laser light with varying intensity, for example, laser light with sinusoidal variation of the emitted intensity over a large period depending on the applied power. The sinusoidal laser may be combined with stripe projection, whereby a pattern with alternate brightness and different brightness between bright and dark stripes is scanned and projected. In this case, the image sensor needs to have the capability of differentially imaging different light intensities, and thus may be a multi-level gray scale image sensor. It is apparent that gray scale projection and imaging can provide more accurate inter-pixel matching than black and white projection and imaging, thereby improving the accuracy of depth data measurements.
In one embodiment, the laser generator 411 may be a line laser generator that generates line light extending in the x-direction (the direction perpendicular to the paper surface in fig. 4A-B). The line light is then projected onto the imaging plane by a reflective mechanism 412 that is swingable along an axis in the x-direction. Swing of the reflecting mechanism 412 as shown in fig. 4B, the projecting mechanism 412 (e.g., a mirror) can scan in a range of α angle, thereby realizing a line-type light scanning that reciprocates in a range AB of the imaging plane.
It will be appreciated that in order to achieve projection of the fringe pattern, the line light itself needs to undergo a change in shade (or, in a simple implementation, a change in intensity) during the continued movement of the line light in the y-direction. For example, when it is desired to scan the first pattern of fig. 1, the laser generator 411 remains off as the projection mechanism 412 scans through the front α/2 angle, and the laser generator 411 becomes on until scanning to the rear α/2 angle, thereby achieving a pattern that is dark on the left and bright on the right. When the second pattern of fig. 1 needs to be scanned, the laser generator 411 is kept off when the projection mechanism 412 scans through 0 to α/4 angle, the laser generator 411 becomes on when the angle is scanned to α/2 angle, the laser generator 411 is turned off again when the angle is scanned through α/2 to 3α/4 angle, and the laser generator 411 becomes on when the angle is scanned to 3α/4 to α angle. Thereby realizing a dark-bright-dark-bright pattern. Similarly, the third pattern of fig. 1 and the finer stripe pattern shown in fig. 2 may be implemented in more frequent changes based on the angle of rotation.
In one embodiment, the reflective mechanism 412 may be a micromirror device (also referred to as a digital micromirror device, DMD) and may be implemented as a MEMS (micro-electro-mechanical system). Fig. 5 shows a simplified perspective schematic diagram of a projection device for use with the present invention. As shown in fig. 5, the point laser generated by the laser may obtain line-shaped light (corresponding to the line-shaped laser generator 411 of fig. 4) through a lens, the line-shaped light is reflected by a micro mirror device of MEMS type, and the reflected line-shaped light is projected to an external space through a light window. Micromirror devices have extremely high performance, for example, commercially available DMDs can perform highly stable reciprocating vibrations at a frequency of 2k, thereby laying the foundation for high performance depth imaging.
In order to acquire a frame of high-precision depth map, the depth data measuring head shown in fig. 3 needs to sequentially project a plurality of different fringe patterns. In other words, in the existing scheme of synthesizing a depth map using a photographed fringe pattern, the accuracy is traded for the cost of the time domain. Further, since different fringe patterns photographed in successive N imaging periods are used for the synthesis of one depth map, the existing depth data measurement method is only suitable for the case where the photographed object remains stationary during the N imaging periods, which greatly limits the application range of the technique for depth data determination using active projection fringe images.
In view of this, the present invention proposes a new depth data measurement scheme, which uses an image sensor provided with different groups of pixels capable of phase-shifting exposure, and enables the different groups of pixels of the image sensor to respectively acquire different phase-shifted fringe images in a single scanning of line-type light through smart setting of the light-shade variation of the projection line-type light, thereby realizing the acquisition of a plurality of fringe images in the single scanning of line-type light. Therefore, the synthesis speed of the depth map can be greatly improved, and the method is suitable for shooting a moving target object.
Further, problems with shadows and dead zones are encountered when imaging with actively projected structured light. Fig. 6 shows an example in which a single camera is missing part of depth information of an imaging object when imaging using structured light. As shown in the figure, due to the unevenness of the structure of the imaging object itself, the structure light is blocked by the imaging object itself, so that a shadow is caused in the range of the imaging object facing the camera, and the depth information of the corresponding part is missing (corresponding to "missing 1" in the figure) when the subsequent depth information is obtained due to the non-structural light irradiation of the shadow part. No matter what structure the imaging object itself has, the part of the imaging object facing away from the structured light generator cannot receive the irradiation of the structured light, so that the depth information of the corresponding part is missing (corresponding to 'missing 2' in the figure) when the subsequent depth information is obtained
Fig. 7 shows an example in which the dual camera imaging improves the loss of depth information of a part of an imaging subject. As shown in the figure, by setting the left and right double-sided cameras, more parts of the imaging object can receive the irradiation of the structured light, thereby realizing the measurement of more depth information of the imaging object.
In view of this, the invention may be embodied in particular as a dual camera (i.e. dual measuring head) depth data measuring device. Fig. 8 shows a schematic diagram of a depth data measurement device according to an embodiment of the invention. As shown, the depth data measurement device 800 may include a first depth imaging measurement head 810 and a second depth imaging measurement head 820 that are fixed in relative position. In particular, the first depth imaging measurement head 810 and the second depth imaging measurement head 820 may be mounted within the same housing and thereby ensure a fixed relative position.
Further, the measuring head 810 and the measuring head 820 may be controlled by the same control device (e.g., a processor, not shown) to perform sequential structured light projection and imaging. Here, sequential means that the measuring head 810 performs first one scan and imaging, and the measuring head 820 performs again one scan and imaging, and performs line-type light projection and imaging in different periods from each other.
As shown, the measuring head 810 and the measuring head 820 may have the same configuration, for example, both may be two measuring heads having the same configuration and arranged to sequentially photograph the same imaging region (imaging subject in the imaging region) from different angles.
Here, sequential photographing means that after the measuring head 810 completes a first pattern scan (the projection device 811 performs scanning projection of line-type light, the image sensor 812 images the projected line-type light), the measuring head 820 performs a second pattern scan (the projection device 821 performs scanning projection of line-type light, the image sensor 822 images the projected line-type light), the first set of N-step phase shift patterns obtained by the first pattern scan and the second set of N-step phase shift patterns obtained by the second pattern scan are synthesized based on the relative positions as depth information of the imaging object in the imaging region.
Further, the housing of the depth data measurement device 800 may also include a handle 831 structure and may be implemented as a handheld modeling device. In this implementation, the depth data measurement device performs photographing of a plurality of first and second sets of N-step phase shift patterns that are relatively moved with respect to the imaging object, and depth information generated from the plurality of first and second sets of N-step phase shift patterns may be synthesized into model information of the imaging object based on a calibration point.
In one embodiment, each of the measurement head 810 and measurement head 820 may be implemented as a depth imaging measurement head comprising: projection device and image sensor. Wherein the projection means may be adapted to project linear light moving in a first direction (e.g. the y-direction in fig. 3) towards the imaging area, wherein the length direction of the linear light is a second direction (e.g. the x-direction in fig. 3) perpendicular to the first direction. In one embodiment, the projection device may have an implementation structure shown in fig. 5, and includes a line-type light generating device and a projection mechanism for reflecting and projecting line-type light and capable of performing a change in projection direction within a certain angle. Here, it should be understood that the measuring head 810 and the measuring head 820 have respective xyz directions, and the relative relationship of xyz directions between the two measuring heads is determined by the relative positions therebetween.
An image sensor comprising a plurality of groups of pixels uniformly distributed on an imaging surface, each group of pixels having an exposure switching period t spaced apart from each other by 2π/N phases e Exposure is performed, where N is an integer greater than 1.
Here, for convenience of understanding, the structure and exposure of the image sensor will be described taking n=4 as an example. Fig. 9 shows an example of the pixel composition of the image sensor used in the present invention. Also for convenience of explanation, an example of 16×24 pixels is shown in fig. 6. It should be appreciated that the image sensor actually used may have more pixels, for example 600x800 pixels. The image sensor shown in fig. 6 includes 4 (n=4) groups of pixels uniformly distributed across the imaging surface, represented by 1, 2, 3, 4, respectively, in the illustrated blocks. Here, the "uniform distribution" of four pixels over the imaging surface means that the same (or approximately the same) number of pixels of each pixel are illuminated in the currently illuminated area when the line light is scanned and projected in the y direction. In a preferred embodiment, the 4 sets of pixels are spaced apart from each other in a unit of one pixel as shown in fig. 6. That is, the image sensor shown in fig. 9 can be considered to include a plurality of "pixel units" (as shown by a bold black frame in the drawing, in the example of fig. 4, 8×12 pixel units of the same configuration may be included), each pixel unit including one pixel each belonging to 4 groups of pixels.
In other embodiments, each group of pixels may be distributed at intervals of one unit of two pixels (for example, two pixels adjacently arranged in the x-direction).
Fig. 10 shows an example of the relative relationship of exposure periods between different groups of pixels of the same image sensor. In the example of FIG. 10, 4 groups of pixels have the same exposure switching period t e And all switch at a duty cycle of 50%, in other words, image transferAll pixels of the sensor have the same exposure switching waveform. But differs in that successive different sets of pixel waveforms have the same pi/2 phase difference between them. In one implementation, the exposure switch period t e For example taking a typical value of 20ns. This means that each pixel in the image sensor is turned on for exposure at 10ns on intervals of 10ns off, but the on-time of group 2 pixels is 5ns later than group 1, the on-time of group 3 pixels is 5ns later than group 2, and the on-time of group 4 pixels is 5ns later than group 3 (which can also be considered as 5ns earlier than group 1 pixels).
When the image sensor used can perform group phase shift exposure as shown in fig. 9 and 10, linear light projection of the projection device can be skillfully set to realize multi-image acquisition of single scanning.
In particular, the projection device may be operated in one scan cycle (e.g., denoted as scan cycle) A pattern scan is completed. In the scanning period +.>In, it is assumed that the line light sweeps over the imaging area at a constant speed and is +.>Repetition is performed. For each cycle sub-period +.>One light projection embodiment is described in detail.
At each cycle sub-periodIn which the line light is projected for a period t p Performing a change of brightness, the projection period t p And the exposure switching period t e The duration of (2) is the same. Projection period t p Comprising N2 pi +.N waveform projection areas, the projection light intensity of each waveform projection area is encoded so as to be +.>Upon completion of one pattern scan, the N groups of pixels of the image sensor each image a different fringe pattern, and the N fringe patterns form a set of N-step phase-shift patterns with a 2 pi/N phase shift relative to each other.
In one embodiment, each waveform projection area projects a rectangular wave of width 2 pi/N or 0 (0 can also be considered as a rectangular wave with intensity 0) correspondingly, and corresponds to one cycle sub-period based on a set of N-step phase shift patternsThe light intensity of each waveform projection area is determined.
In one embodiment, the set of N-step phase shift patterns is a sine wave four-step phase shift pattern and each projection period t is derived based on the exposure of N sets of pixels to the waveform projection area p The light intensity of each waveform projection area is not smaller than zero.
For convenience of explanation, let n=4 be assumed here, and let the line laser scan projection period t be p The laser intensity per 1/4 period in (a) is Q 1 /Q 2 /Q 3 /Q 4 . Thus in one cycle sub-periodIn this, the integrated luminance of the four pixels is as follows:
P 1 =Σ(Q 1 +Q 2 )
P 2 =Σ(Q 2 +Q 3 )
P 3 =Σ(Q 3 +Q 4 )
P 4 =Σ(Q 1 +Q 4 ) (1)
thus, Q can be determined according to the pattern type corresponding to the required 4-step phase shift imaging 1 ~Q 4 And respectively taking values.
FIG. 11 shows a cycle of sub-periodsAn imaging schematic of a different set of pixels. In the example shown in fig. 11, the desired phase shift pattern is a sinusoidal optical waveform with a phase difference of pi/2. At this time, four groups of pixels P 1 ~P 4 The values of (2) may be:
P 1 =Q/2*sint+Q/2
P 2 =-Q/2*cost+Q/2
P 3 =-Q/2*sint+Q/2
P 4 =Q/2*cost+Q/2 (2)
here, Q may be regarded as the integrated brightness at the brightest position of the phase-shifted stripe shown in fig. 11, and the value of t may correspond to one cycle sub-periodThe value of t can correspond to pixels at different positions in the image sensor when the line light is scanned at a uniform speed.
Q can be reversely found by the method (2) 1 ~Q 4 Luminance value with respect to t. Since in the formula (2), the unknown number Q 1 ~Q 4 There are n=4, and the rank of formula (2) is N-1=3, thus Q 1 ~Q 4 There may actually be numerous solutions, one of which is given as follows:
Q 1 =A/2*sint+A/2
Q 2 =0
Q 3 =-A/2*cost+A/2
Q 4 =A/2(cost-sint) (3)
where o×a=q, and O is, for example, the number of exposure cycles each pixel receives during the line light sweep (for example, each pixel column in the following example can complete 100 exposure turns on in 2us of the line light sweep, which can be regarded as o=100).
However, since the light intensity cannot be negative, Q 1 ~Q 4 The respective values need to be kept non-negative. Equation (3) holds in the range of t=0 to pi/4, and when the line type light scans to the range of t=pi/4 to pi/2, another solution value can be given based on equation (2):
Q1=A/2*cost+A/2
Q2=A/2(sint-cost)
Q3=-A/2*sint+A/2
Q4=0 (4)
FIG. 12 shows an example of the relative relationship between the projected light waveform values and the exposure periods of pixel groups 1-4 when performing sine wave four-step phase shift pattern imaging.
As can be seen from the figure in combination with the figure 11 and the formula (1), at t=0, P 1 ~P 4 The respective values may correspond to Q/2, 0, Q/2 and Q, respectively.
Assuming a scan periodFor 3.84ms, the image sensor includes 1920 columns and a linear light projection period t p 20ns, due to the time each pixel is swept by the line light, i.e. dwell time t c Can be equal to the scanning period +.>Divided by the number of columns C, i.e. 3.84 ms/1920=2us, or the residence time t, taking into account that the line light has a certain width c Not less than 2us, each pixel column is thus able to complete 100 exposure starts in 2us swept by the line light. At this time, o=100.
In one embodiment, the line light may be caused to be at the current cycle sub-periodInitially, the first 100 projection periods t are maintained p With Q corresponding to t=0 1 =A/2,Q 2 =Q 3 =0,Q 4 Waveform of =a/2 (here a=q/o=q/100). In a preferred embodiment, then, at each projection period t, based on small variations of t p All are to Q 1 ~Q 4 The values of the respective values are fine-tuned, for example, by solving the equation (2).
As previously described, the cycle is repeatedUp to a predetermined number of times, e.g. M times, and thereby completing one scan cycleFIG. 13 shows completion of one scanning cycle +.>Patterns 1-4 are obtained from pixel groups 1-4, respectively. For example, in the example of FIG. 10, M may be equal to 16, i.e., 16 cycles of subcycles +.>Thus, a four-step phase shift pattern with a phase difference of pi/2 is obtained. Fig. 11 shows an example of a sine wave 4-step phase shift pattern. When the four-step phase shift map with 32 sine wave stripe repetitions shown in fig. 13 is acquired using the image sensor of 1920 columns in the above example, each sine wave stripe covers 60 pixel columns (1920/32=60), i.e., 30 pixel cell columns. In other words, the on-line optical scanning is performed for one cycle of sub-period +. >60 x 100 = 6000 projections need to be completed. At each projection period t p All are to Q 1 ~Q 4 In embodiments in which the respective values are fine-tuned, e.g., the solution of equation (3), each projection period t p Compared with the previous projection period t p T has small increments of Δt=2pi/6000=pi/3000, and thus takes a different value.
Returning to fig. 12, as the line light scans from the position corresponding to t=0 to the position corresponding to t=pi/6, Q can be solved based on, for example, equation (3) 1 ~Q 4 And respectively taking values. At t=pi/6, it can be considered that the line light scans to the current cycleThe 5 th pixel column of the 60 pixel columns of the loop, and corresponds to Q when t=pi/6 is solved based on equation (3) 1 =3A/4,Q 2 =0,Q 3 =0.183A,Q 4 =0.067A。
As the linear light scans from the position corresponding to t=pi/6 to the position corresponding to t=pi/3, the solution of Q can be continued 1 ~Q 4 And respectively taking values. Since at t=pi/4, cost=sint, and then pi/4 to 3 pi/4, cost < sint, Q is found 4 Equation (3) below 0 no longer applies, and it can be found based on equation (4) that corresponds to Q when t=pi/3 1 =3A/4,Q 2 =0.183A,Q 3 =0.067A,Q 4 =0. And at t=pi/3, the line-type light can be considered to be scanned to the 10 th pixel column among 60 pixel columns of the current cycle.
As the line light scans from the position corresponding to t=pi/3 to the position corresponding to t=pi/2, the solution of Q based on equation (4) may be continued 1 ~Q 4 And respectively taking values. At t=pi/2, it can be considered that the line light is scanned to the 15 th pixel column among 60 pixel columns of the current cycle, and the solution of t=pi/2 based on the equation (4) corresponds to Q 1 =A/2,Q 2 =A/2,Q 3 =0,Q 4 =0. It should be noted that since pixels 1-4 are imaged with a 4-step phase shift, at each projection period t p All have Q 1 +Q 2 +Q 3 +Q 4 =a, and since pixels 1-4 are separated from each other by 2pi/N phase, the same is projected by period t p The exposure is performed with an on-time of 10%, so that at each projection period t p Pixels 1-4 can obtain 2A x t in total p Integrated luminance of/4.
Q of the preceding pi/2 in one 2 pi cycle period of the sine wave N-step phase shift plot is described above in connection with FIG. 12 1 ~Q 4 For each of pixels 1-4. Those skilled in the art can continue to find Q3 pi/2 after one 2 pi cycle period based on equation (2) and the example of FIG. 12 1 ~Q 4 Is a value example of the number of the (a).
In addition, it should be understood that although fig. 11 and 12 are for the purpose ofThe variation of the fringe image and waveform is conveniently illustrated to show the sub-period T 1 -T 4 (sub-period T) 1 -T 4 Each corresponding to a 2 pi cycle sub-period0 to pi/2 part, pi/2 to pi part, pi to 3 pi/2 part and 3 pi/2 to 2 pi part), but when realizing the above sine wave 4-step phase shift pattern, only one cycle subperiod +_j is needed to be obtained according to the above formula (2) >The brightness value of the internal projection laser is changed without additional subcycle T 1 -T 4 Is divided into (1).
In one embodiment, other fringe light and N-step phase shift imaging may be used in combination to find a more accurate depth image. Fig. 14 shows an example of gray code combined with four-step phase shift for depth map imaging. In practice, one can use a greater number of cyclic sub-periods in each 4-step phase shift patternTo improve the imaging accuracy of the depth map. However, since the period of the 4-step phase shift pattern is repeated, there is a problem in that the cross-period depth jump cannot be recognized. At this time, the photographic subject in the photographing space can be primarily imaged using gray codes and then 4-step phase shift imaging is performed. In particular, the projection means may be in the first scanning period +.>Pattern scanning is completed once (e.g. the corresponding waveform of each linear light projection period is solved by the illustrated light-dark diagram so that N groups of pixels of the image sensor respectively image a different stripe pattern, and the N stripe patterns form a group of Gray code patterns; the projection device is in the second scanning period->A pattern scan is completed such that each of the N groups of pixels of the image sensor images a different fringe pattern and the N fringe patterns form the set of N-step phase-shifting patterns, wherein a depth map of the imaged region is generated from the set of N-step phase-shifting patterns based on the set of gray code patterns.
In addition, in introducing multiple scan periodsIn this case, not only the implementation of other stripe patterns plus N-step phase shift patterns, but also the implementation of an αn-step phase shift pattern may be performed. Here, α is an integer of 2 or more. Specifically, the projection device is +.>Completing a pattern scans in each of said scanning periods +.>Comprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In which N sub-periods T are included 1 -T N At each sub-period T i In which the line light is projected for a period t p Performing light-dark variation projection, wherein the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p Comprising a bright region, wherein, in the subperiod T 1 -T N In the projection period t p The position of (2 pi/alpha N) is varied at intervals of 2 pi/alpha N phase, so that at each of the scanning periods +.>When one pattern scanning is completed, the image is transmittedN groups of pixels of the sensor each image a different fringe pattern and are in alpha scan periods +.>When the pattern scanning is completed for alpha times, the alpha N stripe patterns form a group of alpha N step phase shift patterns with 2 pi/alpha N phase shift between each other, wherein alpha is an integer greater than or equal to 2.
For ease of understanding, a=2, n=4 is illustrated as an example. That is, using an exposure switching period t comprising 4 groups of phases 2 pi/N apart from each other e An image sensor of the exposed pixel, and two scans to achieve eight-step phase shift pattern acquisition.
To achieve eight-step phase shift pattern acquisition with a 4-group pixel image sensor, the projection device needs to perform two scans. In the first scan, i.e. in the first scan periodComprises a plurality of cyclic subcycles +.>In each cycle sub-period->In which 4 sub-periods T are included 1 -T 4 The bright area is in the projection period t p The position of which is varied by pi/4 phase intervals such that in the first scanning period +.>And when one pattern scanning is completed, the first 4 patterns in the eight-step phase shift pattern are acquired. In the second scan, i.e. in the second scan period +.>In the same way comprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In the same way include N sub-periods T 1 -T 4 The bright area is in the projection period t p The position of which is varied by pi/4 phase intervals (but the phase of which is different from that of the first scanning) so that in the second scanning period +. >And when one pattern scanning is completed, acquiring the last 4 patterns in the eight-step phase shift pattern. Thus, the eight-step phase shift pattern can be acquired through two scans, thereby realizing higher depth imaging precision.
In addition, although an example of n=4 is given above for convenience of explanation, N may take other values in other embodiments. Specifically, n=2 n N is an integer of 1 or more. Thus, for example, an 8-step phase shift using an 8-group pixel image sensor, a 16-step phase shift using a 16-group pixel image sensor, or the like can be realized with higher accuracy.
Each pixel in the image sensor may include a corresponding charge storage unit during the scan periodAnd when one pattern scanning is completed, acquiring a group of N-step phase shift patterns from N groups of charge storage units corresponding to N groups of pixels respectively, wherein the group of N-step phase shift patterns are used for generating a depth map of the imaging region.
FIG. 15 shows a cycle of sub-periodsAn imaging schematic of a different set of pixels. In the example shown in fig. 15, the desired phase shift pattern is a bright-dark fringe waveform having a phase difference of pi/2. At this time, four groups of pixels P 1 ~P 4 The values of (2) may be:
P 1 =q, when t=0 to pi; when t=pi-2pi=0
P 2 =0, when t=0 to pi/2; =q, when t=pi/2 to 3 pi/2; when t=3pi/2 to 2pi=0
P 3 =0, when t=0 to pi; when t=pi to 2pi=q
P 4 =q, when t=0 to pi/2; =0, when t=pi/2 to 3 pi/2; when t=3pi/2 to 2pi (5) =q
Then, Q is solved based on the equation (1) and the equation (5) 1 ~Q 4 One optimized solution is obtained as follows:
Q 1 =a, when t=0 to pi/2; when t=pi/2 to 2 pi =0
Q 2 =0, when t=0 to pi/2; =a, when t=pi/2 to pi; when t=pi-2pi=0
Q 3 =0, when t=0 to pi; =a, when t=pi-3 pi/2; when t=3pi/2 to 2pi=0
Q 4 =0, when t=0 to 3 pi/2; when t=3pi/2 to 2pi (6)
Where o×a=q, and O is, for example, the number of exposure cycles each pixel receives while the line light sweeps.
In the optimized solution based on equation (6), each cycle can be sub-cycledIs divided into 4 sub-periods T 1 -T 4 Sub-period T 1 -T 4 Each corresponding to a 2 pi cycle sub-period +.>In (2) a 0-pi/2 fraction, pi/2-pi fraction, pi-3 pi/2 fraction and 3 pi/2-2 pi fraction, and the bright region is in the projection period t p The position of which varies at intervals of 2 pi/4 phases.
In the example of n=4, each cycle is sub-periodicIn which 4 sub-periods T are included 1 -T 4 . FIGS. 16A-D show sub-period T 1 -T 4 The relative relationship between the projected light waveform and the exposure period of pixel groups 1-4.
As shown in fig. 16A-D, the laser is projected with an exposure switching period t e Projection period t of equal duration p Laser projection is performed, and each laser projection period t can be considered p All the time remains the same as the exposure switching period t of the first group of pixels e Synchronization (although laser projection period t p Will be on at different phases in different sub-periods) and switch at a duty cycle of 25% (i.e. the bright area is 2 pi/N), i.e. the waveform is a rectangular wave with a duty cycle of 25%. At 4 sub-periods T 1 -T 4 In each projection period t, the projection laser light p The projection on-times in each are aligned with the exposure on-times of the 1 st-4 th group of pixels, respectively.
Specifically, first, as shown in fig. 16A, in the sub-period T 1 In each projection period t, the projection laser light p Remains on for the first pi/2 phase of (1), at which time the first and fourth sets of pixels are also on, so that each projection period t can be p The reflected projection light is exposed within the first pi/2 phase of (a) so that there can be charge accumulation within the corresponding pixel, as indicated by the gray rectangle in the figure. After completion of a predetermined m 1 Following projection of a cycle, sub-cycle T 1 Ending, entering the projection laser in each projection period t p Sub-period T which remains open in pi/2 to pi phase 2
As shown in fig. 16B, in the sub-period T 2 In each projection period t, the projection laser light p Remains on for pi/2 to pi phase of (i), at which time the first and second sets of pixels are also on, so that each projection period t p The reflected projection light is exposed to within pi/2 to pi phases so that charge accumulation can exist in the corresponding pixel, as indicated by the gray rectangle in the figure. After completion of a predetermined m 2 Following projection of a cycle, sub-cycle T 2 Ending, entering the projection laser in each projection period t p Sub-period T which remains open in the pi-3 pi/2 phase of (C) 3
As shown in fig. 16C, in the sub-period T 3 In each projection period t, the projection laser light p Remains on for a phase of pi-3 pi/2, at which time the second and third sets of pixels are also on, so that each projection period t p Exposing the reflected projection light in a phase of pi-3 pi/2 such that a charge accumulation can exist in the corresponding pixel, as indicated by the gray rectangle in the figure. After completion of a predetermined m 3 Following projection of a cycle, sub-cycle T 3 Ending, entering the projection laser in each projection period t p The sub-period T remaining open in 3 pi/2 to 2 pi phase 4
As shown in fig. 16D, in the sub-period T 4 In each projection period t, the projection laser light p Remains on for 3 pi/2-2 pi phase, at which time the third and fourth sets of pixels are also on, so that each projection period t p The reflected projection light is exposed to within 3 pi/2 to 2 pi phases so that charge accumulation can exist in the corresponding pixel, as indicated by the gray rectangle in the figure. After completion of a predetermined m 4 Following projection of a cycle, sub-cycle T 4 And (5) ending. At this time, one cycle of sub-period is completedIs entered into the next cycle sub-period +.>Sub-period T of (2) 1
In a simple implementation, the number of projection cycles per sub-cycle can be made the same, i.e. m 1 =m 2 =m 3 =m 4 I.e. T 1 -T 4 At this time, if the line-type light is swept across the imaging plane at a uniform speed, a phase shift map as shown in fig. 12 can be obtained. In combination, as shown in FIGS. 16A-D, in sub-period T 1 The lighting interval of the projected laser falls into the exposure interval of the first group of pixels and the fourth group of pixels, so the first group of pixels and the fourth group of pixels correspond to the bright lines; in the sub-range Period T 2 The lighting interval of the projected laser falls into the exposure interval of the first group of pixels and the second group of pixels, so that the first group of pixels and the second group of pixels correspond to the bright lines; in the subperiod T 3 The lighting interval of the projected laser falls into the exposure interval of the second group of pixels and the third group of pixels, so the second group of pixels and the third group of pixels correspond to the bright lines; in the subperiod T 4 The lighting interval of the projection laser falls within the exposure interval of the third group of pixels and the fourth group of pixels, and thus the third group of pixels and the fourth group of pixels correspond to the bright line. Repeating the above cycle sub-periodUp to a predetermined number of times, for example, M times, and thereby completing one scanning cycle +.>FIG. 14 shows that the scanning cycle +.>Patterns 1-4 are obtained from pixel groups 1-4, respectively. For example, in the example of FIG. 14, M may be equal to 16, i.e., 16 cycles of subcycles +.>Thus, a four-step phase shift diagram of the bright and dark fringes with a phase difference of pi/2 is obtained.
Thus, when the N-step phase shift map of the bright-dark fringes is obtained, the phase shift map is obtained at each cycle of the sub-periodIn which N sub-periods T are included 1 -T N At each sub-period T i In which the line light is projected for a period t p Performing a change of brightness, the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p Comprising a bright region, wherein, in the subperiod T 1 -T N In the projection period t p At a position of 2 pi/N phaseThe bit interval is varied so that in the scanning period +.>When one pattern scanning is completed, each of N groups of pixels of the image sensor forms a different stripe pattern, and a phase shift of 2 pi/N exists among the N stripe patterns. Thereby, one time by the line light, for example, scanning from left to right (corresponding to one scanning period +.>) A set of N-step phase shift patterns may be obtained directly from N sets of pixels of the image sensor.
The line light scan corresponds to dwell time t on each column of pixels c Not less than the scanning periodDivided by the number of columns C. To ensure adequate exposure, the dwell time t c Is the projection period t p More than 10 times, preferably, the residence time t c Is the projection period t p More than 50 times of the total number of the components. For example, assume scan period +.>At 3.84ms (i.e., a frame rate of 1000/3.84≡260 frames/s), in the case of an image sensor comprising 1920 columns (with 960 columns of pixel cells in the case of the pixel cell shown in FIG. 6) of pixels, each pixel is swept by the line light for a time, i.e., dwell time t c Can be equal to the scanning period +.>Divided by the number of columns C, i.e. 3.84 ms/1920=2us, or the residence time t, taking into account that the line light has a certain width c Not less than 2us. On-line light projection period t p With a duty cycle of 25% and 20ns, each pixel column is capable of 100 exposures on in 2us swept by the line light and is equal to the current projection period t p The two sets of pixels corresponding to the bright area positions are capable of achieving an exposure of a duration of 5 nsx100=0.5 us. In addition, since the time for which the line-type light scans through a column of pixels is sufficiently long compared to the exposure period of the pixels (for example, 100 times as long as in the above example), it can be approximately considered that the line-type light itself is a light source whose position is not shifted in the 2us, corresponding to the pixel column currently illuminated.
While the dwell time t in each pixel column c Not less than 2us, if the pixel units have a 4-pixel structure of a 2x2 arrangement as shown in fig. 9, for example, each pixel unit is swept by the line light for not less than 4us. To achieve a four-step phase shift, each sub-period T is required i The duration of which is not less than the time that the line light sweeps through a column of pixel elements. When the image sensor of 1920 columns in the above example is used to acquire the four-step phase shift map with 32 stripes (16 bright stripes and 16 dark stripes) shown in fig. 10, each stripe covers 60 pixel columns (1920/32=60), i.e., 30 pixel cell columns. As shown in FIG. 9, due to each sub-period T i Corresponding to half a stripe, thus covering 30 columns of pixels, i.e. 15 columns of pixel elements, for a duration of 2usx30=60 us. Thus in this example, m 1 =m 2 =m 3 =m 4 =m=60 us/20 ns=3000. Due to the linear light projection period t p Exposure switching period t with each pixel group e The duration is the same, so that each corresponding group of pixels has also been switched 3000 times as the line light scans through the distance of half the stripe.
The linear light projected as described above in connection with fig. 15-17 is the projection period t p And has a 2 pi/N phase bright area. In this example, the projected line light is always the projection period t p A duty cycle of 100/N% (in the upper example of N=4, a duty cycle of 25%, corresponding to a bright area of 2 pi/N phase) and each different projection period t p The projected light intensity of the interior bright areas has preferably the same rectangular wave. At each sub-period T i In each projection period t of the line light p The inner projected waveform is the same and is a rectangular wave of 2 pi/N phase bright region, 6 pi/N phase dark region (e.g., each projection period t p In the laserAt t p Time on/N, (N-1) t p /N is turned off during the time). The projection period of the line light and the proportion of bright and dark areas are unchanged between successive subcycles, only the position of the bright areas is changed. And in making up a scanning cycle Is +.>And repeating internally. Thus, the resulting set of N-step phase shift patterns is a stripe pattern that is distinct and repeated multiple times with the bright and dark regions shown in fig. 17.
The above embodiment of obtaining the bright-dark fringes shown in fig. 17 based on a rectangular wave with a duty ratio of 100/N% and constant brightness can be regarded as the embodiment of the present invention based on a linear light projection period t p In each 2 pi/N phase is individually adjustable and thus combined with a phase difference of 2 pi/N and the exposure period is also equal to t p A special case of an imaging scheme that generates a set of N-step phase shift patterns.
The projection and imaging scheme used by the two measuring heads provided by the depth data measuring device of the present invention can be used for a monocular scheme (i.e., a scheme equipped with one image sensor) or a binocular scheme (i.e., a scheme equipped with two image sensors in a fixed relative position for synchronous imaging). When the image sensor includes a first image sensor and a second image sensor that are fixed in relative positions, the first image sensor and the second image sensor may each include the N sets of pixels and perform exposure in synchronization with each other. In other words, in the binocular scheme, during the scan period When one pattern scan is completed, the first image sensor may acquire an N Zhang Tiaowen image, the second image sensor may acquire an N Zhang Tiaowen image, and the 2N Zhang Tiaowen image may be used to perform one depth data determination.
Further, in order to achieve scanning projection, the projection device of the measuring head includes: a light emitting device for generating linear light; and a reflection device for reflecting line-type light, which projects line-type light moving in a direction perpendicular to the stripe direction toward a photographing area at a predetermined frequency, the length direction of the line-type light being the length direction of the projected stripe, the reflection device comprising one of: a mechanical vibrating mirror reciprocally vibrating at the predetermined frequency; a micromirror device reciprocating at a predetermined frequency; and a mechanically rotating mirror that rotates unidirectionally at a predetermined frequency. Here, the projected line light may be line light having a high order gaussian or flat gaussian distribution, thereby providing a highly uniform brightness distribution in the width direction of the line light.
The invention also discloses a measuring device using the measuring head. In particular, a depth data measuring device may comprise a depth data measuring head as described above, and a processor connected to the depth data measuring head for, during said scanning period And when one pattern scanning is completed, obtaining a depth map of the imaging area from the N obtained stripe patterns. In the binocular solution, the processor may determine depth data of the photographed object in the photographing region according to predetermined relative positions of the first and second image sensors and N first two-dimensional image frames and N second two-dimensional image frames obtained by imaging the structured light. In various embodiments, the measuring head may have a relatively independent package, or may be packaged with the processor in the measuring device.
Fig. 18 shows a schematic diagram of a depth data measurement device according to an embodiment of the invention. As shown, measurement device 1800 may include two measurement heads 1810 and 1820 as described above, as well as processor 1530. Measuring head 1810 includes projection device 1811 and an image sensor 1812. The measurement head 1820 includes a projection device 1821 and an image sensor 1822.
Processor 1830 is coupled to two measuring heads, for example, to projection device 1811 and image sensor 1812, and to projection device 1821 and image sensor 1822. Projection device 1811 may perform one scan projection as described above under control of processor 1830 and the N sets of pixels of image sensor 1812 are imaged accordingly, thereby obtaining a set of N-step phase-shifted images after one scan projection. The projection device 1821 may then perform a one-scan projection as described above under the control of the processor 1830, and the N sets of pixels of the image sensor 1822 are imaged accordingly, thereby obtaining a set of N-step phase-shifted images after one-scan projection. The two sets of N-step phase shift images may each be used to synthesize one depth map of the imaging subject, and the two depth maps may be stitched based on the positional relationship of the two measurement heads, thereby acquiring more comprehensive data of the imaging subject.
Fig. 19 shows a schematic flow chart of a depth data measurement method according to an embodiment of the invention. The method may be implemented by the depth data measuring apparatus of the present invention.
In step S1910, a first pattern scan is performed to an imaging region using a first depth imaging measurement head of a depth data measurement device to obtain a first set of N-step phase shift patterns.
In step S1920, a second pattern scan is performed to the imaging region using a second depth imaging measurement head of the depth data measurement device to obtain a second set of N-step phase shift patterns.
In step S1930, depth information of an imaging subject within the imaging region is synthesized from the first set of N-step phase shift patterns and the second set of N-step phase shift patterns based on relative positions of the first depth imaging measurement head and the second depth imaging measurement head.
The scanning imaging of the first depth imaging measurement head and the second depth imaging measurement head may each comprise: projecting line-shaped light moving along a first direction toward an imaging area, wherein the length direction of the line-shaped light is a second direction perpendicular to the first direction, and the projected line-shaped light is scanned in a periodA pattern scan is completed in the same time, the scanning period is +. >Comprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In which the line light is projected for a period t p Performing a change of brightness, the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p The method comprises the steps of including N waveform projection areas with the width of 2 pi/N, coding the projection light intensity of each waveform projection area, wherein N is an integer greater than 1; photographing the imaging region using an image sensor including N groups of pixels uniformly distributed on an imaging surface to obtain N image frames under the line-type light scanning projection, wherein each group of pixels has an exposure switching period t of 2 pi/N phase interval from each other e Exposing; and obtaining depth data of the measured object in the imaging area based on the image frame. The projected light intensity of each waveform projection area is encoded so as to be +.>Upon completion of one pattern scan, the N groups of pixels of the image sensor each image a different fringe pattern, and the N fringe patterns form a set of N-step phase-shift patterns with a 2 pi/N phase shift relative to each other.
Steps S1910 to S1930 may be repeated as above for dynamic imaging of an imaging subject in motion, for example. At this time, the depth data measuring apparatus of the present invention can maintain the position unchanged.
In another embodiment, the depth data measuring device of the present invention may then actively undergo a change in position. As previously shown in fig. 8, the depth data measuring apparatus of the present invention may be embodied, inter alia, as a hand-held device comprising a handle 831. In such an implementation, the depth data measurement device performs photographing of a plurality of first and second sets of N-step phase shift patterns in relative motion with respect to the imaging subject, and depth information generated from the plurality of first and second sets of N-step phase shift patterns may be synthesized into model information of the imaging subject based on a scale point.
The depth data measuring apparatus and method according to the present invention have been described in detail above with reference to the accompanying drawings. The depth data measuring device comprises two measuring heads which are fixed in relative positions and imaged successively, and the image sensor which is respectively provided with different groups of pixels capable of phase-shifting exposure images projected phase-shifting linear light, so that in single scanning of the linear light, different groups of pixels of the image sensor can respectively acquire different phase-shifting fringe images, and further acquisition of a plurality of fringe images of single linear light scanning is realized.
The two measuring heads can shoot the same imaging area at high speed from different angles, and can be especially realized as a handheld modeling device combined with calibration points.
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems and methods according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The foregoing description of embodiments of the invention has been presented for purposes of illustration and description, and is not intended to be exhaustive or limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the various embodiments described. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or the improvement of technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (14)

1. A depth data measurement device comprising a first depth imaging measurement head and a second depth imaging measurement head fixed in relative position, and each comprising:
a projection device for projecting line-type light moving in a first direction toward an imaging region, wherein a length direction of the line-type light is a second direction perpendicular to the first direction;
an image sensor comprising N groups of pixels uniformly distributed on an imaging surface, each group of pixels having an exposure switching period t spaced apart from each other by 2π/N phases e Exposing, wherein N is an integer greater than 1,
wherein the projection device is used for scanning in a periodA pattern scan is completed in the same time, the scanning period is +.>Comprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In which the line light is projected for a period t p Performing a change of brightness, the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p Comprising N wave-shaped projection areas of width 2 pi/N, the projected light intensity of each wave-shaped projection area being encoded such that in said scanning period +.>Upon completion of one pattern scan in, each of the N groups of pixels of the image sensor images a different fringe pattern, and the N fringe patterns form a set of N-step phase-shifted patterns with a 2 pi/N phase shift relative to each other,
after the first depth imaging measuring head finishes the first pattern scanning, the second depth imaging measuring head performs the second pattern scanning, and the first N-step phase shift pattern obtained by the first pattern scanning and the second N-step phase shift pattern obtained by the second pattern scanning are synthesized into depth information of an imaging object in an imaging area based on the relative position.
2. The apparatus of claim 1, wherein the image sensor comprises a plurality of pixel cells, each pixel cell comprising one pixel each belonging to N groups of pixels.
3. The apparatus of claim 1, wherein each waveform projection area corresponds to a rectangular wave or 0 of width 2 pi/N projected and corresponds to one cyclic sub-period based on a set of N-step phase shift patternsThe light intensity of each waveform projection area is determined.
4. The apparatus of claim 3, wherein the set of N-step phase shift patterns is a sine wave four-step phase shift pattern and each projection period t is derived based on the exposure of N sets of pixels to the waveform projection area p The light intensity of each waveform projection area is not smaller than zero.
5. The device of claim 1, wherein the line-type light scan corresponds to a dwell time t on each column of pixels c Not less than the scanning periodDivided by the column number C, the residence time t c Is the projection period t p More than 10 times of the total number of the components.
6. The apparatus of claim 5, wherein, in each sub-period T i In which the line light is projected for a period t p Projected m times and each sub-period T i Is longer than the residence time t c
7. The apparatus of claim 1, wherein the projection device is configured to, during a first scan periodCompleting one pattern scanning in the image sensor so that N groups of pixels of the image sensor respectively image a different stripe pattern, wherein N stripe patterns form a group of Gray code patterns;
The projection device is in a second scanning periodA pattern scan is completed such that each of the N groups of pixels of the image sensor images a different fringe pattern, and the N fringe patterns form the set of N-step phase-shifted patterns,
wherein a depth map of the imaging region is generated from the set of N-step phase shift patterns based on the set of gray code patterns.
8. The apparatus of claim 1, wherein the projection device comprises:
a light emitting device for generating linear light; and
a reflection device for reflecting line-type light to project line-type light moving in a direction perpendicular to the stripe direction toward a photographing region at a predetermined frequency, a length direction of the line-type light being a length direction of the projected stripe, the reflection device comprising one of:
a mechanical vibrating mirror reciprocally vibrating at the predetermined frequency;
a micromirror device reciprocating at a predetermined frequency; and
a mechanically rotating mirror that rotates unidirectionally at a predetermined frequency.
9. The apparatus of claim 1, wherein the image sensor comprises a first image sensor and a second image sensor that are fixed in relative position, wherein the first image sensor and the second image sensor each comprise the N sets of pixels and are exposed in synchronization with each other.
10. The apparatus of claim 1, wherein the projection device is configured to perform a scan cycle aCompleting a pattern scans in each of said scanning periods +.>Comprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In which N sub-periods T are included 1 -T N At each sub-period T i In which the line light is projected for a period t p Performing light-dark variation projection, wherein the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p Comprising a bright region, wherein, in the subperiod T 1 -T N In the projection period t p The position of (2 pi/alpha N) is varied at intervals of 2 pi/alpha N phase, so that at each of the scanning periods +.>When one pattern scanning is completed, N groups of pixels of the image sensor respectively image a different stripe pattern, and the pixels are in alpha scanning periods +.>When the pattern scanning is completed for alpha times, the alpha N stripe patterns form a group of alpha N step phase shift patterns with 2 pi/alpha N phase shift between each other, wherein alpha is an integer greater than or equal to 2.
11. The apparatus of claim 1, wherein, at each cycle sub-periodIn which N sub-periods T are included 1 -T N At each sub-period T i In which the line light is projected for a period t p Performing light-dark variation projection, wherein the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p Comprising a bright region, wherein, in the subperiod T 1 -T N In the projection period t p The position of which is varied at intervals of 2 pi/N phase such that during said scanning period +.>Upon completion of one pattern scan, the N groups of pixels of the image sensor each image a different stripe pattern, and the N stripe patterns form an N-step phase shift pattern of a set of bright and dark stripes with a 2 pi/N phase shift relative to each other.
12. The apparatus of claim 1, wherein the depth data measurement apparatus is an imaging object modeling apparatus that performs continuous shooting with respect to an imaging object with relative motion, and synthesizes depth information of the imaging object into model information based on a calibration point.
13. A depth data measurement method, comprising:
performing a first pattern scan to an imaging region using a first depth imaging measurement head of a depth data measurement device to obtain a first set of N-step phase shift patterns;
performing a second pattern scan to the imaging region using a second depth imaging measurement head of the depth data measurement device to obtain a second set of N-step phase shift patterns;
Synthesizing depth information of an imaging subject within the imaging region from the first set of N-step phase shift patterns and a second set of N-step phase shift patterns based on relative positions of the first depth imaging measurement head and the second depth imaging measurement head,
wherein the scanning imaging of the first depth imaging measurement head and the second depth imaging measurement head each comprises:
projecting line-shaped light moving along a first direction toward an imaging area, wherein the length direction of the line-shaped light is a second direction perpendicular to the first direction, and the projected line-shaped light is scanned in a periodInternally complete one-time pattern scanning, the scanning periodComprising a plurality of cyclic sub-periods +.>In each cycle sub-period->In which the line light is projected for a period t p Performing a change of brightness, the projection period t p And the exposure switching period t e Is the same in duration and the projection period t p The method comprises the steps of including N waveform projection areas with the width of 2 pi/N, coding the projection light intensity of each waveform projection area, wherein N is an integer greater than 1;
photographing the imaging region using an image sensor including N groups of pixels uniformly distributed on an imaging surface to obtain N image frames under the line-type light scanning projection, wherein each group of pixels has an exposure switching period t of 2 pi/N phase interval from each other e Exposing; and
depth data of the measured object in the imaging area is obtained based on the image frame,
wherein the projected light intensity of each waveform projection area is encoded such that during the scan periodUpon completion of one pattern scan, the N groups of pixels of the image sensor each image a different fringe pattern, and the N fringe patterns form a set of N-step phase-shift patterns with a 2 pi/N phase shift relative to each other.
14. The method of claim 13, further comprising:
the depth data measuring device shoots a plurality of groups of first N-step phase shift patterns and a plurality of groups of second N-step phase shift patterns which move relatively relative to the imaging object; and
depth information generated from the plurality of first and second sets of N-step phase shift patterns is synthesized into model information of the imaging subject based on the calibration points.
CN202210542889.0A 2022-05-18 2022-05-18 Depth data measuring apparatus and method Pending CN117128885A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210542889.0A CN117128885A (en) 2022-05-18 2022-05-18 Depth data measuring apparatus and method
PCT/CN2023/100797 WO2023222139A1 (en) 2022-05-18 2023-06-16 Depth data measuring head, measuring apparatus, and measuring method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210542889.0A CN117128885A (en) 2022-05-18 2022-05-18 Depth data measuring apparatus and method

Publications (1)

Publication Number Publication Date
CN117128885A true CN117128885A (en) 2023-11-28

Family

ID=88858714

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210542889.0A Pending CN117128885A (en) 2022-05-18 2022-05-18 Depth data measuring apparatus and method

Country Status (1)

Country Link
CN (1) CN117128885A (en)

Similar Documents

Publication Publication Date Title
US11105617B2 (en) Hybrid light measurement method for measuring three-dimensional profile
CN107923737B (en) Method and apparatus for superpixel modulation and ambient light rejection
JP7224708B2 (en) Depth data measuring head, measuring device and measuring method
CN111829449B (en) Depth data measuring head, measuring device and measuring method
US20140085426A1 (en) Structured light systems with static spatial light modulators
CN105806259B (en) A kind of method for three-dimensional measurement based on the projection of two-value grating defocus
JP7371443B2 (en) 3D measuring device
KR101824888B1 (en) Three dimensional shape measuring apparatus and measuring methode thereof
CN109581360A (en) Device and method for light detection and ranging
KR20170027776A (en) Method and system for adjusting light pattern for structured light imaging
JP2002257528A (en) Three-dimensional shape measuring device by phase shift method
CN110291359A (en) Three-dimensional measuring apparatus
JP7409443B2 (en) Imaging device
CN112712585B (en) Three-dimensional imaging system and method based on arc binary coding phase shift fringe projection
CN111692987A (en) Depth data measuring head, measuring device and measuring method
CN108668127B (en) Imaging device time for exposure test device
JP6714665B2 (en) Device and method for light detection and ranging
CN117128885A (en) Depth data measuring apparatus and method
CN112019773B (en) Depth data measuring head, measuring device and method
CN117128891A (en) Depth data measuring head, measuring device and measuring method
CN117128890A (en) Depth data measuring head, measuring device and measuring method
CN115655153B (en) Light source modulation method, MEMS scanning 3D imaging system and imaging method thereof
CN111982022A (en) Spatial structure detection method and system
CN117989997A (en) Depth data measuring head, measuring device and measuring method
WO2023222139A1 (en) Depth data measuring head, measuring apparatus, and measuring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination