CA2475391A1 - Optical 3d digitizer with enlarged non-ambiguity zone - Google Patents
Optical 3d digitizer with enlarged non-ambiguity zone Download PDFInfo
- Publication number
- CA2475391A1 CA2475391A1 CA002475391A CA2475391A CA2475391A1 CA 2475391 A1 CA2475391 A1 CA 2475391A1 CA 002475391 A CA002475391 A CA 002475391A CA 2475391 A CA2475391 A CA 2475391A CA 2475391 A1 CA2475391 A1 CA 2475391A1
- Authority
- CA
- Canada
- Prior art keywords
- camera
- images
- cameras
- related functions
- projector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2545—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/50—Depth or shape recovery
- G06T7/521—Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
An optical 3D digitizer with an enlarged non-ambiguity zone, comprising a structured light projector for projecting a fringe pattern over a target area, the fringe pattern having a shiftable position over the target area. First and second cameras having overlapping measurement fields are directed toward the target area and positioned with respect to the projector to define distinct triangulation planes therewith. The second camera has a larger non-ambiguity depth than the first camera. A computer evaluates a same set of camera-projector related functions from images captured by the cameras including the projected pattern at shifted positions, builds low depth resolution and degenerated 3D models from the camera-projector related functions evaluated with respect to the second and first cameras respectively, determines chromatic texture from the images, and builds a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
Claims (25)
1. An optical 3D digitizer with an enlarged non-ambiguity zone, comprising:
at least one structured light projector for projecting a fringe pattern over a target area, the fringe pattern having a shiftable position over the target area;
a first camera directed toward the target area and positioned with respect to said at least one structured light projector to define a first triangulation plane therewith;
a second camera directed toward the target area and positioned with respect to said at least one structured light projector to define a second triangulation plane therewith, the second triangulation plane being distinct from the first triangulation plane, the first and second cameras having at least partially overlapping measurement fields, the second camera having a larger non-ambiguity depth than the first camera; and a computer means connected to the cameras, for performing an image processing of images captured by the cameras, the image processing including evaluating a same set of camera-projector related functions from images including the pattern projected by said at least one structured light projector at shifted positions as captured by the cameras, building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera, building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera, determining chromatic texture from the images captured by the cameras, and building a complete textured 3D
model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
at least one structured light projector for projecting a fringe pattern over a target area, the fringe pattern having a shiftable position over the target area;
a first camera directed toward the target area and positioned with respect to said at least one structured light projector to define a first triangulation plane therewith;
a second camera directed toward the target area and positioned with respect to said at least one structured light projector to define a second triangulation plane therewith, the second triangulation plane being distinct from the first triangulation plane, the first and second cameras having at least partially overlapping measurement fields, the second camera having a larger non-ambiguity depth than the first camera; and a computer means connected to the cameras, for performing an image processing of images captured by the cameras, the image processing including evaluating a same set of camera-projector related functions from images including the pattern projected by said at least one structured light projector at shifted positions as captured by the cameras, building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera, building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera, determining chromatic texture from the images captured by the cameras, and building a complete textured 3D
model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
2. The optical 3D digitizer according to claim 1, wherein the fringe pattern has a periodic sinusoidal section profile.
3. The optical 3D digitizer according to claim 1, wherein the set of camera-projector related functions comprises relations determined from phase shifting algorithms used in phase shifting interferometry.
4. The optical 3D digitizer according to claim 1, wherein the set of camera-projector related functions comprises relations determined from spatial phase shifting algorithms relying on a single fringe image in temporal phase shifting interferometry.
5. The optical 3D digitizer according to claim 1, wherein the images captured by the cameras including the pattern projected by said at least one structured light projector are written as:
I n (i,j)=I Ave (i,j)+I Mod (i,j)[1+cos(.phi.(i,j)+m(i,j)2.pi.+(n-1).alpha.)]
where I represents one of the images, n represents a shift position of the pattern, i,j represent pixel coordinates in said one of the images, I Ave (i,j) represents a local intensity diffused or reflected back to the camera which captured said one of the images, including surrounding ambient light and local temporal average intensity of the pattern, I Mod (i,j) represents a local amplitude modulation of the pattern, .phi.(i,j) represents a local phase function wrapped over 2.pi.
range, m(i,j) represents a local order of the phase function, and .alpha. represents a 90° phase shift.
I n (i,j)=I Ave (i,j)+I Mod (i,j)[1+cos(.phi.(i,j)+m(i,j)2.pi.+(n-1).alpha.)]
where I represents one of the images, n represents a shift position of the pattern, i,j represent pixel coordinates in said one of the images, I Ave (i,j) represents a local intensity diffused or reflected back to the camera which captured said one of the images, including surrounding ambient light and local temporal average intensity of the pattern, I Mod (i,j) represents a local amplitude modulation of the pattern, .phi.(i,j) represents a local phase function wrapped over 2.pi.
range, m(i,j) represents a local order of the phase function, and .alpha. represents a 90° phase shift.
6. The optical 3D digitizer according to claim 1, wherein the camera-projector related functions comprise a phase function, a phase-shift function, an amplitude modulation function and an average function.
7. The optical 3D digitizer according to claim 6, wherein the computer means evaluates the chromatic texture from the amplitude modulation function and the average function using separate color channels.
8. The optical 3D digitizer according to claim 6, wherein the computer means evaluates the set of camera-projector related functions using:
I Ave (i,j) = (I1(i,j)+ I2(i,j)+ I3(i,j)+ I4(i,j))/4 where I represents one of the images, n represents a shift position of the pattern, i, j represent pixel coordinates in said one of the images, I Ave (i, j) represents a local intensity diffused or reflected back to the camera which captured said one of the images, including surrounding ambient light and local temporal average intensity of the pattern, I Mod (i, j) represents a local amplitude modulation of the pattern, ~(i, j) represents a local phase function wrapped over 2.pi. range, and .alpha.
represents a phase shift;
and wherein a local order of the phase function m(i, j) is evaluated directly.
I Ave (i,j) = (I1(i,j)+ I2(i,j)+ I3(i,j)+ I4(i,j))/4 where I represents one of the images, n represents a shift position of the pattern, i, j represent pixel coordinates in said one of the images, I Ave (i, j) represents a local intensity diffused or reflected back to the camera which captured said one of the images, including surrounding ambient light and local temporal average intensity of the pattern, I Mod (i, j) represents a local amplitude modulation of the pattern, ~(i, j) represents a local phase function wrapped over 2.pi. range, and .alpha.
represents a phase shift;
and wherein a local order of the phase function m(i, j) is evaluated directly.
9. The optical 3D digitizer according to claim 8, wherein the computer means builds the low depth resolution and degenerated 3D models using look-up tables.
10. The optical 3D digitizer according to claim 1, wherein the image processing includes comparing the camera-projector related functions of the cameras and pixel coordinates in the images captured by the cameras, determining rejection of pixels without correspondences if a field of view of the second camera is entirely covered by the first camera, attaching pixels showing acceptable functions to surrounding pixels by local phase unwrapping, and extracting 3D coordinates forming the complete textured 3D model using table data for building the low depth resolution 3D model applied to corrected order values and current phase values.
11. An optical 3D digitizing method with an enlarged non-ambiguity zone, comprising:
controllably projecting a fringe pattern over a target area using at least one structured light projector, the fringe pattern having a shiftable position over the target area;
positioning a first camera directed toward the target area with respect to said at least one structured light projector to define a first triangulation plane therewith;
positioning a second camera directed toward the target area with respect to said at least one structured light projector to define a second triangulation plane therewith, the second triangulation plane being distinct from the first triangulation plane, the first and second cameras having at least partially overlapping measurement fields, the second camera having a larger non-ambiguity depth than the first camera; and performing an image processing of images captured by the cameras, the image processing including evaluating a same set of camera-projector related functions from images including the pattern projected by said at least one structured light projector at shifted positions as captured by the cameras, building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera, building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera, determining chromatic texture from the images captured by the cameras, and building a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
controllably projecting a fringe pattern over a target area using at least one structured light projector, the fringe pattern having a shiftable position over the target area;
positioning a first camera directed toward the target area with respect to said at least one structured light projector to define a first triangulation plane therewith;
positioning a second camera directed toward the target area with respect to said at least one structured light projector to define a second triangulation plane therewith, the second triangulation plane being distinct from the first triangulation plane, the first and second cameras having at least partially overlapping measurement fields, the second camera having a larger non-ambiguity depth than the first camera; and performing an image processing of images captured by the cameras, the image processing including evaluating a same set of camera-projector related functions from images including the pattern projected by said at least one structured light projector at shifted positions as captured by the cameras, building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera, building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera, determining chromatic texture from the images captured by the cameras, and building a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
12. The optical 3D digitizing method according to claim 11, wherein the fringe pattern has a periodic sinusoidal section profile.
13. The optical 3D digitizing method according to claim 11, wherein the set of camera-projector related functions comprises relations determined from phase shifting algorithms used in phase shifting interferometry.
14. The optical 3D digitizing method according to claim 11, wherein the set of camera-projector related functions comprises relations determined from spatial phase shifting algorithms relying on a single fringe image in temporal phase shifting interferometry.
15. The optical 3D digitizing method according to claim 11, wherein the images captured by the cameras including the pattern protected by said at least one structured light projector are written as:
I n(i,j)=I Ave (i,j) + I Mod (i,j)[1 + cos(.phi.i,j)+m(i,j)2.pi. + (n-1).alpha.)]
where I represents one of the images, n represents a shift position of the pattern, i, j represent pixel coordinates in said one of the images, I Ave(i,j) represents a local intensity diffused or reflected back to the camera which captured said one of the images, including surrounding ambient light and local temporal average intensity of the pattern, I Mod(i,j) represents a local amplitude modulation of the pattern, ~(i,j) represents a local phase function wrapped over 2.pi. range, m(i,j) represents a local order of the phase function, and .alpha. represents a 90° phase shift.
I n(i,j)=I Ave (i,j) + I Mod (i,j)[1 + cos(.phi.i,j)+m(i,j)2.pi. + (n-1).alpha.)]
where I represents one of the images, n represents a shift position of the pattern, i, j represent pixel coordinates in said one of the images, I Ave(i,j) represents a local intensity diffused or reflected back to the camera which captured said one of the images, including surrounding ambient light and local temporal average intensity of the pattern, I Mod(i,j) represents a local amplitude modulation of the pattern, ~(i,j) represents a local phase function wrapped over 2.pi. range, m(i,j) represents a local order of the phase function, and .alpha. represents a 90° phase shift.
16. The optical 3D digitizing method according to claim 11, wherein the camera-projector related functions comprise a phase function, a phase-shift function, an amplitude modulation function and an average function.
17. The optical 3D digitizing method according to claim 16, wherein the chromatic texture is evaluated from the amplitude modulation function and the average function using separate color channels.
18. The optical 3D digitizing method according to claim 16, wherein the camera-projector related functions are evaluated using:
where I represents one of the images, n represents a shift position of the pattern, i,j represent pixel coordinates in said one of the images, I Ave (i,j) represents a local intensity diffused or reflected back to the camera which captured said one of the images, including surrounding ambient light and local temporal average intensity of the pattern, I Mod (i,j) represents a local amplitude modulation of the pattern, ~(i,j) represents a local phase function wrapped over 2.pi. range, and .alpha.
represents a phase shift;
and wherein a local order of the phase function m(i,j) is evaluated directly.
where I represents one of the images, n represents a shift position of the pattern, i,j represent pixel coordinates in said one of the images, I Ave (i,j) represents a local intensity diffused or reflected back to the camera which captured said one of the images, including surrounding ambient light and local temporal average intensity of the pattern, I Mod (i,j) represents a local amplitude modulation of the pattern, ~(i,j) represents a local phase function wrapped over 2.pi. range, and .alpha.
represents a phase shift;
and wherein a local order of the phase function m(i,j) is evaluated directly.
19. The optical 3D digitizing method according to claim 18, wherein the low depth resolution and degenerated 3D models are built using look-up tables.
20. The optical 3D digitizing method according to claim 11, wherein the image processing includes comparing the camera-projector related functions of the cameras and pixel coordinates in the images captured by the cameras, determining rejection of pixels without correspondences if a field of view of the second camera is entirely covered by the first camera, attaching pixels showing acceptable functions to surrounding pixels by local phase unwrapping, and extracting 3D coordinates forming the complete textured 3D model using table data for building the low depth resolution 3D model applied to corrected order values and current phase values.
21. A computer apparatus for performing an image processing of images captured by first and second cameras, the second camera having a larger non-ambiguity depth than the first camera, comprising:
means for evaluating a same set of camera-projector related functions from images captured by the cameras, at least some of the images including a pattern projected at shifted positions;
means for building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera;
means for building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera;
means for determining chromatic texture from the images captured by the cameras; and means for building a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
means for evaluating a same set of camera-projector related functions from images captured by the cameras, at least some of the images including a pattern projected at shifted positions;
means for building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera;
means for building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera;
means for determining chromatic texture from the images captured by the cameras; and means for building a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
22. A computer readable medium having recorded thereon statements and instructions for execution by a computer to perform an image processing of images captured by first and second cameras directed toward a target area, the second camera having a larger non-ambiguity depth than the first camera, the image processing including evaluating a same set of camera-projector related functions from the images captured by the cameras, at least some of the images including a pattern projected at shifted positions, building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera, building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera, determining chromatic texture from the images captured by the cameras, and building a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
23. A computer program product, comprising a memory having computer readable code embodied therein, for execution by a CPU, for performing an image processing of images captured by first and second cameras directed toward a target area, the second camera having a larger non-ambiguity depth than the first camera, said code comprising:
code means for evaluating a same set of camera-projector related functions from the images captured by the cameras, at least some of the images including a pattern projected at shifted positions;
code means for building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera;
code means for building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera;
code means for determining chromatic texture from the images captured by the cameras; and code means for building a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
code means for evaluating a same set of camera-projector related functions from the images captured by the cameras, at least some of the images including a pattern projected at shifted positions;
code means for building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera;
code means for building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera;
code means for determining chromatic texture from the images captured by the cameras; and code means for building a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
24. A carrier wave embodying a computer data signal representing sequences of statements and instructions which, when executed by a processor, cause the processor to perform an image processing of images captured by first and second cameras directed toward a target area, the second camera having a larger non-ambiguity depth than the first camera, the statements and instructions comprising:
evaluating a same set of camera-projector related functions from the images captured by the cameras, at least some of the images including a pattern projected at shifted positions;
building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera;
building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera;
determining chromatic texture from the images captured by the cameras;
and building a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
evaluating a same set of camera-projector related functions from the images captured by the cameras, at least some of the images including a pattern projected at shifted positions;
building a low depth resolution 3D model from the camera-projector related functions evaluated with respect to the second camera;
building a degenerated 3D model from the camera-projector related functions evaluated with respect to the first camera;
determining chromatic texture from the images captured by the cameras;
and building a complete textured 3D model from data corresponding between the low depth resolution and degenerated 3D models within a tolerance range.
25. An optical 3D digitizing method with an enlarged non-ambiguity zone, comprising:
controllably projecting a fringe pattern having a shiftable position over a target area;
capturing images obtained by high depth resolution sensing and low depth resolution sensing from respective measurement fields at least partially overlapping each other over the target area;
determining absolute pixel 3D positions in the images obtained by low depth resolution sensing and high depth resolution sensing as a function of relations depending on the fringe pattern in the captured images and correspondence between the absolute pixel 3D positions in the images;
extracting chromatic texture from the captured images; and building a complete textured 3D model from the absolute pixel 3D positions and the chromatic texture.
controllably projecting a fringe pattern having a shiftable position over a target area;
capturing images obtained by high depth resolution sensing and low depth resolution sensing from respective measurement fields at least partially overlapping each other over the target area;
determining absolute pixel 3D positions in the images obtained by low depth resolution sensing and high depth resolution sensing as a function of relations depending on the fringe pattern in the captured images and correspondence between the absolute pixel 3D positions in the images;
extracting chromatic texture from the captured images; and building a complete textured 3D model from the absolute pixel 3D positions and the chromatic texture.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA2475391A CA2475391C (en) | 2003-07-24 | 2004-07-21 | Optical 3d digitizer with enlarged non-ambiguity zone |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CA002435935A CA2435935A1 (en) | 2003-07-24 | 2003-07-24 | Optical 3d digitizer with enlarged non-ambiguity zone |
CA2,435,935 | 2003-07-24 | ||
CA2475391A CA2475391C (en) | 2003-07-24 | 2004-07-21 | Optical 3d digitizer with enlarged non-ambiguity zone |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2475391A1 true CA2475391A1 (en) | 2005-01-24 |
CA2475391C CA2475391C (en) | 2011-10-25 |
Family
ID=34105194
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2475391A Expired - Lifetime CA2475391C (en) | 2003-07-24 | 2004-07-21 | Optical 3d digitizer with enlarged non-ambiguity zone |
Country Status (1)
Country | Link |
---|---|
CA (1) | CA2475391C (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3425437A4 (en) * | 2016-03-04 | 2019-04-10 | Koh Young Technology Inc. | Patterned light irradiation apparatus and method |
EP4332499A1 (en) * | 2022-08-30 | 2024-03-06 | Canon Kabushiki Kaisha | Three-dimensional measuring apparatus, three-dimensional measuring method, storage medium, system, and method for manufacturing an article |
-
2004
- 2004-07-21 CA CA2475391A patent/CA2475391C/en not_active Expired - Lifetime
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3425437A4 (en) * | 2016-03-04 | 2019-04-10 | Koh Young Technology Inc. | Patterned light irradiation apparatus and method |
US11002534B2 (en) | 2016-03-04 | 2021-05-11 | Koh Young Technology Inc. | Patterned light projection apparatus and method |
EP3876017A1 (en) * | 2016-03-04 | 2021-09-08 | Koh Young Technology Inc. | Patterned light irradiation apparatus and method |
EP4332499A1 (en) * | 2022-08-30 | 2024-03-06 | Canon Kabushiki Kaisha | Three-dimensional measuring apparatus, three-dimensional measuring method, storage medium, system, and method for manufacturing an article |
Also Published As
Publication number | Publication date |
---|---|
CA2475391C (en) | 2011-10-25 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zhang | High-speed 3D shape measurement with structured light methods: A review | |
US10935371B2 (en) | Three-dimensional triangulational scanner with background light cancellation | |
Zhang et al. | High-resolution, real-time three-dimensional shape measurement | |
Zhang | Recent progresses on real-time 3D shape measurement using digital fringe projection techniques | |
US9392262B2 (en) | System and method for 3D reconstruction using multiple multi-channel cameras | |
US8294958B2 (en) | Scanner system and method for scanning providing combined geometric and photometric information | |
JP3525964B2 (en) | 3D shape measurement method for objects | |
JP5032943B2 (en) | 3D shape measuring apparatus and 3D shape measuring method | |
US20120281087A1 (en) | Three-dimensional scanner for hand-held phones | |
US20120176478A1 (en) | Forming range maps using periodic illumination patterns | |
US20120176380A1 (en) | Forming 3d models using periodic illumination patterns | |
JP4830871B2 (en) | 3D shape measuring apparatus and 3D shape measuring method | |
KR20130032368A (en) | Three-dimensional measurement apparatus, three-dimensional measurement method, and storage medium | |
CN110264540B (en) | Parallel single-pixel imaging method | |
Ke et al. | A flexible and high precision calibration method for the structured light vision system | |
CA2475391A1 (en) | Optical 3d digitizer with enlarged non-ambiguity zone | |
Li et al. | Lasers structured light with phase-shifting for dense depth perception | |
Petković et al. | Multiprojector multicamera structured light surface scanner | |
Dizeu et al. | Frequency shift triangulation: a robust fringe projection technique for 3D shape acquisition in the presence of strong interreflections | |
Ke et al. | A fast and accurate calibration method for the structured light system based on trapezoidal phase-shifting pattern | |
CA2569798C (en) | Full-field three-dimensional measurement method | |
KR20070054784A (en) | System and method for obtaining shape and reflectance information of object surfaces from two images | |
KR100585272B1 (en) | Monochrome sinusoidal pattern phase shifting based system and method for range imaging from two images | |
KR100585270B1 (en) | System and method for obtaining shape and color information of object surfaces from two images | |
CN112945086A (en) | Structured light coding method based on space sequence and light intensity threshold segmentation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |