CN217133735U - 3D structured light recognition device and system - Google Patents

3D structured light recognition device and system Download PDF

Info

Publication number
CN217133735U
CN217133735U CN202121530840.0U CN202121530840U CN217133735U CN 217133735 U CN217133735 U CN 217133735U CN 202121530840 U CN202121530840 U CN 202121530840U CN 217133735 U CN217133735 U CN 217133735U
Authority
CN
China
Prior art keywords
light
angle
image
structured light
light spot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202121530840.0U
Other languages
Chinese (zh)
Inventor
王百顺
余建男
徐小岚
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Bosheng Photoelectric Technology Co ltd
Original Assignee
Shenzhen Bosheng Photoelectric Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Bosheng Photoelectric Technology Co ltd filed Critical Shenzhen Bosheng Photoelectric Technology Co ltd
Priority to CN202121530840.0U priority Critical patent/CN217133735U/en
Application granted granted Critical
Publication of CN217133735U publication Critical patent/CN217133735U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application discloses 3D structure light recognition device, system, wherein, a 3D structure light recognition device includes: the transmitting module is configured for transmitting light to the surface of the object to be measured; the receiving module is configured for receiving at least part of light reflected by the surface of the object to be measured and acquiring an identification image related to the object to be measured; the identification image comprises a plurality of light spot blocks which are arranged in an array, and the light spot blocks are arranged in a staggered mode in a first direction and in a staggered mode in a second direction. The 3D structure light recognition device that this application embodiment provided, through the angle setting of adjustment emission module and receiving module for the facula is more random along the range on first direction and the second direction in the identification image who obtains, makes the repeatability in first direction and second direction of looking for the facula reduce, like this because the facula reduces at the inside repeatability probability of search frame, the also greatly reduced of depth map black hole problem probability that appears.

Description

3D structured light recognition device and system
Technical Field
The present application relates generally to the field of 3D technology, and more particularly, to a 3D structured light recognition apparatus and system.
Background
The 3D module is more and more widely applied to the new payment industry. At present, in the novel face-brushing payment industry, the 3D structured light module is widely applied due to the advantages of high precision, low cost, low power consumption and the like, and becomes the mainstream of the existing equipment.
However, when the image is searched to obtain the 3D depth information of the object to be detected in the prior art, the algorithm has a problem of repeatability when searching for a frame, and the algorithm has no way to clearly distinguish and distinguish during searching calculation, so that the searching is disordered, the difference between the searched position and the reference image is large, and the problem of black holes in the depth image is caused.
SUMMERY OF THE UTILITY MODEL
In view of the above-mentioned defects or shortcomings in the prior art, it is desirable to provide a 3D structured light recognition apparatus and system, which can avoid the problem of black holes when calculating 3D depth information of a measured object.
In a first aspect, the present application provides a 3D structured light recognition apparatus, comprising:
the transmitting module is configured for transmitting light to the surface of the object to be measured;
the receiving module is configured for receiving at least part of light reflected by the surface of the object to be measured and acquiring an identification image related to the object to be measured;
the identification image comprises a plurality of light spot blocks which are arranged in an array, wherein the light spot blocks are arranged in a staggered mode in a first direction and the light spot blocks are arranged in a staggered mode in a second direction.
Optionally, the angle of the emitted light of the emission module is adjusted, so that the light spot blocks in the identification image are arrayed along a third direction and a fourth direction, the third direction and the first direction are arranged at an included angle, and the fourth direction and the second direction are arranged at an included angle.
Optionally, the light spot blocks in the identification image are staggered in a third direction and/or the light spot blocks are staggered in a fourth direction.
Optionally, the shooting angle of the receiving module is adjusted, so that the light spot blocks in the identification image are arrayed along a fifth direction and a sixth direction, the fifth direction and the first direction form an included angle, and the sixth direction and the second direction form an included angle.
Optionally, through the adjustment receive module's shooting angle and through the adjustment the emission light angle of emission module to make in the discernment image the light spot piece is arranged along seventh direction and eighth direction array, the seventh direction with the first direction is the contained angle setting, the eighth direction with the second direction is the contained angle setting.
Optionally, the angle between the first direction and the second direction is 90 °.
Optionally, the light spot block comprises a plurality of randomly arranged light spots.
Optionally, the transmitting module includes:
a light source configured to emit a structured light beam, the structured light beam being a random speckle pattern;
a diffractive optical element configured to receive the structured light beam and expand the structured light beam into a patterned light beam, and configured to project the patterned light beam onto a surface of an object to be measured;
the patterned light beam is configured to form a plurality of light spot blocks arranged in an array on the surface of the object to be measured, the light spot blocks are arranged in a staggered mode in a first direction, and the light spot blocks are arranged in a staggered mode in a second direction.
In a second aspect, the present application provides a 3D structured light recognition apparatus and a data processing unit, wherein the data processing unit is configured to obtain the recognition image and construct 3D depth information of the object to be measured based on the recognition image.
In a third aspect, the present application provides a 3D structured light recognition method, including:
the transmitting module transmits light to the surface of the measured object;
the receiving module receives the light reflected by the surface of the measured object and acquires an identification image related to the measured object;
the data processing unit acquires the identification image and presets a picture frame with a square structure, wherein the picture frame comprises a first edge extending along a first direction and a second edge extending along a second direction;
and the data processing unit searches the identification image through the picture frame to obtain the 3D depth information of the measured object.
The technical scheme provided by the embodiment of the application can have the following beneficial effects:
the 3D structure light recognition device that this application embodiment provided, through the angle setting of adjustment emission module and receiving module for the facula is more random along the range on first direction and the second direction in the identification image who obtains, makes the repeatability in first direction and second direction of looking for the facula reduce, like this because the facula reduces at the inside repeatability probability of search frame, the also greatly reduced of depth map black hole problem probability that appears.
Drawings
Other features, objects and advantages of the present application will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
fig. 1 is a schematic diagram of 3D structured light recognition provided by an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a frame search principle according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating a principle of a repetitive situation of searching for a spot according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a 3D structured light recognition apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of another 3D structured light recognition apparatus provided in an embodiment of the present application;
fig. 6 is a schematic diagram illustrating a situation of a light spot emitted by the emission module according to an embodiment of the present application;
fig. 7 is a schematic diagram of an identification image received by a receiving module according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram of a shooting angle of a receiving module according to an embodiment of the present disclosure;
fig. 9 is a flowchart of a 3D structured light recognition method according to an embodiment of the present application.
Detailed Description
The present application will be described in further detail with reference to the following drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings.
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The present application will be described in detail below with reference to the embodiments with reference to the attached drawings.
The projected set of beams in known spatial directions is called structured light (structured light). 3D structured light principle, as shown in fig. 1: the method is characterized in that invisible infrared laser with a specific wavelength is used as a light source, emitted light forms an image with a certain encoding rule through an optical diffraction element (DOE) and is projected on an object, meanwhile, a receiving module receives the image with the encoding rule on the surface of the object, and then the distortion of a returned encoding pattern is calculated based on an optical triangulation method measuring principle to obtain the position and depth information of the object.
Namely, the 3D structured light module can set a reference picture in a burning way on the reference surface under a specific distance, the reference picture contains some specific information, and the picture of the actually shot object surface is compared with the previously determined reference picture, and the difference between the picture and the previously determined reference picture is confirmed and then converted into a depth map for displaying after algorithm calculation.
When the actually shot image is compared with the reference image, the picture frame is used for comparison in a certain range, comparison calculation is carried out after similar coding patterns are confirmed, and then the depth image is output. As shown in fig. 2, when the two images are compared, the range of each frame is as the thin line frame, the thick line frame is the range defined by the algorithm, the thin line frame is continuously searched from left to right and from top to bottom inside the thick line frame, and the calculation is performed after the most similar comparison with the reference image is found.
As shown in fig. 3, in this case, there is no way to clearly distinguish and distinguish the frames during the search calculation, so that the search is messy, and the difference between the searched position and the reference image is large.
The method and the device aim at the technical problem, improve from the aspect of hardware, and avoid the problem that the 3D depth information has black holes due to loopholes generated in algorithm searching.
Referring to fig. 4 and 5 in detail, the present application provides a 3D structured light recognition device, including:
the transmitting module 10 is configured to transmit light to the surface of the object to be measured;
the receiving module 20 is configured to receive at least part of the light reflected by the surface of the object to be measured and acquire an identification image related to the object to be measured;
the identification image comprises a plurality of light spot blocks arranged in an array, the light spot blocks are arranged in a staggered mode in a first direction X1, and the light spot blocks are arranged in a staggered mode in a second direction X2.
This application is through the mode of adjustment transmission module or accepting the module, and the facula randomness that realizes finally obtaining in the relevant discernment image with the testee is stronger, avoids appearing the facula repeatability problem.
It should be noted that, in the embodiment of the present application, the first direction X1 is defined as a horizontal direction, the second direction X2 is defined as a vertical direction, an included angle between the first direction X1 and the second direction X2 is 90 °, and in some other embodiments, the first direction X1 and the second direction X2 may be interchanged.
The transmission module includes:
a light source configured to emit a structured light beam, the structured light beam being a random speckle pattern.
A Diffractive Optical Element (DOE) configured to receive the structured beam and expand the structured beam into a patterned beam for a multi-path output, the diffractive Optical element further configured to project the patterned beam onto a surface of an object to be measured.
The patterned light beams are configured to form a plurality of light spot blocks arranged in an array on the surface of the object to be measured, wherein the light spot blocks are arranged in an staggered manner in a first direction X1 and the light spot blocks are arranged in an staggered manner in a second direction X2.
The light source may be an infrared light beam, and the light source may also be a vcsel (vertical Cavity Surface Emitting laser), and the light source emits infrared structured light and/or floodlight as an object identification signal through the diffractive optical element.
In an embodiment of the present application, the light source includes one or more of a high-contrast vertical cavity surface emitting laser, a single-aperture wide-area vertical cavity surface emitting laser, an array vertical cavity surface emitting laser, a laser diode, and an LED light source. The emission module can also comprise optical elements with other functions, and when the emission module is specifically arranged, the light source can be properly adjusted according to different application scenes, such as a micro-lens array, a grating and the like.
In some embodiments, the emission module 10 is adjusted to rotate by a certain angle, as shown in fig. 4, so that the light emitted from the emission module to the surface of the object to be measured will also rotate by a certain angle. Illustratively, the light spot blocks in the identification image are arrayed along a third direction X3 and a fourth direction X4 by adjusting the angle of the emitted light of the emission module, wherein the third direction X3 forms an included angle with the first direction X1, and the fourth direction X4 forms an included angle with the second direction X2.
It should be noted that the included angle between the third direction X3 and the fourth direction X4 and the first direction X1 and the second direction X2 is related to the adjustment angle of the emitting module, and meanwhile, the third direction X3 and the fourth direction X4 are also related to the light path direction in the light source and the diffractive optical element in the emitting module. The size of the included angle between the third direction X3 and the fourth direction X4 and the first direction X1 and the second direction X2 is not limited in the present application, and when the specific setting is performed, different adjustments may be performed according to different application scenarios.
In some embodiments, the present application adjusts the specific structure inside the emission module, and the light spot blocks in the identification image are staggered in the third direction and/or the light spot blocks are staggered in the fourth direction.
Since the third direction X3 and the fourth direction X4 are also related to the light path direction in the light source and the diffractive optical element in the emission module. In this embodiment, the directivity of the light spot irradiated on the surface of the object to be measured can be realized by adjusting the structure or the optical path of the optical element. In the embodiment of the application, the optical structure inside the transmitting module is adjusted to realize the transmission.
As shown in fig. 6, in the embodiment of the present application, the light spot condition emitted by the emission module is adjusted by adjusting the angle of the emission module or the structure of the emission module. Fig. 6 shows an exemplary staggered spot image along the third direction and a linear spot image along the fourth direction. It should be noted that, in the embodiment of the present application, the arrangement manner of the light spot blocks in the third direction X3 and the fourth direction X4 is not limited, and the light spots in the first direction X1 and the second direction X2 in the image received by the receiving module can be more random by the staggered or random arrangement manner, as shown in fig. 7, so that the light spots can be clearly distinguished and distinguished when the frame is searched for and calculated, and the situation that the searched position is greatly different from the reference image due to the fact that the frame is searched for in a messy way is prevented.
In some embodiments, the receiving module 20 is adjusted to rotate by a certain angle as shown in fig. 5, so that the shooting angle of the receiving module also rotates by a certain angle. Through adjusting the shooting angle of the receiving module, so that the light spot blocks in the identification image are arranged in an array along a fifth direction X5 and a sixth direction X6, the fifth direction X5 and the first direction X1 are arranged at an included angle, and the sixth direction X6 and the second direction X2 are arranged at an included angle.
It should be noted that the included angle between the fifth direction X5 and the sixth direction X6 and the first direction X1 and the second direction X2 is related to the angle of adjustment of the receiving module, and the fifth direction X5 and the sixth direction X6 are also related to the light path direction in the light source and the diffractive optical element in the emitting module. The size of the included angle between the fifth direction X5 and the sixth direction X6 and the first direction X1 and the second direction X2 is not limited, and when the included angle is specifically set, different adjustments can be performed according to different application scenarios.
As shown in fig. 8, in the embodiment of the present application, the light spot condition on the surface of the measured object is obtained by adjusting the shooting angle of the receiving module. Fig. 8 shows an exemplary staggered spot image along the fifth direction X5 and a linear spot image along the sixth direction X6. It should be noted that, in the embodiment of the present application, the arrangement manner of the light spot blocks in the fifth direction X5 and the sixth direction X6 is not limited, and by means of an interleaving manner or a random arrangement manner, the light spots in the first direction X1 and the second direction X2 in the image received by the receiving module are more random, so that the light spots can be clearly distinguished and distinguished when the frame is searched for and calculated, and the situation that the searched position is greatly different from the reference image due to the fact that the search is disordered is prevented.
In some embodiments, the light spot blocks in the identification image are arranged in an array along a seventh direction and an eighth direction by adjusting the shooting angle of the receiving module and the angle of the emitted light of the emitting module, the seventh direction is disposed at an angle to the first direction X1, and the eighth direction is disposed at an angle to the second direction X2.
It should be noted that the included angle between the seventh direction and the eighth direction and the first direction X1 and the second direction X2 is related to the adjustment angle of the emitting module, and meanwhile, the seventh direction and the eighth direction are also related to the light path direction in the light source and the diffractive optical element in the emitting module. The size of the included angle between the seventh direction and the eighth direction and between the first direction X1 and the second direction X2 is not limited by the application, and different adjustments can be performed according to different application scenes when the application is specifically set.
It should be noted that in the embodiment of the present application, the shooting angle refers to the angle of the external device of the receiving module, and the image portion displayed in the algorithm searching portion is the image displayed along the first direction and the second direction, as shown in fig. 7. The specific shooting principle adopts various shooting principles in the prior art, and the detailed description is omitted here.
In addition, in the embodiment of the present application, the arrangement of the transmitting module and the receiving module is adjusted at the same time, so as to adjust the arrangement of the light spots. Different angles can be adjusted according to requirements during specific adjustment to obtain better effect.
In the embodiment of the present application, the light spot block includes a plurality of light spots arranged randomly. The emitted light spot is designed into a random light spot, and a dislocation mode is used when an optical diffraction element (DOE) is designed in the existing mode, so that each light spot block shot on the VCSEL is dislocated, and the light spot blocks are more random. The shape of the light spot blocks is exemplarily shown in the embodiments of the present application, and in some embodiments, the light spot blocks are in a pattern such as an irregular lattice, a grid, a stripe, or an encoded pattern.
In a second aspect, the present application provides a 3D structured light recognition apparatus comprising a device as described in any of the above and a data processing unit, wherein,
the transmitting module 10 is configured to transmit light to the surface of the object to be measured;
the receiving module 20 is configured to receive at least part of the light reflected by the surface of the object to be measured and acquire an identification image related to the object to be measured;
the data processing unit is configured to acquire the identification image and construct 3D depth information of the measured object based on the identification image.
In a third aspect, as shown in fig. 9, the present application provides a 3D structured light recognition method, including:
s02, the emission module emits light to the surface of the measured object;
s04, receiving the light reflected by the surface of the measured object by the receiving module and acquiring an identification image related to the measured object;
s06, the data processing unit acquires the identification image and presets a picture frame with a square structure, wherein the picture frame comprises a first edge extending along a first direction and a second edge extending along a second direction;
and S08, the data processing unit searches the identification image through the picture frame to obtain the 3D depth information of the measured object.
It should be noted that, in the embodiment of the present application, the standard image is post-processed by using other processing methods in the prior art to obtain the image depth map, which is not described herein again, and any processing method belongs to the protection scope of the present application without departing from the inventive concept of the present application.
For the image processing domain, image transformation: indirect processing techniques such as fourier transform, walsh transform, discrete cosine transform, etc., convert the processing in the spatial domain into processing in the transform domain, which not only reduces the amount of computation, but also allows for more efficient processing. Image coding: compressed image coding compression techniques may reduce the amount of data (i.e., the number of bits) describing an image in order to save image transmission, processing time, and reduce the amount of memory occupied. Image enhancement and restoration: the purpose of image enhancement and restoration is to improve the quality of an image, such as removing noise, improving the sharpness of an image, and the like. Image segmentation: image segmentation is one of the key techniques in digital image processing. The image segmentation is to extract a meaningful characteristic part in the image, wherein the meaningful characteristic is an edge, a region and the like in the image, and the meaningful characteristic is a basis for further image recognition, analysis and understanding. Image description: image description is a necessary prerequisite for image recognition and understanding. As the simplest binary image, the geometric characteristics of the binary image can be used for describing the characteristics of an object, and a general image description method adopts two-dimensional shape description which has two types of methods of boundary description and region description. Image recognition: the image recognition belongs to the category of pattern recognition, and the main content of the image recognition is that after certain preprocessing (enhancement, restoration and compression), the image is subjected to image segmentation and feature extraction, so that judgment and classification are performed.
The 3D structured light identification method provided by the application enables the repeatability of searching the light spots to be more difficult to appear compared with the prior art, and in this way, the probability of the repeatability of the light spots in the searching frame is reduced, and the probability of the black hole problem of the appearing depth map can also be greatly reduced.
It will be understood that the terms "length," "width," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in an orientation or positional relationship indicated in the drawings for convenience in describing the invention and to simplify the description, and are not intended to indicate or imply that the device or element so referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus should not be construed as limiting the invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically limited otherwise.
Unless defined otherwise, technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. Terms such as "disposed" and the like, as used herein, may refer to one element being directly attached to another element or one element being attached to another element through intervening elements. Features described herein in one embodiment may be applied to another embodiment, either alone or in combination with other features, unless the feature is otherwise inapplicable or otherwise stated in the other embodiment.
The present invention has been described in terms of the above embodiments, but it should be understood that the above embodiments are for purposes of illustration and description only and are not intended to limit the invention to the scope of the described embodiments. It will be appreciated by those skilled in the art that many more modifications and variations are possible in light of the above teaching and are intended to be included within the scope of the invention.

Claims (9)

1. A 3D structured light recognition apparatus, comprising:
the transmitting module is configured for transmitting light to the surface of the object to be measured;
the receiving module is configured for receiving at least part of light reflected by the surface of the object to be measured and acquiring an identification image related to the object to be measured;
the identification image comprises a plurality of light spot blocks which are arranged in an array, wherein the light spot blocks are arranged in a staggered mode in a first direction and the light spot blocks are arranged in a staggered mode in a second direction.
2. The 3D structured light recognition device according to claim 1, wherein the light beam emitting angle of the emitting module is adjusted to array the light spot blocks in the recognition image along a third direction and a fourth direction, the third direction is disposed at an angle with respect to the first direction, and the fourth direction is disposed at an angle with respect to the second direction.
3. The 3D structured light recognition device according to claim 1, wherein the light spot blocks in the recognition image are staggered in a third direction and/or the light spot blocks are staggered in a fourth direction.
4. The 3D structured light recognition device according to claim 1, wherein the shooting angle of the receiving module is adjusted to arrange the light spot blocks in the recognition image in an array along a fifth direction and a sixth direction, the fifth direction is disposed at an angle with respect to the first direction, and the sixth direction is disposed at an angle with respect to the second direction.
5. The 3D structured light recognition device according to claim 1, wherein the light spot blocks in the recognition image are arranged in an array along a seventh direction and an eighth direction by adjusting a shooting angle of the receiving module and adjusting an angle of the emitted light of the emitting module, the seventh direction is disposed at an angle with respect to the first direction, and the eighth direction is disposed at an angle with respect to the second direction.
6. A 3D structured light recognition device according to claim 1 wherein the first direction and the second direction are at an angle of 90 °.
7. The 3D structured light recognition device according to claim 1, wherein the light spot block comprises a plurality of light spots arranged randomly.
8. The 3D structured light recognition device according to claim 1, wherein the emission module comprises:
a light source configured to emit a structured light beam, the structured light beam being a random speckle pattern;
a diffractive optical element configured to receive the structured light beam and expand the structured light beam into a patterned light beam, and configured to project the patterned light beam onto a surface of an object to be measured;
the patterned light beam is configured to form a plurality of light spot blocks arranged in an array on the surface of the object to be measured, the light spot blocks are arranged in a staggered mode in a first direction, and the light spot blocks are arranged in a staggered mode in a second direction.
9. A 3D structured light recognition system comprising a 3D structured light recognition device according to any of claims 1 to 8 and a data processing unit, wherein the data processing unit is configured to acquire the recognition image and to construct 3D depth information of the object under test based on the recognition image.
CN202121530840.0U 2021-07-06 2021-07-06 3D structured light recognition device and system Active CN217133735U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202121530840.0U CN217133735U (en) 2021-07-06 2021-07-06 3D structured light recognition device and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202121530840.0U CN217133735U (en) 2021-07-06 2021-07-06 3D structured light recognition device and system

Publications (1)

Publication Number Publication Date
CN217133735U true CN217133735U (en) 2022-08-05

Family

ID=82613826

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202121530840.0U Active CN217133735U (en) 2021-07-06 2021-07-06 3D structured light recognition device and system

Country Status (1)

Country Link
CN (1) CN217133735U (en)

Similar Documents

Publication Publication Date Title
US11054506B2 (en) Spatially self-similar patterned illumination for depth imaging
US10608002B2 (en) Method and system for object reconstruction
EP3276300B1 (en) Projector for projecting a high resolution dot pattern
US8836756B2 (en) Apparatus and method for acquiring 3D depth information
US9501833B2 (en) Method and system for providing three-dimensional and range inter-planar estimation
US20140002610A1 (en) Real-time 3d shape measurement system
US20160267682A1 (en) Object detection device
EP3444782B1 (en) Coding distance topologies for structured light patterns for 3d reconstruction
CN110689577B (en) Active rigid body pose positioning method in single-camera environment and related equipment
KR20110084029A (en) Apparatus and method for obtaining 3d image
CA2771727A1 (en) Device and method for obtaining three-dimensional object surface data
JP2002191058A (en) Three-dimensional image acquisition device and three- dimensional image acquisition method
US20170061634A1 (en) Code domain power control for structured light
CN108362228A (en) A kind of hybrid three-dimensional measuring apparatus of finishing tool grating and measurement method based on double ray machines
CN217133735U (en) 3D structured light recognition device and system
TWI753129B (en) Optoelectronic devices for collecting three-dimensional data
CN108428244A (en) Image matching method and depth data measurement method and system
CN113269180A (en) 3D structured light recognition device, system and method
CN113283422A (en) 3D structured light recognition device, system and method
JP2004110804A (en) Three-dimensional image photographing equipment and method
CN215219710U (en) 3D recognition device and 3D recognition system
JP2002027501A (en) Three-dimensional image pickup device and method
CN113326802A (en) 3D recognition device, 3D recognition system and recognition method thereof
CN217133736U (en) 3D structured light recognition device and system
US11195290B2 (en) Apparatus and method for encoding in structured depth camera system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant