CN106910213B - A kind of not visible scene three-dimensional information acquisition methods of optics based on calculating imaging - Google Patents
A kind of not visible scene three-dimensional information acquisition methods of optics based on calculating imaging Download PDFInfo
- Publication number
- CN106910213B CN106910213B CN201710049708.XA CN201710049708A CN106910213B CN 106910213 B CN106910213 B CN 106910213B CN 201710049708 A CN201710049708 A CN 201710049708A CN 106910213 B CN106910213 B CN 106910213B
- Authority
- CN
- China
- Prior art keywords
- carrier
- stripe
- scene
- structured light
- fringe
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 54
- 238000003384 imaging method Methods 0.000 title claims abstract description 19
- 238000004364 calculation method Methods 0.000 claims abstract description 11
- 230000003287 optical effect Effects 0.000 claims description 19
- 239000005337 ground glass Substances 0.000 claims description 12
- 238000001228 spectrum Methods 0.000 claims description 8
- 238000004458 analytical method Methods 0.000 claims description 5
- 239000000969 carrier Substances 0.000 claims description 5
- 230000000694 effects Effects 0.000 claims description 5
- 230000010363 phase shift Effects 0.000 claims description 5
- 230000010354 integration Effects 0.000 claims description 3
- 238000002310 reflectometry Methods 0.000 claims description 3
- 238000005516 engineering process Methods 0.000 description 7
- 230000001360 synchronised effect Effects 0.000 description 4
- 238000001514 detection method Methods 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000005338 frosted glass Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
Landscapes
- Length Measuring Devices By Optical Means (AREA)
- Image Analysis (AREA)
Abstract
The present invention proposes a kind of not visible scene three-dimensional information acquisition methods of optics based on calculating imaging, and this method, which combines, calculates imaging technique and active three-dimensional sensing method, without complex calculations such as Stereo matchings.This method includes the generation of projection structure striations, the calculating of carrier wave deforming stripe is rebuild and three processes of acquisition of tested scene three-dimensional coordinate information, system is by digital projector, video camera and the not visible Scene realization of optics, this method has many advantages, such as that precision is high, device is simple, breaks through and block or cover, and may be implemented to obtain the three-dimensional information of the not visible scene of optics.
Description
Technical Field
The invention relates to a three-dimensional information acquisition technology, in particular to an optical invisible scene three-dimensional information acquisition method based on computational imaging.
Background
The three-dimensional information acquisition technology can simultaneously obtain the three-dimensional coordinates of a detected scene, and has important practical significance and wide application prospect in the fields of real object profiling, industrial detection, virtual reality, machine vision, intelligent interaction and the like. The traditional three-dimensional information acquisition technology aims at an optical visual scene, and acquires three-dimensional coordinate information of a detected scene by utilizing a passive three-dimensional sensing method or an active three-dimensional sensing method based on a photogrammetric principle. However, in the environments of actual industrial detection, fire rescue and the like, the detected scene is mostly shielded or covered, namely, an optical invisible scene. The traditional three-dimensional information reconstruction technology cannot be applied to optical invisible scenes to obtain the three-dimensional coordinate information of the detected scenes. The computed imaging is an indirect imaging technology, the sensor does not directly obtain the information of the imaging of the optical lens of the detected scene, but obtains the relevant information relevant to the detected scene, and the lens imaging effect of the detected scene can be obtained through the obtained relevant information and the relevant computation. The traditional computational imaging can realize the extraction of optical non-visual field scene information, but the method can only obtain two-dimensional information of a detected scene, and cannot realize the acquisition of three-dimensional information of the detected scene, and the popularization of the computational imaging in practical application is just limited.
Disclosure of Invention
The invention aims to realize a method for acquiring the three-dimensional information of the detected scene, which has high precision and simple device and breaks through shielding or covering. In order to achieve the purpose, the invention provides an optical invisible scene three-dimensional information acquisition method based on computational imaging, which combines a computational imaging technology and an active three-dimensional sensing method and does not need complex operations such as stereo matching and the like. The method comprises three processes of generation of projection structure light stripes, calculation and reconstruction of carrier deformation stripes and acquisition of three-dimensional coordinate information of a detected scene.
The method can be realized by a digital projector, a camera and an optical invisible scene, as shown in figure 1, in the invention, static ground glass is placed in front of a detected scene to simulate the optical invisible scene; the digital projector and the camera are precisely synchronized, in the acquisition process, the digital projector projects structural light striations on the surface of a detected scene, and the camera synchronously acquires light intensity information on the ground glass.
The invention utilizes a digital projector to project a plurality of frames of structured light stripes with special structures, and forms corresponding light intensity information on a synchronized camera through the reflection of a measured scene and the scattering of ground glass; reconstructing a Fourier spectrum of the deformed stripes by combining a Fourier analysis theory, and further obtaining a space domain expression form of the deformed stripes; by utilizing the deformation stripes which are calculated and reconstructed and based on the principle of an active three-dimensional sensing method, the three-dimensional information of the optical invisible scene can be acquired with high precision. The flow of the method provided by the invention is shown in the attached figure 2.
The generation process of the projection structured light striations is carried out according to the preset resolution ratio of the three-dimensional information of the detected sceneM×NDetermining the resolution of the projected structured light fringesM×NFor projecting structured light stripesP(x, y, f x , f y , δ, δ 0) Represents:
(1)
the projected structured light stripe is a dual-frequency stripe, wherein,in the form of a signal wave, the signal wave,representing carriers, and respectively satisfying:
(2)
(3)
wherein,xandyto project the pixel coordinates of the structured light stripes,x∈[0, M-1]the number of the integer (c) of (d),y∈[0, N-1]an integer of (d);f x andf y for projecting structured light fringe signal wavesxAxial direction andythe frequency component in the direction of the axis,f x ∈[0, M-1]the number of the integer (c) of (d),f y ∈[0, N-1]an integer of (d);Afor projecting the background intensity of the structured light fringe signal wave,Bprojecting the contrast of the structured light fringe signal wave;andfor projecting structured light fringes ontoxAxial direction andyfrequency components in the axial direction, which are determined constants;A 0to project the structured light stripe carrier background intensity,B 0projecting the structured light fringe carrier contrast;δandδ 0the values of the projection structure light stripe signal wave and the carrier phase offset are respectively taken in a set {0, pi/2, pi, 3 pi/2 }. Projected structured light stripes of different parameters generated in the present invention are shown in fig. 3.
In the calculation and reconstruction process of the carrier deformation stripes, the digital projector is used for projecting the structural light stripes on the surface of the detected scene, and the reflectivity of the surface of the detected scene is assumed to beR(x, y) Then reflection fringeE(x, y, f x , f y , δ, δ 0) Can be expressed as:
(4)
reflective stripeE(x, y, f x , f y , δ, δ 0) The camera synchronously obtains the light intensity information of the detected scene through the scattering effect of the ground glassD(f x , f y , δ, δ 0) The light intensity information can be reflected stripesE(x, y, f x , f y , δ, δ 0) Represents the integral of:
(5)
wherein Ω is a reflection fringeE(x, y, f x , f y , δ, δ 0) The integration area in the present invention is the pixel coordinate area of the ground glass imaged on the camera. According to the formulas (1) to (5):
(6)
the carrier deformation stripe is obtained by supposing that the carrier of the projection structure light stripe is reflected by the detected sceneC(x, y) Then carrier wave deformed stripeC(x, y) Satisfies the following conditions:
(7)
from the fourier analysis theory, it can be obtained from equation (7):
(8)
wherein,C(f x , f y ) Indicating carrier distortion fringesC(x, y) FFT (×) represents a fourier transform function to x.
In the present invention, when the structured light stripe is projectedP(x, y, f x , f y , δ, δ 0) Inf x A traversal of the value of 0 is made,M-1]all of the whole numbers of (a), (b), (c), (,f y a traversal of the value of 0 is made,N-1]after all the integers are obtained, a complete carrier deformed fringe Fourier spectrum can be obtainedC(f x ,f y ). By the property of inverse Fourier transform, the reflection of the projection structural light stripe carrier wave through the detected scene can be reconstructedResulting carrier deformed fringesC(x, y):
(9)
Where IFFT (x) represents an inverse fourier transform function on x. Phase offset when projecting structured light fringe carriersδ 0When the value is 0, pi/2, pi and 3 pi/2, the carrier deformation stripe obtained by calculation and reconstruction is recorded asC 0(x, y)、C π/2(x, y)、C π(x,y) AndC 3π/2(x, y) As shown in fig. 4.
The process of obtaining the three-dimensional coordinate information of the detected scene comprises the steps of firstly, utilizing the carrier deformation stripes which are calculated and reconstructedC 0(x, y)、C π/2(x, y)、C π(x, y) AndC 3π/2(x, y) According to the 4-step phase shift method, the phase function of the carrier deformation stripe can be obtainedφ(x, y):
(10)
Then, according to a truncation phase dynamic programming expansion algorithm, the obtained phase function is obtainedφ(x, y) Spread out as a continuous phase functionΦ(x, y). In the invention, the height value of the detected sceneH(x, y) With spread continuous phase functionΦ(x, y) The proportional relationship can be expressed as:
(11)
wherein,Kthe scale factor can be obtained by system calibration. The invention can obtain the three-dimensional coordinate information of the measured scene, and the comparison between the three-dimensional coordinate information and the real three-dimensional coordinate information of the measured scene is shown in the attached figure 5.
The invention provides an optical invisible scene three-dimensional information acquisition method based on computational imaging, which combines a computational imaging technology and an active three-dimensional sensing method, does not need complex operations such as stereo matching and the like, and acquires the three-dimensional information of a detected scene by computing and reconstructing the Fourier spectrum of a carrier deformation stripe. The method has the advantages of high precision, simple device, breaking through shielding or covering and the like, and can realize the three-dimensional information acquisition of the optically invisible scene.
Drawings
FIG. 1 is a system diagram for realizing an optical invisible scene three-dimensional information acquisition method based on computational imaging
FIG. 2 is a flow chart of the method of the present invention
FIG. 3 is a schematic representation of projected structured light fringes of various parameters generated in the present invention
FIG. 4 shows carrier deformed stripes obtained by calculation and reconstruction in the present invention
FIG. 5 is a comparison graph of (a) real three-dimensional coordinate information of a scene to be measured and (b) three-dimensional coordinate information obtained by the present invention
The reference numbers in the figures are:
1 digital projector, 2 cameras, 3 frosted glass and 4 detected scenes.
It should be understood that the above-described figures are merely schematic and are not drawn to scale.
Detailed Description
The following describes an exemplary embodiment of a method for acquiring three-dimensional information of an optically invisible scene based on computed tomography in detail, and further describes the invention in detail. It should be noted that the following examples are only for illustrative purposes and should not be construed as limiting the scope of the present invention, and that the skilled person in the art may make modifications and adaptations of the present invention without departing from the scope of the present invention.
The invention provides an optical invisible scene three-dimensional information acquisition method based on computational imaging. The method comprises three processes of generation of projection structure light stripes, calculation and reconstruction of carrier deformation stripes and acquisition of three-dimensional coordinate information of a detected scene.
The method provided by the invention can be realized by a digital projector, a camera and an optical invisible scene, as shown in figure 1, in the embodiment, static ground glass is placed in front of a detected scene to simulate the optical invisible scene; the digital projector and the camera are precisely synchronized, in the acquisition process, the digital projector projects structural light striations on the surface of a detected scene, and the camera synchronously acquires light intensity information on the ground glass.
In the embodiment, a digital projector is used for projecting a plurality of frames of structured light stripes with special structures, and corresponding light intensity information is formed on a synchronized camera through the reflection of a detected scene and the scattering of ground glass; reconstructing a Fourier spectrum of the deformed stripes by combining a Fourier analysis theory, and further obtaining a space domain expression form of the deformed stripes; by utilizing the deformation stripes which are calculated and reconstructed and based on the principle of an active three-dimensional sensing method, the three-dimensional information of the optical invisible scene can be acquired with high precision. The flow of this example is shown in FIG. 2.
The generation process of the projection structured light striations is carried out according to the preset resolution ratio of the three-dimensional information of the detected sceneM×N =500 × 500, the resolution of the projected structured light stripe is also determinedM×N =500 x 500 for projecting structured light stripesP(x,y, f x , f y , δ, δ 0) Represents:
(1)
the projected structured light stripe is a dual-frequency stripe, wherein,in the form of a signal wave, the signal wave,representing carriers, and respectively satisfying:
(2)
(3)
wherein,xandyto project the pixel coordinates of the structured light stripes,x∈[0, 499]the number of the integer (c) of (d),y∈[0, 499]an integer of (d);f x andf y for projecting structured light fringe signal wavesxAxial direction andythe frequency component in the direction of the axis,f x ∈[0, 499]the number of the integer (c) of (d),f y ∈[0, 499]an integer of (d);Afor projecting the background intensity of the structured light fringe signal wave,Bto project the structured light fringe signal wave contrast, in this embodiment,A = 127.5, B = 127.5;andfor projecting structured light fringes ontoxAxial direction andythe frequency component in the axial direction is a constant determined in the present embodiment = 200, = 0;A 0To project the structured light stripe carrier background intensity,B 0to project the structured light streak carrier contrast, in this embodiment,A 0 = 127.5, B 0 =127.5;δandδ 0the values of the projection structure light stripe signal wave and the carrier phase offset are respectively taken in a set {0, pi/2, pi, 3 pi/2 }. The projected structured light stripes of different parameters generated in this embodiment are shown in fig. 3.
In the calculation and reconstruction process of the carrier deformation stripes, the digital projector is used for projecting the structural light stripes on the surface of the detected scene, and the reflectivity of the surface of the detected scene is assumed to beR(x, y) Then reflection fringeE(x, y, f x , f y , δ, δ 0) Can be expressed as:
(4)
reflective stripeE(x, y, f x , f y , δ, δ 0) The camera synchronously obtains the light intensity information of the detected scene through the scattering effect of the ground glassD(f x , f y , δ, δ 0) The light intensity information can be reflected stripesE(x, y, f x , f y , δ, δ 0) Represents the integral of:
(5)
wherein Ω is a reflection fringeE(x, y, f x , f y , δ, δ 0) The integration area of (a) is, in this embodiment, the pixel coordinate area of the ground glass imaged on the camera. According to the formulas (1) to (5):
(6)
the carrier deformation stripe is obtained by supposing that the carrier of the projection structure light stripe is reflected by the detected sceneC(x, y) Then carrier wave deformed stripeC(x, y) Satisfies the following conditions:
(7)
from the fourier analysis theory, it can be obtained from equation (7):
(8)
wherein,C(f x , f y ) Indicating carrier distortion fringesC(x, y) FFT (×) represents a fourier transform function to x.
In this embodiment, when throwingShadow-structured light stripeP(x, y, f x , f y , δ, δ 0) Inf x Traverse [0, 499]All of the whole numbers of (a), (b), (c), (,f y traverse [0, 499]After all the integers are obtained, a complete carrier deformed fringe Fourier spectrum can be obtainedC(f x ,f y ). By the property of inverse Fourier transform, the carrier deformation stripe obtained by reflecting the projection structure light stripe carrier wave on the detected scene can be reconstructedC(x, y):
(9)
Where IFFT (x) represents an inverse fourier transform function on x. Phase offset when projecting structured light fringe carriersδ 0When the value is 0, pi/2, pi and 3 pi/2, the carrier deformation stripe obtained by calculation and reconstruction is recorded asC 0(x, y)、C π/2(x, y)、C π(x,y) AndC 3π/2(x, y) As shown in fig. 4.
The process of obtaining the three-dimensional coordinate information of the detected scene comprises the steps of firstly, utilizing the carrier deformation stripes which are calculated and reconstructedC 0(x, y)、C π/2(x, y)、C π(x, y) AndC 3π/2(x, y) According to the 4-step phase shift method, the phase function of the carrier deformation stripe can be obtainedφ(x, y):
(10)
Then, according to a truncation phase dynamic programming expansion algorithm, the obtained phase function is obtainedφ(x, y) Spread out as a continuous phase functionΦ(x, y). In an embodiment, the height value of the measured sceneH(x, y) With spread continuous phase functionΦ(x, y) The proportional relationship can be expressed as:
(11)
wherein,Kthe scale factor can be obtained by system calibration. The embodiment can obtain the three-dimensional coordinate information of the measured scene, and the comparison between the three-dimensional coordinate information and the real three-dimensional coordinate information of the measured scene is shown in fig. 5.
Claims (4)
1. An optical invisible scene three-dimensional information acquisition method based on computational imaging is characterized by comprising three processes of generation of projection structure light stripes, computational reconstruction of carrier deformation stripes and acquisition of three-dimensional coordinate information of a detected scene, wherein the method is realized by a digital projector, a camera and an optical invisible scene; in the generation process of the projection structured light stripe, the projection structured light stripe is a dual-frequency stripe and comprises a signal wave and a carrier wave; in the calculation and reconstruction process of the carrier deformation stripe, the reflection stripe E (x, y, f)x,fy,δ,δ0) The camera synchronously obtains the light intensity information D (f) of the detected scene through the scattering effect of the ground glassx,fy,δ,δ0) When projecting the structured light stripe P (x, y, f)x,fy,δ,δ0) In fxTraverse [0, M-1]All integers of (a), (b), (c), (d), (yTraverse [0, N-1]After all the integers are obtained, a complete carrier deformed fringe Fourier spectrum C (f) can be obtainedx,fy) Wherein x and y are pixel coordinates of the projected structured light stripe, M × N is the resolution of the projected structured light stripe, and x ∈ [0, M-1 ∈ [ ]]Is an integer of (1), y is an element of [0, N-1 ]]Integer of (a), fxAnd fyFor projecting the frequency components of the structured-light fringe signal wave in the x-and y-directions, delta and delta0Respectively taking values in a set {0, pi/2, pi, 3 pi/2 } for a projection structure light stripe signal wave and a carrier phase offset; in the process of acquiring the three-dimensional coordinate information of the detected scene, carrier deformation stripes C reconstructed by calculation are utilized0(x,y)、Cπ/2(x,y)、Cπ(x, y) and C3π/2(x, y) the phase function of the carrier fringe can be obtained by the 4-step phase shift method
2. The method as claimed in claim 1, wherein the process of generating the projected structured light stripe determines the resolution of the projected structured light stripe is mxn according to a predetermined resolution mxn of the three-dimensional information of the measured scene, and the projected structured light stripe is P (x, y, f) for usex,fy,δ,δ0) Is represented by P (x, y, f)x,fy,δ,δ0)=P′(x,y,fx,fy,δ)·P0(x,y,fx 0,fy 0,δ0) The projected structured light stripe is a dual-frequency stripe, where P' (x, y, f)x,fyAnd delta) is a signal wave,represents a carrier and satisfies P' (x, y, f)x,fy,δ)=A+B·cos(2πfxx+2πfxy+δ),fx∈[0,M-1]Integer of (a), fy∈[0,N-1]An integer of (d); a is the background intensity of the projection structure light stripe signal wave, and B is the contrast of the projection structure light stripe signal wave;andprojecting frequency components of the structured light stripe carrier wave in the x-axis direction and the y-axis direction, wherein the frequency components are determined constants; a. the0For projecting the background intensity of the structured light striations, B0Is the projected structured light fringe carrier contrast.
3. The method as claimed in claim 1, wherein the carrier deformation fringes are calculated and reconstructed by projecting the structural light fringes on the surface of the measured scene by using a digital projector, and assuming that the reflectivity of the surface of the measured scene is R (x, y), the reflective fringes are E (x, y, f)x,fy,δ,δ0) Is represented by E (x, y, f)x,fy,δ,δ0)=R(x,y)·P(x,y,fx,fy,δ,δ0) Reflection fringe E (x, y, f)x,fy,δ,δ0) The camera synchronously obtains the light intensity information D (f) of the detected scene through the scattering effect of the ground glassx,fy,δ,δ0) The light intensity information is reflected by the reflection stripes E (x, y, f)x,fy,δ,δ0) Is expressed by the integral ofWhere Ω is reflectionStripe E (x, y, f)x,fy,δ,δ0) The integration area of (1); the carrier wave deformation stripe C (x, y) is obtained by reflecting the carrier wave of the projection structure light stripe through the detected scene, and then the carrier wave deformation stripe C (x, y) meets the requirementObtained according to Fourier analysis theoryWherein, C (f)x,fy) A fourier spectrum representing the carrier deformed fringes C (x, y), the FFT (x) representing a fourier transform function on x; when projecting the structured light stripe P (x, y, f)x,fy,δ,δ0) In fxTraverse [0, M-1]All integers of (a), (b), (c), (d), (yTraverse [0, N-1]After all the integers are obtained, a complete carrier deformed fringe Fourier spectrum C (f) can be obtainedx,fy) (ii) a By the property of inverse Fourier transform, the carrier deformation stripe C (x, y) obtained by reflecting the projection structure light stripe carrier wave through a detected scene can be reconstructed; phase shift delta when projecting structured light fringe carriers0When the value is 0, pi/2, pi and 3 pi/2, the carrier deformation stripe obtained by calculation and reconstruction is marked as C0(x,y)、Cπ/2(x,y)、Cπ(x, y) and C3π/2(x,y)。
4. The method for acquiring three-dimensional information of an optically invisible scene based on computed imaging as claimed in claim 1, wherein the acquisition process of the three-dimensional coordinate information of the detected scene utilizes carrier deformed stripes C reconstructed by computation0(x,y)、Cπ/2(x,y)、Cπ(x, y) and C3π/2(x, y) the phase function of the carrier fringe can be obtained by the 4-step phase shift method According to the truncation phase dynamic programming expansion algorithm, the obtained phase functionThe spread becomes a continuous phase function Φ (x, y).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710049708.XA CN106910213B (en) | 2017-01-23 | 2017-01-23 | A kind of not visible scene three-dimensional information acquisition methods of optics based on calculating imaging |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710049708.XA CN106910213B (en) | 2017-01-23 | 2017-01-23 | A kind of not visible scene three-dimensional information acquisition methods of optics based on calculating imaging |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106910213A CN106910213A (en) | 2017-06-30 |
CN106910213B true CN106910213B (en) | 2019-09-03 |
Family
ID=59206636
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710049708.XA Active CN106910213B (en) | 2017-01-23 | 2017-01-23 | A kind of not visible scene three-dimensional information acquisition methods of optics based on calculating imaging |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106910213B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101451826A (en) * | 2008-12-17 | 2009-06-10 | 中国科学院上海光学精密机械研究所 | Object three-dimensional profile measuring device and method |
CN101806587A (en) * | 2010-04-29 | 2010-08-18 | 浙江师范大学 | Optical three-dimensional measurement method with absolute phase measurement |
EP2428162A1 (en) * | 2010-09-10 | 2012-03-14 | Dimensional Photonics International, Inc. | Method of data acquisition for three-dimensional imaging of the intra-oral cavity |
CN106289109A (en) * | 2016-10-26 | 2017-01-04 | 长安大学 | A kind of three-dimensional reconstruction system based on structure light and method |
-
2017
- 2017-01-23 CN CN201710049708.XA patent/CN106910213B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101451826A (en) * | 2008-12-17 | 2009-06-10 | 中国科学院上海光学精密机械研究所 | Object three-dimensional profile measuring device and method |
CN101806587A (en) * | 2010-04-29 | 2010-08-18 | 浙江师范大学 | Optical three-dimensional measurement method with absolute phase measurement |
EP2428162A1 (en) * | 2010-09-10 | 2012-03-14 | Dimensional Photonics International, Inc. | Method of data acquisition for three-dimensional imaging of the intra-oral cavity |
CN106289109A (en) * | 2016-10-26 | 2017-01-04 | 长安大学 | A kind of three-dimensional reconstruction system based on structure light and method |
Non-Patent Citations (1)
Title |
---|
基于条纹结构光投影的三维形变测量系统的设计与研究;李腾飞;《万方学位论文》;20150730;论文第1-24页 |
Also Published As
Publication number | Publication date |
---|---|
CN106910213A (en) | 2017-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110288642B (en) | Three-dimensional object rapid reconstruction method based on camera array | |
US9322643B2 (en) | Apparatus and method for 3D surface measurement | |
Liu et al. | Real-time 3D surface-shape measurement using background-modulated modified Fourier transform profilometry with geometry-constraint | |
CN113237435B (en) | High-light-reflection surface three-dimensional vision measurement system and method | |
CN111288925B (en) | Three-dimensional reconstruction method and device based on digital focusing structure illumination light field | |
CN110514143A (en) | A kind of fringe projection system scaling method based on reflecting mirror | |
Huang et al. | Fast full-field out-of-plane deformation measurement using fringe reflectometry | |
CN107990846A (en) | Master based on single frames structure light passively combines depth information acquisition method | |
Iwai et al. | Shadow removal of projected imagery by occluder shape measurement in a multiple overlapping projection system | |
Liu et al. | Deflectometry for phase retrieval using a composite fringe | |
CN110618537B (en) | Coated lens device and three-dimensional reconstruction imaging system applying same | |
CN104567718B (en) | Integration imaging micro-image array generating method based on multi-angle projection PMP | |
Heist et al. | High-speed 3D shape measurement by GOBO projection of aperiodic sinusoidal fringes: a performance analysis | |
CN111947600B (en) | Robust three-dimensional phase unfolding method based on phase level cost filtering | |
Zhou et al. | Three-dimensional shape measurement using color random binary encoding pattern projection | |
US6512844B2 (en) | 3D rendering | |
CN106910213B (en) | A kind of not visible scene three-dimensional information acquisition methods of optics based on calculating imaging | |
Marrugo et al. | Fourier transform profilometry in LabVIEW | |
Zhu et al. | Single frame phase estimation based on Hilbert transform and Lissajous ellipse fitting method in fringe projection technology | |
KR101555027B1 (en) | Appapatus for three-dimensional shape measurment and method the same | |
Liu et al. | Investigation of phase pattern modulation for digital fringe projection profilometry | |
Wenzel et al. | New optical equipment in 3D surface measuring | |
Li et al. | Single-shot absolute 3D measurement based on speckle-embedded fringe projection | |
Kayaba et al. | Non-contact full field vibration measurement based on phase-shifting | |
KR101613829B1 (en) | Method and Apparatus for 3D Shape Measuring By Using Derivative Moire |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |