CN114216409B - Parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging - Google Patents

Parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging Download PDF

Info

Publication number
CN114216409B
CN114216409B CN202111000654.0A CN202111000654A CN114216409B CN 114216409 B CN114216409 B CN 114216409B CN 202111000654 A CN202111000654 A CN 202111000654A CN 114216409 B CN114216409 B CN 114216409B
Authority
CN
China
Prior art keywords
pixel
projector
parallel
slice
optical center
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111000654.0A
Other languages
Chinese (zh)
Other versions
CN114216409A (en
Inventor
赵慧洁
王云帆
姜宏志
李旭东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN202111000654.0A priority Critical patent/CN114216409B/en
Publication of CN114216409A publication Critical patent/CN114216409A/en
Application granted granted Critical
Publication of CN114216409B publication Critical patent/CN114216409B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2504Calibration devices

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A parallel axis three-dimensional measurement method based on slice parallel single-pixel imaging is used for detecting vertical three-dimensional shapes of deep holes, steps, grooves and the like on the surface of an object. The method is based on the principle of optical axis triangulation of a projector, the projector and a telecentric camera are arranged in parallel, and the depth reconstruction is carried out by using the optical center distance of emergent rays of the projector. The optical center distance observed by each camera pixel is determined by the projection of the optical transmission coefficient calculated by the parallel single-pixel imaging of the slice, the sub-pixel coordinate of the coordinate system of the corresponding projector is accurately positioned by projecting and shooting the parallel single-pixel imaging substrate stripe of the horizontal slice and the vertical slice, and the optical center distance of the projector is calculated according to the optical center coordinate of the projector. And during calibration, sampling the optical center distance under different depth planes, and fitting depth mapping by utilizing a third-order polynomial. And during measurement, depth reconstruction is carried out according to the calibrated polynomial. The invention solves the problem of robust and high-precision vertical scanning three-dimensional measurement of self-shielding objects such as deep holes, grooves, steps and the like, and simultaneously has good effect on promoting the development of three-dimensional measurement of vertical scanning configuration.

Description

Parallel axis three-dimensional measurement method based on slice parallel single-pixel imaging
Technical Field
The invention relates to a parallel axis three-dimensional measurement method based on slice parallel single-pixel imaging, which expands the parallel single-pixel imaging technology into a vertical scanning three-dimensional measurement system and improves the robustness, signal-to-noise ratio and precision of vertical scanning three-dimensional measurement. The invention belongs to the field of machine vision.
Background
Three-dimensional measurement is widely applied in manufacturing, entertainment, security, medical and other industries, wherein the fringe projection technology is one of the most popular measurement methods due to its accuracy, rapidness, flexibility and multiple purposes. However, for objects with deep holes, grooves and steps, due to the excessive baseline distance, the traditional method based on binocular stereo vision often cannot completely measure the areas with self-occlusion.
Vertical scanning devices can overcome the above disadvantages, wherein the optical axes of the projector and the camera are usually placed parallel or coaxial, either as parallel axis measurements or as coaxial measurements. However, none of the existing coaxial or parallel axis three-dimensional measurement methods can achieve robust, high signal-to-noise ratio, and high precision three-dimensional measurement. The fringe defocus-based coaxial measurement method requires the projector to have severe defocus variation over a certain depth range. While extracting parameters from the defocused projections, the degradation also results in a reduction in the signal-to-noise ratio. The method based on the phase and the optical axis triangle of the projector needs complex zero-phase point detection, and meanwhile, the robustness of zero-phase coordinate detection errors is poor.
Disclosure of Invention
In order to solve the problems of robust and high-precision three-dimensional measurement of self-shielding objects such as deep holes, grooves, steps and the like, the invention provides a parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging, and a parallel single-pixel imaging technology is introduced into a vertical scanning three-dimensional measurement device for the first time.
The basic principle of the invention is to reconstruct the three-dimensional point cloud of an object by taking the optical center distance of emergent rays of a projector received by a pixel of a telecentric camera as an independent variable according to the optical axis trigonometric principle and a polynomial mapping relation calibrated at different depths. The optical center distance is obtained by a slice parallel single-pixel imaging technology, and robust, high-precision and non-shielding three-dimensional scanning can be realized. In actual scanning, a telecentric lens and a projector are arranged in parallel through an optical axis of a beam splitter prism, a series of parallel single-pixel imaging substrate stripe images of slices are projected and shot, a one-dimensional Fourier coefficient is calculated, the sub-pixel coordinates of the projector corresponding to each pixel are determined according to light transmission coefficient horizontal and vertical projection signals obtained after one-dimensional Fourier inverse transformation, then the optical center distance is calculated, and the sub-pixel coordinates are substituted into a preset calibrated polynomial to calculate the three-dimensional coordinates.
The technical problem to be solved by the invention is as follows: the robust and high-precision vertical scanning three-dimensional measurement of self-shielding objects such as deep holes, grooves and steps is achieved.
The technical solution of the invention is as follows: a parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging is characterized in that the measurement process comprises the following steps:
(1) Projecting and shooting a series of slice parallel single-pixel imaging substrate stripe images;
(2) Using the slice parallel single-pixel imaging substrate stripe image acquired in the step (1) to calculate the corresponding projector sub-pixel coordinates in each camera pixel observation area;
(3) Calculating the optical center distance under the image coordinate system of the projector according to the sub-pixel optical center coordinates of the projector calculated in the step (2);
(4) Calibrating the optical center distance observed by each pixel in a measuring range, under planes of different depths, and fitting the optical center distance-depth mapping by a third-order polynomial pixel by pixel to obtain a polynomial coefficient;
(5) And (5) reconstructing the three-dimensional point cloud of each pixel point by using the magnification factor of the telecentric lens and the polynomial coefficient calibrated in the step (4).
In the step (1), a series of slice parallel single-pixel imaging substrate stripes are horizontal and vertical slice sinusoidal substrate stripe patterns in parallel single-pixel imaging, and are divided into two parts, namely coarse stripes and fine stripes, and mathematical expressions of the two parts are as follows:
coarse stripe pattern:
Figure BDA0003235327970000021
fine fringe pattern:
Figure BDA0003235327970000022
wherein (x) p ,y p ) Is discrete coordinate under projector image coordinate system, (u, v) is Fourier domain discrete space frequency coordinate, phi takes values of 0, pi/2, 3/2 pi and 2 pi, a is average gray scale, b is modulation degree, M and N are projector resolution width and height, M is modulation degree, M is image resolution ratio s And N s The width and the height of the area are observed for a single pixel of the camera. The thick stripe only takes the middle-low frequency part of the full resolution, and the thin stripe is the whole frequency in the observation area.
In the step (2), the horizontal and vertical one-dimensional Fourier coefficients at the frequency corresponding to each pixel are respectively calculated by using the images of the coarse stripes and the fine stripes collected in the step (1). And for the coarse stripes, the high-frequency coefficient which is not acquired is set to be zero, and then the position of the full-resolution one-dimensional inverse Fourier transform maximum value is taken as a coarse positioning coordinate. And acquiring a one-dimensional Fourier inverse transformation result under the resolution of the thin stripe observation area, and performing periodic continuation until the coarse positioning coordinate is covered. And setting the error continuation signal to be 0 according to the coarse positioning coordinates, and only keeping one period covering the coarse positioning position. And solving the peak value sub-pixel coordinate as a fine positioning coordinate by a gray scale centroid method.
And (3) calculating the distance from the pixel coordinate of the subpixel projector to the optical center in the projector image coordinate system according to the fine positioning result in the step (2). The mathematical expression is
Figure BDA0003235327970000031
Wherein, (x- p ,y* p ) Is the sub-pixel coordinate (x) of the fine positioning in (2) Op ,y Op ) Projector optical center coordinates. The optical center coordinates of the projector are obtained by calibrating a projector pinhole model, or the center (M/2, N/2) of a projector pixel coordinate system is directly obtained.
In the step (4), the optical center distance corresponding to each camera pixel under planes with different depths is collected, and the depth of the current position is recorded. Aiming at the 'optical center distance-depth' data collected by each pixel, the third-order polynomial fitting is carried out pixel by pixel, and the polynomial coefficient is stored as a calibration result. For a certain camera pixel point, the mapping expression is as follows:
Z=a 0 +a 1 r p +a 2 r p 2 +a 3 r p 3
wherein Z is the plane depth, r p For which the optical center distance, a, is calculated using the method in (3) i The scaling polynomial coefficient for that pixel.
In the step (5), the measured object is placed in the measuring range, and the steps (1) to (3) are repeated. And (4) calculating horizontal coordinates (X, Y) by using the pixel size and the magnification of the telecentric camera, and substituting the optical center distance calculated in the actual measurement into the polynomial calibrated in the step (4) to obtain the depth Z. I.e. for a certain pixel point coordinate (x) c ,y c ) The three-dimensional point (X, Y, Z) calculation method comprises the following steps:
Figure BDA0003235327970000041
wherein, mu c For telecentric camera pixel size, beta is the magnification factor, r p For the optical centre distance calculated during actual measurement, a i The coefficients of the scaling polynomial for that pixel.
Compared with the prior art, the invention has the advantages that:
(1) Compared with a vertical three-dimensional scanning technology based on defocusing, the method provided by the invention can have higher sensitivity without drastic change of a projector point spread function in depth change, is insensitive to defocusing and has higher signal-to-noise ratio.
(2) Compared with the vertical three-dimensional scanning technology based on the phase, the method does not need a complex phase zero point detection algorithm, and can utilize a projector pinhole model calibration technology or directly adopt the center of a projector image coordinate system within an error tolerance range.
(3) Compared with the traditional parallel single-pixel imaging technology, the invention adopts the slice parallel single-pixel imaging technology, only needs to project the horizontal and vertical sine substrate fringe patterns, greatly reduces the projection quantity of the parallel single-pixel imaging fringes and shortens the measurement time.
Drawings
FIG. 1 is a measurement flow chart of the method of the present invention.
FIG. 2 is a hardware schematic of the method of the present invention. The device comprises a telecentric camera 1, a semi-transparent semi-reflecting beam splitter prism 2, a projector 3 and a measured object 4.
FIG. 3 is a calibration flow chart of the method of the present invention.
FIG. 4 is a schematic diagram of the calibration of the method of the present invention. The device comprises a telecentric camera 1, a semi-transparent semi-reflecting beam splitter prism 2, a projector 3, a calibrated (depth) plane 4 and a translation device 5.
FIG. 5 is a flow chart of an optical transmission coefficient projection sub-pixel localization algorithm.
FIG. 6 is a schematic diagram of the positioning of a light transmission coefficient projection sub-pixel.
Detailed Description
For a better understanding of the present invention, the technical solutions of the present invention will be described in detail below with reference to the accompanying drawings and examples.
Referring to the attached figure 1, a parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging comprises the following steps:
1. before starting the measurement, the projector and telecentric camera positions are adjusted so that the optical axes are parallel with respect to fig. 2. Wherein at most half of the image of the projector is projected onto the object to be measured through the semi-transparent semi-reflective prism.
2. And calibrating the coefficients of the depth reconstruction polynomial. The specific calibration process is shown in figure 3, and the calibration method is shown in figure 4. Before calibration is started, the translation mechanism is adjusted, so that the movement axis is parallel to the optical axis of the telecentric camera, and the calibration plane is perpendicular to the movement axis. Adjust the plane to a certain depth Z i And projecting a series of slices to the calibration plane, imaging the substrate stripes by single pixel in parallel, shooting, and recording the current depth value. And solving the horizontal and vertical projections of the optical transmission coefficient observed by each camera pixel by utilizing one-dimensional Fourier inverse transformation and parallel single-pixel imaging technology. And determining the sub-pixel projector coordinate corresponding to each camera pixel according to the peak position projected by the one-dimensional optical transmission coefficient, and calculating the optical center distance according to the optical center position. The optical center position can be determined by calibrating a projector in advance, or the central point of the image coordinate system of the projector can be directly obtained. Recording the optical center distance r at the current depth pi . And adjusting the depth position of the plane, and repeating the steps until the whole depth field is covered. And carrying out third-order polynomial fitting on the optical center distance and the depth value acquired at each depth, wherein the mathematical expression is as follows:
Z i =a 0 +a 1 r pi +a 2 r pi 2 +a 3 r pi 3
the polynomial coefficients for each pixel are recorded.
3. And placing the measured object in a calibration range, projecting a series of slice parallel single-pixel substrate stripes to the measured object, and shooting by using a camera. A series of slice parallel single pixel imaging substrate stripes are parallel single pixel imaging medium-transverse and longitudinal slice sinusoidal substrate stripe patterns, which are divided into a coarse stripe pattern and a fine stripe pattern, and a four-step phase shift technology is adopted, wherein mathematical expressions of the four-step phase shift technology are respectively as follows:
coarse stripe pattern:
Figure BDA0003235327970000061
fine stripe pattern:
Figure BDA0003235327970000062
wherein (x) p ,y p ) Is discrete coordinate under projector image coordinate system, (u, v) is Fourier domain discrete space frequency coordinate, phi takes values of 0, pi/2, 3/2 pi and 2 pi, a is average gray scale, b is modulation degree, M and N are projector resolution width and height, M is modulation degree, M is image resolution ratio s And N s The width and the height of the area are observed for a single pixel of the camera. The thick stripe only takes the middle-low frequency part of the full resolution, and the thin stripe is the whole frequency in the observation area.
4. And (3) solving the light transmission coefficient projection, and determining the sub-pixel coordinate of the projector corresponding to each pixel, wherein the flow chart is shown in figure 5, and the schematic diagram is shown in figure 6. Firstly, for a coarse fringe image, horizontal and vertical one-dimensional Fourier coefficients are respectively obtained by utilizing a four-step phase shift principle. The mathematical expression is as follows:
Figure BDA0003235327970000063
wherein, R is an image collected by the camera, b is a modulation degree, j is an imaginary part, V represents a vertical stripe, and H represents a horizontal stripe. And (3) because only the low-frequency coefficient is acquired, 0 is supplemented to the resolved one-dimensional Fourier coefficient until the vertical stripe is resolved to obtain the Fourier coefficient with the length of M and the horizontal stripe Fourier coefficient is resolved to be N. And performing inverse Fourier transform on the one-dimensional Fourier coefficients after 0 compensation respectively, namely:
Figure BDA0003235327970000064
wherein h is the one-dimensional projection signal of the optical transmission coefficient. Maximum values of the inversely transformed signals are respectively obtained to obtain integer pixel coordinates corresponding to the pixels of the projector
Figure BDA0003235327970000065
Namely:
Figure BDA0003235327970000071
similarly, for the fine stripe image, the length is calculated to be M by utilizing the four-step phase shift technology s 、N s The one-dimensional fourier coefficients of (a) directly perform one-dimensional inverse fourier transform on the signal, that is:
Figure BDA0003235327970000072
then respectively extending the signal period until covering
Figure BDA0003235327970000073
And intercepting the light transmission coefficient projection signal with the length of one period in the range, wherein the process is schematically shown in fig. 6. Finally, calculating the sub-pixel coordinate position (x) by using a gray scale centroid method p ,y* p ) I.e. by
Figure BDA0003235327970000074
5. Calculating the optical center distance r observed by each camera pixel by using the sub-pixel coordinate position of the projector calculated in the step 4 p . The mathematical expression is as follows:
Figure BDA0003235327970000075
wherein (x) Op ,y Op ) Projector optical center coordinates. The optical center coordinates of the projector are obtained by calibrating a projector pinhole model, or the center (M/2, N/2) of a projector pixel coordinate system is directly obtained.
6. And (5) calculating the three-dimensional coordinate of each pixel point according to the calibration coefficient in the step (2) and the optical center distance obtained by resolving in the step (5). Calculating horizontal coordinates (X, Y) by using pixel size and magnification of telecentric camera, and calculating the calculated optical center distance r p And (3) substituting the polynomial calibrated in the step (2) to obtain the depth Z.
For camera image point (x) c ,y c ) The mathematical expression is as follows:
Figure BDA0003235327970000076
wherein, mu c In order to use the pixel size of a telecentric camera, beta is the magnification factor, r p For the optical centre distance calculated during actual measurement, a i The scaling polynomial coefficient for that pixel.

Claims (5)

1. A parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging is characterized in that the measurement process comprises the following steps:
(1) Projecting and shooting a series of slice parallel single-pixel imaging substrate stripe images;
(2) Calculating the sub-pixel coordinates of a corresponding projector in each camera pixel observation area by using the slice parallel single-pixel imaging substrate stripe image acquired in the step (1);
(3) Calculating the optical center distance under the image coordinate system of the projector according to the sub-pixel optical center coordinates of the projector calculated in the step (2);
(4) Acquiring optical center distances corresponding to pixels of each camera under planes with different depths, and recording the depth value of the current position; aiming at the 'optical center distance-depth' data collected by each pixel, performing third-order polynomial fitting pixel by pixel, and storing polynomial coefficients as calibration results; for a certain camera pixel point, the mapping relation expression is as follows:
Z=a 0 +a 1 r p +a 2 r p 2 +a 3 r p 3
wherein Z is the plane depth, r p For which the optical center distance, a, is calculated using the method in (3) 0 、a 1 、a 2 、a 3 Respectively calibrating constant term, first term, quadratic term and third term coefficients in a third-order polynomial for the pixel;
(5) And (4) reconstructing a three-dimensional point of each pixel by using the magnification factor of the telecentric lens and the polynomial coefficient calibrated in the step (4).
2. The parallel-axis three-dimensional measurement method based on slice-parallel single-pixel imaging according to claim 1, characterized in that: in the step (1), a series of slice parallel single-pixel imaging substrate stripes are horizontal and vertical slice sinusoidal substrate stripe patterns in parallel single-pixel imaging, and are divided into a thick stripe part and a thin stripe part, and mathematical expressions of the two parts are as follows:
coarse stripe pattern:
Figure FDA0003934422610000011
fine stripe pattern:
Figure FDA0003934422610000021
wherein (x) p ,y p ) Is discrete coordinate under projector image coordinate system, (u, v) is Fourier domain discrete space frequency coordinate, phi takes values of 0, pi/2, 3/2 pi and 2 pi, a is average gray scale, b is modulation degree, M and N are projector resolution width and height, M is modulation degree, M is image resolution ratio s And N s Observing the width and height of an area for a single pixel of a camera; the thick stripe only takes the middle-low frequency part of the full resolution, and the thin stripe is the whole frequency in the observation area.
3. The parallel-axis three-dimensional measurement method based on slice-parallel single-pixel imaging according to claim 1, characterized in that: in the step (2), the horizontal and vertical one-dimensional Fourier coefficients at the frequency position corresponding to each pixel are respectively calculated by using the images of the coarse stripes and the fine stripes collected in the step (1); for the coarse stripes, the high-frequency coefficient which is not collected is set to be zero, and then the position of the full-resolution one-dimensional inverse Fourier transform maximum value is taken as a coarse positioning coordinate; acquiring a one-dimensional inverse Fourier transform result under the resolution of a fine stripe observation area, and performing periodic continuation until a coarse positioning coordinate is covered; setting the error continuation signal to be 0 according to the coarse positioning coordinates, and only reserving a period covering the coarse positioning position; and solving the peak value sub-pixel coordinate as a fine positioning coordinate by a gray scale centroid method.
4. The parallel-axis three-dimensional measurement method based on slice-parallel single-pixel imaging according to claim 1, characterized in that: in the step (3), according to the sub-pixel coordinates of the projector obtained by fine positioning in the step (2), calculating the distance from the pixel coordinates of the sub-pixel projector to the optical center under the image coordinate system of the projector, namely the optical center distance; the optical center coordinates of the projector are obtained by calibrating a projector pinhole model, or the center (M/2, N/2) of a projector pixel coordinate system is directly taken.
5. The parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging as claimed in claim 1, characterized in that: the step (5) of reconstructing the three-dimensional point refers to that a pixel point (x) of a certain camera c ,y c ) Using pixel size mu of a telecentric camera c And calculating horizontal coordinates (X, Y) by the magnification beta, and calculating the optical center distance r calculated in actual measurement p Substituting the polynomial scaled in (4) to obtain the depth Z, i.e.
Figure FDA0003934422610000022
Wherein a is 0 、a 1 、a 2 、a 3 And respectively calibrating constant terms, primary terms, secondary terms and tertiary term coefficients in the obtained third-order polynomial, thereby obtaining the coordinate of the reconstructed three-dimensional point.
CN202111000654.0A 2021-08-30 2021-08-30 Parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging Active CN114216409B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111000654.0A CN114216409B (en) 2021-08-30 2021-08-30 Parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111000654.0A CN114216409B (en) 2021-08-30 2021-08-30 Parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging

Publications (2)

Publication Number Publication Date
CN114216409A CN114216409A (en) 2022-03-22
CN114216409B true CN114216409B (en) 2023-01-24

Family

ID=80695915

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111000654.0A Active CN114216409B (en) 2021-08-30 2021-08-30 Parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging

Country Status (1)

Country Link
CN (1) CN114216409B (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102073863B (en) * 2010-11-24 2012-08-15 中国科学院半导体研究所 Method for acquiring characteristic size of remote video monitored target on basis of depth fingerprint
CN103065359A (en) * 2013-01-14 2013-04-24 厦门大学 Optical imaging three-dimensional contour reconstruction system and reconstruction method
CN109087348B (en) * 2017-06-14 2022-04-29 北京航空航天大学 Single-pixel imaging method based on adaptive area projection
CN110425986B (en) * 2019-07-17 2020-10-16 北京理工大学 Three-dimensional calculation imaging method and device based on single-pixel sensor

Also Published As

Publication number Publication date
CN114216409A (en) 2022-03-22

Similar Documents

Publication Publication Date Title
Feng et al. Calibration of fringe projection profilometry: A comparative review
CN110514143B (en) Stripe projection system calibration method based on reflector
CN105783775B (en) A kind of minute surface and class minute surface object surface appearance measuring device and method
CN109163672B (en) Micro-topography measuring method based on white light interference zero-optical-path-difference position pickup algorithm
EP0888522B1 (en) Method and apparatus for measuring shape of objects
CN111238403A (en) Three-dimensional reconstruction method and device based on light field sub-aperture stripe image
CN110864650A (en) Flatness measuring method based on fringe projection
CN107941168B (en) Reflective stripe surface shape measuring method and device based on speckle position calibration
CN112525070B (en) Vibration-resistant white light interference measurement method based on non-uniform sampling correction
CN112967342B (en) High-precision three-dimensional reconstruction method and system, computer equipment and storage medium
CN109631798A (en) A kind of 3 d shape vertical measurement method based on π phase shifting method
KR20180089910A (en) Method and apparatus for optimizing the optical performance of an interferometer
CN108955559A (en) Three-dimension measuring system structural parameters quick calibrating method under the conditions of one kind is non-parallel
CN113280755B (en) Large-curvature mirror surface three-dimensional shape measuring method based on curved surface screen phase deflection
CN114216409B (en) Parallel-axis three-dimensional measurement method based on slice parallel single-pixel imaging
US20050177339A1 (en) Precision surface measurement
Yanjun et al. Method for phase-height mapping calibration based on fringe projection profilometry
CN107808399B (en) Method and system for measuring angle change of camera based on moire fringes
Schreiber et al. Optical 3D coordinate-measuring system using structured light
Sainov et al. Real time phase stepping pattern projection profilometry
Jia et al. Comparison of linear and non-linear calibration methods for phase-shifting surface-geometry measurement
Lu et al. Parallax correction of texture image in fringe projection profilometry
CN117804381B (en) Three-dimensional reconstruction method for object based on camera array focusing structure light
CN116824069B (en) Self-adaptive stripe method for detecting saturation point by using high-frequency signal
CN112857260A (en) Stripe three-dimensional imaging calibration method and system thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant