CN112325799A - High-precision three-dimensional face measurement method based on near-infrared light projection - Google Patents

High-precision three-dimensional face measurement method based on near-infrared light projection Download PDF

Info

Publication number
CN112325799A
CN112325799A CN202110016306.6A CN202110016306A CN112325799A CN 112325799 A CN112325799 A CN 112325799A CN 202110016306 A CN202110016306 A CN 202110016306A CN 112325799 A CN112325799 A CN 112325799A
Authority
CN
China
Prior art keywords
phase
dimensional
precision
camera
projection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110016306.6A
Other languages
Chinese (zh)
Inventor
张晓磊
左超
胡岩
沈德同
许明珠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University Of Technology Intelligent Computing Imaging Research Institute Co ltd
Original Assignee
Nanjing University Of Technology Intelligent Computing Imaging Research Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University Of Technology Intelligent Computing Imaging Research Institute Co ltd filed Critical Nanjing University Of Technology Intelligent Computing Imaging Research Institute Co ltd
Priority to CN202110016306.6A priority Critical patent/CN112325799A/en
Publication of CN112325799A publication Critical patent/CN112325799A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/254Projection of a pattern, viewing through a pattern, e.g. moiré
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a high-precision three-dimensional face measurement method based on near-infrared light projection. The method comprises the steps of firstly, acquiring wrapping phase information in a fringe image by using a three-step phase shift method, using an infrared projection module to replace a digital projector to project phase shift fringes, using a black-and-white camera additionally provided with an optical filter to replace a professional infrared camera, and then using a multi-frequency time phase expansion method to robustly recover an absolute phase, thereby realizing high-precision three-dimensional data acquisition. Experiments on real faces show that the method can acquire three-dimensional face data with lower cost and higher precision compared with the traditional method.

Description

High-precision three-dimensional face measurement method based on near-infrared light projection
Technical Field
The invention relates to the technical field of three-dimensional imaging, in particular to a high-precision three-dimensional face measurement method based on near infrared light projection.
Background
Fringe projection technology is one of the most popular optical non-contact three-dimensional shape measurement technologies, and is used in a variety of fields, such as mechanical Engineering, industrial monitoring, medical imaging, computer vision, education, biomedicine, and virtual/augmented reality, etc. (Gorthi, S.S. and Rastogi, P., "Fringe projection technologies: Whither we are", Optics and Lasers in Engineering 48(2), 133-. Compared with the traditional two-dimensional Face recognition, the three-dimensional Face recognition overcomes the influence of posture, expression and illumination change on the Face recognition, so that the method has a wide application prospect (Taskeran, M., Kahraman, N., and Erdem, C. E., "Face recognition: Past, present and future (a view)" Digital Processing 106, 102809 (2020)).
To realize high-precision three-dimensional human Face measurement by fringe projection, firstly, a light source in infrared or near-infrared band needs to be selected, which is beyond the range of visible light and thus is not perceived by naked eyes, so that the interference to human eyes can be reduced to the maximum extent, and discomfort in the acquisition process can be avoided (Guo, K., Wu, S., and Xu, Y., "Face reception used bone visible light image and near-acquired image and a depth network," CAAI transformations on understanding Technology 2(1), 39-47 (2017)). Then, the speed when the pattern is projected should be as fast as possible to minimize the disturbance of human motion (Yang, C., Zhou, H., Sun, S., Liu, R., ZHao, J., and Ma, J., "Good mapping for updated surface registration," updated Physics & Technology 67, 111-115 (2014)). Finally, in order to increase its market application, the hardware cost of the projection system needs to be controlled. For example, the successful application of three-dimensional measurement technologies such as microsoft Kinect, intel real sense and apple iPhone X has pushed the development of related applications and the demand for better three-dimensional face imaging technologies (Zhang, z., "Review of single-shot 3d shape measurement by phase calculation-based segmentation techniques," Optics and Lasers in Engineering 50(8), 1097-.
Fringe projection techniques can be largely classified into Fourier transform contouring (M. Takeda and K. Mutoh, "Fourier transform profiling for the automatic measurement of 3-d object profiles," Applied optics 22, 3977- & 3982 (1983)), and phase-shift contouring (V. Srinivasan, H. -C. Liu, and M. Halioua, "automatic phase-measuring profiling of 3-d differential objects," Applied optics 23, 3105- & 3108 (1984)). The former is based on spatial filtering, and can carry out three-dimensional reconstruction by using a single frame, so that the three-dimensional reconstruction method is insensitive to motion and is suitable for measuring dynamic scenes. But the measurement accuracy is low due to the less ambient light rejection than phase shift profilometry. The latter adopts a multi-frame grating projection strategy, can effectively inhibit the influence of the ambient light and the reflectivity of the measured object on phase recovery, and has higher precision. But at least three images are required to obtain high quality three-dimensional data and are sensitive to motion errors. By using a digital projector, the speed of hardware equipment can be increased, but high-precision, low-cost and small-size face measurement still presents a great challenge, which limits the development of stripe projection technology in the field of face recognition.
Disclosure of Invention
The invention aims to provide a high-precision three-dimensional face measurement method based on near-infrared light projection.
The technical scheme of the invention is as follows: a high-precision three-dimensional face measurement method based on near-infrared light projection comprises the following steps:
the method comprises the following steps that firstly, a projection system is built by using a near infrared light projection module and two cameras, and system calibration is completed;
step two, obtaining the wrapped phase of the measured object according to the phase shift profilometry, and obtaining the absolute phase by using a multi-frequency time phase expansion method, which specifically comprises the following steps:
step 2.1, three images with three-step phase shift are projected by the projection module, the images with fringe deformation are synchronously acquired by two cameras, for phase shift profilometry, the three-step phase shift images acquired by the cameras are expressed as,
Figure 189358DEST_PATH_IMAGE001
Figure 919548DEST_PATH_IMAGE002
Figure 699285DEST_PATH_IMAGE003
wherein the content of the first and second substances,
Figure 156811DEST_PATH_IMAGE004
representing the coordinates of a pixel point on the camera,
Figure 971183DEST_PATH_IMAGE005
showing three light intensity maps acquired by the camera,
Figure 606695DEST_PATH_IMAGE006
which represents the average light intensity of the light,
Figure 873728DEST_PATH_IMAGE007
the intensity of the light of the modulation degree is represented,
Figure 134945DEST_PATH_IMAGE008
representing the absolute phase of the fringe pattern;
step 2.2, solving the wrapping phase by phase shift contour technology according to the collected image,
Figure 803824DEST_PATH_IMAGE009
wherein the content of the first and second substances,
Figure 607308DEST_PATH_IMAGE010
the solved wrapped phase is represented and,
the wrapped phase is related to the absolute phase by,
Figure 361637DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 160966DEST_PATH_IMAGE012
the order of the stripes is shown as,
Figure 684351DEST_PATH_IMAGE013
indicates the number of stripes;
step 2.3, unwrapping the wrapped phase to obtain an absolute phase,
assume that the wrapping phase to be unwrapped is
Figure 927245DEST_PATH_IMAGE014
The frequency of projection is
Figure 168870DEST_PATH_IMAGE015
Projecting a set of fringe patterns with frequency 1 on the object to find its wrapping phase
Figure 443994DEST_PATH_IMAGE016
According to the multifrequency time method, the absolute phase is expressed as,
Figure 149781DEST_PATH_IMAGE017
so as to obtain the stripe level,
Figure 484948DEST_PATH_IMAGE018
wherein the content of the first and second substances,
Figure 26919DEST_PATH_IMAGE019
is a rounding function;
solving corresponding points of the two cameras, and obtaining sub-pixel matching points of the main camera by an interpolation method;
and fourthly, calculating the space three-dimensional point of the measured object by using the fused sub-pixel matching point to obtain three-dimensional data.
Preferably, the third step is specifically: and after the absolute phase is obtained, obtaining a corresponding three-dimensional point under the view angle of the main camera according to the calibration parameters of the main camera and the auxiliary camera, mapping the three-dimensional point into the auxiliary camera by utilizing polar line mapping to obtain a corresponding point of the main camera in the auxiliary camera, and finally finding a sub-pixel matching point under the phase shift profilometry by interpolation operation.
Preferably, in the fourth step, the spatial three-dimensional point coordinates of the measured object are calculated by using the sub-pixel matching points,
Figure 105733DEST_PATH_IMAGE020
Figure 666027DEST_PATH_IMAGE021
Figure 437674DEST_PATH_IMAGE022
wherein the content of the first and second substances,
Figure 201362DEST_PATH_IMAGE023
representing the reconstructed three-dimensional data,
Figure 83868DEST_PATH_IMAGE024
the two-dimensional to three-dimensional mapping parameters between the two cameras are obtained through calibration.
Compared with the traditional method, the method has the following advantages: (1) infrared light is used as a projection light source instead of visible light, so that stimulation to human eyes in the information acquisition process is avoided; (2) the method replaces expensive digital projection and infrared image acquisition equipment with a mode of using a small projection module and installing an optical filter, reduces the size of the system and simultaneously reduces the hardware cost; (3) compared with the existing infrared scanning precision, the method realizes high-precision data acquisition by using algorithms such as a phase shift method and time domain-based phase expansion.
Drawings
Fig. 1 is a schematic flow chart of a high-precision three-dimensional face measurement method based on near-infrared light projection according to an embodiment.
Fig. 2 is a schematic diagram of a binocular infrared fringe projection system of an embodiment.
FIG. 3 shows a ceramic precision sphere to be measured in a first precision measurement experiment of the example.
Fig. 4 shows the result of the proposed near-infrared light projection measurement method in the first precision measurement experiment of the embodiment.
FIG. 5 shows the results of the first precision measurement experiment of the example, the error between the measured spherical surface and the spherical surface fitted with the results.
Fig. 6 is a first precision measurement experiment of the example, showing the histogram of fig. 5.
FIG. 7 shows a ceramic plate to be measured in a second precision measurement experiment of the example.
Fig. 8 shows a second precision measurement experiment of the embodiment, which provides a result measured by the near infrared light projection measurement method.
FIG. 9 shows the error between the measured plate result and the plane fitted with the result of the second precision measurement experiment of the example.
Fig. 10 is a second precision measurement experiment of the example, showing the histogram of fig. 9.
Fig. 11 shows a high-precision measurement scene experiment of a real face, in which a camera captures a raw infrared image.
Fig. 12 shows a three-dimensional face reconstruction result at a first view angle in a high-precision measurement scene experiment of a real face according to an embodiment.
Fig. 13 shows a three-dimensional face reconstruction result at a second viewing angle in a high-precision measurement scene experiment of a real face according to the embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application.
Fig. 1 is a schematic flow chart of a high-precision three-dimensional human face measurement method based on near-infrared light projection according to this embodiment, and specific implementation steps of the method are as follows.
Firstly, a measuring system (also called a binocular infrared fringe projection system) is set up to complete binocular fringe projection and measurement, and system calibration is completed. The system comprises a computer, two black-and-white cameras and an infrared projection module, wherein the projector is connected with the cameras through two trigger lines, and the cameras are connected with the computer through data lines. The components are arranged as shown in figure 2.
Then, the whole system is calibrated to the unified world coordinate system by using Zhang algorithm (Z. Zhang, "A flexible new technique for camera calibration." IEEE Transactions on pattern analysis and machine inter-estimation. 22(11), 1330 (2000).) to obtain calibration parameters of two cameras in a world coordinate system, and the parameters are converted into two-dimensional to three-dimensional, two-dimensional to two-dimensional mapping parameters (K. Liu, Y. Wang, D.L. Lau, et al, "Dual-frequency pattern for high-speed 3-D shape measurement." Optics expression. 18(5): 5229-.
Step two, obtaining the wrapping phase of the measured object by using a phase-shift profilometry, and expanding the phase (T, Tao, Q, Chen, S, Feng, Y, Hu, M, Zhang, and C, Zuo, "High-precision time 3D shape measurement based on a quad-camera system," Journal of Optics 20, 014009 (2018)) by using a multi-frequency time phase expansion method, wherein the specific steps are as follows: the method comprises the steps of firstly projecting three images with three-step phase shift by using a projection module, synchronously acquiring fringe deformation images by using two cameras, and then solving a wrapping phase by using a phase shift profile technology according to the acquired images.
The solving process is as follows: for phase shift profilometry, the three-step phase-shifted image acquired by the camera can be represented as,
Figure 498668DEST_PATH_IMAGE025
Figure 441217DEST_PATH_IMAGE026
Figure 689271DEST_PATH_IMAGE027
in the above formula, the first and second carbon atoms are,
Figure 109888DEST_PATH_IMAGE028
representing the coordinates of a pixel point on the camera,
Figure 644774DEST_PATH_IMAGE029
showing three light intensity maps acquired by the camera,
Figure 758224DEST_PATH_IMAGE030
which represents the average light intensity of the light,
Figure 683455DEST_PATH_IMAGE031
the intensity of the light of the modulation degree is represented,
Figure 720812DEST_PATH_IMAGE032
indicating the phase of the fringe pattern.
Due to the truncation effect of the arctan function, only the wrapping phase can be obtained by utilizing the acquired fringe pattern,
Figure 782309DEST_PATH_IMAGE033
in the formula
Figure 128976DEST_PATH_IMAGE034
Representing the solved wrapped phase.
The wrapped phase is related to the absolute phase by,
Figure 541503DEST_PATH_IMAGE035
in the formula
Figure 382551DEST_PATH_IMAGE036
The order of the stripes is shown as,
Figure 564134DEST_PATH_IMAGE037
the number of stripes is shown, and the process of solving the stripe order is called unwrapping.
The wrapped phase is then unwrapped to obtain the absolute phase. To improve the accuracy of unwrapping, the present invention uses multi-frequency time phase unwrapping (C. Zuo, L. Huang, M. Zhang, Q. Chen, and A. Asundi, "Temporal phase unwrapting algorithms for fringe projection profiling: A comparative review," Optics and Lasers in Engineering, 85: 84-103 (2016)). The specific unfolding process is as follows: assume that the wrapping phase to be unwrapped is
Figure 816124DEST_PATH_IMAGE038
The frequency of projection is
Figure 715947DEST_PATH_IMAGE039
Projecting a set of fringe patterns with frequency 1 on the object to find its wrapping phase
Figure 95107DEST_PATH_IMAGE040
According to the multi-frequency time method, the absolute phase can be expressed as,
Figure 396775DEST_PATH_IMAGE041
further, the stripe order can be obtained according to the following formula,
Figure 819666DEST_PATH_IMAGE042
whereinRoundIs a rounding function.
And step three, after the absolute phase is obtained, searching a matching point of which the phase difference is within a certain threshold value on the polar line of the corresponding image by utilizing polar line mapping, and finally finding a sub-pixel matching point by interpolation operation.
And fourthly, calculating the space three-dimensional point of the measured object by using the fused sub-pixel matching point to complete the display of the three-dimensional data.
Specifically, after the fused sub-pixel matching points are obtained, the reconstructed three-dimensional data can be obtained according to the following formula,
Figure 206785DEST_PATH_IMAGE043
Figure 410144DEST_PATH_IMAGE044
Figure 300739DEST_PATH_IMAGE045
in the formula
Figure 160111DEST_PATH_IMAGE046
Representing the reconstructed three-dimensional data,
Figure 768947DEST_PATH_IMAGE047
and
Figure 755488DEST_PATH_IMAGE048
the two-dimensional to three-dimensional mapping parameters between the two cameras are obtained through calibration.
In order to test the feasibility and the real-time performance of the method, a set of binocular infrared high-precision three-dimensional imaging system of a human face measurement scene is built by using the embodiment, and the structure of the system is shown in figure 2. In this system, the infrared projection module is used in the typeUFGKH0A1IR01Scan angle of 60 ° and maximum speed of 32 °kHzLaser center wavelength of 830nm. Two black and white cameras are used for stereo phase expansion, and the type of the cameras isBasler acA640-750umThe highest frame rates are all 750fpsThe maximum resolution is 640X 480, and 12 are usedmmComputer lens. The transmission range of the optical filter loaded on the lens is 800-1100nm. In the experiment, the speed of the projector was 500HzAll cameras are triggered by the projector.
Firstly, an accuracy measurement scene based on an accuracy ball is designed, the measurement result is shown in figures 3-6, and figure 3 shows that the accuracy ball is to be measuredFig. 4 is a result measured by the proposed near infrared light projection measurement method, fig. 5 is an error between a result of a spherical surface measured by the proposed method and a spherical surface fitted with the result, and fig. 6 shows a histogram of fig. 5. As can be seen from the measurement results, the accuracy of the method of the present invention is 0.14mm
In order to further verify the system precision, a second precision experiment is designed. A plate was measured, and the final measurement results are shown in fig. 7 to 10, where fig. 7 is a ceramic plate to be measured, fig. 8 is a result measured by the proposed near infrared light projection measurement method, fig. 9 is an error between the plate result measured by the proposed method and a plane fitted with the result, and fig. 10 shows the histogram of fig. 9. As can be seen from the measurement results, the precision of the method is 0.15 mm. The two experimental results prove that the method provided by the invention is superior to 0.15 mm
Finally, in order to further verify that the system can be well applied to a real scene, high-precision measurement experiments of a real face are carried out, and the experimental results are shown in fig. 11 to 13, wherein fig. 11 is an original infrared image captured by a camera, and fig. 12 to 13 are three-dimensional face reconstruction results under different viewing angles.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present application. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the application. Thus, the present application is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (3)

1. A high-precision three-dimensional face measurement method based on near infrared light projection is characterized by comprising the following steps:
the method comprises the following steps that firstly, a projection system is built by using a near infrared light projection module and two cameras, and system calibration is completed;
step two, obtaining the wrapped phase of the measured object according to the phase shift profilometry, and obtaining the absolute phase by using a multi-frequency time phase expansion method, which specifically comprises the following steps:
step 2.1, three images with three-step phase shift are projected by the projection module, the images with fringe deformation are synchronously acquired by two cameras, for phase shift profilometry, the three-step phase shift images acquired by the cameras are expressed as,
Figure DEST_PATH_IMAGE001
Figure DEST_PATH_IMAGE002
Figure DEST_PATH_IMAGE003
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE004
representing the coordinates of a pixel point on the camera,
Figure DEST_PATH_IMAGE005
showing three light intensity maps acquired by the camera,
Figure DEST_PATH_IMAGE006
which represents the average light intensity of the light,
Figure DEST_PATH_IMAGE007
the intensity of the light of the modulation degree is represented,
Figure DEST_PATH_IMAGE008
representing the absolute phase of the fringe pattern;
step 2.2, solving the wrapping phase by phase shift contour technology according to the collected image,
Figure DEST_PATH_IMAGE009
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE010
the solved wrapped phase is represented and,
the wrapped phase is related to the absolute phase by,
Figure DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE012
the order of the stripes is shown as,
Figure DEST_PATH_IMAGE013
indicates the number of stripes;
step 2.3, unwrapping the wrapped phase to obtain an absolute phase,
assume that the wrapping phase to be unwrapped is
Figure DEST_PATH_IMAGE014
The frequency of projection is
Figure DEST_PATH_IMAGE015
Projecting a set of fringe patterns with frequency 1 on the object to find its wrapping phase
Figure DEST_PATH_IMAGE016
According to the multifrequency time method, the absolute phase is expressed as,
Figure DEST_PATH_IMAGE017
so as to obtain the stripe level,
Figure DEST_PATH_IMAGE018
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE019
is a rounding function;
solving corresponding points of the two cameras, and obtaining sub-pixel matching points of the main camera by an interpolation method;
and fourthly, calculating the space three-dimensional point of the measured object by using the fused sub-pixel matching point to obtain three-dimensional data.
2. The high-precision three-dimensional face measurement method based on near-infrared light projection according to claim 1, characterized in that the third step is specifically: and after the absolute phase is obtained, obtaining a corresponding three-dimensional point under the view angle of the main camera according to the calibration parameters of the main camera and the auxiliary camera, mapping the three-dimensional point into the auxiliary camera by utilizing polar line mapping to obtain a corresponding point of the main camera in the auxiliary camera, and finally finding a sub-pixel matching point under the phase shift profilometry by interpolation operation.
3. The high-precision three-dimensional human face measuring method based on near infrared light projection of claim 1 is characterized in that in the fourth step, the spatial three-dimensional point coordinates of the measured object are calculated by using the sub-pixel matching points,
Figure DEST_PATH_IMAGE020
Figure DEST_PATH_IMAGE021
Figure DEST_PATH_IMAGE022
wherein the content of the first and second substances,
Figure DEST_PATH_IMAGE023
representing the reconstructed three-dimensional data,
Figure DEST_PATH_IMAGE024
the two-dimensional to three-dimensional mapping parameters between the two cameras are obtained through calibration.
CN202110016306.6A 2021-01-07 2021-01-07 High-precision three-dimensional face measurement method based on near-infrared light projection Pending CN112325799A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110016306.6A CN112325799A (en) 2021-01-07 2021-01-07 High-precision three-dimensional face measurement method based on near-infrared light projection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110016306.6A CN112325799A (en) 2021-01-07 2021-01-07 High-precision three-dimensional face measurement method based on near-infrared light projection

Publications (1)

Publication Number Publication Date
CN112325799A true CN112325799A (en) 2021-02-05

Family

ID=74302351

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110016306.6A Pending CN112325799A (en) 2021-01-07 2021-01-07 High-precision three-dimensional face measurement method based on near-infrared light projection

Country Status (1)

Country Link
CN (1) CN112325799A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113108719A (en) * 2021-03-23 2021-07-13 南京理工大学 High-precision three-dimensional face measurement method based on near-infrared fringe projection

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110298891A1 (en) * 2010-06-04 2011-12-08 Iowa State University Research Foundation, Inc. Composite phase-shifting algorithm for 3-d shape compression
US20120120412A1 (en) * 2010-11-15 2012-05-17 Seikowave, Inc. Structured Light 3-D Measurement Module and System for Illuminating an Area-under-test using a Fixed-pattern Optic
CN109253708A (en) * 2018-09-29 2019-01-22 南京理工大学 A kind of fringe projection time phase method of deploying based on deep learning
CN109579741A (en) * 2018-11-01 2019-04-05 南京理工大学 A kind of Full-automatic multimould state three-dimensional colour measurement method based on multi-angle of view
CN109903376A (en) * 2019-02-28 2019-06-18 四川川大智胜软件股份有限公司 A kind of the three-dimensional face modeling method and system of face geological information auxiliary
CN110500970A (en) * 2019-08-01 2019-11-26 佛山市南海区广工大数控装备协同创新研究院 A kind of multi-frequency structural light three-dimensional measuring device and method
CN111207693A (en) * 2020-01-10 2020-05-29 西安交通大学 Three-dimensional measurement method of turbine blade ceramic core based on binocular structured light

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110298891A1 (en) * 2010-06-04 2011-12-08 Iowa State University Research Foundation, Inc. Composite phase-shifting algorithm for 3-d shape compression
US20120120412A1 (en) * 2010-11-15 2012-05-17 Seikowave, Inc. Structured Light 3-D Measurement Module and System for Illuminating an Area-under-test using a Fixed-pattern Optic
CN109253708A (en) * 2018-09-29 2019-01-22 南京理工大学 A kind of fringe projection time phase method of deploying based on deep learning
CN109579741A (en) * 2018-11-01 2019-04-05 南京理工大学 A kind of Full-automatic multimould state three-dimensional colour measurement method based on multi-angle of view
CN109903376A (en) * 2019-02-28 2019-06-18 四川川大智胜软件股份有限公司 A kind of the three-dimensional face modeling method and system of face geological information auxiliary
CN110500970A (en) * 2019-08-01 2019-11-26 佛山市南海区广工大数控装备协同创新研究院 A kind of multi-frequency structural light three-dimensional measuring device and method
CN111207693A (en) * 2020-01-10 2020-05-29 西安交通大学 Three-dimensional measurement method of turbine blade ceramic core based on binocular structured light

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113108719A (en) * 2021-03-23 2021-07-13 南京理工大学 High-precision three-dimensional face measurement method based on near-infrared fringe projection

Similar Documents

Publication Publication Date Title
CN109506589B (en) Three-dimensional profile measuring method based on structural light field imaging
CN110288642B (en) Three-dimensional object rapid reconstruction method based on camera array
CN110514143B (en) Stripe projection system calibration method based on reflector
CN111288925B (en) Three-dimensional reconstruction method and device based on digital focusing structure illumination light field
CN111563564B (en) Speckle image pixel-by-pixel matching method based on deep learning
Liu et al. Real-time 3D surface-shape measurement using background-modulated modified Fourier transform profilometry with geometry-constraint
CN104335005B (en) 3D is scanned and alignment system
CN109579741B (en) Full-automatic multi-mode three-dimensional color measurement method based on multiple visual angles
US20140247326A1 (en) Method and system for alignment of a pattern on a spatial coded slide image
CN107967697B (en) Three-dimensional measurement method and system based on color random binary coding structure illumination
CN104061879A (en) Continuous-scanning structured light three-dimensional surface shape perpendicular measuring method
Aliaga et al. A self-calibrating method for photogeometric acquisition of 3D objects
CN111028295A (en) 3D imaging method based on coded structured light and dual purposes
CN113205592B (en) Light field three-dimensional reconstruction method and system based on phase similarity
CN107990846B (en) Active and passive combination depth information acquisition method based on single-frame structured light
CN108596008B (en) Face shake compensation method for three-dimensional face measurement
US20210254968A1 (en) Method and System for Automatic Focusing for High-Resolution Structured Light 3D Imaging
WO2018032841A1 (en) Method, device and system for drawing three-dimensional image
CN112945141A (en) Structured light rapid imaging method and system based on micro-lens array
KR101411568B1 (en) A Hologram Generating Method using Virtual View-point Depth Image Synthesis
CN113108721A (en) High-reflectivity object three-dimensional measurement method based on multi-beam self-adaptive complementary matching
CN117450955B (en) Three-dimensional measurement method for thin object based on space annular feature
CN112325799A (en) High-precision three-dimensional face measurement method based on near-infrared light projection
Zhou et al. Three-dimensional shape measurement using color random binary encoding pattern projection
CN111121663B (en) Object three-dimensional topography measurement method, system and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210205