CN111811432A - Three-dimensional imaging system and method - Google Patents

Three-dimensional imaging system and method Download PDF

Info

Publication number
CN111811432A
CN111811432A CN202010548295.1A CN202010548295A CN111811432A CN 111811432 A CN111811432 A CN 111811432A CN 202010548295 A CN202010548295 A CN 202010548295A CN 111811432 A CN111811432 A CN 111811432A
Authority
CN
China
Prior art keywords
projector
camera
dimensional
phase
measured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010548295.1A
Other languages
Chinese (zh)
Inventor
魏永超
邓春艳
敖良忠
夏桂书
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Civil Aviation Flight University of China
Original Assignee
Civil Aviation Flight University of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Civil Aviation Flight University of China filed Critical Civil Aviation Flight University of China
Priority to CN202010548295.1A priority Critical patent/CN111811432A/en
Publication of CN111811432A publication Critical patent/CN111811432A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2504Calibration devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a three-dimensional imaging system and a three-dimensional imaging method, relates to the technical field of spectral imaging, and relates to a PC (personal computer), a camera, a projector I and a projector II, wherein a trigger control channel and a data transmission channel are arranged between the PC and the camera as well as between the projector I and the projector II. The projection light source wave bands of the projector I and the projector II are different; the projector I can project structured light of a single waveband. The projector II can project a plurality of structured lights with different wave bands. The camera is located between the projector I and the projector II. Camera, projecting apparatus I and projecting apparatus II homoenergetic lift adjustment, and their camera lens horizontal orientation and up-down orientation all can be adjusted. The projection focuses of the camera, the projector I and the projector II converge at one point. The invention can project structured light with a plurality of light source wave bands, is suitable for three-dimensional imaging of objects made of different materials, can avoid the loss and the incompleteness of the basic information of the imaging of the objects, can realize the information acquisition of a plurality of spectral wave bands by a single system, and has simple later-stage fusion and smaller error.

Description

Three-dimensional imaging system and method
Technical Field
The invention relates to the technical field of spectral imaging, in particular to a multispectral structured light three-dimensional imaging system and a multispectral structured light three-dimensional imaging method.
Background
The multispectral imaging technology is a remote sensing technology appearing in the early 60 th of the last century, the selection of the wave band range and the wave band number is directly related to an application target, the synchronous acquisition of target space information, radiation information and spectral information is realized by acquiring the spectral information of a plurality of or more wave bands of ground objects, the comprehensive detection perception and identification of target characteristics can be improved, and the target distinguishing and monitoring capability of the remote sensing technology is greatly expanded.
Three-dimensional imaging is a technique of extracting three-dimensional information of an object by optical means and completely restoring three-dimensional characteristics of the object in a reconstruction process. How to acquire three-dimensional information of a scene quickly and well is a key of a three-dimensional imaging technology. The passive three-dimensional imaging adopts a non-structural illumination mode, and reconstructs the three-dimensional shape of an object by computer matching and operation from two-dimensional images in different visual directions acquired by one or more camera systems. Based on active three-dimensional imaging, structured light needs to be actively projected onto a measured object, and three-dimensional information of the measured object is determined through deformation (or flight time and the like) of the structured light.
The existing active structured light three-dimensional imaging system can only project structured light with a single light source wave band, and has the following problems:
1) due to the difference of the material of the object, different light source wave bands are absorbed differently, and the material of the object to be measured can be greatly limited by a single light source.
2) A single light source causes a lack and incompleteness of the imaging information of the object.
3) Information acquisition of a plurality of spectral bands can be completed only by combining a plurality of systems, and the later stage fusion is complex and has large errors.
Disclosure of Invention
The present invention aims to provide a three-dimensional imaging system and method which alleviates the above problems.
In order to alleviate the above problems, the technical scheme adopted by the invention is as follows:
in a first aspect, the invention provides a three-dimensional imaging system, which comprises a PC, a camera, a projector I and a projector II, wherein a trigger control channel and a data transmission channel are arranged between the PC and the camera, between the projector I and between the PC and the projector II;
the projection light source wave bands of the projector I and the projector II are different;
the projector I can project structured light of a single waveband;
the projector II can project a plurality of structured lights with different wave bands;
the camera is positioned between the projector I and the projector II;
the camera, the projector I and the projector II can be adjusted in a lifting mode, and the horizontal direction and the vertical direction of lenses of the camera, the projector I and the projector II can be adjusted;
the projection focuses of the camera, the projector I and the projector II converge at one point.
The technical effect of the technical scheme is as follows: the three-dimensional imaging system can project a plurality of light source wave band structured light, can be suitable for three-dimensional imaging of measured objects made of various different materials, can avoid loss and incompleteness of object imaging basic information, can realize information acquisition of a plurality of spectrum wave bands by a single system, and is simple in later-stage fusion and small in error.
Furthermore, the camera, the projector I and the projector II are respectively assembled on three cloud platforms, the three cloud platforms realize lifting adjustment through the cloud platforms, the horizontal orientation and the up-down orientation of the lens are adjusted, and the cloud platforms are electrically connected with the PC.
The technical effect of the technical scheme is as follows: the cloud platform is as current ripe equipment, easily acquires, and it can satisfy the gesture and the position control requirement of camera, projecting apparatus I and projecting apparatus II.
Further, the camera uses a 6-Pin circular connector as an IO interface, which includes a trigger control port and a data input port.
The technical effect of the technical scheme is as follows: the structure is flexible, the imaging field range and the imaging field area can be dynamically adjusted according to the equipment characteristics or the needs of the object to be measured, and the structure does not need to be redesigned.
In a second aspect, the present invention provides a three-dimensional imaging method, which uses the three-dimensional imaging system, and the method includes:
s1, generating structural light stripes by using a PC (personal computer) and uploading the structural light stripes to the projector I and the projector II;
s2, carrying out system calibration in the common vision field of the camera, the projector I and the projector II, where the object to be measured is placed, and acquiring relative calibration parameters among the camera, the projector I and the projector II through a PC;
s3, adjusting the exposure brightness of the camera according to the ambient light and the projection light intensity of the projector on the measured object;
s4, standby the projector II, projecting and scanning the single-waveband structured light stripe to the object to be measured by the projector I, capturing the object to be measured by the camera to obtain a group of object deformation stripe images, and uploading the stripe images to the PC;
s5, the projector I is standby, the projector II sequentially projects and scans N kinds of structural light stripes with different wave bands to the measured object, in the process, the measured object is captured by the camera, N groups of object deformation stripe graphs corresponding to the N kinds of structural light stripes with different wave bands are obtained and uploaded to the PC;
s6, three-dimensional scanning, wherein the PC respectively calculates the N +1 groups of object deformation stripe patterns obtained by snapshot according to the calibration parameters to obtain N +1 groups of object three-dimensional scanning data;
and S7, reconstructing data, fusing the N +1 groups of object three-dimensional scanning data according to the calibration parameters to obtain three-dimensional data with unified coordinates, filtering the three-dimensional data to obtain final three-dimensional point cloud data of the measured object, and finishing three-dimensional imaging of the measured object.
The technical effect of the technical scheme is as follows: the method has the advantages that the method is suitable for three-dimensional imaging of the measured object made of various different materials by projecting the structured light with a plurality of light source wave bands to the measured object and then capturing to obtain an object image, can avoid the loss and the incompleteness of the basic information of the object imaging, and is simple in the whole operation method and very convenient to use; the camera, the projector and all calibration between the projector and the camera can be completed at one time, N +1 groups of measurement data can be directly fused by using calibration parameters, fusion of features or mark points is not needed, and the calculation is simple and high in precision.
Further, the projector II can project structured light of four different wave bands.
The technical effect of the technical scheme is as follows: can cover the common visible light projection wave band.
Furthermore, the projector I is an infrared projector, and the projector II is an RGB projector.
The technical effect of the technical scheme is as follows: can cover the reflection wave band range of common material of the object to be measured.
Further, in step S1, the structured light stripe is generated based on a time-phase and phase-shift phase structured light algorithm according to the three-dimensional imaging task condition.
Further, in step S2, the calibration parameters are obtained by a nine-point calibration method.
The technical effect of the technical scheme is as follows: the nine-point calibration method utilizes a two-dimensional translation table to realize horizontal translation and 360-degree rotation; the calibration plate is fixed on the two-dimensional translation table, moves horizontally at three positions, and rotates at three angles at each position respectively, so that the plane of the calibration plate and the center line of the camera view field form 90 degrees, 90 degrees to alpha and 90 degrees plus alpha respectively, wherein alpha is less than 30 degrees, thereby ensuring the imaging quality and the calibration accuracy of the calibration plate; after the images of the nine position calibration plates are collected, the calibration parameters of the camera and the projector can be calculated by using the known calibration plate data; the calibration plate can adopt calibration points or a checkerboard.
In step S3, when the exposure brightness of the camera is adjusted, the camera can clearly acquire the image of the object to be measured.
Further, in step S6, for each group of object deformation fringe maps, the truncation phase is first obtained based on the time and the phase shift phase, the unfolding continuous phase is then obtained according to the truncation phase, and finally the object three-dimensional scanning data is obtained by using the calibration parameters and the unfolding continuous phase.
The technical effect of the technical scheme is as follows: the method is suitable for a common active structured light algorithm, and can quickly and effectively obtain the complete profile of the measured object with high precision.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present invention and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained according to the drawings without inventive efforts.
FIG. 1 is a block diagram schematic of a three-dimensional imaging system in accordance with an embodiment of the present invention;
FIG. 2 is a schematic diagram of a calibration board position point when a nine-point calibration method is used for system calibration according to an embodiment of the present invention;
FIG. 3 is a schematic layout of a camera and projector according to an embodiment of the invention;
FIG. 4 is a schematic diagram of trigger control according to an embodiment of the present invention;
FIG. 5 is a flow chart of a method of three-dimensional imaging according to an embodiment of the present invention;
FIG. 6 is a flow chart of phase calculation and expansion by the difference frequency method according to the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. The components of embodiments of the present invention generally described and illustrated in the figures herein may be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present invention, presented in the figures, is not intended to limit the scope of the invention, as claimed, but is merely representative of selected embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the invention provides a three-dimensional imaging system, which includes a PC, a camera, a projector i and a projector ii, wherein a trigger control channel and a data transmission channel are disposed between the PC and the camera, and between the projector i and the projector ii.
The trigger controller is used for being connected between the PC and the camera, and between the projector I and the projector II to form a trigger control channel.
The projection light source wave bands of the projector I and the projector II are different; the projector I can project structured light of a single waveband; the projector II can project a plurality of structured lights with different wave bands; the camera is positioned between the projector I and the projector II; the camera, the projector I and the projector II can be adjusted in a lifting mode, and the horizontal direction and the vertical direction of lenses of the camera, the projector I and the projector II can be adjusted; the projection focuses of the camera, the projector I and the projector II converge at one point.
In this embodiment, camera, I projecting apparatus and II projecting apparatus assemble respectively on three cloud platforms, and they realize lift adjustment through the cloud platform to and the horizontal orientation of camera lens and upper and lower orientation are adjusted, and the cloud platform is connected with PC electricity, can send gesture and position control signal to the cloud platform through the PC to adjust the gesture and the upper and lower position of camera, I projecting apparatus and II projecting apparatus.
In this embodiment, to synchronize the projector with the displayed pattern, both projector II and projector I support a set of trigger inputs and outputs that are configured through a "trigger mode" section and a "trigger control" sub tab. The camera uses a 6-Pin circular connector as an IO interface and comprises a trigger control port for forming a trigger control channel and a data input port for forming a data transmission channel.
When the system is connected, a trigger input pin of the camera is connected with a trigger output interface of the projector II, the trigger output of the projector II is set to a required logic voltage level by inserting the jumper wire into two ends of a proper pin corresponding to the jumper wire, and the trigger output interface is connected with a trigger output interface of a trigger output port of the projector I.
In the embodiment, the projector II selects and uses equipment capable of projecting structured light of four different wave bands; the projector I is an infrared projector, and the projector II is an RGB projector. The model selected by the projector I is PDC03 and is used for projecting an infrared image to a measured object; the model of the projector II is
Figure BDA0002541554970000051
Lightcraft 4500 for projecting red, blue, green, and white images to a subject; the model selected by the camera is M3S501M-H, and the camera is used for capturing and acquiring the image of the measured object.
Fig. 3 is a schematic layout of a camera and projectors, wherein the camera is located between the two projectors, the projector ii is located on the left side of the camera, and the projector i is located on the right side of the camera. The lens of the projector ii, the camera lens, and the lens of the projector i are located at the same height in the z direction, the distance between the center of the camera lens and the center of the lens of the projector ii is d1, the distance between the center of the camera lens and the center of the lens of the projector i is also d2, and usually d1 is d 2. The included angle between the optical axis of the camera lens and the optical axis of the lens of the projector II is alpha, and the included angle between the optical axis of the camera lens and the optical axis of the lens of the projector I is beta, wherein alpha is beta for system symmetry.
Because the projection directions of the projectors are different, the projector has an orthographic projection (the projection direction is consistent with the optical axis of the lens), a downward projection (the projection direction forms an included angle with the optical axis of the lens, namely, the projector projects upwards at a certain angle) and a downward projection (the projection direction forms an included angle with the optical axis of the lens, namely, the projector projects downwards at a certain angle). Therefore, in order to ensure the convenience of the system, the cloud platforms are designed, and the postures and the positions of the cloud platforms are adjusted through the cloud platforms. Can adjust to arbitrary angle regulation through cloud platform projecting apparatus as required, simultaneously in order to guarantee the formation of image picture scope of camera, the cloud platform of camera can go up and down, can adjust camera level as required to it is suitable to guarantee the scope of formation of image.
In the present embodiment, for the projector i and the camera, the optical axes of both coincide with the projection or imaging direction. When the projector I and the camera are horizontally placed, the optical axes of the projector I and the camera are also in the horizontal direction. And the projection direction of the projector II forms an included angle with the optical axis of the lens, namely the projector II projects upwards theta. In order to ensure that the center of the projected image of projector II and the center of the image of the camera are in the same plane and intersect at the measured distance D, projector II must be projected downward theta, i.e. at an angle theta to the plane xoz.
Referring to fig. 1 to 4, an embodiment of the present invention further provides a three-dimensional imaging method, where the three-dimensional imaging system is adopted, and the method includes:
1) and generating the structural light stripes by adopting a PC (personal computer) and uploading the structural light stripes to the projector I and the projector II.
In this embodiment, the PC implements structured light calculation based on a time phase and phase shift phase structured light algorithm according to three-dimensional imaging task conditions, and generates structured light stripes.
And the projector I and the projector II are electrified and are connected with a PC (personal computer) through USB (universal serial bus) data lines to form a data transmission channel, and the PC uploads the generated structured light stripes to the projector through the channel and stores the structured light stripes.
2) System calibration
Carry out system calibration in the public field of vision of placing the testee of camera, projecting apparatus I and projecting apparatus II, obtain the relative calibration parameter between camera, projecting apparatus I and the projecting apparatus II through the PC.
As shown in fig. 2, in the embodiment of the present invention, a nine-point calibration method is used to perform system calibration to obtain calibration parameters.
When placing the calibration plate, it is necessary to ensure that the entire calibration plate is in the common field of view of the camera and the projector, place the calibration plate in 9 different positions within the field of view, and scan it. And the PC sends a trigger control instruction to the camera and the projector through the trigger control channel to complete the scanning of the calibration board.
And after the scanning is finished at 9 positions, calculating calibration parameters, finishing the calibration, and if the calibration fails, executing the calibration process again. In fig. 2, c1 is the minimum imaging distance, c2 is the maximum imaging distance, and in three-dimensional imaging, a measured object is between c1 and c2 to obtain a clear image.
3) Projection snapshot
Firstly, the exposure brightness of the camera is adjusted according to the ambient light and the projection light intensity of the projector on the measured object.
When the exposure brightness of the camera is adjusted, the camera can clearly acquire the image of the measured object, the measured object can be clearly seen on the camera picture when the projector is in the video mode, the adjustment can be completed when too much overexposure does not exist, and otherwise, the exposure time can be adjusted to change the picture brightness.
In the embodiment, self-adaptive exposure brightness adjustment is adopted, a camera captures a test image, the brightness of a computer automatically adjusts the exposure brightness according to a brightness distribution interval, and the exposure brightness is within a proper brightness range.
And carrying out projection scanning after the exposure brightness adjustment of the camera is finished.
Firstly, a projector II is in standby, an infrared structural striation is projected and scanned to a measured object by the projector I, the measured object is captured by a camera, a group of object deformation striation images are obtained and uploaded to a PC;
then, the projector I is standby, the projector II sequentially projects and scans the structural light striations with N being 4 different wave bands to the measured object, wherein the structural light striations are respectively red structural light, blue structural light, green structural light and white structural light, in the process, the measured object is captured by the camera, 4 groups of object deformation striations corresponding to the structural light striations with 4 different wave bands are obtained, and the 4 groups of object deformation striations are uploaded to the PC.
The PC machine acquires N + 1-5 groups of object deformation stripe images.
In the present embodiment, the trigger principle of the projection snapshot is shown in fig. 4, and when the system is in operation, the projector i and the camera form a measurement system T1, and the projector ii and the camera form a measurement system T2. The camera in the measuring system T1 is in a state of receiving the trigger signal, the trigger signal generated by the projector I is transmitted to the camera, and the camera is triggered to shoot images after receiving the trigger signal. The operation of the measurement system T2 is the same as that of T1. During operation of the system, the scanning of the object by the system T1 is first measured, while the projector ii is in a standby state and does not project any image. After the measurement system T1 completes scanning, the projector i is placed in a standby state, image projection is turned off, and the measurement system T2 is activated to generate red, blue, green, and white structured light, respectively, to perform scanning of the object.
4) Three-dimensional scanning
For each group of object deformation fringe images, the PC machine firstly obtains the truncation phase of each group of object deformation fringe images based on time and phase shift phases, then obtains the unfolding continuous phase according to the truncation phase, and finally obtains object three-dimensional scanning data by utilizing the calibration parameters and the unfolding continuous phase for calculation.
Finally 5 groups of three-dimensional scanning data of the object are obtained through calculation.
In this embodiment, a multi-frequency heterodyne three-dimensional scanning algorithm is used for calculation:
the heterodyne principle measures the phase of a modulated optical signal using optical modulation techniques and an electronic phase meter, using a wavelength of λ1,λ221) Of the phase function phi1And phi2And obtaining a new phase function phi after superposition, wherein the wavelength of the phase function phi is lambda. Then there are:
Figure BDA0002541554970000071
formula (1) shows that the low-frequency phase function can be obtained by utilizing the high-frequency phase function heterodyne, and then the time phase expansion is realized by utilizing the relationship between the low-frequency phase display and the high-frequency function. Is known as phi1And phi2Then the heterodyne low frequency phase function can be calculated by equation (2).
Figure BDA0002541554970000081
Taking three-frequency time phase unwrapping as an example, let the frequency ratio of the projected fringe be
Figure BDA0002541554970000082
Wherein λ321. Constructing the grating such that4=f2-f1=1,
Figure BDA0002541554970000083
The unwrapping of the high frequency phase can be achieved using time phase unwrapping as shown in figure 6.
And finally, obtaining height information from continuous phase back calculation by using the geometric model parameters obtained by calibration.
S7, data reconstruction
And fusing the 5 groups of object three-dimensional scanning data according to the calibration parameters to obtain three-dimensional data with unified coordinates, filtering the three-dimensional data to obtain final three-dimensional point cloud data of the measured object, and finishing the three-dimensional imaging of the measured object.
In this embodiment, a data reconstruction method based on poisson equation is adopted, so that local shape preservation and global optimization can be achieved. The process is as follows: the boundary and normal vectors for each set of scan data are calculated, the Poisson equation is solved, so that the model indicated function gradient is equal to the surface normal field integral.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A three-dimensional imaging system is characterized by comprising a PC (personal computer), a camera, a projector I and a projector II, wherein a trigger control channel and a data transmission channel are arranged between the PC and the camera as well as between the projector I and the projector II;
the projection light source wave bands of the projector I and the projector II are different;
the projector I can project structured light of a single waveband;
the projector II can project a plurality of structured lights with different wave bands;
the camera is positioned between the projector I and the projector II;
the camera, the projector I and the projector II can be adjusted in a lifting mode, and the horizontal direction and the vertical direction of lenses of the camera, the projector I and the projector II can be adjusted;
the projection focuses of the camera, the projector I and the projector II converge at one point.
2. The system of claim 1, wherein said camera, projector i and projector ii are mounted on three respective holders that allow for elevation adjustment by the holders, and adjustment of the horizontal and up and down orientation of the lens, said holders being electrically and mechanically connected to said PC.
3. The system of claim 1, wherein the camera uses a 6-Pin circular connector as an IO interface, including a trigger control port and a data input port.
4. A three-dimensional imaging method, characterized in that the three-dimensional imaging system of any one of claims 1 to 3 is used, the method comprising:
s1, generating structural light stripes by using a PC (personal computer) and uploading the structural light stripes to the projector I and the projector II;
s2, carrying out system calibration in the common vision field of the camera, the projector I and the projector II, where the object to be measured is placed, and acquiring relative calibration parameters among the camera, the projector I and the projector II through a PC;
s3, adjusting the exposure brightness of the camera according to the ambient light and the projection light intensity of the projector on the measured object;
s4, standby the projector II, projecting and scanning the single-waveband structured light stripe to the object to be measured by the projector I, capturing the object to be measured by the camera to obtain a group of object deformation stripe images, and uploading the stripe images to the PC;
s5, the projector I is standby, the projector II sequentially projects and scans N kinds of structural light stripes with different wave bands to the measured object, in the process, the measured object is captured by the camera, N groups of object deformation stripe graphs corresponding to the N kinds of structural light stripes with different wave bands are obtained and uploaded to the PC;
s6, three-dimensional scanning, wherein the PC respectively calculates the N +1 groups of object deformation stripe patterns obtained by snapshot according to the calibration parameters to obtain N +1 groups of object three-dimensional scanning data;
and S7, reconstructing data, fusing the N +1 groups of object three-dimensional scanning data according to the calibration parameters to obtain three-dimensional data with unified coordinates, filtering the three-dimensional data to obtain final three-dimensional point cloud data of the measured object, and finishing three-dimensional imaging of the measured object.
5. The method of claim 4, wherein projector II is capable of projecting four different wavelength bands of structured light.
6. The method of claim 5, wherein projector I is an infrared projector and projector II is an RGB projector.
7. The method according to claim 4, wherein in step S1, the structured light stripe is generated based on a time-phase and phase-shift phase structured light algorithm according to three-dimensional imaging task conditions.
8. The method as claimed in claim 4, wherein in step S2, the calibration parameters are obtained by a nine-point calibration method.
9. The method according to claim 4, wherein in step S3, the camera is allowed to clearly acquire the image of the object to be measured when adjusting the exposure brightness of the camera.
10. The method according to claim 4, wherein in step S6, for each set of deformed fringe patterns of the object, the truncated phase is first obtained based on time and phase shift phase, then the unwrapped continuous phase is obtained based on the truncated phase, and finally the three-dimensional scan data of the object is obtained by using the calibration parameters and the unwrapped continuous phase.
CN202010548295.1A 2020-06-16 2020-06-16 Three-dimensional imaging system and method Pending CN111811432A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010548295.1A CN111811432A (en) 2020-06-16 2020-06-16 Three-dimensional imaging system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010548295.1A CN111811432A (en) 2020-06-16 2020-06-16 Three-dimensional imaging system and method

Publications (1)

Publication Number Publication Date
CN111811432A true CN111811432A (en) 2020-10-23

Family

ID=72846258

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010548295.1A Pending CN111811432A (en) 2020-06-16 2020-06-16 Three-dimensional imaging system and method

Country Status (1)

Country Link
CN (1) CN111811432A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112361991A (en) * 2020-11-04 2021-02-12 深圳广成创新技术有限公司 Three-dimensional scanning method and device, computer equipment and storage medium
CN113532328A (en) * 2021-07-16 2021-10-22 燕山大学 Surface profile real-time measurement system and method in medium plate straightening process
CN114295076A (en) * 2022-01-05 2022-04-08 南昌航空大学 Measuring method for solving shadow measuring problem of tiny object based on structured light

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102645406A (en) * 2012-05-17 2012-08-22 天津理工大学 Online vision detection light source based on multispectral characteristics
CN108475145A (en) * 2016-01-13 2018-08-31 精工爱普生株式会社 Pattern recognition device, image-recognizing method and image identification unit
CN108490000A (en) * 2018-03-13 2018-09-04 北京科技大学 A kind of Bar Wire Product surface defect on-line measuring device and method
CN110032915A (en) * 2018-01-12 2019-07-19 杭州海康威视数字技术股份有限公司 A kind of human face in-vivo detection method, device and electronic equipment
CN110312079A (en) * 2018-03-20 2019-10-08 北京中科奥森科技有限公司 Image collecting device and its application system
CN210570529U (en) * 2019-03-05 2020-05-19 盎锐(上海)信息科技有限公司 Calibration system for camera and projector

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102645406A (en) * 2012-05-17 2012-08-22 天津理工大学 Online vision detection light source based on multispectral characteristics
CN108475145A (en) * 2016-01-13 2018-08-31 精工爱普生株式会社 Pattern recognition device, image-recognizing method and image identification unit
CN110032915A (en) * 2018-01-12 2019-07-19 杭州海康威视数字技术股份有限公司 A kind of human face in-vivo detection method, device and electronic equipment
CN108490000A (en) * 2018-03-13 2018-09-04 北京科技大学 A kind of Bar Wire Product surface defect on-line measuring device and method
CN110312079A (en) * 2018-03-20 2019-10-08 北京中科奥森科技有限公司 Image collecting device and its application system
CN210570529U (en) * 2019-03-05 2020-05-19 盎锐(上海)信息科技有限公司 Calibration system for camera and projector

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112361991A (en) * 2020-11-04 2021-02-12 深圳广成创新技术有限公司 Three-dimensional scanning method and device, computer equipment and storage medium
CN113532328A (en) * 2021-07-16 2021-10-22 燕山大学 Surface profile real-time measurement system and method in medium plate straightening process
CN114295076A (en) * 2022-01-05 2022-04-08 南昌航空大学 Measuring method for solving shadow measuring problem of tiny object based on structured light
CN114295076B (en) * 2022-01-05 2023-10-20 南昌航空大学 Measuring method for solving shadow measuring problem of tiny object based on structured light

Similar Documents

Publication Publication Date Title
CN111811432A (en) Three-dimensional imaging system and method
CN106127745B (en) The combined calibrating method and device of structure light 3 D vision system and line-scan digital camera
US7274470B2 (en) Optical 3D digitizer with enlarged no-ambiguity zone
JP7090446B2 (en) Image processing equipment
US7403650B2 (en) System for simultaneous projections of multiple phase-shifted patterns for the three-dimensional inspection of an object
US20140307100A1 (en) Orthographic image capture system
US20140198185A1 (en) Multi-camera sensor for three-dimensional imaging of a circuit board
CN104221053B (en) Shaped reflector and surface profile map
US20190362483A1 (en) Inspection system and inspection method
CA2267519A1 (en) Optical full human body 3d digitizer
CN107860337B (en) Structured light three-dimensional reconstruction method and device based on array camera
CN107480615B (en) Beauty treatment method and device and mobile equipment
CN206311076U (en) Very fast 3D anthropometric scanning instrument based on speckle
CN112762859B (en) High-precision three-dimensional measuring device for sine stripe structured light of non-digital optical machine
JP2003202216A (en) Method, device, system and program for three-dimensional image processing
CN108592886A (en) Image capture device and image-pickup method
CN112648935A (en) Image processing method and device and three-dimensional scanning system
CN109900223B (en) Imaging method and device for projection grating modeling
US11763491B2 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
Portalés et al. An interactive cameraless projector calibration method
JP2015206749A (en) Coupling method of three-dimensional data, shape measurement method, coupling device of three-dimensional data, shape measurement device, structure manufacturing method, structure manufacturing system and shape measurement program
US11481917B2 (en) Compensation of three-dimensional measuring instrument having an autofocus camera
KR101816781B1 (en) 3D scanner using photogrammetry and photogrammetry photographing for high-quality input data of 3D modeling
CN109900222B (en) Model obtaining method and device based on projection grating modeling
Kasuya et al. One-shot entire shape scanning by utilizing multiple projector-camera constraints of grid patterns

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination