CN105651173A - 3D measuring method and system - Google Patents
3D measuring method and system Download PDFInfo
- Publication number
- CN105651173A CN105651173A CN201610113222.3A CN201610113222A CN105651173A CN 105651173 A CN105651173 A CN 105651173A CN 201610113222 A CN201610113222 A CN 201610113222A CN 105651173 A CN105651173 A CN 105651173A
- Authority
- CN
- China
- Prior art keywords
- measured
- image data
- information
- measured object
- phase information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 18
- 238000005259 measurement Methods 0.000 claims description 26
- 238000012545 processing Methods 0.000 claims description 24
- 238000012937 correction Methods 0.000 claims description 16
- 238000000691 measurement method Methods 0.000 claims description 15
- 230000005540 biological transmission Effects 0.000 claims description 13
- 230000008569 process Effects 0.000 claims description 9
- 238000001914 filtration Methods 0.000 claims description 6
- 238000012360 testing method Methods 0.000 claims description 5
- 238000006243 chemical reaction Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 6
- 238000012423 maintenance Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 238000010924 continuous production Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8887—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention discloses a 3D measuring method and system. The method can include the steps that measured image data of the surface of a measured object is de-noised, and phase information of the measured image data is obtained; based on reference image data, the phase information of the measured image data is corrected, and corrected phase information is obtained; the corrected phase information is subjected to phase analysis, and 3D data of the surface of the measured object is obtained; based on the 3D data of the surface of the measured object, surface information of the measured object is obtained and analyzed.
Description
Technical Field
The present invention relates to a 3D measuring method, and more particularly, to a high-speed 3D measuring method and system.
Background
With the rapid increase of the total length of roads, bridges, railways and tunnels, the measurement of the degree of damage to roads, bridges and the like and the corresponding maintenance requires more and more expenditure of manpower, material resources and financial resources. The existing measurement technology mainly comprises manual measurement and automatic measurement, the speed of the manual measurement is slow, the acquired data is not continuous, the precision cannot meet the requirement, and the requirement of continuous increase cannot be well met; automatic measurement techniques, particularly 3D measurement techniques, are being developed in recent years. The traditional 3D measurement includes contact type, laser scanning, measurement of the surface of an object contacted by a probe pen, or omnibearing scanning of 3D space information of the measured object by a laser scanner and the like.
The inventor finds that the existing 3D measurement technology has limitation on the rapid measurement of large-area objects (such as roads, bridges and the like), and the speed is greatly restricted. Therefore, it is necessary to develop a high-speed 3D measurement method and system.
The information disclosed in this background section of the disclosure is only for enhancement of understanding of the general background of the disclosure and should not be taken as an acknowledgement or any form of suggestion that this information constitutes prior art already known to a person skilled in the art.
Disclosure of Invention
The disclosure provides a 3D measurement method and a system, which can collect image data of the surface of a measured object, process and analyze the image data, and realize high-speed measurement of the surface of a large-area object.
According to an aspect of the present disclosure, there is provided a 3D measurement method, which may include: denoising the measured image data on the surface of the measured object to obtain phase information of the measured image data; based on the reference image data, carrying out correction processing on the phase information of the detected image data to obtain corrected phase information; carrying out phase analysis on the corrected phase information to obtain 3D data of the surface of the measured object; and obtaining surface information of the measured object based on the 3D data of the surface of the measured object, and analyzing the surface information of the measured object.
According to another aspect of the present disclosure, a 3D measurement system is presented, which may include: an emission unit that projects the structured light to a surface of an object to be measured; an acquisition unit that acquires measured image data reflected from a surface of a measured object; and a processing unit which processes and analyzes the detected image data.
The methods and apparatus of the present disclosure have other features and advantages which will be apparent from or are set forth in detail in the accompanying drawings and the following detailed description, which are incorporated herein, and which together serve to explain certain principles of the disclosure.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following more particular descriptions of exemplary embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout the exemplary embodiments of the disclosure.
Fig. 1 shows a flow chart of the steps of a 3D measurement method according to the present disclosure.
FIG. 2 shows a schematic diagram of frequency domain information of a measured image data according to one embodiment of the present disclosure.
Fig. 3 shows a schematic diagram of filtering frequency domain information according to one embodiment of the present disclosure.
Fig. 4 shows a schematic diagram of a 3D measurement system according to the present disclosure.
Detailed Description
The present disclosure will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Example 1
Fig. 1 shows a flow chart of the steps of a 3D measurement method according to the present disclosure.
As shown in fig. 1, according to an embodiment of the present disclosure, there is provided a 3D measurement method, which may include: step 101, denoising measured image data on the surface of a measured object to obtain phase information of the measured image data; 102, correcting the phase information of the detected image data based on the reference image data to obtain corrected phase information; 103, carrying out phase analysis on the corrected phase information to obtain 3D data of the surface of the measured object; and step 104, obtaining the surface information of the measured object based on the 3D data of the surface of the measured object, and analyzing the surface information of the measured object.
The embodiment performs denoising, correction and phase analysis on the measured image data to acquire and analyze the surface information of the measured object, thereby realizing high-speed 3D measurement of the surface of the measured object.
Specific steps of the 3D measurement method according to the present disclosure are explained in detail below.
Denoising process
In one example, the measured image data of the surface of the measured object may be denoised to obtain phase information of the measured image data.
In one example, the denoising process may include: carrying out Fourier transform on the measured image data to obtain frequency domain information of the measured image data; filtering the frequency domain information to obtain denoised frequency domain information; and performing inverse Fourier transform on the denoised frequency domain information to obtain time domain information of the detected image data, and further obtain phase information of the detected image data. Wherein the phase information of the measured image data can be expressed as:
wherein c (x, y) may represent temporal information of the measured image data, imag [ c (x, y) ] may represent an imaginary part of c (x, y), and real [ c (x, y) ] may represent a real part of c (x, y).
Specifically, the structured light, which may be the stripe light diffracted by the grating, may be projected onto the surface of the measured object, so that the image data with the interference fringes reflected from the surface of the measured object may be collected as the measured image data. Wherein the light intensity at point (x, y) on the measured image data can be expressed as:
I(x,y)=Ib(x,y)+Ic(x,y)*cos(2πfx0x+2πfy0y+Ω(x,y)(2)
wherein I (x, y) may represent the light intensity at point (x, y); i isb(x, y) may represent the brightness (brightness) of the background; i isc(x, y) may represent the contrast of the interference fringes; Ω (x, y) may represent phase information, which may represent three-dimensional information of an object; f. ofx0And fy0It is possible to express a carrier frequency (carrierfrequency) in the x and y directions.
Transformation of equation (1) using the Euler's formula (Euler:) yields:
wherein, C*(x, y) may represent the conjugate function of the complex number C (x, y).
Fig. 2 shows a schematic diagram of frequency domain information of measured image data according to an embodiment of the present disclosure, wherein the horizontal axis is carrying frequency f and the vertical axis is amplitude a. As shown in FIG. 2, the image data under test can be converted from time domain to frequency domain by fast Fourier transform to obtain frequency domain information of the image data under test, where C*(f+f0,y)、C(f-f0Y) and a (f, y) may respectively represent functions corresponding to the respective waveforms. Performing fast fourier transform on equation (3) can obtain:
wherein,and C*(fx+fx0,fy+fy0) Can respectively represent I (x, y) and I after Fourier transformationb(x, y), C (x, y) and C*(x,y),fx,fyThe components of the frequency f in the x and y directions can be represented separately.
The frequency domain information may then be filtered to obtain denoised frequency domain informationFIG. 3 shows a schematic diagram of filtering frequency domain information, H (f-f), according to one embodiment of the present disclosure0And y) represents the threshold function of the filter set. As shown in fig. 3, according to the distribution of the frequency domain of the object to be measured (as shown in fig. 2) and the function of equation (5), a corresponding filter function can be designed according to the required final technical parameters (retaining high-frequency or low-frequency information), retaining useful object information in the frequency domain, and removing the noise part in the frequency domain.
In one example, filtering the frequency domain information may include: the number and the position of the peak values in the frequency domain are automatically found by adopting an automatic peak value finding type filter, and effective frequency domain information is reserved. Referring to fig. 3, an automatic peak-finding filter may be adopted, which may automatically find the number and position of peaks in the frequency domain, and generate a mask (mask) according to the position of the DC power source (middle peak), where the mask can cover the invalid noise on the left of the middle peak, and only retain the frequency domain information of the right valid object for subsequent analysis. The use of the automatic peak value finding filter greatly facilitates the processing of video images, and is more convenient than manual setting of an independent filter.
The denoised frequency domain information may then be inverse Fourier transformedAnd performing the interior transformation to obtain the de-noised detected image data. Using an inverse Fourier transform, denoised frequency domain information can be transformedReconverting into time domain information c (x, y), then obtaining the phase information of the detected image data:
wherein Ω (x, y) may represent phase information of the measured image data, imag [ c (x, y) ] may represent an imaginary part of c (x, y), and real [ c (x, y) ] may represent a real part of c (x, y).
In fact, the phase information Ω (x, y) in equation (5) also includes other phase information besides the object phase information, which brings great inconvenience to the phase analysis. In order to be able to better obtain three-dimensional information containing only the object itself, it needs to be corrected.
Correction processing
In one example, the phase information of the measured image data may be subjected to correction processing based on the reference image data, obtaining corrected phase information.
Specifically, before the measured image data of the object is acquired, reference image data is extracted at the acquisition plane according to the height of the object, and the reference image data can be used for correction to remove phase information outside the object. Through the correction processing of the phase information Ω (x, y) of the formula (1), corrected phase information Ω (x, y) can be obtainedw(x,y)。
In addition, the image distortion or off-axis caused by a part of the camera self-factors can be removed by correction, the angle deviation of the photo and the like.
Phase resolution
In one example, the corrected phase information may be phase resolved to obtain 3D data of the surface of the object under measurement.
In one example, the phase resolving may include: carrying out continuous processing on the corrected detected image data by adopting an Unwrap method to obtain continuous phase information; and performing phase conversion on continuous phase information based on the strip-free measured image data to obtain 3D data of the surface of the measured object.
In fact, the corrected phase information Ωw(x, y) is a phase discontinuity (wrapped) having a phase of [ - π, π]In between, in order to remove the discontinuity of 2 pi, it is necessary to perform a continuous process to obtain the phase information Ωw(x, y) into continuous (Unwrapped) phase information omegau(x, y) as shown in formula (6):
Ωu(x,y)=Ωw(x,y)+2πk(6)
where k may be an integer coefficient that accounts for 2 pi.
The autonomous designed unwrap algorithm can be used to obtain continuous phase information. Those skilled in the art will appreciate that various serialization algorithms known in the art may be employed to acquire continuous phase information.
Then, the continuous phase information is converted based on the streak-free measured image data, and 3D data of the surface of the measured object can be obtained, which 3D data may include texture information of the surface of the measured object. Then, the GPS coordinate information of the object to be measured may be added to the 3D data so as to achieve the positioning of the object to be measured.
In one example, the units of measure of continuous phase information may be converted from rad radians measures to standard metric measures.
Analyzing surface information
In one example, the surface information of the object to be measured may be obtained based on 3D data of the surface of the object to be measured, and analyzed.
In one example, analyzing the surface information of the object under test may include: and judging the damage condition of the measured object based on the surface information of the measured object, positioning the damage position of the measured object, and displaying the result. Wherein, the surface information of the measured object can include: the curvature of the surface of the measured object; texture of the surface of the object to be tested; flatness of the surface of the object to be measured; information of the depth, length and volume of the crack on the surface of the measured object; and the degree of damage to the surface of the object to be measured.
Actually, based on the 3D data of the surface of the measured object, various surface information of the measured object can be obtained. Then, the surface information of the measured object can be analyzed according to the requirements of the user, and the result is displayed. For example, when the object to be measured is a road, a bridge, a rail, or the like, the curvature of the surface of the object to be measured can be analyzed; texture of the surface of the object to be tested; flatness of the surface of the object to be measured; information of the depth, length and volume of the crack on the surface of the measured object; and the damage degree of the surface of the measured object and the like, so that the damage condition of the road, the bridge or the track is judged, the damage position is positioned, the result is displayed according to the requirement of a user, and reference is provided for the subsequent repair or maintenance of the road, the bridge or the track.
It should be understood by those skilled in the art that the above is only an example of surface information of the object to be measured, which is not limited thereto, and various surface information of the object to be measured known in the art may be analyzed and displayed.
In one example, the 3D data of the surface of the measured object may be further subjected to post-analysis processing, for example, the grade information of the maintenance scheme grade evaluation may be formulated based on the 3D data of the surface of the measured object and the GPS positioning information; a management database of road conditions or national property and wealth of one or more areas can be established, and data sharing, data retrieval and the like are achieved.
It will be understood by those skilled in the art that the foregoing description of the embodiments of the present disclosure is for the purpose of illustrating the beneficial effects of the embodiments of the present disclosure only and is not intended to limit the embodiments of the present disclosure to any of the examples given.
Example 2
Fig. 4 shows a schematic diagram of a 3D measurement system according to the present disclosure.
As shown in fig. 4, according to an embodiment of the present disclosure, there is provided a 3D measurement system, which may include: a transmitting unit 401, an acquisition unit 402 and a processing unit 403. The emitting unit 401 may project the structured light to the surface of the object to be measured; the acquisition unit 402 may acquire measured image data reflected from the surface of the measured object; the processing unit 403 may process and analyze the measured image data.
The embodiment can acquire and process the image data of the measured object, obtain the surface information of the measured object and realize the high-speed 3D measurement of the surface of the measured object.
In one example, the emission unit 401 may include a light source, a grating component, and a Support (Support) component. Light emitted by the light source generates structured light with interference fringes after passing through the grating assembly and is projected onto the surface of a measured object; a support assembly supports the light source and the grating assembly. Preferably, in the case of using invisible light, the light source may be infrared light, and parameters such as brightness and contrast of the light source may be adjusted by adjusting the intensity of the light source; the light emitted from the light source generates structured light with interference fringes after passing through the grating assembly and projects on the surface of a measured object: the support component can support the light source and the grating component, and the positions of the light source and the grating component can be adjusted by adjusting the support component. In the case of using visible light, the emission unit 401 may include a DLP projection component and a control component. The DLP projection assembly can project light with interference fringes onto the surface of a measured object; the control component can adjust background brightness, light intensity, interference fringe density and the like to ensure measurement accuracy.
In one example, the acquisition unit 402 may acquire measured image data reflected from a surface of the measured object. After the emission unit 401 projects light onto the surface of the object to be measured, measured image data with deformed interference fringes reflected from the surface of the object to be measured, that is, image data of structured light whose surface is deformed according to the structural shape of the object to be measured, may be collected by the collection unit 402. The acquisition unit 402 may comprise a CCD or CMOS camera, and may employ a high definition dynamic industrial camera.
In one example, the 3D measurement system according to the present disclosure may further comprise a synchronization unit that may adjust the acquisition frequency of the acquisition unit 402 as a function of the transmission frequency of the transmission unit 401 and synchronize the transmission of the transmission unit 401 and the acquisition of the acquisition unit 402.
Specifically, the acquisition frequency of the acquisition unit 402 may be adjusted by the synchronization unit as a function of the transmission frequency of the transmission unit 401, for example, the acquisition frequency is 2 times the transmission frequency, and the transmission and acquisition of the transmission unit 401 and the acquisition unit 402 are synchronized. In this way, after the acquisition unit 402 acquires each frame of measured image data with interference fringes, one frame of measured image data without interference fringes can be acquired. The acquired measured image data without interference fringes can well keep 2D information (such as texture, color and the like of the measured image) of a measured object, and the measured image data without interference fringes can be used for realizing conversion to 3D data in subsequent processing; the reference image data may be obtained based on the measured image data having no interference fringes, and the reference image data may be used for correction processing in subsequent processing. In addition, the setting of the synchronization unit can also play the effect of saving electricity and energy.
It will be understood by those skilled in the art that the present disclosure is not so limited and that the acquisition frequency can be set to any function of the transmit frequency to acquire simultaneously measured image data with and without interference fringes.
In one example, the processing unit 403 may process and analyze the measured image data. Wherein, the processing and analyzing of the detected image data by the processing unit may include: denoising the measured image data on the surface of the measured object to obtain phase information of the measured image data; based on the reference image data, carrying out correction processing on the phase information of the detected image data to obtain corrected phase information; carrying out phase analysis on the corrected phase information to obtain 3D data of the surface of the measured object; and obtaining surface information of the measured object based on the 3D data of the surface of the measured object, and analyzing the surface information of the measured object.
In one example, hardware corrections may also be made to the 3D measurement system of the present disclosure. The hardware correction can comprise correction of the camera, removal of image deformation or off-axis caused by factors of the camera, and the like. The hardware correction may also include correction of the 3D measurement system, for example, the emitting unit 401 and the collecting unit 402 may be interchanged, and the emitting and collecting angles of the emitting unit 401 and the collecting unit 402 may be adjusted to ensure that the emitting and collecting optical paths are reversible and reduce the image deformation that may occur.
Furthermore, various parameters of the 3D measurement system of the present disclosure may also be set by the processing unit 403, which may include: system parameters such as optical path design parameters, mechanical design parameters; hardware parameters, such as camera parameters of the acquisition unit 402, projector parameters of the emission unit 401, emission and acquisition angle parameters of the emission unit 401 and the acquisition unit 402, and the like; and software parameters such as GUI, etc.
It will be understood by those skilled in the art that the foregoing description of the embodiments of the present disclosure is for the purpose of illustrating the beneficial effects of the embodiments of the present disclosure only and is not intended to limit the embodiments of the present disclosure to any of the examples given.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.
Claims (10)
1. A 3D measurement method, comprising:
denoising the measured image data on the surface of the measured object to obtain phase information of the measured image data;
based on the reference image data, carrying out correction processing on the phase information of the detected image data to obtain corrected phase information;
carrying out phase analysis on the corrected phase information to obtain 3D data of the surface of the measured object; and
and obtaining surface information of the measured object based on the 3D data of the surface of the measured object, and analyzing the surface information of the measured object.
2. The 3D measurement method according to claim 1, wherein the denoising process includes:
carrying out Fourier transform on the measured image data to obtain frequency domain information of the measured image data;
filtering the frequency domain information to obtain denoised frequency domain information; and
performing inverse Fourier transform on the denoised frequency domain information to obtain time domain information of the measured image data and further obtain phase information of the measured image data,
wherein the phase information of the measured image data is expressed as:
wherein c (x, y) represents time domain information of the image data to be measured, imag [ c (x, y) ] represents an imaginary part of c (x, y), and real [ c (x, y) ] represents a real part of c (x, y).
3. The 3D measurement method of claim 2, wherein filtering the frequency domain information comprises: the number and the position of the peak values in the frequency domain are automatically found by adopting an automatic peak value finding type filter, and effective frequency domain information is reserved.
4. The 3D measurement method of claim 1, wherein phase resolving comprises:
carrying out continuous processing on the corrected detected image data by adopting an Unwrap method to obtain continuous phase information;
and performing phase conversion on continuous phase information based on the strip-free measured image data to obtain 3D data of the surface of the measured object.
5. The 3D measurement method of claim 1, wherein analyzing the surface information of the object under test comprises:
and judging the damage condition of the measured object based on the surface information of the measured object, positioning the damage position of the measured object, and displaying the result.
6. The 3D measurement method of claim 1, wherein the surface information of the object to be measured includes: the curvature of the surface of the measured object; texture of the surface of the object to be tested; flatness of the surface of the object to be measured; information of the depth, length and volume of the crack on the surface of the measured object; and the degree of damage to the surface of the object being measured.
7. A 3D measurement system comprising:
an emission unit that projects the structured light to a surface of an object to be measured;
an acquisition unit that acquires measured image data reflected from a surface of a measured object; and
and the processing unit is used for processing and analyzing the detected image data.
8. The 3D measurement system of claim 7, further comprising:
a synchronization unit that adjusts the acquisition frequency of the acquisition unit as a function of the transmission frequency of the transmission unit and synchronizes the transmission of the transmission unit and the acquisition of the acquisition unit.
9. The 3D measurement system of claim 7, wherein the transmitting unit comprises:
a light source;
the light emitted by the light source passes through the grating assembly to generate structured light with interference fringes and is projected onto the surface of a measured object; and
and the support component supports the light source and the grating component.
10. The 3D measurement system of claim 7, wherein the processing unit to process and analyze the image data under test comprises:
denoising the measured image data on the surface of the measured object to obtain phase information of the measured image data;
based on the reference image data, carrying out correction processing on the phase information of the detected image data to obtain corrected phase information;
carrying out phase analysis on the corrected phase information to obtain 3D data of the surface of the measured object; and
and obtaining surface information of the measured object based on the 3D data of the surface of the measured object, and analyzing the surface information of the measured object.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610113222.3A CN105651173A (en) | 2016-02-29 | 2016-02-29 | 3D measuring method and system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610113222.3A CN105651173A (en) | 2016-02-29 | 2016-02-29 | 3D measuring method and system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN105651173A true CN105651173A (en) | 2016-06-08 |
Family
ID=56492850
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610113222.3A Pending CN105651173A (en) | 2016-02-29 | 2016-02-29 | 3D measuring method and system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105651173A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107221025A (en) * | 2017-05-31 | 2017-09-29 | 天津大学 | A kind of synchronous system and method for obtaining body surface three-dimensional colour point clouds model |
CN112381731A (en) * | 2020-11-12 | 2021-02-19 | 四川大学 | Single-frame stripe image phase analysis method and system based on image denoising |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1758020A (en) * | 2005-11-18 | 2006-04-12 | 北京航空航天大学 | Stereo vision detection system based on adaptive sine streak projection |
CN101050949A (en) * | 2007-05-22 | 2007-10-10 | 天津大学 | Measuring system and its measuring method for large field object micro surface three dimension topography |
CN101975558A (en) * | 2010-09-03 | 2011-02-16 | 东南大学 | Rapid three-dimensional measurement method based on color grating projection |
CN102316355A (en) * | 2011-09-15 | 2012-01-11 | 丁少华 | Generation method of 3D machine vision signal and 3D machine vision sensor |
CN103940371A (en) * | 2014-05-12 | 2014-07-23 | 电子科技大学 | High-precision three-dimensional shape measurement method for jump object |
-
2016
- 2016-02-29 CN CN201610113222.3A patent/CN105651173A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1758020A (en) * | 2005-11-18 | 2006-04-12 | 北京航空航天大学 | Stereo vision detection system based on adaptive sine streak projection |
CN101050949A (en) * | 2007-05-22 | 2007-10-10 | 天津大学 | Measuring system and its measuring method for large field object micro surface three dimension topography |
CN101975558A (en) * | 2010-09-03 | 2011-02-16 | 东南大学 | Rapid three-dimensional measurement method based on color grating projection |
CN102316355A (en) * | 2011-09-15 | 2012-01-11 | 丁少华 | Generation method of 3D machine vision signal and 3D machine vision sensor |
CN103940371A (en) * | 2014-05-12 | 2014-07-23 | 电子科技大学 | High-precision three-dimensional shape measurement method for jump object |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107221025A (en) * | 2017-05-31 | 2017-09-29 | 天津大学 | A kind of synchronous system and method for obtaining body surface three-dimensional colour point clouds model |
CN107221025B (en) * | 2017-05-31 | 2020-01-03 | 天津大学 | System and method for synchronously acquiring three-dimensional color point cloud model of object surface |
CN112381731A (en) * | 2020-11-12 | 2021-02-19 | 四川大学 | Single-frame stripe image phase analysis method and system based on image denoising |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106802138B (en) | A kind of 3 D scanning system and its scan method | |
CN103534583B (en) | The defect inspection method of tire | |
CN105783775B (en) | A kind of minute surface and class minute surface object surface appearance measuring device and method | |
Huang et al. | Comparison of Fourier transform, windowed Fourier transform, and wavelet transform methods for phase extraction from a single fringe pattern in fringe projection profilometry | |
CN104006765B (en) | Single width carrier frequency interference fringe phase extraction method and detecting device | |
CN101986098A (en) | Tricolor raster projection-based Fourier transform three-dimensional measuring method | |
CN109186496B (en) | Three-dimensional surface shape measuring method based on moving least square method | |
CN101996416B (en) | 3D face capturing method and equipment | |
CN104680496A (en) | Kinect deep image remediation method based on colorful image segmentation | |
CN105091748B (en) | Rail vehicle tolerance dimension measuring system | |
CN101422787A (en) | Strip-steel flatness measuring method based on single-step phase-shift method | |
CN102519393A (en) | Method for realizing rapid modulation degree profilometry by use of two orthogonal sinusoidal gratings | |
CN109507198B (en) | Mask detection system and method based on fast Fourier transform and linear Gaussian | |
CN106996748A (en) | Wheel diameter measuring method based on binocular vision | |
CN109506592A (en) | Object dimensional surface shape measurement method and device based on striped light stream | |
Wang et al. | Development of three-dimensional pavement texture measurement technique using surface structured light projection | |
CN112097688A (en) | Multispectral three-dimensional shape measurement method and device based on grating projection three-dimensional imaging | |
CN108332684A (en) | A kind of measuring three-dimensional profile method based on Structured Illumination microtechnic | |
CN110057312A (en) | A kind of monocular vision three-dimensional scanning measurement device and measurement method based on structure light | |
CN105651173A (en) | 3D measuring method and system | |
CN205718873U (en) | A kind of double frequency phase shift tripleplane measuring instrument | |
JP6544018B2 (en) | Tire analysis apparatus and tire analysis method | |
Quan et al. | Determination of surface contour by temporal analysis of shadow moiré fringes | |
CN112212806B (en) | Three-dimensional phase unfolding method based on phase information guidance | |
Jiang et al. | Fringe pattern analysis by S-transform |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20160608 |