CN108827184B - Structured light self-adaptive three-dimensional measurement method based on camera response curve - Google Patents
Structured light self-adaptive three-dimensional measurement method based on camera response curve Download PDFInfo
- Publication number
- CN108827184B CN108827184B CN201810403285.1A CN201810403285A CN108827184B CN 108827184 B CN108827184 B CN 108827184B CN 201810403285 A CN201810403285 A CN 201810403285A CN 108827184 B CN108827184 B CN 108827184B
- Authority
- CN
- China
- Prior art keywords
- exposure time
- image
- value
- exposure
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000004044 response Effects 0.000 title claims abstract description 18
- 238000000691 measurement method Methods 0.000 title claims abstract description 12
- 238000005259 measurement Methods 0.000 claims abstract description 40
- 238000005316 response function Methods 0.000 claims abstract description 33
- 238000005516 engineering process Methods 0.000 claims abstract description 9
- 230000008859 change Effects 0.000 claims abstract description 6
- 238000000034 method Methods 0.000 claims description 28
- 230000008569 process Effects 0.000 claims description 17
- 238000004364 calculation method Methods 0.000 claims description 9
- 238000005286 illumination Methods 0.000 claims description 8
- 230000003044 adaptive effect Effects 0.000 claims description 3
- 230000015572 biosynthetic process Effects 0.000 claims description 3
- 230000010363 phase shift Effects 0.000 claims description 3
- 238000003786 synthesis reaction Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 5
- 238000002310 reflectometry Methods 0.000 abstract description 5
- 238000002474 experimental method Methods 0.000 description 4
- 241000135164 Timea Species 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The application provides a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which comprises the steps of presetting a measurement gray value range and an exposure time range, and acquiring a reference image and reference exposure time in the preset measurement range; selecting a reference point in the reference image, and calculating a camera response function according to an exposure amount and a gray value obtained along with a change in exposure time of the reference point; obtaining relative radiance values of all the points by using a camera response function, and calculating exposure times and exposure time based on the relative radiance values and the camera response function; the images acquired at different exposure times are fused into a new sequence of fringe images and used for three-dimensional reconstruction. The invention can self-adaptively reconstruct an object with complex surface reflectivity, enlarges the application range of the structured light measurement technology, improves the automation degree of measurement and can obtain good measurement effect on the complex reflectivity surface.
Description
Technical Field
The invention belongs to the technical field of optical measurement, and particularly relates to a structured light self-adaptive three-dimensional measurement method based on a camera response curve.
Background
The structured light measurement technology is an active optical measurement technology which projects grating stripe images in a certain mode onto the surface of an object to be measured, a camera obtains a grating image which is modulated by the shape of the object to be measured and deformed, and then three-dimensional information data of the object to be measured is calculated. When the surface exposure is too strong or insufficient, such as specular reflection, the camera imaging cannot truly reflect the true intensity value of the surface, resulting in reduced measurement accuracy or no reconstruction. The multiple exposure technology is applied to the structured light reconstruction technology and is used for solving the problem of light reflection, and after multiple exposure images are acquired, points with the highest pixel value gray scale but without saturation at the same position are respectively selected to participate in reconstruction. Although the method of combining the structured light technology with multiple exposures obtains a better measuring effect, the exposure time and the exposure times of the method all depend on the experience of an operator, and the measuring efficiency and the industrial application are influenced.
It should be noted that the above background description is only for the convenience of clear and complete description of the technical solutions of the present application and for the understanding of those skilled in the art. Such solutions are not considered to be known to the person skilled in the art merely because they have been set forth in the background section of the present application.
Disclosure of Invention
The invention aims to provide a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which can self-adaptively reconstruct an object with complex surface reflectivity, enlarge the application range of the structured light measurement technology, improve the automation degree of measurement and obtain good measurement effect on the complex reflectivity surface.
The invention provides a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which comprises the following steps:
s1: presetting a measurement gray value range and an exposure time range, and acquiring a reference image and reference exposure time in the preset measurement range;
s2: selecting a reference point in the reference image, and calculating a camera response function according to an exposure amount and a gray value obtained along with a change in exposure time of the reference point;
s3: obtaining relative radiance values of all the points by using a camera response function, and calculating exposure times and exposure time based on the relative radiance values and the camera response function;
s4: the images acquired at different exposure times are fused into a new sequence of fringe images and used for three-dimensional reconstruction.
Further, the preset measurement gray value range and the exposure time range specifically include:
s21: the preset measured gray value range is~The preset exposure time range is the preset minimum camera exposure timeAnd maximum exposure timeA value of (d); is provided withWhen the value is needed, the maximum value of the area where the target object in the camera view is located under the current value is smaller than the maximum value when the stripes are projected(ii) a Is provided withWhen the value is larger, the minimum value of the region where the target object in the camera view is located at the current value is larger than the minimum value when the stripe is projected;
The acquiring the reference image and the reference exposure time within the preset measurement range comprises the projector projecting a pure white image to a target object within the preset exposure time rangeToContinuously changing the exposure time of the camera, counting the number of gray values of all pixels in the view of the camera in the range of the measured gray value in the adjustment process, and storing the image and the corresponding exposure in the adjustment processAnd light time, wherein an image corresponding to the maximum pixel gray value in the process is selected as a reference image, and the exposure time of the reference image is the reference exposure time.
Further, a gray value of a pixel on the reference image is selected asThe reference point illuminance value is uniformly set to an arbitrary constant, such as 300;
building template imagesThe gray scale value of the template image pixel is determined by the following formula:
in the above-mentioned formula (1),is that the reference image is in the coordinatesThe gray value of (d);
increasing the exposure time from the reference exposure time until the gray value of the pixel at the reference point is equal to or greater thanStoring the pixel gray value and the exposure time in the above process, and representing the camera response function by a fifth-order polynomial, as shown in formula (2):
the exposure of the reference point in the adjusting process is used as an input value, the corresponding pixel gray value is used as an output value, and 5 unknown coefficients are obtained by utilizing a least square fitting algorithmThat is, the calculated camera response functions from which the respective camera response functions are obtainedCorresponding exposure amountAndcorresponding exposure amount。
Further, using the camera response function to obtain relative irradiance values for all points calculating the number of exposures and the exposure time by combining the obtained relative irradiance values with the camera response function comprises:
s41: for template imageTraversing the image sequence obtained by calculation in step S21, when the coordinate pixel value on an image in the sequence is found to be in the optimal measurement range, obtaining the exposure of the point by using the calculated camera response function, and dividing the exposure by the corresponding exposure time of the image to obtain the relative contrast value of the point;
s42: for template imageAnd obtaining the exposure corresponding to the point according to the pixel gray value of the reference image at the point and the camera response function, and dividing the exposure by the reference exposure time to obtain the relative contrast value of the point.
S43: after the relative contrast values of all the target points are obtained, the minimum value is found,/The value of (1) is the one-time exposure time, and the exposure amount in the current exposure time is deletedRelative luminance values of points in between; and repeating the step S43 until all the illumination values are deleted, wherein the exposure time obtained in the calculation process is the required exposure time for measurement.
Further, combining the N-step phase shift technology and the multiple exposure method to perform image synthesis, comprising:
s51: the projector projects a pure white image to change the camera exposure time and acquire the target image sequence, respectively, according to the result of step S43;
S52: projecting N stripe images again by the projector, changing the exposure time of the camera and acquiring a target image sequenceFirst of all by means of a sequence of imagesAcquiring a template image, wherein a calculation formula is as follows:
in the above-mentioned formula (3),is the coordinates of the template image corresponding to the ith exposureThe gray value of the pixel of (a),,is the coordinates of the camera image corresponding to the ith, j exposuresThen fusing the fringe image sequence into a new fringe image sequence by utilizing the template image:
In the above-mentioned formula (4),the coordinates of the camera image corresponding to the ith exposure when the jth fringe pattern is projectedThe gray value of the pixel of (a),is the firstCoordinate of the frame-fused imageThe pixel gray scale value of (2).
Further, using the fused image sequenceAnd taking the projector as a virtual camera, finding out points corresponding to the projector view and the camera view, and performing phase solution and three-dimensional reconstruction by using a trigonometry method.
Specific embodiments of the present application are disclosed in detail with reference to the following description and drawings, indicating the manner in which the principles of the application may be employed. It should be understood that the embodiments of the present application are not so limited in scope. The embodiments of the application include many variations, modifications and equivalents within the spirit and scope of the appended claims.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments, in combination with or instead of the features of the other embodiments.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps or components.
Drawings
FIG. 1 is a flow chart of a structured light adaptive three-dimensional measurement method based on a camera response curve according to the present invention;
FIG. 2 is a graph of camera response;
FIG. 3 is a schematic diagram of reference exposure time selection;
FIG. 4 is a schematic illustration of the measured target reflectance during an experiment;
FIG. 5 is a schematic diagram of experimentally calculated camera response curves;
FIG. 6 is a graph of the effect of image fusion in an experiment;
FIG. 7 is a graph showing the effect of model measurement in the experiment.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The invention relates to a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which is characterized in that the camera response curve is calculated by utilizing the gray value of a reference point changing along with the exposure time, and finally, the exposure times and the exposure time are calculated according to the response curve and images under different exposure times. The illumination response curve of the camera is the camera exposureAnd image pixel valueThe relationship between them is shown in fig. 2.
Exposure amountIs numerically equal to the illuminationAnd exposure timeBy multiplication, i.e.. The camera illumination curve may be expressed as:
the invention provides a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which comprises the following steps:
s1: presetting a measurement gray value range and an exposure time range, and acquiring a reference image and reference exposure time in the preset measurement range;
s2: selecting a reference point in the reference image, and calculating a camera response function according to an exposure amount and a gray value obtained along with a change in exposure time of the reference point;
s3: obtaining relative radiance values of all the points by using a camera response function, and calculating exposure times and exposure time based on the relative radiance values and the camera response function;
s4: the images acquired at different exposure times are fused into a new sequence of fringe images and used for three-dimensional reconstruction.
In this embodiment, the preset measured gray value range and the exposure time range in step S1 specifically include:
s21: the preset measured gray value range is~The preset exposure time range is the preset minimum camera exposure timeAnd maximum exposure timeA value of (d); is provided withWhen the value is needed, the maximum value of the area where the target object in the camera view is located under the current value is smaller than the maximum value when the stripes are projected(ii) a Is provided withWhen the value is larger, the minimum value of the region where the target object in the camera view is located at the current value is larger than the minimum value when the stripe is projected;
The acquiring of the reference image and the reference exposure time within the preset measurement range in step S1 includes the projector projecting a pure white image to the target object within the preset exposure time range from the target objectToContinuously changing the exposure time of the camera, counting the number of gray values of all pixels in the view of the camera in a measurement gray value range in the adjustment process, storing the image and the corresponding exposure time in the adjustment process, selecting the image corresponding to the maximum gray value of the pixels in the adjustment process as a reference image, and taking the exposure time of the reference image as the reference exposure time.
In this embodiment, the gray scale value of the pixel on the selected reference image in step S2 isThe reference point illuminance value is uniformly set to an arbitrary constant, such as 300;
building template imagesThe gray scale value of the template image pixel is determined by the following formula:
in the above-mentioned formula (1),is that the reference image is in the coordinatesThe gray value of (d);
increasing the exposure time from the reference exposure time until the gray value of the pixel at the reference point is equal to or greater thanStoring the pixel gray value and the exposure time in the above process, and representing the camera response function by a fifth-order polynomial, as shown in formula (2):
the exposure of the reference point in the adjusting process is used as an input value, the corresponding pixel gray value is used as an output value, and 5 unknown coefficients are obtained by utilizing a least square fitting algorithmThat is, the calculated camera response functions from which the respective camera response functions are obtainedCorresponding exposure amountAndcorresponding exposure amount。
In this embodiment, the step of using the camera response function to obtain the relative irradiance values of all the points in step S3, and calculating the exposure times and the exposure time by combining the obtained relative irradiance values with the camera response function, includes the following steps:
s41: for template imageTraversing the image sequence obtained by calculation in step S21, when the coordinate pixel value on an image in the sequence is found to be in the optimal measurement range, obtaining the exposure of the point by using the calculated camera response function, and dividing the exposure by the corresponding exposure time of the image to obtain the relative contrast value of the point;
s42: for template imageMiddle pixel grayAnd obtaining the exposure corresponding to the point according to the pixel gray value of the reference image at the point and the camera response function, wherein the exposure is divided by the reference exposure time to obtain the relative contrast value of the point.
S43: after the relative contrast values of all the target points are obtained, the minimum value is found,/The value of (1) is the one-time exposure time, and the exposure amount in the current exposure time is deletedRelative luminance values of points in between; and repeating the step S43 until all the illumination values are deleted, wherein the exposure time obtained in the calculation process is the required exposure time for measurement.
In the present embodiment, combining the N-step phase shift technique and the multiple exposure method to perform image synthesis includes:
s51: the projector projects a pure white image to change the camera exposure time and acquire the target image sequence, respectively, according to the result of step S43;
S52: projecting N stripe images again by the projector, changing the exposure time of the camera and acquiring a target image sequenceFirst of all by means of a sequence of imagesAcquiring a template image, wherein a calculation formula is as follows:
in the above-mentioned formula (3),is the coordinates of the template image corresponding to the ith exposureThe gray value of the pixel of (a),,is the coordinates of the camera image corresponding to the ith, j exposuresThen fusing the fringe image sequence into a new fringe image sequence by utilizing the template image:
In the above-mentioned formula (4),the coordinates of the camera image corresponding to the ith exposure when the jth fringe pattern is projectedThe gray value of the pixel of (a),is the firstCoordinate of the frame-fused imageThe pixel gray scale value of (2).
In the present embodiment, the algorithm principle in the document "Automated phase-measurement profiling of 3-Ddifferential objects" is combined, and the fused image sequence is usedAnd taking the projector as a virtual camera, finding out points corresponding to the projector view and the camera view, and performing phase solution and three-dimensional reconstruction by using a trigonometry method.
The beneficial effects of the present invention will be further illustrated by the following experiments: :
the measuring device for testing The method of The invention comprises a computer, a projector (NEC NP43 +), and an industrial CCD camera (The Imaging Source, DMK 23G 445). The experimental measurement object is a metal blade with a complex reflectivity surface, an image directly projected to the object is shown in fig. 3, an experimentally calculated camera response curve is shown in fig. 4, and exposure times obtained through an algorithm are respectively as follows: 63.73 ms, 30.3 ms, 13.3 ms, 11.49 ms, 5.81 ms and 1.17 ms. The fused image is shown in fig. 5, and the reconstruction result is shown in fig. 6.
As can be seen from fig. 5 and 6, the fused image phase information is better preserved, and the three-dimensional data of the object surface is completely recovered.
Therefore, compared with the prior art, the invention has the following beneficial effects:
(1) the invention can calculate the required exposure times and exposure time aiming at the measurement surface in a self-adaptive way, improves the measurement efficiency and the automation degree, and has greater application value in industry;
(2) the invention can set the optimal strength range according to the measurement requirement, thereby improving the measurement flexibility;
(3) the invention can calculate the relative illumination values of all points in the field range, thereby improving the exposure accuracy and robustness;
(4) the automatic exposure algorithm is not only suitable for the structured light measurement mode, but also can be applied to other three-dimensional vision measurement modes.
The above is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above-mentioned embodiments, and all technical solutions belonging to the idea of the present invention belong to the protection scope of the present invention. It should be noted that modifications and embellishments within the scope of the invention may be made by those skilled in the art without departing from the principle of the invention.
Claims (4)
1. A structured light self-adaptive three-dimensional measurement method based on a camera response curve is characterized by comprising the following steps:
s1: presetting a measurement gray value range and an exposure time range, and acquiring a reference image and reference exposure time in the preset measurement range;
s2: selecting a reference point in the reference image, and calculating a camera response function according to an exposure amount and a gray value obtained along with a change in exposure time of the reference point;
s3: obtaining relative radiance values of all the points by using a camera response function, and calculating exposure times and exposure time based on the relative radiance values and the camera response function;
s4: the images acquired at different exposure times are fused into a new sequence of fringe images and used for three-dimensional reconstruction,
the preset measurement gray value range and the exposure time range specifically include:
s21: the preset measurement gray value range is V1~V2The preset exposure time range is the preset minimum camera exposure time t1And a maximum exposure time t2A value of (d); setting t1When the value is larger, the maximum value of the area where the target object in the camera view is located at the current value when the stripe is projected needs to be smaller than V2(ii) a Setting t2When the value is larger, the minimum value of the region where the target object in the camera view is located at the current value when the stripe is projected needs to be larger than V1;
The acquiring the reference image and the reference exposure time within the preset measurement range comprises the steps that the projector projects a pure white image to a target object, and the pure white image is exposed in the preset exposureWithin a range from t1To t2Continuously changing the exposure time of the camera, counting the number of gray values of all pixels in the view of the camera in the range of the measured gray value in the adjusting process, storing the image and the corresponding exposure time in the adjusting process, selecting the image corresponding to the maximum gray value of the pixels in the adjusting process as a reference image, taking the exposure time of the reference image as the reference exposure time, and selecting the gray value of the pixels on the reference image as V1The illumination value of the reference point is uniformly set to be an arbitrary constant;
establishing a template image F with the same size as the reference image, wherein the pixel gray scale value of the template image is determined by the following formula:
in the above formula (1), VR(x, y) is the grayscale value of the reference image at coordinate (x, y);
continuously increasing the exposure time from the reference exposure time until the pixel gray value of the reference point is greater than or equal to V2Storing the pixel gray value and the exposure time in the above process, and representing the camera response function by a fifth-order polynomial, as shown in formula (2):
y=Ax4+Bx3+Cx2+Dx+E (2)
the exposure of the reference point in the adjusting process is used as an input value, the corresponding pixel gray value is used as an output value, 5 unknown coefficients A, B, C, D, E are obtained by utilizing a least square fitting algorithm, namely, the camera response function is obtained through calculation, and V can be respectively obtained according to the camera response function1Corresponding exposure H1And V2Corresponding exposure H2。
2. The adaptive three-dimensional measurement method for structured light based on camera response curve according to claim 1, wherein the obtaining relative irradiance values for all points by using the camera response function calculates the exposure times and the exposure time by combining the obtained relative irradiance values with the camera response function comprises:
s41: traversing the image sequence calculated in the step S21 for the point with the pixel gray value of 1 in the template image F, and when the coordinate pixel value on an image in the sequence is found to be within the optimal measurement range, obtaining the exposure amount of the point by using the calculated camera response function, wherein the exposure amount is divided by the corresponding exposure time of the image to obtain the relative contrast value of the point;
s42: for the point with the pixel gray value of 0 in the template image F, obtaining the exposure corresponding to the point according to the pixel gray value of the reference image at the point and the camera response function, and dividing the exposure by the reference exposure time to obtain the relative contrast value of the point;
s43: after the relative contrast values of all the target points are obtained, the minimum value E is found1,H1/E1The value of (1) is the one-time exposure time, and the exposure amount in the current exposure time is deleted to be H1-H2Relative luminance values of points in between; and repeating the step S43 until all the illumination values are deleted, wherein the exposure time obtained in the calculation process is the required exposure time for measurement.
3. The adaptive three-dimensional measurement method for structured light based on camera response curve according to claim 2, wherein the image synthesis is performed by combining N-step phase shift technology and multiple exposure method, and comprises:
s51: the projector projects a pure white image, the exposure time of the camera is respectively changed according to the result of the step S43, and a target image sequence I is obtained;
s52: the projector projects N pieces of fringe patterns again, changes the exposure time of the camera and obtains a target image sequence J, firstly, a template image is obtained by using an image sequence I, and the calculation formula is as follows:
in the above formula (3), Mi(x, y) is the pixel gray scale value of the template image corresponding to the ith exposure in the coordinate (x, y), Ii(x,y),Ij(x, y) is the camera image corresponding to the ith, jth exposureAnd (3) fusing the fringe image sequence into a new fringe image sequence P by using the template image according to the pixel gray value of the image in the coordinate (x, y):
in the above formula (4), Ji,j(x, y) is the pixel gray scale value at coordinate (x, y) of the camera image corresponding to the ith exposure when the jth fringe pattern is projected, Pj(x, y) is the pixel gray value of the jth fused image at coordinate (x, y).
4. The method according to claim 3, wherein the fused image sequence P is used, the projector is regarded as a virtual camera, and the point corresponding to the camera view in the projector view is found, and the phase solution and the three-dimensional reconstruction are performed by using a trigonometry method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810403285.1A CN108827184B (en) | 2018-04-28 | 2018-04-28 | Structured light self-adaptive three-dimensional measurement method based on camera response curve |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810403285.1A CN108827184B (en) | 2018-04-28 | 2018-04-28 | Structured light self-adaptive three-dimensional measurement method based on camera response curve |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108827184A CN108827184A (en) | 2018-11-16 |
CN108827184B true CN108827184B (en) | 2020-04-28 |
Family
ID=64147518
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810403285.1A Active CN108827184B (en) | 2018-04-28 | 2018-04-28 | Structured light self-adaptive three-dimensional measurement method based on camera response curve |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108827184B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109883354B (en) * | 2019-03-05 | 2021-01-26 | 盎锐(上海)信息科技有限公司 | Adjusting system and method for projection grating modeling |
CN110440712B (en) * | 2019-08-26 | 2021-03-12 | 英特维科技(苏州)有限公司 | Self-adaptive large-field-depth three-dimensional scanning method and system |
CN110475078B (en) * | 2019-09-03 | 2020-12-15 | 河北科技大学 | Camera exposure time adjusting method and terminal equipment |
WO2021078300A1 (en) * | 2019-10-24 | 2021-04-29 | 先临三维科技股份有限公司 | Three-dimensional scanner and three-dimensional scanning method |
CN111707221B (en) * | 2020-06-29 | 2021-11-16 | 西安工业大学 | Multi-exposure scattering signal fusion surface roughness measurement method |
CN112118435B (en) * | 2020-08-04 | 2021-06-25 | 山东大学 | Multi-projection fusion method and system for special-shaped metal screen |
CN112764358B (en) * | 2020-12-28 | 2022-04-15 | 中国人民解放军战略支援部队航天工程大学 | Dynamic control method for optical observation exposure time of geosynchronous orbit target |
CN113340235B (en) * | 2021-04-27 | 2022-08-12 | 成都飞机工业(集团)有限责任公司 | Projection system based on dynamic projection and phase shift pattern generation method |
CN113870430B (en) * | 2021-12-06 | 2022-02-22 | 杭州灵西机器人智能科技有限公司 | Workpiece data processing method and device |
CN116614713B (en) * | 2023-07-14 | 2023-10-27 | 江南大学 | Self-adaptive multiple exposure method for three-dimensional morphology measurement |
Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2013012335A1 (en) * | 2011-07-21 | 2013-01-24 | Ziv Attar | Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene |
CN103002225A (en) * | 2011-04-20 | 2013-03-27 | Csr技术公司 | Multiple exposure high dynamic range image capture |
CN103411533A (en) * | 2013-08-05 | 2013-11-27 | 上海交通大学 | Structured light self-adapting repeated multi-exposure method |
CN103761712A (en) * | 2014-01-21 | 2014-04-30 | 太原理工大学 | Image blind convolution method based on adaptive optical system point spread function reconstruction |
CN104113946A (en) * | 2013-04-17 | 2014-10-22 | 宝山钢铁股份有限公司 | light source illuminance adaptive control device and light source illuminance adaptive control method |
CN104539921A (en) * | 2014-11-26 | 2015-04-22 | 北京理工大学 | Illumination compensation method based on multi-projector system |
CN104835130A (en) * | 2015-04-17 | 2015-08-12 | 北京联合大学 | Multi-exposure image fusion method |
CN104954701A (en) * | 2015-06-19 | 2015-09-30 | 长春理工大学 | Camera response curve generating method |
CN105651203A (en) * | 2016-03-16 | 2016-06-08 | 广东工业大学 | High-dynamic-range three-dimensional shape measurement method for self-adaptation fringe brightness |
CN105872398A (en) * | 2016-04-19 | 2016-08-17 | 大连海事大学 | Space camera self-adaption exposure method |
CN106091986A (en) * | 2016-06-08 | 2016-11-09 | 韶关学院 | A kind of method for three-dimensional measurement being applicable to glossy surface |
CN107845128A (en) * | 2017-11-03 | 2018-03-27 | 安康学院 | A kind of more exposure high-dynamics image method for reconstructing of multiple dimensioned details fusion |
CN107894215A (en) * | 2017-12-26 | 2018-04-10 | 东南大学 | HDR optical grating projection method for three-dimensional measurement based on fully automatic exposure |
-
2018
- 2018-04-28 CN CN201810403285.1A patent/CN108827184B/en active Active
Patent Citations (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103002225A (en) * | 2011-04-20 | 2013-03-27 | Csr技术公司 | Multiple exposure high dynamic range image capture |
WO2013012335A1 (en) * | 2011-07-21 | 2013-01-24 | Ziv Attar | Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene |
CN104113946A (en) * | 2013-04-17 | 2014-10-22 | 宝山钢铁股份有限公司 | light source illuminance adaptive control device and light source illuminance adaptive control method |
CN103411533A (en) * | 2013-08-05 | 2013-11-27 | 上海交通大学 | Structured light self-adapting repeated multi-exposure method |
CN103761712A (en) * | 2014-01-21 | 2014-04-30 | 太原理工大学 | Image blind convolution method based on adaptive optical system point spread function reconstruction |
CN104539921A (en) * | 2014-11-26 | 2015-04-22 | 北京理工大学 | Illumination compensation method based on multi-projector system |
CN104835130A (en) * | 2015-04-17 | 2015-08-12 | 北京联合大学 | Multi-exposure image fusion method |
CN104954701A (en) * | 2015-06-19 | 2015-09-30 | 长春理工大学 | Camera response curve generating method |
CN105651203A (en) * | 2016-03-16 | 2016-06-08 | 广东工业大学 | High-dynamic-range three-dimensional shape measurement method for self-adaptation fringe brightness |
CN105872398A (en) * | 2016-04-19 | 2016-08-17 | 大连海事大学 | Space camera self-adaption exposure method |
CN106091986A (en) * | 2016-06-08 | 2016-11-09 | 韶关学院 | A kind of method for three-dimensional measurement being applicable to glossy surface |
CN107845128A (en) * | 2017-11-03 | 2018-03-27 | 安康学院 | A kind of more exposure high-dynamics image method for reconstructing of multiple dimensioned details fusion |
CN107894215A (en) * | 2017-12-26 | 2018-04-10 | 东南大学 | HDR optical grating projection method for three-dimensional measurement based on fully automatic exposure |
Non-Patent Citations (2)
Title |
---|
一种基于自动多次曝光面结构光的形貌测量方法;李兆杰;《光学学报》;20181130(第11期);全文 * |
一种用于产品三维轮廓检测的自适应ROI高速相机系统;王倩;《中国优秀硕士学位论文全文数据库信息科技辑》;20150715;I138-957 * |
Also Published As
Publication number | Publication date |
---|---|
CN108827184A (en) | 2018-11-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108827184B (en) | Structured light self-adaptive three-dimensional measurement method based on camera response curve | |
CN110689581B (en) | Structured light module calibration method, electronic device and computer readable storage medium | |
CN107894215B (en) | High dynamic range grating projection three-dimensional measurement method based on full-automatic exposure | |
CN108168464B (en) | phase error correction method for defocusing phenomenon of fringe projection three-dimensional measurement system | |
JP5576726B2 (en) | Three-dimensional measuring apparatus, three-dimensional measuring method, and program | |
JP2012103239A (en) | Three dimensional measurement device, three dimensional measurement method, and program | |
JP6519265B2 (en) | Image processing method | |
CN107071248B (en) | High dynamic range imaging method for extracting geometric features of strong reflection surface | |
US20190005607A1 (en) | Projection device, projection method and program storage medium | |
JP6444233B2 (en) | Distance measuring device, distance measuring method, and program | |
JP6937642B2 (en) | Surface evaluation method and surface evaluation device | |
JP6418884B2 (en) | Three-dimensional measuring apparatus, three-dimensional measuring method and program | |
CN109474814A (en) | Two-dimensional calibration method, projector and the calibration system of projector | |
US11449970B2 (en) | Three-dimensional geometry measurement apparatus and three-dimensional geometry measurement method | |
Chen et al. | Automated exposures selection for high dynamic range structured-light 3-D scanning | |
CN114612409A (en) | Projection calibration method and device, storage medium and electronic equipment | |
CN116608794B (en) | Anti-texture 3D structured light imaging method, system, device and storage medium | |
CN109587463A (en) | Calibration method, projector and the calibration system of projector | |
CN112985302A (en) | Three-dimensional measurement system, method, apparatus, medium, and electronic device | |
US11898838B2 (en) | Adjustment method and measurement method | |
CN115127481A (en) | Stripe projection 3D measuring method, terminal device and computer readable storage medium | |
JP6776004B2 (en) | Image processing equipment, image processing methods and programs | |
JP5968370B2 (en) | Three-dimensional measuring apparatus, three-dimensional measuring method, and program | |
CN110857855B (en) | Image data acquisition method, device and system | |
TWI604411B (en) | Structured-light-based exposure control method and exposure control apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
EE01 | Entry into force of recordation of patent licensing contract |
Application publication date: 20181116 Assignee: NANJING KINGYOUNG INTELLIGENT SCIENCE AND TECHNOLOGY Co.,Ltd. Assignor: Nanjing University of Aeronautics and Astronautics Contract record no.: X2021320000050 Denomination of invention: An adaptive 3D measurement method of structured light based on camera response curve Granted publication date: 20200428 License type: Common License Record date: 20210719 |
|
EE01 | Entry into force of recordation of patent licensing contract |