CN108827184B - Structured light self-adaptive three-dimensional measurement method based on camera response curve - Google Patents

Structured light self-adaptive three-dimensional measurement method based on camera response curve Download PDF

Info

Publication number
CN108827184B
CN108827184B CN201810403285.1A CN201810403285A CN108827184B CN 108827184 B CN108827184 B CN 108827184B CN 201810403285 A CN201810403285 A CN 201810403285A CN 108827184 B CN108827184 B CN 108827184B
Authority
CN
China
Prior art keywords
exposure time
image
value
exposure
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810403285.1A
Other languages
Chinese (zh)
Other versions
CN108827184A (en
Inventor
崔海华
程筱胜
李兆杰
田威
张小迪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201810403285.1A priority Critical patent/CN108827184B/en
Publication of CN108827184A publication Critical patent/CN108827184A/en
Application granted granted Critical
Publication of CN108827184B publication Critical patent/CN108827184B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The application provides a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which comprises the steps of presetting a measurement gray value range and an exposure time range, and acquiring a reference image and reference exposure time in the preset measurement range; selecting a reference point in the reference image, and calculating a camera response function according to an exposure amount and a gray value obtained along with a change in exposure time of the reference point; obtaining relative radiance values of all the points by using a camera response function, and calculating exposure times and exposure time based on the relative radiance values and the camera response function; the images acquired at different exposure times are fused into a new sequence of fringe images and used for three-dimensional reconstruction. The invention can self-adaptively reconstruct an object with complex surface reflectivity, enlarges the application range of the structured light measurement technology, improves the automation degree of measurement and can obtain good measurement effect on the complex reflectivity surface.

Description

Structured light self-adaptive three-dimensional measurement method based on camera response curve
Technical Field
The invention belongs to the technical field of optical measurement, and particularly relates to a structured light self-adaptive three-dimensional measurement method based on a camera response curve.
Background
The structured light measurement technology is an active optical measurement technology which projects grating stripe images in a certain mode onto the surface of an object to be measured, a camera obtains a grating image which is modulated by the shape of the object to be measured and deformed, and then three-dimensional information data of the object to be measured is calculated. When the surface exposure is too strong or insufficient, such as specular reflection, the camera imaging cannot truly reflect the true intensity value of the surface, resulting in reduced measurement accuracy or no reconstruction. The multiple exposure technology is applied to the structured light reconstruction technology and is used for solving the problem of light reflection, and after multiple exposure images are acquired, points with the highest pixel value gray scale but without saturation at the same position are respectively selected to participate in reconstruction. Although the method of combining the structured light technology with multiple exposures obtains a better measuring effect, the exposure time and the exposure times of the method all depend on the experience of an operator, and the measuring efficiency and the industrial application are influenced.
It should be noted that the above background description is only for the convenience of clear and complete description of the technical solutions of the present application and for the understanding of those skilled in the art. Such solutions are not considered to be known to the person skilled in the art merely because they have been set forth in the background section of the present application.
Disclosure of Invention
The invention aims to provide a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which can self-adaptively reconstruct an object with complex surface reflectivity, enlarge the application range of the structured light measurement technology, improve the automation degree of measurement and obtain good measurement effect on the complex reflectivity surface.
The invention provides a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which comprises the following steps:
s1: presetting a measurement gray value range and an exposure time range, and acquiring a reference image and reference exposure time in the preset measurement range;
s2: selecting a reference point in the reference image, and calculating a camera response function according to an exposure amount and a gray value obtained along with a change in exposure time of the reference point;
s3: obtaining relative radiance values of all the points by using a camera response function, and calculating exposure times and exposure time based on the relative radiance values and the camera response function;
s4: the images acquired at different exposure times are fused into a new sequence of fringe images and used for three-dimensional reconstruction.
Further, the preset measurement gray value range and the exposure time range specifically include:
s21: the preset measured gray value range is
Figure DEST_PATH_IMAGE002
~
Figure DEST_PATH_IMAGE004
The preset exposure time range is the preset minimum camera exposure time
Figure DEST_PATH_IMAGE006
And maximum exposure time
Figure DEST_PATH_IMAGE008
A value of (d); is provided with
Figure DEST_PATH_IMAGE006A
When the value is needed, the maximum value of the area where the target object in the camera view is located under the current value is smaller than the maximum value when the stripes are projected
Figure DEST_PATH_IMAGE004A
(ii) a Is provided with
Figure DEST_PATH_IMAGE008A
When the value is larger, the minimum value of the region where the target object in the camera view is located at the current value is larger than the minimum value when the stripe is projected
Figure DEST_PATH_IMAGE002A
The acquiring the reference image and the reference exposure time within the preset measurement range comprises the projector projecting a pure white image to a target object within the preset exposure time range
Figure DEST_PATH_IMAGE006AA
To
Figure DEST_PATH_IMAGE008AA
Continuously changing the exposure time of the camera, counting the number of gray values of all pixels in the view of the camera in the range of the measured gray value in the adjustment process, and storing the image and the corresponding exposure in the adjustment processAnd light time, wherein an image corresponding to the maximum pixel gray value in the process is selected as a reference image, and the exposure time of the reference image is the reference exposure time.
Further, a gray value of a pixel on the reference image is selected as
Figure DEST_PATH_IMAGE002AA
The reference point illuminance value is uniformly set to an arbitrary constant, such as 300;
building template images
Figure DEST_PATH_IMAGE010
The gray scale value of the template image pixel is determined by the following formula:
Figure DEST_PATH_IMAGE012
(1)
in the above-mentioned formula (1),
Figure DEST_PATH_IMAGE014
is that the reference image is in the coordinates
Figure DEST_PATH_IMAGE016
The gray value of (d);
increasing the exposure time from the reference exposure time until the gray value of the pixel at the reference point is equal to or greater than
Figure DEST_PATH_IMAGE004AA
Storing the pixel gray value and the exposure time in the above process, and representing the camera response function by a fifth-order polynomial, as shown in formula (2):
Figure DEST_PATH_IMAGE018
(2)
the exposure of the reference point in the adjusting process is used as an input value, the corresponding pixel gray value is used as an output value, and 5 unknown coefficients are obtained by utilizing a least square fitting algorithm
Figure DEST_PATH_IMAGE020
That is, the calculated camera response functions from which the respective camera response functions are obtained
Figure DEST_PATH_IMAGE002AAA
Corresponding exposure amount
Figure DEST_PATH_IMAGE022
And
Figure DEST_PATH_IMAGE004AAA
corresponding exposure amount
Figure DEST_PATH_IMAGE024
Further, using the camera response function to obtain relative irradiance values for all points calculating the number of exposures and the exposure time by combining the obtained relative irradiance values with the camera response function comprises:
s41: for template image
Figure DEST_PATH_IMAGE010A
Traversing the image sequence obtained by calculation in step S21, when the coordinate pixel value on an image in the sequence is found to be in the optimal measurement range, obtaining the exposure of the point by using the calculated camera response function, and dividing the exposure by the corresponding exposure time of the image to obtain the relative contrast value of the point;
s42: for template image
Figure DEST_PATH_IMAGE010AA
And obtaining the exposure corresponding to the point according to the pixel gray value of the reference image at the point and the camera response function, and dividing the exposure by the reference exposure time to obtain the relative contrast value of the point.
S43: after the relative contrast values of all the target points are obtained, the minimum value is found
Figure DEST_PATH_IMAGE026
,
Figure DEST_PATH_IMAGE022A
/
Figure DEST_PATH_IMAGE026A
The value of (1) is the one-time exposure time, and the exposure amount in the current exposure time is deleted
Figure DEST_PATH_IMAGE028
Relative luminance values of points in between; and repeating the step S43 until all the illumination values are deleted, wherein the exposure time obtained in the calculation process is the required exposure time for measurement.
Further, combining the N-step phase shift technology and the multiple exposure method to perform image synthesis, comprising:
s51: the projector projects a pure white image to change the camera exposure time and acquire the target image sequence, respectively, according to the result of step S43
Figure DEST_PATH_IMAGE030
S52: projecting N stripe images again by the projector, changing the exposure time of the camera and acquiring a target image sequence
Figure DEST_PATH_IMAGE032
First of all by means of a sequence of images
Figure DEST_PATH_IMAGE030A
Acquiring a template image, wherein a calculation formula is as follows:
Figure DEST_PATH_IMAGE034
(3)
in the above-mentioned formula (3),
Figure DEST_PATH_IMAGE036
is the coordinates of the template image corresponding to the ith exposure
Figure DEST_PATH_IMAGE016A
The gray value of the pixel of (a),
Figure DEST_PATH_IMAGE038
Figure DEST_PATH_IMAGE040
is the coordinates of the camera image corresponding to the ith, j exposures
Figure DEST_PATH_IMAGE016AA
Then fusing the fringe image sequence into a new fringe image sequence by utilizing the template image
Figure DEST_PATH_IMAGE042
Figure DEST_PATH_IMAGE044
(4)
In the above-mentioned formula (4),
Figure DEST_PATH_IMAGE046
the coordinates of the camera image corresponding to the ith exposure when the jth fringe pattern is projected
Figure DEST_PATH_IMAGE016AAA
The gray value of the pixel of (a),
Figure DEST_PATH_IMAGE048
is the first
Figure DEST_PATH_IMAGE050
Coordinate of the frame-fused image
Figure DEST_PATH_IMAGE016AAAA
The pixel gray scale value of (2).
Further, using the fused image sequence
Figure DEST_PATH_IMAGE042A
And taking the projector as a virtual camera, finding out points corresponding to the projector view and the camera view, and performing phase solution and three-dimensional reconstruction by using a trigonometry method.
Specific embodiments of the present application are disclosed in detail with reference to the following description and drawings, indicating the manner in which the principles of the application may be employed. It should be understood that the embodiments of the present application are not so limited in scope. The embodiments of the application include many variations, modifications and equivalents within the spirit and scope of the appended claims.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments, in combination with or instead of the features of the other embodiments.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps or components.
Drawings
FIG. 1 is a flow chart of a structured light adaptive three-dimensional measurement method based on a camera response curve according to the present invention;
FIG. 2 is a graph of camera response;
FIG. 3 is a schematic diagram of reference exposure time selection;
FIG. 4 is a schematic illustration of the measured target reflectance during an experiment;
FIG. 5 is a schematic diagram of experimentally calculated camera response curves;
FIG. 6 is a graph of the effect of image fusion in an experiment;
FIG. 7 is a graph showing the effect of model measurement in the experiment.
Detailed Description
In order to make those skilled in the art better understand the technical solutions in the present application, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The invention relates to a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which is characterized in that the camera response curve is calculated by utilizing the gray value of a reference point changing along with the exposure time, and finally, the exposure times and the exposure time are calculated according to the response curve and images under different exposure times. The illumination response curve of the camera is the camera exposure
Figure DEST_PATH_IMAGE052
And image pixel value
Figure DEST_PATH_IMAGE054
The relationship between them is shown in fig. 2.
Exposure amount
Figure DEST_PATH_IMAGE052A
Is numerically equal to the illumination
Figure DEST_PATH_IMAGE056
And exposure time
Figure DEST_PATH_IMAGE058
By multiplication, i.e.
Figure DEST_PATH_IMAGE060
. The camera illumination curve may be expressed as:
Figure DEST_PATH_IMAGE062
(5)
the invention provides a structured light self-adaptive three-dimensional measurement method based on a camera response curve, which comprises the following steps:
s1: presetting a measurement gray value range and an exposure time range, and acquiring a reference image and reference exposure time in the preset measurement range;
s2: selecting a reference point in the reference image, and calculating a camera response function according to an exposure amount and a gray value obtained along with a change in exposure time of the reference point;
s3: obtaining relative radiance values of all the points by using a camera response function, and calculating exposure times and exposure time based on the relative radiance values and the camera response function;
s4: the images acquired at different exposure times are fused into a new sequence of fringe images and used for three-dimensional reconstruction.
In this embodiment, the preset measured gray value range and the exposure time range in step S1 specifically include:
s21: the preset measured gray value range is
Figure DEST_PATH_IMAGE002AAAA
~
Figure DEST_PATH_IMAGE004AAAA
The preset exposure time range is the preset minimum camera exposure time
Figure DEST_PATH_IMAGE006AAA
And maximum exposure time
Figure DEST_PATH_IMAGE008AAA
A value of (d); is provided with
Figure DEST_PATH_IMAGE006AAAA
When the value is needed, the maximum value of the area where the target object in the camera view is located under the current value is smaller than the maximum value when the stripes are projected
Figure DEST_PATH_IMAGE004AAAAA
(ii) a Is provided with
Figure DEST_PATH_IMAGE008AAAA
When the value is larger, the minimum value of the region where the target object in the camera view is located at the current value is larger than the minimum value when the stripe is projected
Figure DEST_PATH_IMAGE002AAAAA
The acquiring of the reference image and the reference exposure time within the preset measurement range in step S1 includes the projector projecting a pure white image to the target object within the preset exposure time range from the target object
Figure DEST_PATH_IMAGE006AAAAA
To
Figure DEST_PATH_IMAGE008AAAAA
Continuously changing the exposure time of the camera, counting the number of gray values of all pixels in the view of the camera in a measurement gray value range in the adjustment process, storing the image and the corresponding exposure time in the adjustment process, selecting the image corresponding to the maximum gray value of the pixels in the adjustment process as a reference image, and taking the exposure time of the reference image as the reference exposure time.
In this embodiment, the gray scale value of the pixel on the selected reference image in step S2 is
Figure DEST_PATH_IMAGE002AAAAAA
The reference point illuminance value is uniformly set to an arbitrary constant, such as 300;
building template images
Figure DEST_PATH_IMAGE010AAA
The gray scale value of the template image pixel is determined by the following formula:
Figure DEST_PATH_IMAGE012A
(1)
in the above-mentioned formula (1),
Figure DEST_PATH_IMAGE014A
is that the reference image is in the coordinates
Figure DEST_PATH_IMAGE016AAAAA
The gray value of (d);
increasing the exposure time from the reference exposure time until the gray value of the pixel at the reference point is equal to or greater than
Figure DEST_PATH_IMAGE004AAAAAA
Storing the pixel gray value and the exposure time in the above process, and representing the camera response function by a fifth-order polynomial, as shown in formula (2):
Figure DEST_PATH_IMAGE018A
(2)
the exposure of the reference point in the adjusting process is used as an input value, the corresponding pixel gray value is used as an output value, and 5 unknown coefficients are obtained by utilizing a least square fitting algorithm
Figure DEST_PATH_IMAGE020A
That is, the calculated camera response functions from which the respective camera response functions are obtained
Figure DEST_PATH_IMAGE002AAAAAAA
Corresponding exposure amount
Figure DEST_PATH_IMAGE022AA
And
Figure DEST_PATH_IMAGE004AAAAAAA
corresponding exposure amount
Figure DEST_PATH_IMAGE024A
In this embodiment, the step of using the camera response function to obtain the relative irradiance values of all the points in step S3, and calculating the exposure times and the exposure time by combining the obtained relative irradiance values with the camera response function, includes the following steps:
s41: for template image
Figure DEST_PATH_IMAGE010AAAA
Traversing the image sequence obtained by calculation in step S21, when the coordinate pixel value on an image in the sequence is found to be in the optimal measurement range, obtaining the exposure of the point by using the calculated camera response function, and dividing the exposure by the corresponding exposure time of the image to obtain the relative contrast value of the point;
s42: for template image
Figure DEST_PATH_IMAGE010AAAAA
Middle pixel grayAnd obtaining the exposure corresponding to the point according to the pixel gray value of the reference image at the point and the camera response function, wherein the exposure is divided by the reference exposure time to obtain the relative contrast value of the point.
S43: after the relative contrast values of all the target points are obtained, the minimum value is found
Figure DEST_PATH_IMAGE026AA
,
Figure DEST_PATH_IMAGE022AAA
/
Figure DEST_PATH_IMAGE026AAA
The value of (1) is the one-time exposure time, and the exposure amount in the current exposure time is deleted
Figure DEST_PATH_IMAGE028A
Relative luminance values of points in between; and repeating the step S43 until all the illumination values are deleted, wherein the exposure time obtained in the calculation process is the required exposure time for measurement.
In the present embodiment, combining the N-step phase shift technique and the multiple exposure method to perform image synthesis includes:
s51: the projector projects a pure white image to change the camera exposure time and acquire the target image sequence, respectively, according to the result of step S43
Figure DEST_PATH_IMAGE030AA
S52: projecting N stripe images again by the projector, changing the exposure time of the camera and acquiring a target image sequence
Figure DEST_PATH_IMAGE032A
First of all by means of a sequence of images
Figure DEST_PATH_IMAGE030AAA
Acquiring a template image, wherein a calculation formula is as follows:
Figure DEST_PATH_IMAGE034A
(3)
in the above-mentioned formula (3),
Figure DEST_PATH_IMAGE036A
is the coordinates of the template image corresponding to the ith exposure
Figure DEST_PATH_IMAGE016AAAAAA
The gray value of the pixel of (a),
Figure DEST_PATH_IMAGE038A
Figure DEST_PATH_IMAGE040A
is the coordinates of the camera image corresponding to the ith, j exposures
Figure DEST_PATH_IMAGE016AAAAAAA
Then fusing the fringe image sequence into a new fringe image sequence by utilizing the template image
Figure DEST_PATH_IMAGE042AA
Figure DEST_PATH_IMAGE044A
(4)
In the above-mentioned formula (4),
Figure DEST_PATH_IMAGE046A
the coordinates of the camera image corresponding to the ith exposure when the jth fringe pattern is projected
Figure DEST_PATH_IMAGE016AAAAAAAA
The gray value of the pixel of (a),
Figure DEST_PATH_IMAGE048A
is the first
Figure DEST_PATH_IMAGE050A
Coordinate of the frame-fused image
Figure DEST_PATH_IMAGE016AAAAAAAAA
The pixel gray scale value of (2).
In the present embodiment, the algorithm principle in the document "Automated phase-measurement profiling of 3-Ddifferential objects" is combined, and the fused image sequence is used
Figure DEST_PATH_IMAGE042AAA
And taking the projector as a virtual camera, finding out points corresponding to the projector view and the camera view, and performing phase solution and three-dimensional reconstruction by using a trigonometry method.
The beneficial effects of the present invention will be further illustrated by the following experiments: :
the measuring device for testing The method of The invention comprises a computer, a projector (NEC NP43 +), and an industrial CCD camera (The Imaging Source, DMK 23G 445). The experimental measurement object is a metal blade with a complex reflectivity surface, an image directly projected to the object is shown in fig. 3, an experimentally calculated camera response curve is shown in fig. 4, and exposure times obtained through an algorithm are respectively as follows: 63.73 ms, 30.3 ms, 13.3 ms, 11.49 ms, 5.81 ms and 1.17 ms. The fused image is shown in fig. 5, and the reconstruction result is shown in fig. 6.
As can be seen from fig. 5 and 6, the fused image phase information is better preserved, and the three-dimensional data of the object surface is completely recovered.
Therefore, compared with the prior art, the invention has the following beneficial effects:
(1) the invention can calculate the required exposure times and exposure time aiming at the measurement surface in a self-adaptive way, improves the measurement efficiency and the automation degree, and has greater application value in industry;
(2) the invention can set the optimal strength range according to the measurement requirement, thereby improving the measurement flexibility;
(3) the invention can calculate the relative illumination values of all points in the field range, thereby improving the exposure accuracy and robustness;
(4) the automatic exposure algorithm is not only suitable for the structured light measurement mode, but also can be applied to other three-dimensional vision measurement modes.
The above is only a preferred embodiment of the present invention, and the protection scope of the present invention is not limited to the above-mentioned embodiments, and all technical solutions belonging to the idea of the present invention belong to the protection scope of the present invention. It should be noted that modifications and embellishments within the scope of the invention may be made by those skilled in the art without departing from the principle of the invention.

Claims (4)

1. A structured light self-adaptive three-dimensional measurement method based on a camera response curve is characterized by comprising the following steps:
s1: presetting a measurement gray value range and an exposure time range, and acquiring a reference image and reference exposure time in the preset measurement range;
s2: selecting a reference point in the reference image, and calculating a camera response function according to an exposure amount and a gray value obtained along with a change in exposure time of the reference point;
s3: obtaining relative radiance values of all the points by using a camera response function, and calculating exposure times and exposure time based on the relative radiance values and the camera response function;
s4: the images acquired at different exposure times are fused into a new sequence of fringe images and used for three-dimensional reconstruction,
the preset measurement gray value range and the exposure time range specifically include:
s21: the preset measurement gray value range is V1~V2The preset exposure time range is the preset minimum camera exposure time t1And a maximum exposure time t2A value of (d); setting t1When the value is larger, the maximum value of the area where the target object in the camera view is located at the current value when the stripe is projected needs to be smaller than V2(ii) a Setting t2When the value is larger, the minimum value of the region where the target object in the camera view is located at the current value when the stripe is projected needs to be larger than V1
The acquiring the reference image and the reference exposure time within the preset measurement range comprises the steps that the projector projects a pure white image to a target object, and the pure white image is exposed in the preset exposureWithin a range from t1To t2Continuously changing the exposure time of the camera, counting the number of gray values of all pixels in the view of the camera in the range of the measured gray value in the adjusting process, storing the image and the corresponding exposure time in the adjusting process, selecting the image corresponding to the maximum gray value of the pixels in the adjusting process as a reference image, taking the exposure time of the reference image as the reference exposure time, and selecting the gray value of the pixels on the reference image as V1The illumination value of the reference point is uniformly set to be an arbitrary constant;
establishing a template image F with the same size as the reference image, wherein the pixel gray scale value of the template image is determined by the following formula:
Figure FDA0002388654870000011
in the above formula (1), VR(x, y) is the grayscale value of the reference image at coordinate (x, y);
continuously increasing the exposure time from the reference exposure time until the pixel gray value of the reference point is greater than or equal to V2Storing the pixel gray value and the exposure time in the above process, and representing the camera response function by a fifth-order polynomial, as shown in formula (2):
y=Ax4+Bx3+Cx2+Dx+E (2)
the exposure of the reference point in the adjusting process is used as an input value, the corresponding pixel gray value is used as an output value, 5 unknown coefficients A, B, C, D, E are obtained by utilizing a least square fitting algorithm, namely, the camera response function is obtained through calculation, and V can be respectively obtained according to the camera response function1Corresponding exposure H1And V2Corresponding exposure H2
2. The adaptive three-dimensional measurement method for structured light based on camera response curve according to claim 1, wherein the obtaining relative irradiance values for all points by using the camera response function calculates the exposure times and the exposure time by combining the obtained relative irradiance values with the camera response function comprises:
s41: traversing the image sequence calculated in the step S21 for the point with the pixel gray value of 1 in the template image F, and when the coordinate pixel value on an image in the sequence is found to be within the optimal measurement range, obtaining the exposure amount of the point by using the calculated camera response function, wherein the exposure amount is divided by the corresponding exposure time of the image to obtain the relative contrast value of the point;
s42: for the point with the pixel gray value of 0 in the template image F, obtaining the exposure corresponding to the point according to the pixel gray value of the reference image at the point and the camera response function, and dividing the exposure by the reference exposure time to obtain the relative contrast value of the point;
s43: after the relative contrast values of all the target points are obtained, the minimum value E is found1,H1/E1The value of (1) is the one-time exposure time, and the exposure amount in the current exposure time is deleted to be H1-H2Relative luminance values of points in between; and repeating the step S43 until all the illumination values are deleted, wherein the exposure time obtained in the calculation process is the required exposure time for measurement.
3. The adaptive three-dimensional measurement method for structured light based on camera response curve according to claim 2, wherein the image synthesis is performed by combining N-step phase shift technology and multiple exposure method, and comprises:
s51: the projector projects a pure white image, the exposure time of the camera is respectively changed according to the result of the step S43, and a target image sequence I is obtained;
s52: the projector projects N pieces of fringe patterns again, changes the exposure time of the camera and obtains a target image sequence J, firstly, a template image is obtained by using an image sequence I, and the calculation formula is as follows:
Figure FDA0002388654870000031
in the above formula (3), Mi(x, y) is the pixel gray scale value of the template image corresponding to the ith exposure in the coordinate (x, y), Ii(x,y),Ij(x, y) is the camera image corresponding to the ith, jth exposureAnd (3) fusing the fringe image sequence into a new fringe image sequence P by using the template image according to the pixel gray value of the image in the coordinate (x, y):
Figure FDA0002388654870000032
in the above formula (4), Ji,j(x, y) is the pixel gray scale value at coordinate (x, y) of the camera image corresponding to the ith exposure when the jth fringe pattern is projected, Pj(x, y) is the pixel gray value of the jth fused image at coordinate (x, y).
4. The method according to claim 3, wherein the fused image sequence P is used, the projector is regarded as a virtual camera, and the point corresponding to the camera view in the projector view is found, and the phase solution and the three-dimensional reconstruction are performed by using a trigonometry method.
CN201810403285.1A 2018-04-28 2018-04-28 Structured light self-adaptive three-dimensional measurement method based on camera response curve Active CN108827184B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810403285.1A CN108827184B (en) 2018-04-28 2018-04-28 Structured light self-adaptive three-dimensional measurement method based on camera response curve

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810403285.1A CN108827184B (en) 2018-04-28 2018-04-28 Structured light self-adaptive three-dimensional measurement method based on camera response curve

Publications (2)

Publication Number Publication Date
CN108827184A CN108827184A (en) 2018-11-16
CN108827184B true CN108827184B (en) 2020-04-28

Family

ID=64147518

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810403285.1A Active CN108827184B (en) 2018-04-28 2018-04-28 Structured light self-adaptive three-dimensional measurement method based on camera response curve

Country Status (1)

Country Link
CN (1) CN108827184B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109883354B (en) * 2019-03-05 2021-01-26 盎锐(上海)信息科技有限公司 Adjusting system and method for projection grating modeling
CN110440712B (en) * 2019-08-26 2021-03-12 英特维科技(苏州)有限公司 Self-adaptive large-field-depth three-dimensional scanning method and system
CN110475078B (en) * 2019-09-03 2020-12-15 河北科技大学 Camera exposure time adjusting method and terminal equipment
WO2021078300A1 (en) * 2019-10-24 2021-04-29 先临三维科技股份有限公司 Three-dimensional scanner and three-dimensional scanning method
CN111707221B (en) * 2020-06-29 2021-11-16 西安工业大学 Multi-exposure scattering signal fusion surface roughness measurement method
CN112118435B (en) * 2020-08-04 2021-06-25 山东大学 Multi-projection fusion method and system for special-shaped metal screen
CN112764358B (en) * 2020-12-28 2022-04-15 中国人民解放军战略支援部队航天工程大学 Dynamic control method for optical observation exposure time of geosynchronous orbit target
CN113340235B (en) * 2021-04-27 2022-08-12 成都飞机工业(集团)有限责任公司 Projection system based on dynamic projection and phase shift pattern generation method
CN113870430B (en) * 2021-12-06 2022-02-22 杭州灵西机器人智能科技有限公司 Workpiece data processing method and device
CN116614713B (en) * 2023-07-14 2023-10-27 江南大学 Self-adaptive multiple exposure method for three-dimensional morphology measurement

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013012335A1 (en) * 2011-07-21 2013-01-24 Ziv Attar Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
CN103002225A (en) * 2011-04-20 2013-03-27 Csr技术公司 Multiple exposure high dynamic range image capture
CN103411533A (en) * 2013-08-05 2013-11-27 上海交通大学 Structured light self-adapting repeated multi-exposure method
CN103761712A (en) * 2014-01-21 2014-04-30 太原理工大学 Image blind convolution method based on adaptive optical system point spread function reconstruction
CN104113946A (en) * 2013-04-17 2014-10-22 宝山钢铁股份有限公司 light source illuminance adaptive control device and light source illuminance adaptive control method
CN104539921A (en) * 2014-11-26 2015-04-22 北京理工大学 Illumination compensation method based on multi-projector system
CN104835130A (en) * 2015-04-17 2015-08-12 北京联合大学 Multi-exposure image fusion method
CN104954701A (en) * 2015-06-19 2015-09-30 长春理工大学 Camera response curve generating method
CN105651203A (en) * 2016-03-16 2016-06-08 广东工业大学 High-dynamic-range three-dimensional shape measurement method for self-adaptation fringe brightness
CN105872398A (en) * 2016-04-19 2016-08-17 大连海事大学 Space camera self-adaption exposure method
CN106091986A (en) * 2016-06-08 2016-11-09 韶关学院 A kind of method for three-dimensional measurement being applicable to glossy surface
CN107845128A (en) * 2017-11-03 2018-03-27 安康学院 A kind of more exposure high-dynamics image method for reconstructing of multiple dimensioned details fusion
CN107894215A (en) * 2017-12-26 2018-04-10 东南大学 HDR optical grating projection method for three-dimensional measurement based on fully automatic exposure

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103002225A (en) * 2011-04-20 2013-03-27 Csr技术公司 Multiple exposure high dynamic range image capture
WO2013012335A1 (en) * 2011-07-21 2013-01-24 Ziv Attar Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
CN104113946A (en) * 2013-04-17 2014-10-22 宝山钢铁股份有限公司 light source illuminance adaptive control device and light source illuminance adaptive control method
CN103411533A (en) * 2013-08-05 2013-11-27 上海交通大学 Structured light self-adapting repeated multi-exposure method
CN103761712A (en) * 2014-01-21 2014-04-30 太原理工大学 Image blind convolution method based on adaptive optical system point spread function reconstruction
CN104539921A (en) * 2014-11-26 2015-04-22 北京理工大学 Illumination compensation method based on multi-projector system
CN104835130A (en) * 2015-04-17 2015-08-12 北京联合大学 Multi-exposure image fusion method
CN104954701A (en) * 2015-06-19 2015-09-30 长春理工大学 Camera response curve generating method
CN105651203A (en) * 2016-03-16 2016-06-08 广东工业大学 High-dynamic-range three-dimensional shape measurement method for self-adaptation fringe brightness
CN105872398A (en) * 2016-04-19 2016-08-17 大连海事大学 Space camera self-adaption exposure method
CN106091986A (en) * 2016-06-08 2016-11-09 韶关学院 A kind of method for three-dimensional measurement being applicable to glossy surface
CN107845128A (en) * 2017-11-03 2018-03-27 安康学院 A kind of more exposure high-dynamics image method for reconstructing of multiple dimensioned details fusion
CN107894215A (en) * 2017-12-26 2018-04-10 东南大学 HDR optical grating projection method for three-dimensional measurement based on fully automatic exposure

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
一种基于自动多次曝光面结构光的形貌测量方法;李兆杰;《光学学报》;20181130(第11期);全文 *
一种用于产品三维轮廓检测的自适应ROI高速相机系统;王倩;《中国优秀硕士学位论文全文数据库信息科技辑》;20150715;I138-957 *

Also Published As

Publication number Publication date
CN108827184A (en) 2018-11-16

Similar Documents

Publication Publication Date Title
CN108827184B (en) Structured light self-adaptive three-dimensional measurement method based on camera response curve
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
CN107894215B (en) High dynamic range grating projection three-dimensional measurement method based on full-automatic exposure
CN108168464B (en) phase error correction method for defocusing phenomenon of fringe projection three-dimensional measurement system
JP5576726B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method, and program
JP2012103239A (en) Three dimensional measurement device, three dimensional measurement method, and program
JP6519265B2 (en) Image processing method
CN107071248B (en) High dynamic range imaging method for extracting geometric features of strong reflection surface
US20190005607A1 (en) Projection device, projection method and program storage medium
JP6444233B2 (en) Distance measuring device, distance measuring method, and program
JP6937642B2 (en) Surface evaluation method and surface evaluation device
JP6418884B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method and program
CN109474814A (en) Two-dimensional calibration method, projector and the calibration system of projector
US11449970B2 (en) Three-dimensional geometry measurement apparatus and three-dimensional geometry measurement method
Chen et al. Automated exposures selection for high dynamic range structured-light 3-D scanning
CN114612409A (en) Projection calibration method and device, storage medium and electronic equipment
CN116608794B (en) Anti-texture 3D structured light imaging method, system, device and storage medium
CN109587463A (en) Calibration method, projector and the calibration system of projector
CN112985302A (en) Three-dimensional measurement system, method, apparatus, medium, and electronic device
US11898838B2 (en) Adjustment method and measurement method
CN115127481A (en) Stripe projection 3D measuring method, terminal device and computer readable storage medium
JP6776004B2 (en) Image processing equipment, image processing methods and programs
JP5968370B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method, and program
CN110857855B (en) Image data acquisition method, device and system
TWI604411B (en) Structured-light-based exposure control method and exposure control apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20181116

Assignee: NANJING KINGYOUNG INTELLIGENT SCIENCE AND TECHNOLOGY Co.,Ltd.

Assignor: Nanjing University of Aeronautics and Astronautics

Contract record no.: X2021320000050

Denomination of invention: An adaptive 3D measurement method of structured light based on camera response curve

Granted publication date: 20200428

License type: Common License

Record date: 20210719

EE01 Entry into force of recordation of patent licensing contract