CN107835931A - The method for monitoring the linear dimension of 3D solid - Google Patents

The method for monitoring the linear dimension of 3D solid Download PDF

Info

Publication number
CN107835931A
CN107835931A CN201580080870.7A CN201580080870A CN107835931A CN 107835931 A CN107835931 A CN 107835931A CN 201580080870 A CN201580080870 A CN 201580080870A CN 107835931 A CN107835931 A CN 107835931A
Authority
CN
China
Prior art keywords
line
pixel
matrix
video camera
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201580080870.7A
Other languages
Chinese (zh)
Other versions
CN107835931B (en
Inventor
安德烈.弗拉基米罗维奇.克里莫夫
亚历山.格尔盖维奇.洛马金
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Anna Stibwa
Lomakin Aleksandr Georgievich
Original Assignee
Alexander's Golge Lomakin
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Alexander's Golge Lomakin filed Critical Alexander's Golge Lomakin
Publication of CN107835931A publication Critical patent/CN107835931A/en
Application granted granted Critical
Publication of CN107835931B publication Critical patent/CN107835931B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The present invention provides a kind of method for carrying out 3D object measurements, including:There is the image for the periodic structure being made up of line using projector;The projector light reflected from object is recorded using the pixel of video camera receiving matrix, wherein, triangulation angle is formed between the central light beam of projecting apparatus and the central light beam of video camera;Then the line that the reception pixel of camera matrix and reflected light are formed is identified, to determine coordinate and identify the line in camera review;The image being made up of using projector two groups of intersecting lenses, line is parallel to each other and angled with the vertical axes of triangulation angle plane in each group, it is later determined that and the mutual intersection point of identification each pair line and the pixel column and line by the camera matrix on its registering camera matrix.A pair of the projection lines and each intersection point of pixel vertical row that are found on matrix are confirmed as the coordinate Xn of the point N on object, the intersection point of pixel level line and a pair of lines is confirmed as the coordinate Yn on object, and coordinate Z is determined by relation Z=M*Yn/sin (α), wherein M is intended to indicate that the lens scale factor of pixel in Spatial Dimension, and α is triangulation angle.Groups of parallel projection line is in vertical, and the line in every group is located at equidistant to each other, while the inclination angle selection of projection line is acute angle.One of line is respectively mutually perpendicular to positioned at an acute angle with the row and another one (pixel line of the camera matrix selected from following relations):β=arcsin (Tv2*M/ ((z2 z1) * sin α)), wherein:β is the angle for projecting line position, and Т v2 are the distance between adjacent projections lines, and M is intended to indicate that the lens scale factor of pixel in Spatial Dimension, and Z1 and Z2 are the borders in the associated working area of projecting apparatus (1) and video camera (5).The present invention is respectively mutually perpendicular to one of line positioned at an acute angle with the row and another one (pixel line of the camera matrix selected from following relations):β=arcsin (Tv2*M/ ((z2 z1) * sin α)), wherein:β is the angle for projecting line position, and Т v2 are the distance between adjacent projections lines, and M is intended to indicate that the lens scale factor of pixel in Spatial Dimension, and Z1 and Z2 are the borders in the associated working area of projecting apparatus (1) and video camera (5).The present invention can shorten the probability of duration, the imaging for reducing testee and measurement error.

Description

The method for monitoring the linear dimension of 3D solid
Technical field
The present invention relates to measuring apparatus, and can be used in by observing known projection pattern with different triangulation angles Carry out accurately three-dimensional (3D) measurement and three-dimensional object profile visualization.
Background technology
A kind of known method for optical measurement surface configuration includes:By a surface layout projection optical system photograph In light field and while in the visual field of the visual detector on above-mentioned surface;Using above-mentioned projection optical system, utilize given pattern Luminous flux, by one group of image projection to measured surface;Detection is when with different from the observation of the angle of image collection projection angle The correspondence image set on the surface arrived, and the image based on record determine the shape of measured surface.In addition, at least three cycles Light intensity distributions are alternately projected on above-mentioned surface, above-mentioned cycle light intensity distributions be its intensity in horizontal direction with sine The light belt set of order change, and above-mentioned cycle light intensity distributions are by the way that the light belt is integrated on the direction of light belt Enter line displacement by the controlled quatity in the light belt and distinguish, and registered image is processed to obtain to include and corresponds to surface point Phase preliminary phase distribution.In addition, by additional optical intensity distribution projection once on the surface, enabling to above-mentioned Each point on surface determines the light belt number from above-mentioned light belt set, and the additional image on above-mentioned surface is registered, above-mentioned surface Above-mentioned image of the gained phase distribution of each visible point based on the object by above-mentioned preliminary phase distribution irradiation and by complementation Illumination profile irradiation the object above-mentioned image and obtain.The absolute coordinate of the point on above-mentioned surface is based on above-mentioned gained phase Distribution is obtained using pre-calibration data.When performing measurement using the above method, it is assumed that be:The image registration of each surface point Occur under conditions of it is only illuminated by the direct beam of projecting apparatus, and in described image detector the target point figure The illumination of picture is considered as (RU proportional to the brightness of the incident light beam at that point directly from the projecting apparatus No.2148793)。
The shortcomings that this method, is the duration of its complexity implemented and process, (projecting apparatus and is taken the photograph in device location Camera) mechanical wave in the case of, the process needs the plenty of time to measure and to consider error source.
A kind of known method by structured lighting carries out the side on non-contact type control and the surface for identifying three-dimensional body Method and equipment, it includes:Optical emitter and the transparent body that aperiodic line-structured light band can be formed;Non-focus optical system, it is used In by transparent body image projection to controlled surface;Receiving lens, it is formed because of the profile on controlled object surface and distortion, controlled The image of the knot structure pattern occurred on body surface;Photorecorder, the image that receiving lens are formed is converted to numeral by it Image;Digital electronic element is calculated, digital picture caused by photorecorder is recalculated as the coordinate of controlled surface by it, Above-mentioned part is installed along radiation path successively;And it is provided with:N-1 additional radiation source, each of which are different from it Remaining radiation spectrum scope;The N-1 transparent body, each of which is with remaining at least for a light belt difference;Installed in saturating The N-1 lens at phaneroplasm rear;N-1 speculum, its be arranged on the non-focus optical system second component in front of and with institute State the optical axis of each angle at 45 ° in N-1 lens and installed in the receiving lens rear and with the receiving lens Optical axis angle at 45 °;N-1 secondary receiving lens, each of which are arranged on the rear of each 2nd N-1 speculum, and And the cable architecture occurred by the profile on controlled object surface in distortion, controlled object surface is formed together with the receiving lens The image of pattern;N-1 photorecorder, each of which have consistent with the spectral radiance scope of one of N-1 radiation source Spectral sensitivity region;N-1 calculating digital electronic element;Image addition electronic unit, it is configured to have and calculated The equal input quantity of the quantity of digital electronic element, each input of image addition electron block, which is connected to, each calculates numeral The output of electronic unit, and quantity N is by formula N=Log2(L) determine, wherein L is the spatial discrimination of the photorecorder The number (RU No.2199718) of rate element pair.
The shortcomings that this method, lies also in the complexity of its implementation and the duration of process, in device location (projecting apparatus And video camera) mechanical wave in the case of, the process needs the plenty of time to measure and to consider error source.
A kind of known its method and apparatus of implementation are that the linear chi of three-dimensional body is monitored by three cartesian coordinates It is very little.Two positions for video camera are in the right side of projector and left side, so as to form solid as human vision to (stereo pair).Projector projects band-like image on object.Image is obtained from right and left video camera, then passes through correlation technique ratio Compared with the two images, i.e., for each band from right image, by directly searching method from all light from left image Searched in band similar to (US 6377700) in left image.
The shortcomings that this method is:Searching for all possible light belt pair and operation related algorithm on computers needs to grow very much Time.
A kind of known method that three-dimensional body measurement is carried out using structured lighting, wherein:There to be at least two non-phases The predetermined image of intersection is projected on projecting apparatus object to be checked along one of longitudinal axis;Distance projection is placed on using at least two The light for the projecting apparatus that camera record at instrument different distance reflects from object, and in the central light beam and video camera of projecting apparatus Different triangulation angles is formed between heart light beam;Then identified by comparing the line coordinates obtained by video camera by projecting The every line that instrument is projected and the reflected light that is received by each video camera is formed, wherein, projecting apparatus central light beam be located at from Triangulation angle between the central light beam of the first video camera at projecting apparatus minimum range is selected as between the light belt of projection Distance and the camera lens the depth of field ratio arc tangent;In the image of the first video camera, the longitudinal direction at lines center Coordinate and vertical coordinate are confirmed as longitudinal coordinate and between the central light beam of projecting apparatus and the central light beams of the first video camera Triangulation angle tangent line business, and for clearly vertical coordinate, its value, which uses to be located at, is more than first triangulation angle Second video camera obtains, therefore is identified in the image of the second video camera:With the line identical line near longitudinal coordinate Position, the vertical coordinate that the ordinate is calculated as being determined by the first video camera is multiplied by the triangulation angle of the second video camera Tangent product;Then the designated value (WO2014074003, prototype) of longitudinal coordinate and vertical coordinate is determined for these lines.
The shortcomings that this method, is as follows:In force, two video cameras, the video camera of three or more than three are at least needed More preferably, appreciable error is there may be if Z coordinate is determined using a video camera, this is due to it is determined that being reflected along object Line on point in error and by the associated camera record of the cycle on the line and pixel column with camera matrix In error, and the received field associated with the cycle of reflected ray be positioned at the maximum quantity of camera matrix pixel column and On line.The image obtained with first video camera of projecting apparatus into low-angle, the region that the image center line can be projected is never The region of another line of any position in the workspace that busy line is reflected where object, but it is to determine the degree of accuracy of 3D coordinates It is not very high, and second video camera is to be used to carry out clearly.
The field of line in camera review is can be with the region of the camera matrix pixel at positioning projection's line center, the size of field Depending on the cycle between projection line and the thickness of projection line.In the case of without using second video camera, as a consequence it is hardly possible to The region of the projected position of target point on specified matrix.
The content of the invention
The present invention technical purpose be:A kind of effective ways are developed, it is measured using structured lighting to perform 3D objects And extend using structured lighting to perform the scope for the method that 3D objects measure.
Having the technical effect that for the solution for task of formulating is provided:Shorten the duration, reduce --- with determining along quilt Object reflection line on point with it is relevant with the error on line by the field of camera record in the pixel column of camera matrix --- the imaging of testee and the probability of measurement error, because the search and formation each put in camera review are along two Line is carried out, i.e., as two intersection points for being mutually perpendicular to line, it almost eliminates the search for errors and line number for point along the line The possibility that mistake determines, wherein relative to the received field by reflection intersecting lens of matrix columns and rows rotation required by implementing On the pixel column and line of the possible quantity of minimum of unique camera matrix needed for the method for protection.In this application, perpendicular to The projected horizontal line of vertical curve is used as the second video camera, and vertical line and horizontal all intersection points uniquely distribute horizontal Number, 3D coordinates are determined by the intersection point of line by using a video camera or determined by horizontal line, and are in a video camera Image in.
Therefore, the pixel of fuzzy region --- region where desired point --- comprising minimum number, therefore compare known to implementation It may be significantly smaller during method.
Essence of the invention is that include for the method for carrying out 3D object measurements:Had using projector by line The image of the periodic structure of composition;The projector light reflected from object is recorded using the pixel of video camera receiving matrix, its In, triangulation angle is formed between the central light beam of projecting apparatus and the central light beam of video camera;Then camera matrix is identified Reception pixel and reflected light formed line, with determine coordinate and identify camera review in line;Using projector by The image that two groups of intersecting lenses are formed, line is parallel to each other in each group and vertical axes with triangulation angle plane are into certain angle Degree, it is later determined that and the mutual intersection point of identification each pair line and the pixel by the camera matrix on its registering camera matrix Row and line.
Preferably, a pair of the projection lines and each intersection point of pixel vertical row found on matrix are confirmed as on object The intersection point of point N coordinate Xn, pixel level line and a pair of lines is confirmed as the coordinate Yn on object, and coordinate Z is by relation Z =M*Yn/sin (α) determines that wherein M is intended to indicate that the lens scale factor of pixel in Spatial Dimension, and α is triangulation angle.
Preferably, groups of parallel projection line is in vertical, and the line in every group is located at equidistant to each other, together When projection line inclination angle selection be acute angle.
Preferably, be respectively mutually perpendicular to one of line be located at it is described row and another one (what is selected from following relations takes the photograph The pixel line of camera matrix) it is at an acute angle:β=arcsin (Tv2*M/ ((z2-z1) * sin α)), wherein:β is projection line position Angle, Т v2 are the distance between adjacent projections lines, and M is intended to indicate that the lens scale factor of pixel in Spatial Dimension, Z1 and Z2 is the border in the associated working area of projecting apparatus 1 and video camera 5, and α is triangulation angle.
Preferably, measured using computer processor and determined with coordinate, and formed survey on the computer monitor Measure the 3D rendering of object.
Brief description of the drawings
Fig. 1 shows the layout of projecting apparatus and video camera when by single horizontal line projection to object;
Fig. 2 shows projecting apparatus when by relative to the line projection of camera pixel row and line anglec of rotation β to object and taken the photograph The layout of camera;
Fig. 3 shows to project when being mutually perpendicular to line projection to object relative to two of camera pixel row and line rotation The layout of instrument and video camera;
Fig. 4 shows the intersection point for being mutually perpendicular to line and pixel column on camera matrix of projection;
Fig. 5 is the fuzzy region that is mutually perpendicular to when line intersect formed of the row when camera matrix with projection.
In the accompanying drawings, reference position includes:Projecting apparatus 1, projecting apparatus 1 include radiation source 2, the die plate pattern 3 of projected image With lens 4.Video camera 5 include receiving matrix 6 and with projection lenses identical lens 4.
Die plate pattern 3 is (equivalent:The transparent body, template, lantern slide etc.), such as a kind of thin plate, it shines in the light beam of radiation source 2 There is different absorbability or refractive index on the difference for the plane being mapped to.Projecting apparatus 1 and video camera 5 are positioned to the saturating of them It is distance A between mirror 4, and triangulation angle α and three is formed between the central light beam of projecting apparatus 1 and the central light beam of video camera 5 Angular measurement plane.In this case, the Z1 in Fig. 1 and Z2 is that the border in the associated working area of projecting apparatus 1 and video camera 5 is (deep Degree).The workspace of scanner is considered as the area of space where the beam intersection of projector in size and geometry, its shape on object Into image, and light beam then defines the coverage of video camera.
Embodiment
In Fig. 1, the horizontal line 8 projected to by projecting apparatus 1 on testee 7 is reflected on the latter, and by matrix 6 (limit) pixel record on the matrix 6 of video camera 5 on whole width in the Ly of region by the horizontal dotted line in figure.In fig. 2, water Horizontal line 8 is reflected with projecting the intersection point of the line 9 on testee 7 with angle beta on testee, and by the Ly*Tv2 of region Pixel record on the matrix 6 of video camera 5, the region Ly*Tv2 are included to be located at and limited by inclined dashed line in figure and horizontal dotted line Obvious small number of pixel in fixed region, these dotted lines are with the mutual of angle beta rotation in the plane of die plate pattern 3 The record image and therefore at an angle β with the pixel column in the plane of the matrix 6 of video camera 5 and line of vertical line 8 and 9.In Fig. 3 In, the line 10 and 11 that is mutually perpendicular to projected with acute angles beta on testee 7 is reflected on testee 7, and by following intersecting Pixel record in region on the matrix 6 of video camera 5, the intersecting area is comprising by inclined dashed line, (it is the note of line 10 and 11 Record image) it is limited to the lesser amount of pixel in the region on pixel column 13.
Show to project with angle beta at the top of Fig. 4 and be mutually perpendicular to line 9 and 11 on testee 7,12 intersection point, its Reflected on testee 7, and by the pixel record of two row 13,14 on the matrix 6 of video camera 5 in following regions, the region Comprising in two heavy points (it is line 9 and line 11 and the record image of the intersection point of line 12) limited areas in by figure most The pixel column and line of peanut.
Fig. 4 bottom is shown to be at an angle of β projections for triangulation plane (row and line of the pixel of camera matrix) Line 9 and 11 is mutually perpendicular on to testee 7,12 intersection point, it is reflected on testee 7, and by following regions The pixel record of two row 13,14 on the matrix 6 of video camera 5, the region are included positioned at by four heavy points in figure, (it is projection The record image of the intersection point of line) minimal amount in limited area pixel column and line.
Fig. 5 shows that the reflection cross(ed) field of the line 9,11 relative to triangulation plane and the rotation of following matrix columns 13 is (thick Spend and be located at for record field 15 b), the matrix on the pixel column and line of the possible number of minimum of the matrix 6 of video camera 5.
Methods described is implemented as follows.
Methods described includes:By projecting apparatus 1 by the image projection of periodic structure to the surface of object 7.Using taking the photograph The pixel of the receiving matrix 6 of camera 5 come record from object 7 reflect projecting apparatus 1 light, the video camera 5 is relative to projecting apparatus 1 Optical projection system shift length A and it is arranged to:Formed between the central beam of projecting apparatus 1 and the central beam of video camera 5 Triangulation angle α.
Using projecting apparatus 1, the image of periodic structure is projected to the research that the intersecting lens formed pair by two is formed simultaneously On object 7, such as the plane (triangulation plane) relative to triangulation angle --- i.e. generally relative to video camera 5 The pixel column 13 and line of matrix 6 --- β's at an acute angle is mutually perpendicular to line 9,10,11, the light of the projecting apparatus 1 reflected from object 7 by The pixel record of the receiving matrix 6 of video camera 5.One of which line is provided to the initial measurement of the shape of object 7, and second group of (example Such as, perpendicular to first group) it is used for its refinement.
In Fig. 1, such as in known analog, projecting apparatus 1 projects the image of reticle pattern 3, and the reticle pattern 3 is by passing through A horizontal line 8 for crossing the picture centre of projecting apparatus 1 forms.Video camera 5 is existed with angle [alpha] observed objects 7 according to object 7 Position in the z1-z2 of workspace, the line 8 reflected from object 7 are projected in Ly regions the square of the video camera 5 at diverse location In battle array 6.In addition, Ly=((z1-z2) * sin (α))/M, wherein M is the scale factor of lens 4, the lens 4 are used to throw image Shadow is on the matrix 6 of video camera 5.Therefore, can observe, the position of the object 7 in workspace, projection line 8 can account for Take any position in Ly scopes on the matrix 6 of video camera.Thus, in order to uniquely identify and not obscure at least in video camera 5 Matrix 6 on projection line, it is necessary to project with more than Ly cycle line, i.e. T v1>Ly=((z1-z2) * sin (α))/M.
In this case, for the sake of clarity, it is assumed that:Identical lens 4 with the same ratio factor can be used for throwing Both shadow instrument 1 and video camera 5.If using different lens, value M should be considered on projecting apparatus 1 and video camera 5 not With the ratio between ratio between projecting lens.M can be not only numeral or the matrix for each lens, and it includes pin To the ratio adjustment in the horizontally and vertically direction of projected image.These corrections are intended to correct distortion (the optics mistake in space of lens Very).
If by the image rotation in projector 1, and not projected horizontal line but projection it is at an angle β with triangulation plane Line 9, as shown in Fig. 2 can then project more parallel multistriplines, that is, there is the shorter cycle.In this case, the cycle between line By the anglec of rotation β depending on the projected image in projecting apparatus 1.The distance between parallel lines are Т v2>Ly*sin(β).
If cycle T v2 is less than Ly*sin (β), line is likely located in the Tv2 regions of another line, and line number mesh may It is falsely detected, therefore, the position Z of object 7 may be erroneously determined that in workspace.
Greater number of projecting apparatus can be used (for example, (its central light beam is located at a triangulation and put down two projecting apparatus In face)) carry out the compounding period formula images of design lines 9,11,12, but in this case, calculating becomes more complicated.
The institute wired 9 projected with this anglec of rotation β and cycle is unique, i.e., depending on object 7 is in workspace Z2-Z1 In position, the line of all projections will be projected on the matrix 6 of video camera 5 in their specific region.
In Fig. 4 and Fig. 5 be mutually perpendicular to line 9 and 11,12 be located relative to triangulation angle α plane vertical axis and Relative to the pixel column of video camera 5 β at an acute angle.One of line, such as line 9 are respectively mutually perpendicular to, positioned at the matrix for video camera 5 6 pixel column and other lines (such as line 11,12) are at an acute angle.In this case, acute angles beta is preferably equal to distance between projection line The sine for the triangulation angle α for considering scale factor is multiplied by with the ratio between workspace, i.e., is determined by following relation:β=(arcsin (Tv2*M/ ((z2-z1) * sin α)), wherein:β is the angle of projection line, and Tv2 is the distance between adjacent projections line, and M is to be used for The lens scale factor of the pixel in Spatial Dimension is represented, Z1 and Z2 are the sides in the associated working area of projecting apparatus 1 and video camera 5 Boundary, α are triangulation angles.
Therefore, when image 3 is rotated in projecting apparatus 1, more multi-thread 9,11,12 the square of video camera 5 will can be projected to In battle array 6, and obtain more information on object 7, so as to constriction on the matrix 6 of video camera 5 for each point of object 7 Fuzzy region.
α is at an angle of relative to projecting apparatus 1 because video camera 5 is positioned in perpendicular, therefore object 7 is in workspace Along axis Z movement cause the line of vertical pixel column on matrix 6 of the point along video camera 5 on wired and line move.
For each pair projection line with each other and with the intersecting area of pixel column and line on the matrix 6 of video camera 5 with It is lower to determine that (positioning and research) is based on herein below.
If the line 9 of β projections is at an angle of with vertical (hereinafter referred to as " vertical ") in the image of projecting apparatus 1 and perpendicular to it With it is horizontal (hereinafter referred to as " and the angled β of level ") line 11 intersects, then on the matrix 6 of video camera 5, the intersection point 10 of above-mentioned line To always it be projected on the vertical row of the matrix 6 of video camera 5.
If line 9 intersects with line 11 and line 12, each intersection point is by its row being projected on matrix 6, such as Fig. 4 institutes Show.The intersection point of line 9 and line 12 will be projected on row 14, and the intersection point of line 9 and 11 is projected on row 13.
The every line and the record pixel of the matrix 6 of video camera 5 that perception reflex light is formed, to determine the image of video camera 5 In coordinate line.
Accurately measured as far as possible for each specific die plate pattern 3 if necessary, then can (operation by Before the system that video camera 5 and projecting apparatus 1 are formed) line of pixel on the matrix 6 of default video camera 5 and the zero point position of column number Put.This operates the advance correction of distortion (optical space distortion) and the becoming more meticulous for aforementioned proportion factor M that can be used for lens.
Zero position is set by predetermined calibration process (before object 7 is set) to perform, in this place in the work of equipment Optional calibration plane (for example, in the form of mobile screen) is moved along coordinate Z in area, and all friendships of projection line All row on the matrix 6 of mobile the followed video camera 5 of point are recorded.The point of intersection of the light beam of projecting apparatus 1 and video camera 5 The position of calibration plane be selected as zero position.Through the line 9 at the center of image 3 in zero position, the image 3 of projecting apparatus 1 It will be projected on the matrix 6 of video camera 5 at its center, and this position of projection line 9 on camera matrix Zero point will be referred to as.In Fig. 1, zero position is labeled as 0 along axis Y.When calibration plane is moved close to or away from by video camera 5 During the system formed with projecting apparatus 1, the deviation delta Yn of the line on the matrix 6 of video camera 5 is used for the positioning of the line on refinement matrix 6 Intersection point.
The center of the line 9 that is located through search graph 5 of the line 9 on the matrix 6 of video camera 5 performs, because actual upslide Hachure has certain thickness b on the matrix 6 of video camera 5, and it occupies several pixels.When the image 3 of projecting apparatus 1 is rotated When, the thickness increase of the line 9 intersected with pixel column 13 on matrix 6.And the positioning of line 9 may be less accurate, and this causes to determine The fuzziness during intersection point of " vertical " line and " level " line.Further in this regard, it is preferable to the cycle between " level " line 9 is selected to be more than Row 13 and the fuzzy region 15 as shown in Figure 5 of the point of intersection of line 9, i.e. Т gor>B/tg (β), wherein b are the thickness of projection line 9 Degree, Tgor is the cycle between " level " line 11,12.
In order to project lines 11,12 more more than horizontal line, it is necessary to rotate " vertical " line 9, as shown in figure 1, a wherein line And its band of position Ly on matrix almost occupies the whole matrix 6 of video camera 5, therefore, " vertical " line 9 relative to vertically with Angle beta projects.Fig. 2 shows that region Tv2 is much smaller than region Ly." vertical " line 9 is intersecting with " level " line 11,12, and these lines Intersection point clearly provide on set point intersect vertical curve and horizontal quantity data.
It is rational that selection " level " line 11,12, which is equal to or less than vertical curve, to avoid ambiguity when determining line intersection point. Simultaneously, it is proposed that select the horizontal cycle to be more than the confusion region occurred when line intersects, the confusion region causes it is determined that vertical curve With fuzziness during horizontal intersection point.
Therefore, can be with vertical lattice cycle Т v2>((z1-z2) * sin (α) * sin (β))/M and HORIZONTAL PLAID cycle Т gor> B/tg (β) realizes the image 3 projected by projecting apparatus 1.Image 3 should be relative to the anglec of rotation of pixel vertical row 13 of matrix 6 β。
This image being projected on object 7 allows the quantity of " level " line 11,12 for accurately determining projection, and this makes :Known geometrical construction (relative position of video camera 5 and projecting apparatus 1, i.e., the system being made up of video camera 5 and projecting apparatus 1 Angle [alpha]) in the case of, it may be determined that the shape Z of the object 7 in the workspace for the system being made up of video camera 5 and projecting apparatus 1 =M*Yn/sin (α).Yn is skew (quantity) of the horizontal line 11 on the matrix 6 of video camera 5 relative to its center, i.e., its Through matrix 6 center m when position.When object 7 is located at the centre of workspace, line 11 intersects with the center 6 of matrix.
Therefore, it is possible to quickly and accurately determine it is between projection line and with pixel column on camera matrix Intersection point, in addition, projection line and the intersection point that is listed in vertically on camera matrix recently are defined as the coordinate of point N on object in pairs The intersection point of Xn, the paired line and nearest horizontal pixel line is confirmed as the coordinate Yn on object, and coordinate Z is by relation Z=M*Yn/ Sin (α) determines that wherein M is intended to indicate that the lens scale factor of pixel in Spatial Dimension, and angle α is triangulation angle.
Fig. 5 shows the picture of confusion region 15 --- for determining the position field of line number purpose desired point --- comprising minimal amount Element, and therefore substantially than implementation known method when it is smaller.This need not use second video camera, so as to simplify used set The processing of standby design, technology and measurement result.Measure (calculating of concrete property) and coordinate is determined by using computer Processor is performed, and the 3D rendering of testee is formed on the computer monitor.
Therefore, the duration can be shortened, reduce the imaging of testee and the probability of measurement error.
Industrial usability
The present invention is implemented using industrial widely used common apparatus.

Claims (5)

1. the method for carrying out 3D object measurements, including:Using projector with the periodic structure being made up of line Image;The projector light reflected from object is recorded using the pixel of video camera receiving matrix, wherein, in the center light of projecting apparatus Triangulation angle is formed between beam and the central light beam of video camera;Then the reception pixel and reflected light shape of camera matrix are identified Into line, with determine coordinate and identify camera review in line;The image being made up of using projector two groups of intersecting lenses, Line is parallel to each other and angled with the vertical axes of triangulation angle plane in each group, it is later determined that and identification each pair line Mutual intersection point and the pixel column and line by the camera matrix on its registering camera matrix.
2. according to the method for claim 1, it is characterised in that a pair of projection lines and the pixel vertical row found on matrix Each intersection point be confirmed as the coordinate Xn of the point N on object, the intersection point of pixel level line and a pair of lines is confirmed as object On coordinate Yn, and coordinate Z determines that wherein M is intended to indicate that pixel in Spatial Dimension by relation Z=M*Yn/sin (α) Lens scale factor, α are triangulation angles.
3. method according to claim 1 or 2, it is characterised in that groups of parallel projection line is in vertical, and Line in every group is positioned at equidistant to each other, while the inclination angle selection of projection line is acute angle.
4. according to the method for claim 3, it is characterised in that be respectively mutually perpendicular to one of line positioned at the row and separately One (pixel line of the camera matrix selected from following relations) is at an acute angle:β=arcsin (Tv2*M/ ((z2-z1) * sin α)), wherein:β is the angle for projecting line position, and Т v2 are the distance between adjacent projections lines, and M is intended to indicate that in Spatial Dimension The lens scale factor of pixel, Z1 and Z2 are the borders in the associated working area of projecting apparatus (1) and video camera (5), and α is triangulation Angle.
5. the method according to claim 1,2 or 4, it is characterised in that measured using computer processor and coordinate It is determined that and on the computer monitor formed measurement object 3D rendering.
CN201580080870.7A 2015-12-04 2015-12-04 Method for monitoring linear dimension of three-dimensional entity Expired - Fee Related CN107835931B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/RU2015/000851 WO2017095259A1 (en) 2015-12-04 2015-12-04 Method for monitoring linear dimensions of three-dimensional entities

Publications (2)

Publication Number Publication Date
CN107835931A true CN107835931A (en) 2018-03-23
CN107835931B CN107835931B (en) 2020-11-10

Family

ID=58797378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201580080870.7A Expired - Fee Related CN107835931B (en) 2015-12-04 2015-12-04 Method for monitoring linear dimension of three-dimensional entity

Country Status (2)

Country Link
CN (1) CN107835931B (en)
WO (1) WO2017095259A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10198808B2 (en) * 2016-01-15 2019-02-05 Instrumental, Inc. Methods for automatically generating a common measurement across multiple assembly units
CN112017238B (en) * 2019-05-30 2024-07-19 北京初速度科技有限公司 Method and device for determining spatial position information of linear object

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120092463A1 (en) * 2010-08-06 2012-04-19 University Of Kentucky Research Foundation (Ukrf) Dual-frequency Phase Multiplexing (DFPM) and Period Coded Phase Measuring (PCPM) Pattern Strategies in 3-D Structured Light Systems, and Lookup Table (LUT) Based Data Processing
CN104006762A (en) * 2014-06-03 2014-08-27 深圳市大族激光科技股份有限公司 Method, device and system for obtaining three-dimensional information of object
CN104014905A (en) * 2014-06-06 2014-09-03 哈尔滨工业大学 Observation device and method of three-dimensional shape of molten pool in GTAW welding process
WO2015006431A1 (en) * 2013-07-10 2015-01-15 Faro Technologies, Inc. Triangulation scanner having motorized elements
CN104380033A (en) * 2012-06-07 2015-02-25 法罗技术股份有限公司 Coordinate measurement machines with removable accessories

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7065242B2 (en) * 2000-03-28 2006-06-20 Viewpoint Corporation System and method of three-dimensional image capture and modeling
CN103228228B (en) * 2010-07-12 2016-04-13 3形状股份有限公司 Use the 3D object modeling of textural characteristics
RU125335U1 (en) * 2012-11-07 2013-02-27 Общество с ограниченной ответственностью "Артек Венчурз" DEVICE FOR MONITORING LINEAR SIZES OF THREE-DIMENSIONAL OBJECTS

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120092463A1 (en) * 2010-08-06 2012-04-19 University Of Kentucky Research Foundation (Ukrf) Dual-frequency Phase Multiplexing (DFPM) and Period Coded Phase Measuring (PCPM) Pattern Strategies in 3-D Structured Light Systems, and Lookup Table (LUT) Based Data Processing
CN104380033A (en) * 2012-06-07 2015-02-25 法罗技术股份有限公司 Coordinate measurement machines with removable accessories
WO2015006431A1 (en) * 2013-07-10 2015-01-15 Faro Technologies, Inc. Triangulation scanner having motorized elements
CN104006762A (en) * 2014-06-03 2014-08-27 深圳市大族激光科技股份有限公司 Method, device and system for obtaining three-dimensional information of object
CN104014905A (en) * 2014-06-06 2014-09-03 哈尔滨工业大学 Observation device and method of three-dimensional shape of molten pool in GTAW welding process

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JASON GENG: "DLP-Based Structured Light 3D Imaging Technologies and Applications", 《PROC.OF SPIE》 *

Also Published As

Publication number Publication date
WO2017095259A1 (en) 2017-06-08
CN107835931B (en) 2020-11-10

Similar Documents

Publication Publication Date Title
US10401143B2 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US9858682B2 (en) Device for optically scanning and measuring an environment
EP1364226B1 (en) Apparatus and method for obtaining three-dimensional positional data from a two-dimensional captured image
EP2183544B1 (en) Non-contact measurement apparatus and method
EP2568253B1 (en) Structured-light measuring method and system
KR20190085151A (en) Distance sensor with adjustable focus imaging sensor
CN105960569B (en) The method of three-dimension object is checked using two dimensional image processing
WO2006013635A1 (en) Three-dimensional shape measuring method and apparatus for the same
JP2011227093A (en) Method for measuring shape of reflection surface and system
TWI740237B (en) Optical phase profilometry system
JP2012058076A (en) Three-dimensional measurement device and three-dimensional measurement method
WO2014074003A1 (en) Method for monitoring linear dimensions of three-dimensional objects
JP4402458B2 (en) Method for determining corresponding points in 3D measurement
JP2005195335A (en) Three-dimensional image photographing equipment and method
US11727635B2 (en) Hybrid photogrammetry
JP2019045299A (en) Three-dimensional information acquisition device
CN107835931A (en) The method for monitoring the linear dimension of 3D solid
US20240175677A1 (en) Measuring system providing shape from shading
JP2011075336A (en) Three-dimensional shape measuring instrument and method
JP2010243273A (en) Measuring method and measuring apparatus for object having cylindrical shape
US20160349045A1 (en) A method of measurement of linear dimensions of three-dimensional objects
RU125335U1 (en) DEVICE FOR MONITORING LINEAR SIZES OF THREE-DIMENSIONAL OBJECTS
TW201432222A (en) Three-dimensional range finding method and system thereof
JP2017181114A (en) Radiation intensity distribution measurement system and method
RU164082U1 (en) DEVICE FOR MONITORING LINEAR SIZES OF THREE-DIMENSIONAL OBJECTS

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200927

Address after: 25-2-21 Spring Street, Moscow, Russian Federation

Applicant after: Anna Stibwa

Applicant after: LOMAKIN ALEKSANDR GEORGIEVICH

Address before: 40, building 1, 25 wernatsky Road, Moscow

Applicant before: Andre Vladimirovich Kerrey mov

Applicant before: LOMAKIN ALEKSANDR GEORGIEVICH

GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201110

Termination date: 20211204