WO2017095259A1 - Procédé de contrôle de dimensions linéaires d'objets tridimensionnels - Google Patents

Procédé de contrôle de dimensions linéaires d'objets tridimensionnels Download PDF

Info

Publication number
WO2017095259A1
WO2017095259A1 PCT/RU2015/000851 RU2015000851W WO2017095259A1 WO 2017095259 A1 WO2017095259 A1 WO 2017095259A1 RU 2015000851 W RU2015000851 W RU 2015000851W WO 2017095259 A1 WO2017095259 A1 WO 2017095259A1
Authority
WO
WIPO (PCT)
Prior art keywords
lines
camera
matrix
pixels
projector
Prior art date
Application number
PCT/RU2015/000851
Other languages
English (en)
Russian (ru)
Inventor
Андрей Владимирович КЛИМОВ
Александр Георгиевич ЛОМАКИН
Original Assignee
Андрей Владимирович КЛИМОВ
Александр Георгиевич ЛОМАКИН
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Андрей Владимирович КЛИМОВ, Александр Георгиевич ЛОМАКИН filed Critical Андрей Владимирович КЛИМОВ
Priority to PCT/RU2015/000851 priority Critical patent/WO2017095259A1/fr
Priority to CN201580080870.7A priority patent/CN107835931B/zh
Publication of WO2017095259A1 publication Critical patent/WO2017095259A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures

Definitions

  • the invention relates to measuring technique and can be used for 3D measurements with sufficient accuracy and visualization of the profiles of three-dimensional objects by observing a projected previously known pattern at different triangulation angles.
  • a known method of optical measurement of the surface shape including placing the surface in the illumination field of the projection optical system and simultaneously in the field of view of the device for registering images of the said surface, projecting using the projection optical system onto the measured surface a set of images with a given light flux structure, registering a set of corresponding images surface when it is observed at an angle different from the angle of projection of the set of images, and determine shape of the measured surface from the recorded images.
  • an additional distribution of light intensity is projected onto said surface once, allowing for each point of said surface to determine the strip number from said set of strips, an additional image of said surface is recorded, and for each visible point of said surface the resulting phase distribution based on said image of an object illuminated by a preliminary phase distribution and said image of an object illuminated by an additional illumination distribution. And from the said resulting phase distribution, the absolute coordinates of the points of the said surface are obtained using the preliminary calibration data.
  • a known method and device for contactless control and recognition of surfaces of three-dimensional objects by the method of structured illumination containing a source of optical radiation and a banner sequentially installed along the radiation, configured to form an aperiodic line structure of strips, an afocal optical system for projecting a banner image onto a controlled surface, a receiving lens forming an image of a line structure arising on top of the object being monitored, distorted by the relief of the surface of the object being monitored, a photorecorder that converts the image generated by the receiving lens into a digital, computing digital electronic unit that recalculates the digital images recorded by the photorecorder into the coordinates of the surface to be monitored, and it is equipped with additional N-1 radiation sources, each of which is different on the spectral range of radiation from the rest, N-1 transparencies, each of which differs from the rest at least for one strip, Nl lenses mounted behind banners, N-1 mirrors mounted at an angle of 45 hail, to the optical axis of each of the N-1 lens in front of the second component
  • a known method and device that implements it, to control the linear dimensions of three-dimensional objects in three Cartesian coordinates In which two cameras are located to the right and left of the projector, so they form a stereo pair like human vision. The projector projects a striped image onto the subject. An image is obtained from the right and left cameras, and then these two images are compared by correlation methods, i.e. for each strip from the right image, look for a similar pair in the left image by sorting all the stripes from the left image (US 6377700).
  • correlation methods i.e. for each strip from the right image, look for a similar pair in the left image by sorting all the stripes from the left image (US 6377700).
  • the disadvantage of this method is the large time required to sort through all possible pairs of bands and the large time of the work of correlation algorithms on a computer.
  • a known method of performing ZD measurements of an object using structured illumination in which using a projector, a previously known image having at least two disjoint lines along one of the longitudinal axes is projected onto the object under study, the projector light reflected from the object is recorded using at least of at least two cameras placed at different distances from the projector with the formation of different triangulation angles between the central beam of the projector and the central rays of the cameras, and then produce identical the fication of each line projected by the projector and formed by the reflected light received by each camera by comparing the coordinates of the lines received by the cameras, while the triangulation angle between the central beam of the projector and the central beam of the first camera located at a minimum distance from the projector is chosen equal to the arctangent of the distance the projected stripes to the depth of field of the lens of this camera, determine the longitudinal coordinates of the line centers and vertical coordinates on the image of the first camera as the quotient of dividing the longitudinal coordinate by the tangent of the triangulation angle between the central beam of
  • the disadvantages of this method are that its implementation requires at least two cameras and preferably three or more, and the possibility significant errors in the case of using one camera when determining the Z coordinate, associated with the errors in determining points along the line reflected by the object and the field received by the camera associated with the period on the columns and rows of pixels of the camera matrix, and the received field associated with the period of the reflected line is located on a large possible number columns and rows of pixels of the camera matrix.
  • the image obtained on the first camera which is located at a small angle to the projector and where the area where the line can be projected, never occupies the area of another line wherever the object is in the working area from which the line was reflected, but the accuracy of determining the coordinates of Zd is not very large and to clarify a second camera is used.
  • the line field in the camera image is the area from the pixels of the camera matrix where the center of the projected line can be located, the size of the field depends on the period between the projected lines and the thickness of the projected line. Without the use of the second camera, it is practically impossible to clarify the area of the projection of the points of the object on the matrix.
  • An object of the invention is to create an effective method for performing ZD measurements of an object using structured illumination and expanding the arsenal of methods for performing ZD measurements of an object using structured illumination.
  • the technical result that provides a solution to the problem is to reduce the duration and reduce the possibility of measurement errors and image construction of the studied object, associated with the errors in determining points along the line reflected by the object and the field received by the camera on the columns and rows of pixels of the camera matrix, as the search and formation on the image the camera of each point is made along two lines, namely, as the intersection point of two mutually perpendicular lines, which virtually eliminates the possibility of the probability of an erroneous search for a point along the line and an erroneous determination of the line number, and the received field of reflected intersecting lines rotated relative to the columns and rows of the matrix, is located on the minimum possible number of columns and rows of pixels of the matrix of the only camera needed to implement the proposed method.
  • the role of the second camera is played by projected horizontal lines perpendicular to vertical, all the intersection points of the vertical and horizontal lines uniquely specify the numbers of the horizontal lines, the determination of the coordinate coordinates takes place either from the points of intersection of the lines or along the horizontal lines, and everything happens with the help of one camera, and on the image of one camera.
  • the region of uncertainty — the field for finding the desired point — contains the minimum number of pixels and, therefore, is significantly smaller than when implementing known methods.
  • the essence of the invention lies in the fact that the method of performing ZD measurements of an object is that by means of a projector, an image with a periodic structure consisting of lines is projected onto the object to be studied, the projector light reflected from the object is recorded using pixels of a camera receiving matrix arranged to form a triangulation angle between the central beam of the projector and the central beam of the camera, and then the lines formed by the reflected light and the received pixels of the camera matrix are identified to determine location of coordinates and identification of lines on the camera image, and projecting with the help of a projector an image consisting of two intersecting groups of lines, in each group the lines are parallel to each other and are located at an angle to the vertical axis of the plane of the triangulation angle with the subsequent determination and identification of points in the camera matrix the intersection zone of each pair of lines registered to it between each other and with columns and rows of pixels of the camera matrix.
  • each intersection point of the projected line pair with the vertical column of pixels found on the matrix is determined as the coordinate Xn of the point N on the object
  • the intersection point of the horizontal pixel row with this pair of lines is determined as the Yn coordinate on the object
  • the groups of parallel projected lines are pairwise mutually perpendicular, and the lines in each group are located at equal distances between them, while the angle of inclination of the projected lines is chosen sharp.
  • one of each mutually perpendicular lines is located at an acute angle to the columns, and the other to the rows of pixels of the camera matrix, selected from the ratio: a)), where: ⁇ is the angle of the projected lines, TV2 is the distance between adjacent projected lines, M is the scale factor of the lens for converting pixels to spatial dimensions, Z1 and ⁇ 2 are the boundaries of the joint working area of the projector 1 and camera 5, and a is triangulated angle.
  • the measurements and determination of coordinates are carried out using a computer processor, and on the computer monitor form a 3D image of the measurement object.
  • figure 1 shows the location of the projector and camera when projecting one horizontal line on the object; figure 2 - arrangement of the projector and camera when projecting a line rotated by an angle ⁇ relative to the columns and rows of pixels of the camera on the object; in Fig.3 - the location of the projector and camera when projecting a line, two mutually perpendicular lines, rotated relative to the columns and rows of pixels of the camera, on the object; figure 4 is the intersection point of the projected mutually perpendicular lines and columns of pixels on the camera matrix; 5 is a region of uncertainty formed at the intersection of a column of the camera matrix and projected mutually perpendicular lines.
  • Camera 5 includes a receiving matrix 6 and an identical lens 4 to the projector’s lens.
  • Stencil 3 (identical: transparency, template, slide, etc.), for example, a thin plate having at different points on the plane onto which the light beam of the radiation source 2 falls, different absorbance or refractive index.
  • the projector 1 and the camera 5 are placed at a distance A between their lenses 4 with the formation of a triangulation angle a between the central beam of the projector 1 and the central beam of the camera 5 and the triangulation plane.
  • Z1 and Z2 in Fig. 1 are the boundaries (depth) of the joint working area of the projector 1 and camera 5.
  • the working area of the scanner looks geometrically as the area of space where the projection rays intersect, which forms the image on the object and the rays that limit the camera’s field of view.
  • Figure 4 above shows the intersection of mutually perpendicular lines 9 and 11.12 projected at an angle ⁇ onto the measured object 7, which are reflected on the latter and are received by two columns of 13.14 pixels on the matrix 6 of the camera 5 in the area containing the minimum number of columns and lines of pixels located in the area bounded by two bold points in the drawing, which are received images of the intersection points of line 9 and lines 11 and 12.
  • Figure 4 shows the intersection of a group of mutually perpendicular lines 9 and 11.12 projected at an angle ⁇ to the triangulation plane (and the columns and rows of pixels of the camera matrix) onto the measured object 7, which are reflected on the latter and are received by two columns of 13.14 pixels on the matrix 6 of the camera 5 in the area containing the minimum number of columns and rows of pixels located in the area limited in the drawing by four bold points, which are received images of the intersection points of the projected lines.
  • Figure 5 shows the received field 15 of the reflected intersecting fields (thickness b) of lines 9.11, rotated relative to the triangulation plane and the column 13 of the matrix, which is located on the minimum possible number of columns and rows of pixels of the matrix 6 of the camera 5.
  • the method is implemented as follows.
  • the method consists in projecting by the projector 1 onto the surface of the object 7 an image with a periodic structure.
  • the light of the projector 1 reflected from the object 7 is recorded using the pixels of the receiving matrix 6 of the camera 5, offset by a distance A with respect to the projection system of the projector 1 and placed with the formation of a triangulation angle a between the central beam of the projector 1 and the central beam of the camera 5.
  • an image with a periodic structure consisting of two groups of pairwise intersecting lines, for example, mutually perpendicular lines 9,10,11 located at an acute angle ⁇ to the plane of the triangulation angle (triangulation plane), i.e., as a rule, to the columns 13 and the rows of pixels of the matrix 6 of the camera 5, the light of the projector reflected from the object 7 is recorded for 1 s using pixels of the receiving matrix 6 of the camera 5.
  • One of the groups of lines provides the initial measurement of the shape of the object 7, and the second group (for example, perpendicular to the first) - its refinement.
  • the projector 1 projects, as in the known analogues, the image of the stencil 3, consisting of one horizontal line 8, which passes through the center of the image of the projector 1.
  • the camera 5 observes the object 7 at an angle a and the line 8 reflected from the object 7 is projected onto the matrix 6 of the camera 5 in different places in the Ly zone, depending on the position of the object 7 in the working area zl-z2.
  • Ly ((zl-z2) * sin (a)) / M, where M is the scale factor of lens 4, which is used to project the image onto the matrix 6 of camera 5,
  • the projected line 8 can take any position in the Ly range on the camera matrix 6, depending on the position of the object 7 in the working area.
  • M can be not just a number, but also a matrix for each lens, which contains scale corrections in the horizontal and vertical directions of the projected image. These corrections are used to correct distortion (optical distortion of space) of the lens. If you turn the image in the projector 1 to project not horizontal lines, but lines 9 located at an angle ⁇ to the triangulation plane, as shown in figure 2, then you can project more parallel lines, i.e. with a shorter period. The period between the lines in this case will depend on the angle ⁇ of rotation of the projected image in the projector 1. The distance between the parallel lines is Tv2> Ly * sin ( ⁇ ).
  • the line may fall into the region ⁇ 2 of another line and it is possible that the line number is incorrectly determined and, accordingly, the position Z of object 7 in the working area is incorrect.
  • a composite periodic image of lines 9,11,12 can be designed using a larger number of projectors, for example, two projectors (with central rays lying in one triangulation plane), but in this case the formulas for calculations are complicated.
  • the mutually perpendicular lines 9 and 11.12 in Fig. 4.5 are located at an acute angle ⁇ to the vertical axis of the plane of the triangulation angle a and to the pixel columns of the camera 5.
  • One of each mutually perpendicular lines, for example line 9, is located at an acute angle to the columns and the other, for example, lines 11,12, to the rows of pixels of the matrix 6 of camera 5.
  • the acute angle ⁇ is preferably equal to the arcsine of the ratio of the distance between the projected lines to the working area multiplied by the sine of the triangulation angle a taking into account the scale factor, t .e.
  • the movement of the object 7 in the working area along the Z axis causes the movement of all lines and all points on the lines along the vertical columns of pixels on the matrix 6 of the camera 5.
  • Each line formed by the reflected light and received pixels of the matrix 6 of the camera 5 is identified to determine the coordinate lines in the image of the camera 5.
  • the most accurate measurements for each specific stencil 3 can be carried out preliminary setting the zero position of the numbering of columns and rows of pixels on the matrix 6 of the camera 5 previously (before the working use of the system consisting of camera 5 and projector 1). This operation may serve to early correction of distortion (optical distortion of the space) of the lens and refinement of the aforementioned scale factor M.
  • the zero position is established by a preliminary calibration process (prior to installing object 7), in which in the working area of the device, a randomly selected calibration plane is moved along the Z coordinate (for example, in the form of a rearranged screen) and all columns on the matrix 6 of camera 5 along which all move intersections of projected lines.
  • the position of the calibration plane at the intersection of the rays of the projector 1 and the camera 5 is selected as the zero position.
  • the line 9 passing on the image 3 of the projector 1 through the center of the image 3 will be projected onto the matrix 6 of the camera 5 in its center and this position of the projected line 9 on the camera matrix will also be called zero.
  • the zero position is designated 0 along the Y axis. Deviation ⁇ of the line on the matrix 6 of the camera 5 when moving the calibration plane closer or further from the system consisting of the camera 5 and the projector 1 is used to refine the found intersection points of this line on the matrix 6.
  • the position of the line 9 on the matrix 6 of the camera 5 is determined by searching for the center of the line 9 of Fig. 5, because in reality the projected lines have a certain thickness b on the matrix 6 of the camera 5, which can occupy several pixels.
  • the thickness of the line 9, which crosses the column 13 of the pixels on the matrix 6, increases.
  • the determination of the position of line 9 could be less accurate, which leads to ambiguity in determining the intersections of "vertical” and "horizontal” lines, for this the period between the "horizontal” lines 9 is better to choose more than the uncertainty zone 15, shown in figure 5 with the intersection of column 13 and line 9, i.e. T gor> b / tg (), where b is the thickness of the projected line 9, T gor is the period between the "horizontal" lines 11,12.
  • the image 3 which projects the projector 1 with the vertical lattice period Tv2> ((zl-z2) * sin (a) * sin (P)) / M and the horizontal lattice period Tgor> b / tg (P) and the image 3 should be rotated through an angle ⁇ with respect to the vertical columns 13 pixels of the matrix 6.
  • Such an image projected onto object 7 allows one to determine the numbers of projected “horizontal” lines 11,12 without errors, which makes it possible, knowing the geometry (the relative location of camera 5 and projector 1, namely the angle a.
  • the geometry the relative location of camera 5 and projector 1, namely the angle a.
  • Yn is the offset (number) of the horizontal line 11 on the matrix 6 of the camera 5 relative to its central position t. e. the position when it passes through the center of the mat Itza 6.
  • Line 11 intersects the center 6 of the matrix when the object 7 is placed in the middle of the working area.
  • Figure 5 shows that the uncertainty region 15 — the field of finding the desired point for determining the line number — contains a minimum number of pixels and, therefore, is significantly smaller than when implementing the known methods. It does not require the involvement of the second camera, which simplifies the design of the equipment used, technology and processing of measurement results.
  • Measurements (calculation of specific characteristics) and determination of coordinates are carried out using a computer processor, and on the computer monitor form a 3D image of the measurement object.
  • the present invention is implemented using universal equipment widely used in industry.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne les mesures 3D d'un objet. Dans le procédé de l'invention, on projette sur l'objet examiné une image constituée de lignes puis on enregistre la lumière réfléchie au moyen des pixels de la matrice de réception de la caméra disposée de manière à former un angle de triangulation entre les rayons centraux du projecteur et de la chambre puis on effectue l'identification des lignes formées par la lumière réfléchie et les pixels de la matrice. L'image projetée est constituée de deux groupes de lignes entrant en intersection et disposés à un angle par rapport à l'axe vertical du plan de l'angle de triangulation, avec détermination et identification subséquentes sur la matrice de points dans la zone d'intersection de chaque paire des lignes enregistrées entre elles et avec les colonnes ainsi que les rangées de pixels de la matrice de la caméra. Le résultat technique consiste à réduire la durée et le risque d'erreurs de mesure et d'établissement d'images de l'objet examiné.
PCT/RU2015/000851 2015-12-04 2015-12-04 Procédé de contrôle de dimensions linéaires d'objets tridimensionnels WO2017095259A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/RU2015/000851 WO2017095259A1 (fr) 2015-12-04 2015-12-04 Procédé de contrôle de dimensions linéaires d'objets tridimensionnels
CN201580080870.7A CN107835931B (zh) 2015-12-04 2015-12-04 监测三维实体的线性尺寸的方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/RU2015/000851 WO2017095259A1 (fr) 2015-12-04 2015-12-04 Procédé de contrôle de dimensions linéaires d'objets tridimensionnels

Publications (1)

Publication Number Publication Date
WO2017095259A1 true WO2017095259A1 (fr) 2017-06-08

Family

ID=58797378

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/RU2015/000851 WO2017095259A1 (fr) 2015-12-04 2015-12-04 Procédé de contrôle de dimensions linéaires d'objets tridimensionnels

Country Status (2)

Country Link
CN (1) CN107835931B (fr)
WO (1) WO2017095259A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112017238A (zh) * 2019-05-30 2020-12-01 北京初速度科技有限公司 一种线状物体的空间位置信息确定方法及装置
US20220020139A1 (en) * 2016-01-15 2022-01-20 Instrumental, Inc. Methods for automatically generating a common measurement across multiple assembly units

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060227133A1 (en) * 2000-03-28 2006-10-12 Michael Petrov System and method of three-dimensional image capture and modeling
WO2012007003A1 (fr) * 2010-07-12 2012-01-19 3Shape A/S Modélisation 3d d'un objet en utilisant des éléments texturaux
RU125335U1 (ru) * 2012-11-07 2013-02-27 Общество с ограниченной ответственностью "Артек Венчурз" Устройство контроля линейных размеров трехмерных объектов
WO2015006431A1 (fr) * 2013-07-10 2015-01-15 Faro Technologies, Inc. Scanner à triangulation ayant des éléments motorisés

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8947677B2 (en) * 2010-08-06 2015-02-03 University Of Kentucky Research Foundation Dual-frequency phase multiplexing (DFPM) and period coded phase measuring (PCPM) pattern strategies in 3-D structured light systems, and lookup table (LUT) based data processing
JP5816773B2 (ja) * 2012-06-07 2015-11-18 ファロ テクノロジーズ インコーポレーテッド 取り外し可能なアクセサリーを備える座標測定マシン
CN104006762B (zh) * 2014-06-03 2017-01-04 大族激光科技产业集团股份有限公司 获取物体三维信息的方法、装置和系统
CN104014905A (zh) * 2014-06-06 2014-09-03 哈尔滨工业大学 Gtaw焊接过程中熔池三维形貌观测装置及方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060227133A1 (en) * 2000-03-28 2006-10-12 Michael Petrov System and method of three-dimensional image capture and modeling
WO2012007003A1 (fr) * 2010-07-12 2012-01-19 3Shape A/S Modélisation 3d d'un objet en utilisant des éléments texturaux
RU125335U1 (ru) * 2012-11-07 2013-02-27 Общество с ограниченной ответственностью "Артек Венчурз" Устройство контроля линейных размеров трехмерных объектов
WO2015006431A1 (fr) * 2013-07-10 2015-01-15 Faro Technologies, Inc. Scanner à triangulation ayant des éléments motorisés

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220020139A1 (en) * 2016-01-15 2022-01-20 Instrumental, Inc. Methods for automatically generating a common measurement across multiple assembly units
CN112017238A (zh) * 2019-05-30 2020-12-01 北京初速度科技有限公司 一种线状物体的空间位置信息确定方法及装置

Also Published As

Publication number Publication date
CN107835931B (zh) 2020-11-10
CN107835931A (zh) 2018-03-23

Similar Documents

Publication Publication Date Title
US9142025B2 (en) Method and apparatus for obtaining depth information using optical pattern
JP4290733B2 (ja) 3次元形状計測方法及びその装置
KR101605224B1 (ko) 패턴 광을 이용한 깊이 정보 획득 장치 및 방법
CN101821580B (zh) 用于实物形状的三维测量的系统和方法
US7548324B2 (en) Three-dimensional shape measurement apparatus and method for eliminating 2π ambiguity of moire principle and omitting phase shifting means
EP2568253B1 (fr) Procédé et système de mesure de lumière structurée
EP2918967B1 (fr) Procédé de vérification des dimensions linéaires d'objets tridimensionnels
CN104266608B (zh) 视觉传感器现场标定装置和标定方法
US20030123707A1 (en) Imaging-based distance measurement and three-dimensional profiling system
WO2007015059A1 (fr) Procédé et système pour saisie de données en trois dimensions
JP2012058076A (ja) 3次元計測装置及び3次元計測方法
CN101901501A (zh) 一种生成激光彩色云图的方法
CN108195305B (zh) 一种双目检测系统及其深度检测方法
CN105303572B (zh) 基于主被动结合的深度信息获取方法
WO2017095259A1 (fr) Procédé de contrôle de dimensions linéaires d'objets tridimensionnels
US10801834B2 (en) Fringe projection for determining topography of a body
JP4382430B2 (ja) 頭部の三次元形状計測システム
RU125335U1 (ru) Устройство контроля линейных размеров трехмерных объектов
US20160349045A1 (en) A method of measurement of linear dimensions of three-dimensional objects
RU164082U1 (ru) Устройство контроля линейных размеров трехмерных объектов
RU2708940C1 (ru) Способ измерения трехмерной геометрии выпуклых и протяженных объектов
RU2685761C1 (ru) Фотограмметрический способ измерения расстояний вращением цифрового фотоаппарата
Fiedler et al. A Novel Method for Digitalisation of Test Fields by Laser Scanning
Liu et al. Comparison study of three camera calibration methods considering the calibration board quality and 3D measurement accuracy
CN103411535A (zh) 一种针对回光反射标志的可变权重像点定位方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15909877

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 12.11.2018)

122 Ep: pct application non-entry in european phase

Ref document number: 15909877

Country of ref document: EP

Kind code of ref document: A1