WO2021094636A1 - Procédé et système pour le suivi spatial d'objets - Google Patents

Procédé et système pour le suivi spatial d'objets Download PDF

Info

Publication number
WO2021094636A1
WO2021094636A1 PCT/ES2020/070697 ES2020070697W WO2021094636A1 WO 2021094636 A1 WO2021094636 A1 WO 2021094636A1 ES 2020070697 W ES2020070697 W ES 2020070697W WO 2021094636 A1 WO2021094636 A1 WO 2021094636A1
Authority
WO
WIPO (PCT)
Prior art keywords
camera
artifact
yaw
pitch
points
Prior art date
Application number
PCT/ES2020/070697
Other languages
English (en)
Spanish (es)
Inventor
Aitor Olarra Urberuaga
Gorka KORTABERRIA BERRIOZABAL
Brahim Ahmed CHEKH OUMAR
Andoni Delgado Castrillo
Original Assignee
Fundación Tekniker
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fundación Tekniker filed Critical Fundación Tekniker
Publication of WO2021094636A1 publication Critical patent/WO2021094636A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates

Definitions

  • the present invention relates to the field of spatial object tracking.
  • it relates to non-contact measurement techniques to improve the precision that can be achieved in spatial tracking of objects by means of a spatial resection procedure.
  • photogrammetry is based on extracting three-dimensional measurements from two-dimensional data (that is, images).
  • photogrammetry uses the spatial resection method to obtain the exterior orientation of a single image.
  • spatial resection the spatial position and orientation of a camera is determined based on the central projection of the camera and modeling of optical distortion due to lens shape errors.
  • the pinhole camera model represents the mathematical definition of light output through a camera lens between the 3D world (the object space) and a 2D image (the sensor plane). This is outlined in Figure 1, which depicts how to obtain the position and orientation of an object 11 using a single camera 10 by applying the spatial resection method.
  • the following equation describes the rigid transformation between the camera coordinate system and the object coordinate system: where it indicates the spatial position of the camera 10, it indicates the spatial position of object 11 and the matrix represents the relative rotation and displacement from object 11 to camera 10.
  • represents elevation angle (also referred to as pitch or pitch angle) ( relative pitch angle) of object 11
  • represents the yaw angle (also referred to as yaw angle) relative to object 11
  • F represents the yaw angle (also referred to as roll angle ) (roll angle) of the object 11
  • yd TxTyTz represents the relative translation T x , T y and T z.
  • the spatial resection method iteratively minimizes the planar distance between the observed image points and those that are theoretically projected in order to determine the position and orientation of the camera 10 with 6 degrees of freedom (DOF DOF) that best fit the corresponding points.
  • DOF DOF degrees of freedom
  • At least three reference points must be marked on the object.
  • six circles depicted on the object 11 represent corresponding object reference points. Each circle has an Xi, Yi, Zi coordinate.
  • the target reference point distribution (how the reference points are distributed on the object) generates a different image after camera translation and a different image after camera rotation.
  • the target distribution generates nearly identical images after camera translation and camera rotation.
  • Autocollimation is an optical setup in which a collimated beam of parallel light rays exits an optical system and is reflected back to the same system by a flat mirror. Autocollimation is used to measure small tilt angles of the mirror with high precision.
  • most of the autocollimation techniques have a limitation to estimate the angle of rotation or roll of a mirror.
  • most existing autocollimation devices are capable of measuring only two bank angles (pitch or pitch and pitch). turn) of a mirror.
  • new autocollimators have been developed, which have a certain ability to measure an angle of rotation around the normal axis of the mirror based on a special lens. This is the case, for example, of TriAngle®3D, from the company TRIOPTICS.
  • the accuracy of the roll or roll angle measurement (the angle of rotation) is significantly reduced compared to the accuracy of the pitch and bank angle measurement.
  • the method and system described in the present invention are intended to solve the drawbacks of the prior art.
  • two non-contact measurement techniques are combined in order to improve the precision that can be achieved in spatial monitoring or tracking of objects.
  • the two techniques are (a) photogrammetry, which is applied with a single camera, and (b) autocollimation, which is used to estimate the absolute angles of inclination of a specular surface that, with a calibrated transformation, represents the pitch (bank) and yaw angles of the object being tracked.
  • photogrammetry the problem of orientation and external translation of a camera by means of a single image is addressed by applying the spatial resection technique.
  • the camera orientation (pitch (bank) and yaw angles) is obtained with respect to the object's coordinate system that is constructed from a set of objectives (targets).
  • the spatial resection technique is restricted with the orientation values obtained (the pitch (bank) and turn angles) and the remaining parameters (T x T and T z and the angle of roll or roll F) are they can estimate with higher precision and lower correlation between them. Therefore, these are estimated with a better (lower) uncertainty.
  • a restricted spatial resection is applied instead of a conventional spatial resection.
  • a first aspect of the invention refers to a method for estimating the position and orientation of an object with 6 degrees of freedom, comprising: attaching an artifact to the object whose position and orientation are to be estimated, where the artifact comprises a specular surface and a set of N reference points, where each reference point is defined by a position X i , Y i , Z i in a coordinate system defined in the artifact, where 1 ⁇ i ⁇ N and N>2; placing a camera facing the artifact that is attached to the object; measuring the angles of elevation, pitch or inclination ⁇ and deviation or yaw ⁇ that represent the inclination of the specular surface and, therefore, of the object, said measurement being carried out by applying an autocollimation technique; capturing an image of the set of reference points comprised in the artifact that is attached to the object, thereby obtaining, in the plane of the camera sensor, observed image points corresponding to said reference points; obtain the angle of rotation or roll F of the object and a translation vector
  • said roll or roll angle F and said translation vector between the camera and the object are obtained by implementing a spatial resection technique, restricted by the elevation, pitch or tilt angles ⁇ and of deviation or yaw ⁇ already obtained, iteratively solving the following optimization problem: where N is the number of image points observed; (x ' i , y' ⁇ ) are the observed image points; and (X ' Ei , y' Ei ) are the estimated image points.
  • the estimated image points (X ' Ei , and' Ei ) are obtained by applying a pinhole camera model that represents the projection (X ' Ei , and' Ei ) of each 3D reference point on the camera sensor image plane: where: which can be simplified as: and where: is the intrinsic matrix of the camera and where l is a scale factor.
  • the estimated image points (X ' Ei , and' Ei , 0) are calculated as follows by applying the central projection equation in computer vision:
  • the estimated image points (X ' Ei , and' Ei , 0) are calculated as follows by applying collinearity equations:
  • the proposed method can be implemented either in a single device that encompasses both the functionality of a camera and an autocollimator; or in two separate devices - a camera and an autocollimator - that work together.
  • a second aspect of the invention refers to a system for estimating the position and orientation of an object with 6 degrees of freedom, comprising: means for taking photogrammetry measurements; means for carrying out autocollimation; and an artifact that is attached to the object whose position and orientation are to be estimated, wherein said artifact comprises a specular surface and a set of N reference points, where each reference point is defined by a position in a coordinate system defined in the artifact, where 1 ⁇ i ⁇ N and N>2; wherein the means for carrying out autocollimation are configured to measure the angles of elevation, pitch or bank ⁇ and deviation or yaw ⁇ that represent the inclination of the specular surface and, therefore, of the object; wherein the means for taking photogrammetry measurements are configured to: capture an image of the set of reference points comprised in the artifact that is attached to the object, thereby obtaining some points observed image images corresponding to said reference points; and obtain the angle of rotation or roll F of the object and a translation vector (T x T and
  • the means for taking photogrammetric measurements and the means for carrying out autocollimation are comprised of a chamber.
  • the means for taking photogrammetric measurements are comprised of a camera and the means for performing autocollimation are comprised of an autocollimator.
  • the N reference points represent a checkerboard pattern, or a circular pattern, or a square pattern, or a cross-shaped pattern, or follow special markers.
  • the specular surface is a flat mirror or any flat or semi-flat surface capable of producing a specular reflection of light.
  • the system further comprises mounting means for mounting a plurality of measurement means for measuring the spatial location and orientation of the mirror.
  • the device can be embodied in at least two different configurations.
  • a single device integrates the two technologies (camera and autocollimator).
  • the device such as A camera is made up of a light source - such as an LED source - and a beam splitter.
  • the light source and beam splitter can be mounted on the camera. Therefore, simply by modifying the acquisition parameters of the camera, the camera can act as an autocollimator.
  • the device is comprised of a single camera and a separate 2D autocollimator. This configuration requires a previous extrinsic calibration phase, to collect independent data and obtain output data with respect to the same reference system.
  • a third aspect of the invention relates to a computer program product comprising computer program instructions / code for carrying out the disclosed method.
  • a fourth aspect of the invention relates to a computer-readable memory / medium that stores instructions / program code to carry out the disclosed method.
  • the combination of a camera configured for photogrammetry and an autocollimator, either in a single device or in two separate devices, provides the following advantages: (a) It enhances the precision of measurements that can be obtained as a consequence of the reduction in the uncertainty to identify four camera parameters (the orientation - the roll or roll- and T x T and T z ) using spatial resection techniques, thanks in turn to the restriction of the orientation parameters - the pitch or tilt and the yaw - by autocollimator.
  • Figure 1 depicts the spatial resection technique, which is a well-known technique for determining the spatial position and orientation of a camera based on the central projection of the camera.
  • Figure 2 shows exemplary photogrammetric configurations with different correlation between translation and rotation of a camera.
  • Figures 3 (a) and 3 (b) show side views of a chamber used in embodiments of the present invention. Part of the outer casing has been removed to show the inside of the camera.
  • Figure 3 (c) shows an exploded view of the chamber shown in Figures 3 (a) and 3 (b).
  • Figure 4 shows a measurement artifact to be attached to an object to perform 6 DOF follow-up measurements, according to embodiments of the present invention.
  • the artifact includes a mirror surface and a set of reference points.
  • Figure 5 shows a schematic according to embodiments of the invention, in which autocollimation measurements are taken with the camera of Figures 3 (a) - (c) and the measurement artifact of Figure 4.
  • Figure 6 shows exemplary components of a device for performing the autocollimation technique.
  • Figure 7A shows a schematic according to embodiments of the invention, in which photogrammetry measurements are taken with the camera of Figures 3 (a) - (c) and the artifact of Figure 4.
  • Figure 7B shows the schematic of Figure 7A, further including the 6 parameters to be measured.
  • Figure 8 shows an alternative scheme according to some embodiments of the invention, in which the spatial monitoring or tracking of an object is achieved by means of a camera and a separate 2D autocollimator, plus a measurement artifact as shown. in figure 4.
  • Figures 3 (a) to 3 (c) show different views of a camera 40 suitable for carrying out spatial monitoring or tracking of an object according to an embodiment of the invention.
  • the camera 40 integrates the two technologies that are required to improve the precision in measurements necessary for the spatial tracking of an object: a camera as such, for photogrammetry measurements, and an autocollimator.
  • Figures 3 (a) and 3 (b) part of the outer casing has been removed to show the inner part of the camera.
  • Figure 3 (c) shows an exploded view of the chamber shown in Figures 3 (a) and 3 (b).
  • Camera 40 also includes a light source 43 necessary to implement autocollimation functionality.
  • Light source 43 can be a light emitting diode (LED). Alternatively, this can be a laser emitter. This can increase the power and therefore the operating range, or it can make it possible to work in lighting environments.
  • Camera 40 also includes a beam splitter 44, also necessary to implement autocollimation functionality. Beam splitter 44 can be implemented, for example, as a plate or a cube.
  • Chamber 40 also has structural elements 45-50 to enable mounting, docking, and / or support of optical and / or electrical components.
  • the body 45 is the main structural element of the chamber 40, to which the other elements of the chamber are attached.
  • the beam splitter is supported by element 46, which ensures contact between the beam splitter and the camera sensor on one side, and between the beam splitter and the light source on the other side.
  • the camera 40 also comprises special spacer elements 47, which are used to set the focal length depending on the camera lens.
  • the camera sensor is fixed by means of the structural element 48.
  • Chamber 40 also comprises a light source support 49.
  • a cover 50 encapsulates and protects all internal elements.
  • Chamber 40 also has fixing elements 51-54 for fixing and coupling the different (structural) mounting, optical and / or electrical components.
  • the measurement artifact 60 shown in Figure 4 is composed of a mirror surface 55 and a set or grid of reference points 56 that is attached or attached to the artifact.
  • the specular surface 55 is implemented as a flat mirror.
  • Alternative specular surfaces can be any flat or semi-flat surface capable of producing a specular reflection of light, such as a part that has a good surface finish.
  • Flat mirror 55 is used for self-collimation measurements. It is mounted on a frame 57.
  • the set of reference points 56 are targets that are required for the measurements that are carried out by the camera 40 using the spatial resection technique.
  • the set of reference points 56 requires at least 3 points or targets. It should be noted that, in general, the more reference points there are, the less uncertainty there will be in the measurements.
  • the targets are set up in the form of a checkerboard.
  • the reference points correspond to the sides of each square on the chessboard.
  • the artifact 60 shown in Figure 4 is comprised of a flat mirror 55 and a calibrated chessboard 56.
  • the chessboard is widely used for internal camera calibration.
  • the coordinates of the reference points 56 are known by a coordinate system defined in the artifact 60 (the mirror 55 and the set of reference points 56).
  • the geometric relationship between the plane of the plane mirror 55 and the coordinate system formed by the set of reference points 56 has to be established.
  • a phase of characterization of the artifact 60 (mirror 55 and set of reference points 56), preferably offline or offline, before starting the phase of measuring the position and orientation of an object. The results of this characterization are used later during the measurements.
  • a common reference system has to be defined for the plane mirror 55 and the set of reference points 56.
  • the normal vector of the mirror 55 and the Z axis of the coordinate system that is created at starting from the set of reference points 56 eg, a chessboard
  • they may be aligned, but they may not coincide.
  • the artifact 60 shown in figure 4 (the mirror 55 with the set of reference points 56) is measured, for example, in a Measuring Machine of Coordinates (Coordinate Measuring Machine, CMM).
  • the artifact 60 further includes a mounting means, such as a plurality of housings, for mounting a plurality of measurement means to measure the spatial location and orientation of the artifact 60.
  • the measurement means may be retroreflective, with the that the spatial location and orientation of the artifact can be measured by conventional laser tracker technology, which is outside the scope of the present invention.
  • an artifact 60 comprising a plane mirror and a set of reference points, it is attached to the object whose position and orientation (T x , T y , T z , F, q, ⁇ ), with respect to the camera, will be estimated.
  • the camera 40 is positioned facing the artifact 60 that is attached to the object, for example, as shown in Figures 5, 7A and 7B, in such a way that the images of the set of reference points that are comprised in the artifact 60 can be captured by camera 40.
  • the pitch or bank angles ⁇ and yaw ⁇ are measured, which represent the tilt of the plane mirror 55. This is done by applying the autocorrelation functionality available in the camera 40 and based on the mirror of type specular 55 (the plane mirror 55). The set of reference points 56 is not used in this phase.
  • Figure 5 shows a scheme to carry out autocollimation measurements to obtain pitch or bank angles ⁇ and deviation or yaw ⁇ with camera 40 and a flat mirror. 55.
  • Figure 6 shows the components of an optical device, such as a camera, relevant for performing autocollimation measurements, plus the required flat mirror. In figure 6, it is shown in detail how the pitch or bank angle ⁇ is obtained.
  • a light source 43 such as a diode
  • LED emits a non-collimated light beam 62 which is redirected by a beam splitter 44, also producing non-collimated beams 63.
  • These beams 63 are collimated by a collimation lens 41.
  • the focal length of the camera is represented by the letter f (the distance between the plane of the collimating lens and the plane of the sensor or image plane).
  • the collimated beams 65 are reflected by the mirror 55, thereby providing reflected collimated light beams 66. Some of these again reach the collimating lens 41, which provides a focused reflected beam of light 67 to the camera sensor. 42.
  • This configuration allows the angle of inclination of the mirror 55 (the inclination or pitch ⁇ or the deflection or yaw ⁇ ) and, therefore, of the object that is attached to the artifact 60 to be determined, but not the angle of rotation or roll F.
  • Plane offset d represents the offset between a camera sensor reference point 76 and the position where the focused reflected beam of light 75 is captured.
  • Point 75 is the point on the image plane 61 at which the focuses the reflected light beam 67.
  • indicates the tilt or pitch angle of the mirror. I know takes a similar approach to obtain the yaw angle ⁇ .
  • a non-collimated beam of light is emitted by the light source 43 arranged in the chamber 40 (not shown in Figure 5, see, for example, Figure 6).
  • the emitted light beam is redirected by beam splitter 44, also producing non-collimated beams (not shown). These beams are collimated by the camera lens 41.
  • the collimated beams 65 are directed towards the plane mirror 55.
  • the collimated beams 65 are reflected by the plane mirror 55, thereby providing reflected collimated light beams 66.
  • the collimated beams The reflected beams 66 reach the camera lens 41, which provides focused reflected light beams 67 to the camera sensor 42.
  • the point (dx ', ⁇ y') represents the plane shift between the position (at the camera sensor 42) in which the focused reflected light beam 67, that is, point 75, and a fixed reference point 76 in the plane 61 of the camera sensor 42 is captured.
  • FIG. 7A shows a scheme for carrying out photogrammetry measurements with camera 40 and set of reference points 56 that is comprised in artifact 60 and that is attached to the object - not shown - to be performed.
  • a trace in Figure 7A, reference points are referred to as 71. Therefore, these points 71 are reference points in the 3D coordinate system of the object whose position and orientation is being tracked.
  • the coordinates of the reference points 71 are known by a coordinate system defined in the artifact 60.
  • the camera 40 captures an image of the object - in general, the artifact - which has attached the set of reference points 71 that are located at positions (X i , Y i , Z i ).
  • image points 72 are obtained which correspond to respective reference points 71.
  • Image points 72 hereinafter referred to as reference points
  • the observed image defines the projection of the 3D reference points 71 on the plane 61 of the sensor. Because the camera sensor 52 defines a 2D plane, the Z component is set to 0.
  • the observed image points are expressed as (x ' i , y' i , 0).
  • the position of the image points observed in the sensor plane is known.
  • computational image processing such as operators and / or filters, can be applied to identify the center of the features of interest (such as intersecting edge points, centers of ellipses, etc.).
  • a surface contour detection function can be applied in order to obtain the position of the image points observed in the sensor plane.
  • a spatial resection algorithm is applied, restricted by the already known orientation angles ( pitch or bank angles ⁇ and yaw or yaw ⁇ ). Therefore, these two angles are entered as known parameters in the spatial resection algorithm, which is thus simplified.
  • the angle of rotation F of the artifact 60 is obtained, which is attached to the object to be measured (not shown) and a translation vector (T x T and T z ) between camera 40 and the object. This translation vector corresponds to the position of the object.
  • the restricted spatial resection algorithm is preferably applied such as follows:
  • N is the number of the observed image points 72 (and hence the number of reference points 71). The difference between the observed image points and the estimated image points is repeatedly minimized until a certain precision threshold is obtained.
  • the estimated image points (X ' Ei , y' Ei , 0) are obtained by applying the pinhole camera model that represents the projection of a 3D point (X i , Y i , Z i ) on an image plane (a camera sensor) (X ' Ei , y' Ei ): where: Y:
  • the observed image points 72 at the sensor of the camera; the position of the reference points in the object's coordinate system a scale factor ⁇ , which represents a conversion factor between reference points on the object and image points observed on the camera sensor; and the intrinsic camera matrix [A], which is a matrix that contains the different internal characteristics of the camera (the focal length (f), the main central point (cx, cy), the obliquity factor, among others).
  • the scale factor l is not required if a normalization process is previously applied.
  • the intrinsic matrix of the chamber [A] can be obtained, for example, by applying a calibration process.
  • the disclosed algorithm can be implemented either taking into account the distortion of the image produced by the camera lens or without considering this distortion. This distortion can be accounted for using well known radial and tangential distortion errors in photogrammetric applications.
  • the input restriction parameters that are obtained after applying the autocollimation technique are: the pitch or pitch rotation matrix R (ángulo) obtained from pitch or pitch angle ⁇ ; and the deviation or yaw rotation matrix R ( ⁇ ) obtained from the deflection or yaw angle ⁇ .
  • the output parameters, estimated by applying the spatial resection algorithm are: the rotation rotation matrix R ( ⁇ ) (in general, the angle of rotation or balance F) and the translation vector (Tx Tg TZ) between the camera coordinate system and the object coordinate system, which corresponds to the position of the object.
  • T x T and T z , ⁇ , ⁇ , ⁇ represent the relative position and orientation between the camera 40 and the object.
  • Figure 8 shows an alternative scheme to implement the method to estimate the position and orientation of an object with 6 degrees of freedom.
  • the spatial tracking of the object is performed by means of two separate devices: a camera and a 2D autocollimator, plus the artifact of Figure 4.
  • the camera 40 of Figures 3 (a - c ) is replaced by the chamber 140 and the 2D autocollimator 150 of FIG. 8.
  • the chamber 140 and the autocollimator 150 are disposed on a support 160. depicts an extrinsic calibration transformation between the autocollimator and the chamber.
  • the measurement procedure is the same as in the previous case.
  • the pitch or bank angles ⁇ and deviation or yaw ⁇ which represent the inclination of the plane mirror (generally a specular surface) that is attached to the artifact and, therefore, the inclination of the object being tracked, are measured by carrying out autocollimation measurements with autocollimator 150.
  • the observed image points are obtained (x ' 1 , y' 1 ), (x ' 2 , y' 2 ), (x ' 3 , y' 3 ), (x ' 4 , y ' 4 ), (x' 5 , y ' 5 ).
  • the spatial resection algorithm is applied , constrained by already known yaw angles (pitch or bank angles ⁇ and yaw or yaw ⁇ ), as previously disclosed.
  • the disclosed spatial resection algorithm can be implemented and executed in a processing means, such as a processor, and a data storage means, such as a memory.
  • the processing means can be incorporated in the chamber 40, 140, for example in the processing means, generally comprised in or associated with a chamber sensor.
  • the processing means may be located in a different device with respect to the camera 40, 140, for example in a computer system or a computing device, such as a personal computer. In this case, the algorithm can be run offline.

Landscapes

  • Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

L'invention concerne un procédé pour estimer la position et l'orientation d'un objet à 6 degrés de liberté (Tx1, Ty1, Tz1, Φ, θ, Ψ), qui consiste à : relier un artéfact (60) à l'objet dont la position et l'orientation (Tx1, Ty1, Tz1, Φ, θ, Ψ) vont être estimées, ledit artéfact (60) comprenant une surface spéculaire (55) et un ensemble de N points de référence (71), chaque point de référence étant défini par une position (X¡, Y¡, Z¡) dans un système de coordonnées défini dans l'artéfact (60) ; placer une caméra (49) orientée vers l'artéfact (60) qui est relié à l'objet ; mesurer les angles de tangage ou d'inclinaison Θ et de déviation ou lacet Ψ qui représentent l'inclinaison de la surface spéculaire (55) et, par conséquent, ledite mesure étant effectuée par application d'une technique d'autocollision ; capturer une image de l'ensemble de points de référence (71; (X¡, Y¡, Z¡)) qui est compris dans l'artéfact (60), ce qui permet d'obtenir ainsi, dans le plan (61) du capteur (42) de la caméra, des points d'image observés (72; (x'¡, y'¡)) qui correspondent avec lesdits points de référence (71; (X¡, Y¡, Z¡)) ; obtenir l'angle de rotation ou roulis Φ de l'objet et un vecteur de translation (Tx, Ty, Tz) entre la caméra (40) et l'objet, à l'aide desdits points de référence (71; (X¡, Y¡, Z¡)) et desdits points d'image observés (72; (x'¡, y'¡)) qui sont obtenus dans le capteur (42) de la caméra, par application d'un algorithme de résection spatiale limité par les angles de tangage ou d'inclinaison Θ et de déviation ou lacet Ψ mesurés, (Τχ, Τγ, Tz, Φ, θ, Ψ) représentant la position et l'orientation relatives entre la caméra (40) et l'objet.
PCT/ES2020/070697 2019-11-13 2020-11-11 Procédé et système pour le suivi spatial d'objets WO2021094636A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ESP201930991 2019-11-13
ES201930991A ES2824873A1 (es) 2019-11-13 2019-11-13 Metodo y sistema para el seguimiento espacial de objetos

Publications (1)

Publication Number Publication Date
WO2021094636A1 true WO2021094636A1 (fr) 2021-05-20

Family

ID=75819435

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/ES2020/070697 WO2021094636A1 (fr) 2019-11-13 2020-11-11 Procédé et système pour le suivi spatial d'objets

Country Status (2)

Country Link
ES (1) ES2824873A1 (fr)
WO (1) WO2021094636A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4714339A (en) * 1986-02-28 1987-12-22 The United States Of America As Represented By The Secretary Of Commerce Three and five axis laser tracking systems
US20030206285A1 (en) * 2002-05-06 2003-11-06 Automated Precision, Inc. Nine dimensional laser tracking system and method
US20120262550A1 (en) * 2011-04-15 2012-10-18 Faro Technologies, Inc. Six degree-of-freedom laser tracker that cooperates with a remote structured-light scanner
US20170201738A1 (en) * 2015-06-13 2017-07-13 Alberto Daniel Lacaze Senising on uavs for mapping and obstacle avoidance
US20190006289A1 (en) * 2017-06-30 2019-01-03 Taiwan Semiconductor Manufacturing Company, Ltd. Semiconductor Device with Shielding Structure for Cross-Talk Reduction

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019006289A1 (fr) * 2017-06-30 2019-01-03 Kaarta, Inc. Systèmes et procédés d'améliorations de balayage et de mise en correspondance

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4714339A (en) * 1986-02-28 1987-12-22 The United States Of America As Represented By The Secretary Of Commerce Three and five axis laser tracking systems
US4714339B1 (en) * 1986-02-28 1997-03-18 Us Army Three and five axis laser tracking systems
US4714339B2 (en) * 1986-02-28 2000-05-23 Us Commerce Three and five axis laser tracking systems
US20030206285A1 (en) * 2002-05-06 2003-11-06 Automated Precision, Inc. Nine dimensional laser tracking system and method
US20120262550A1 (en) * 2011-04-15 2012-10-18 Faro Technologies, Inc. Six degree-of-freedom laser tracker that cooperates with a remote structured-light scanner
US20170201738A1 (en) * 2015-06-13 2017-07-13 Alberto Daniel Lacaze Senising on uavs for mapping and obstacle avoidance
US20190006289A1 (en) * 2017-06-30 2019-01-03 Taiwan Semiconductor Manufacturing Company, Ltd. Semiconductor Device with Shielding Structure for Cross-Talk Reduction

Also Published As

Publication number Publication date
ES2824873A1 (es) 2021-05-13

Similar Documents

Publication Publication Date Title
US10088296B2 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
US10401143B2 (en) Method for optically measuring three-dimensional coordinates and controlling a three-dimensional measuring device
US10841562B2 (en) Calibration plate and method for calibrating a 3D measurement device
US9612331B2 (en) Laser tracker with functionality for graphical target preparation
US9696140B2 (en) Laser tracker with position-sensitive detectors for searching for a target
US9188430B2 (en) Compensation of a structured light scanner that is tracked in six degrees-of-freedom
ES2801395T3 (es) Sistema y método para la medición tridimensional de la forma de objetos materiales
JP5123932B2 (ja) 回動鏡を備えるカメラ利用6自由度標的計測装置及び標的追尾装置
US20170168160A1 (en) Portable distance measuring device and method for capturing relative positions
JP5127820B2 (ja) カメラ利用標的座標計測方法
ES2340340T3 (es) Procedimiento para determinar el eje de giro de una rueda de vehiculo.
US20140043622A1 (en) System for measuring the position and movement of an object
JP2014066728A (ja) 六自由度計測装置及び方法
Zhou et al. A novel laser vision sensor for omnidirectional 3D measurement
US11754386B2 (en) Method and system for capturing and measuring the position of a component with respect to a reference position and the translation and rotation of a component moving relative to a reference system
WO2016040229A1 (fr) Procédé de mesure optique de coordonnées tridimensionnelles, et étalonnage d'un dispositif de mesure tridimensionnelle
EP3495844A1 (fr) Coordonnées tridimensionnelles de lignes de bord bidimensionnelles obtenues à l'aide d'une caméra de suivi
WO2016040271A1 (fr) Procédé pour mesurer optiquement des coordonnées tridimensionnelles et commander un dispositif de mesures tridimensionnelles
Barone et al. Structured light stereo catadioptric scanner based on a spherical mirror
Orghidan et al. Modelling and accuracy estimation of a new omnidirectional depth computation sensor
WO2021094636A1 (fr) Procédé et système pour le suivi spatial d'objets
CN108458692B (zh) 一种近距离三维姿态测量方法
US8854612B2 (en) Optical system for measuring orientation with cubic wedge and mask
Xu et al. Calibration method of laser plane equation for vision measurement adopting objective function of uniform horizontal height of feature points
Feng et al. A general model and calibration method for spherical stereoscopic vision

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20886604

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20886604

Country of ref document: EP

Kind code of ref document: A1