WO2016102721A1 - Method and system for spatial localisation using luminous markers for any environment - Google Patents

Method and system for spatial localisation using luminous markers for any environment Download PDF

Info

Publication number
WO2016102721A1
WO2016102721A1 PCT/ES2015/000182 ES2015000182W WO2016102721A1 WO 2016102721 A1 WO2016102721 A1 WO 2016102721A1 ES 2015000182 W ES2015000182 W ES 2015000182W WO 2016102721 A1 WO2016102721 A1 WO 2016102721A1
Authority
WO
WIPO (PCT)
Prior art keywords
marker
coordinates
instant
image
target
Prior art date
Application number
PCT/ES2015/000182
Other languages
Spanish (es)
French (fr)
Inventor
Eugenio VILLAR BONEL
PATRICIA Mª MARTINEZ MEDIAVILLA
Francisco José ALCALÁ GALÁN
Pablo Pedro SÁNCHEZ ESPESO
Victor FERNÁNDEZ SOLORZANO
Original Assignee
Universidad De Cantabria
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Universidad De Cantabria filed Critical Universidad De Cantabria
Publication of WO2016102721A1 publication Critical patent/WO2016102721A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras

Definitions

  • the present invention belongs to the fields of electronics and telecommunications. Specifically, the present invention applies to the industry area! that collects the techniques of detection of reference points for the location and positioning of a target (a person, animal or an object) in controlled environments.
  • the present invention relates to a method and system for obtaining, from the use of light markers, the position and orientation of an object or subject, applicable to any type of environment, interior or exterior.
  • object tracking is closely related to augmented reality, where knowledge of the individual's position is essential.
  • This technology mixes virtual elements with real images, allowing the user to expand their information from the real world or interact with it.
  • Virtual reality replaces physical reality with computer data.
  • Augmented reality systems can use a transparent optical display (for example, Google Glass) or an image mixing screen (for example, using a smart mobile phone, ' smartphcne ') as a visuaiizadón device. in English).
  • a transparent optical display for example, Google Glass
  • an image mixing screen for example, using a smart mobile phone, ' smartphcne '
  • visuaiizadón device in English.
  • they can be based on the use of cameras, optical sensors, accelerometers, gyroscopes, GPS, etc.
  • stereo cameras and epipole geometry When only one reference marker can be displayed, the use of stereo cameras and epipole geometry must be used.
  • the characterization of a point in three-dimensional space requires knowledge of its coordinates (y, y, z) within the environment where it is located, with respect to a reference position.
  • the most common technique is based on the use of two or more calibrated cameras, which provide a left and right image of the same scene.
  • stereo correspondences are applied (look for the same point in both images) and projective or epipolar geometry is calculated (it describes the relationship between the image planes of the cameras and the point).
  • VVO 201 3/120041 A1 "ethod and apparatus fctr 3D spatiai lecalizaiion and tracking of objects using active optics! lighting and sensing ”
  • this type of light sources with variable illuminance or pulsed light, so it can cause synchronization failures.
  • the use of light markers can pose problems, specifically in environments where there are sources of light with an Iuminance much greater than the marker itself (in the worst case, sunlight) or sources that emit radiation in the same direction; in those situations, the image sensor is not able to differentiate one light source from another, so it will force such and as it happened before, to use this technology in luminous environments without large sources of light in it.
  • infrared markers located on the wall of a room to locate the user.
  • active markers are formed by a set of three infrared LEDs and a signal emitter, which sends data of its real position to a signal decoder that the user carries, so that once it is detected they know their absolute position.
  • Passive markers are only an infrared light source, from which they obtain the relative position of the user. In addition to relying on the reception of signals from active markers, they calculate the relative distance to the marker from stereo vision. He Use of this technique, even here! which occurred in the cases explained above, is limited to interior spaces.
  • Radio frequency techniques consist of measuring distances, of static or mobile objects, from the emission of electromagnetic pulses that are reflected in a receiver. These electromagnetic waves will be reflected when there are significant changes in the atomic density between the environment and the object, so they work particularly well in cases of conductive materials (metals). They are able to detect objects at a greater distance than other systems based on light or sound, however they are quite sensitive to interference or noise. It is also difficult to measure objects that are between each other at different distances to the emitter, because the pulse frequency will vary (slower the farther and vice versa).
  • LIDAR systems which calculate the distance over the time it takes for a light pulse to be reflected on an object or surface, using a device with a pulsed laser as a light emitter and a photodetector as a signal receiver. reflected.
  • the advantage of these systems is the precision they achieve over long distances (using lasers with a wavelength> 1000 nm) and the possibility of mapping large areas, by sweeping light pulses. Its drawbacks are the need to perform the analysis and processing of each point, as well as the difficulty of automatically reconstructing three-dimensional images.
  • the objective technical problem that arises is thus to provide a system for the detection of the position and orientation of an individual or object in any type of environment, interior or exterior, with whatever their lighting conditions.
  • the present invention serves to solve the aforementioned problems, solving the problems presented by the solutions mentioned in the state of the art, providing a system that, from; Using one or more reference luminous markers and a single stereo camera, it allows locating space objects or individuals in a scenario under any environmental condition and with greater distances between the user and the marker.
  • the system relies primarily on the use of light markers to calculate relative positions of! object / individual, a stereo camera to visualize those markers on the stage image and an electronic device for measuring angles, such as a gyroscope or electronic compass, to provide turning angles of the target user (object, person or animal) .
  • the present invention makes it possible to detect reference luminous markers in any type of environment, regardless of the light sources that determine the environmental conditions.
  • One aspect of the invention relates to a method for positioning or locating an objective by using reference markers in any 3D environment that, from a first image frame at an instant of current time and a second image frame in a previous time instant captured by a stereo camera, images in which at least one marker is detected, obtains the coordinates ( ⁇ ⁇ , and n ) of the objective at the current time instant n, for which it performs the following steps:
  • the first image frame and the second image frame are equal, calculate the coordinates (x n> yr.) Of! target instantly acts! equalizing them to the coordinates ( ⁇ , ⁇ , ⁇ , .. ⁇ ; of the objective in the previous instant; - if not, otherwise, it obtains the image coordinates of, at least one detected marker, and its radius, to compare ios radios in the current time instant n and in the previous time instant n-1 and: - if the radii are equal and there are a plurality of markers, the coordinates (x n . and n ) of the objective at the present time are obtained by triangulation using the first image frame and the second image frame;
  • the coordinates (x n , and f ,) of the objective at the current moment are obtained by triangulation but using a single image frame, the one captured at the current moment;
  • the coordinates ⁇ x r composerand n ) of the target at the current moment are obtained by the stereo geometry algorithm known in the prior art;
  • the coordinates (x r , y n ) of the target at the present time are obtained through an algorithm reminiscent of stereo geometry but using the image coordinates of the marker in e! instant time acts! and in the previous instant of time, instead of a left and right image of the same instant of time
  • Another aspect of the invention relates to a system for locating an objective, which can be an object or an individual, from at least one reference marker in a 3D space or environment, comprising the following means;
  • a stereo camera to capture image frames in which one or more markers are detected
  • a signal processor with access to a storage device (a memory), configured to perform the steps of the method described above to obtain at its output the coordinates (x ⁇ and r ) of the target calculated at the instant of current time, using, according to each case indicated above, the data obtained at the previous time stored in memory.
  • a storage device a memory
  • a light source is used, identifiable in the environment of use.
  • the invention described can be used for Simulated Reality applications.
  • glasses are incorporated into the system Virtual reality. Both the stereo camera and the glasses can be part of a helmet or holding equipment that is placed on the user's head and connecting the camera with the glasses.
  • the system can additionally incorporate an accelerometer, which measures the displacement made in a finite time, which would reduce the cumulative errors.
  • the present invention solves the problem of computation time of existing systems such as that described in US 7,231, 063 B2 ⁇ because contrast enhancement algorithms and / or specific saved markers are required on a data base s because in the present invention light markers are used that work in the visible or infrared spectrum, such as light emitting diodes (LEDs)
  • one of the differences of the present invention is that it uses fixed light sources and comes to solve the problem that occurs in environments where there are light sources with an intensity much greater than the marker itself .
  • the present invention uses an element that prevents the light conditions of an environment from affecting significantly as is the use of a background after the source of iuz.
  • FIGURE 1 Shows a schematic block diagram of the spatial location system of individuals or objects, according to a preferred embodiment of the invention.
  • FIGURE 3. Shows an environment of use of the markers of Figure 2 and in which the system of Figure 1 is applicable, according to a possible embodiment.
  • FIGURES 4 ⁇ -4 ⁇ .- They show a scheme of the markers and parameters used by the system to locate individuals or objects that move vertically in it and when only one marker is detected.
  • FIGURES 5A-5B - They show a scheme of the markers and parameters that the system uses to locate in the environment individuals or objects that move vertically and when more than one marker is detected.
  • FIGURE 6A It shows a scheme of the markers and parameters that the system uses to locate individuals or objects that move horizontal in the environment and when only a single marker is detected.
  • FIGURE 8B - Shows a scheme of the markers and parameters used by the system to locate in the environment individuals or objects that move horizontally and when more than one marker is detected.
  • FIGURE 7.- It shows a scheme of the operation of the method, it is merely an example of data flow.
  • possible embodiments of the obtaining system are proposed, based on the use of one or more luminous markers, from the position and orientation of a user, in different possible environments, which can be indoors or outdoors, within A controlled scenario.
  • Figure 1 shows a schematic diagram of the system block architecture for locating in e! space ios objects or individuals that constitute an objective (10) in a three-dimensional environment (1 1) under any environmental condition defined by a number m3 ⁇ 41 of light sources (f, fLg, ft : 3 ⁇ 4 fL TM), having one or more light markers (20) as coughs shown in Figures 2-3, 4A-4B, 5A-5B and 6A-8B E!
  • the system comprises a stereo camera ⁇ 12 ⁇ to detect the luminous markers (20) and an electronic angle measuring device (13), for example, a gyroscope or electronic compass, with which the rotation angles of the objective are obtained ( 10).
  • the system comprises a digital signal processor (14) that calculates the position coordinates in the space of each luminous marker (20) in time and stores them in a memory or storage device (15).
  • Digital signal processor (14) uses the stored coordinates and the output parameters obtained from the stereo camera (12) and the angle measuring device (13) to determine at its output (18) the position of the target user (10) .
  • a type of reference marker (20) of those used is shown in Figure 2, which is a luminous marker and comprises two main elements: a light source (21) and a contrast surface (22).
  • the preferred light source (21) is an LED emitting in e! visible range: 400-700 nm.
  • This type of source is a point light source that achieves ranges greater than 50 m for powers greater than 1 W.
  • an LED can be considered a non-hazardous product due to the optical powers in which it works since in the worst of the cases the exposure time is very low (aversion reaction time ⁇ -250 ms).
  • the system can use luminous markers (20) with other types of light sources (21), because the device detecting the luminous marker (20), that is, the stereo camera (12) used as light receiver detects both light sources ⁇ 21 ⁇ that work in the visible and infrared spectrum.
  • the image sensor of the stereo camera (12) has a spectral curve that, for the wavelength of the LED used, indicates a spectral response with a value greater than 0.005 A / VV.
  • Filament bulbs are another example of light sources (21), although they are diffuse sources with emitted optical powers below the range with an LED.
  • Another possible light source (21) can be a laser diode, although it is a collimated source capable of focusing the light on a very small point and, for most cases, all those optical powers greater than 1 mW can be Dangerous.
  • the last type of light source (21) that can be used is an infrared LED, although due to the scope it presents, the drawback is that the user is not able to perceive it and could cause eye damage.
  • the contrast surface (22) is of a color - for example, black - and dimensions that allow distinguishing between the light marker (20) and any external light source.
  • the contrast screen or surface (22) allows you to apply the method proposed here in environments with little or a lot of light, and at great distances.
  • the shape of the contrast surface (22) can be any, for example, square as in Figure 2.
  • the dimensions of the contrast surface (22) depend on the surrounding light conditions (1 1), the luminous flux of The light source (21) and the maximum distance between the objective (10) and the light markers (20).
  • the template or contrast surface (22) is located on the outside of the light source (21), specifically at the rear, the light source (21) being left in view of the user. In the case that the environment or the background behind the light source (21) is dark enough, it is not necessary to add the contrast surface (22).
  • the system supports the use of other types of luminous markers (20), such as white printed markers with a black border, although these cannot be used in any type of environment.
  • Figure 3 shows a possible system application scenario, in which the distribution of the markers (20).
  • the markers (20) can be located at different distances from each other, which the system must know in advance.
  • the height between each luminous marker (20) and the ground is not preset, but it is recommended that it be that which allows direct vision between the stereo camera (12) and the igloo light sources. : , fLa, fL & ..,. ; fLm ⁇ of the luminous markers (20).
  • the markers (20) are placed on vertical supports to achieve the necessary height.
  • the luminous markers (20) can also be placed on vertical supports or subjects on the surrounding walls or objects.
  • the scenario where the method is applied does not present any predefined characteristics with respect to distribution, plant, obstacles, so that the system adapts to it.
  • the type of environment as explained above, can be indoor or outdoor.
  • the only restriction is that having the maximum dimensions of the environment, being limited by ei reach the light sources (fL;, fL 2 fL3 ⁇ 4 FLM) chosen.
  • Said scope is measured in function of 3a intensity and luminous flux of the light sources (fLi, fL 2l fL R hinder) and of the sensitivity of the image sensor of the stereo camera (12).
  • this system allows ioealizar in the image specific reference points by means of an algorithm of detection of luminous markers (20), as eS described below.
  • the method to be described is not unique, other variants can be used, returning as output parameters the image coordinates (u.v) and the diameter of the luminous markers (20) detected.
  • the detection of light markers (20) is divided into the following steps:
  • Image conversion to grayscale to significantly reduce the size of the image, as this goes from having three channels, red, green and blue, to only one black and white. That is, each pixel in the image reduces its value from 3 bytes to 1 byte.
  • Noise removal filtering to eliminate erroneous pixels and noise from images captured by cameras.
  • the type of filter depends on how clear the images are desired and the delay time that can be introduced into the system.
  • the position of the target user (10) depends on the turns and the type of movements that perform - vertical: up or down, horizontal: left or right-; or if it does not make any movement.
  • the method returns the same user coordinates as in the previous moment (x réelle-i, y-,); otherwise, the position is calculated with all the information, a) -c), mentioned above. In this way, redundant and unnecessary operations are avoided.
  • the marker detection algorithm is applied. By knowing the values of the radii of the markers detected in the current instant, r (n) and those of the previous instant r (n ⁇ 1), the type of displacement of! target user (10):
  • the type of movement performed by the objective (10) can be located in the environment (1 1) according to the following methods, which depend on the type of movement and the number m of markers (20) detected.
  • Figures A-48 show the case in which it has been determined that there is a vertical movement of the lens (10) and when only a single marker (20) is detected in the binocular image (40) captured by the stereo camera (12),
  • a triangulation algorithm cannot be used, because the pixels cannot be related to a real distance; Therefore, the stereo vision technique must be used and the following parameters are needed:
  • binocular disparity ⁇ 'disparity , ai ⁇ English stereo vision given by the UL and U coordinates, rectified and distortion respectively marker dei (2Q) obtained from the two image components, left (41) and right (42), captured by the stereo camera ⁇ 12);
  • the project geometry is calculated at the present time n according to the equation: baseline x focaljenght
  • Figures 5A-5B show the case in which it has been determined that there is a vertical movement of the lens (10) and two or more markers (20, 20 ' , 20 ") are detected in the image (50) captured by the stereo camera ⁇ 12)
  • triangulation can be applied, since more than one marker is available, of the actual distance (d / m) between markers (2C S 20 ', 20 "), of the angle of rotation ( ⁇ ), the aperture angle (2 ⁇ p) of the camera (12) and the number of pixels (AxB) of the image (50).
  • the meters traveled are calculated as the difference between the two, From that value and the user's previous position, coordinates of the target (10) in the previous moment ⁇ x ? . ? , andtown.i), its new coordinates ⁇ n , and n ) can be calculated at the current moment:
  • stereo vision can also be used to obtain depth at the markers. But it is necessary to apply stereo correspondences, that is, to relate the markers of the left image with their equivalents of the right image. Once the correspondences have been obtained, projective geometry can be applied, as in the case of a single marker to obtain the real distance to each marker.
  • Figures 6A-6B show the case in which it has been determined that there is a horizontal movement of the target (10).
  • Figure 6A refers to the case in which only a single marker (20) is detected in the image (61, 62).
  • An algorithm that can be reminded of stereo geometry is applied, but in this case two images of the same are not used moment taken from two different angles, but two images of contiguous instants and the same perspective will be used: the image captured at the current moment (61) and the one captured at an immediately previous instant 62). Also, there are the horizontal coordinates of the marker in e! current moment (uwel) and those obtained from the previous frarne ( ⁇ , - ,.
  • Figure 6B refers to the case in which more than one marker (20, 20 ' . 20 ") is detected in the image.
  • a technique similar to that of triangulation explained in the case of a vertical user movement with a plurality is used. of detected markers, but in this case two images ⁇ 83, 84) captured by the same image sensor are used consecutively in time, having the current image (63) and the image captured in the previous instant (64). Knowing the real distance between markers and the pixels between them, px pies in the current moment and pqxies in the previous instant n-1, it can be extrapolated to the length that has been moved by the user.
  • the previous case of a single marker detected can also be applied to obtain the displacement (D) performed by the target user (10).
  • Fs from the coordinates of the same marker in two contiguous images, disregarding the rest of the markers detected, and with the previous distance between the marker and the user, calculate the displacement D.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a method and system for spatial localisation of an object (10) in a three-dimensional environment (11), comprising at least one luminous marker, comprising a stereo camera (12) for capturing a first image frame in a current moment and a second image frame in a previous moment, an angle-measuring device (13) for obtaining an angle of rotation of the object (10), a signal processor (14) with access to a memory (15) that stores, inter alia, a radius of the at least one marker detected in a real-time moment and in a previous moment n-1, said signal processor being configured to calculate coordinates (x-., y) of the object (10) in a moment in time i as follows: if the angle of rotation in the real-time moment and in the previous moment are different, (Xn, Yn )=(Xn-1, Yn-1 ); if the two image frames are the same, (Xn, Yn )=(Xn-1, Yn-1 ); in another case: if the radii are the same and there are a number of markers, (Xn, Yn) are calculated by means of triangulation using both image frames; if the radii are different and there are a number of markers, (Xn, Yn ) are calculated by means of triangulation using a single image frame; if the radii are different and there is a single marker, (Xn, Yn) are calculated by means of stereo geometry; if the radii are the same and there is a single marker, (Xn, Yn) are calculated using image coordinates of the marker in the current moment and in the previous moment.

Description

Méto o w Sistema ds Localización espacial mediante Marc dores Lümiriosos para Cis lgguier ambiente  Method o w System ds Spatial location by Marc dores Lümiriosos for Cis lgguier atmosphere
OBJETO DE LA iNVENDO PURPOSE OF THE WINTER
La presente invención pertenece a ios campos de ia electrónica y las telecomunicaciones. Concretamente, la presente invención se aplica al área industria! que recoge ías técnicas de detección de puntos de referencia para la localización y posicionamiento de un objetivo (una persona, animal o un objeto) en entornos controlados. The present invention belongs to the fields of electronics and telecommunications. Specifically, the present invention applies to the industry area! that collects the techniques of detection of reference points for the location and positioning of a target (a person, animal or an object) in controlled environments.
Más particularmente, la presente invención se refiere a un método y sistema para obtener, a partir del uso de marcadores luminosos, la posición y orientación de un objeto o sujeto, aplicable a cualquier tipo de ambiente, interior o exterior. More particularly, the present invention relates to a method and system for obtaining, from the use of light markers, the position and orientation of an object or subject, applicable to any type of environment, interior or exterior.
ANTECEDENTES DE LA INVENCIÓN BACKGROUND OF THE INVENTION
En los últimos años, está habiendo un creciente interés en aquellos sistemas o productos relacionados con la localización de objetos en tres dimensiones (3D). Los sectores que cubren esta tecnología son muy amplios, como rebotica, medicina o videoluegos, entre otros. In recent years, there is a growing interest in those systems or products related to the location of objects in three dimensions (3D). The sectors that cover this technology are very wide, such as reboot, medicine or video games, among others.
Concretamente, el seguimiento de objetos está estrechamente relacionado con la realidad aumentada, donde el conocimiento de la posición del individuo es fundamental. Esta tecnología mezcla elementos virtuales con imágenes reales, permitiendo al usuario ampliar su información del mundo real o Interactuar con él. La realidad virtual sin embargo, sustituye ía realidad física con datos informáticos. Los sistemas de realidad aumentada pueden utilizar como dispositivos de vísuaiizadón ('displays' en inglés) una pantalla óptica transparente (por ejemplo, Google Glass) o una pantalla de mezcla de imágenes (por ejemplo, ía de un teléfono móvil inteligente, 'smartphcne' en inglés). Además para poder conocer la posición del Individuo con precisión, pueden basarse en el uso de cámaras, sensores ópticos, acelerómeíros, giroscopios, GPS, etc. En el caso de que se empleen sistemas de visión, es necesario realizar un preprocesado de ia región de ínteres donde se sitúa el individuo, utilizando algoritmos de imagen que permiten detectar esquinas, bordes o marcadores de referencia; para después, con estos datos obtener las coordenadas 3D reales del entorno. Estos sistemas requieren el uso de una CPU y memoria RAM con suficiente capacidad de cómputo, para poder procesar las ¡mágenes de las cámaras a tiempo rea! y con la menor latericia posible. Specifically, object tracking is closely related to augmented reality, where knowledge of the individual's position is essential. This technology mixes virtual elements with real images, allowing the user to expand their information from the real world or interact with it. Virtual reality, however, replaces physical reality with computer data. Augmented reality systems can use a transparent optical display (for example, Google Glass) or an image mixing screen (for example, using a smart mobile phone, ' smartphcne ') as a visuaiizadón device. in English). In addition to knowing the position of the individual with precision, they can be based on the use of cameras, optical sensors, accelerometers, gyroscopes, GPS, etc. In the case that vision systems are used, it is necessary to perform a preprocessing of the region of interest where the individual is located, using image algorithms that allow to detect corners, edges or reference markers; for later, with this data obtain the real 3D coordinates of the environment. These systems require the use of a CPU and RAM with sufficient capacity to computation, to be able to process the images of the cameras in real time! and with the least possible latericia.
Cuando únicamente se pueda visualizar un marcador de referencia, se tiene que recurrir al uso de cámaras estéreo y geometría epípoiar. La caracterización de un punto en el espacio tridimensional requiere el conocimiento de sus coordenadas (y, y, z) dentro del entorno donde éste se encuentre, respecto a una posicián de referencia. La técnica más común está basada en el uso de dos o más cámaras calibradas, que proporcionan una imagen izquierda y derecha de la misma escena. Para obtener las coordenadas 3D del punto y objeto a caracterizar, se aplica correspondencias estéreo (buscar un mismo punto en ambas imágenes) y se calcula !a geometría proyectiva o epipolar (describe la relación existente entre los planos de la imagen de las cámaras y ei punto). When only one reference marker can be displayed, the use of stereo cameras and epipole geometry must be used. The characterization of a point in three-dimensional space requires knowledge of its coordinates (y, y, z) within the environment where it is located, with respect to a reference position. The most common technique is based on the use of two or more calibrated cameras, which provide a left and right image of the same scene. To obtain the 3D coordinates of the point and object to be characterized, stereo correspondences are applied (look for the same point in both images) and projective or epipolar geometry is calculated (it describes the relationship between the image planes of the cameras and the point).
En ei caso de tener más de un marcador o patrón disponible, se pueden aplicar otras técnicas para localizar eí objeto en el escenario. A través de triangulación, conociendo la distancia real entre marcadores, es posible con sólo dos marcadores y una sola cámara obtener ios parámetros para conseguir ia profundidad al objetivo y posicionarlo en ei entorno. Esta práctica simplifica ei coste computacional, al no tener que analizar dos imágenes y sus correspondencias; pero requiere mayor precisión a la hora de detectar los marcadores. A pesar de eso, aígunos autores ("Optica! tracking using projective invariant marker pattern propierties", . van Liere et al., Proceedings of the IEEE Virtual Roality Conference 2003, 2003) consideran que, para poder realizar un mejor seguimiento del objeto, es necesario que el objeto tenga cuatro o más marcadores y que además se utilice visión estéreo; para conseguir un sistema más preciso pero más íento. In the case of having more than one marker or pattern available, other techniques can be applied to locate the object on the stage. Through triangulation, knowing the real distance between markers, it is possible with only two markers and a single camera to obtain the parameters to achieve the depth of the objective and position it in the environment. This practice simplifies the computational cost, by not having to analyze two images and their correspondences; but it requires greater precision when it comes to detecting markers. Despite this, some authors ("Optics! Tracking using projective invariant marker pattern proprietors",. Van Liere et al., Proceedings of the IEEE Virtual Roality Conference 2003, 2003) consider that, in order to better track the object, it is necessary that the object has four or more markers and that stereo vision is also used; to achieve a more precise but more efficient system.
Uno de los problemas que se han encontrado en otros estudios (US 7,231 ,063 32, "Fudicial Detection System", L. Waimark et ai.) a la hora de utilizar marcadores, es la luminosidad que tiene el ambiente donde van a ser temadas las imágenes. Los puntos a detectar pueden perderse en la escena por falta de luz. Esto limita las aplicaciones que usan este sistema en ambientes interiores o con luminosidad controlada. Además, se hace necesario el uso de algoritmos de realce de contraste y/o ei uso de marcadores específicos guardados en una base de datos, lo que incrementa susíancíalmente el tiempo de cómputo de estos sistemas. Además de limitar considerablemente la distancia entre los marcadores impresos y e! usuario del sistema, a no ser que su tamaño sea lo suficientemente grande para que lo capte ei sensor de imagen. En algunos casos (VVO 201 3/120041 A1 , " ethod and apparatus fctr 3D spatiai lecalizaiion and tracking of objects using active óptica! íilumination and sensing") se han propuesto este tipo de fuentes de luz con íuminancia variable o luz pulsada, So que puede ocasionar fallos de sincronización. Aun así, el uso de marcadores luminosos puede plantear problemas, concretamente en ambientes donde hay fuentes de iuz con una Iuminancia mucho mayor que el propio marcador (en el peor caso, luz solar) o fuentes que emiten radiación en la misma dirección; en esas situaciones, el sensor de imagen no es capaz de diferenciar una fuente de luz de otra, por lo que obligará tal y corno ocurría anteriormente, a utilizar esta tecnología en entornos luminosos sin grandes fuentes de luz en él. One of the problems that have been found in other studies (US 7,231, 063 32, "Fudicial Detection System", L. Waimark et ai.) When using markers, is the luminosity of the environment where they will be feared the images. The points to be detected may be lost in the scene due to lack of light. This limits the applications that use this system indoors or with controlled brightness. In addition, the use of contrast enhancement algorithms and / or the use of specific markers stored in a database is necessary, which substantially increases the computation time of these systems. In addition to considerably limiting the distance between printed markers and ye! user of the system, unless its size is large enough to be captured by the image sensor. In some cases (VVO 201 3/120041 A1, "ethod and apparatus fctr 3D spatiai lecalizaiion and tracking of objects using active optics! lighting and sensing ") have proposed this type of light sources with variable illuminance or pulsed light, so it can cause synchronization failures. Even so, the use of light markers can pose problems, specifically in environments where there are sources of light with an Iuminance much greater than the marker itself (in the worst case, sunlight) or sources that emit radiation in the same direction; in those situations, the image sensor is not able to differentiate one light source from another, so it will force such and as it happened before, to use this technology in luminous environments without large sources of light in it.
La idea de detectar y pcsicionar los marcadores sirve para caracterizar ios objetos o individuos que hay en éi. de esa forma se localizan en el espacio. En el artículo "Wide área optical tracking ín unconsfrained indoor environments" (de A. Mossei et ai. , 23rd Snternacionai Conference on Artificial ealíty and Teiexistence (iCAT), 2013) se propone incorporar marcadores luminosos infrarrojos en una cinta situada en la cabeza del usuario. Para ello, coiocan en ei escenario dos cámaras independientes, que requieren un proceso de Sincronización para realicen ei disparo simuitaneameníe, situadas a una distancia igual a la longitud de la pared de la habitación donde se va a probar. Ei algoritmo empleado, para hacer una estimación de la posición, se bese en la búsqueda de correspondencias estéreo. Uno de ios inconvenientes que presenta es que no se puede ímplementar para sistemas de realdad aumentada o simulada, porque las cámaras no muestran lo que ve ei usuario, además de estar restringido a ambientes interiores con dimensiones limitadas. The idea of detecting and positioning markers serves to characterize the objects or individuals in it. that way they are located in space. In the article "Wide area optical tracking ín unconsfrained indoor environments" (by A. Mossei et ai., 23rd Snternacionai Conference on Artificial ealíty and Teiexistence (iCAT), 2013) it is proposed to incorporate infrared light markers in a tape located on the head of the Username. To do this, two independent cameras co-operate on the stage, which require a synchronization process to perform the shooting simultaneously, located at a distance equal to the length of the wall of the room where it is to be tested. The algorithm used to estimate the position is kissed in the search for stereo correspondences. One of the disadvantages that it presents is that it cannot be implemented for augmented or simulated reality systems, because the cameras do not show what the user sees, in addition to being restricted to indoor environments with limited dimensions.
Otros estudios como "Tracking of user posiíion and orientation by stereo measurement of infrared markers and orientation sensing" (de M. Maeda, et al. , Proceeding of t e 8th. International Symposlum on Wearable Computers ÍJS C 4), 2004) plantean el uso da marcadores infrarrojos situados en la pared de una habitación, para localizar al usuario. Concretamente, proponen el uso de dos tipos de marcadores: activos y pasivos. Los marcadores activos están formados por un conjunto de tres LEDs infrarrojos y un emisor de señales, que envía datos de su posición real a un decodifícador de señales que porta el usuario, por lo que una vez que los detecta conocen su posición absoluta. Los marcadores pasivos son únicamente una fuente de luz infrarroja, a partir de ios cuales obtienen la posición relativa del usuario. Además de basarse en ia recepción de señales de los marcadores activos, calculan la distancia relativa ai marcador a partir de visión estéreo. El uso de esta técnica, aÍ igua! que ocurría en los casos explicados anteriormente, está limitado a espacios interiores. Other studies such as "Tracking of user position and orientation by stereo measurement of infrared markers and orientation sensing" (by M. Maeda, et al., Proceeding of te 8th. International Symposlum on Wearable Computers ÍJS C 4), 2004) raise the use gives infrared markers located on the wall of a room, to locate the user. Specifically, they propose the use of two types of markers: active and passive. The active markers are formed by a set of three infrared LEDs and a signal emitter, which sends data of its real position to a signal decoder that the user carries, so that once it is detected they know their absolute position. Passive markers are only an infrared light source, from which they obtain the relative position of the user. In addition to relying on the reception of signals from active markers, they calculate the relative distance to the marker from stereo vision. He Use of this technique, even here! which occurred in the cases explained above, is limited to interior spaces.
Existen otros métodos, que no requieren ia visión directa de una o más cámaras con los marcadores de referencia, para sistemas de localización y seguimiento dei usuario, Las técnicas de radiofrecuencia consisten en medir distancias, de objetos estáticos o móviles, a partir de ia emisión de pulsos electromagnéticos que son reflejados en un receptor. Estas ondas electromagnéticas se reflejarán cuando haya cambios significativos en la densidad atómica entre el entorno y el objeto, por lo que funcionan particularmente bien en los casos de materiales conductores (metales). Son capaces de detectar objetos a mayor distancia que otros sistemas basados en luz o sonido, sin embargo son bastante sensibles a interferencias o ruido. Además es complicado medir objetos que se encuentren entre sí a diferentes distancias al emisor, debido a que la frecuencia del pulso variará (más lento cuanto más lejos y viceversa). Aun así, hay estudios experimentales corno "RADAR: an in- building RF-based user location and tracking system" (de P. Bahl et al., Proceedings of IEEE INFOCOM 2000, Tel -Aviv, 2000) que demuestran su uso para estimar ia localización del usuario con un alto nivel de precisión. Esta técnica no resulta apropiada en aplicaciones de realidad aumentada. There are other methods, which do not require direct vision of one or more cameras with the reference markers, for location and user tracking systems. Radio frequency techniques consist of measuring distances, of static or mobile objects, from the emission of electromagnetic pulses that are reflected in a receiver. These electromagnetic waves will be reflected when there are significant changes in the atomic density between the environment and the object, so they work particularly well in cases of conductive materials (metals). They are able to detect objects at a greater distance than other systems based on light or sound, however they are quite sensitive to interference or noise. It is also difficult to measure objects that are between each other at different distances to the emitter, because the pulse frequency will vary (slower the farther and vice versa). Even so, there are experimental studies such as "RADAR: an in- struction RF-based user location and tracking system" (by P. Bahl et al., Proceedings of IEEE INFOCOM 2000, Tel-Aviv, 2000) that demonstrate its use to estimate The location of the user with a high level of precision. This technique is not appropriate in augmented reality applications.
Otro ejemplo de soluciones existentes son los sistemas LIDAR, que calculan ia distancia a través del tiempo que tarda un pulso luminoso en reflejarse en un objeto o superficie, utilizando un dispositivo con un láser pulsado como emisor de luz y un fotodetector como receptor de ia señal reflejada. La ventaja de estos sistemas es la precisión que logran a largas distancias (utilizando láseres con longitud de onda >1000 nm) y la posibilidad de mapear grandes extensiones, mediante barridos de pulsos luminosos. Sus inconvenientes son la necesidad de efectuar el análisis y procesado de cada punto, así como ia dificultad de reconstruir automáticamente imágenes tridimensionales. Another example of existing solutions is LIDAR systems, which calculate the distance over the time it takes for a light pulse to be reflected on an object or surface, using a device with a pulsed laser as a light emitter and a photodetector as a signal receiver. reflected. The advantage of these systems is the precision they achieve over long distances (using lasers with a wavelength> 1000 nm) and the possibility of mapping large areas, by sweeping light pulses. Its drawbacks are the need to perform the analysis and processing of each point, as well as the difficulty of automatically reconstructing three-dimensional images.
El problema técnico objetivo que se presenta es pues proporcionar un sistema para ia detección de la posición y de la orientación de un individuo u objeto en cualquier tipo de ambiente, interior o exterior, con cualesquiera que sean sus condiciones de iluminación. DESCRIPCIÓN DE LA ÉNVENCÜÓN The objective technical problem that arises is thus to provide a system for the detection of the position and orientation of an individual or object in any type of environment, interior or exterior, with whatever their lighting conditions. DESCRIPTION OF THE ÉNVENCÜÓN
La presente invención sirve para solucionar los probiemas mencionados anteriormente, resolviendo ios inconvenientes que presentan las soluciones comentadas en el estado de la técnica, proporcionando un sistema que, a partir de; uso de uno o más marcadores luminosos de referencia y una única cámara estéreo, permite localizar espacialrneníe objetos o individuos en un escenario bajo cualquier condición ambiental y con distancias mayores entre eS usuaria y el marcador. El sistema se basa principalmente en el uso de marcadores luminosos para calcular posiciones relativas de! objeto/individuo, una cámara estéreo para visualizar esos marcadores en la imagen del escenario y un dispositivo electrónico de medida de ángulos, tai como puede ser un giroscopio o brújula electrónica, para proporcionar ángulos de giro del usuario objetivo (objeto, persona o animal). The present invention serves to solve the aforementioned problems, solving the problems presented by the solutions mentioned in the state of the art, providing a system that, from; Using one or more reference luminous markers and a single stereo camera, it allows locating space objects or individuals in a scenario under any environmental condition and with greater distances between the user and the marker. The system relies primarily on the use of light markers to calculate relative positions of! object / individual, a stereo camera to visualize those markers on the stage image and an electronic device for measuring angles, such as a gyroscope or electronic compass, to provide turning angles of the target user (object, person or animal) .
La presente invención permite detectar marcadores luminosos de referencia en cualquier tipo de ambientes, independientemente de las fuentes de luz que determinan las condiciones ambientales. The present invention makes it possible to detect reference luminous markers in any type of environment, regardless of the light sources that determine the environmental conditions.
Un aspecto de ia invención se refiere a un método para posicionar o localizar un objetivo mediante el uso de marcadores de referencia en cualquier entorno 3D que, a partir de una primera trama de imagen en un instante de tiempo actual y una segunda trama rie imagen en un instante de tiempo anterior capturadas mediante una cámara estéreo, imágenes en las que se detecta al menos un marcador, obtiene las coordenadas (χΛ, yn) del objetivo en el instante de tiempo actual n, para lo que realiza los siguientes pasos: One aspect of the invention relates to a method for positioning or locating an objective by using reference markers in any 3D environment that, from a first image frame at an instant of current time and a second image frame in a previous time instant captured by a stereo camera, images in which at least one marker is detected, obtains the coordinates (χ Λ , and n ) of the objective at the current time instant n, for which it performs the following steps:
- obtener un ángulo de giro del objetivo en el instante de tiempo actual y en eí instante de tiempo anterior;  - obtain an angle of rotation of the target at the current time instant and at the previous time instant;
- si el ángulo de giro en el instante de tiempo actual y el ánguio de giro en e! instante de tiempo anterior son distintos, calcular las coordenadas (x yn) del objetivo en ei instante actual n igualándolas a las coordenadas {xr.~ yn-i) del objetivo en el instante anterior n-1- if the angle of rotation at the current time instant and the angle of rotation at e! The previous instant of time is different, calculate the coordinates (x and n ) of the target at the current instant n equating them to the coordinates {x r . ~ yn-i) of the target at the previous instant n-1
- Si la primera trama de imagen y la segunda trama de imagen son iguales, calcular las coordenadas (xn> yr.) de! objetivo en el instante actúa! igualándolas a las coordenadas (χ, ΐ , ν,..· ; del objetivo en el instante anterior;- si no, en otro caso, obtiene las coordenadas de imagen de, al menos un marcador detectado, y su radio, para comparar ios radios en el instante de tiempo actual n y en el Instante de tiempo anterior n-1 y: - si ios radios son ¡guales y hay una pluralidad de marcadores, las coordenadas (xn. yn) del objetivo en ei instante actual se obtienen mediante triangulación usando la primera trama de imagen y la segunda trama de imagen; - If the first image frame and the second image frame are equal, calculate the coordinates (x n> yr.) Of! target instantly acts! equalizing them to the coordinates ( χ , ΐ, ν, .. ·; of the objective in the previous instant; - if not, otherwise, it obtains the image coordinates of, at least one detected marker, and its radius, to compare ios radios in the current time instant n and in the previous time instant n-1 and: - if the radii are equal and there are a plurality of markers, the coordinates (x n . and n ) of the objective at the present time are obtained by triangulation using the first image frame and the second image frame;
- si ios radios son distintos y hay también más de un marcador, las coordenadas (xn, yf,) del objetivo en ei instante actual se obtienen mediante triangulación pero usando una soia trama de imagen, la captada en el instante actual; - if the radios are different and there is also more than one marker, the coordinates (x n , and f ,) of the objective at the current moment are obtained by triangulation but using a single image frame, the one captured at the current moment;
si ios radios son distintos y hay un único marcador detectado, las coordenadas {xr„ yn) del objetivo en ei instante actual se obtienen mediante el algoritmo de geometría estéreo conocido en ei estado de la técnica; if the radios are different and there is a single marker detected, the coordinates {x r „and n ) of the target at the current moment are obtained by the stereo geometry algorithm known in the prior art;
- si los radios son iguales y hay un ünico marcador detectado, las coordenadas (xr, yn) del objetivo en el instante actual se obtienen medíante un algoritmo que recuerda al de geometría estéreo pero usando las coordenadas de imagen del marcador en e! instante de tiempo actúa! y en ei instante de tiempo anterior, en vez de una imagen izquierda y derecha de mismo instante de tiempo - if the radii are equal and there is a unique marker detected, the coordinates (x r , y n ) of the target at the present time are obtained through an algorithm reminiscent of stereo geometry but using the image coordinates of the marker in e! instant time acts! and in the previous instant of time, instead of a left and right image of the same instant of time
Otro aspecto de la invención se refiere a un sistema para localizar un objetivo, que puede ser un objeto o un individuo, a partir de al menos un marcador de referencia en un espacio o entorno 3D, que comprende ios siguientes medios; Another aspect of the invention relates to a system for locating an objective, which can be an object or an individual, from at least one reference marker in a 3D space or environment, comprising the following means;
una cámara estéreo para captar tramas de imagen en las que se detectan uno o más marcadores;  a stereo camera to capture image frames in which one or more markers are detected;
- un dispositivo medidor de ángulos para obtener el ángulo de giro del objetivo en cada instante de tiempo;  - an angle measuring device to obtain the angle of rotation of the target at each instant of time;
- un procesador de señales, con acceso a un dispositivo de almacenamiento (una memoria), configurado para realizar los pasos dei método anteriormente descrito para obtener a su salida las coordenadas (x < yr) del objetivo calculadas en ei instante de tiempo actual, usando, según cada caso arriba indicado, los datos obtenidos en ei instante de tiempo anterior almacenados en la memoria. - a signal processor, with access to a storage device (a memory), configured to perform the steps of the method described above to obtain at its output the coordinates (x < and r ) of the target calculated at the instant of current time, using, according to each case indicated above, the data obtained at the previous time stored in memory.
Como marcador de reterencia se utiliza una fuente luminosa, identificable en el entorno de utilización. As a retentive marker, a light source is used, identifiable in the environment of use.
En un posible campo de aplicación, ia invención que se describe puede emplearse para aplicaciones de Realidad Simulada. Para ello, se incorpora al sistema unas gafas de Realidad Virtual. Tanto la cámara estéreo corno las gafas pueden formar parte de un casco o equipo de sujeción que va colocado en la cabeza del usuario y conexionando la cámara con las gafas. El sistema puede incorporar de forma adicional un aceierómetro, que mide el desplazamiento realizado en un tiempo finito, lo que reduciría ios errores acumulativos. In a possible field of application, the invention described can be used for Simulated Reality applications. For this, glasses are incorporated into the system Virtual reality. Both the stereo camera and the glasses can be part of a helmet or holding equipment that is placed on the user's head and connecting the camera with the glasses. The system can additionally incorporate an accelerometer, which measures the displacement made in a finite time, which would reduce the cumulative errors.
La presente invención posee una serie de características diferenciadoras con respecto a las soluciones existentes comentadas en eí estado de la técnica anterior que presentan ventajas técnicas corno las que siguen: The present invention has a series of differentiating characteristics with respect to the existing solutions discussed in the prior art which have technical advantages as follows:
- Con respecto a US 7,231 ,083 62. la presente invención resuelve el problema del tiempo de computo de sistemas existentes como el descrito en US 7,231 ,063 B2< debido a que se requieren algoritmos de realce de contraste y/o de marcadores específicos guardados en una base de dafoss porque en la presente invención se utilizan marcadores luminosos que trabajan en el espectro visible o infrarrojo, como pueden ser los diodos emisores de luz (LEDs - With respect to US 7,231, 083 62. the present invention solves the problem of computation time of existing systems such as that described in US 7,231, 063 B2 < because contrast enhancement algorithms and / or specific saved markers are required on a data base s because in the present invention light markers are used that work in the visible or infrared spectrum, such as light emitting diodes (LEDs)
- Con respecto a WO 2013/120041 A1 , una de ¡as diferencias de la presente invención es que utiliza fuentes luminosas fijas y viene a resolver el problema que se da en ambientes donde hay fuentes tíe luz con una íurnlnancia mucho mayor que el propio marcador. Para solucionar este problema, la presente invención usa un elemento que evita que las condiciones lumínicas de un entorno afecten de forma significativa como es el uso de un fondo tras la fuente de iuz.  - With respect to WO 2013/120041 A1, one of the differences of the present invention is that it uses fixed light sources and comes to solve the problem that occurs in environments where there are light sources with an intensity much greater than the marker itself . To solve this problem, the present invention uses an element that prevents the light conditions of an environment from affecting significantly as is the use of a background after the source of iuz.
BREVE DESCRIPCIÓN DE LAS FIGURAS BRIEF DESCRIPTION OF THE FIGURES
A continuación se pasa a describir de manera muy breve una sene de dibujos que ayudan a comprender mejor la invención y que se relacionan expresamente con una realización de dicha invención que se presenta como un ejemplo no limitativo de ésta. Next, a series of drawings that help to better understand the invention and that expressly relate to an embodiment of said invention which is presented as a non-limiting example thereof is described very briefly.
FIGURA 1 .- Muestra un diagrama de bloques esquemático del sistema de localización espacial de individuos u objetos, según una realización preferente de la invención. FIGURE 1 .- Shows a schematic block diagram of the spatial location system of individuals or objects, according to a preferred embodiment of the invention.
FIGURA 2.» Muestra una representación simplificada de un tipo de marcador luminoso, que puede usar el sistema de la Figura 1 . FIGURA 3.- Muestra un entorno de uso de ios marcadores de ¡a Figura 2 y en el que es aplicable el sistema de la Figura 1 , según una posible realización. FIGURE 2. » Shows a simplified representation of a type of light marker, which can be used by the system of Figure 1. FIGURE 3.- Shows an environment of use of the markers of Figure 2 and in which the system of Figure 1 is applicable, according to a possible embodiment.
FIGURAS 4Ά-4Β.- Muestran un esquema de ios marcadores y parámetros que usa el sistema para localizar en eí entorno individuos u objetos que se desplazan verticalmente y cuando soio se detecta un único marcador. FIGURES 4Ά-4Β.- They show a scheme of the markers and parameters used by the system to locate individuals or objects that move vertically in it and when only one marker is detected.
FIGURAS 5A- 5B - Muestran un esquema de los marcadores y parámetros que usa el sistema para localizar en el entorno individuos u objetos que se desplazan verticalmente y cuando se detecta más de un marcador. FIGURES 5A-5B - They show a scheme of the markers and parameters that the system uses to locate in the environment individuals or objects that move vertically and when more than one marker is detected.
FIGURA 6A.- Muestra un esquema de los marcadores y parámetros que usa el sistema para localizar en el entorno individuos u objetos que se desplazan orizontalmente y cuando sólo se detecta un único marcador. FIGURE 6A.- It shows a scheme of the markers and parameters that the system uses to locate individuals or objects that move orizontally in the environment and when only a single marker is detected.
FIGURA 8B - Muestra un esquema de ios marcadores y parámetros que usa eí sistema para localizar en el entorno individuos u objetos que se desplazan horizontalmente y cuando se detecta más de un marcador. FIGURE 8B - Shows a scheme of the markers and parameters used by the system to locate in the environment individuals or objects that move horizontally and when more than one marker is detected.
FIGURA 7.- Muestra un esquema del funcionamiento del método, es meramente un ejemplo de flujo de datos. FIGURE 7.- It shows a scheme of the operation of the method, it is merely an example of data flow.
REALIZACIÓN PREFERENTE DE LA INVENCIÓN PREFERRED EMBODIMENT OF THE INVENTION
A continuación, se proponen posibles modos de realización del sistema de obtención, a partir dei uso de uno o varios marcadores luminosos, de !a posición y orientación de un usuario, en diferentes posibles ambientes, que pueden ser en interior o exterior, dentro de un escenario controlado. Next, possible embodiments of the obtaining system are proposed, based on the use of one or more luminous markers, from the position and orientation of a user, in different possible environments, which can be indoors or outdoors, within A controlled scenario.
La Figura 1 muestra un diagrama esquemático de ía arquitectura de bloques del sistema para localizar en e! espacio ios objetos o individuos que constituyen un objetivo ( 10) en un entorno tridimensional ( 1 1 ) bajo cualquier condición ambientaí definida por un numero m¾1 de fuentes de luz (f , fLg, ft:¾ fL™), disponiendo de uno o más marcadores luminosos (20) como tos mostrados en ias Figuras 2-3, 4A-4B, 5A-5B y 6A-8B E! sistema comprende una cámara estéreo {12} para detectar ¡os marcadores luminosos (20) y un dispositivo electrónico de medida de ángulos (13), por ejemplo, un giroscopio o brújula electrónica, con el que se obtienen los ángulos de giro del objetivo (10). Además, el sistema comprende un procesador digital de señales (14) que calcula las coordenadas de posición en el espacio de cada marcador luminoso (20) en ei tiempo y las almacena en una memoria o dispositivo de almacenamiento (15). E! procesador digital de señales ( 14) utiliza las coordenadas almacenadas y los parámetros de salida que obtiene de la cámara estéreo (12) y del dispositivo medidor de ángulos ( 13) para determinar a su salida (18) la posición del usuario objetivo ( 10). Figure 1 shows a schematic diagram of the system block architecture for locating in e! space ios objects or individuals that constitute an objective (10) in a three-dimensional environment (1 1) under any environmental condition defined by a number m¾1 of light sources (f, fLg, ft : ¾ fL ™), having one or more light markers (20) as coughs shown in Figures 2-3, 4A-4B, 5A-5B and 6A-8B E! The system comprises a stereo camera {12} to detect the luminous markers (20) and an electronic angle measuring device (13), for example, a gyroscope or electronic compass, with which the rotation angles of the objective are obtained ( 10). In addition, the system comprises a digital signal processor (14) that calculates the position coordinates in the space of each luminous marker (20) in time and stores them in a memory or storage device (15). AND! Digital signal processor (14) uses the stored coordinates and the output parameters obtained from the stereo camera (12) and the angle measuring device (13) to determine at its output (18) the position of the target user (10) .
En la Figura 2 se muestra un tipo de marcador de referencia (20) de los utilizados, que es un marcador luminoso y comprende dos elementos principales: una fuente de luz (21 ) y una superficie de contraste (22). La fuente de luz (21 ) preferente es un diodo LED que emite en e! rango visible: 400-700 nm. Este tipo de fuente es una fuente de luz puntual que logra alcances mayores a 50 m para potencias mayores de 1 W. Además, un diodo LED se puede considerar como un producto no peligroso debido las potencias ópticas en las que trabaja y a que en el peor de los casos el tiempo de exposición es muy bajo (tiempo de reacción de aversión^ -250 ms). Aun así, el sistema puede usar marcadores (20) luminosos con otro tipo de fuentes de luz (21 ), debido a que ei dispositivo que detecta el marcador luminoso (20), es decir, la cámara estéreo (12) que se usa como receptor de la luz. detecta tanto fuentes de luz {21 } que trabajan en el espectro visible como infrarrojo. El sensor de imagen de la cámara estéreo ( 12) tiene una curva espectral que, para la longitud de onda del LED que se usa, indica una respuesta espectral con un valor superior a 0.005 A/VV. Las bombillas de filamento son otro ejemplo de fuentes de luz (21 ), aunque son fuentes difusas con potencias ópticas emitidas inferiores a ias alcanzares con un diodo LED. Otra posible fuente de luz (21 ) puede ser un diodo láser, aunque se trata de una fuente colimada capaz de focalizar la luz sobre un punto muy pequeño y, para la mayoría de los casos, todas aquellas potencias ópticas superiores a 1 mW pueden ser peligrosas, El último tipo de fuente de luz (21 ) que puede emplearse es un diodo LED infrarrojo, aunque debido al alcance que presenta, el Inconveniente es que el usuario no es capaz de percibirlo y podría ocasionarle daños oculares. Por otra parte, la superficie de contraste (22) es de un color -por ejemplo, negra- y dimensiones que permiten distinguir entre el marcador luminoso (20) y cualquier fuente de luz externa. La pantalla o superficie de contraste (22) permite aplicar el método aquí propuesto en ambientes con poca o mucha luminosidad, y a grandes distancias . La forma de ia superficie de contraste (22) puede ser cualquiera, por ejemplo, cuadrada como en la Figura 2. Las dimensiones de la superficie de contraste (22) dependen de ías condiciones lumínicas del entorno (1 1 ), dei flujo luminoso de ia fuente de luz (21 ) y de ia distancia máxima que vaya a haber entre el objetivo (10) y los marcadores luminosos (20). La plantilla o superficie de contraste (22) se sitúa en la parte externa de ía fuente de luz (21 ), concretamente en ia posterior, quedando la fuente de luz (21 ) a ia vista del usuario. En el caso que ei ambiente o el fondo que haya detrás de la fuente de luz (21 ) sea lo suficientemente oscuro, no es necesario añadir la superficie de contraste (22). El sistema admite el uso de otro tipo de marcadores luminosos (20), como pueden ser marcadores impresos blancos con un reborde negro, aunque éstos no pueden emplearse en cualquier tipo de ambiente. A type of reference marker (20) of those used is shown in Figure 2, which is a luminous marker and comprises two main elements: a light source (21) and a contrast surface (22). The preferred light source (21) is an LED emitting in e! visible range: 400-700 nm. This type of source is a point light source that achieves ranges greater than 50 m for powers greater than 1 W. In addition, an LED can be considered a non-hazardous product due to the optical powers in which it works since in the worst of the cases the exposure time is very low (aversion reaction time ^ -250 ms). Even so, the system can use luminous markers (20) with other types of light sources (21), because the device detecting the luminous marker (20), that is, the stereo camera (12) used as light receiver detects both light sources {21} that work in the visible and infrared spectrum. The image sensor of the stereo camera (12) has a spectral curve that, for the wavelength of the LED used, indicates a spectral response with a value greater than 0.005 A / VV. Filament bulbs are another example of light sources (21), although they are diffuse sources with emitted optical powers below the range with an LED. Another possible light source (21) can be a laser diode, although it is a collimated source capable of focusing the light on a very small point and, for most cases, all those optical powers greater than 1 mW can be Dangerous, The last type of light source (21) that can be used is an infrared LED, although due to the scope it presents, the drawback is that the user is not able to perceive it and could cause eye damage. On the other hand, the contrast surface (22) is of a color - for example, black - and dimensions that allow distinguishing between the light marker (20) and any external light source. The contrast screen or surface (22) allows you to apply the method proposed here in environments with little or a lot of light, and at great distances. The shape of the contrast surface (22) can be any, for example, square as in Figure 2. The dimensions of the contrast surface (22) depend on the surrounding light conditions (1 1), the luminous flux of The light source (21) and the maximum distance between the objective (10) and the light markers (20). The template or contrast surface (22) is located on the outside of the light source (21), specifically at the rear, the light source (21) being left in view of the user. In the case that the environment or the background behind the light source (21) is dark enough, it is not necessary to add the contrast surface (22). The system supports the use of other types of luminous markers (20), such as white printed markers with a black border, although these cannot be used in any type of environment.
La Figura 3 muestra un posible escenario de aplicación del sistema, en el que ia distribución de ios marcadores (20). Sin embargo, Los marcadores (20) pueden situarse a diferentes distancias los unos de los otros, que ei sistema ha de conocer de antemano. La altura entre cada marcador luminoso (20) y el suelo no está prefijada, pero es recomendable que sea aquella que permite visión directa entre ia cámara estéreo (12) y las fuentes luminosas ífl. :, fLa, fL&..,.; fLm} de los marcadores (20) luminosas. En ei caso de ambientes exteriores, los marcadores (20) se sitúan en soportes verticales para conseguir la altura necesaria. En el caso de ambientes interiores, ios marcadores luminosos (20) pueden ir también en soportes verticales o sujetos en las paredes u objetos del entorno. La relación que hay entre el número rn de marcadores (20) y la distancia entre ellos (di, c ) depende del ángulo de apertura (2φ) de la cámara estéreo ( 12), del ángulo de emisión (2Θ) de la fuente de luz (21 ), por ejemplo el LED, del marcador (20) luminoso y de ia distancia mínima (L) que tiene que haber entre el objetivo ( 10), usuario del sistema, y las fuentes luminosas (fL-¡ , ft..2l fL.3...... fl.fr}, í.e. , los LEDs, para que puedan visualizarse como mínimo una pareja de fuentes; según queda reflejado en ía siguiente en la ecuación m d Figure 3 shows a possible system application scenario, in which the distribution of the markers (20). However, the markers (20) can be located at different distances from each other, which the system must know in advance. The height between each luminous marker (20) and the ground is not preset, but it is recommended that it be that which allows direct vision between the stereo camera (12) and the igloo light sources. : , fLa, fL & ..,. ; fLm} of the luminous markers (20). In the case of outdoor environments, the markers (20) are placed on vertical supports to achieve the necessary height. In the case of indoor environments, the luminous markers (20) can also be placed on vertical supports or subjects on the surrounding walls or objects. The relationship between the number rn of markers (20) and the distance between them (di, c) depends on the opening angle (2φ) of the stereo camera (12), the emission angle (2Θ) of the source of light (21), for example the LED, of the luminous marker (20) and of the minimum distance (L) that must be between the target (10), user of the system, and the light sources (fL-¡, ft. 2l fL.3 ...... fl. F r}, i.e., the LEDs, so that at least one pair of sources can be displayed; as reflected in the following in the md equation
El escenario donde se aplica el método no presenta ninguna característica predefinida con respecto a distribución, planta, obstáculos, de forma que el sistema se adapta a el. El tipo de ambiente, como se ha explicado anteriormente, puede ser interior o exterior. La única restricción que tiene es las dimensiones máximas de este entorno, estando limitadas por ei alcance de las fuentes luminosas (fL; , fL2, fL¾ fLm) elegidas. Dicho alcance, se mide en función de 3a intensidad y flujo luminoso de las fuentes de luz (fLi , fL2l fLR„) y de ia sensibilidad del sensor de imagen de ia cámara estéreo (12). The scenario where the method is applied does not present any predefined characteristics with respect to distribution, plant, obstacles, so that the system adapts to it. The type of environment, as explained above, can be indoor or outdoor. The only restriction is that having the maximum dimensions of the environment, being limited by ei reach the light sources (fL;, fL 2 fL¾ FLM) chosen. Said scope is measured in function of 3a intensity and luminous flux of the light sources (fLi, fL 2l fL R „) and of the sensitivity of the image sensor of the stereo camera (12).
A partir de las imágenes capturadas por ¡a cámara estéreo (12) y de las coordenadas de imagen de ios marcadores luminosos (20) calculadas en una captura anterior, como más adelante se describe, por el procesador digital de señales ( 14) del sistema, este sistema permite ioealizar en ia imagen unos puntos de referencia específicos mediante un algoritmo de detección de marcadores luminosos (20), como eS que se describe seguidamente. El método que se va a describir no es único, pueden usarse otras va iantes, devolviendo como parámetros de salida ¡as coordenadas de imagen (u.v) y el diámetro de los marcadores luminosos (20) detectados. En las coordenadas de imagen (u,v) en 2 dimensiones, la primera coordenada u denota ia coordenada según un eje horizontal y ia segunda coordenada v denota la coordenada según un eje vertical, en el piano 2D de ia imagen donde se detectan los movimientos. La detección de marcadores luminosos (20) se divide en los siguientes pasos:  From the images captured by the stereo camera (12) and the image coordinates of the light markers (20) calculated in a previous capture, as described below, by the digital signal processor (14) of the system , this system allows ioealizar in the image specific reference points by means of an algorithm of detection of luminous markers (20), as eS described below. The method to be described is not unique, other variants can be used, returning as output parameters the image coordinates (u.v) and the diameter of the luminous markers (20) detected. In the image coordinates (u, v) in 2 dimensions, the first coordinate or denotes the coordinate along a horizontal axis and the second coordinate v denotes the coordinate along a vertical axis, on the 2D piano of the image where movements are detected . The detection of light markers (20) is divided into the following steps:
Conversión de imagen a escala de grises para reducir considerablemente el tamaño de la imagen, ya que así se pasa de tener tres canales, rojo, verde y azul, a sólo uno blanco y negro. Es decir, cada pixel de la imagen reduce su valor de 3 bytes a 1 byte.  Image conversion to grayscale to significantly reduce the size of the image, as this goes from having three channels, red, green and blue, to only one black and white. That is, each pixel in the image reduces its value from 3 bytes to 1 byte.
Filtrado de eliminación de ruido para eiimínar los píxeles erróneos y ruido de las imágenes captadas por las cámaras. El tipo de filtro depende de lo n ítidas que se deseen las imágenes y del tiempo de retraso que se pueda introducir en el sistema.  Noise removal filtering to eliminate erroneous pixels and noise from images captured by cameras. The type of filter depends on how clear the images are desired and the delay time that can be introduced into the system.
Localización de pixeles vecinos con fuertes contrastes, analizando ía imagen por ventanas y buscando aquellas regiones donde ios contrastes entre píxeles vecinos son mayores, Este algoritmo tiene sentido porque las fuentes de luz (21 ) tienen vaíores de pixel en ia imagen en torno a 255 y la plantilla (22) negra tiene valores en torno a Q.  Localization of neighboring pixels with strong contrasts, analyzing the image by windows and looking for those regions where the contrasts between neighboring pixels are greater, This algorithm makes sense because the light sources (21) have pixel variances in the image around 255 and the black template (22) has values around Q.
Obtención de las coordenadas de los marcadores Suminosos (20), una vez localizadas las regiones que pueden corresponder a fuentes de luz (21 ), verificando que realmente lo sean. Lo primero que se comprueba es la forma de la fuentes de luz (21 ), que se aproxime a una circunferencia o elipse, y se obtienen las coordenadas imagen (usv), de su punto central así como su radio. Además se han de contrastar dichas regiones entre sí, verificando que todas se encuentran en filas de píxeies muy similares y que tienen valores de intensidad similares, ya que se asume que ícdas son fuentes de uz (21 ) con la misma íumínancia. Obtaining the coordinates of the Suminous markers (20), once the regions that can correspond to light sources (21) are located, verifying that they really are. The first thing that is checked is the shape of the light sources (21), which approximates a circle or ellipse, and the image coordinates (u s v) of its central point as well as its radius are obtained. In addition, these regions have to be checked against each other, verifying that they are all in very similar rows of pixels and that they have intensity values similar, since it is assumed that íddas are sources of uz (21) with the same illuminance.
- Verificación final, comparando las coordenadas de los marcadores luminosos (20} calculadas con las obtenidas es una captura anterior. Una vez obtenidas las regiones que se han comprobado corresponden a marcadores luminosos (20), se procede a una última comprobación. En este caso, cotejando ias posiciones de ios marcadores actuales con los de un instante anterior; teniendo en cuenta que ai ser momentos consecutivos, las coordenadas no cambian de forma muy significativa de un sitio a otro.  - Final verification, comparing the coordinates of the luminous markers (20) calculated with those obtained is a previous capture.Once obtained the regions that have been verified correspond to luminous markers (20), a final check is made. , comparing the positions of the current markers with those of a previous moment, taking into account that in the case of consecutive moments, the coordinates do not change very significantly from one site to another.
Para localizar en la Imagen de un entorno ( 1 1 ) los puntos de referencia que dan la localización del usuario objetivo ( 10), es necesario conocer la siguiente información: To locate in the Image of an environment (1 1) the reference points that give the location of the target user (10), it is necessary to know the following information:
a) las coordenadas de imagen (u, v) y radio de cada marcador (20) detectado por el algoritmo de obtención de marcadores anteriormente descrito a partir de la imagen capturada por la cámara estéreo ( 12); a) the image (u, v) and radius coordinates of each marker (20) detected by the marker obtaining algorithm described above from the image captured by the stereo camera (12);
b) el valor en grados δ, del giro del usuario objetivo, devuelto por el dispositivo de mediría de ángulos (1 3), en el momento de la captura por la cámara estéreo ( 12) de cada imagen; y c) los datos guardados en la memoria ( 5) como son: posición anterior distancia real entre marcadores, distancia focal de las cámaras, ángulo de apertura de la cámara, distancia entre cámaras {'baseline', en inglés), trama de imagen (trame', en inglés) anterior, radio anterior de los marcadores, vectores de posición anteriores de los marcadores y ángulo de giro anterior. b) the value in degrees δ , of the rotation of the target user, returned by the angle measuring device (1 3), at the time of capture by the stereo camera (12) of each image; and c) the data stored in memory (5) such as: previous position actual distance between markers, focal length of the cameras, camera opening angle, distance between cameras {'baseline', in English), image frame ( trame ', in English) anterior, anterior radius of the markers, anterior position vectors of the markers and anterior angle of rotation.
Considerando el caso particular de un entorno ( 1 1 ) continuo, sin obstáculos y de forma cuadrada, por ejemplo, como el escenario representado en la Figura 3, la posición del usuario objetivo ( 10) depende de los giros y el tipo de movimientos que realice - vertical: arriba o abajo, horizontal: izquierda o derecha-; o de si no realiza ningún movimiento. Considering the particular case of a continuous, unobstructed, square-shaped environment (1 1), for example, as the scenario depicted in Figure 3, the position of the target user (10) depends on the turns and the type of movements that perform - vertical: up or down, horizontal: left or right-; or if it does not make any movement.
Los métodos para el cálculo de la posición que se describen a continuación se resumen en la Figura 7, se smplemenfan de diferentes maneras, ilustradas en las Figuras 4A-4B, 5A-5B y 6A-6B, dependiendo de la clase de desplazamiento que se haya registrado y del número de marcadores detectados, siendo válidos para cualquier tipo de marcador, tanto luminoso como impreso. Como muestra ia Figura 7, lo primero es comprobar e! valor, en grados, devuelto por eí dispositivo de medida de ángulos (13) para determinar si existe un giro significativo, ío cual ocurre en caso de ser el ángulo obtenido en eí instante actuai δ(η) distinto al del instante anterior 5(n-1 ); sin embargo, si δ(η) - δ(η~1 ) indica que el usuario objetivo ( 10) no ha girado. Si hay giro, las coordenadas de usuario son las mismas a pesar de que las imágenes capturadas por ia cámara estéreo (12) cambien. Cuando el ángulo de giro, obtenido por el dispositivo de medida de ángulos {13), es constante en el tiempo, se compara la trama imagen capturada en ei instante actual, frarne n), con la inmediatamente anterior frame (n~ 1 ) y si coinciden se interpreta como que no ha habido ningún movimiento dei usuario. En el caso de que no haya desplazamiento, el método devuelve las mismas coordenadas de usuario que en el momento anterior (x„-i, y-, ); en caso contrario, se calcula ia posición con toda ia información, a)-c), mencionada anteriormente. De este modo, se evitan operaciones redundantes e innecesarias. Cuando se detecta cambio de posición, se aplica el algoritmo de detección de marcadores. Conociendo los valores de ios radios de ios marcadores detectados en el instante actual, r(n) y los del instante anterior r(n~1 ), se puede identificar el tipo de desplazamiento de! usuario objetivo (10): The methods for calculating the position described below are summarized in Figure 7, they are used in different ways, illustrated in Figures 4A-4B, 5A-5B and 6A-6B, depending on the kind of displacement has registered and the number of markers detected, being valid for any type of marker, both luminous and printed. As Figure 7 shows, the first thing is to check e! value, in degrees, returned by the angle measuring device (13) to determine if there is a significant turn, which occurs if the angle obtained at the instant acting δ (η) is different from that of the previous instant 5 (n -one ); however, if δ ( η ) - δ (η ~ 1) indicates that the target user (10) has not turned. If there is rotation, the user coordinates are the same even though the images captured by the stereo camera (12) change. When the angle of rotation, obtained by the angle measuring device {13), is constant over time, the image frame captured at the current moment is compared, frarne n), with the immediately previous frame (n ~ 1) and if they coincide, it is interpreted as having been no movement of the user. In the case that there is no displacement, the method returns the same user coordinates as in the previous moment (x „-i, y-,); otherwise, the position is calculated with all the information, a) -c), mentioned above. In this way, redundant and unnecessary operations are avoided. When position change is detected, the marker detection algorithm is applied. By knowing the values of the radii of the markers detected in the current instant, r (n) and those of the previous instant r (n ~ 1), the type of displacement of! target user (10):
* Si esos radios son distintos, r(n~1 )≠ r n), el desplazamiento es hacia arriba o abajo. Para conocer ia posición del objetivo (10) es necesario saber ia distancia entre él y los marcadores, es decir, conocer el desplazamiento realizado verticaimente.  * If those radii are different, r (n ~ 1) ≠ r n), the offset is up or down. To know the position of the objective (10) it is necessary to know the distance between it and the markers, that is, to know the displacement made vertically.
~ Si esos radios son Iguales, r(n-1 ) ~ r{n), desplazamiento es a derecha o izquierda. Para conocer la posición del objetivo ( 10) es necesario saber cuánto se ha movido horlzontalrnente.  ~ If those radii are the same, r (n-1) ~ r {n), displacement is to the right or left. In order to know the position of the objective (10) it is necessary to know how much it has moved horlzontalrnente.
Una vez identificado ei tipo de movimiento realizado por el objetivo (10), se le puede localizar en el entorno ( 1 1 ) según los siguientes métodos, que dependen dei tipo de movimiento y del número m de marcadores (20) detectados. Once the type of movement performed by the objective (10) has been identified, it can be located in the environment (1 1) according to the following methods, which depend on the type of movement and the number m of markers (20) detected.
Las Figuras A-48 muestran el caso en que se ha determinado que existe un movimiento vertical del objetivo (10) y cuando sólo se detecta un único marcador (20) en ia imagen (40) binocular captada por la cámara estéreo (12), En este caso, no se puede usar un algoritmo de triangulación, debido a que no se puede relacionar ios píxeies con una distancia real; por ello se tiene que recurrir a la técnica de visión estéreo y se necesitan ios siguientes parámetros: Figures A-48 show the case in which it has been determined that there is a vertical movement of the lens (10) and when only a single marker (20) is detected in the binocular image (40) captured by the stereo camera (12), In this case, a triangulation algorithm cannot be used, because the pixels cannot be related to a real distance; Therefore, the stereo vision technique must be used and the following parameters are needed:
la disparidad binocular {'disparity:, en inglés} de ia visión estéreo dada por las coordenadas UL y Ü , rectificadas y sin distorsión respectivamente, dei marcador (2Q) obtenido de las dos componentes de imagen, izquierda (41 ) y derecha (42), captadas por la cámara estéreo {12); binocular disparity { 'disparity:, ai} English stereo vision given by the UL and U coordinates, rectified and distortion respectively marker dei (2Q) obtained from the two image components, left (41) and right (42), captured by the stereo camera {12);
ios valores de distancia baseline B y distancia focal foca ength f de ia cámara estéreo (12); y  ios baseline B and focal length ength f focal values of the stereo camera (12); Y
e! ángulo de giro (δ) del usuario objetivo (10).  and! angle of rotation (δ) of the target user (10).
Para poder transformar las coordenadas de imagen a ¡a profundidad, se calcula la geometría proyectsva en el instante actual n según ia ecuación: baseline x focaljenght In order to transform the image coordinates to depth, the project geometry is calculated at the present time n according to the equation: baseline x focaljenght
'*· ~ mura ado r ^j dispar ¿t y ' * · ~ Mura ado r ^ j shoot it ty
Una vez calculada la distancia Lmsft-.SCiC¡-{n} que hay entre el objetivo ( 10) y el marcador (20), se puede obtener su posición dentro escenario. Como se ha desplazado verticalmenie, io ünico que aparentemente ha cambiado es su coordenada y, pero es necesario tener en cuenta ei ángulo de ano δ para obtener las coordenadas absolutas. Las coordenadas (x.-. y„) en el instante actual son igual a las coordenadas en ei instante anterior (χ:·.-.. >¾.·¾) más la um del desplazamiento realizado' Once the distance L msft - has been calculated . SC i C ¡- {n} between the objective (10) and the marker (20), you can get your position on stage. As it has moved vertically, the only one that has apparently changed is its coordinate and, but it is necessary to take into account the angle of the year δ to obtain the absolute coordinates. The coordinates (x.-. And „) in the current instant are equal to the coordinates in the previous instant (χ : · .- ..> ¾. · ¾) plus the um of the displacement made '
Si r(n-1 ) r(n) ½ .... 4- sin(o } - i :" ?« & r s' s á ¡í ,, .. , If r (n-1) r (n) ½ .... 4- sin (o} - i : "?« &Rs' s á ¡í ,, ..,
„..¡ eos (5) " I i-: ??Sff >- íJ.: ? -...¿ „..¡Os (5)" I i-: ?? Sff> - íJ .:? -...
Si r(n-1 ) > r n) ·- sin($) - *~ m ares dor^^ y» _ t eos (ó") * I ¾4:-: >V :^ :·>,.. ,. If r (n-1)> rn) · - sin ($) - * ~ m ares dor ^^ and »_ t eos (or " ) * I ¾4 : - : > V: ^: ·>, .., .
Las Figuras 5A-5B muestran ei caso en que se ha determinado que existe un movimiento vertical del objetivo (10) y se detectan dos o más marcadores (20, 20', 20") en la imagen (50) captada por la cámara estéreo {12). En este caso, se puede aplicar triangulación, puesto que se dispone de más de un marcador, de la distancia real (d/m) entre marcadores (2CS 20', 20"), del ángulo de giro (δ), del ángulo de apertura (2<p) de la cámara (12) y deí número de píxeles (AxB) de la imagen (50). Conociendo las coordenadas horizontales u de imagen de los marcadores (20, 20', 20"), se calcula, en píxeles, la distancia en pi eles q entre ellos, q = uj-ui, que en el mundo real es igual a d/m metros, siendo m el número de marcadores. Por lo tanto ios metros reales de distancia L^^a^ ) que hay entre ei objetivo ( 10) y uno de ¡os marcadores, marcador (20), en ei instante actual n es:
Figure imgf000017_0001
Figures 5A-5B show the case in which it has been determined that there is a vertical movement of the lens (10) and two or more markers (20, 20 ' , 20 ") are detected in the image (50) captured by the stereo camera {12) In this case, triangulation can be applied, since more than one marker is available, of the actual distance (d / m) between markers (2C S 20 ', 20 "), of the angle of rotation (δ ), the aperture angle (2 <p) of the camera (12) and the number of pixels (AxB) of the image (50). Knowing the horizontal or image coordinates of the markers (20, 20 ' , 20 "), it is calculated, in pixels, the distance in pixels q between them, q = uj-ui, which in the real world is equal ad / m meters, m being the number of markers. Therefore, the actual meters of distance L ^^ a ^) between the objective (10) and one of the markers, marker (20), at the present time n is:
Figure imgf000017_0001
En las Figuras 4A-4B y 5A-5B, se representa ei ángulo Φ que se refiere a la mitad del ángulo de apertura (2cp) de la cámara ( 12), In Figures 4A-4B and 5A-5B, the angle Φ is shown which refers to half of the opening angle (2cp) of the chamber (12),
Conocida ia distancia ai marcador y ía distancia que había en ei instante anterior n~1, se calculan los metros recorridos como ía diferencia entre ambos, A partir de ese valor y de ia posición anterior del usuario, coordenadas del objetivo ( 10) en el instante anterior {x?,.?, y„.i ), se pueden calcular sus nuevas coordenadas { n, yn) en ei instante actual: Known the distance to the marker and the distance that was at the previous moment n ~ 1, the meters traveled are calculated as the difference between the two, From that value and the user's previous position, coordinates of the target (10) in the previous moment {x ? . ? , and „.i), its new coordinates { n , and n ) can be calculated at the current moment:
I Si r(n~1 ) < r(n) x^ - ¾ , 4, I If r (n ~ 1) <r (n) x ^ - ¾, 4,
I Si r{n-1 ) > r(n) Xn m x^ _ I If r {n-1)> r (n) Xn mx ^ _
L^a c.ííí;a¾..., '\-. ·?«<·. > i | wf.s .«f íj\ ^ ΐ L ^ a c.ííí; a¾ ..., '\ -. ·? «<·. > i | w f . s . «f íj \ ^ ΐ
En este caso también se puede usar visión estéreo para obtener la profundidad a tos marcadores. Pero es necesario aplicar correspondencias estéreo, es decir, relacionar ios marcadores de ia imagen izquierda con sus equivalentes de la imagen derecha. Obtenidas las correspondencias, se puede aplicar geometna proyectiva, como en el caso de un único marcador para obtener ia distancia real a cada marcador. In this case, stereo vision can also be used to obtain depth at the markers. But it is necessary to apply stereo correspondences, that is, to relate the markers of the left image with their equivalents of the right image. Once the correspondences have been obtained, projective geometry can be applied, as in the case of a single marker to obtain the real distance to each marker.
Las Figuras 6A-6B muestran el caso en que se ha determinado que existe un movimiento horizontal del objetivo ( 10). Figures 6A-6B show the case in which it has been determined that there is a horizontal movement of the target (10).
La Figura 6A se refiere al caso en que sólo se detecta un único marcador (20) en ia imagen (61 , 62), Se aplica un algoritmo que puede recordar al de geometría estéreo, pero en este caso no se utilizan dos imágenes del mismo instante tomadas desde dos ángulos diferentes, sino que se utilizarán dos imágenes de instantes contiguos y misma perspectiva: la imagen captada en el instante actual (61 ) y la capturada en un instante inmediatamente anterior 62). Asimismo, se cuenta con ias coordenadas horizontales del marcador en e! instante actual (u„) y las que se obtuvieron deí frarne anterior (υ,-,.ι ), así como la distancia anterior entre el marcador y el usuario (Lmarcador) y la distancia focaí (foealjength) da ia cámara (12), para ca cular el desplazamiento (D) horizontal realizado por ei usuario objetivo {10) según la siguiente expresión:
Figure imgf000018_0001
Figure 6A refers to the case in which only a single marker (20) is detected in the image (61, 62). An algorithm that can be reminded of stereo geometry is applied, but in this case two images of the same are not used moment taken from two different angles, but two images of contiguous instants and the same perspective will be used: the image captured at the current moment (61) and the one captured at an immediately previous instant 62). Also, there are the horizontal coordinates of the marker in e! current moment (u „) and those obtained from the previous frarne (υ, - ,. ι), as well as the previous distance between the marker and the user (Lmarcador) and the focaí distance (foealjength) of the camera (12) , to calculate the horizontal displacement (D) performed by the target user {10) according to the following expression:
Figure imgf000018_0001
Una vez conocido el desplazamiento D, en metros, que ha realizado el usuario objetivo ( 10), se pueden obtener sus coordenadas reales, que dependen de su posición en el instante anterior n-1 y del tipo de desplazamiento, izquierda o derecha, realizado: Once the displacement D is known, in meters, that the target user has made (10), its real coordinates can be obtained, which depend on its position in the previous instant n-1 and on the type of displacement, left or right, performed :
Figure imgf000018_0002
Figure imgf000018_0002
La Figura 6B se refiere ai caso en que se detecta más de un marcador (20, 20' . 20") en ia imagen. Se emplea una técnica similar a la de triangulación explicada en el case de un movimiento vertical del usuario con una pluralidad de marcadores detectados, pero en este caso se usan dos imágenes {83, 84) capturadas por el mismo sensor de imagen de forma consecutiva en el tiempo, teniendo la imagen actual (63) y la imagen captada en el instante anterior (64). Conociendo la distancia real entre marcadores y ios pixeies que hay entre ellos, p píxeies en ei instante actual n y q píxeies en ei instante anterior n-1. se puede extrapolar a la longitud que se ha desplazado ei usuario. Para ello se requiere conocer, además de la distancia entre marcadores (d/m), el ángulo de giro (s) y las coordenadas de imagen (un.f ) de los marcadores (20, 20', 20") en la imagen anterior (64).
Figure imgf000018_0003
Al igual que ocurría en ei caso de un único marcador, una vez conocido ei desplazamiento D se pueden obtener las coordenadas reales del usuario objetivo (10):
Figure 6B refers to the case in which more than one marker (20, 20 ' . 20 ") is detected in the image. A technique similar to that of triangulation explained in the case of a vertical user movement with a plurality is used. of detected markers, but in this case two images {83, 84) captured by the same image sensor are used consecutively in time, having the current image (63) and the image captured in the previous instant (64). Knowing the real distance between markers and the pixels between them, px pies in the current moment and pqxies in the previous instant n-1, it can be extrapolated to the length that has been moved by the user. of the distance between markers (d / m), the angle of rotation ( s ) and the image coordinates (u n .f) of the markers (20, 20 ', 20 ") in the previous image (64).
Figure imgf000018_0003
As in the case of a single marker, once the offset D is known, the actual coordinates of the target user (10) can be obtained:
Figure imgf000019_0001
Figure imgf000019_0001
En este caso también se puede aplicar el caso anterior de un único marcador detectado para obtener el desplazamiento (D) realizado por el usuario objetivo (10). Fs decir, a partir de las coordenadas del mismo marcador en dos imágenes contiguas, despreciando ei reste de ios marcadores detectados, y con la distancia anterior entre el marcador y el usuario, calcular ei desplazamiento D. In this case, the previous case of a single marker detected can also be applied to obtain the displacement (D) performed by the target user (10). Fs, from the coordinates of the same marker in two contiguous images, disregarding the rest of the markers detected, and with the previous distance between the marker and the user, calculate the displacement D.

Claims

REIVINDICACIONES
1. Método para la localización espacial de un objetivo (10) usando ai menos un marcador (20) luminoso identificable en el entorno de utilización de referencia, que en un instante de tiempo í calcula unas coordenadas (x¡( y,) del objetivo ( 10), caracterizado por que comprende: 1. Method for the spatial location of a target (10) using at least one luminous marker (20) identifiable in the reference use environment, which in a moment of time í calculates coordinates (x ¡( y,) of the target (10), characterized in that it comprises:
- capturar mediante una cámara estéreo ( 12) una primera trama de imagen en un instante de tiempo actual y una segunda trama de imagen en un instante de tiempo anterior, detectando en la primera y segunda trama de imagen al menos un marcador (20);  - capturing through a stereo camera (12) a first image frame at an instant of current time and a second image frame at an earlier time, detecting at least one marker (20) in the first and second image frame;
- obtener un radio en un instante de tiempo actual y un radio en el instante de tiempo anterior del, al menos un, marcador (20) detectado en la primera trama de imagen y segunda trama de imagen;  - obtaining a radius at a current time instant and a radius at the previous time instant of the at least one marker (20) detected in the first image frame and second image frame;
- obtener un ángulo de giro del objetivo (10) mediante un dispositivo de medida de ángulos ( 13) en el instante de tiempo actual y en el instante de tiempo anterior;  - obtaining an angle of rotation of the objective (10) by means of an angle measuring device (13) at the current time instant and at the previous time instant;
- si el ángulo de giro en el instante de tiempo actual y el ángulo de giro en el instante de tiempo anterior son distintos, calcular las coordenadas (xn, yv,) del objetivo ( 10) en ei instante actual igualándolas a las coordenadas (xn.¾, yR.< ) del objetivo (10) en el instante anterior;- if the angle of rotation at the current time instant and the angle of rotation at the previous time instant are different, calculate the coordinates (x n , yv,) of the target (10) at the current time equal to the coordinates ( x n .¾, and R. <) of the objective (10) in the previous instant;
- sí la primera trama de imagen y la segunda trama de imagen son iguales, calcular las coordenadas (x.- . y-) del objetivo (10) en el instante actual Igualándolas a las coordenadas { ::· - . y,vi } del objetivo (10) en el instante anterior; - If the first image frame and the second image frame are the same, calculate the coordinates (x.-. y-) of the target (10) at the current time Equal to the coordinates {:: · -. and, vi} of the objective (10) in the previous instant;
- si no, comparar ios radios en ei instante de tiempo actual y en el instante de tiempo anterior del, al menos un, marcador (20) detectado y:  - if not, compare the radios at the instant of current time and at the previous time of the at least one, marker (20) detected and:
- si ios radíos son iguales y hay más de un marcador (20, 20', 20") detectado, las coordenadas (x,-. y„) deí objetivo (10) en el instante actual se obtienen mediante triangulación usando ia primera trama de imagen y la segunda trama de imagen;  - if the radii are equal and there is more than one marker (20, 20 ', 20 ") detected, the coordinates (x, -. and„) of the target (10) at the present time are obtained by triangulation using the first frame of image and the second image frame;
- si ios radios son distintos y hay más de un marcador (20, 20', 20") detectado, las coordenadas ( , . yn) deí objetivo (10) en el instante actual se obtienen mediante triangulación usando una sola trama de imagen que es la primera trama de imagen; - if the radios are different and there is more than one marker (20, 20 ', 20 ") detected, the coordinates (,. and n ) of the target (10) at the present time are obtained by triangulation using a single image frame which is the first image frame;
- si los radios son distintos y hay un único marcador (20) detectado, las coordenadas (x,.. y,-,) del objetivo ( 10) en el instante actual se obtienen mediante geometría estéreo;  - if the radii are different and there is a single marker (20) detected, the coordinates (x, .. and, -,) of the target (10) at the present time are obtained by stereo geometry;
si los radios son iguales y hay un único marcador (20) detectado, las coordenadas (Xn, y-) objetivo (10) en el instante actual se obtienen calculando unas coordenadas de imagen del marcador (20) en ei instante de tiempo actual en la primera trama de imagen y unas coordenadas de imagen del marcador (20) obtenidas en el instante de tiempo anterior en la segunda trama de imagen. if the radii are equal and there is a single marker (20) detected, the target coordinates (Xn, y-) (10) at the current time are obtained by calculating image coordinates of the marker (20) at the current time instant in the first image frame and image coordinates of the marker (20) obtained at the previous time in the second image frame.
2. Método de localización espacial, de acuerdo con ia reivindicación 1 , caracterizado por que usa un marcador (20) luminoso que comprende una íueníe de luz (21 ) y una superficie de conírasíe (22). 2. The method of spatial location, according to claim 1, characterized in that it uses a luminous marker (20) comprising a light pattern (21) and a coniarial surface (22).
3. Método de localización espacial, de acuerdo con ia reivindicación 2, caracterizado por que usa un marcador (20) luminoso que comprende una fuente de luz (21 ) que es un diodo LED. 3. Spatial location method, according to claim 2, characterized in that it uses a luminous marker (20) comprising a light source (21) which is an LED.
4. Método de localización espacial , de acuerdo con la reivindicación 1 , caracterizada por que, si los radios son iguales y hay más de un marcador (20, 2Q\ 20") detectado, obtener las coordenadas (xfl! y„) del objetivo ( 10) en e! instante actual comprende: 4. Spatial location method, according to claim 1, characterized in that, if the radii are equal and there is more than one marker (20, 2Q \ 20 ") detected, obtain the coordinates (x fl! And„) of the objective (10) at the present moment comprises:
- para cada marcador (20, 20'. 20"), obtener en ia primera trama de imagen unas coordenadas horizontales un de imagen en el instante de tiempo actual y en la segunda trama da imagen obtener unas coordenadas horizontales un-i de imagen en el instante de tiempo anterior: - for each marker (. 20, 20 '20 "), obtained in the first image frame ia a horizontal coordinates u n image at the current time instant and the second frame image gives a horizontal coordinates u obtain n -i image at the previous time instant:
~ medir un desplazam expresión:
Figure imgf000021_0001
~ measure a displacement expression:
Figure imgf000021_0001
donde p es un número de pixeles en el instante actual n y q es un número de p ixeles en el instante anterior n-1 , m es un numero totai de marcadores, d/m es una distancia real entre marcadores (20, 20', 20") y 5 es el ángulo de giro del objetivo (10) obtenido; where p is a number of pixels in the current instant nyq is a number of pixels in the previous instant n-1, m is a total number of markers, d / m is a real distance between markers (20, 20 ', 20 ") and 5 is the angle of rotation of the target (10) obtained;
- calcular las coordenadas (xn, yv.) del objetivo ( 10) en el instante actual mediante ia ecuación: - calculate the coordinates (x n , yv.) of the objective (10) at the current time using the equation:
Sí Us i^ ¾·;
Figure imgf000021_0002
Yes Us i ^ ¾ ·;
Figure imgf000021_0002
5. Método de localización espacial, de acuerdo con la reivindicación 1 , earaeferáade por que, si los radios son iguales y hay un único marcador (20) detectado, obtener las coordenadas (A, , y-.) del objetivo ( 10) en el instante actual comprende: - para e! marcador (20) obtener en la primera trama de imagen unas coordenadas horizontales un de imagen en el instante de tiempo actual y en la segunda trama de imagen obtener unas coordenadas horizontales un-i de imagen en el instante de tiempo anterior; ■■ medir un desplazamiento D sión:
Figure imgf000022_0001
5. The method of spatial location, according to claim 1, will ensure that, if the radii are equal and there is a single marker (20) detected, obtain the coordinates (A,, and -.) Of the objective (10) in The current moment includes: - for e! marker (20) obtain horizontal coordinates or image n in the first image frame at the current time instant and in the second image frame obtain horizontal image n- i coordinates in the previous time instant; ■■ measure a displacement D sion:
Figure imgf000022_0001
donde ia cámara (12) tiene una distancia focal focal iength y LmareadoriVi es una distancia entre ei marcador (20) y ei objetivo (10) medida en ei instante de tiempo anterior, ~ calcular las coordenadas {x«. yrt) del objetivo (10) en el instante actual mediante la ecuación: where the camera (12) has a focal length of distance iength and the smoker iV i is a distance between the marker (20) and the target (10) measured at the previous time, ~ calculate the coordinates {x «. and rt ) of the objective (10) at the current time using the equation:
Figure imgf000022_0002
Figure imgf000022_0002
6. Método de localización espacial, de acuerdo con la reivindicación 1 , caracterizado por que, si los radios, que son ei radio en el instante actual r(n) y ei radio en el instante anterior r{n-1 ), son distintos y hay más de un marcador {20, 20r, 20") detectado, obtener las coordenadas (xr, yn) del objetivo (10) en el instante actual comprende: 6. The method of spatial location, according to claim 1, characterized in that, if the radii, which are the radius at the current instant r (n) and the radius at the previous instant r {n-1), are different and there is more than one marker {20, 20 r , 20 ") detected, obtaining the coordinates (x r , and n ) of the target (10) at the present time comprises:
~ obtener en la primera trama de imagen unas primeras coordenadas horizontales u¡ de imagen de un primor marcador (20. 20', 20") y unas segundas coordenadas horizontales uj de imagen de un segunde marcador (20' ) ~ obtain in the first image frame first horizontal coordinates or image of a primary marker (20. 20 ', 20 ") and second horizontal coordinates of image of a second marker (20')
- medir en el instante de tiempo actual una distancia Lmarcador* entre el objetivo ( 10) y ei primer marcador (20)
Figure imgf000022_0003
- measure a marker distance * between the target (10) and the first marker (20) at the current time
Figure imgf000022_0003
donde la cámara (12) tiene un ángulo de apertura 2φ, AxB es un numero de pixeies de imagen bidimensíona! en el instante actual, m es un número totai de marcadores, d/m es una distanda real entre ios marcadores (20, 20') y ñ es el ánguio de giro dei objetivo (10) obtenido; - obtener una distancia Lmarcadasv; medida en el instante de ttiiempo anterior entre el objetivo ( 10) y ei primer marcador (20); where the camera (12) has an opening angle 2φ, AxB is a number of two-dimensional image pixels! at the present time, m is a total number of markers, d / m is a real distance between the markers (20, 20 ' ) and ñ is the turning angle of the target (10) obtained; - get a distance Lmarcadasv; measured at the previous time between the objective (10) and the first marker (20);
- calcular las coordenadas (x.- , yn) del objetivo ( 10) en el instante actual rrt ecuación: - calculate the coordinates (x.-, y n ) of the objective (10) at the current moment rrt equation:
Si r(n-1 ) < r(n) ¾ ^ SK x - sm(S) * v ...; -t osí^Ó^ * If r (n-1) <r (n) ¾ ^ SK x - sm (S) * v ... ; -t osí ^ Ó ^ *
Si r(n-1 ) > r(n) ¾ ■:: :< v: ··· ¾|¾|íl:i * <5 £· ^ i If r (n-1)> r (n) ¾ ::: < v: ··· ¾ | ¾ | íl : i * <5 £ · ^ i
- v .. ; - cos(<¾ - v .. ; - cos (<¾
7. Método de localización espacial, de acuerdo con la reivindicación 1 , caracterizado por que, si ios radios, que son el radio en ei instante actual r{n) y el radio en el instante anterior r(n-1 ), son distintos y hay un único marcador (20) detectado, obtener las coordenadas (x„, yn) del objetivo (10) en el instante actual comprende: 7. Spatial location method according to claim 1, characterized in that, if the radios, which are the radius at the current instant r {n) and the radius at the previous instant r (n-1), are different and there is a single marker (20) detected, obtaining the coordinates (x „, y n ) of the target (10) at the current time comprises:
obtener unas coordenadas del marcador (20) rectificadas y SJR respectivamente en una componente de imagen izquierda (41 ) y una componente de imagen derecha (42) captadas por la cámara estéreo ( 12), una disparidad binocular obtain rectified marker coordinates (20) and SJR respectively in a left image component (41) and a right image component (42) captured by the stereo camera (12), a binocular disparity
- medir en eí instante de tiempo actual una distancia Lmarcador,, entre ei objetivo ( 10) y ei marcador (20) mediante la expre
Figure imgf000023_0001
- measure a marker distance at the current time, between the target (10) and the marker (20) by means of the expiration
Figure imgf000023_0001
donde la cámara estéreo (12) tiene una disparidad binocular igual a ui. - u , una distancia de referencia B y una distancia focal f;  where the stereo camera (12) has a binocular disparity equal to ui. - u, a reference distance B and a focal distance f;
■■ obtener una distancia
Figure imgf000023_0002
medida en ei instante de tiempo anterior entre ei objetivo ( 10) y el marcador (20);
■■ get a distance
Figure imgf000023_0002
measured at the previous time between the objective (10) and the marker (20);
- calcular las coordenadas (x,-,, y(1) del objetivo (10) en el instante actual mediante la ecuación, donde 5 es el ángulo de giro dei objetivo (10) obtenido en el instante actual: - calculate the coordinates (x, - ,, y (1 ) of the target (10) at the current time using the equation, where 5 is the angle of rotation of the target (10) obtained at the current time:
Figure imgf000024_0001
Figure imgf000024_0001
8.~ Sistema de localización espacial de un objetivo ( 10) en un entorno (1 1 ) tridimensional que comprende al menos un marcador (20) luminoso de referencia y yn procesador digital de señales (14) para calcular unas coordenadas (x¡, y,} del objetivo (10) en un instante de tiempo i, caracterizado por que comprende: 8. ~ Spatial location system of a target (10) in a three-dimensional environment (1 1) comprising at least one reference light marker (20) and yn digital signal processor (14) to calculate coordinates (x¡, and,} of the objective (10) in an instant of time i, characterized in that it comprises:
- una cámara estéreo (12) para capturar una primera trama de imagen en un instante de tiempo actual y una segunda trama de imagen en un instante de tiempo anterior;  - a stereo camera (12) for capturing a first image frame at an instant of current time and a second image frame at an earlier time;
- un dispositivo de medida de ángulos ( 1 3) para obtener un ángulo de giro del objetivo ( 10) en el instante de tiempo actual y en el instante de tiempo anterior;  - an angle measuring device (1 3) to obtain an angle of rotation of the target (10) at the current time instant and the previous time instant;
- el procesador de señales (14) con acceso a una memoria {15) que almacena un radio en un instante de tiempo actual y en un radie en ei instante de tiempo anterior del, al menos un, marcador (20) detectado en la primera trama de imagen y segunda trama de imagen; ei procesador de seriales ( 14) configurado para:  - the signal processor (14) with access to a memory {15) that stores a radio in an instant of current time and in a radie in the previous time of the at least one, marker (20) detected in the first image frame and second image frame; The serial processor (14) configured to:
- si ei ángulo de giro en el instante de tiempo actual y ei ángulo de giro en el instante de tiempo anterior son distintos, calcular las coordenadas (x, yf;) del objetivo (10) en el instante actual Igualándolas a las coordenadas (xn-i , r-i ) del objetivo (10) en el instante anterior; - if the angle of rotation at the current time instant and the angle of rotation at the previous time instant are different, calculate the coordinates (x , and f; ) of the target (10) at the current time Equal to the coordinates (x n -i, ri) of the objective (10) in the previous instant;
- si la primera trama de imagen y ia segunda trama de imagen son iguales, calcular las coordenadas yn) del objetivo ( 1 0) en ei instante actual igualándolas a las coordenadas (xf;-i , yv : ) del objetivo ( 10) en el instante anterior; - if the first frame image and the second image frame are equal, calculate the coordinates and n) of the target (1 0) in ei current time equating to the coordinates (x f, - i, v:) of the target (10 ) in the previous instant;
- si no, comparar los radios en el instante de tiempo actual y en ei instante de tiempo anterior del , al menos un, marcador (20) detectado y:  - if not, compare the radios at the current time instant and at the previous time instant of the at least one, marker (20) detected and:
- si los radios son iguales y hay más de un marcador (20, 2G\ 20") detectado, calcular las coordenadas (χ.·. yn) del objetivo ( 10) en el instante actual mediante triangulación usando la primera trama de imagen y la segunda trama de imagen; - if the radii are equal and there is more than one marker (20, 2G \ 20 ") detected, calculate the coordinates (χ. ·. and n ) of the target (10) at the current moment by triangulation using the first image frame and the second image frame;
- si ios radios son distintos y hay más de un marcador (20, 20', 20") deteclado, calcular las coordenadas { n, yn) del objetivo (10) en el instante actual mediante triangulación usando una sola trama de imagen que es ia primera trama de imagen; - si ios radios son distintos y hay un único marcador (20) detectado, calcular las coordenadas (xn, yn) del objetivo (10) en ei instante actual mediante geometría estéreo; - if the radios are different and there is more than one marker (20, 20 ', 20 ") detected, calculate the coordinates { n , and n ) of the target (10) at the current moment by triangulation using a single image frame that it is the first image frame; - if the radios are different and there is a single marker (20) detected, calculate the coordinates (x n , y n ) of the objective (10) at the current moment using stereo geometry;
- si los radios son iguales y hay un Cínico marcador {20} detectado, calcular las coordenadas (χ Πι yn) del objetivo (10) en ei instante actual usando unas coordenadas de imagen del marcador (20) en ei instante de tiempo actuai en la primera trama de imagen y unas coordenadas de imagen deí marcador (20) obtenidas en e! instante de tiempo anterior en !a segunda trama de imagen. - if the radii are equal and there is a cynical marker {20} detected, calculate the coordinates ( χ Πι and n ) of the target (10) at the current moment using image coordinates of the marker (20) at the instant time in the first image frame and image coordinates of the marker (20) obtained in e! instant of previous time in! to second image frame.
9. - Sistema de localización espacial de acuerdo con la reivindicación 8, caracteri ado par qi¡® el marcador (20) luminoso comprende una fuente de luz (21 ) que es un diodo LED 9. - Spatial location system according to claim 8, characterized by the luminous marker (20) comprising a light source (21) which is an LED
10, - Sistema de localización espacial de acuerdo con cualquiera de las reivindicaciones 8-9, caracterizado por quB el marcador (20) luminoso que incluye una superficie de contraste (22) 10, - Spatial location system according to any of claims 8-9, characterized in that the luminous marker (20) includes a contrast surface (22)
1 1 - Sistema de localización espacial de acuerdo con cualquiera de las reivindicaciones 8- 10, caracterizado par que el entorno ( 1 ) es interior. 1 1 - Spatial location system according to any of claims 8-10, characterized in that the environment (1) is interior.
12.- Sistema de localización espacial de acuerdo con cualquiera de ias reivindicaciones 8- 10, caracterizado por que ei entorno (1 1 ) es exterior. 12. Spatial location system according to any of claims 8-10, characterized in that the environment (1 1) is external.
PCT/ES2015/000182 2014-12-23 2015-12-16 Method and system for spatial localisation using luminous markers for any environment WO2016102721A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
ESP201500011 2014-12-23
ES201500011A ES2543038B2 (en) 2014-12-23 2014-12-23 Spatial location method and system using light markers for any environment

Publications (1)

Publication Number Publication Date
WO2016102721A1 true WO2016102721A1 (en) 2016-06-30

Family

ID=53784084

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/ES2015/000182 WO2016102721A1 (en) 2014-12-23 2015-12-16 Method and system for spatial localisation using luminous markers for any environment

Country Status (2)

Country Link
ES (1) ES2543038B2 (en)
WO (1) WO2016102721A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020089675A1 (en) 2018-10-30 2020-05-07 Общество С Ограниченной Ответственностью "Альт" Method and system for the inside-out optical tracking of a movable object

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1501051A2 (en) * 2003-07-08 2005-01-26 Canon Kabushiki Kaisha Position and orientation detection method and apparatus
US7231063B2 (en) * 2002-08-09 2007-06-12 Intersense, Inc. Fiducial detection system
US20100045701A1 (en) * 2008-08-22 2010-02-25 Cybernet Systems Corporation Automatic mapping of augmented reality fiducials
US8761439B1 (en) * 2011-08-24 2014-06-24 Sri International Method and apparatus for generating three-dimensional pose using monocular visual sensor and inertial measurement unit

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7231063B2 (en) * 2002-08-09 2007-06-12 Intersense, Inc. Fiducial detection system
EP1501051A2 (en) * 2003-07-08 2005-01-26 Canon Kabushiki Kaisha Position and orientation detection method and apparatus
US20100045701A1 (en) * 2008-08-22 2010-02-25 Cybernet Systems Corporation Automatic mapping of augmented reality fiducials
US8761439B1 (en) * 2011-08-24 2014-06-24 Sri International Method and apparatus for generating three-dimensional pose using monocular visual sensor and inertial measurement unit

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ABABSA F ET AL.: "A robust circular fiducial detection technique and real-time 3D camera tracking.", JOURNAL OF MULTIMEDIA, vol. 3, no. 4, 30 September 2008 (2008-09-30), pages 34 - 41, ISSN: 1796-2048 *
VOGT S ET AL.: "Single camera tracking of marker clusters: multiparameter cluster optimization and experimental verification.", PROCEEDINGS OF THE IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY 2002 IEEE COMPUT., 30 November 2001 (2001-11-30), pages 127 - 136, ISBN: 0-7695-1781-1 *
YOU S ET AL.: "Fusion of vision and gyro tracking for robust augmented reality registration.", PROCEEDINGS IEEE 2001 VIRTUAL REALITY. (VR), 1 January 2001 (2001-01-01), YOKOHAMA, JAPAN, pages 71 - 78, ISBN: 978-0-7695-0948-8 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020089675A1 (en) 2018-10-30 2020-05-07 Общество С Ограниченной Ответственностью "Альт" Method and system for the inside-out optical tracking of a movable object

Also Published As

Publication number Publication date
ES2543038A1 (en) 2015-08-13
ES2543038B2 (en) 2015-11-26

Similar Documents

Publication Publication Date Title
US10088296B2 (en) Method for optically measuring three-dimensional coordinates and calibration of a three-dimensional measuring device
ES2932079T3 (en) Training neural networks for three-dimensional (3D) gaze prediction with calibration parameters
US8885177B2 (en) Medical wide field of view optical tracking system
US10762652B2 (en) Hybrid depth detection and movement detection
EP3076892B1 (en) A medical optical tracking system
CN107850782B (en) Indicate that enhancing depth map indicates with reflectance map
ES2525011T3 (en) Optronic system and three-dimensional imaging procedure dedicated to identification
US20110310238A1 (en) Apparatus and method for inputting coordinates using eye tracking
US20210136347A1 (en) Active stereo matching for depth applications
CN108354585B (en) Computer-implemented method for detecting corneal vertex
US11307021B2 (en) Method and apparatus for indoor positioning
ES2681094B1 (en) SYSTEM AND METHOD FOR VOLUMETRIC AND ISOTOPIC IDENTIFICATION OF DISTRIBUTIONS OF RADIOACTIVE SCENES
US10976158B2 (en) Device and method to locate a measurement point with an image capture device
ES2763912T3 (en) Optical tracking
ES2628751T3 (en) Procedure and device for monitoring the space of an enclosure
US10728518B2 (en) Movement detection in low light environments
ES2848078T3 (en) Night vision device
ES2924701T3 (en) On-screen position estimation
JP2017528714A (en) Method for optical measurement of three-dimensional coordinates and control of a three-dimensional measuring device
US10713527B2 (en) Optics based multi-dimensional target and multiple object detection and tracking method
WO2016102721A1 (en) Method and system for spatial localisation using luminous markers for any environment
JP6430813B2 (en) Position detection apparatus, position detection method, gazing point detection apparatus, and image generation apparatus
CN105937913B (en) CCD combines total station method for comprehensive detection
KR20140061230A (en) Apparatus and method for producing of depth map of object
WO2012020397A1 (en) Method and system of measuring a distance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15872003

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15872003

Country of ref document: EP

Kind code of ref document: A1