EP3485463A1 - Procédé et système pour la localisation et la reconstruction en temps réel de la posture d'un objet mouvant à l'aide de capteurs embarqués - Google Patents
Procédé et système pour la localisation et la reconstruction en temps réel de la posture d'un objet mouvant à l'aide de capteurs embarquésInfo
- Publication number
- EP3485463A1 EP3485463A1 EP17748526.5A EP17748526A EP3485463A1 EP 3485463 A1 EP3485463 A1 EP 3485463A1 EP 17748526 A EP17748526 A EP 17748526A EP 3485463 A1 EP3485463 A1 EP 3485463A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- optical sensor
- sensors
- module
- relative motion
- relative
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1113—Local tracking of patients, e.g. in a hospital or private home
- A61B5/1114—Tracking parts of the body
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1126—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
- A61B5/1127—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Definitions
- the field of the invention is that of the capture of the movement of an object, and the real-time transfer of this movement to a (poly) articulated mass structure modeling the object.
- the invention finds application to the capture of the movement of a human body via sensors arranged on a person, and the representation of this movement via the animation of an avatar in a numerical simulation.
- the main needs of a motion capture are, on the one hand, to locate the person in the environment, and, on the other hand, to estimate the complete posture of the person.
- Cameras are for example placed in the environment and observe the person (or sensors carried by it) to detect its movements.
- the data stream to be transferred is very large.
- the object of the invention is to propose a technique for locating and reconstructing the posture of a moving object in real time using onboard sensors only, which is at least partially free of the disadvantages above.
- the invention proposes a system for reproducing the movement of an object by an articulated structure comprising a plurality of segments, which comprises:
- a unit for processing measurements made by the sensors fixed on the object comprising a module for determining a command to be applied to the articulated structure so as to locate it in a reference frame by making it adopt a posture reproducing that of the object, said module outputting the absolute location of each of the segments of the structure in the reference frame, said module using the absolute location of each of the sensors attached to the object in the reference frame.
- the sensors fixed in use on the object comprise an optical sensor and relative motion sensors and the processing unit further comprises:
- an image processing module configured to determine an absolute location of the optical sensor in the reference frame from a sequence of images acquired by the optical sensor
- an absolute location determination module of each of the relative motion sensors in the reference frame from the absolute location of the optical sensor and the relative location of each of the relative motion sensors.
- FIG. 1 represents an example of a poly-articulated mass structure used as a physical model (in the meaning of the laws of physics) of a human body;
- FIG. 2 is a diagram of a system according to the invention for tracking the movement of an object on which sensors are fixed;
- FIG. 3 illustrates the calculation of the location of the optical sensor with respect to the reference mark;
- FIG. 4 represents the acquisition of the movement data of the limbs in a current coordinate system associated with the optical sensor
- FIG. 5 is a diagram showing the steps of a method according to the invention.
- the invention relates to the control of an articulated mass structure from data representative of the movement of an object.
- Motion sensors are attached to the object and it is desired to enslave the articulated mass structure to reproduce the movements of the object.
- the object is typically a human body or the body of an animal, or even a single part of such a body.
- the invention is however not limited to these examples, and thus extends to monitoring the movement of any moving object that can be represented by an articulated structure.
- the articulated mass structure is composed of segments connected by at least one joint. It is used as a physical model of the object whose movement we follow. It is typically a poly-articulated structure.
- the articulated mass structure can be a numerical model of the object.
- the objective can then be the representation of the movements of the object in a numerical simulation, by means of the animation of the mass structure articulated in the numerical simulation according to a command determined by the invention.
- FIG. 1 gives an example of a poly-articulated mass structure that can be used as a physical model of the human body in an implementation of the invention.
- the poly-articulated mass structure is composed of segments connected by joints.
- a segment refers to a rigid or supposedly rigid object that is defined by a geometry (that is, a well-defined volume shape), mass, and inertia.
- An articulation denotes a connection between two segments. This link defines the relative configuration that a segment can have with respect to a segment to which it is bound.
- An articulation is defined by one or more representative degrees of freedom, in particular the characteristics of centers, axes or stops.
- the poly-articulated structure 10 is composed of twenty-one segments connected by twenty joints.
- the hand is modeled by a first segment 13 connected by a hinge 16 to a second segment 12 corresponding to the forearm.
- the articulation 16 has for example three degrees of freedom.
- the segment 12 corresponding to the forearm is connected by a hinge 15 with two degrees of freedom to the segment 11 corresponding to the arm.
- the segment 11 is connected to the clavicle by a hinge 14 with three degrees of freedom.
- This definition of the soil includes the geometry of the ground (reliefs, stairs etc.), and a reference reference attached to the ground. In particular, it makes it possible to detect potential collisions between the articulated structure and the environment.
- the invention provides a system for reproducing the movement of an object by an articulated structure.
- This system exploits the data from a set of sensors CO, C1-C4 fixed on the object, and does not require instrumentation of the environment.
- the sensors attached to the object include a central element having an optical sensor, and relative motion sensors.
- the optical sensor and the module 22 described below make it possible to calculate the position / orientation of the object in a reference reference linked to the environment.
- the movements of the object typically the movements of the limbs of a person, are obtained by means of the relative motion sensors and the module 24 described below by which one can have a location, in position and orientation, in a reference relative to the central element.
- the relative motion sensors are four in number, with a relative movement sensor C1, C2 on each hand of the person and a relative motion sensor C3, C4 on each foot of the nobody.
- the optical sensor C0 is placed on the head of the person. This number of four relative motion sensors is given by way of example only, the invention extending to any number of relative motion sensors.
- the optical sensor C0 provides a sequence of images DO, while the relative motion sensors provide motion information Di in a reference relative to the optical sensor, i.e. a current marker corresponding to the current position of the optical sensor in the environment.
- This unit 20 receives the measurements made by the sensors fixed on the object, and is configured, by means of different modules described below, to implement steps at each time step of calculating the configuration to be given to the poly-articulated structure to reproduce the movements of the object.
- the processing unit 20 is in particular provided with a module 21 for determining a command to be applied to the articulated structure so that it reproduces the movement of the object.
- This command more precisely locates the articulated structure in a reference frame 3 ⁇ 4 a , for example a marker attached to the ground, making him adopt a posture reproducing that of the object.
- the module 21 for determining the command to be applied to the articulated structure uses the absolute location of each of the sensors attached to the object in the reference frame. The following notations are adopted:
- This set constitutes the outputs of the module 21 for determining the control to be applied to the articulated structure, namely the location and the complete posture of the articulated structure in the reference frame 3 ⁇ 4 a .
- PO ⁇ denotes the set of orientations and positions of the current coordinate system 3 ⁇ 4 0 corresponding to the current position, at times t of the central element da ns the environment, expressed in the reference reference 3 ⁇ 4 a .
- the module 21 for determining the command to be applied to the articulated structure takes into account the locations in the reference reference frame of the sensors C0, C1-Cn. It is also able, in the context of a physical simulation of an articulated structure, to take into account the effect of gravity, to detect collisions between the geometries of the segments of the articulated structure and the geometry of the structure. environment and resolve the contacts by generating efforts opposing penetrations between the segments of the articulated structure and the environment.
- the collisions between the geometries of the segments of the articulated structure and the geometry of the environment are detected at each time step for a configuration of the given structure.
- This detection can be based on the penetration detection of the geometries as on the detection of proximity according to a given threshold.
- the collision information obtained is the point and the normal of each contact. Collision information is used to generate efforts to prevent penetrations between segment geometries and the environment.
- a modeling of the friction allows, moreover, to simulate the adhesion of the segment on the environment and constrain sliding. The resulting efforts are called contact efforts.
- the geometry of the environment can be obtained either by a priori knowledge of the environment, for example it is assumed that the ground is flat, or is thanks to the mapping of the environment provided by the SLAM.
- the module 21 for determining the control to be applied to the articulated structure can thus exploit a real-time physical simulation engine, such as the XDE engine (for the "eXtended Dynamic Engine") of the Applicant, which makes it possible to perform simulations in a virtual environment by assigning physical constraints to an articulated structure, detecting collisions and managing contacts.
- a real-time physical simulation engine such as the XDE engine (for the "eXtended Dynamic Engine") of the Applicant, which makes it possible to perform simulations in a virtual environment by assigning physical constraints to an articulated structure, detecting collisions and managing contacts.
- the central element and the relative motion sensors C1-Cn each comprise means for measuring the relative distances between the latter, for example an Ultra Wide Band module.
- the module 21 for determining the command to be applied to the articulated structure can implement an estimation algorithm (exploiting for example a Kalman filter).
- an analytical method for example of the MDS ("Multi Dimentional Scaling") type, makes it possible to recover the initial posture by merging the distance measurements provided by the central element and the relative movement sensors Cl-Cn and a priori knowledge of a reference posture serving as an initialization posture.
- the estimation algorithm merges the distance data provided by the central element and the relative motion sensors to determine the position of the joints as a function of time.
- biomechanical constraints knowledge a priori of the model of the human body, number of segments and their sizes that can be measured during the initialization phase
- the command determined by the module 21 can be used to perform a motion analysis. Indeed, from the knowledge of the type of movement, specific information can be extracted for particular application needs. This command can also be used to perform a complete reconstruction of the movement for display on a screen 30, for example that of a computer or a virtual reality headset. The movement can be modified or amplified to achieve the desired effects.
- the processing unit 20 comprises a module an image processing unit 22 and a relative location determination module 24 of the relative motion sensors.
- the image processing module 22 is configured to determine, in real time, an absolute location of the optical sensor C0 in the reference frame 3 ⁇ 4 a from a sequence of images D0 acquired by the optical sensor.
- This module thus provides the set ⁇ R, p £ ⁇ at times t ⁇ . It can notably implement a visual SLAM type algorithm (for Simultaneous Localization And Mapping designating simultaneous mapping and localization) or SfM.
- a visual SLAM type algorithm for Simultaneous Localization And Mapping designating simultaneous mapping and localization
- SfM Simultaneous Localization And Mapping designating simultaneous mapping and localization
- Such an algorithm makes it possible to iteratively reconstruct a map of 3D primitives (generally points) and to locate the optical sensor C0 in this map from the video stream D0.
- the reference mark is then fixed either on the first position of the sensor or on the reconstructed map.
- FIG. 3 shows the calculation of the orientation and the position of the central element with respect to the reference mark, with R and o the orientation and the position of the current mark 3 ⁇ 4 0 relative to the reference mark 3 ⁇ 4 has at successive times t ⁇ , t ⁇ ⁇ 1 and 2.
- the relative location determination module 24 of the relative motion sensors is in turn configured to determine, in real time, the relative location of each of the relative movement sensors C1-Cn, from the measurements made by the sensors fixed on the 'object.
- FIG. 4 shows the acquisition of the movement data of the limbs in the current coordinate system 3 ⁇ 4 0 fixed on the central element, with R and pf the orientation and the position of each relative motion sensor i with respect to the current coordinate system 3 ⁇ 4 0 at different successive instants t, t ⁇ +1 and t + 2 ⁇
- the central element provided with the optical sensor may be associated with means for completing the measurements from the sensors fixed on the object. This is particularly the case when these measurements are insufficient to provide position information of the relative motion sensors with respect to the optical sensor, for example because they only make it possible to determine a relative distance.
- the relative motion sensors each comprise a UWB transmitter ("Ultra Wide Band") to enable the calculation of a relative distance from a receiver.
- the central element is provided not with a UWB receiver but with three UWB receivers positioned in a predetermined configuration, each of which makes it possible to recover the relative distance of each UWB transmitter.
- the relative location determination module 24 can then implement a triangulation to determine the exact position of each motion sensor with respect to the optical sensor.
- the processing unit further comprises an absolute location determination module 23 of each of the relative motion sensors in the reference frame.
- This module 23 exploits the absolute location of the optical sensor ⁇ R Q , PO ⁇ AND the relative location of each of the relative motion sensors ⁇ R. pf ⁇ relative to the optical sensor to make a reference change and replace each of the positions and orientations of the relative motion sensors calculated in the current coordinate system at each instant t in the reference frame.
- This module thus provides the absolute locations ⁇ Rf, tf ⁇ .
- an additional synchronization step is performed by the module 23 to obtain data synchronized at the output.
- the module 23 can use the frequency F C as output frequency and interpolate the missing locations of the optical sensor.
- An exemplary embodiment of the invention exploits relative motion sensors which are each composed of an inertial unit which provides information on the orientation of the sensor, and a UWB (Ultra-Large Band) transmitter which makes it possible to calculate a relative distance from receivers.
- the person is equipped with at least 4 relative sensors fixed rigidly on his body, including a sensor on each hand and a sensor on each foot.
- the central element comprises a camera, for example a 360 ° camera, as well as three UWB receivers positioned in a predetermined configuration, each of which makes it possible to recover the relative distance of each UWB transmitter.
- the optical sensor comprises at least two cameras rigidly connected and calibrated between them, whose fields of view are overlapped. This is for example a stereoscopic sensor. Alternatively, you can use a RGB-D sensor that provides an RGB image and a depth map of the image scene. The use of such optical sensors makes it possible to obtain a real scale location and to minimize the drift inherent in any monocular SLAM or SfM solution.
- the reconstruction and the location are obtained at an arbitrary scale factor. This solution may therefore not be satisfactory for merging the relative movements of the limbs and the displacement of the person.
- the scale is provided thanks to the knowledge of the rigid transformation connecting the two cameras. This makes it possible to obtain a reconstruction and a location on a real scale.
- the optical sensor is composed of a single camera, and to obtain a scale reconstruction it is associated with:
- the invention is not limited to the system as previously described, but also extends to the method of reproducing the movement of an object by an articulated structure implemented by such a system.
- the invention also relates to a computer program product comprising program code instructions for executing the method according to the invention when said program is executed on a computer.
- this method comprises a prior "INST" step of placing the relative motion sensors on the object.
- it is a human body, it is typically placed relative motion sensors on each hand and on each foot of the person.
- the optical sensor is also installed, for example on the person's head.
- the method includes the acquisition "ACQ. Measurements made by the various sensors fixed on the object (video flow delivered by the optical sensor and measurements of the relative motion sensors).
- the image sequence acquired by the optical sensor is subjected to "SLAM" processing to determine the absolute location of the optical sensor in the reference frame.
- the measurements made by the sensors fixed on the object are used during a "LOC-R" step to determine the relative locations of each of the relative motion sensors relative to the optical sensor.
- the absolute location of each of the relative motion sensors in the reference frame is determined from the absolute location of the optical sensor and the relative location of each of the motion sensors. related. If necessary, a resynchronization of the outputs between them is performed during this step "LOC-A".
- step “DET-CDE” determine the command to be applied to the articulated structure so as to locate it in the benchmark by making him adopt a posture reproducing that of the object .
- This command can be used to perform a complete reconstruction of the movement for display on a screen, for example that of a computer or a virtual reality headset.
- the method then comprises an animation "DISP" step, according to the determined command, of the articulated structure in a 3D digital simulation.
- the invention allows the capture of movement in real time, without instrumenting the environment, by locating the person in his environment in an absolute way while estimating his complete posture.
- the advantages of the invention over the solution proposed in the article by T. Shiratori et al. are the following.
- the solution is less expensive in computing time. It uses a single optical sensor and therefore offers a cost reduction.
- the data stream to be transmitted is lighter, and the use of embedded relative sensors (compared to cameras on the limbs) is more robust to abrupt limb movements and occultations.
- the drift of the system is reduced by the choice of the optical sensor placed on the central element.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Dentistry (AREA)
- Surgery (AREA)
- Medical Informatics (AREA)
- Biomedical Technology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Molecular Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Computer Graphics (AREA)
- Cardiology (AREA)
- Software Systems (AREA)
- Geometry (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1656738A FR3054061B1 (fr) | 2016-07-13 | 2016-07-13 | Procede et systeme pour la localisation et la reconstruction en temps reel de la posture d'un objet mouvant a l'aide de capteurs embarques |
PCT/FR2017/051878 WO2018011498A1 (fr) | 2016-07-13 | 2017-07-10 | Procédé et système pour la localisation et la reconstruction en temps réel de la posture d'un objet mouvant à l'aide de capteurs embarqués |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3485463A1 true EP3485463A1 (fr) | 2019-05-22 |
Family
ID=56842946
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17748526.5A Withdrawn EP3485463A1 (fr) | 2016-07-13 | 2017-07-10 | Procédé et système pour la localisation et la reconstruction en temps réel de la posture d'un objet mouvant à l'aide de capteurs embarqués |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3485463A1 (fr) |
FR (1) | FR3054061B1 (fr) |
WO (1) | WO2018011498A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114485620B (zh) * | 2022-01-29 | 2023-07-28 | 中国科学院国家空间科学中心 | 融合轨道动力学的小行星探测器自主视觉定位系统及方法 |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8203487B2 (en) * | 2009-08-03 | 2012-06-19 | Xsens Holding, B.V. | Tightly coupled UWB/IMU pose estimation system and method |
US20110181601A1 (en) * | 2010-01-22 | 2011-07-28 | Sony Computer Entertainment America Inc. | Capturing views and movements of actors performing within generated scenes |
US8786680B2 (en) * | 2011-06-21 | 2014-07-22 | Disney Enterprises, Inc. | Motion capture from body mounted cameras |
-
2016
- 2016-07-13 FR FR1656738A patent/FR3054061B1/fr not_active Expired - Fee Related
-
2017
- 2017-07-10 EP EP17748526.5A patent/EP3485463A1/fr not_active Withdrawn
- 2017-07-10 WO PCT/FR2017/051878 patent/WO2018011498A1/fr unknown
Also Published As
Publication number | Publication date |
---|---|
FR3054061A1 (fr) | 2018-01-19 |
WO2018011498A1 (fr) | 2018-01-18 |
FR3054061B1 (fr) | 2018-08-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP1984696B1 (fr) | Dispositif de capture de mouvement et procede associe | |
US9934587B2 (en) | Deep image localization | |
EP2715662B1 (fr) | Procede de localisation d'une camera et de reconstruction 3d dans un environnement partiellement connu | |
EP2208355B1 (fr) | Procédé de synchronisation de flux vidéo | |
EP2572319B1 (fr) | Procede et systeme pour fusionner des donnees issues de capteurs d'images et de capteurs de mouvement ou de position | |
EP1990138B1 (fr) | Procédé de traitement pour la capture de mouvement d'une structure articulée | |
CN109084746A (zh) | 用于具有辅助传感器的自主平台引导系统的单目模式 | |
JP7448485B2 (ja) | ポイントクラウドの着色において使用される方法及びシステム | |
US20210080259A1 (en) | System and method for enhancing non-inertial tracking system with inertial constraints | |
WO2013176829A1 (fr) | Vidéo augmentée enregistrée spatialement | |
CN106066987A (zh) | Tof成像中的参数在线校准和补偿 | |
EP2851868A1 (fr) | Reconstruction 3D | |
FR3041804A1 (fr) | Systeme de simulation tridimensionnelle virtuelle propre a engendrer un environnement virtuel reunissant une pluralite d'utilisateurs et procede associe | |
EP3126864B1 (fr) | Procédé de géo-localisation de l'environnement d'un porteur | |
WO2005010820A9 (fr) | Procede et dispositif automatise de perception avec determination et caracterisation de bords et de frontieres d'objets d'un espace, construction de contours et applications | |
WO2018011498A1 (fr) | Procédé et système pour la localisation et la reconstruction en temps réel de la posture d'un objet mouvant à l'aide de capteurs embarqués | |
WO2006117374A2 (fr) | Procédé de reconstruction tridimensionnelle d'un membre ou d'un ensemble de membres articulés | |
KR102225321B1 (ko) | 복수 영상 센서로부터 취득한 영상 정보와 위치 정보 간 연계를 통한 도로 공간 정보 구축을 위한 시스템 및 방법 | |
Morimoto et al. | 3D Pose Estimation Using Multiple Asynchronous Cameras | |
EP1714112A1 (fr) | Procede de capture du mouvement d'un solide, utilisant une mesure absolue associee a une mesure par double integration | |
US11605206B2 (en) | Method and apparatus with human body estimation | |
Soulier et al. | Real-time estimation of illumination direction for augmented reality with low-cost sensors | |
FR3054358A1 (fr) | Procede et systeme de reconstruction de posture par suivi spatial du haut d'un corps et suivi du bas du corps au moyen d'un tapis de detection | |
CN118687563A (zh) | 定位数据获取方法、装置、存储介质与电子设备 | |
FR3001072A1 (fr) | Procede et systeme de modelisation 3d absolue en tout ou en partie d'un vehicule passant devant une camera. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190114 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: GRANT OF PATENT IS INTENDED |
|
INTG | Intention to grant announced |
Effective date: 20200414 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200825 |