CN110782492B - Pose tracking method and device - Google Patents

Pose tracking method and device Download PDF

Info

Publication number
CN110782492B
CN110782492B CN201910950626.1A CN201910950626A CN110782492B CN 110782492 B CN110782492 B CN 110782492B CN 201910950626 A CN201910950626 A CN 201910950626A CN 110782492 B CN110782492 B CN 110782492B
Authority
CN
China
Prior art keywords
pose
freedom
degree
tracked object
pixels
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910950626.1A
Other languages
Chinese (zh)
Other versions
CN110782492A (en
Inventor
唐创奇
李卓
李宇光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung China Semiconductor Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Samsung China Semiconductor Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung China Semiconductor Co Ltd, Samsung Electronics Co Ltd filed Critical Samsung China Semiconductor Co Ltd
Priority to CN201910950626.1A priority Critical patent/CN110782492B/en
Publication of CN110782492A publication Critical patent/CN110782492A/en
Priority to KR1020200114552A priority patent/KR20210042011A/en
Priority to US17/063,909 priority patent/US11610330B2/en
Application granted granted Critical
Publication of CN110782492B publication Critical patent/CN110782492B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

A pose tracking method and device are provided. The pose tracking method comprises the following steps: acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object; acquiring pixels with changed brightness from the acquired image; the 6-degree-of-freedom pose of the tracked object is calculated based on the acquired pixels, so that the dependency of pose tracking on the specific layout of the LED markers is reduced, the delay of pose tracking is reduced, and the precision and the efficiency of pose tracking are improved.

Description

Pose tracking method and device
Technical Field
The present disclosure relates to the field of computer vision technology. More specifically, the present disclosure relates to a pose tracking method and apparatus.
Background
In recent years, various methods of pose estimation with 6 degrees of freedom have been proposed and widely applied in the fields of robot grasping, virtual reality/augmented reality, human-computer interaction, and the like. Virtual reality and augmented reality place high demands on system latency. If the system is sluggish to head movements, it can cause the user to be dizzy, nausea. The delay of the Virford (Valve) system is 7-15 milliseconds. The lowest delay of the currently commercially available Virtual Reality (VR) tracking products is 15 milliseconds, and the users cannot have a perfect immersive experience.
Many current optical tracking systems are based on Complementary Metal Oxide Semiconductor (CMOS) cameras. However, the CMOS camera delay for such consumer machines is typically greater than 16.7 milliseconds (60 FPS). Hardware limitations have resulted in these methods not being able to display the user's motion input on the screen in a timely manner, and not meeting VR's low latency requirements.
Active Light Emitting Diode (LED) markers are also used in some schemes to restore a 6 degree of freedom pose of an object. However, these methods have some limitations, either the number of LED lamps must be four and coplanar, or the calculated quantity ratio exceeding a preset deviation threshold cannot be applied to a real-time system. Only 4 LED lights affect the accuracy of the pose tracking and the robustness of the system because the pose solution fails if one of the lights is not detected. In addition, the violent solution of the corresponding relation of the 2D/3D point sets is time-consuming, and cannot be applied to the condition that a large number of LED lamps exist and a real-time system.
Disclosure of Invention
An exemplary embodiment of the present disclosure is to provide a pose tracking method and apparatus to reduce dependency of pose tracking on a specific layout of an LED marker, and at the same time, reduce delay of pose tracking, and improve accuracy and efficiency of pose tracking.
According to an exemplary embodiment of the present disclosure, there is provided a pose tracking method including: acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object; acquiring pixels with changed brightness from the acquired image; the 6-degree-of-freedom posture of the tracked object is calculated based on the acquired pixels, so that the dependency of posture tracking on the specific layout of the LED markers is reduced, the delay of posture tracking is reduced, and the precision and the efficiency of posture tracking are improved.
Optionally, the step of calculating a 6-degree-of-freedom pose of the tracked object based on the acquired pixels may comprise: acquiring inertial measurement unit data of the tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y and z under a body coordinate system of the tracked object; and calculating a 6-degree-of-freedom attitude of the tracked object based on the 3-degree-of-freedom attitude and the acquired pixels, wherein the 6-degree-of-freedom attitude is an attitude along three coordinate axis directions of x, y and z and an attitude rotating around three rectangular coordinate axes of x, y and z under a body coordinate system of the tracked object, so that the accuracy and the efficiency of attitude tracking are improved.
Optionally, the step of calculating a 6 degree-of-freedom pose of the tracked object based on the 3 degree-of-freedom pose and the acquired pixels may comprise: based on the 3-degree-of-freedom posture and the obtained pixels, solving a corresponding relation between a 2D point set and a 3D point set of the marker to obtain a matching pair of the 2D point set and the 3D point set of the marker, wherein the 2D point set comprises pixel coordinates of the marker, and the 3D point set comprises coordinates of the marker in a body coordinate system of the tracked object; and calculating the 6-degree-of-freedom posture of the tracked object based on the matching pair, thereby improving the precision and efficiency of posture tracking.
Optionally, the step of calculating the 6-degree-of-freedom pose of the tracked object may comprise: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; and performing minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object, so that the precision and the efficiency of pose tracking are improved.
Optionally, the step of calculating the 6-degree-of-freedom pose of the tracked object may comprise: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; performing minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation; and optimizing the pose of 6 degrees of freedom after the operation of minimizing the reprojection error according to the pose of 3 degrees of freedom to obtain the pose of 6 degrees of freedom of the tracked object, thereby improving the precision and the efficiency of pose tracking.
Optionally, after obtaining the 6-degree-of-freedom pose of the tracked object, the pose tracking method may further include: and carrying out re-matching on the pixels with the re-projection errors exceeding a preset deviation threshold value in the matching pair according to the 6-degree-of-freedom posture of the tracked object so as to be used for follow-up pose tracking.
According to an exemplary embodiment of the present disclosure, there is provided a pose tracking apparatus including: an image acquisition unit configured to acquire an image of a tracking object on which a marker that blinks at a specific frequency is provided; a pixel acquisition unit configured to acquire a pixel of which luminance varies from the acquired image; and a pose calculation unit configured to calculate a 6-degree-of-freedom pose of the tracked object based on the acquired pixels, thereby reducing dependency of pose tracking on a specific layout of the LED markers while reducing delay of pose tracking, and improving accuracy and efficiency of pose tracking.
Alternatively, the posture calculation unit may be configured to: acquiring inertial measurement unit data of the tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y and z under a body coordinate system of the tracked object; and calculating a 6-degree-of-freedom attitude of the tracked object based on the 3-degree-of-freedom attitude and the acquired pixels, wherein the 6-degree-of-freedom attitude is an attitude along three coordinate axis directions of x, y and z and an attitude rotating around three rectangular coordinate axes of x, y and z under a body coordinate system of the tracked object, so that the accuracy and the efficiency of attitude tracking are improved.
Optionally, the pose calculation unit may be further configured to: based on the 3-degree-of-freedom posture and the obtained pixels, solving a corresponding relation between a 2D point set and a 3D point set of the marker to obtain a matching pair of the 2D point set and the 3D point set of the marker, wherein the 2D point set comprises pixel coordinates of the marker, and the 3D point set comprises coordinates of the marker in a body coordinate system of the tracked object; and calculating the 6-degree-of-freedom gesture of the tracked object based on the matching pair, thereby improving the accuracy and efficiency of gesture tracking.
Optionally, the pose calculation unit may be further configured to: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; and performing minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object, so that the precision and the efficiency of pose tracking are improved.
Optionally, the pose calculation unit may be further configured to: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; performing minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation; and fusing the 3-degree-of-freedom posture with the 6-degree-of-freedom posture after the minimized reprojection error operation to obtain the 6-degree-of-freedom posture of the tracked object, thereby improving the precision and efficiency of posture tracking.
Optionally, the pose tracking apparatus may further include: and the re-matching unit is configured to re-match the pixels with the re-projection errors exceeding a preset deviation threshold according to the 6-degree-of-freedom posture of the tracking object after the 6-degree-of-freedom posture of the tracking object is obtained, so as to be used for follow-up pose tracking.
According to an exemplary embodiment of the present disclosure, there is provided an electronic apparatus including: a camera for acquiring an image of a tracking object on which a marker flashing at a specific frequency is provided, and acquiring a pixel having a luminance change from the acquired image; a processor for calculating a 6 degree of freedom pose of the tracked object based on the pixels acquired by the camera, thereby reducing dependency of pose tracking on specific layout of LED markers, reducing delay of pose tracking, and improving accuracy and efficiency of pose tracking.
According to an exemplary embodiment of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program that, when executed by a processor, implements a pose tracking method according to an exemplary embodiment of the present disclosure.
According to an exemplary embodiment of the present disclosure, there is provided a computing apparatus including: a processor; a memory storing a computer program that, when executed by the processor, implements a pose tracking method according to an exemplary embodiment of the present disclosure.
According to the pose tracking method and device, the image of the tracked object is obtained, the pixels with changed brightness are obtained from the obtained image, and the 6-degree-of-freedom pose of the tracked object is calculated based on the obtained pixels, so that the dependency of pose tracking on the specific layout of the LED markers is reduced, the delay of pose tracking is reduced, and the precision and the efficiency of pose tracking are improved.
Additional aspects and/or advantages of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
Drawings
The above and other objects and features of the exemplary embodiments of the present disclosure will become more apparent from the following description taken in conjunction with the accompanying drawings which illustrate exemplary embodiments, wherein:
FIG. 1 shows a flowchart of a pose tracking method according to an example embodiment of the present disclosure;
FIG. 2 shows 2D active LED marker tracking results of a tracked object;
FIG. 3 shows a block diagram of a pose tracking apparatus according to an example embodiment of the present disclosure;
FIG. 4 shows a schematic view of an electronic device according to an exemplary embodiment of the present disclosure; and
fig. 5 shows a schematic diagram of a computing device according to an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present disclosure by referring to the figures.
Fig. 1 shows a flowchart of a pose tracking method according to an exemplary embodiment of the present disclosure. The pose tracking method shown in fig. 1 is applicable to a tracked object provided with a plurality of markers blinking at a specific frequency, where the markers blinking at a specific frequency may be, for example, active LEDs. In the following description, an LED lamp is explained as an example of a sign, but it should be understood that the present invention is not limited thereto. One skilled in the art may use other forms of marking as the embodiments require.
Referring to fig. 1, in step S101, an image of a tracking target is acquired.
In an exemplary embodiment of the present disclosure, an image of a tracking object may be acquired by a DVS camera. The pose tracking method shown in fig. 1 is applicable to an electronic device having a camera capable of acquiring pixels of luminance variations and a host capable of performing calculations, or to a system composed of a camera capable of acquiring pixels of luminance variations and a host capable of performing calculations.
In step S102, pixels with luminance changes are acquired from the acquired image.
In the exemplary embodiment of the present disclosure, the DVS camera does not directly transmit the image to the host computer after obtaining the image in step S101, but acquires the pixels of the luminance change from the acquired image in step S102, and then transmits the acquired pixels to the host computer for pose tracking, thereby reducing the amount of data transmitted and the amount of data for calculation, and reducing the delay of pose tracking.
In particular, when a marker (e.g., an active LED lamp) flashes, the DVS can generate corresponding on and off events. These events can be easily distinguished from other events by the frequency of LED blinking. These screened events may be divided into different clusters, each cluster representing an LED lamp. These screened events can then be processed using a region-growing based clustering algorithm and a lightweight voting algorithm to identify the most dense points in the cluster as the center of the marker. In addition, a global nearest neighbor tracking method and Kalman filtering based on a uniform velocity model can be used for tracking a plurality of marks, so that the probability of missed detection and false detection of the marks is reduced.
In step S103, a 6-degree-of-freedom pose of the tracking object is calculated based on the acquired pixels. Here, the 6-degree-of-freedom attitude indicates an attitude in the directions of three coordinate axes x, y, and z and an attitude rotated around three rectangular coordinate axes x, y, and z in the body coordinate system of the tracking target.
In an exemplary embodiment of the present disclosure, only those pixels whose luminance changes due to motion are used to calculate the 6-degree-of-freedom pose of the tracking object, thereby reducing the delay of pose tracking.
In an exemplary embodiment of the present disclosure, in calculating the 6-degree-of-freedom pose of the tracking object based on the acquired pixels, inertial Measurement Unit (IMU) data of the tracking object may be first acquired, a 3-degree-of-freedom pose of the tracking object (i.e., a pose rotated around three coordinate axes of x, y, and z in a body coordinate system of the tracking object) may be estimated based on the acquired Inertial Measurement Unit (IMU) data, and then the 6-degree-of-freedom pose of the tracking object may be calculated based on the 3-degree-of-freedom pose and the acquired pixels, thereby improving accuracy and efficiency of pose tracking.
In an exemplary embodiment of the present disclosure, when calculating the 6-degree-of-freedom pose of the tracked object based on the 3-degree-of-freedom pose and the acquired pixels, the correspondence between the 2D point set and the 3D point set of the marker may be first solved based on the 3-degree-of-freedom pose and the acquired pixels, resulting in a matching pair of the marker with respect to the 2D point set and the 3D point set, and then the 6-degree-of-freedom pose of the tracked object may be calculated based on the matching pair. Here, the 2D point set includes the pixel coordinates of each marker, and the 3D point set includes the coordinates of each marker in the body coordinate system of the tracking target.
Specifically, when calculating a 6 degree-of-freedom pose of a tracked object based on a 3 degree-of-freedom pose and acquired pixels, p may be defined IA 、p IB Respectively, the pixel coordinates, p, of two LED lamps (LED lamp A and LED lamp B) which have been undistorted on the image A 、p B The coordinates of the LED lamp A and the LED lamp B are respectively under the object body coordinate system. 3-degree-of-freedom attitude matrix R = [ R ] of tracked object 1 ,r 2 ,r 3 ] T Can be estimated by the attitude heading reference system through IMU data, r 1 、r 2 And r 3 Respectively representing the row vectors of the first, second and third rows of R. t = [ t ] x ,t y ,t z ] T Is an unknown translation vector, where t x ,t y ,t z Respectively representing displacements along three coordinate axes of x, y and z. By using the principle of pinhole imaging, the following equation can be obtained:
Figure BDA0002225546750000061
Figure BDA0002225546750000062
wherein x is A 、y A Respectively representing the x-coordinate and the y-coordinate of the LED lamp a on the image. x is the number of B 、y B Respectively representing the x-coordinate and the y-coordinate of the LED lamp B on the image. The above equation is unknown only for t, 4 equations, 3 unknowns, solving the equation can result:
t=A z p IA -Rp A (3)
wherein,
Figure BDA0002225546750000063
a pixel coordinate point set O of the LED lamp in the image and a known coordinate point set L of the LED lamp in the object body coordinate system can be obtained through a DVS camera detection and clustering algorithm. And (4) arbitrarily selecting two points (O, L) from the point sets O and L to pair, and then calculating the translation t by the formula (3). Through the above operation, a list T of possible translation vectors can be obtained. With some invalid translation vectors (vector elements too large or t) z Negative) may be deleted and some approximately equal translation vectors may be merged. For any valid translation vector T in T valid The point set L of visible LED lamps corresponding to the point set L can be determined through a Moller-Trumbore ray intersection algorithm v : an LED light is visible if the first point at which it intersects an object with the camera's determined ray is the LED light, otherwise the LED light is not visible in this pose. Determining a visible set of LED lamp points for the case of multiple LED lamps may reduce the amount of computation and the case of a mismatch. Then set L of visible points v And projecting the corresponding pose with 6 degrees of freedom and the camera internal parameters onto an image plane P. Thus, the Kuhn-Munkres algorithm can be used to solve the set of visible points L v And best match and match error for observation point set O. Go throughThe list of possible translation vectors T, the set of matches with the smallest match error is the correct matching pair for the set of 2D points and the set of 3D points.
In the exemplary embodiment of the present disclosure, when calculating the 6-degree-of-freedom pose of the tracked object, pixels whose reprojection deviation exceeds a preset deviation threshold may be first removed from the matching pair, and the 6-degree-of-freedom pose may be calculated according to the remaining pixels after removal, and then the 6-degree-of-freedom pose obtained by calculation may be subjected to a minimum reprojection error operation to obtain the 6-degree-of-freedom pose of the tracked object, thereby further improving the accuracy and efficiency of pose tracking.
In an exemplary embodiment of the present disclosure, when calculating the 6-degree-of-freedom pose of the tracked object, pixels whose reprojection deviation exceeds a preset deviation threshold may be first removed from the matching pair, and the 6-degree-of-freedom pose may be calculated according to the remaining pixels after removal, then the minimized reprojection error operation may be performed on the calculated 6-degree-of-freedom pose, and then the 6-degree-of-freedom pose after the minimized reprojection error operation may be optimized according to the 3-degree-of-freedom pose, so as to obtain the 6-degree-of-freedom pose of the tracked object, thereby further improving the accuracy and efficiency of pose tracking.
In an exemplary embodiment of the present disclosure, after the 6-degree-of-freedom pose of the tracked object is obtained, pixels in the matching pair whose reprojection error exceeds a preset deviation threshold may be further re-matched according to the 6-degree-of-freedom pose of the tracked object. In addition, the corresponding relation between the 2D point set and the 3D point set of the newly observed mark (such as an active LED lamp) or the matching pair of the 2D point set and the 3D point set can be updated, so that the pose tracking precision and efficiency are further improved.
Specifically, after the corresponding relationship between the 2D Point set and the 3D Point set is obtained, a Random Sample Consensus (RANSAC) algorithm may be used to remove points where the reprojection deviation in the matching pair exceeds a preset deviation threshold, and then an effective Perspective n-Point positioning (effective Per effective Perspective-n-Point specific-n-Point, EPnP) algorithm is used to solve the 6-degree-of-freedom pose. And then, the rough pose obtained by solving the EPnP algorithm is further optimized by using a Beam Adjustment (BA) algorithm, so that a more accurate pose can be obtained. The accurate pose can be used for re-matching points of the matched pair where the re-projection error exceeds a preset deviation threshold and updating the newly observed matching relationship of the LED. And finally, fusing the obtained 6-degree-of-freedom pose and the 3-degree-of-freedom pose in the IMU based on a sensor fusion algorithm of an extended Kalman filter to obtain a smoother and consistent 6-degree-of-freedom pose (namely, the fused 6-degree-of-freedom pose). In addition, points with the reprojection error exceeding a preset deviation threshold can be rematched by using the fused 6-degree-of-freedom pose, and the newly observed LED matching relationship can be updated.
According to the pose tracking method of the exemplary embodiment of the disclosure, the dependency of pose tracking on the specific layout of the LED markers is reduced, meanwhile, the delay of the pose tracking is reduced, and the precision and the efficiency of the pose tracking are improved.
The DVS camera used to obtain the pixels that track the motion of the object causing the brightness change may be a three generation VGA device, e.g., samsung, with a resolution of 640 x 480, and may be connected to the host through USB 3.0. The DVS camera can generate events for pixel points that vary in relative illumination intensity. Each event is represented by a tuple < t, x, y, p >, where t is the timestamp (resolution on the order of microseconds) at which the event occurred, x, y is the pixel coordinate corresponding to the event, and p ∈ {0,1} is the polarity of the event, where an event < t, x, y,1> occurs when the LED lamp is on, and an event < t, x, y,0> occurs when the LED lamp is off.
The pose tracking method needs to be applied to some fixed parameters (camera internal parameters) and transfer matrices (IMU to handle body, DVS camera to helmet, etc.). An OptiTrack optical motion tracking system is used in exemplary embodiments of the present disclosure to calibrate these fixed parameters and transfer matrices, while also being used to provide an assessment of the accuracy of the pose tracking by the handle pose true value. The OptiTrack optical motion tracking system is not needed during actual use by the user when calibration is complete.
Multiple coordinate systems exist in an OptiTrack optical motion tracking system, e.g. camera (C) mountA coordinate system, a helmet (H) coordinate system, a world (W) coordinate system, an IMU (I) coordinate system, a handle body (B) coordinate system, a handle model (M) coordinate system and an OptiTrack (O) coordinate system. To simplify the OptiTrack optical motion tracking system, the handle body coordinate system and the handle model coordinate system may be aligned, and the world coordinate system and the OptiTrack coordinate system may be aligned. So the pose of 6 degrees of freedom of the moving handle to be solved can be expressed as
Figure BDA0002225546750000081
Figure BDA0002225546750000082
Is a rotation matrix from the handle model coordinate system to the camera coordinate system, C P M is a representation of the origin of the model coordinate system in the camera coordinate system. It is necessary to calibrate in advance the internal parameters of the DVS camera and some fixed transfer matrices>
Figure BDA0002225546750000083
Since the optical characteristics of DVS are the same as a normal CMOS camera, standard pinhole imaging models can be used to determine camera parameters (e.g., focal length, center of projection, and distortion parameters). The DVS camera differs from a normal camera in that the DVS camera cannot see anything that has no illumination change, so a blinking checkerboard can be used to calibrate the DVS.
OptiTrack beads can be used to calibrate some fixed transfer matrices. In the 3D model, the origin of the handle model is the center of the great circle at the top end of the handle. The OptiTrack balls can be fixed on the circumference of a great circle and correspond to the directions of X, Y and Z axes in the marker model, so that the model coordinate system of the motion handle can be determined in the OptiTrack optical motion tracking system. The OptiTrack balls are fixed at the IMU chip and the readings of the IMU are used to determine the directions of the X, Y and Z axes, thus determining the IMU coordinate system in the OptiTrack optical motion tracking system. The OptiTrack optical motion tracking system can record the poses of an IMU coordinate system and a model coordinate system in real time, and then can calculate the conversion relation between the model coordinate system of the motion handle and the IMU coordinate system. And a helmetConversion matrix to camera
Figure BDA0002225546750000093
It can also be determined by the blinking LEDs of those particular patterns:
Figure BDA0002225546750000094
wherein it is present>
Figure BDA0002225546750000095
And &>
Figure BDA0002225546750000096
Are provided by an OptiTrack optical motion tracking system and can be calculated by the EPnP algorithm to £ or @>
Figure BDA0002225546750000097
(the point set matching relationship for a particular pattern is fixed). The two matrices obtained by calculation can be shown as follows:
Figure BDA0002225546750000091
Figure BDA0002225546750000092
the active LED marker blinking interval may be determined by an on-off-on, off-on-off event. If the blinking interval of a pixel is in the [800 mus, 1200 mus ] interval, it can be concluded that this event is caused by LED blinking. The center position of the LED is then determined using a region growing algorithm and a voting method. A plurality of LED lamps may be tracked using global nearest neighbor and kalman filtering. Fig. 2 shows the 2D active LED marker tracking result of the tracked object. In fig. 2, each continuous line represents the movement trace of one marker (e.g., LED), the small open circle represents the start point of the movement trace, and the large circle with a filled circle inside represents the end point of the movement trace.
In addition, in order to improve the performance of the BA algorithm, the optimization window of the BA algorithm may be set to 10, and the optimization is performed every 4 frames. The matching relationship is not updated every time, and the matching relationship is updated only when the ratio of the number of the matched LED lamps to the number of all the observed LED lamps is smaller than a preset value, such as 0.6, so that the efficiency of the BA algorithm is improved. After the initialization of the start-up phase, the entire process flow only takes 1.23 milliseconds. Thus plus the screening time for LED lamp flashing events (1 ms), the overall delay is 2.23 ms.
The pose tracking method according to the exemplary embodiment of the present disclosure has been described above with reference to fig. 1 to 2. Hereinafter, a pose tracking apparatus and units thereof according to an exemplary embodiment of the present disclosure will be described with reference to fig. 3.
Fig. 3 shows a block diagram of a pose tracking apparatus according to an exemplary embodiment of the present disclosure.
Referring to fig. 3, the pose tracking apparatus includes an image acquisition unit 31, a pixel acquisition unit 32, and a pose calculation unit 33.
The image acquisition unit 31 is configured to acquire an image of a tracking object on which a marker that blinks at a specific frequency is provided.
The pixel acquisition unit 32 is configured to acquire pixels of varying brightness from the acquired image.
The pose calculation unit 33 is configured to calculate a 6-degree-of-freedom pose of the tracking object based on the acquired pixels.
In an exemplary embodiment of the present disclosure, the posture calculation unit 33 may be configured to: acquiring inertial measurement unit data of a tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y and z under a body coordinate system of the tracked object; a6-degree-of-freedom attitude of the tracked object is calculated based on the 3-degree-of-freedom attitude and the acquired pixels, where the 6-degree-of-freedom attitude is an attitude in directions of three coordinate axes x, y, and z and an attitude rotated around three rectangular coordinate axes x, y, and z in a body coordinate system of the tracked object.
In an exemplary embodiment of the present disclosure, the posture calculation unit 33 may be further configured to: based on the 3-degree-of-freedom posture and the obtained pixels, solving the corresponding relation between the marked 2D point set and the marked 3D point set to obtain a matched pair of the marked 2D point set and the marked 3D point set, wherein the 2D point set comprises the pixel coordinates of each mark, and the 3D point set comprises the coordinates of each mark in a body coordinate system of the tracked object; based on the matching pairs, a 6-degree-of-freedom pose of the tracked object is calculated.
In an exemplary embodiment of the present disclosure, the posture calculation unit 33 may be further configured to: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; and carrying out minimum re-projection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object.
In an exemplary embodiment of the present disclosure, the posture calculation unit 33 may be further configured to: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; carrying out minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation; and optimizing the pose of 6 degrees of freedom after the operation of minimizing the reprojection error according to the pose of 3 degrees of freedom to obtain the pose of 6 degrees of freedom of the tracked object.
In an exemplary embodiment of the present disclosure, the pose tracking apparatus may further include: a re-matching unit configured to re-match pixels having a re-projection error exceeding a preset deviation threshold according to the 6 degrees of freedom posture of the tracked object after the 6 degrees of freedom posture of the tracked object is obtained.
Fig. 4 shows a schematic view of an electronic device according to an exemplary embodiment of the present disclosure.
Referring to fig. 4, the electronic device 4 includes: a camera 41 and a processor 42.
The camera 41 is configured to acquire an image of a tracking object on which a marker that flickers at a specific frequency is provided, and acquire a pixel having a luminance change from the acquired image. The processor 42 is used to calculate a 6 degree of freedom pose of the tracked object based on the pixels acquired by the camera.
In an exemplary embodiment of the present disclosure, the processor 42 may be configured to acquire inertial measurement unit data of the tracked object, and estimate a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, where the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y, and z in a body coordinate system of the tracked object; a6-degree-of-freedom attitude of the tracked object is calculated based on the 3-degree-of-freedom attitude and the acquired pixels, where the 6-degree-of-freedom attitude is an attitude in directions of three coordinate axes x, y, and z and an attitude rotated around three rectangular coordinate axes x, y, and z in a body coordinate system of the tracked object.
In an exemplary embodiment of the present disclosure, the processor 42 may be configured to solve a corresponding relationship between a 2D point set and a 3D point set of a marker based on a 3-degree-of-freedom pose and an acquired pixel, and obtain a matching pair of the marker with respect to the 2D point set and the 3D point set, where the 2D point set includes a pixel coordinate of each marker, and the 3D point set includes a coordinate of each marker in a body coordinate system of a tracked object; based on the matching pairs, a 6 degree-of-freedom pose of the tracked object is calculated.
In an exemplary embodiment of the present disclosure, the processor 42 may be configured to remove pixels from the matched pair for which the reprojection bias exceeds a preset bias threshold, and calculate a 6-degree-of-freedom pose from the pixels remaining after the removal; and carrying out minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object.
In an exemplary embodiment of the present disclosure, the processor 42 may be configured to remove pixels from the matched pair for which the reprojection bias exceeds a preset bias threshold, and calculate a 6-degree-of-freedom pose from the pixels remaining after the removal; performing minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation; and optimizing the pose of 6 degrees of freedom after the operation of minimizing the reprojection error according to the pose of 3 degrees of freedom to obtain the pose of 6 degrees of freedom of the tracked object.
In an exemplary embodiment of the disclosure, the processor 42 may be configured to, after obtaining the 6-degree-of-freedom pose of the tracked object, re-match pixels having a re-projection error exceeding a preset deviation threshold according to the 6-degree-of-freedom pose of the tracked object.
Further, according to an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a computer program which, when executed, implements a pose tracking method according to an exemplary embodiment of the present disclosure.
By way of example, the computer readable storage medium may carry one or more programs which, when executed, implement the steps of: acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object; acquiring pixels with changed brightness from the acquired image; a 6 degree-of-freedom pose of the tracked object is calculated based on the acquired pixels.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing. The computer readable storage medium may be embodied in any device; or may be present alone without being assembled into the device.
The pose tracking apparatus and the electronic apparatus according to the exemplary embodiments of the present disclosure have been described above with reference to fig. 3 and 4. Next, a computing device according to an exemplary embodiment of the present disclosure is described with reference to fig. 5.
Fig. 5 shows a schematic diagram of a computing device according to an exemplary embodiment of the present disclosure.
Referring to fig. 5, the computing apparatus 5 according to the exemplary embodiment of the present disclosure includes a memory 51 and a processor 52, the memory 51 having stored thereon a computer program that, when executed by the processor 52, implements a pose tracking method according to the exemplary embodiment of the present disclosure.
As an example, the computer program, when executed by the processor 52, may implement the steps of: acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object; acquiring pixels with changed brightness from the acquired image; a 6 degree-of-freedom pose of the tracked object is calculated based on the acquired pixels.
The computing device illustrated in fig. 5 is only one example and should not impose any limitations on the functionality or scope of use of embodiments of the disclosure.
The pose tracking method and apparatus according to the exemplary embodiment of the present disclosure have been described above with reference to fig. 1 to 5. However, it should be understood that: the pose tracking apparatus and its elements shown in fig. 3 may each be configured as software, hardware, firmware, or any combination thereof to perform a particular function, and the computing apparatus shown in fig. 5 is not limited to including the components shown above, but some components may be added or deleted as desired, and the above components may also be combined.
According to the pose tracking method and device, the image of the tracking object is obtained, the pixels with changed brightness are obtained from the obtained image, and the 6-degree-of-freedom pose of the tracking object is calculated based on the obtained pixels, so that the dependency of pose tracking on the specific layout of the LED markers is reduced, meanwhile, the delay of the pose tracking is reduced, and the precision and the efficiency of the pose tracking are improved.
While the present disclosure has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the following claims.

Claims (11)

1. A pose tracking method, comprising:
acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object;
acquiring pixels with changed brightness from the acquired image;
calculating a 6 degree-of-freedom pose of the tracked object based on the acquired pixels,
wherein the step of calculating the 6 degree-of-freedom pose of the tracked object based on the acquired pixels comprises:
acquiring inertial measurement unit data of the tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y and z in a body coordinate system of the tracked object;
based on the 3-degree-of-freedom posture and the obtained pixels, solving a corresponding relation between a 2D point set and a 3D point set of the marker to obtain a matching pair of the 2D point set and the 3D point set of the marker, wherein the 2D point set comprises pixel coordinates of the marker, and the 3D point set comprises coordinates of the marker in a body coordinate system of the tracked object;
and calculating the 6-degree-of-freedom posture of the tracked object based on the matching pair, wherein the 6-degree-of-freedom posture is a posture in the directions of three coordinate axes of x, y and z and a posture rotating around three rectangular coordinate axes of x, y and z under the body coordinate system of the tracked object.
2. The pose tracking method of claim 1, wherein the step of calculating a 6 degree of freedom pose of the tracked object comprises:
removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal;
and carrying out minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object.
3. The pose tracking method of claim 1, wherein the step of calculating a 6 degree of freedom pose of the tracked object comprises:
removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal;
performing minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation;
and optimizing the pose of 6 degrees of freedom after the operation of minimizing the reprojection error according to the pose of 3 degrees of freedom to obtain the pose of 6 degrees of freedom of the tracked object.
4. The pose tracking method according to claim 2 or 3, further comprising, after obtaining the 6 degree-of-freedom pose of the tracked object:
and carrying out re-matching on the pixels of which the re-projection errors in the matching pairs exceed a preset deviation threshold according to the 6-degree-of-freedom posture of the tracked object.
5. A pose tracking apparatus, comprising:
an image acquisition unit configured to acquire an image of a tracking object on which a marker that blinks at a specific frequency is provided;
a pixel acquisition unit configured to acquire a pixel of which luminance varies from the acquired image; and
a pose calculation unit configured to calculate a 6-degree-of-freedom pose of the tracking object based on the acquired pixels,
wherein the attitude calculation unit is configured to:
acquiring inertial measurement unit data of the tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y and z under a body coordinate system of the tracked object;
based on the 3-degree-of-freedom posture and the obtained pixels, solving a corresponding relation between a 2D point set and a 3D point set of the marker to obtain a matching pair of the 2D point set and the 3D point set of the marker, wherein the 2D point set comprises pixel coordinates of the marker, and the 3D point set comprises coordinates of the marker in a body coordinate system of the tracked object;
and calculating the 6-degree-of-freedom posture of the tracked object based on the matching pair, wherein the 6-degree-of-freedom posture is a posture in the directions of three coordinate axes of x, y and z and a posture rotating around three rectangular coordinate axes of x, y and z under the body coordinate system of the tracked object.
6. The pose tracking apparatus of claim 5, wherein the pose calculation unit is further configured to:
removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal;
and carrying out minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object.
7. The pose tracking apparatus of claim 6, wherein the pose calculation unit is further configured to:
removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal;
carrying out minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation;
and optimizing the pose of 6 degrees of freedom after the operation of minimizing the reprojection error according to the pose of 3 degrees of freedom to obtain the pose of 6 degrees of freedom of the tracked object.
8. The pose tracking apparatus of claim 6 or 7, further comprising:
a re-matching unit configured to re-match pixels, whose re-projection errors exceed a preset deviation threshold, according to the 6-degree-of-freedom posture of the tracked object after the 6-degree-of-freedom posture of the tracked object is obtained.
9. An electronic device, comprising:
a camera for acquiring an image of a tracking object on which a marker blinking at a specific frequency is provided, and acquiring pixels of which luminance varies from the acquired image, and
a processor for acquiring inertial measurement unit data of the tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y, and z in a body coordinate system of the tracked object; based on the 3-degree-of-freedom posture and the obtained pixels, solving a corresponding relation between a 2D point set and a 3D point set of the marker to obtain a matching pair of the 2D point set and the 3D point set of the marker, wherein the 2D point set comprises pixel coordinates of the marker, and the 3D point set comprises coordinates of the marker in a body coordinate system of the tracked object; and calculating the 6-degree-of-freedom posture of the tracked object based on the matching pair, wherein the 6-degree-of-freedom posture is a posture in the directions of three coordinate axes of x, y and z and a posture rotating around three rectangular coordinate axes of x, y and z under the body coordinate system of the tracked object.
10. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the pose tracking method of any one of claims 1 to 4.
11. A computing device, comprising:
a processor;
a memory storing a computer program that, when executed by the processor, implements the pose tracking method of any one of claims 1 to 4.
CN201910950626.1A 2019-10-08 2019-10-08 Pose tracking method and device Active CN110782492B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201910950626.1A CN110782492B (en) 2019-10-08 2019-10-08 Pose tracking method and device
KR1020200114552A KR20210042011A (en) 2019-10-08 2020-09-08 Posture tracking method and apparatus performing the same
US17/063,909 US11610330B2 (en) 2019-10-08 2020-10-06 Method and apparatus with pose tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910950626.1A CN110782492B (en) 2019-10-08 2019-10-08 Pose tracking method and device

Publications (2)

Publication Number Publication Date
CN110782492A CN110782492A (en) 2020-02-11
CN110782492B true CN110782492B (en) 2023-03-28

Family

ID=69384884

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910950626.1A Active CN110782492B (en) 2019-10-08 2019-10-08 Pose tracking method and device

Country Status (2)

Country Link
KR (1) KR20210042011A (en)
CN (1) CN110782492B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111462179B (en) * 2020-03-26 2023-06-27 北京百度网讯科技有限公司 Three-dimensional object tracking method and device and electronic equipment
CN111949123B (en) * 2020-07-01 2023-08-08 青岛小鸟看看科技有限公司 Multi-sensor handle controller hybrid tracking method and device
CN112306271B (en) * 2020-10-30 2022-11-25 歌尔光学科技有限公司 Focus calibration method and device of handle controller and related equipment
CN112991556B (en) * 2021-05-12 2022-05-27 航天宏图信息技术股份有限公司 AR data display method and device, electronic equipment and storage medium
CN113370217B (en) * 2021-06-29 2023-06-16 华南理工大学 Object gesture recognition and grabbing intelligent robot method based on deep learning
US20230031480A1 (en) * 2021-07-28 2023-02-02 Htc Corporation System for tracking camera and control method thereof
CN114549578A (en) * 2021-11-05 2022-05-27 北京小米移动软件有限公司 Target tracking method, device and storage medium
CN114565669A (en) * 2021-12-14 2022-05-31 华人运通(上海)自动驾驶科技有限公司 Method for fusion positioning of field-end multi-camera

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103607541A (en) * 2013-12-02 2014-02-26 吴东辉 Method and system for obtaining information by way of camera shooting, camera shooting device and information modulation device
CN105844659A (en) * 2015-01-14 2016-08-10 北京三星通信技术研究有限公司 Moving part tracking method and device
CN106068533A (en) * 2013-10-14 2016-11-02 瑞尔D股份有限公司 The control of directional display
CN108596980A (en) * 2018-03-29 2018-09-28 中国人民解放军63920部队 Circular target vision positioning precision assessment method, device, storage medium and processing equipment
CN109474817A (en) * 2017-09-06 2019-03-15 原相科技股份有限公司 Optical sensing devices, method and optical detecting module
CN110036258A (en) * 2016-12-08 2019-07-19 索尼互动娱乐股份有限公司 Information processing unit and information processing method
CN110120099A (en) * 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 Localization method, device, recognition and tracking system and computer-readable medium

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103930944B (en) * 2011-06-23 2016-08-24 奥布隆工业有限公司 Adaptive tracking system for space input equipment
EP2728375A1 (en) * 2012-10-31 2014-05-07 Leica Geosystems AG Method and device for determining the orientation of an object
FR3013487B1 (en) * 2013-11-18 2017-04-21 Univ De Nice (Uns) METHOD OF ESTIMATING THE SPEED OF MOVING A CAMERA
US20160247293A1 (en) * 2015-02-24 2016-08-25 Brain Biosciences, Inc. Medical imaging systems and methods for performing motion-corrected image reconstruction
US9934592B1 (en) * 2016-11-15 2018-04-03 Carl Zeiss Industrielle Messtechnik Gmbh Method and system for determining a 6-DOF-pose of an object in space
EP3447448B1 (en) * 2017-07-24 2021-01-06 Trifo, Inc. Fault-tolerance to provide robust tracking for autonomous and non-autonomous positional awareness
US10529074B2 (en) * 2017-09-28 2020-01-07 Samsung Electronics Co., Ltd. Camera pose and plane estimation using active markers and a dynamic vision sensor
CN108648215B (en) * 2018-06-22 2022-04-15 南京邮电大学 SLAM motion blur pose tracking algorithm based on IMU

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106068533A (en) * 2013-10-14 2016-11-02 瑞尔D股份有限公司 The control of directional display
CN103607541A (en) * 2013-12-02 2014-02-26 吴东辉 Method and system for obtaining information by way of camera shooting, camera shooting device and information modulation device
CN105844659A (en) * 2015-01-14 2016-08-10 北京三星通信技术研究有限公司 Moving part tracking method and device
CN110036258A (en) * 2016-12-08 2019-07-19 索尼互动娱乐股份有限公司 Information processing unit and information processing method
CN109474817A (en) * 2017-09-06 2019-03-15 原相科技股份有限公司 Optical sensing devices, method and optical detecting module
CN110120099A (en) * 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 Localization method, device, recognition and tracking system and computer-readable medium
CN108596980A (en) * 2018-03-29 2018-09-28 中国人民解放军63920部队 Circular target vision positioning precision assessment method, device, storage medium and processing equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Outside-in monocular IR camera based HMD pose estimation via geometric optimization;Pavel A. Savkin 等;《the 23rd ACM Symposium on Virtual Reality Software and Technology》;20171108;全文 *
适用于昼夜视觉的微光CIS;潘京生 等;《红外技术》;20161231;第38卷(第3期);全文 *

Also Published As

Publication number Publication date
KR20210042011A (en) 2021-04-16
CN110782492A (en) 2020-02-11

Similar Documents

Publication Publication Date Title
CN110782492B (en) Pose tracking method and device
JP6821714B2 (en) Improved camera calibration system, target, and process
CN108022264B (en) Method and equipment for determining camera pose
EP4224424A1 (en) Method and system for determining spatial coordinates of a 3d reconstruction of at least part of a real object at absolute spatial scale
JP6594129B2 (en) Information processing apparatus, information processing method, and program
JP2018522348A (en) Method and system for estimating the three-dimensional posture of a sensor
CN109961523B (en) Method, device, system, equipment and storage medium for updating virtual target
CN108519102B (en) Binocular vision mileage calculation method based on secondary projection
US11108964B2 (en) Information processing apparatus presenting information, information processing method, and storage medium
CN112652016A (en) Point cloud prediction model generation method, pose estimation method and device
US10755422B2 (en) Tracking system and method thereof
CN108257177B (en) Positioning system and method based on space identification
JP6129363B2 (en) Interactive system, remote control and operation method thereof
CN112288825A (en) Camera calibration method and device, electronic equipment, storage medium and road side equipment
US10902610B2 (en) Moving object controller, landmark, and moving object control method
US20200166409A1 (en) Temperature processing apparatus and temperature processing method
CN111427452B (en) Tracking method of controller and VR system
JP2020179441A (en) Control system, information processing device and control method
JP2015031601A (en) Three-dimensional measurement instrument, method, and program
JP6922348B2 (en) Information processing equipment, methods, and programs
McIlroy et al. Kinectrack: 3d pose estimation using a projected dense dot pattern
US11758100B2 (en) Portable projection mapping device and projection mapping system
CN118212257A (en) Training method, pose tracking system, pose tracking device and storage medium
CN114926542A (en) Mixed reality fixed reference system calibration method based on optical positioning system
CN115082520A (en) Positioning tracking method and device, terminal equipment and computer readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant