CN110782492A - Pose tracking method and device - Google Patents

Pose tracking method and device Download PDF

Info

Publication number
CN110782492A
CN110782492A CN201910950626.1A CN201910950626A CN110782492A CN 110782492 A CN110782492 A CN 110782492A CN 201910950626 A CN201910950626 A CN 201910950626A CN 110782492 A CN110782492 A CN 110782492A
Authority
CN
China
Prior art keywords
pose
freedom
degree
tracked object
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910950626.1A
Other languages
Chinese (zh)
Other versions
CN110782492B (en
Inventor
唐创奇
李卓
李宇光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung China Semiconductor Co Ltd
Samsung Electronics Co Ltd
Original Assignee
Samsung China Semiconductor Co Ltd
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung China Semiconductor Co Ltd, Samsung Electronics Co Ltd filed Critical Samsung China Semiconductor Co Ltd
Priority to CN201910950626.1A priority Critical patent/CN110782492B/en
Publication of CN110782492A publication Critical patent/CN110782492A/en
Priority to KR1020200114552A priority patent/KR20210042011A/en
Priority to US17/063,909 priority patent/US11610330B2/en
Application granted granted Critical
Publication of CN110782492B publication Critical patent/CN110782492B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

A pose tracking method and device are provided. The pose tracking method comprises the following steps: acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object; acquiring pixels with changed brightness from the acquired image; the 6-degree-of-freedom posture of the tracked object is calculated based on the acquired pixels, so that the dependency of posture tracking on the specific layout of the LED markers is reduced, the delay of posture tracking is reduced, and the precision and the efficiency of posture tracking are improved.

Description

Pose tracking method and device
Technical Field
The present disclosure relates to the field of computer vision technology. More specifically, the present disclosure relates to a pose tracking method and apparatus.
Background
In recent years, various methods of pose estimation with 6 degrees of freedom have been proposed and widely applied to the fields of robot grasping, virtual reality/augmented reality, human-computer interaction, and the like. Virtual reality and augmented reality place high demands on system latency. If the system is sluggish to head movements, it can cause dizziness and nausea in the user. The delay of the Virfur (Valve) system is 7-15 milliseconds. The lowest delay of the currently commercially available Virtual Reality (VR) tracking products is 15 milliseconds, and the users cannot have a perfect immersive experience.
Many current optical tracking systems are based on Complementary Metal Oxide Semiconductor (CMOS) cameras. But the CMOS camera delay for such consumer machines is typically greater than 16.7 milliseconds (60 FPS). Hardware limitations have resulted in these methods not being able to display the user's motion input on the screen in a timely manner, and not meeting VR's low latency requirements.
Active Light Emitting Diode (LED) markers are also used in some schemes to restore a 6 degree of freedom pose of an object. However, these methods have some limitations, either the number of LED lamps must be four and coplanar, or the calculated quantity ratio exceeding a preset deviation threshold cannot be applied to a real-time system. Only 4 LED lights affect the accuracy of the pose tracking and the robustness of the system because the pose solution fails if one of the lights is not detected. In addition, the violent solution of the corresponding relation of the 2D/3D point sets is time-consuming, and cannot be applied to the condition that a large number of LED lamps exist and a real-time system.
Disclosure of Invention
An exemplary embodiment of the present disclosure is to provide a pose tracking method and apparatus to reduce dependency of pose tracking on a specific layout of an LED marker, and at the same time, reduce delay of pose tracking, and improve accuracy and efficiency of pose tracking.
According to an exemplary embodiment of the present disclosure, there is provided a pose tracking method including: acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object; acquiring pixels with changed brightness from the acquired image; the 6-degree-of-freedom posture of the tracked object is calculated based on the acquired pixels, so that the dependency of posture tracking on the specific layout of the LED markers is reduced, the delay of posture tracking is reduced, and the precision and the efficiency of posture tracking are improved.
Optionally, the step of calculating a 6-degree-of-freedom pose of the tracked object based on the acquired pixels may comprise: acquiring inertial measurement unit data of the tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y and z under a body coordinate system of the tracked object; and calculating a 6-degree-of-freedom attitude of the tracked object based on the 3-degree-of-freedom attitude and the acquired pixels, wherein the 6-degree-of-freedom attitude is an attitude along three coordinate axis directions of x, y and z and an attitude rotating around three rectangular coordinate axes of x, y and z under a body coordinate system of the tracked object, so that the accuracy and the efficiency of attitude tracking are improved.
Optionally, the step of calculating a 6 degree-of-freedom pose of the tracked object based on the 3 degree-of-freedom pose and the acquired pixels may comprise: based on the 3-degree-of-freedom posture and the obtained pixels, solving a corresponding relation between a 2D point set and a 3D point set of the marker to obtain a matching pair of the 2D point set and the 3D point set of the marker, wherein the 2D point set comprises pixel coordinates of the marker, and the 3D point set comprises coordinates of the marker in a body coordinate system of the tracked object; and calculating the 6-degree-of-freedom posture of the tracked object based on the matching pair, thereby improving the precision and efficiency of posture tracking.
Optionally, the step of calculating the 6-degree-of-freedom pose of the tracked object may comprise: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; and performing minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object, so that the precision and the efficiency of pose tracking are improved.
Optionally, the step of calculating the 6-degree-of-freedom pose of the tracked object may comprise: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; performing minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation; and optimizing the pose of 6 degrees of freedom after the operation of minimizing the reprojection error according to the pose of 3 degrees of freedom to obtain the pose of 6 degrees of freedom of the tracked object, thereby improving the precision and the efficiency of pose tracking.
Optionally, after obtaining the 6-degree-of-freedom pose of the tracked object, the pose tracking method may further include: and carrying out re-matching on the pixels with the re-projection errors exceeding a preset deviation threshold value in the matching pair according to the 6-degree-of-freedom posture of the tracked object so as to be used for follow-up pose tracking.
According to an exemplary embodiment of the present disclosure, there is provided a pose tracking apparatus including: an image acquisition unit configured to acquire an image of a tracking object on which a marker that blinks at a specific frequency is provided; a pixel acquisition unit configured to acquire a pixel of which luminance varies from the acquired image; and a pose calculation unit configured to calculate a 6-degree-of-freedom pose of the tracking object based on the acquired pixels, thereby reducing dependency of pose tracking on a specific layout of the LED markers while reducing delay of the pose tracking, and improving accuracy and efficiency of the pose tracking.
Optionally, the pose calculation unit may be configured to: acquiring inertial measurement unit data of the tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y and z under a body coordinate system of the tracked object; and calculating a 6-degree-of-freedom attitude of the tracked object based on the 3-degree-of-freedom attitude and the acquired pixels, wherein the 6-degree-of-freedom attitude is an attitude along three coordinate axis directions of x, y and z and an attitude rotating around three rectangular coordinate axes of x, y and z under a body coordinate system of the tracked object, so that the accuracy and the efficiency of attitude tracking are improved.
Optionally, the pose calculation unit may be further configured to: based on the 3-degree-of-freedom posture and the obtained pixels, solving a corresponding relation between a 2D point set and a 3D point set of the marker to obtain a matching pair of the 2D point set and the 3D point set of the marker, wherein the 2D point set comprises pixel coordinates of the marker, and the 3D point set comprises coordinates of the marker in a body coordinate system of the tracked object; and calculating the 6-degree-of-freedom posture of the tracked object based on the matching pair, thereby improving the precision and efficiency of posture tracking.
Optionally, the pose calculation unit may be further configured to: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; and performing minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object, so that the precision and the efficiency of pose tracking are improved.
Optionally, the pose calculation unit may be further configured to: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; performing minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation; and fusing the 3-degree-of-freedom posture with the 6-degree-of-freedom posture after the minimized reprojection error operation to obtain the 6-degree-of-freedom posture of the tracked object, thereby improving the precision and efficiency of posture tracking.
Optionally, the pose tracking apparatus may further include: and the re-matching unit is configured to re-match the pixels with the re-projection errors exceeding a preset deviation threshold according to the 6-degree-of-freedom posture of the tracking object after the 6-degree-of-freedom posture of the tracking object is obtained, so as to be used for follow-up pose tracking.
According to an exemplary embodiment of the present disclosure, there is provided an electronic apparatus including: a camera for acquiring an image of a tracking object on which a marker blinking at a specific frequency is provided, and acquiring a pixel of which luminance varies from the acquired image; a processor for calculating a 6 degree of freedom pose of the tracked object based on the pixels acquired by the camera, thereby reducing dependency of pose tracking on specific layout of LED markers, reducing delay of pose tracking, and improving accuracy and efficiency of pose tracking.
According to an exemplary embodiment of the present disclosure, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a pose tracking method according to an exemplary embodiment of the present disclosure.
According to an exemplary embodiment of the present disclosure, there is provided a computing apparatus including: a processor; a memory storing a computer program that, when executed by the processor, implements a pose tracking method according to an exemplary embodiment of the present disclosure.
According to the pose tracking method and device, the image of the tracking object is obtained, the pixels with changed brightness are obtained from the obtained image, and the 6-degree-of-freedom pose of the tracking object is calculated based on the obtained pixels, so that the dependency of pose tracking on the specific layout of the LED markers is reduced, meanwhile, the delay of the pose tracking is reduced, and the precision and the efficiency of the pose tracking are improved.
Additional aspects and/or advantages of the present general inventive concept will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the general inventive concept.
Drawings
The above and other objects and features of the exemplary embodiments of the present disclosure will become more apparent from the following description taken in conjunction with the accompanying drawings which illustrate exemplary embodiments, wherein:
FIG. 1 illustrates a flow diagram of a pose tracking method according to an exemplary embodiment of the present disclosure;
FIG. 2 shows the 2D active LED marker tracking results of a tracked object;
FIG. 3 shows a block diagram of a pose tracking apparatus according to an example embodiment of the present disclosure;
FIG. 4 shows a schematic view of an electronic device according to an exemplary embodiment of the present disclosure; and
fig. 5 shows a schematic diagram of a computing device according to an exemplary embodiment of the present disclosure.
Detailed Description
Reference will now be made in detail to the exemplary embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the like elements throughout. The embodiments are described below in order to explain the present disclosure by referring to the figures.
Fig. 1 shows a flowchart of a pose tracking method according to an exemplary embodiment of the present disclosure. The pose tracking method shown in fig. 1 is applicable to a tracked object provided with a plurality of markers blinking at a specific frequency, where the markers blinking at a specific frequency may be, for example, active LEDs. In the following description, an LED lamp is explained as an example of a sign, but it should be understood that the present invention is not limited thereto. One skilled in the art may use other forms of marking as the embodiment requires.
Referring to fig. 1, in step S101, an image of a tracking target is acquired.
In an exemplary embodiment of the present disclosure, an image of a tracking object may be acquired by a DVS camera. The pose tracking method illustrated in fig. 1 is applicable to an electronic apparatus having a camera capable of acquiring a pixel of luminance change and a host capable of calculation, or to a system composed of a camera capable of acquiring a pixel of luminance change and a host capable of calculation.
In step S102, pixels with luminance changes are acquired from the acquired image.
In the exemplary embodiment of the present disclosure, the DVS camera does not directly transmit the image to the host computer after obtaining the image in step S101, but acquires the pixels of the luminance change from the acquired image in step S102, and then transmits the acquired pixels to the host computer for pose tracking, thereby reducing the amount of data transmitted and the amount of data for calculation, and reducing the delay of pose tracking.
In particular, when a marker (e.g., an active LED lamp) flashes, the DVS can generate corresponding on and off events. These events can be easily distinguished from other events by the frequency of LED blinking. These screened events may be divided into different clusters, each cluster representing an LED lamp. These screened events can then be processed using a region-growing based clustering algorithm and a lightweight voting algorithm to identify the most dense points in the cluster as the center of the marker. In addition, a global nearest neighbor tracking method and Kalman filtering based on a uniform velocity model can be used for tracking a plurality of marks, so that the probability of missed detection and false detection of the marks is reduced.
In step S103, a 6-degree-of-freedom pose of the tracking object is calculated based on the acquired pixels. Here, the 6-degree-of-freedom attitude indicates an attitude in the directions of three coordinate axes x, y, and z and an attitude rotated around three rectangular coordinate axes x, y, and z in the body coordinate system of the tracking target.
In an exemplary embodiment of the present disclosure, only those pixels whose luminance changes due to motion are used to calculate the 6-degree-of-freedom pose of the tracking object, thereby reducing the delay of pose tracking.
In an exemplary embodiment of the present disclosure, in calculating the 6-degree-of-freedom pose of the tracking object based on the acquired pixels, Inertial Measurement Unit (IMU) data of the tracking object may be first acquired, a 3-degree-of-freedom pose of the tracking object (i.e., a pose rotated around three coordinate axes of x, y, and z in a body coordinate system of the tracking object) may be estimated based on the acquired Inertial Measurement Unit (IMU) data, and then the 6-degree-of-freedom pose of the tracking object may be calculated based on the 3-degree-of-freedom pose and the acquired pixels, thereby improving accuracy and efficiency of pose tracking.
In an exemplary embodiment of the present disclosure, when calculating the 6-degree-of-freedom pose of the tracked object based on the 3-degree-of-freedom pose and the acquired pixels, the correspondence between the 2D point set and the 3D point set of the marker may be first solved based on the 3-degree-of-freedom pose and the acquired pixels, resulting in a matching pair of the marker with respect to the 2D point set and the 3D point set, and then the 6-degree-of-freedom pose of the tracked object may be calculated based on the matching pair. Here, the 2D point set includes the pixel coordinates of each marker, and the 3D point set includes the coordinates of each marker in the body coordinate system of the tracking target.
Specifically, when calculating a 6 degree-of-freedom pose of a tracked object based on a 3 degree-of-freedom pose and acquired pixels, p may be defined IA、p IBRespectively, the pixel coordinates, p, of two LED lamps (LED lamp A and LED lamp B) which have been undistorted on the image A、p BThe coordinates of the LED A, LED lamp B in the object body coordinate system are respectively. Tracking object 3-degree-of-freedom attitude matrix R ═ R 1,r 2,r 3] TCan be passed through by the attitude and heading reference systemIMU data is estimated, r 1、r 2And r 3The row vectors for the first, second and third rows of R are indicated, respectively. t ═ t x,t y,t z] TIs an unknown translation vector, where t x,t y,t zRespectively representing displacements along three coordinate axes of x, y and z. By the principle of pinhole imaging, the following equation can be derived:
Figure BDA0002225546750000061
wherein x is A、y ARespectively representing the x-coordinate and the y-coordinate of the LED lamp a on the image. x is the number of B、y BRespectively representing the x-coordinate and the y-coordinate of the LED lamp B on the image. The above equation is unknown only for t, 4 equations, 3 unknowns, solving the equation can result:
t=A zp IA-Rp A(3)
wherein the content of the first and second substances,
Figure BDA0002225546750000063
a pixel coordinate point set O of the LED lamp in the image and a known coordinate point set L of the LED lamp in the object body coordinate system can be obtained through a DVS camera detection and clustering algorithm. If two points (o, l) are arbitrarily selected from the point set O, L for pairing, the translation t can be calculated by equation (3). Through the above operation, a list T of possible translation vectors can be obtained. With some invalid translation vectors (vector elements too large or t) zNegative) may be deleted and some approximately equal translation vectors may be merged. For any valid translation vector T in T validThe point set L of visible LED lamps corresponding to the point set L can be determined through a Moller-Trumbore ray intersection algorithm v: if the first point at which a certain LED lamp intersects the determined ray of the camera with the object is the LED lamp, then the LED lampIt is visible otherwise this LED lamp is not visible in this pose. Determining a visible set of LED lamp points for the case of multiple LED lamps may reduce the amount of computation and instances of mismatching. Then set L of visible points vAnd projecting the corresponding pose with 6 degrees of freedom and the camera internal parameters onto an image plane P. Thus, the Kuhn-Munkres algorithm can be used to solve the set of visible points L vAnd best match and match error for observation point set O. Traversing the list of possible translation vectors T, the set of matches with the smallest match error is the correct matching pair for the set of 2D points and the set of 3D points.
In the exemplary embodiment of the present disclosure, when calculating the 6-degree-of-freedom pose of the tracked object, pixels whose reprojection deviation exceeds a preset deviation threshold may be first removed from the matching pair, and the 6-degree-of-freedom pose may be calculated according to the remaining pixels after removal, and then the 6-degree-of-freedom pose obtained by calculation may be subjected to a minimum reprojection error operation to obtain the 6-degree-of-freedom pose of the tracked object, thereby further improving the accuracy and efficiency of pose tracking.
In an exemplary embodiment of the present disclosure, when calculating the 6-degree-of-freedom pose of the tracked object, pixels whose reprojection deviation exceeds a preset deviation threshold may be first removed from the matching pair, and the 6-degree-of-freedom pose may be calculated according to the remaining pixels after removal, then the minimized reprojection error operation may be performed on the calculated 6-degree-of-freedom pose, and then the 6-degree-of-freedom pose after the minimized reprojection error operation may be optimized according to the 3-degree-of-freedom pose, so as to obtain the 6-degree-of-freedom pose of the tracked object, thereby further improving the accuracy and efficiency of pose tracking.
In an exemplary embodiment of the present disclosure, after the 6-degree-of-freedom pose of the tracked object is obtained, pixels in the matching pair whose reprojection error exceeds a preset deviation threshold may be further re-matched according to the 6-degree-of-freedom pose of the tracked object. In addition, the corresponding relation between the 2D point set and the 3D point set of the newly observed mark (such as an active LED lamp) or the matching pair of the 2D point set and the 3D point set can be updated, so that the pose tracking precision and efficiency are further improved.
Specifically, after the corresponding relationship between the 2D Point set and the 3D Point set is obtained, a Random Sample Consensus (RANSAC) algorithm may be used to remove a Point where a reprojection deviation in a matching pair exceeds a preset deviation threshold, and then an effective Perspective n-Point positioning (effective Perspective-n-Point, EPnP) algorithm is used to solve a 6-degree-of-freedom pose. And then, the rough pose obtained by solving the EPnP algorithm is further optimized by using a Beam Adjustment (BA) algorithm, so that more accurate pose can be obtained. The accurate pose can be used to re-match the points in the matched pair where the re-projection error exceeds a preset deviation threshold and update the newly observed matching relationship of the LEDs. And finally, fusing the obtained 6-degree-of-freedom pose and the 3-degree-of-freedom pose in the IMU based on a sensor fusion algorithm of an extended Kalman filter to obtain a smoother and consistent 6-degree-of-freedom pose (namely, the fused 6-degree-of-freedom pose). In addition, points with the reprojection error exceeding a preset deviation threshold can be rematched by using the fused 6-degree-of-freedom pose, and the newly observed LED matching relationship can be updated.
According to the pose tracking method of the exemplary embodiment of the disclosure, the dependency of pose tracking on the specific layout of the LED markers is reduced, meanwhile, the delay of the pose tracking is reduced, and the precision and the efficiency of the pose tracking are improved.
The DVS camera used to obtain the pixels that track the motion of the object causing the brightness change may be a three generation VGA device, e.g., samsung, with a resolution of 640 x 480 and may be connected to the host through USB 3.0. The DVS camera can generate events for pixel points that vary in relative illumination intensity. Each event is represented by a tuple < t, x, y, p >, where t is the timestamp (resolution on the order of microseconds) at which the event occurred, x, y is the pixel coordinate corresponding to the event, and p ∈ {0,1} is the polarity of the event, where an event < t, x, y,1> occurs when the LED lamp is on, and an event < t, x, y,0> occurs when the LED lamp is off.
The pose tracking method needs to be applied to some fixed parameters (camera internal parameters) and transfer matrices (IMU to handle body, DVS camera to helmet, etc.). An OptiTrack optical motion tracking system is used in exemplary embodiments of the present disclosure to calibrate these fixed parameters and transfer matrices, while also being used to provide an assessment of the accuracy of the pose tracking by the handle pose true value. The OptiTrack optical motion tracking system is not needed during actual use by the user when calibration is complete.
In the OptiTrack optical motion tracking system, there are a plurality of coordinate systems, such as a camera (C) coordinate system, a helmet (H) coordinate system, a world (W) coordinate system, an imu (i) coordinate system, a grip body (B) coordinate system, a grip model (M) coordinate system, and an OptiTrack (o) coordinate system. To simplify the OptiTrack optical motion tracking system, the handle body coordinate system and the handle model coordinate system may be aligned, and the world coordinate system and the OptiTrack coordinate system may be aligned. So the pose of 6 degrees of freedom of the moving handle to be solved can be expressed as
Figure BDA0002225546750000081
Figure BDA0002225546750000082
Is a rotation matrix from the handle model coordinate system to the camera coordinate system, CP Mis a representation of the origin of the model coordinate system in the camera coordinate system. Internal parameters of DVS camera and some fixed transfer matrix need to be calibrated in advance
Since the optical characteristics of DVS are the same as a normal CMOS camera, standard pinhole imaging models can be used to determine camera parameters (e.g., focal length, center of projection, and distortion parameters). The DVS camera differs from the normal camera in that the DVS camera cannot see anything that has no illumination change, so a blinking checkerboard can be used to calibrate the DVS.
OptiTrack beads can be used to calibrate some fixed transfer matrices. In the 3D model, the origin of the handle model is the center of the great circle at the top end of the handle. The OptiTrack sphere can be fixed on the circumference of a great circle and corresponds to the direction of X, Y, Z axes in the marker model, so that the OptiTrack sphere can be determined in an OptiTrack optical motion tracking systemA model coordinate system of the motion handle. The OptiTrack ball is mounted at the IMU chip and readings from the IMU are used to orient the X, Y, Z axes, thus defining the IMU coordinate system in the OptiTrack optical motion tracking system. The OptiTrack optical motion tracking system can record the poses of an IMU coordinate system and a model coordinate system in real time, and then can calculate the conversion relation between the model coordinate system of the motion handle and the IMU coordinate system. And conversion matrix of helmet to camera
Figure BDA0002225546750000093
It can also be determined by the blinking LEDs of those particular patterns:
Figure BDA0002225546750000094
wherein the content of the first and second substances,
Figure BDA0002225546750000095
and
Figure BDA0002225546750000096
are all provided by an OptiTrack optical motion tracking system and can be obtained by calculation through an EPnP algorithm
Figure BDA0002225546750000097
(the point set matching relationship for a particular pattern is fixed). The two matrices obtained by calculation can be shown as follows:
Figure BDA0002225546750000092
the active LED marker blinking interval may be determined by an on-off-on, off-on-off event. If the blinking interval of a pixel is in the [800 mus, 1200 mus ] interval, it can be concluded that this event is caused by LED blinking. The center position of the LED is then determined using a region growing algorithm and a voting method. A plurality of LED lamps may be tracked using global nearest neighbor and kalman filtering. Fig. 2 shows the 2D active LED marker tracking result of the tracked object. In fig. 2, each continuous line represents the movement trace of one marker (e.g., LED), the small open circle represents the start point of the movement trace, and the large circle with a filled circle inside represents the end point of the movement trace.
In addition, in order to improve the performance of the BA algorithm, the optimization window of the BA algorithm may be set to 10, and the optimization is performed every 4 frames. The matching relationship is not updated every time, and the matching relationship is updated only when the ratio of the number of the matched LED lamps to the number of all the observed LED lamps is smaller than a preset value, such as 0.6, so that the efficiency of the BA algorithm is improved. After the initialization of the start-up phase, the entire process flow only takes 1.23 milliseconds. Thus plus the screening time for LED light flashing events (1 ms), the overall delay is 2.23 ms.
The pose tracking method according to the exemplary embodiment of the present disclosure has been described above with reference to fig. 1 to 2. Hereinafter, a pose tracking apparatus and units thereof according to an exemplary embodiment of the present disclosure will be described with reference to fig. 3.
Fig. 3 shows a block diagram of a pose tracking apparatus according to an exemplary embodiment of the present disclosure.
Referring to fig. 3, the pose tracking apparatus includes an image acquisition unit 31, a pixel acquisition unit 32, and a pose calculation unit 33.
The image acquisition unit 31 is configured to acquire an image of a tracking object on which a marker that blinks at a specific frequency is provided.
The pixel acquisition unit 32 is configured to acquire pixels of varying brightness from the acquired image.
The pose calculation unit 33 is configured to calculate a 6-degree-of-freedom pose of the tracking object based on the acquired pixels.
In an exemplary embodiment of the present disclosure, the posture calculation unit 33 may be configured to: acquiring inertial measurement unit data of a tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y and z under a body coordinate system of the tracked object; a6-degree-of-freedom attitude of the tracked object is calculated based on the 3-degree-of-freedom attitude and the acquired pixels, where the 6-degree-of-freedom attitude is an attitude in directions of three coordinate axes x, y, and z and an attitude rotated around three rectangular coordinate axes x, y, and z in a body coordinate system of the tracked object.
In an exemplary embodiment of the present disclosure, the posture calculation unit 33 may be further configured to: based on the 3-degree-of-freedom posture and the obtained pixels, solving a corresponding relation between a 2D point set and a 3D point set of the marker to obtain a matched pair of the 2D point set and the 3D point set of the marker, wherein the 2D point set comprises pixel coordinates of each marker, and the 3D point set comprises coordinates of each marker in a body coordinate system of the tracked object; based on the matching pairs, a 6 degree-of-freedom pose of the tracked object is calculated.
In an exemplary embodiment of the present disclosure, the posture calculation unit 33 may be further configured to: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; and carrying out minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object.
In an exemplary embodiment of the present disclosure, the posture calculation unit 33 may be further configured to: removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal; performing minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation; and optimizing the pose of 6 degrees of freedom after the operation of minimizing the reprojection error according to the pose of 3 degrees of freedom to obtain the pose of 6 degrees of freedom of the tracked object.
In an exemplary embodiment of the present disclosure, the pose tracking apparatus may further include: a re-matching unit configured to re-match pixels having a re-projection error exceeding a preset deviation threshold according to the 6 degrees of freedom posture of the tracked object after the 6 degrees of freedom posture of the tracked object is obtained.
Fig. 4 shows a schematic view of an electronic device according to an exemplary embodiment of the present disclosure.
Referring to fig. 4, the electronic device 4 includes: a camera 41 and a processor 42.
The camera 41 is configured to acquire an image of a tracking object on which a marker that flickers at a specific frequency is provided, and acquire a pixel having a luminance change from the acquired image. The processor 42 is used to calculate a 6 degree of freedom pose of the tracked object based on the pixels acquired by the camera.
In an exemplary embodiment of the present disclosure, the processor 42 may be configured to acquire inertial measurement unit data of the tracked object, and estimate a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, where the 3-degree-of-freedom attitude is an attitude rotated around three coordinate axes of x, y, and z in a body coordinate system of the tracked object; a6-degree-of-freedom attitude of the tracked object is calculated based on the 3-degree-of-freedom attitude and the acquired pixels, where the 6-degree-of-freedom attitude is an attitude in directions of three coordinate axes x, y, and z and an attitude rotated around three rectangular coordinate axes x, y, and z in a body coordinate system of the tracked object.
In an exemplary embodiment of the present disclosure, the processor 42 may be configured to solve a corresponding relationship between a 2D point set and a 3D point set of a marker based on a 3-degree-of-freedom posture and an acquired pixel, and obtain a matching pair of the marker with respect to the 2D point set and the 3D point set, where the 2D point set includes pixel coordinates of each marker, and the 3D point set includes coordinates of each marker in a body coordinate system of a tracked object; based on the matching pairs, a 6 degree-of-freedom pose of the tracked object is calculated.
In an exemplary embodiment of the present disclosure, the processor 42 may be configured to remove pixels from the matched pair for which the reprojection bias exceeds a preset bias threshold, and calculate a 6-degree-of-freedom pose from the pixels remaining after the removal; and carrying out minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object.
In an exemplary embodiment of the present disclosure, the processor 42 may be configured to remove pixels from the matched pair for which the reprojection bias exceeds a preset bias threshold, and calculate a 6-degree-of-freedom pose from the pixels remaining after the removal; performing minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation; and optimizing the pose of 6 degrees of freedom after the operation of minimizing the reprojection error according to the pose of 3 degrees of freedom to obtain the pose of 6 degrees of freedom of the tracked object.
In an exemplary embodiment of the disclosure, the processor 42 may be configured to, after obtaining the 6-degree-of-freedom pose of the tracked object, re-match pixels having a re-projection error exceeding a preset deviation threshold according to the 6-degree-of-freedom pose of the tracked object.
Further, according to an exemplary embodiment of the present disclosure, there is also provided a computer-readable storage medium having stored thereon a computer program which, when executed, implements a pose tracking method according to an exemplary embodiment of the present disclosure.
By way of example, the computer readable storage medium may carry one or more programs which, when executed, implement the steps of: acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object; acquiring pixels with changed brightness from the acquired image; a 6 degree-of-freedom pose of the tracked object is calculated based on the acquired pixels.
A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer program embodied on the computer readable storage medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing. The computer readable storage medium may be embodied in any device; it may also be present separately and not assembled into the device.
The pose tracking apparatus and the electronic apparatus according to the exemplary embodiments of the present disclosure have been described above with reference to fig. 3 and 4. Next, a computing device according to an exemplary embodiment of the present disclosure is described with reference to fig. 5.
Fig. 5 shows a schematic diagram of a computing device according to an exemplary embodiment of the present disclosure.
Referring to fig. 5, the computing apparatus 5 according to the exemplary embodiment of the present disclosure includes a memory 51 and a processor 52, the memory 51 having stored thereon a computer program that, when executed by the processor 52, implements a pose tracking method according to the exemplary embodiment of the present disclosure.
As an example, the computer program, when executed by the processor 52, may implement the steps of: acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object; acquiring pixels with changed brightness from the acquired image; a 6 degree-of-freedom pose of the tracked object is calculated based on the acquired pixels.
The computing device illustrated in fig. 5 is only one example and should not impose any limitations on the functionality or scope of use of embodiments of the disclosure.
The pose tracking method and apparatus according to the exemplary embodiment of the present disclosure have been described above with reference to fig. 1 to 5. However, it should be understood that: the pose tracking apparatus and its elements shown in fig. 3 may each be configured as software, hardware, firmware, or any combination thereof to perform a particular function, and the computing apparatus shown in fig. 5 is not limited to including the components shown above, but some components may be added or deleted as needed, and the above components may also be combined.
According to the pose tracking method and device, the image of the tracking object is obtained, the pixels with changed brightness are obtained from the obtained image, and the 6-degree-of-freedom pose of the tracking object is calculated based on the obtained pixels, so that the dependency of pose tracking on the specific layout of the LED markers is reduced, meanwhile, the delay of the pose tracking is reduced, and the precision and the efficiency of the pose tracking are improved.
While the present disclosure has been particularly shown and described with reference to exemplary embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present disclosure as defined by the following claims.

Claims (10)

1. A pose tracking method, comprising:
acquiring an image of a tracking object, wherein a mark which flickers at a specific frequency is arranged on the tracking object;
acquiring pixels with changed brightness from the acquired image;
a 6 degree of freedom pose of the tracked object is calculated based on the acquired pixels.
2. The pose tracking method of claim 1, wherein the step of calculating a 6 degree of freedom pose of the tracked object based on the acquired pixels comprises:
acquiring inertial measurement unit data of the tracked object, and estimating a 3-degree-of-freedom attitude of the tracked object based on the acquired inertial measurement unit data, wherein the 3-degree-of-freedom attitude is an attitude rotating around three coordinate axes of x, y and z under a body coordinate system of the tracked object;
and calculating a 6-degree-of-freedom attitude of the tracked object based on the 3-degree-of-freedom attitude and the acquired pixels, wherein the 6-degree-of-freedom attitude is an attitude along three coordinate axis directions of x, y and z and an attitude rotating around three rectangular coordinate axes of x, y and z under a body coordinate system of the tracked object.
3. The pose tracking method of claim 2, wherein the step of calculating a 6 degree of freedom pose of the tracked object based on the 3 degree of freedom pose and the acquired pixels comprises:
based on the 3-degree-of-freedom posture and the obtained pixels, solving a corresponding relation between a 2D point set and a 3D point set of the marker to obtain a matching pair of the 2D point set and the 3D point set of the marker, wherein the 2D point set comprises pixel coordinates of the marker, and the 3D point set comprises coordinates of the marker in a body coordinate system of the tracked object;
based on the matching pairs, a 6 degree of freedom pose of the tracked object is calculated.
4. The pose tracking method of claim 3, wherein the step of calculating a 6 degree of freedom pose of the tracked object comprises:
removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal;
and carrying out minimum reprojection error operation on the calculated pose with 6 degrees of freedom to obtain the pose with 6 degrees of freedom of the tracked object.
5. The pose tracking method of claim 3, wherein the step of calculating a 6 degree of freedom pose of the tracked object comprises:
removing pixels with the reprojection deviation exceeding a preset deviation threshold value from the matching pair, and calculating a pose with 6 degrees of freedom according to the remaining pixels after removal;
performing minimum reprojection error operation on the pose with 6 degrees of freedom obtained by calculation;
and optimizing the pose of 6 degrees of freedom after the operation of minimizing the reprojection error according to the pose of 3 degrees of freedom to obtain the pose of 6 degrees of freedom of the tracked object.
6. The pose tracking method according to claim 4 or 5, further comprising, after obtaining the 6 degree-of-freedom pose of the tracked object:
and carrying out re-matching on the pixels of which the re-projection errors in the matching pairs exceed a preset deviation threshold according to the 6-degree-of-freedom posture of the tracked object.
7. A pose tracking apparatus, comprising:
an image acquisition unit configured to acquire an image of a tracking object on which a marker that blinks at a specific frequency is provided;
a pixel acquisition unit configured to acquire a pixel of which luminance varies from the acquired image; and
a pose calculation unit configured to calculate a 6-degree-of-freedom pose of the tracking object based on the acquired pixels.
8. An electronic device, comprising:
a camera for acquiring an image of a tracking object on which a marker blinking at a specific frequency is provided, and acquiring a pixel of which luminance varies from the acquired image;
a processor to calculate a 6 degree of freedom pose of the tracked object based on the pixels acquired by the camera.
9. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the pose tracking method of any one of claims 1 to 6.
10. A computing device, comprising:
a processor;
a memory storing a computer program that, when executed by the processor, implements the pose tracking method of any one of claims 1 to 6.
CN201910950626.1A 2019-10-08 2019-10-08 Pose tracking method and device Active CN110782492B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201910950626.1A CN110782492B (en) 2019-10-08 2019-10-08 Pose tracking method and device
KR1020200114552A KR20210042011A (en) 2019-10-08 2020-09-08 Posture tracking method and apparatus performing the same
US17/063,909 US11610330B2 (en) 2019-10-08 2020-10-06 Method and apparatus with pose tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910950626.1A CN110782492B (en) 2019-10-08 2019-10-08 Pose tracking method and device

Publications (2)

Publication Number Publication Date
CN110782492A true CN110782492A (en) 2020-02-11
CN110782492B CN110782492B (en) 2023-03-28

Family

ID=69384884

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910950626.1A Active CN110782492B (en) 2019-10-08 2019-10-08 Pose tracking method and device

Country Status (2)

Country Link
KR (1) KR20210042011A (en)
CN (1) CN110782492B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111462179A (en) * 2020-03-26 2020-07-28 北京百度网讯科技有限公司 Three-dimensional object tracking method and device and electronic equipment
CN111949123A (en) * 2020-07-01 2020-11-17 青岛小鸟看看科技有限公司 Hybrid tracking method and device for multi-sensor handle controller
CN112306271A (en) * 2020-10-30 2021-02-02 歌尔光学科技有限公司 Focus calibration method and device of handle controller and related equipment
CN112991556A (en) * 2021-05-12 2021-06-18 航天宏图信息技术股份有限公司 AR data display method and device, electronic equipment and storage medium
CN113370217A (en) * 2021-06-29 2021-09-10 华南理工大学 Method for recognizing and grabbing object posture based on deep learning for intelligent robot
TWI812369B (en) * 2021-07-28 2023-08-11 宏達國際電子股份有限公司 Control method, tracking system and non-transitory computer-readable storage medium

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103607541A (en) * 2013-12-02 2014-02-26 吴东辉 Method and system for obtaining information by way of camera shooting, camera shooting device and information modulation device
CN103930944A (en) * 2011-06-23 2014-07-16 奥布隆工业有限公司 Adaptive tracking system for spatial input devices
CN104769454A (en) * 2012-10-31 2015-07-08 莱卡地球系统公开股份有限公司 Method and device for determining the orientation of an object
CN105844659A (en) * 2015-01-14 2016-08-10 北京三星通信技术研究有限公司 Moving part tracking method and device
US20160247293A1 (en) * 2015-02-24 2016-08-25 Brain Biosciences, Inc. Medical imaging systems and methods for performing motion-corrected image reconstruction
CN106068533A (en) * 2013-10-14 2016-11-02 瑞尔D股份有限公司 The control of directional display
CN108074262A (en) * 2016-11-15 2018-05-25 卡尔蔡司工业测量技术有限公司 For determining the method and system of the six-degree-of-freedom posture of object in space
CN108596980A (en) * 2018-03-29 2018-09-28 中国人民解放军63920部队 Circular target vision positioning precision assessment method, device, storage medium and processing equipment
CN108648215A (en) * 2018-06-22 2018-10-12 南京邮电大学 SLAM motion blur posture tracking algorithms based on IMU
US20180308240A1 (en) * 2013-11-18 2018-10-25 Pixmap Method for estimating the speed of movement of a camera
CN109298629A (en) * 2017-07-24 2019-02-01 来福机器人 For providing the fault-tolerant of robust tracking to realize from non-autonomous position of advocating peace
CN109474817A (en) * 2017-09-06 2019-03-15 原相科技股份有限公司 Optical sensing devices, method and optical detecting module
WO2019066476A1 (en) * 2017-09-28 2019-04-04 Samsung Electronics Co., Ltd. Camera pose and plane estimation using active markers and a dynamic vision sensor
CN110036258A (en) * 2016-12-08 2019-07-19 索尼互动娱乐股份有限公司 Information processing unit and information processing method
CN110120099A (en) * 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 Localization method, device, recognition and tracking system and computer-readable medium

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103930944A (en) * 2011-06-23 2014-07-16 奥布隆工业有限公司 Adaptive tracking system for spatial input devices
CN104769454A (en) * 2012-10-31 2015-07-08 莱卡地球系统公开股份有限公司 Method and device for determining the orientation of an object
CN106068533A (en) * 2013-10-14 2016-11-02 瑞尔D股份有限公司 The control of directional display
US20180308240A1 (en) * 2013-11-18 2018-10-25 Pixmap Method for estimating the speed of movement of a camera
CN103607541A (en) * 2013-12-02 2014-02-26 吴东辉 Method and system for obtaining information by way of camera shooting, camera shooting device and information modulation device
CN105844659A (en) * 2015-01-14 2016-08-10 北京三星通信技术研究有限公司 Moving part tracking method and device
US20160247293A1 (en) * 2015-02-24 2016-08-25 Brain Biosciences, Inc. Medical imaging systems and methods for performing motion-corrected image reconstruction
CN108074262A (en) * 2016-11-15 2018-05-25 卡尔蔡司工业测量技术有限公司 For determining the method and system of the six-degree-of-freedom posture of object in space
CN110036258A (en) * 2016-12-08 2019-07-19 索尼互动娱乐股份有限公司 Information processing unit and information processing method
CN109298629A (en) * 2017-07-24 2019-02-01 来福机器人 For providing the fault-tolerant of robust tracking to realize from non-autonomous position of advocating peace
CN109474817A (en) * 2017-09-06 2019-03-15 原相科技股份有限公司 Optical sensing devices, method and optical detecting module
WO2019066476A1 (en) * 2017-09-28 2019-04-04 Samsung Electronics Co., Ltd. Camera pose and plane estimation using active markers and a dynamic vision sensor
CN110120099A (en) * 2018-02-06 2019-08-13 广东虚拟现实科技有限公司 Localization method, device, recognition and tracking system and computer-readable medium
CN108596980A (en) * 2018-03-29 2018-09-28 中国人民解放军63920部队 Circular target vision positioning precision assessment method, device, storage medium and processing equipment
CN108648215A (en) * 2018-06-22 2018-10-12 南京邮电大学 SLAM motion blur posture tracking algorithms based on IMU

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
PAVEL A. SAVKIN 等: "Outside-in monocular IR camera based HMD pose estimation via geometric optimization", 《THE 23RD ACM SYMPOSIUM ON VIRTUAL REALITY SOFTWARE AND TECHNOLOGY》 *
潘京生 等: "适用于昼夜视觉的微光CIS", 《红外技术》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111462179A (en) * 2020-03-26 2020-07-28 北京百度网讯科技有限公司 Three-dimensional object tracking method and device and electronic equipment
CN111462179B (en) * 2020-03-26 2023-06-27 北京百度网讯科技有限公司 Three-dimensional object tracking method and device and electronic equipment
CN111949123A (en) * 2020-07-01 2020-11-17 青岛小鸟看看科技有限公司 Hybrid tracking method and device for multi-sensor handle controller
WO2022002132A1 (en) * 2020-07-01 2022-01-06 青岛小鸟看看科技有限公司 Multi-sensor handle controller hybrid tracking method and device
CN111949123B (en) * 2020-07-01 2023-08-08 青岛小鸟看看科技有限公司 Multi-sensor handle controller hybrid tracking method and device
CN112306271A (en) * 2020-10-30 2021-02-02 歌尔光学科技有限公司 Focus calibration method and device of handle controller and related equipment
CN112306271B (en) * 2020-10-30 2022-11-25 歌尔光学科技有限公司 Focus calibration method and device of handle controller and related equipment
CN112991556A (en) * 2021-05-12 2021-06-18 航天宏图信息技术股份有限公司 AR data display method and device, electronic equipment and storage medium
CN112991556B (en) * 2021-05-12 2022-05-27 航天宏图信息技术股份有限公司 AR data display method and device, electronic equipment and storage medium
CN113370217A (en) * 2021-06-29 2021-09-10 华南理工大学 Method for recognizing and grabbing object posture based on deep learning for intelligent robot
CN113370217B (en) * 2021-06-29 2023-06-16 华南理工大学 Object gesture recognition and grabbing intelligent robot method based on deep learning
TWI812369B (en) * 2021-07-28 2023-08-11 宏達國際電子股份有限公司 Control method, tracking system and non-transitory computer-readable storage medium

Also Published As

Publication number Publication date
KR20210042011A (en) 2021-04-16
CN110782492B (en) 2023-03-28

Similar Documents

Publication Publication Date Title
CN110782492B (en) Pose tracking method and device
JP7057454B2 (en) Improved camera calibration system, target, and process
US20210190497A1 (en) Simultaneous location and mapping (slam) using dual event cameras
US20180113506A1 (en) Position tracking system that exploits arbitrary configurations to determine loop closure
CN108022264B (en) Method and equipment for determining camera pose
US11436750B2 (en) Optical tracking system and optical tracking method
JP2018522348A (en) Method and system for estimating the three-dimensional posture of a sensor
US10802606B2 (en) Method and device for aligning coordinate of controller or headset with coordinate of binocular system
CN108519102B (en) Binocular vision mileage calculation method based on secondary projection
CN109961523B (en) Method, device, system, equipment and storage medium for updating virtual target
US11108964B2 (en) Information processing apparatus presenting information, information processing method, and storage medium
JP7479324B2 (en) Information processing device, information processing method, and program
CN116222543B (en) Multi-sensor fusion map construction method and system for robot environment perception
CN112288825A (en) Camera calibration method and device, electronic equipment, storage medium and road side equipment
WO2019013162A1 (en) Information processing device and information processing method
JP2006234703A (en) Image processing device, three-dimensional measuring device, and program for image processing device
JP6922348B2 (en) Information processing equipment, methods, and programs
CN110688002A (en) Virtual content adjusting method and device, terminal equipment and storage medium
US11138760B2 (en) Display systems and methods for correcting drifts in camera poses
CN114926542A (en) Mixed reality fixed reference system calibration method based on optical positioning system
CN115100257A (en) Sleeve alignment method and device, computer equipment and storage medium
CN114972514A (en) SLAM positioning method, device, electronic equipment and readable storage medium
CN110785792A (en) 3D modeling method, electronic device, storage medium, and program product
CN111897432A (en) Pose determining method and device and electronic equipment
US20240083038A1 (en) Assistance system, image processing device, assistance method and non-transitory computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant