CN116804894A - Wearable tracking system and wearable tracking method - Google Patents

Wearable tracking system and wearable tracking method Download PDF

Info

Publication number
CN116804894A
CN116804894A CN202310291133.8A CN202310291133A CN116804894A CN 116804894 A CN116804894 A CN 116804894A CN 202310291133 A CN202310291133 A CN 202310291133A CN 116804894 A CN116804894 A CN 116804894A
Authority
CN
China
Prior art keywords
wearable
movement
body part
movement trajectory
wearable device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310291133.8A
Other languages
Chinese (zh)
Inventor
萧家尧
王冠勋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
HTC Corp
Original Assignee
HTC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US18/178,535 external-priority patent/US12045385B2/en
Application filed by HTC Corp filed Critical HTC Corp
Publication of CN116804894A publication Critical patent/CN116804894A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure provides a wearable tracking system and a wearable tracking method. The wearable tracking system includes a wearable device, an external camera, and a processor. The wearable device is adapted to be worn on a body part of a user. The wearable device is configured to detect body movement of the body part based on the tracking sensor. The external camera is configured to obtain a body image of the body part. The processor is configured to obtain a first movement trajectory of a body movement trajectory of the body part based on the body movement, obtain a second movement trajectory of the body part based on the body image, and determine a body pose of the body part based on the first movement trajectory and the second movement trajectory.

Description

Wearable tracking system and wearable tracking method
Technical Field
The present disclosure relates to a wearable tracking system; in particular, the present disclosure relates to a wearable tracking system and a wearable tracking method.
Background
In order to bring an immersive experience to users, technologies related to augmented reality (XR) such as augmented reality (augmented reality; AR), virtual Reality (VR) and Mixed Reality (MR) are continuously developed. AR technology allows users to bring virtual elements to the real world. VR technology allows users to enter the entire new virtual world to experience different lives. MR technology merges the real world with the virtual world.
To provision an immersive experience, wearable devices worn by a user are typically used to provide immersive visual content or track the location of the user. Thus, in AR, VR or MR applications, the convenience and performance of wearable devices often severely impact the user experience.
Disclosure of Invention
The present disclosure relates to a wearable tracking system and a wearable tracking method for easily tracking a user.
In the present disclosure, a wearable tracking system is provided. The wearable tracking system includes a wearable device, an external camera, and a processor. The wearable device is adapted to be worn on a body part of a user. The wearable device is configured to detect a body Movement (Movement) of a body part based on the tracking sensor. The external camera is configured to obtain a body image of the body part. The processor is configured to obtain a first movement trajectory of a body movement trajectory of the body part based on the body movement, obtain a second movement trajectory of the body part based on the body image, and determine a body Pose (Pose) of the body part based on the first movement trajectory and the second movement trajectory.
In the present disclosure, a wearable tracking method is provided. The wearable tracking method comprises the following steps: obtaining a first movement trajectory of a body part from a wearable device based on the body movement, wherein the wearable device is adapted to be worn on the body part of the user, and the wearable device is configured to detect the body movement of the body part based on the tracking sensor; obtaining a second movement trajectory of the body part from an external camera based on the body image, wherein the external camera is configured to obtain the body image of the body part; and determining, by the processor, a body posture of the body part based on the first movement trajectory and the second movement trajectory.
Based on the above, tracking of a user is easily performed according to the wearable tracking system and the wearable tracking method.
In order to make the foregoing more readily understood, several embodiments are described in detail below with accompanying drawings.
Drawings
The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate exemplary embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a schematic diagram of a wearable tracking system according to an embodiment of the present disclosure.
Fig. 2A is a schematic diagram of a tracking scenario of a wearable tracking system according to an embodiment of the present disclosure.
Fig. 2B is a schematic diagram of a tracking scenario of a first movement trajectory according to an embodiment of the present disclosure.
Fig. 2C is a schematic diagram of a tracking scenario of a second movement trajectory according to an embodiment of the present disclosure.
Fig. 2D is a schematic diagram of a tracking scenario of a coordinate transformation relationship between a wearable device and an external camera according to an embodiment of the present disclosure.
Fig. 3 is a schematic flow chart of a wearable tracking method according to an embodiment of the present disclosure.
Description of the reference numerals
100: a wearable tracking system;
110: a wearable device;
112: tracking the sensor;
120: an external camera;
130: a processor;
200A, 200B, 200C, 200D: tracking a scene;
300: a wearable tracking method;
BD: a body part;
CMR, DV: a coordinate system;
IMG: a body image;
MV: body movement;
PS: body posture;
RLT: a coordinate transformation relationship;
s310, S320, S330: a step of;
u: and (5) a user.
Detailed Description
Reference will now be made in detail to the exemplary embodiments of the present disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers will be used throughout the drawings and the description to refer to the same or like parts.
Throughout the specification and the appended claims of this disclosure, specific terms are used to refer to specific components. Those skilled in the art will appreciate that electronic device manufacturers may refer to a same component by different names. It is not intended herein to distinguish between components that have the same function but different names. In the following description and claims, words such as "include" and "comprising" are open-ended terms and should be interpreted to mean "include, but not limited to … …".
The term "coupled (or connected)" as used throughout the specification of the present application, including the appended claims, may refer to any direct or indirect connection means. For example, if the text describes a first device coupled (or connected) to a second device, it should be interpreted that the first device may be directly connected to the second device, or the first device may be indirectly connected to connect to the second device through other devices or some connecting means. The terms "first," "second," and the like in the description of the present disclosure (including the appended claims) are used for naming only discrete elements or for distinguishing between different embodiments or ranges. Accordingly, the terms should not be construed as an upper limit or a lower limit of the number of the limiting elements and are not applied to the order in which the limiting elements are arranged. In addition, elements/components/steps using the same reference numerals in the drawings and embodiments denote the same or similar parts, where possible. The relative descriptions of elements/components/steps may be referred to each other using the same reference numerals or using the same terminology in different embodiments.
It should be noted that in the following embodiments, the technical features of several different embodiments may be replaced, rearranged, and mixed to accomplish other embodiments without departing from the spirit of the present disclosure. The features of each embodiment may be arbitrarily mixed and used together as long as they do not violate the spirit of the disclosure or conflict with each other.
In order to bring an immersive experience to users, technologies involving augmented reality (XR), such as Augmented Reality (AR), virtual Reality (VR), and Mixed Reality (MR), are continually developed. AR technology allows users to bring virtual elements to the real world. VR technology allows users to enter the entire new virtual world to experience different lives. MR technology merges the real world with the virtual world.
To provision an immersive experience, wearable devices worn by a user are typically used to provide immersive visual content or track the location of the user. Furthermore, the wearable device may typically cooperate with an external camera to improve the accuracy of tracking the user's location. For example, the wearable device and the external camera may share a feature point map or make themselves trackable with each other. That is, the wearable device and the external camera may need to utilize the same algorithm or pre-stored information as each other. In other words, for purposes of tracking a wearable device by a camera, it is often necessary to place a tracker on the wearable device. Otherwise, the features or shape of the wearable device may need to be pre-stored in the external camera for purposes of tracking the wearable device by the camera. Thus, the wearable device and the external camera are typically products of the same manufacturer.
However, when the wearable device (the user has) and the external camera are products of different manufacturers or products utilizing different algorithms, tracking may fail or be difficult to set. Therefore, how to develop a convenient tracking method using products of different manufacturers becomes a problem requiring effort.
Fig. 1 is a schematic diagram of a wearable tracking system according to an embodiment of the present disclosure. Referring to fig. 1, a wearable tracking system 100 may include a wearable device 110 and an external camera 120. The wearable device 110 is adapted to be worn on a body part of a user. In addition, the wearable device 110 is configured to detect a body movement MV of the body part based on the tracking sensor 112. The external camera 120 is configured to obtain a body image IMG of the body part. The processor 130 is configured to obtain a first movement trajectory of the body part based on the body movement MV. In addition, the processor 130 is configured to obtain a second movement trajectory of the body part based on the body image IMG. Further, the processor 130 is configured to determine a body posture PS of the body part based on the first movement trajectory and the second movement trajectory.
In this way, since the wearable device 110 and the external camera 120 are configured to track the user based on the body movement trajectory of the user's body part, the wearable device 110 and the external camera 120 (that the user has) may be products of different manufacturers. For example, the wearable device 110 may be a first product of a first manufacturer, while the external camera 120 may be a second product of a second manufacturer. Accordingly, the manufacturer of the wearable device 110 or the external camera 120 does not need to be considered, thereby increasing convenience and user experience. Furthermore, there is no need to place markers on the wearable device 110 or pre-store information of the wearable device 110 in the external camera 120. That is, a label-free calibration method between the wearable device 110 and the external camera 120 (from the same manufacturer or a different manufacturer) is implemented. Furthermore, since the label-free calibration method is easy to set up and no additional devices are required. That is, recalibration between the wearable device 110 and the external camera 120 may be performed at any time, and online calibration of the wearable tracking system 100 is achieved.
In one embodiment, the wearable device 110 may be a head-mounted display (HMD), wearable glasses (e.g., AR/VR goggles), wristband device, waistband device, palmband device, graspable device, other similar device, or a combination of these devices. However, the present disclosure is not limited thereto.
In one embodiment, the external camera 120 includes, for example, a complementary metal oxide semiconductor (complementary metal oxide semiconductor; CMOS) camera or a charge coupled device (charge coupled device; CCD) camera. However, the present disclosure is not limited thereto.
In one embodiment, the processor 130 includes, for example, a microcontroller unit (microcontroller unit; MCU), a central processing unit (central processing unit; CPU), a microprocessor, a digital signal processor (digital signal processor; DSP), a programmable controller, a programmable logic device (programmable logic device; PLD), other similar devices, or a combination of these devices. The present disclosure is not limited thereto. In addition, in an embodiment, each of the functions of the processor 130 may be implemented as a plurality of program codes. Program code is stored in the memory and executed by the processor 130. Alternatively, in embodiments, each of the functions of the processor 130 may be implemented as one or more circuits. The present disclosure is not limited to the use of software or hardware to implement the functions of processor 130.
Further, the processor 130 may be integrated in the wearable device 110, in the external camera 120, in an additional device, or in a cloud server. However, the present disclosure is not limited thereto. That is, the present disclosure is not limited to the location in the wearable tracking system 100 where the calculations are performed.
In one embodiment, the wearable device 110 may include a display module (not shown), and the display module includes, for example, an organic light-emitting diode (OLED) display device, a small-sized LED display device, a micro-sized LED display device, a Quantum Dot (QD) LED display device, a liquid-crystal display (LCD) display device, a tiled display device, a foldable display device, or an electronic paper display (electronic paper display; EPD). However, the present disclosure is not limited thereto.
In one embodiment, the wearable device 110 or the external camera 120 may include a network module (not shown), and the network module includes, for example, a wired network module, a wireless network module, a bluetooth module, an infrared module, a radio frequency identification (radio frequency identification; RFID) module, a zigbee network module (Zigbee network module), or a near field communication (near field communication; NFC) network module, but the disclosure is not limited thereto. That is, the wearable device 110 may be configured to communicate with the external camera 120 through wired or wireless communication.
Fig. 2A is a schematic diagram of a tracking scenario of a wearable tracking system according to an embodiment of the present disclosure. Referring to fig. 1 and 2A, tracking scenario 200A depicts that wearable device 110 is wearable on a body part BD of user U, and external camera 120 may be disposed facing the user U to capture a photograph (i.e., body image IMG) of the user U. It should be noted that while wearable device 110 is depicted as being wearable on the head of user U for ease of explanation, wearable device 110 may also be wearable on other body parts of user U.
In one embodiment, the wearable device 110 may be a head-mounted device, and the body part BD of the user U may be a portion of the face (e.g., the nose) or the head of the user U. In another embodiment, the wearable device 110 may be a wristband device and the body part BD of the user U may be the wrist of the user U. In yet another embodiment, the wearable device 110 may be a belt device and the body part BD of the user U may be the waist of the user U. However, the present disclosure is not limited thereto. That is, the present disclosure is not limited to body part BD of wearable device 110. Furthermore, in different embodiments, the distance between the wearable device 110 and the body part BD may be different. Thus, the processor 130 may be configured as a local transformation matrix for each embodiment to transform the coordinates of the wearable device 110 to the coordinates of the body part BD.
As described above, the wearable device 110 may include the tracking sensor 112, and the wearable device 110 may be configured to detect body movements of the body part BD based on the tracking sensor 112.
In one embodiment, the tracking sensor 112 may include an internal camera (not shown), a light detection and ranging (light detection and ranging; liDAR) device, a global positioning system (global positioning system; GPS) device, radar, infrared sensor, ultrasonic sensor, other similar devices, or a combination of these devices. The present disclosure is not limited thereto. The tracking sensor 112 may be configured to obtain an environmental image of the environment surrounding the wearable device 110. Further, the processor 130 may be configured to generate the first movement trajectory from the environmental image based on a simultaneous localization and mapping (simultaneous localization and mapping; SLAM) algorithm. That is, a SLAM map is generated from an environmental image based on a SLAM algorithm. In other words, the SLAM map may be obtained by a camera, liDAR device, GPS device, radar, infrared sensor, ultrasonic sensor, other similar device, or a combination of these devices. The present disclosure is not limited thereto.
In one embodiment, the tracking sensor 112 may include an inertial measurement unit (inertial measurement unit; IMU) sensor (not shown). The IMU sensor 112 may be configured to detect linear acceleration or angular velocity of the wearable device 110. Further, the processor 130 may be configured to generate the first movement trace based on the linear acceleration or the angular velocity.
In one embodiment, the IMU sensor 112 includes, for example, a gyroscope, an accelerometer, other similar devices, or a combination of these devices. The present disclosure is not limited thereto. In one embodiment, the IMU sensor 112 may be an accelerometer and may be configured to detect at least one of three linear acceleration values in three degrees of freedom (degrees of freedom; DOF). The three linear acceleration values may include a first acceleration value along the X-axis, a second acceleration value along the Y-axis, and a third acceleration value along the Z-axis. In one embodiment, the IMU sensor 112 may be a gyroscope and may be configured to detect at least one of three angular velocities in three degrees of freedom. The three angular velocities may include a Roll (Roll) angular velocity about the X-axis, a Pitch (Pitch) angular velocity about the Y-axis, and a Yaw (Yaw) angular velocity about the Z-axis. In one embodiment, the IMU sensor 112 may include an accelerometer and a gyroscope and is configured to detect changes in six degrees of freedom. The six degree of freedom changes include three linear acceleration values corresponding to three perpendicular axes and three angular velocities corresponding to three perpendicular axes (e.g., X, Y, Z axes).
In one embodiment, the external camera 120 may be configured to obtain a body image IMG of the body part BD. Further, the processor 130 may be configured to generate a second movement track from the body image IMG based on a simultaneous localization and mapping (SLAM) algorithm.
It should be noted that different techniques have been developed to track the movement of user U. For example, there are two main categories of tracking techniques, inside-out (Inside-out) tracking and Outside-in (Outside-in) tracking. Inside-out tracking is tracking movement with respect to the internal device itself (e.g., wearable device 110) relative to the external environment. Outside-in tracking is tracking movement with respect to an external device (e.g., external camera 120). The external device is disposed separately from the internal device and configured to observe/track movement of the internal device.
By utilizing the wearable tracking system 100, the body movement trajectory of the body part BD may be obtained by the wearable device 110 or the external camera 120 in response to the body part BD movement of the user U. That is, the wearable device 110 provides an inside-out tracking function to the wearable tracking system 100, and the external camera 120 provides an outside-in tracking function to the wearable tracking system 100. In other words, the processor 130 may be configured to obtain the first movement track based on an inside-out tracking algorithm, and to obtain the second movement track based on an outside-in tracking algorithm.
In this way, since the wearable device 110 and the external camera 120 are configured to track the user U based on the body movement trajectory of the body part BD of the user U, it is not necessary to consider the manufacturer of the wearable device 110 or the external camera 120, thereby increasing convenience and user experience.
Fig. 2B is a schematic diagram of a tracking scenario 200B of a first movement trajectory according to an embodiment of the present disclosure. Fig. 2C is a schematic diagram of a tracking scenario 200C of a second movement trajectory according to an embodiment of the present disclosure. Fig. 2D is a schematic diagram of a tracking scenario 200D of a coordinate transformation relationship between a wearable device and an external camera, according to an embodiment of the present disclosure. Referring to fig. 1-2D, fig. 2B depicts a first movement track based on the wearable device 110 tracking (obtaining) the body part BD of the user U, and fig. 2C depicts a second movement track based on the external camera 120 tracking (obtaining) the body part BD of the user U.
For example, the user U may draw a circle (i.e., a body movement trajectory) in the environment using the body part BD. Wearable device 110 may obtain (record/track) a circle by detecting body movement MV based on tracking sensor 112. The external camera 120 may obtain (record/track) a circle by capturing the body image IMG. Further, the processor 130 may obtain a first movement trajectory based on the body movement MV, and may obtain a second movement trajectory based on the body image IMG. The first and second movement trajectories may have a size and shape similar to or the same as a circle drawn by the user U. That is, either one of the first movement locus and the second movement locus may be a circle. Thus, by comparing the first movement trajectory and the second movement trajectory by the processor 130, a coordinate transformation relationship RLT between the wearable device 110 and the external camera 120 may be generated. Thus, the body posture PS may be determined by the processor 130 based on the coordinate transformation relationship RLT.
In one embodiment, the coordinate transformation relationship RLT between the wearable device 110 and the external camera 120 may be depicted as fig. 2D. The coordinate system DV of the wearable device 110 may be defined as an inside-out coordinate system, and the coordinate system CMR of the external camera 120 may be defined as an outside-in coordinate system.
After obtaining the first movement track based on the wearable device 110 and the second movement track based on the external camera 120, the processor 130 may be configured to align the first movement track with the second movement track. For example, the first movement track and the second movement track are circles. The first movement track may also be referred to as a first circle and the second movement track may also be referred to as a second circle. The processor 130 may be configured to move the position of the first circle and/or rotate the first circle to align (fit) with the second circle. That is, the processor 130 may be configured to calculate a position transformation matrix between the inside-out coordinate system of the first circle (i.e., the first movement trajectory) and the outside-in coordinate system of the second circle (i.e., the second movement trajectory). Further, the processor 130 may be configured to calculate a rotation transformation matrix between the inside-out coordinate system of the first circle (i.e., the first movement trajectory) and the outside-in coordinate system of the second circle (i.e., the second movement trajectory). Thus, the coordinate transformation relationship RLT may be determined from the position transformation matrix and/or the rotation transformation matrix. For example, a calibration matrix (e.g., a position transformation matrix or a rotation transformation matrix) may be solved using a standard linear least squares method based on the first and second movement trajectories. However, the present disclosure is not limited thereto.
Thus, once the coordinate transformation relationship RLT is determined, a first coordinate in the inside-out coordinate system may be transformed to a second coordinate in the outside-in coordinate system. That is, the first tracking result of the user U-based wearable device 110 may be fused with the second tracking result of the user U-based external camera 120. Therefore, the body posture PS can be determined from the fusion result of the first tracking result and the second tracking result. In other words, the processor 130 may be configured to determine the body posture PS based on the position transformation matrix and the rotation transformation matrix.
In this way, since the wearable device 110 and the external camera 120 are configured to track the user U based on the body movement trajectory of the body part BD of the user U, it is not necessary to consider the manufacturer of the wearable device 110 or the external camera 120, thereby increasing convenience and user experience.
Fig. 3 is a schematic flow chart of a wearable tracking method according to an embodiment of the present disclosure. Referring to fig. 1 to 3, the wearable tracking method 300 may include step S310, step S320, and step S330.
In step S310, a first movement trajectory of the body part BD of the user U may be obtained by the processor 130 based on the body movement MV detected by the wearable device 110. In step S320, a second movement trajectory of the body part BD of the user U may be obtained by the processor 130 based on the body image IMG obtained by the external camera 120. In step S330, the body posture PS may be determined by the processor 130 based on the first movement trajectory and the second movement trajectory. Additionally, implementation details of the wearable tracking method 300 may be referred to the descriptions in fig. 1-2D to obtain sufficient teaching, advice, and implementation embodiments, without redundantly describing the details herein.
In this way, since the wearable tracking method 300 is configured to track the user U based on the body movement trajectory of the body part BD of the user U, the manufacturer of the wearable device 110 or the external camera 120 does not need to be considered, thereby increasing convenience and user experience.
In summary, according to the wearable tracking system 100 and the wearable tracking method 300, since the wearable device 110 and the external camera 120 are configured to track the user U based on the body movement track of the body part BD of the user U, it is not necessary to consider the manufacturer of the wearable device 110 or the external camera 120, thereby increasing convenience and user experience.
It will be apparent to those skilled in the art that various modifications and variations can be made to the disclosed embodiments without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the present disclosure cover modifications and variations provided they come within the scope of the appended claims and their equivalents.

Claims (10)

1. A wearable tracking system, comprising:
a wearable device adapted to be worn on a body part of a user, wherein the wearable device is configured to detect body movement of the body part based on a tracking sensor;
an external camera configured to obtain a body image of the body part; and
a processor configured to:
obtaining a first movement trajectory of a body movement trajectory of the body part based on the body movement;
obtaining a second movement trajectory of the body part based on the body image; and
based on the first movement trajectory and the second movement trajectory, a body posture of the body part is determined.
2. The wearable tracking system of claim 1, wherein the processor is further configured to:
acquiring the first movement track based on an inside-out tracking algorithm; and
and obtaining the second movement track based on an outside-in tracking algorithm.
3. The wearable tracking system of claim 1, wherein the processor is further configured to:
generating a coordinate transformation relationship between the wearable device and the external camera by comparing the first movement trajectory and the second movement trajectory; and
the body posture is determined based on the coordinate transformation relationship.
4. The wearable tracking system of claim 1, wherein the processor is further configured to:
aligning the first movement track with the second movement track;
calculating a position transformation matrix between an inside-out coordinate system of the first moving track and an outside-in coordinate system of the second moving track;
calculating a rotation transformation matrix between the inside-out coordinate system of the first movement track and the outside-in coordinate system of the second movement track; and
the body posture is determined based on the positional transformation matrix and the rotational transformation matrix.
5. The wearable tracking system of claim 1, wherein the tracking sensor comprises:
an internal camera configured to obtain an environmental image of an environment surrounding the wearable device,
wherein the processor is configured to generate the first movement trajectory from the environmental image based on a simultaneous localization and mapping algorithm.
6. The wearable tracking system of claim 1, wherein the tracking sensor comprises:
an inertial measurement unit sensor configured to detect a linear acceleration or an angular velocity of the wearable device,
wherein the processor is configured to generate the first movement trajectory based on the linear acceleration or the angular velocity.
7. The wearable tracking system of claim 1, wherein the processor is configured to generate the second movement trajectory from the body image based on a simultaneous localization and mapping algorithm.
8. The wearable tracking system of claim 1, wherein the wearable device is a head-mounted device, a wristband device, or a waistband device, and the body part is a portion of the user's face, head, wrist, or waist.
9. The wearable tracking system of claim 1, wherein the wearable device is a first product of a first manufacturer and the external camera is a second product of a second manufacturer.
10. A wearable tracking method, comprising:
obtaining a first movement trajectory of a body part from a wearable device based on body movement, wherein the wearable device is adapted to be worn on the body part of a user, and the wearable device is configured to detect the body movement of the body part based on a tracking sensor;
obtaining a second movement trajectory of the body part from an external camera based on a body image, wherein the external camera is configured to obtain the body image of the body part; and
determining, by a processor, a body posture of the body part based on the first movement trajectory and the second movement trajectory.
CN202310291133.8A 2022-03-23 2023-03-23 Wearable tracking system and wearable tracking method Pending CN116804894A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US63/322,642 2022-03-23
US18/178,535 US12045385B2 (en) 2022-03-23 2023-03-06 Wearable tracking system and wearable tracking method
US18/178,535 2023-03-06

Publications (1)

Publication Number Publication Date
CN116804894A true CN116804894A (en) 2023-09-26

Family

ID=88078956

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310291133.8A Pending CN116804894A (en) 2022-03-23 2023-03-23 Wearable tracking system and wearable tracking method

Country Status (1)

Country Link
CN (1) CN116804894A (en)

Similar Documents

Publication Publication Date Title
US10521011B2 (en) Calibration of inertial measurement units attached to arms of a user and to a head mounted device
CN106445130B (en) A kind of motion capture gloves and its calibration method for gesture identification
US9401050B2 (en) Recalibration of a flexible mixed reality device
CN110458961B (en) Augmented reality based system
US10853649B2 (en) Context-aware hazard detection using world-facing cameras in virtual, augmented, and mixed reality (xR) applications
CN116051640A (en) System and method for simultaneous localization and mapping
EP3422153A1 (en) System and method for selective scanning on a binocular augmented reality device
TW201911133A (en) Controller tracking for multiple degrees of freedom
Lee et al. Visual-inertial hand motion tracking with robustness against occlusion, interference, and contact
JP2004157850A (en) Motion detector
JP6986003B2 (en) Tracking system and its tracking method
CN105184268B (en) Gesture identification equipment, gesture identification method and virtual reality system
CN109813317A (en) A kind of barrier-avoiding method, electronic equipment and virtual reality device
TWI829563B (en) Wearable tracking system and wearable tracking method
CN116804894A (en) Wearable tracking system and wearable tracking method
Mohareri et al. A vision-based location positioning system via augmented reality: An application in humanoid robot navigation
US20240104820A1 (en) Tracking system, tracking method, and self-tracking tracker
Oh A study on MTL device design and motion tracking in virtual reality environments
TWI854517B (en) Motion computing system and method for mixed reality
US11836302B2 (en) Motion computing system and method for virtual reality
US20240244575A1 (en) Tracking system and tracking method
US20240112421A1 (en) System and method of object tracking for extended reality environment
EP4239455A1 (en) Motion computing system and method for mixed reality
EP4280157A1 (en) Extended reality (xr) object alignment method and xr projection system
Chu et al. A study of motion recognition system using a smart phone

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination