US20210157394A1 - Motion tracking system and method - Google Patents

Motion tracking system and method Download PDF

Info

Publication number
US20210157394A1
US20210157394A1 US16/693,344 US201916693344A US2021157394A1 US 20210157394 A1 US20210157394 A1 US 20210157394A1 US 201916693344 A US201916693344 A US 201916693344A US 2021157394 A1 US2021157394 A1 US 2021157394A1
Authority
US
United States
Prior art keywords
position information
motion
sensing data
obtaining
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/693,344
Inventor
Ching-Ning Huang
Yi-Kang Hsieh
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
XRspace Co Ltd
Original Assignee
XRspace Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by XRspace Co Ltd filed Critical XRspace Co Ltd
Priority to US16/693,344 priority Critical patent/US20210157394A1/en
Assigned to XRSpace CO., LTD. reassignment XRSpace CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: HSIEH, YI-KANG, HUANG, CHING-NING
Priority to US17/125,954 priority patent/US11460912B2/en
Priority to US17/125,962 priority patent/US20210157397A1/en
Publication of US20210157394A1 publication Critical patent/US20210157394A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0172Head mounted characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • the present disclosure generally relates to a method for tracking the motion of a user, in particular, to a motion tracking system and a motion tracking method.
  • the motion of the user may be detected, to directly operate the electronic apparatus according to the motion of the user.
  • some electronic apparatuses may allow human body portions (such as hands, legs, head, etc.) of the user to control the operation of these electronic apparatuses, and the motion of these human body portions may be tracked.
  • these electronic apparatuses merely provide one way to detect the motion of multiple human body portions at the same time.
  • a virtual reality (VR) product may provide handheld controllers, and each handheld controller includes an inertial measurement unit (IMU) to track the motion of the hands of the user.
  • IMU inertial measurement unit
  • the present disclosure is directed to a motion tracking system and a motion tracking method, in which one human body portions can be tracked with different motion tracking technologies.
  • a motion tracking method is adapted for a motion tracking system including first, second and third motion sensing apparatuses wearable on human body portions of a user.
  • the motion tracking method includes, but not limited to, the following steps. First sensing data is obtained based on the motion sensor disposed on the first, second and third motion sensing apparatuses. Second sensing data is obtained based on wireless signals transmitted between the first, second and third motion sensing apparatuses. Motion information of the user is determined according to a determining factor including the first sensing data and the second sensing data.
  • a motion tracking system includes, but not limited to, three motion sensing apparatuses and a processor.
  • the motion sensing apparatuses are wearable on the human body portions of a user.
  • Each motion sensing apparatus includes a wireless transceiver and a motion sensor.
  • the wireless transceiver is used for transmitting or receiving wireless signals.
  • the motion sensor is used for sensing the motion of one human body portion of the user.
  • the processor is configured to obtain first sensing data based on the motion sensor of the motion sensing apparatuses and second sensing data based on second sensing data based on the wireless signals transmitted between the three motion sensing apparatus, and determine motion information of the user by a determining factor including the first sensing data and the second sensing data.
  • FIG. 1 is a block diagram illustrating a motion tracking system according to one of the exemplary embodiments of the disclosure.
  • FIG. 2 is a schematic diagram illustrating a motion tracking system according to one of the exemplary embodiments of the disclosure.
  • FIG. 3 is a flowchart illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure.
  • FIG. 4 is a schematic diagram illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure.
  • FIG. 5 is a schematic diagram illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure.
  • FIG. 1 is a block diagram illustrating a motion tracking system 10 according to one of the exemplary embodiments of the disclosure.
  • the motion tracking system 10 includes, but not limited to, three or more multiple motion sensing apparatuses 100 , and a computing apparatus 200 .
  • the motion tracking system 10 can be adapted for VR, AR, MR, XR or other reality related technology.
  • Each motion sensing apparatus 100 includes, but not limited to, a wireless transceiver 110 and a motion sensor 130 .
  • the motion sensing apparatuses 100 could be a handheld controller or a wearable apparatus, such as a wearable controller, a smartwatch, an ankle sensor, a waist belt, a head-mounted display (HMD), or the likes.
  • each motion sensing apparatus 100 is wearable on one human body portion of the user.
  • the human body portion may be a hand, a head, an ankle, a leg, a waist, or other portions.
  • the wireless transceiver 110 could be a communication transceiver compatible with Bluetooth, Wi-Fi, IR, RFID, or other wireless communication technologies.
  • the wireless transceiver 110 is used for transmitting and/or receiving wireless signals with the wireless transceivers 110 of other motion sensing apparatuses 100 , and a sequence of first sensing data would be generated based on the wireless signals transmitted between the motion sensing apparatuses 100 . The detailed process for the generation of the sequence of first sensing data would be introduced later.
  • the motion sensor 130 may be an accelerometer, a gyroscope, a magnetometer, an inertial measurement unit (IMU), or any combination of aforementioned sensors.
  • the motion sensor 130 is used for sensing the motion of a corresponding human body portion of the user, which wears a motion sensing apparatus 100 , for a time period, to generate a sequence of first sensing data from the sensing result (such as acceleration, rotation, magnetic force, etc.) of the motion sensor 130 at multiple time points within the time period.
  • the first sensing data includes a 3-degree of freedom (3-DoF) data, and the 3-DoF data which are related to the orientation information of the human body portion in three-dimensional (3D) space, such as accelerations in yaw, roll, and pitch.
  • 3-DoF 3-degree of freedom
  • the computing apparatus 200 includes, but not limited to, a memory 240 and a processor 250 .
  • the computing apparatus 200 could be a computer, a server, a smartphone, a tablet, or one of the motion sensing apparatus 100 .
  • the memory 240 may be any type of a fixed or movable Random-Access Memory (RAM), a Read-Only Memory (ROM), a flash memory or a similar device or a combination of the above devices.
  • RAM Random-Access Memory
  • ROM Read-Only Memory
  • flash memory or a similar device or a combination of the above devices.
  • the memory 240 can be used to store program codes, device configurations, buffer data or permanent data (such as sensing data, motion information, distance relationship, etc.), and these data would be introduced later.
  • the processor 250 is coupled to the memory 240 , and the processor 250 is configured to load the program codes stored in the memory 240 , to perform a procedure of the exemplary embodiment of the disclosure.
  • functions of the processor 150 may be implemented by using a programmable unit such as a central processing unit (CPU), a microprocessor, a microcontroller, a digital signal processing (DSP) chip, a field programmable gate array (FPGA), etc.
  • the functions of the processor 250 may also be implemented by an independent electronic device or an integrated circuit (IC), and operations of the processor 150 may also be implemented by software.
  • the processor 250 may or may not be disposed at the same apparatus with the one, part, or all of the motion sensing apparatuses 100 .
  • the apparatuses respectively equipped with the motion sensor 130 and the processor 250 may further include communication transceivers with compatible communication technology, such as Bluetooth, Wi-Fi, IR, or physical transmission line, to transmit/receive data with each other.
  • the motion tracking system 10 may. further include a head mounted display (HMD) 300 .
  • the HMD 300 is wearable on the head of the user.
  • the HMD 300 includes, but not limited to, a wireless transceiver 310 and an image sensor 360 .
  • the description of the wireless transceiver 310 could be referred to the description of the wireless transceiver 110 , and would be omitted. It means that HMD 300 may communicate with the motion sensing apparatus 100 through the wireless transceiver 310 .
  • the image sensor 360 may be a camera, such as a monochrome camera or a color camera, a deep camera, a video recorder, or other sensor capable of capturing images.
  • FIG. 2 is a schematic diagram illustrating a motion sensing system 20 according to one of the exemplary embodiments of the disclosure.
  • the motion sensing system 20 includes a HMD 300 and four motion sensing apparatuses 100 (which are two ankle sensors worn on the human body portions B 1 and B 2 (i.e., two ankles) and two handheld controllers worn on the human body portions B 3 and B 4 (i.e., two hands)).
  • the HMD 300 may further include another motion sensor 130 (not shown), to obtain orientation information of human body portions B 5 (i.e., the head).
  • the processor 250 is embedded in the HMD 300 .
  • the motion sensing system 20 is merely an example to illustrate the disposing manners of motion sensing apparatuses 100 , the HMD 300 , and processor 250 .
  • the behavior understanding system 10 there are still many other implementations of the behavior understanding system 10 , and the present disclosure is not limited thereto.
  • FIG. 3 is a flowchart illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure.
  • the processor 250 may obtain first sensing data based on the motion sensors 130 disposed on the three motion sensing apparatuses 100 (step S 310 ). Specifically, regarding different types of the motion sensor 130 , acceleration, rotation, magnetic force, orientation and/or 3-DoF/6-DoF for the motion of corresponding human body portion in a 2D/3D space may be obtained, and one or more sensing results of the motion sensor 130 would become a sequence of first sensing data of the human body portion.
  • the processor 250 may obtain second sensing data based on wireless signals transmitted between the three motion sensing apparatuses 100 (step S 330 ).
  • the processor 250 may obtain the signal strengths of the wireless signals from three or more motion sensing apparatuses 100 at multiple time points, and each signal strength would be recorded with its corresponding transmitter and receiver in the memory 240 .
  • the signal strength could be received signal strength indication (RSSI), received channel power indicator (RCPI), reference signal received power (RSRP), or the likes.
  • the motion sensing apparatus 100 may monitor the signal strengths of all detectable wireless signals, and each wireless signal includes specific identifier(s) of the transmitter and/or the receiver.
  • the motion sensing apparatus 100 may further feedback the signal strengths with the corresponding identifier(s) of to the computing apparatus 200 .
  • the computing apparatus 200 may monitor the signal strengths of all detectable wireless signal, and the processor 250 records the signal strengths with the corresponding identifier of the transmitter in the memory 240 .
  • the signal strengths would be recorded for a time period to generate a sequence of the second sensing data. It means that the second sensing data includes a sequence of signal strengths arranged by time.
  • the processor 250 may further obtain third sensing data based on the images captured from the image sensor 360 .
  • the third sensing data could be a sequence of images and/or sensing results (such as brightness, color, depth, etc) of pixels in the images.
  • the processor 250 may determine the motion information of the user by a determining factor including the first sensing data and the second sensing data (step S 350 ).
  • the motion information may include the position information and the orientation information.
  • the processor 250 may determine the position information of the user according to the first sensing data.
  • the determining factor includes the first sensing data.
  • a displacement of a corresponding human body portion can be estimated through double integral on the detected acceleration (i.e., the second sensing data) of the human body portion in three axes, to further determine the position information based on the displacement.
  • the position information could be coordinate at two or three axes, a position relative to a reference, etc.
  • the processor 250 may obtain the position information according to the second sensing data based on the wireless signals between three motion sensing apparatuses 100 .
  • the determining factor includes the second sensing data, it should be noted that the signal strength of the wireless signal is related to a relative distance between two motion sensing apparatuses 100 .
  • three distances between three points can be used to determine the relative position information of the three points. It is assumed that three of the motion sensing apparatuses 100 as the aforementioned three points, the processor 250 may determine the relative distances between each two motion sensing apparatuses 100 as the distance relationship between the motion sensing apparatuses 100 . Then, the processor 250 may generate the position information of the tracked apparatus based on the distance relationship and trilateration.
  • the processor 250 may obtain signal strengths of the wireless signal from the motion sensing apparatus 100 for the human body portion B 3 to the HMD 300 (which is one of the emotion sensing apparatus 100 in this embodiment) for the human body portion B 5 , the wireless signal from the motion sensing apparatus 100 for the human body portion B 4 to the HMD 300 for the human body portion B 5 , and the wireless signal from the motion sensing apparatus 100 for the human body portion B 3 to the motion sensing apparatus 100 for the human body portion B 4 .
  • the processor 250 may determine their distance relationship according to the signal strengths, and then generates the position information of the human body portion B 3 based on the distance relationship.
  • the position information may be coordinates or relative position.
  • the embodiment does not limit which three motion sensing apparatuses 100 are selected.
  • signal strengths of the wireless signal from the motion sensing apparatus 100 for the human body portion B 2 to the motion sensing apparatus 100 for the human body portion B 3 can be used for estimating the position information of the human body portion B 1 .
  • the combination of the motion sensing apparatuses 100 can be varied on demand.
  • the processor 250 may determine the position information of the user according to the third sensing data.
  • the determining factor includes the third sensing data.
  • the position and the displacement of the human body portion in the images can be used for determining the position information in the real environment. Taking FIG. 2 as an example, the sensing strength and the pixel position corresponding to the human body portion B 4 in the image can be used for estimating depth information of the human body portion B 4 (i.e., a distance relative to the HMD 300 ) and estimating 2D position of the human body portion B 4 at a plane parallel to the image sensor 360 .
  • the accuracy of the position information based on merely one sensing manner for example, which is based on one of the wireless transceiver 110 , motion sensor 130 , and the image sensor 360 , may be different. Therefore, two or more sensing manners can be used to determine the position information of the corresponding human body portion.
  • the processor 250 may obtain first position information according to the first sensing data, obtain second position information according to the second sensing data, and obtain adjusted position information according to the first position information and the second position information.
  • the determining factor includes the first sensing data and the second sensing data.
  • the processor 250 may determine the position information according to a combination of the first position information and the second position information.
  • the combination is a weighted combination.
  • the adjusted position information is determined according to the sum of weighted first position information and the weighted second position information.
  • the weights of the weighted combination for the first position information and the second position information may be fixed. In another embodiment, the weights of the weighted combination for the first position information and the second position information may be varied.
  • the weight for the first position information could be a value from 0 to 100%
  • the weight for the second position information could be a value from 0 to 100%. However, the weights for the first and second position information both may not be 0 at the same time.
  • the position information determined based on the third sensing data generated by the image of the image sensor 360 may be more accurate than the position information determined based on the wireless transceiver 110 and/or the motion sensor 130 . Therefore, in one embodiment, the determining factor may include the second and third sensing data.
  • the processor 250 may determine the position information according to a combination of the position information obtained based on the first, the second, and the third sensing data.
  • the processor 250 may obtain a first part of position information according to the second sensing data in a first duration, obtain a second part of position information according to the third sensing data in a second duration, and combine the first and second part of position information as combined position information.
  • the third sensing data which detects the human body portions, can be used to correct position information based on the second sensing data in the first and second duration.
  • the processor 250 may determine the combined position information based on the first and second part of position information in different durations.
  • a position ( 1 , 1 ) is determined based on the second sensing data at the first duration
  • another position ( 2 , 1 ) is determined based on the third sensing data at the second duration
  • the combined position information may be a displacement from the position ( 1 , 1 ) to the position ( 2 , 1 ).
  • the processor 250 may determine the position information according to a weighted combination of the second and third position information.
  • the weights for the second and third position information may be varied or fixed based on the actual situations. For example, the weight for the third position information may be lager than the second position information.
  • the position information is the weighted combination if the human body portions exist in the third sensing data, and the position is the second position information if the human body portions do not exist in the third sensing data.
  • the image sensor 360 may be designed with a specific field of view (FOV). If one human body portion is located outside of the field of view of the image sensor 360 , the processor 250 may not able to determine the motion information of this human body portion merely using the third sensing data, and the first or second sensing data should be considered.
  • FOV field of view
  • the processor 250 may determine whether one human body portion of the user exists in the sequence of third sensing data, and determine whether to use the distance relationship between three motion sensing apparatuses 100 according to a determined result of the existence of the human body portion to determine the position information based on trilateration.
  • the processor 250 may use machine learning technology (such as deep learning, artificial neural network (ANN), or support vector machine (SVM), etc.) to identify the target human body portion in the third sensing data.
  • machine learning technology such as deep learning, artificial neural network (ANN), or support vector machine (SVM), etc.
  • FIG. 4 is a schematic diagram illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure. Referring to FIG. 4 , it is assumed that the motion sensing apparatus 100 for the human body portion B 4 is the target apparatus. In this figure, the human body portion B 4 exists in the field of view FOV of HMD 300 (i.e., the human body portion B 4 exists in the third sensing data).
  • FIG. 5 is a schematic diagram illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure. Referring to FIG. 5 , it is assumed that the motion sensing apparatus 100 for the human body portion B 3 is the target apparatus. In this figure, the human body portion B 5 does not exist in the field of view FOV of HMD 300 (i.e., the human body portion B 3 does not exist in the third sensing data).
  • FIGS. 4 and 5 are merely an example and could be modified based on actual requirements.
  • the field of view of the image sensor 360 is used to determine whether the human body portions exist in the third sensing data. In one embodiment, it is assumed that the human body portions are located outside of the field of view (i.e., not exist in the third sensing data) at the first duration, and the human body portions are located inside of the field of view (i.e., exist in the third sensing data) of the image sensor 360 at the second duration. In some embodiments, it is assumed that the human body portions are located inside of the field of view of the image sensor 360 at the first and second duration.
  • the processor 250 may obtain first position information according to the first sensing data, obtain second position information according to the second sensing data, obtain third position information according to the third sensing data, and obtain obtaining adjusted position information according to the first position information, the second position information and the third position information.
  • the determining factor includes the first, second, and third sensing data.
  • the processor 250 may determine the adjusted position information according to a combination of the first motion information, the second motion information, and the third position information.
  • the combination is a weighted combination.
  • the processor 250 may determine a first weight for the first position information and a second weight for the second position information according to the third position information.
  • the first weight and the second weight are varied time after time. In the duration that the human body portions exist in the third sensing data, the third position information would be considered as correct position information, and the weighted combination of the first and second position information with the first weight and the second weight would be adjusted according to the third position information.
  • the processor 250 may obtain a first parameter by multiplying the first weight and the first position information, obtain a second parameter by multiplying the second weight and the second position information, and obtain the adjusted position information by adding the first parameter to the second parameter, so as to obtain the weighted combination.
  • the first and second weights at a subsequent time point may be determined based on an equation that the third position information equals the weighted combination of the first and second position information at a previous time point. For example, at the third time point, the first weight is 0.5 and the second weight is 0.5, the first position information is ( 6 , 6 , 6 ) and the second position information is ( 10 , 10 , 10 ) in a 3-dimension coordinate system, and the adjust position information would be ( 8 , 8 , 8 ). If the third position information is ( 7 , 7 , 7 ), the first weight and second weights at the fourth time point would be determined as 0.75 and 0.25,respectively.
  • the adjust position information would be ( 8 . 25 , 7 , 7 ).
  • the first and second weights at a current time point may be determined based on an equation that the third position information equals the weighted combination of the first and second position information at the current time point.
  • the first position information is ( 6 , 6 , 6 ) and the second position information is ( 10 , 10 , 10 ) in a 3-dimension coordinate system. If the third position information is ( 7 , 7 , 7 ), the first weight and second weights at the second time point would be determined as 0.75 and 0.25, respectively. Then, the adjust position information at the second time point would be determined as ( 7 , 7 , 7 ).
  • the first weight and the second weight are fixed if the human body portions of the user do not exist in the third sensing data. If the human body portions are located outside of the field of view of the image sensor 360 , the third and second weights would be the same as the previous first and second weights at a previous time point when the human body portions of the user still exist in the third sensing data. For example, the human body portions are located inside the field of view of the image sensor 360 at the first time point, and the first weight is 0.5 and the second weight is 0.5. Then, at the second time point, the human body portions are located outside of the field of view of the image sensor 360 .
  • the first weight would be 0.5 and the second weight would be 0.5 at the second time point as same as the first and second weights at the first time point. Until the human body portions of the user exist in the third sensing data, the first and second weights would be varied according to the third sensing data.
  • the processor 250 may determine the adjusted position information according to a weighted combination of the first position information, the second position information, and the third position information.
  • the adjusted position information is determined according to the sum of weighted first position information, the weighted second position information, and the weighted third position information.
  • the weights for the three pieces of position information may be varied or fixed based on the actual situations.
  • the processor 250 may use the sequence of the first sensing data as the orientation information directly.
  • the orientation information could be the acceleration, the angular velocity in the three-axis, the orientation, 3-DoF information and/or 6-DoF information.
  • the processor 250 may determine the orientation information according to the third sensing data. Taking FIG. 4 as an example, two poses of the human body portion B 4 in the images at different time points can be used for estimating the orientation information.
  • the processor 250 may determine the orientation information according to the first sensing data and the third sensing data.
  • the orientation information may be a weighted combination of the first sensing data and the third sensing data.
  • the position information is determined according to the sum of weighted first orientation information based on the motion sensor 130 and the weighted second orientation information based on the image sensor 360 .
  • the field of view of the image sensor 360 would be a condition about whether to use the orientation information according to the third sensing data. If the human body portions exist in the third sensing data, the orientation information may be determined according to the first sensing data and the third sensing data. If the human body portions do not exist in the third sensing data, the orientation information may be determined merely according to the first sensing data.
  • the processor 250 may determine the motion information of the user according to both the orientation information and the position information.
  • the orientation information could be generated based on the first sensing data, the third sensing data, or the combination of the first and third sensing data as mentioned above.
  • the position information could be generated based on one of the first, second and third sensing data as mentioned above.
  • the motion information may be related to lifting, pointing, kicking, stepping, or jumping motion.
  • the processor 250 may determine the motion information of the user according to both the orientation information based on the first sensing data and the adjusted position information based on the first and second position information. No matter the human body portions exist in the third sensing data, the processor 250 can predict the motion of the user.
  • the processor 250 may determine the motion information of the user according to both the orientation information based on the first sensing data and the combined position information based on the second and third sensing data. It means that the motion information can be determined based on the orientation information and the combined position information in two durations when the human body portions exist and do not exist in the third sensing data.
  • FIGS. 4 and 5 Taking FIGS. 4 and 5 as an example, hands up motion for the human body portion B 4 is determined in FIG. 4 , and hands down motion is determined in FIG. 5 . Then, a swing motion from up to down for the human body portion B 4 is determined.
  • the processor 250 may determine the motion information of the user merely according to the position information based on the second sensing data. In another embodiment, the processor 250 may determine the motion information of the user merely according to the combined position information based on the second and third sensing data. In some embodiments, the processor 250 may determine the motion information of the user merely according to the position information based on the second sensing data if the human body portions do not exist in the third sensing data, and the may determine the motion information of the user merely according to the position information based on the third sensing data or the combined position information if the human body portions exist in the third sensing data.
  • the displacement or trajectory of the human body portion may be tracked, and the motion information can be determined based on the displacement or trajectory.
  • the human body portion B 3 moves from up to down, and a swing motion from up to down for the human body portion B 4 is determined.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Hardware Design (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biomedical Technology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Dentistry (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A motion tracking system and a motion tracking method are provided. The motion tracking system includes three motion sensing apparatuses wearable on the human body portion of a user. In the method, first sensing data is obtained based on motion sensors disposed on the motion sensing apparatuses, and second sensing data is obtained based on wireless signals transmitted between the motion sensing apparatuses. Motion information of the user is determined by a determining factor. The determining factor includes the first sensing data and the second sensing data. Accordingly, the motion of the user can be tracked with improved accuracy.

Description

    BACKGROUND OF THE DISCLOSURE 1. Field of the Disclosure
  • The present disclosure generally relates to a method for tracking the motion of a user, in particular, to a motion tracking system and a motion tracking method.
  • 2. Description of Related Art
  • To provide intuitive operation on an electronic apparatus (such as a game player, a computer, a smartphone, a smart appliance, etc.), the motion of the user may be detected, to directly operate the electronic apparatus according to the motion of the user.
  • In conventional technology, some electronic apparatuses may allow human body portions (such as hands, legs, head, etc.) of the user to control the operation of these electronic apparatuses, and the motion of these human body portions may be tracked. However, these electronic apparatuses merely provide one way to detect the motion of multiple human body portions at the same time. For example, a virtual reality (VR) product may provide handheld controllers, and each handheld controller includes an inertial measurement unit (IMU) to track the motion of the hands of the user. Sometimes, merely one motion tracking manner may be limited by its hardware or tracking mechanism, and result in abnormal or inaccurate tracking results.
  • SUMMARY OF THE DISCLOSURE
  • Accordingly, the present disclosure is directed to a motion tracking system and a motion tracking method, in which one human body portions can be tracked with different motion tracking technologies.
  • In one of the exemplary embodiments, a motion tracking method is adapted for a motion tracking system including first, second and third motion sensing apparatuses wearable on human body portions of a user. The motion tracking method includes, but not limited to, the following steps. First sensing data is obtained based on the motion sensor disposed on the first, second and third motion sensing apparatuses. Second sensing data is obtained based on wireless signals transmitted between the first, second and third motion sensing apparatuses. Motion information of the user is determined according to a determining factor including the first sensing data and the second sensing data.
  • In one of the exemplary embodiments, a motion tracking system includes, but not limited to, three motion sensing apparatuses and a processor. The motion sensing apparatuses are wearable on the human body portions of a user. Each motion sensing apparatus includes a wireless transceiver and a motion sensor. The wireless transceiver is used for transmitting or receiving wireless signals. The motion sensor is used for sensing the motion of one human body portion of the user. The processor is configured to obtain first sensing data based on the motion sensor of the motion sensing apparatuses and second sensing data based on second sensing data based on the wireless signals transmitted between the three motion sensing apparatus, and determine motion information of the user by a determining factor including the first sensing data and the second sensing data.
  • It should be understood, however, that this Summary may not contain all of the aspects and embodiments of the present disclosure, is not meant to be limiting or restrictive in any manner, and that the invention as disclosed herein is and will be understood by those of ordinary skill in the art to encompass obvious improvements and modifications thereto.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings are included to provide a further understanding of the disclosure, and are incorporated in and constitute a part of this specification. The drawings illustrate embodiments of the disclosure and, together with the description, serve to explain the principles of the disclosure.
  • FIG. 1 is a block diagram illustrating a motion tracking system according to one of the exemplary embodiments of the disclosure.
  • FIG. 2 is a schematic diagram illustrating a motion tracking system according to one of the exemplary embodiments of the disclosure.
  • FIG. 3 is a flowchart illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure.
  • FIG. 4 is a schematic diagram illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure.
  • FIG. 5 is a schematic diagram illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure.
  • DESCRIPTION OF THE EMBODIMENTS
  • Reference will now be made in detail to the present preferred embodiments of the disclosure, examples of which are illustrated in the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the description to refer to the same or like parts.
  • FIG. 1 is a block diagram illustrating a motion tracking system 10 according to one of the exemplary embodiments of the disclosure. Referring to FIG. 1, the motion tracking system 10 includes, but not limited to, three or more multiple motion sensing apparatuses 100, and a computing apparatus 200. The motion tracking system 10 can be adapted for VR, AR, MR, XR or other reality related technology.
  • Each motion sensing apparatus 100 includes, but not limited to, a wireless transceiver 110 and a motion sensor 130. The motion sensing apparatuses 100 could be a handheld controller or a wearable apparatus, such as a wearable controller, a smartwatch, an ankle sensor, a waist belt, a head-mounted display (HMD), or the likes. In one embodiment, each motion sensing apparatus 100 is wearable on one human body portion of the user. The human body portion may be a hand, a head, an ankle, a leg, a waist, or other portions.
  • The wireless transceiver 110 could be a communication transceiver compatible with Bluetooth, Wi-Fi, IR, RFID, or other wireless communication technologies. In one embodiment, the wireless transceiver 110 is used for transmitting and/or receiving wireless signals with the wireless transceivers 110 of other motion sensing apparatuses 100, and a sequence of first sensing data would be generated based on the wireless signals transmitted between the motion sensing apparatuses 100. The detailed process for the generation of the sequence of first sensing data would be introduced later.
  • The motion sensor 130 may be an accelerometer, a gyroscope, a magnetometer, an inertial measurement unit (IMU), or any combination of aforementioned sensors. In the embodiment, the motion sensor 130 is used for sensing the motion of a corresponding human body portion of the user, which wears a motion sensing apparatus 100, for a time period, to generate a sequence of first sensing data from the sensing result (such as acceleration, rotation, magnetic force, etc.) of the motion sensor 130 at multiple time points within the time period. For one example, the first sensing data includes a 3-degree of freedom (3-DoF) data, and the 3-DoF data which are related to the orientation information of the human body portion in three-dimensional (3D) space, such as accelerations in yaw, roll, and pitch.
  • The computing apparatus 200 includes, but not limited to, a memory 240 and a processor 250. The computing apparatus 200 could be a computer, a server, a smartphone, a tablet, or one of the motion sensing apparatus 100.
  • The memory 240 may be any type of a fixed or movable Random-Access Memory (RAM), a Read-Only Memory (ROM), a flash memory or a similar device or a combination of the above devices. The memory 240 can be used to store program codes, device configurations, buffer data or permanent data (such as sensing data, motion information, distance relationship, etc.), and these data would be introduced later.
  • The processor 250 is coupled to the memory 240, and the processor 250 is configured to load the program codes stored in the memory 240, to perform a procedure of the exemplary embodiment of the disclosure. In one embodiment, functions of the processor 150 may be implemented by using a programmable unit such as a central processing unit (CPU), a microprocessor, a microcontroller, a digital signal processing (DSP) chip, a field programmable gate array (FPGA), etc. In some embodiment, the functions of the processor 250 may also be implemented by an independent electronic device or an integrated circuit (IC), and operations of the processor 150 may also be implemented by software.
  • It should be noticed that the processor 250 may or may not be disposed at the same apparatus with the one, part, or all of the motion sensing apparatuses 100. However, the apparatuses respectively equipped with the motion sensor 130 and the processor 250 may further include communication transceivers with compatible communication technology, such as Bluetooth, Wi-Fi, IR, or physical transmission line, to transmit/receive data with each other.
  • In one embodiment, the motion tracking system 10 may. further include a head mounted display (HMD) 300. The HMD 300 is wearable on the head of the user. The HMD 300 includes, but not limited to, a wireless transceiver 310 and an image sensor 360.
  • The description of the wireless transceiver 310 could be referred to the description of the wireless transceiver 110, and would be omitted. It means that HMD 300 may communicate with the motion sensing apparatus 100 through the wireless transceiver 310.
  • The image sensor 360 may be a camera, such as a monochrome camera or a color camera, a deep camera, a video recorder, or other sensor capable of capturing images.
  • FIG. 2 is a schematic diagram illustrating a motion sensing system 20 according to one of the exemplary embodiments of the disclosure. Referring to FIG. 2, the motion sensing system 20 includes a HMD 300 and four motion sensing apparatuses 100 (which are two ankle sensors worn on the human body portions B1 and B2 (i.e., two ankles) and two handheld controllers worn on the human body portions B3 and B4 (i.e., two hands)). In some embodiments, the HMD 300 may further include another motion sensor 130 (not shown), to obtain orientation information of human body portions B5 (i.e., the head). The processor 250 is embedded in the HMD 300.
  • It should be noticed that the motion sensing system 20 is merely an example to illustrate the disposing manners of motion sensing apparatuses 100, the HMD 300, and processor 250. However, there are still many other implementations of the behavior understanding system 10, and the present disclosure is not limited thereto.
  • To better understand the operating process provided in one or more embodiments of the disclosure, several embodiments will be exemplified below to elaborate the operating process of the motion tracking system 10 or 20. The devices and modules in the motion tracking system 10 or 20 are applied in the following embodiments to explain the control method provided herein. Each step of the control method can be adjusted according to actual implementation situations and should not be limited to what is described herein.
  • FIG. 3 is a flowchart illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure. Referring to FIG. 3, the processor 250 may obtain first sensing data based on the motion sensors 130 disposed on the three motion sensing apparatuses 100 (step S310). Specifically, regarding different types of the motion sensor 130, acceleration, rotation, magnetic force, orientation and/or 3-DoF/6-DoF for the motion of corresponding human body portion in a 2D/3D space may be obtained, and one or more sensing results of the motion sensor 130 would become a sequence of first sensing data of the human body portion.
  • On the other hand, the processor 250 may obtain second sensing data based on wireless signals transmitted between the three motion sensing apparatuses 100 (step S330). In one embodiment, the processor 250 may obtain the signal strengths of the wireless signals from three or more motion sensing apparatuses 100 at multiple time points, and each signal strength would be recorded with its corresponding transmitter and receiver in the memory 240. The signal strength could be received signal strength indication (RSSI), received channel power indicator (RCPI), reference signal received power (RSRP), or the likes. In one embodiment, the motion sensing apparatus 100 may monitor the signal strengths of all detectable wireless signals, and each wireless signal includes specific identifier(s) of the transmitter and/or the receiver. The motion sensing apparatus 100 may further feedback the signal strengths with the corresponding identifier(s) of to the computing apparatus 200. In another embodiment, the computing apparatus 200 may monitor the signal strengths of all detectable wireless signal, and the processor 250 records the signal strengths with the corresponding identifier of the transmitter in the memory 240. The signal strengths would be recorded for a time period to generate a sequence of the second sensing data. It means that the second sensing data includes a sequence of signal strengths arranged by time.
  • In some embodiments, the processor 250 may further obtain third sensing data based on the images captured from the image sensor 360. The third sensing data could be a sequence of images and/or sensing results (such as brightness, color, depth, etc) of pixels in the images.
  • Then, the processor 250 may determine the motion information of the user by a determining factor including the first sensing data and the second sensing data (step S350). In one embodiment, the motion information may include the position information and the orientation information. Regarding the position information first, in one embodiment, the processor 250 may determine the position information of the user according to the first sensing data. In this embodiment, the determining factor includes the first sensing data. A displacement of a corresponding human body portion can be estimated through double integral on the detected acceleration (i.e., the second sensing data) of the human body portion in three axes, to further determine the position information based on the displacement. For example, the position information could be coordinate at two or three axes, a position relative to a reference, etc.
  • In another embodiment, the processor 250 may obtain the position information according to the second sensing data based on the wireless signals between three motion sensing apparatuses 100. In this embodiment, the determining factor includes the second sensing data, it should be noted that the signal strength of the wireless signal is related to a relative distance between two motion sensing apparatuses 100. In addition, based on trilateration, three distances between three points can be used to determine the relative position information of the three points. It is assumed that three of the motion sensing apparatuses 100 as the aforementioned three points, the processor 250 may determine the relative distances between each two motion sensing apparatuses 100 as the distance relationship between the motion sensing apparatuses 100. Then, the processor 250 may generate the position information of the tracked apparatus based on the distance relationship and trilateration.
  • Taking the motion tracking system 20 as an example, the processor 250 may obtain signal strengths of the wireless signal from the motion sensing apparatus 100 for the human body portion B3 to the HMD 300 (which is one of the emotion sensing apparatus 100 in this embodiment) for the human body portion B5, the wireless signal from the motion sensing apparatus 100 for the human body portion B4 to the HMD 300 for the human body portion B5, and the wireless signal from the motion sensing apparatus 100 for the human body portion B3 to the motion sensing apparatus 100 for the human body portion B4. The processor 250 may determine their distance relationship according to the signal strengths, and then generates the position information of the human body portion B3 based on the distance relationship. The position information may be coordinates or relative position.
  • It should be noted that the embodiment does not limit which three motion sensing apparatuses 100 are selected. For example, signal strengths of the wireless signal from the motion sensing apparatus 100 for the human body portion B2 to the motion sensing apparatus 100 for the human body portion B3, the wireless signal from the motion sensing apparatus 100 for the human body portion B3 to the motion sensing apparatus 100 for the human body portion B1, and the wireless signal from the motion sensing apparatus 100 for the human body portion B2 to the motion sensing apparatus 100 for the human body portion B1 can be used for estimating the position information of the human body portion B1. The combination of the motion sensing apparatuses 100 can be varied on demand.
  • In still another embodiment, the processor 250 may determine the position information of the user according to the third sensing data. In this embodiment, the determining factor includes the third sensing data. The position and the displacement of the human body portion in the images can be used for determining the position information in the real environment. Taking FIG. 2 as an example, the sensing strength and the pixel position corresponding to the human body portion B4 in the image can be used for estimating depth information of the human body portion B4 (i.e., a distance relative to the HMD 300) and estimating 2D position of the human body portion B4 at a plane parallel to the image sensor 360.
  • It should be noted that the accuracy of the position information based on merely one sensing manner, for example, which is based on one of the wireless transceiver 110, motion sensor 130, and the image sensor 360, may be different. Therefore, two or more sensing manners can be used to determine the position information of the corresponding human body portion.
  • In one embodiment, the processor 250 may obtain first position information according to the first sensing data, obtain second position information according to the second sensing data, and obtain adjusted position information according to the first position information and the second position information. In this embodiment, the determining factor includes the first sensing data and the second sensing data. The processor 250 may determine the position information according to a combination of the first position information and the second position information. In some embodiments, the combination is a weighted combination. The adjusted position information is determined according to the sum of weighted first position information and the weighted second position information.
  • In one embodiment, the weights of the weighted combination for the first position information and the second position information may be fixed. In another embodiment, the weights of the weighted combination for the first position information and the second position information may be varied. The weight for the first position information could be a value from 0 to 100%, and the weight for the second position information could be a value from 0 to 100%. However, the weights for the first and second position information both may not be 0 at the same time.
  • It should be noticed that in some embodiments, the position information determined based on the third sensing data generated by the image of the image sensor 360 may be more accurate than the position information determined based on the wireless transceiver 110 and/or the motion sensor 130. Therefore, in one embodiment, the determining factor may include the second and third sensing data. The processor 250 may determine the position information according to a combination of the position information obtained based on the first, the second, and the third sensing data.
  • In one embodiment, the processor 250 may obtain a first part of position information according to the second sensing data in a first duration, obtain a second part of position information according to the third sensing data in a second duration, and combine the first and second part of position information as combined position information. The third sensing data, which detects the human body portions, can be used to correct position information based on the second sensing data in the first and second duration. The processor 250 may determine the combined position information based on the first and second part of position information in different durations. For example, a position (1, 1) is determined based on the second sensing data at the first duration, another position (2, 1) is determined based on the third sensing data at the second duration, and the combined position information may be a displacement from the position (1, 1) to the position (2, 1).
  • In some embodiments, the processor 250 may determine the position information according to a weighted combination of the second and third position information. The weights for the second and third position information may be varied or fixed based on the actual situations. For example, the weight for the third position information may be lager than the second position information. In another embodiment, the position information is the weighted combination if the human body portions exist in the third sensing data, and the position is the second position information if the human body portions do not exist in the third sensing data.
  • In one embodiment, the image sensor 360 may be designed with a specific field of view (FOV). If one human body portion is located outside of the field of view of the image sensor 360, the processor 250 may not able to determine the motion information of this human body portion merely using the third sensing data, and the first or second sensing data should be considered.
  • In one embodiment, the processor 250 may determine whether one human body portion of the user exists in the sequence of third sensing data, and determine whether to use the distance relationship between three motion sensing apparatuses 100 according to a determined result of the existence of the human body portion to determine the position information based on trilateration. The processor 250 may use machine learning technology (such as deep learning, artificial neural network (ANN), or support vector machine (SVM), etc.) to identify the target human body portion in the third sensing data.
  • FIG. 4 is a schematic diagram illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure. Referring to FIG. 4, it is assumed that the motion sensing apparatus 100 for the human body portion B4 is the target apparatus. In this figure, the human body portion B4 exists in the field of view FOV of HMD 300 (i.e., the human body portion B4 exists in the third sensing data).
  • FIG. 5 is a schematic diagram illustrating a motion tracking method according to one of the exemplary embodiments of the disclosure. Referring to FIG. 5, it is assumed that the motion sensing apparatus 100 for the human body portion B3 is the target apparatus. In this figure, the human body portion B5 does not exist in the field of view FOV of HMD 300 (i.e., the human body portion B3 does not exist in the third sensing data).
  • It should be noticed that the size and the shape of the field of view illustrated in FIGS. 4 and 5 are merely an example and could be modified based on actual requirements.
  • Therefore, the field of view of the image sensor 360 is used to determine whether the human body portions exist in the third sensing data. In one embodiment, it is assumed that the human body portions are located outside of the field of view (i.e., not exist in the third sensing data) at the first duration, and the human body portions are located inside of the field of view (i.e., exist in the third sensing data) of the image sensor 360 at the second duration. In some embodiments, it is assumed that the human body portions are located inside of the field of view of the image sensor 360 at the first and second duration.
  • In another embodiment, the processor 250 may obtain first position information according to the first sensing data, obtain second position information according to the second sensing data, obtain third position information according to the third sensing data, and obtain obtaining adjusted position information according to the first position information, the second position information and the third position information. In this embodiment, the determining factor includes the first, second, and third sensing data. The processor 250 may determine the adjusted position information according to a combination of the first motion information, the second motion information, and the third position information.
  • In one embodiment, the combination is a weighted combination. The processor 250 may determine a first weight for the first position information and a second weight for the second position information according to the third position information. In one embodiment, the first weight and the second weight are varied time after time. In the duration that the human body portions exist in the third sensing data, the third position information would be considered as correct position information, and the weighted combination of the first and second position information with the first weight and the second weight would be adjusted according to the third position information. It should be noted that the processor 250 may obtain a first parameter by multiplying the first weight and the first position information, obtain a second parameter by multiplying the second weight and the second position information, and obtain the adjusted position information by adding the first parameter to the second parameter, so as to obtain the weighted combination.
  • In one embodiment, the first and second weights at a subsequent time point may be determined based on an equation that the third position information equals the weighted combination of the first and second position information at a previous time point. For example, at the third time point, the first weight is 0.5 and the second weight is 0.5, the first position information is (6, 6, 6) and the second position information is (10, 10, 10) in a 3-dimension coordinate system, and the adjust position information would be (8, 8, 8). If the third position information is (7, 7, 7), the first weight and second weights at the fourth time point would be determined as 0.75 and 0.25,respectively. Then, at the fourth time point, if the first position information is (7, 6, 6) and the second position information is (12, 10, 10) in a 3-dimension coordinate system, and the adjust position information would be (8.25, 7, 7).
  • In another embodiment, the first and second weights at a current time point may be determined based on an equation that the third position information equals the weighted combination of the first and second position information at the current time point. For example, at the second time point, the first position information is (6, 6, 6) and the second position information is (10, 10, 10) in a 3-dimension coordinate system. If the third position information is (7, 7, 7), the first weight and second weights at the second time point would be determined as 0.75 and 0.25, respectively. Then, the adjust position information at the second time point would be determined as (7, 7, 7).
  • In some embodiments, the first weight and the second weight are fixed if the human body portions of the user do not exist in the third sensing data. If the human body portions are located outside of the field of view of the image sensor 360, the third and second weights would be the same as the previous first and second weights at a previous time point when the human body portions of the user still exist in the third sensing data. For example, the human body portions are located inside the field of view of the image sensor 360 at the first time point, and the first weight is 0.5 and the second weight is 0.5. Then, at the second time point, the human body portions are located outside of the field of view of the image sensor 360. The first weight would be 0.5 and the second weight would be 0.5 at the second time point as same as the first and second weights at the first time point. Until the human body portions of the user exist in the third sensing data, the first and second weights would be varied according to the third sensing data.
  • In still another embodiment, the processor 250 may determine the adjusted position information according to a weighted combination of the first position information, the second position information, and the third position information. The adjusted position information is determined according to the sum of weighted first position information, the weighted second position information, and the weighted third position information. The weights for the three pieces of position information may be varied or fixed based on the actual situations.
  • On the other hand, regarding the orientation information, in one embodiment, the processor 250 may use the sequence of the first sensing data as the orientation information directly. For example, the orientation information could be the acceleration, the angular velocity in the three-axis, the orientation, 3-DoF information and/or 6-DoF information.
  • In another embodiment, the processor 250 may determine the orientation information according to the third sensing data. Taking FIG. 4 as an example, two poses of the human body portion B4 in the images at different time points can be used for estimating the orientation information.
  • In some embodiment, the processor 250 may determine the orientation information according to the first sensing data and the third sensing data. The orientation information may be a weighted combination of the first sensing data and the third sensing data. For example, the position information is determined according to the sum of weighted first orientation information based on the motion sensor 130 and the weighted second orientation information based on the image sensor 360.
  • In one embodiment, the field of view of the image sensor 360 would be a condition about whether to use the orientation information according to the third sensing data. If the human body portions exist in the third sensing data, the orientation information may be determined according to the first sensing data and the third sensing data. If the human body portions do not exist in the third sensing data, the orientation information may be determined merely according to the first sensing data.
  • In one embodiment, the processor 250 may determine the motion information of the user according to both the orientation information and the position information. The orientation information could be generated based on the first sensing data, the third sensing data, or the combination of the first and third sensing data as mentioned above. The position information could be generated based on one of the first, second and third sensing data as mentioned above. Taking the human body portion B1 or B2 in FIG. 2 as an example, the motion information may be related to lifting, pointing, kicking, stepping, or jumping motion.
  • In another embodiment, the processor 250 may determine the motion information of the user according to both the orientation information based on the first sensing data and the adjusted position information based on the first and second position information. No matter the human body portions exist in the third sensing data, the processor 250 can predict the motion of the user.
  • In still another embodiment, the processor 250 may determine the motion information of the user according to both the orientation information based on the first sensing data and the combined position information based on the second and third sensing data. It means that the motion information can be determined based on the orientation information and the combined position information in two durations when the human body portions exist and do not exist in the third sensing data.
  • Taking FIGS. 4 and 5 as an example, hands up motion for the human body portion B4 is determined in FIG. 4, and hands down motion is determined in FIG. 5. Then, a swing motion from up to down for the human body portion B4 is determined.
  • In one embodiment, the processor 250 may determine the motion information of the user merely according to the position information based on the second sensing data. In another embodiment, the processor 250 may determine the motion information of the user merely according to the combined position information based on the second and third sensing data. In some embodiments, the processor 250 may determine the motion information of the user merely according to the position information based on the second sensing data if the human body portions do not exist in the third sensing data, and the may determine the motion information of the user merely according to the position information based on the third sensing data or the combined position information if the human body portions exist in the third sensing data.
  • The displacement or trajectory of the human body portion may be tracked, and the motion information can be determined based on the displacement or trajectory. Taking FIGS. 4 and 5 as an example, the human body portion B3 moves from up to down, and a swing motion from up to down for the human body portion B4 is determined.
  • It will be apparent to those skilled in the art that various modifications and variations can be made to the structure of the present disclosure without departing from the scope or spirit of the disclosure. In view of the foregoing, it is intended that the present disclosure cover modifications and variations of this disclosure provided they fall within the scope of the following claims and their equivalents.

Claims (20)

What is claimed is:
1. A motion tracking method, adapted for a motion tracking system, wherein the motion tracking system comprises first, second and third motion sensing apparatuses wearable on human body portions of a user, and the motion tracking method comprises:
obtaining first sensing data based on motion sensors disposed on the first, second and third motion sensing apparatuses;
obtaining second sensing data based on wireless signals transmitted between the first, second and third motion sensing apparatuses; and
determining motion information of the user by a determining factor, wherein the determining factor comprises the first sensing data and the second sensing data.
2. The motion tracking method according to claim 1, the step of determining the motion information of the user comprises:
obtaining orientation information according to the first sensing data;
obtaining position information according to the second sensing data; and
determining the motion information of the user according to both the orientation information and the position information.
3. The motion tracking method according to claim 1, the step of determining the motion information of the user comprises:
obtaining first position information and orientation information according to the first sensing data;
obtaining second position information according to the second sensing data;
obtaining adjusted position information according to the first position information and the second position information; and
determining the motion information of the user according to both the orientation information and the adjusted position information.
4. The motion tracking method according to claim 1, further comprising:
obtaining third sensing data based on images captured from an image sensor,
wherein the human body portions of the user exist in the third sensing data, and the determining factor further comprises the third sensing data.
5. The motion tracking method according to claim 4, the step of determining the motion information of the user comprises:
obtaining orientation information according to the first sensing data;
obtaining a first part of position information according to the second sensing data in a first duration;
obtaining a second part of position information according to the third sensing data in a second duration;
combining the first and second part of position information as combined position information; and
determining the motion information of the user according to both the orientation information and the combined position information.
6. The motion tracking method according to claim 4, the step of determining the motion information of the user comprises:
obtaining orientation information and first position information according to the first sensing data;
obtaining second position information according to the second sensing data;
obtaining third position information according to the third sensing data;
obtaining adjusted position information according to the first position information, the second position information and the third position information; and
determining the motion information of the user according to both the orientation information and the adjusted position information.
7. The motion tracking method according to claim 6, wherein the step of obtaining the adjusted position information according to the first position information, the second position information and the third position information comprises:
determining a first weight and a second weight according to the third position information;
obtaining a first parameter by multiplying the first weight and the first position information;
obtaining a second parameter by multiplying the second weight and the second position information; and
obtaining the adjusted position information by adding the first parameter to the second parameter.
8. The motion tracking method according to claim 7, wherein the first weight and the second weight are varied time after time.
9. The motion tracking method according to claim 7, wherein the first weight and the second weight are fixed in response to the human body portions of the user not existing in the third sensing data.
10. The motion tracking method according to claim 5, wherein the human body portions of the user do not exist in the third sensing data at the first duration, and the human body portions of the user exist in the third sensing data at the second duration.
11. A motion tracking system, comprising:
three motion sensing apparatuses, wearable on human body portions of a user, wherein each of the motion sensing apparatuses comprising:
a wireless transceiver, transmitting or receiving wireless signals; and
a motion sensor, sensing motion of one of the human body portions of the user; and
a processor, configured to perform:
obtaining first sensing data based on the motion sensors of the three motion sensing apparatuses;
obtaining second sensing data based on the wireless signals transmitted between the three motion sensing apparatuses; and
determining motion information of the user by a determining factor, wherein the determining factor comprises the first sensing data and the second sensing data.
12. The motion tracking system according to claim 11, wherein the processor is configured to perform:
obtaining orientation information according to the first sensing data;
obtaining position information according to the second sensing data; and
determining the motion information of the user according to both the orientation information and the position information.
13. The motion tracking system according to claim 11, wherein the processor is configured to perform:
obtaining first position information and orientation information according to the first sensing data;
obtaining second position information according to the second sensing data;
obtaining adjusted position information according to the first position information and the second position information; and
determining the motion information of the user according to both the orientation information and the adjusted position information.
14. The motion tracking system according to claim 11, further comprising:
an image sensor, obtaining images; wherein the processor is configured to perform:
obtaining third sensing data based on the images,
wherein the human body portions of the user exist in the third sensing data, and the determining factor further comprises the third sensing data.
15. The motion tracking system according to claim 14, wherein the processor is configured to perform:
obtaining orientation information according to the first sensing data;
obtaining a first part of position information according to the second sensing data in a first duration;
obtaining a second part of position information according to the third sensing data in a second duration;
combining the first and second part of position information as combined position information; and
determining the motion information of the user according to both the orientation information and the combined position information.
16. The motion tracking system according to claim 14, wherein the processor is configured to perform:
obtaining orientation information and first position information according to the first sensing data;
obtaining second position information according to the second sensing data;
obtaining third position information according to the third sensing data;
obtaining adjusted position information according to the first position information, the second position information and the third position information; and
determining the motion information of the user according to both the orientation information and the adjusted position information.
17. The motion tracking system according to claim 16, wherein the processor is configured to perform:
determining a first weight and a second weight according to the third position information;
obtaining a first parameter by multiplying the first weight and the first position information;
obtaining a second parameter by multiplying the second weight and the second position information; and
obtaining the adjusted position information by adding the first parameter to the second parameter.
18. The motion tracking system according to claim 17, wherein the first weight and the second weight are varied time after time.
19. The motion tracking system according to claim 17, wherein the first weight and the second weight are fixed in response to the human body portions of the user not existing in the third sensing data.
20. The motion tracking system according to claim 15, wherein the human body portions of the user do not exist in the third sensing data at the first duration, and the human body portions of the user exist in the third sensing data at the second duration.
US16/693,344 2019-11-24 2019-11-24 Motion tracking system and method Abandoned US20210157394A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/693,344 US20210157394A1 (en) 2019-11-24 2019-11-24 Motion tracking system and method
US17/125,954 US11460912B2 (en) 2019-11-24 2020-12-17 System and method related to data fusing
US17/125,962 US20210157397A1 (en) 2019-11-24 2020-12-17 System and method related to motion tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/693,344 US20210157394A1 (en) 2019-11-24 2019-11-24 Motion tracking system and method

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US17/125,962 Continuation US20210157397A1 (en) 2019-11-24 2020-12-17 System and method related to motion tracking
US17/125,954 Continuation US11460912B2 (en) 2019-11-24 2020-12-17 System and method related to data fusing

Publications (1)

Publication Number Publication Date
US20210157394A1 true US20210157394A1 (en) 2021-05-27

Family

ID=75974056

Family Applications (3)

Application Number Title Priority Date Filing Date
US16/693,344 Abandoned US20210157394A1 (en) 2019-11-24 2019-11-24 Motion tracking system and method
US17/125,954 Active 2040-01-06 US11460912B2 (en) 2019-11-24 2020-12-17 System and method related to data fusing
US17/125,962 Abandoned US20210157397A1 (en) 2019-11-24 2020-12-17 System and method related to motion tracking

Family Applications After (2)

Application Number Title Priority Date Filing Date
US17/125,954 Active 2040-01-06 US11460912B2 (en) 2019-11-24 2020-12-17 System and method related to data fusing
US17/125,962 Abandoned US20210157397A1 (en) 2019-11-24 2020-12-17 System and method related to motion tracking

Country Status (1)

Country Link
US (3) US20210157394A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230086268A1 (en) * 2021-09-22 2023-03-23 Apple Inc. Body volume/shape determination using handheld devices
US20230221433A1 (en) * 2022-01-12 2023-07-13 Freedrum Studio AB System and a method for determining positions of sensor units

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11507203B1 (en) 2021-06-21 2022-11-22 Meta Platforms Technologies, Llc Body pose estimation using self-tracked controllers

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
MX2012010238A (en) 2010-03-05 2013-01-18 Sony Comp Entertainment Us Maintaining multiple views on a shared stable virtual space.
US9436286B2 (en) * 2011-01-05 2016-09-06 Qualcomm Incorporated Method and apparatus for tracking orientation of a user
US20120220233A1 (en) 2011-02-28 2012-08-30 Qualcomm Incorporated Ranging with body motion capture
US9354709B1 (en) 2014-06-17 2016-05-31 Amazon Technologies, Inc. Tilt gesture detection
US20170115737A1 (en) 2015-10-26 2017-04-27 Lenovo (Singapore) Pte. Ltd. Gesture control using depth data
US10317989B2 (en) 2016-03-13 2019-06-11 Logitech Europe S.A. Transition between virtual and augmented reality
US10078377B2 (en) 2016-06-09 2018-09-18 Microsoft Technology Licensing, Llc Six DOF mixed reality input by fusing inertial handheld controller with hand tracking
KR102439771B1 (en) 2016-08-22 2022-09-02 매직 립, 인코포레이티드 Augmented reality display device with deep learning sensors
US10895628B2 (en) 2016-12-29 2021-01-19 Htc Corporation Tracking system, tracking device and tracking method
TWI646449B (en) 2017-05-12 2019-01-01 華碩電腦股份有限公司 Three-dimensional positioning system and method thereof
US10852847B2 (en) 2017-07-26 2020-12-01 Google Llc Controller tracking for multiple degrees of freedom
US10996742B2 (en) 2017-10-17 2021-05-04 Logitech Europe S.A. Input device for AR/VR applications
TWI636274B (en) 2018-03-06 2018-09-21 宏達國際電子股份有限公司 A positioning system and a positioning method
WO2019226691A1 (en) 2018-05-22 2019-11-28 Magic Leap, Inc. Transmodal input fusion for a wearable system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20230086268A1 (en) * 2021-09-22 2023-03-23 Apple Inc. Body volume/shape determination using handheld devices
US11796312B2 (en) * 2021-09-22 2023-10-24 Apple Inc. Body volume/shape determination using handheld devices
US20230221433A1 (en) * 2022-01-12 2023-07-13 Freedrum Studio AB System and a method for determining positions of sensor units

Also Published As

Publication number Publication date
US20210157397A1 (en) 2021-05-27
US11460912B2 (en) 2022-10-04
US20210157396A1 (en) 2021-05-27

Similar Documents

Publication Publication Date Title
US11460912B2 (en) System and method related to data fusing
EP3717928B1 (en) Apparatus and method for tracking movement of electronic device
WO2016041088A1 (en) System and method for tracking wearable peripherals in augmented reality and virtual reality applications
CN107923740B (en) Sensor device, sensor system, and information processing device
EP3304953B1 (en) Transmitting athletic data using non-connected state of discovery signal
JP7317399B2 (en) Electronic device and system for guiding ball drop point
US20180343397A1 (en) Systems and methods for position tracking
US11029753B2 (en) Human computer interaction system and human computer interaction method
EP4016253A1 (en) System and method related to data fusing
EP4016252A1 (en) System and method related to motion tracking
EP3832435A1 (en) Motion tracking system and method
TWI737068B (en) Motion tracking system and method
KR20160015674A (en) System for analyzing of human movement using inertial sensor
CN114661143A (en) System and method relating to data fusion
CN113029190A (en) Motion tracking system and method
CN114745010A (en) System and method relating to motion tracking
TW202225916A (en) Motion tracking system and method
TW202225915A (en) System and method related to data fusing
JP2022096723A (en) System and method related to motion tracking
JP2021089691A (en) Action tracking system and method for tracking actions
JP2022096724A (en) System and method related to data fusion
CN107007997B (en) Image processing apparatus, measurement device, image processing system, image processing method and recording medium
US11777626B2 (en) Method and apparatus related to multiple modes
US11369866B2 (en) Position tracking apparatus and method
SE1950724A1 (en) System for analyzing movement in sport

Legal Events

Date Code Title Description
AS Assignment

Owner name: XRSPACE CO., LTD., TAIWAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HUANG, CHING-NING;HSIEH, YI-KANG;REEL/FRAME:051111/0365

Effective date: 20190822

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION