WO2018057044A1 - Bandes de capteur à double mouvement pour suivi de geste en temps réel et jeu interactif - Google Patents

Bandes de capteur à double mouvement pour suivi de geste en temps réel et jeu interactif Download PDF

Info

Publication number
WO2018057044A1
WO2018057044A1 PCT/US2016/057223 US2016057223W WO2018057044A1 WO 2018057044 A1 WO2018057044 A1 WO 2018057044A1 US 2016057223 W US2016057223 W US 2016057223W WO 2018057044 A1 WO2018057044 A1 WO 2018057044A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
bands
sensing
sensing bands
data
Prior art date
Application number
PCT/US2016/057223
Other languages
English (en)
Inventor
Rosa Mei-Mei HUANG
Original Assignee
Huang Rosa Mei Mei
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huang Rosa Mei Mei filed Critical Huang Rosa Mei Mei
Publication of WO2018057044A1 publication Critical patent/WO2018057044A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7246Details of waveform analysis using correlation, e.g. template matching or determination of similarity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/0015Dancing
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7455Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0382Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC

Definitions

  • fitness trackers on the market (Fitbit, Garmin, Moov, Mizfit and smart watches) track activities such as distance, heart rate, pulse, activities that are monitored primarily through use of a location tracker, heart monitor, gyroscope, and accelerometer. They are used as single units to be worn on one wrist.
  • Motion sensing bands when used in pairs one may be placed on the left wrist and one on the right wrist and/or on the left and right ankle, permit near field communications between the bands and the streaming of information gathered from gyroscopic and accelerometer sensors (and possibly additional sensors).
  • the communication between bands and information streaming by the bands may work in conjunction with sensor input from the front facing camera of a mobile device, web cam, or an external beacon to track complex human movements occurring in 3 dimensions (the sagittal, frontal and transverse planes), although information from a camera is not required for the operation of the sensors or the information streaming from the sensor bands.
  • Application software may provide training to help a user learn a specific dance, martial arts, fitness training, gymnastics or rehabilitation moves by monitoring user movements and by providing feedback via a number of visual, audible, and haptic mechanisms to reward or correct movements during the training process.
  • FIGURE 1 shows the system design architecture for a device where motion capture information can be obtained by gesture recognition bands consistent with certain embodiments of the present invention.
  • FIGURE 2 shows a schematic block diagram of game play integration consistent with certain embodiments of the present invention.
  • FIGURE 3A illustrates the dual band devices on the player's body consistent with certain embodiments of the present invention.
  • FIGURE 3B illustrates dual bands as used for an activity such as fitness training consistent with certain embodiments of the present invention.
  • FIGURE 4 is a schematic block diagram showing the use of the bands in game play consistent with certain embodiments of the present invention.
  • FIGURE 5A is a schematic block diagram illustrating the types of sensor inputs and gesture primitive input gathered through raw data algorithm analysis consistent with certain embodiments of the present invention.
  • FIGURE 5B illustrates primitive angles in raw data accumulation using a device consisting of Left side and Right side consistent with certain embodiments of the present invention.
  • FIGURE 6A shows graphs for raw vs. processed acceleration signals consistent with certain embodiments of the present invention.
  • FIGURE 6B illustrates Angular Displacement Based Trigger from raw data analysis consistent with certain embodiments of the present invention.
  • FIGURE 7A illustrates the system of Basic Data Analysis for motion capture consistent with certain embodiments of the present invention.
  • FIGURE 7B is a schematic block diagram showing pattern recognition work flow consistent with certain embodiments of the present invention.
  • FIGURE 8A illustrates a 3d model of the dual bands showing the Yin Yang configuration consistent with certain embodiments of the present invention.
  • FIGURE 8B illustrates a top view of the dual bands showing the Yin Yang structural configuration consistent with certain embodiments of the present invention.
  • FIGURE 8C illustrates a rear view of the dual bands showing the Yin Yang structural configuration consistent with certain embodiments of the present invention.
  • FIGURE 9A illustrates the central device components early in the process of being returned to the docking station or oracle consistent with certain embodiments of the present invention.
  • FIGURE 9B illustrates the central device components later in the process of being returned to the docking station or oracle consistent with certain embodiments of the present invention.
  • FIGURE 9C illustrates the central device components at the end of the process of being returned to the docking station or oracle consistent with certain embodiments of the present invention.
  • FIGURE 10A illustrates a 3d model of an alternate version of the dual bands showing Yin Yang configuration consistent with certain embodiments of the present invention.
  • FIGURE 10B illustrates a top view of an alternate version of the dual bands showing the Yin Yang structural configuration with interlocking faces consistent with certain embodiments of the present invention.
  • FIGURE IOC illustrates a rear view an alternate version of dual bands showing the Yin Yang structural configuration with interlocking faces consistent with certain embodiments of the present invention.
  • FIGURE 11A illustrates three planes of motion (transverse, frontal and sagittal plane) consistent with certain embodiments of the present invention.
  • FIGURE 11B illustrates a motion sensor demonstrating six degrees of freedom consistent with certain embodiments of the present invention.
  • FIGURE llC illustrates the three planes of motion slicing through one body consistent with certain embodiments of the present invention.
  • FIGURE 12A Illustrates an existing gesture recognition technique (called HMM or Hidden Markov Models) which may use simple machine learning on raw data consistent with certain embodiments of the present invention.
  • HMM existing gesture recognition technique
  • Hidden Markov Models Hidden Markov Models
  • FIGURE 12B illustrates a state transition diagram associated with gesture recognition using Hidden Markov Models consistent with certain embodiments of the present invention.
  • FIGURE 13 is a schematic block diagram showing the processing sequence used in the Hidden Markov Model sequence consistent with certain embodiments of the present invention.
  • FIGURE 14A shows a basic search pattern for histogram data in optical flow motion analysis consistent with certain embodiments of the present invention.
  • FIGURE 14B illustrates feature extraction for nearest neighbor classification or a distribution of direction bias during motion histogram consistent with certain embodiments of the present invention.
  • FIGURE 14C Illustrates some types of movements interpreted by optical flow analysis consistent with certain embodiments of the present invention.
  • FIGURE 15A shows a motion trail interpretation of optical flow analysis of clapping hands consistent with certain embodiments of the present invention.
  • FIGURE 15B shows a silhouette of a man performing this clapping hands gesture consistent with certain embodiments of the present invention.
  • FIGURE 16A shows a motion trail interpretation of optical flow analysis of flapping arms consistent with certain embodiments of the present invention.
  • FIGURE 16B shows a silhouette of a man performing this flapping arms gesture consistent with certain embodiments of the present invention.
  • FIGURE 17A illustrates movement in the sagittal plane about the frontal axis consistent with certain embodiments of the present invention.
  • FIGURE 17B illustrates movement in the frontal plane about the sagittal axis consistent with certain embodiments of the present invention.
  • FIGURE 17C illustrates movement in transverse (horizontal) plane about the vertical axis consistent with certain embodiments of the present invention.
  • FIGURE 18A illustrates the tai chi move sometimes referred to as “repulsing the monkey,” consistent with certain embodiments of the present invention.
  • FIGURE 18B illustrates the tai chi move sometimes referred to as "cloud hands,” consistent with certain embodiments of the present invention.
  • FIGURE 18C illustrates the tai chi move sometimes referred to as "leg kick,” consistent with certain embodiments of the present invention.
  • FIGURE 19A illustrates a b-boy move known as toprocking consistent with certain embodiments of the present invention.
  • FIGURE 19B illustrates a basic arm wave consistent with certain embodiments of the present invention.
  • FIGURE 19C illustrates a breakdance hand spin where the body is balanced on one hand and the body rotates around the axis defined by the arm consistent with certain embodiments of the present invention.
  • FIGURE 20 illustrates an alternate view of the system architecture utilizing a cloud based, online system consistent with certain embodiments of the present invention.
  • FIGURE 21 illustrates an alternate view of the system architecture utilizing an offline or local server based system consistent with certain embodiments of the present invention.
  • the present invention teaches an interlocking set of motion tracking electronic sensor bands to monitor movement patterns along multiple planes, especially the sagittal plane of the body which divides left and right sides.
  • Sets of twin motion tracking bands may support the gamification of fitness and can assist in teaching and monitoring movement skills in dance, martial arts, fitness training, gymnastics and rehabilitation, where form of the body and tracking of arm and leg movements are critical.
  • Multiple sets of interlocking motion tracking bands, or additional configurations of motion tracking sensor bands, including configurations where motion tracking sensor bands are not strictly configured in pairs, in communication with one another and one or more system servers may also be contemplated to create the ability to accumulate and stream information from more complex movements.
  • the motion tracking bands may be configured to communicate and share information between the sensor bands through the use of any near field communication methodology, such as Bluetooth, Bluetooth Low Energy, near field radio signals, or any other near field communication capability.
  • the bands may feature elements which make the user aware of correct form, giving the user feedback through use of LED lights, OLED display, and advanced pulse haptic information. These elements may provide the player with feedback on game scoring, points, correct rhythm and may encourage user engagement through positive reinforcement feedback in LED light format.
  • the motion sensing bands may work in sets of 2 to be placed on both wrists and/or both ankles, or may be configured to work in sets of paired sensing bands, such as, in a non-limiting example, two sets of two for placement on both wrists and ankles, or in greater numbers of sets to foster communication between cooperative sets of sensing bands that are affixed to multiple persons all working toward a similar movement or movement goal.
  • Gesture recognition and analysis for 3d human movements is achieved by use of one sensor each on the left and right side of the body.
  • gyroscope accelerometer
  • a mix of raw data algorithm analysis, pattern recognition, machine learning as well as additional information optionally gathered through the front facing camera of a mobile device using a low latency optical flow algorithm the system can understand complex motion and gesture patterns and assist players as they train to learn complex movements in dance, martial arts, gymnastics, yoga, boxing, fitness and other sports where correct form is critical to training and performance.
  • the motion sensing bands collect sensor data from accelerometer sensors (and possibly additional sensors) and can work in conjunction with sensor input from the front facing camera of a mobile device, web cam, or an external beacon to track complex human movements occurring in 3 dimensions (the sagittal, frontal and transverse planes). Input from a camera of a mobile device, web cam, or external beacon is not required but may be used by the system as additional data for analysis of movement.
  • the system is able to analyze movement with six degrees of freedom. The six degrees of freedom arise from linear motion and rotational motion along each of the X, Y, and Z axis. Each axis is perpendicular to one of three planes:
  • Frontal plane - divides the body into two roughly equal-sized portions and separates front from back; user motion within the plane is to the left or to the right.
  • Transverse plane - divides the body into two roughly equal-sized portions at roughly hip level and separates top from bottom; user motion is rotational - turning to the left or turning to the right.
  • motion data captured from more than one set of sensing bands may be analyzed and utilized to assist more than one user to simultaneously synchronize movements in any or all of the sagittal, frontal, or transverse planes.
  • This synchronization may present a user with additional analysis in relation to another user or object, and may present a group of users with feedback on physical movements to provide for analysis and improvement on individual movement, or on movements that depend upon group interaction and placement.
  • the motion data gathered from these various input devices reveal a plurality of metrics which can analyze and determine the direction and level of the movement, the part of the body performing the movement, the duration of movement, and dynamic qualities of the movement. This information may be used to create a scoring mechanism to incentivize fitness activities. All of the data gathered may be captured and analyzed in real time and may enable skeletal tracking of human body movement without the use of infrared devices.
  • the unified gesture and pattern recognition occurs as a result of data input from pairs or other groupings of wristbands and/or ankle bands, as well as input from the from facing camera of one or more user mobile devices.
  • an infrastructure may support the collection and analysis of motion data from the bands and optional front facing camera, the selection of a gaming, training, educational, or other application, analysis of the motion data, communication of feedback to the user, presentation of gaming, instructional, training, or other audio/visual displays, collection of statistical, progress, scoring, and other data, and other tasks as will be defined in the following descriptions and figures.
  • the infrastructure for certain embodiments may include the user's smart phone, set-top box, gaming system or other similar device, downloadable applications, front facing cameras, cloud network services, and other elements as described herein.
  • the system may provide communication into and out of the motion sensing bands using one or more communication channels such as Near Field Communication (NFC), Bluetooth, WiFi, cellular telephone links, infrared beaming, and so on.
  • NFC Near Field Communication
  • WiFi Wireless Fidelity
  • cellular telephone links such as Wi-Fi Protected Access (WPA)
  • infrared beaming such as Wi-Fi Protected Access (WPA)
  • the links may be used to collect motion and gesture data, button or touch inputs, speech, environmental metrics, or input from other components that may be a standard or optional part of the motion sensing bands.
  • the communication channels may be used to send data such as personalization to the motion sensing bands, to send signals that activate LEDs, OLED displays, haptic responses, sound transducers, and other feedback devices, or to download new programs into the memory of the motion sensing bands.
  • downloaded programs may be used to correct software defects, add new features, or to provide special modes of operation such as a 'concert mode'.
  • concert organizers may, with the permission of the owner, download special software applications that enhance the concert-going experience of attendees who are wearing the motion sensing bands.
  • Non-limiting examples of 'concert mode' functions may include assisting with ticketing and admission, providing special lighting effects during the performances by triggering motion sensing band LEDs of all attendees in unison, or measuring audience response (for example, by detecting what portion of the audience is clapping their hands over their head), and so on.
  • 'concert mode' functions may be triggered by allowing concert organizers to send messages to attendee's phones using one form of communication and then having the phones relay appropriate messages to the motion sending bands using a different or the same communication channel.
  • sensor bands may be used in any combination, whether configured in pairs, multiple pairs, singly or in odd numbers of sensing bands, to permit a user or a group of users to become a part of an augmented reality experience. Integrating the data collected from the multiple sensor bands with one or more three dimensional holographic landscapes or scenarios may permit the user or group of users to control the environment within the augmented reality experience with gestures and motions of the extremities to which the bands are attached.
  • the sensor bands can enable gesture and motion driven holographic UX/UI design and implementation of augmented reality experiences, and be used to create gesture driven environments for augmented reality and virtual reality systems. These systems may be optimized for security, command and control, role playing game immersion, and educational systems, among other advances in augmented reality.
  • the motion sensing bands may have their internal batteries recharged by placing the bands in the vicinity of a wireless recharging device, such as one that operates according to the Wireless Power Consortium Qi® standard.
  • a wireless recharging device such as one that operates according to the Wireless Power Consortium Qi® standard.
  • the faces of the motion sensing bands fit together in a sort of interlocking 'yin yang' configuration.
  • the motion sensing bands utilizing their unique interlocking shapes, can be joined together forming an Oracle' which can be charged on a docking station.
  • the motion sensing bands may reduce or remove power to some internal subsystems or components when it is determined that they are not currently required for use.
  • the motion sensing bands may be capable of providing audible feedback to the wearer.
  • the haptic function provided in the motion sensing bands may involve the use of multiple microhammers so that pulse sensations of different frequencies can be delivered to the user simultaneously.
  • one or more monochromatic or color-changing LED indicators may be provided on each motion sensing band.
  • the indicators may, for example, provide user sensory stimulus such as pinball-like flashes during gaming or may be activated during concert performances when the motion sensing bands are used in concert mode.
  • the system may utilize an OPTICAL FLOW ALGORITHM which interprets histogramic data using the front facing camera of any mobile device.
  • OPTICAL FLOW ALGORITHM interprets histogramic data using the front facing camera of any mobile device.
  • This system was extremely memory and CPU intensive and crashed on devices constantly.
  • We developed a low latency solution where there was no lag time between the player image and data processing.
  • This OPTICAL FLOW approach was able to work on ALL mobile devices without crashing, even low level devices targeted at kids.
  • the front facing camera is generally used for video conferencing or when a user wishes to take a picture of himself or herself. It is less often used with this type of visual tracking system which, though not as accurate technically as skeletal tracking, offers gamification options which are well suited to mobile devices.
  • This optical flow motion sensor we have adds extra visual tracking information about the movement using any device. Theoretically our system can also work with just this mobile device motion tracking system, although the dual bands provide enhanced game play.
  • FIGURE 1 shows the system design architecture for this device consistent with certain embodiments of the present invention wherein motion capture information can be obtained by gesture recognition bands (300).
  • Additional motion sensor information (301) can be obtained through visual tracking of the user using the front facing camera of the mobile device.
  • An example of such as system is our optical flow motion tracking algorithm which analyzes the histogramic data as a low latency tracker which does not require calibration.
  • Various types of mathematical modeling (302 A) and gesture libraries (302B) are incorporated into machine learning used for such functions as hierarchical feature extraction and information processing. These are incorporated into data type 1 (303) where raw data capture (303A) from an accelerometer and gyroscope is combined with pattern recognition (303B).
  • the camera based motion tracker (301) collects data type 2 (304) via visual tracking (304A) and optimization to delete false positives (304B). Data type 1 (303) and Data type 2 (304) are then routed to feedback on bands (306), the mobile app (307) and dashboard analysis (308).
  • FIGURE 2 shows a schematic block diagram of game play integration consistent with certain embodiments of the present invention.
  • motion sensor information is captured via dual motion sensing bands (401) as a source of primary gesture data and a camera based motion sensor (402) in the mobile device or web cam as a secondary gesture data source based on visual tracking. This information from (401) and (402) passes to a core server.
  • the core server is an engine that uses API to get data from the bands and communicates with the API.
  • This core server information goes to a core bridge device to API (403) which serves as a bridge between the code from the device and the API.
  • the core server also connects to Apps & Games (404) which can contain such items as lessons, channels, communities, etc...
  • Apps & Games (404) and API (407) where features (such as movement recognition, pattern recognition, heart rate, calories and accuracy tracking) can be added can feed into the game base (407) or core code which can contain information such as game management system, content engine, lesson engine, progress tracking, social engine, training advisors, adds or IAPs.
  • FIGURE 3A this figure illustrates the dual band devices on the player's body consistent with certain embodiments of the present invention.
  • Device A (100) or the Yin Side is located on the left side of the body (on the left wrist or left ankle or left wrist and ankle) and contains information pertaining to storage (101) and logic (102).
  • Device B (200) or the Yang Side is located on the right side of the body (on the right wrist or right ankle or both the right wrist and right ankle) and contains information pertaining to storage (201) and logic (202).
  • Device A (100) and Device B collect information from movement sensed on each side of the sagittal plane of the body as indicated by the dotted line.
  • FIGURE 3B this figure illustrates dual bands as used for an activity such as fitness training consistent with certain embodiments of the present invention where the right side device (01) communicated information with the left side device (02)
  • FIGURE 4 is a schematic bloc diagram showing the use of the bands in game play consistent with certain embodiments of the present invention.
  • the Device A Data (10A) from the left or yin side is combined with the Device B Data (10B) from the right or yang side) as a form of primary gesture recognition.
  • Visual tracking (11) (such as optical flow for the front facing camera of the mobile device) provides a source of secondary gesture recognition data.
  • Primary and secondary gesture recognition date is then processed through Raw Data Capture (12), Pattern Recognition (13), Backend Communication (14) and Optimization of False Positives (15).
  • FIGURE 5A this figure is a schematic block diagram illustrating the types of sensor inputs and gesture primitive input gathered through raw data algorithm analysis consistent with certain embodiments of the present invention
  • 6 or 9 axis data collection is recommended.
  • a Sensor Input consisting of a 3-axis accelerometer (001A), a 3-axis gyroscope (001B) and possibly a 3-axis magnetometer (OOIC) sends information in the following manner.
  • the accelerometer (001A) sends info to Subtract Gravity (003) and 3-axis Linear Acceleration
  • All three sensor inputs (001A) , (001B) and (OOICO can feed into a processor (005) consisting of a dual complementary filter and accurate real time estimate of 3 axis gravity and magnetic field via sensor fusion.
  • Gesture Primitive Outputs such as Impulse with Direction (0002A), Wrist Angle (002B), Forearm Inclination (002C) and Horizontal Angle (002D).
  • FIGURE 5B this figure illustrates primitive angles in raw data accumulation using a device consisting of Left side (007A) and Right side (007B) consistent with certain embodiments of the present invention.
  • Information from these 2 linked devices can give information (008) relating to the x, y and z axis concerning gravity, location of the forearm and information related to data capture from the accelerometer, gyroscope and magnetometer.
  • FIGURE 6 A shows graphs for raw vs. processed acceleration signals (in m/s2) along a) x and b) y axes defining a straight left to right line gesture consistent with certain embodiments of the present invention.
  • the jagged line shows raw data capture (with many extraneous lines for noise which must be subtracted for proper analysis and the smother line shows the processed acceleration signal.
  • FIGURE 6B this figure illustrates Angular Displacement Based Trigger from raw data analysis where the X, Y and Z axes register different angles (indicated by the circles and theta readings) consistent with certain embodiments of the present invention.
  • FIGURE 7A this figure illustrates the system of Basic Data Analysis for motion capture consistent with certain embodiments of the present invention.
  • the Start (01) of the process proceeds with Data Capture (101). From this, the process proceeds with the Extraction of Features of Gesture (102), comparing this with the Stored Feature of Gestures (103). If a Match is found (106), the system gives Output Feedback to Player via Bands and UX on the Mobile App (105) and then the process is complete Stop (106).
  • FIGURE 7B this figure is a schematic block diagram showing pattern recognition work flow.
  • this system consistent with certain embodiments of the present invention.
  • start (100) proceeds to Data Capture (101), Data Formatting (102) and Gesture Segmentation (103).
  • This forms type of recognition system (104) consisting of Feature Generation (104A), Feature Selection (104B) and Classification (104C). If Gestures Are Recognized (105), the system yields Recognition Results (106). If not, then this data is resent through Gesture Segmentation (103).
  • FIGURE 8A this figure illustrates a 3d model of the dual bands showing Yin Yang configuration (04) consistent with certain embodiments of the present invention.
  • the device can consist of such features as an OLED screen (01), LED lights (02 and 06), design elements (03), a body with haptics capabilities (05) and a band to attach to the wrist or ankles (07).
  • FIGURE 8B this figure illustrates a top view of the dual bands showing the Yin Yang structural configuration (04) consistent with certain embodiments of the present invention.
  • FIGURE 8C this figure illustrates a rear view of the dual bands showing the Yin Yang structural configuration (04) consistent with certain embodiments of the present invention.
  • FIGURE 9A this figure illustrates the central device components (20A) in 3d perspective being returned to the docking station or oracle consistent with certain embodiments of the present invention.
  • Platform (21) is the charging component and which is place on a stand (22)
  • FIGURE 9B this figure illustrates the central device components (20B) in 3d perspective being returned to the docking station or oracle consistent with certain embodiments of the present invention. Now they are closer to the docking station and the Yin Yang configuration is more readily apparent.
  • Platform (21) is the charging component and which is place on a stand (22)
  • FIGURE 9C this figure illustrates the central device components (20A) in top perspective being returned to the docking station or oracle consistent with certain embodiments of the present invention.
  • Platform (21) is the charging component and which is place on a stand (22)
  • FIGURE 10A this figure illustrates a 3d model of an alternate version of the dual bands showing Yin Yang configuration (104) consistent with certain embodiments of the present invention.
  • the device can consist of such features as an OLED screen (102), LED lights (101, 102), a body with haptics capabilities (100) and a band (103) to attach to the wrist or ankles
  • FIGURE 10B this figure illustrates a top view of an alternate version of the dual bands showing the Yin Yang structural configuration (104) with interlocking faces consistent with certain embodiments of the present invention.
  • FIGURE IOC this figure illustrates a rear view an alternate version of dual bands showing the Yin Yang structural configuration (104) with interlocking faces consistent with certain embodiments of the present invention.
  • FIGURE 11 A this figure illustrates three planes of motion (transverse, frontal and sagittal plane) consistent with certain embodiments of the present invention.
  • FIGURE 11B this figure illustrates a motion sensor demonstrating six degrees of freedom consistent with certain embodiments of the present invention.
  • FIGURE 11C this figure illustrates the three planes of motion slicing through one body consistent with certain embodiments of the present invention.
  • FIGURE 12A this figure illustrates an existing gesture recognition technique which may use simple machine learning on raw data consistent with certain embodiments of the present invention. This is called HMM or Hidden Markov Models. HMM may be integrated into the overall design as another method of interpreting gestures.
  • FIGURE 12B this figure illustrates the state transitions of a number of left-to-right Hidden Markov Model sequences associated with the recognition of several gestures.
  • FIGURE 13 is a schematic block diagram showing the processing sequence used in the Hidden Markov Model sequence, one that may be implemented as a source of interpreting data consistent with certain embodiments of the present invention.
  • FIGURE 14A this figure shows a basic search pattern for histogram data in optical flow motion analysis consistent with certain embodiments of the present invention
  • FIGURE 14B this figure illustrates feature extraction for nearest neighbor classification or a distribution of direction bias during motion histogram consistent with certain embodiments of the present invention.
  • FIGURE 14C Illustrates the types of movements interpreted by optical flow analysis consistent with certain embodiments of the present invention, specifically turning (801A), waving (801B), jumping (801C), marching (801D), clapping (80 IE), drumming(801F), and flapping (801 G).
  • FIGURE 15A this figure shows a motion trail interpretation of optical flow analysis of clapping hands consistent with certain embodiments of the present invention.
  • FIGURE 15B this figure shows a silhouette of a man performing this clapping hands gesture.
  • FIGURE 16A this figure shows a motion trail interpretation of optical flow analysis of flapping arms consistent with certain embodiments of the present invention.
  • FIGURE 16B this figure shows a silhouette of a man performing this flapping arms gesture.
  • FIGURE 17A this figure illustrates movement in the sagittal plane (about the frontal axis) which includes the motion of flexion/extension along the frontal axis. Examples include walking, squatting, overhead press
  • FIGURE 17B this figure illustrates movement in the frontal plane (about the sagittal axis) which includes the motions of abduction, side flexion, inversion/eversion along the sagittal axis. Examples include: star jump, lateral arm raise and side bending.
  • FIGURE 17C this figure illustrates movement in transverse (horizontal) plane (about the vertical axis) which includes the motions of internal/external rotation, horizontal flexion/extension and supination/pronation along the vertical axis. Examples include throwing, baseball, martial arts and golf swinging.
  • FIGURE 18 A this figure illustrates the tai chi move sometimes referred to as "repulsing the monkey”. This move involves a complex interchange of pushing and pulling and coordination of arms and legs in a manner which is more complex than simple running or walking.
  • FIGURE 18B this figure illustrates the tai chi move sometimes referred to as "cloud hands”. This is a complex move which is generally difficult to teach new students and which can be readily gamified using our devices and motion sensor monitoring
  • FIGURE 18C this figure illustrates the tai chi move sometimes referred to as "leg kick”. This involves a synchronization of arm and leg movement and projection of force outward from the center of the body.
  • FIGURE 19A this figure illustrates a b-boy (breakdancing) move knows as toprocking. In the toprocking move, the legs cross the body in time with the beat of the music.
  • FIGURE 19B this figure illustrates a basic arm wave specifically showing sequential move of a wave across the body.
  • FIGURE 19C this figure illustrates a breakdance hand spin. In this move the body is balanced on one hand and the body rotates around this axis defined by the arm.
  • FIGURE 20 illustrates an alternate view of the system architecture consistent with certain embodiments of the present invention.
  • this figure highlights that the present invention may use other sources of streaming video and may tie into other cloud services.
  • FIGURE 21 illustrates an alternate view of the system architecture consistent with certain embodiments of the present invention.
  • the present invention may use other sources of streaming video and may be managed through an offline server or an alternate server that is connected to a local network, or other managed network.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Business, Economics & Management (AREA)
  • Medical Informatics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Physiology (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Dentistry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Psychiatry (AREA)
  • User Interface Of Digital Computer (AREA)
  • Social Psychology (AREA)
  • Cardiology (AREA)
  • Artificial Intelligence (AREA)
  • Signal Processing (AREA)

Abstract

L'invention concerne un procédé et un système de reconnaissance de motifs complexes de mouvement et de geste. Dans un mode de réalisation de l'invention, une paire de bandes de poignet (ou de cheville) détecte un mouvement (300) qui est analysé pour déterminer des mouvements d'utilisateur dans des plans sagittal, frontal et transversal (Fig. 11A). Une caméra avant facultative (402) peut être utilisée pour augmenter le flux de données collecté par des bandes de détection de mouvement portées par l'utilisateur. Des jeux et des applications d'apprentissage (Fig 20) utilisés conjointement avec le système de détection de mouvement fournissent une rétroaction appropriée, des incitations et des scores à l'utilisateur par l'intermédiaire de mécanismes visuels, audibles et haptiques.
PCT/US2016/057223 2016-09-26 2016-10-14 Bandes de capteur à double mouvement pour suivi de geste en temps réel et jeu interactif WO2018057044A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662399611P 2016-09-26 2016-09-26
US62/399,611 2016-09-26

Publications (1)

Publication Number Publication Date
WO2018057044A1 true WO2018057044A1 (fr) 2018-03-29

Family

ID=61689690

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/057223 WO2018057044A1 (fr) 2016-09-26 2016-10-14 Bandes de capteur à double mouvement pour suivi de geste en temps réel et jeu interactif

Country Status (1)

Country Link
WO (1) WO2018057044A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734104A (zh) * 2018-04-20 2018-11-02 杭州易舞科技有限公司 基于深度学习图像识别的健身动作纠错方法及系统
US20220212111A1 (en) * 2019-07-05 2022-07-07 Nintendo Co., Ltd. Storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method
WO2023004445A1 (fr) * 2021-07-26 2023-02-02 Istec Innovative Sport Technologies Gmbh Système pour l'attribution automatisée de scores dans les arts martiaux

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7848564B2 (en) * 2005-03-16 2010-12-07 Lucasfilm Entertainment Company Ltd. Three-dimensional motion capture
CN103135765A (zh) * 2013-02-20 2013-06-05 兰州交通大学 一种基于微机械传感器的人体动作信息捕捉系统

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7848564B2 (en) * 2005-03-16 2010-12-07 Lucasfilm Entertainment Company Ltd. Three-dimensional motion capture
CN103135765A (zh) * 2013-02-20 2013-06-05 兰州交通大学 一种基于微机械传感器的人体动作信息捕捉系统

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108734104A (zh) * 2018-04-20 2018-11-02 杭州易舞科技有限公司 基于深度学习图像识别的健身动作纠错方法及系统
US20220212111A1 (en) * 2019-07-05 2022-07-07 Nintendo Co., Ltd. Storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method
US11771994B2 (en) * 2019-07-05 2023-10-03 Nintendo Co., Ltd. Storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method
US11771995B2 (en) 2019-07-05 2023-10-03 Nintendo Co., Ltd. Storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method
US11865454B2 (en) 2019-07-05 2024-01-09 Nintendo Co., Ltd. Storage medium having information processing program stored therein, information processing system, information processing apparatus, and information processing method
WO2023004445A1 (fr) * 2021-07-26 2023-02-02 Istec Innovative Sport Technologies Gmbh Système pour l'attribution automatisée de scores dans les arts martiaux

Similar Documents

Publication Publication Date Title
US10661148B2 (en) Dual motion sensor bands for real time gesture tracking and interactive gaming
CN111095150B (zh) 机器人作为私人教练
CN108369646B (zh) 多传感器事件检测和标记系统
Buttussi et al. Bringing mobile guides and fitness activities together: a solution based on an embodied virtual trainer
CN107533806B (zh) 被配置为实现对包括具有多个可选择的专家知识变化的内容在内的交互技能训练内容的传送的框架、设备和方法
Baca et al. Ubiquitous computing in sports: A review and analysis
US11341776B2 (en) Method, electronic apparatus and recording medium for automatically configuring sensors
CN107211109B (zh) 视频和运动事件集成系统
CN108028902A (zh) 集成传感器和视频运动分析方法
US20170216675A1 (en) Fitness-based game mechanics
US10441847B2 (en) Framework, devices, and methodologies configured to enable gamification via sensor-based monitoring of physically performed skills, including location-specific gamification
CN106233227A (zh) 具有体积感测的游戏装置
CN106448295A (zh) 基于捕捉的远程教学系统及其方法
CN106061568A (zh) 用于模块化玩具的游戏系统
US11682157B2 (en) Motion-based online interactive platform
KR102376816B1 (ko) 표적 이미지 검출을 통한 증강 현실 경험 언록
CN115768532A (zh) 使用激光雷达传感器的增强现实互动运动装置
WO2018057044A1 (fr) Bandes de capteur à double mouvement pour suivi de geste en temps réel et jeu interactif
Dabnichki Computers in sport
US20150273321A1 (en) Interactive Module
Poussard et al. 3DLive: A multi-modal sensing platform allowing tele-immersive sports applications
CN207237157U (zh) 基于行走模式的全景互动体验设备
Krukowski et al. User Interfaces and 3D Environment Scanning for Game-Based Training in Mixed-Reality Spaces
US20240135617A1 (en) Online interactive platform with motion detection
Loviscach Playing with all senses: Human–Computer interface devices for games

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16917010

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 20.08.2019)

122 Ep: pct application non-entry in european phase

Ref document number: 16917010

Country of ref document: EP

Kind code of ref document: A1