US20220155851A1 - System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses - Google Patents
System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses Download PDFInfo
- Publication number
- US20220155851A1 US20220155851A1 US17/590,468 US202217590468A US2022155851A1 US 20220155851 A1 US20220155851 A1 US 20220155851A1 US 202217590468 A US202217590468 A US 202217590468A US 2022155851 A1 US2022155851 A1 US 2022155851A1
- Authority
- US
- United States
- Prior art keywords
- signal
- user
- signals
- midi
- gait
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 230000005021 gait Effects 0.000 title claims abstract description 37
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000004458 analytical method Methods 0.000 title claims description 7
- 230000004044 response Effects 0.000 title description 28
- 230000002123 temporal effect Effects 0.000 claims abstract description 11
- 230000033001 locomotion Effects 0.000 claims description 70
- 238000005259 measurement Methods 0.000 claims description 21
- 230000006870 function Effects 0.000 claims description 19
- 238000001514 detection method Methods 0.000 claims description 12
- 230000035945 sensitivity Effects 0.000 claims description 10
- 230000001953 sensory effect Effects 0.000 claims description 9
- 230000001133 acceleration Effects 0.000 claims description 6
- 238000003708 edge detection Methods 0.000 claims description 3
- 230000005540 biological transmission Effects 0.000 claims description 2
- 230000001360 synchronised effect Effects 0.000 claims description 2
- 210000002683 foot Anatomy 0.000 description 47
- 210000003371 toe Anatomy 0.000 description 25
- 230000000007 visual effect Effects 0.000 description 19
- 230000000694 effects Effects 0.000 description 17
- 230000002452 interceptive effect Effects 0.000 description 8
- 230000000875 corresponding effect Effects 0.000 description 7
- 239000011295 pitch Substances 0.000 description 7
- 238000004891 communication Methods 0.000 description 5
- 238000013139 quantization Methods 0.000 description 5
- 230000001960 triggered effect Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000000554 physical therapy Methods 0.000 description 4
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 3
- 230000001020 rhythmical effect Effects 0.000 description 3
- 238000002560 therapeutic procedure Methods 0.000 description 3
- 210000000707 wrist Anatomy 0.000 description 3
- 208000018737 Parkinson disease Diseases 0.000 description 2
- 210000003423 ankle Anatomy 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000006855 networking Effects 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 230000000386 athletic effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000002153 concerted effect Effects 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 210000004247 hand Anatomy 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000000670 limiting effect Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000013160 medical therapy Methods 0.000 description 1
- 230000001151 other effect Effects 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0334—Foot operated pointing devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0033—Recording/reproducing or transmission of music for electrophonic musical instruments
- G10H1/0041—Recording/reproducing or transmission of music for electrophonic musical instruments in coded form
- G10H1/0058—Transmission between separate instruments or between individual components of a musical system
- G10H1/0066—Transmission between separate instruments or between individual components of a musical system using a MIDI interface
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H1/00—Details of electrophonic musical instruments
- G10H1/0033—Recording/reproducing or transmission of music for electrophonic musical instruments
- G10H1/0083—Recording/reproducing or transmission of music for electrophonic musical instruments using wireless transmission, e.g. radio, light, infrared
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/135—Musical aspects of games or videogames; Musical instrument-shaped game input interfaces
- G10H2220/145—Multiplayer musical games, e.g. karaoke-like multiplayer videogames
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/321—Garment sensors, i.e. musical control means with trigger surfaces or joint angle sensors, worn as a garment by the player, e.g. bracelet, intelligent clothing
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/321—Garment sensors, i.e. musical control means with trigger surfaces or joint angle sensors, worn as a garment by the player, e.g. bracelet, intelligent clothing
- G10H2220/336—Control shoe or boot, i.e. sensor-equipped lower part of lower limb, e.g. shoe, toe ring, sock, ankle bracelet or leg control attachment
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10H—ELECTROPHONIC MUSICAL INSTRUMENTS; INSTRUMENTS IN WHICH THE TONES ARE GENERATED BY ELECTROMECHANICAL MEANS OR ELECTRONIC GENERATORS, OR IN WHICH THE TONES ARE SYNTHESISED FROM A DATA STORE
- G10H2220/00—Input/output interfacing specifically adapted for electrophonic musical tools or instruments
- G10H2220/155—User input interfaces for electrophonic musical instruments
- G10H2220/395—Acceleration sensing or accelerometer use, e.g. 3D movement computation by integration of accelerometer data, angle sensing with respect to the vertical, i.e. gravity sensing
Definitions
- the present invention generally relates to a system and method for generating wireless signals generated from physical movement sensors and/or similar devices coupled to a person's body.
- the invention more particularly relates to such a system and method which also enables the selective control of the digital responses to the generated signals, including MIDI data (Musical Instrument Digital Interface), sounds, visuals, and/or interactive responses, for example.
- MIDI data Musical Instrument Digital Interface
- the present invention relates to a system and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses.
- MIDI controllers are the hardware interface for the control of digital musical instruments.
- MIDI Musical Instrument Digital Interface
- a musical keyboard is most typically used to “control” sound banks or synthesizers that are wired to the controller with MIDI cables.
- Percussive MIDI controllers such as the Roland Corporation's OCTAPAD®, contain large rubber pads that when hit, trigger digital samples.
- MIDI controllers may also contain sliders, knobs or buttons that control recording of MIDI music.
- Novation's Launchpad uses buttons to act as switches for recording, or as drum pads for performing music.
- MIDI musical controllers produced and marketed in the past have included breath controllers offered by Yamaha as optional accessories for their line of keyboard synthesizers produced in the 1980s (the DX7, DX11, CS01, and others). These breath controllers allowed the use of breath pressure to have the synthesizer send corresponding MIDI continuous control messages to modify the sound output. In the wake Hyundai's controller, other manufacturers have made and offered breath controllers that are free-standing and allow the user to add breath control of MIDI continuous controller messages to instruments lacking that control as a built-in feature. For example the TEControl USB MIDI Breath Controller can be used with a wide range of MIDI compatible musical equipment or computer software that accepts MIDI messages.
- U.S. Pat. No. 5,765,300 entitled Shoe Activated Sound Synthesizer Device is directed to a shoe activated sound synthesizer device that enables movement of a shoe to be translated into audible sounds.
- the sound synthesizer device consists of a shoe in which there is disposed at least one trigger element capable of producing a trigger signal when the shoe is flexed to a predetermined degree. As the shoe is worn and is brought into contact with the floor, the shoe is flexed.
- a sound synthesizer circuit is provided that is coupled to each trigger element contained within the shoe.
- the sound synthesizer circuit produces an audible sound, via a speaker, when a trigger signal is received from the shoe.
- Pressure sensors have also been embedded in floors or floor-mounted surfaces and used as arcade or home game controllers: examples include Nintendo's Wii Fit Balance Board and Konami's Dance Dance Revolution.
- the Nike+ FUELBAND® is a device that tracks a user's movement and activity in order to track progress in fitness training.
- the Nike+ sensor may transmit a data packet to a receiver directly attached to a mobile device.
- wireless remote control of electronics hardware through an application on a mobile device or tablet computer is an expanding field.
- One example is the GoPro App, which allows a user full remote control over a GoPro camera's functions and record button, as well as providing a useful preview image of what the camera is photographing, if for example it is attached to the top of a helmet deeming the viewfinder not visible.
- Movement sensors attached to the person e.g., on one foot or both feet
- communicate with other system components such as microprocessors, transceivers and/or tactile interface controls, for example, to wirelessly send signal pulses from a person to a computer or mobile device and allow the person wearing the sensors and/or another person to selectively control the dynamics of the digital responses so to create a unique sensory output.
- a system for creating a sensory output from a user's physical movements comprises one or more sensors configured to be removably attachable to a user's body.
- the one or more sensors may be adapted to detect movement and trigger a signal containing movement data in real-time.
- a transceiver is operably coupled to the one or more sensors to transmit the real-time signal and a receiver is coupled to a computing device with the receiver configured to receive the transmitted real-time signal.
- the computing device converts the movement data to an output signal wherein the output signal manifests as a sensory output comprising one or more of a visual signal, an interactive effect signal, a Musical Instrument Digital Interface (MIDI) signal or an audio signal.
- MIDI Musical Instrument Digital Interface
- the system may further comprise a tactile interface unit coupled to the receiver and computing device wherein the tactile interface unit is operable to selectively control and manipulate the output signal.
- the tactile interface unit may be configured to be removably attached to the user's body and to be operable by the user.
- the tactile interface unit, the receiver and the computing device may be housed within a single mobile device configured to be removably attached to the user's body.
- the tactile interface may be remotely located from the user and may be operated by a second party.
- the tactile interface unit is an application running on a mobile device and is configured for wireless remote control of the output signals produced by the computing device.
- the output signals may be characterized by a note's length, a MIDI channel used, a MIDI continuous controller number sent, or any other parameters that can be modified in the digital output of the audio, visual or interactive effect signals.
- the transceiver may be a radio transceiver configured to generate and transmit wireless pre-MIDI signals, wherein the pre-MIDI data signals from the transceiver are converted by the computing device into MIDI notes, MIDI continuous controller messages or similar data protocol.
- the computing device may convert the wireless pre-MIDI signals into actual MIDI data at a higher data rate than the MIDI protocol to thereby reduce latency in the output of the output signal.
- At least one of the one or more sensors resides in a shoe configured to be worn by the user.
- a shim may also be placed within the shoe wherein the shim is configured to position the at least one sensor at a transverse arch or heel of the user's foot.
- the transceiver is housed within a transceiver box where the transceiver box further includes a microprocessor programmed to include a peak-detection algorithm.
- the microprocessor utilizes the peak-detection algorithm to convert movement data generated by the one or more sensors into discrete digital signals indicating event onsets to be transmitted by the transceiver.
- one or both of the real-time signal and the output signal are communicated wirelessly over a local area network (LAN), a wide area network (WAN), a Cloud or the internet.
- LAN local area network
- WAN wide area network
- Cloud the internet
- a system for creating sensory outputs from physical movements of a plurality of users comprises a respective set of one or more sensors configured to be removably attachable to a respective user's body.
- the one or more sensors may be adapted to detect movement and trigger a signal containing movement data in real-time for the respective user.
- a respective transceiver may be operably coupled to each respective set of one or more sensors to transmit the real-time signal for the respective user.
- a receiver may be coupled to a computing device and be configured to receive the transmitted real-time signal from each user.
- the computing device may then convert the movement data from each user to a series of output signals wherein the output signals manifest as sensory outputs comprising one or more of a visual signal, an interactive effect signal, a Musical Instrument Digital Interface (MIDI) signal or an audio signal.
- MIDI Musical Instrument Digital Interface
- a tactile interface unit may be coupled to the receiver and computing device where the tactile interface unit is operable to selectively control and manipulate the output signals. Additionally or alternatively, a respective tactile interface unit may be configured to be removably attached to each respective user where each respective tactile interface unit may be in communication with the receiver and computing device wherein each respective tactile interface unit is operable to selectively control and manipulate the output signals.
- FIG. 1 is a schematic view of a system for generating wireless signals from physical movement in accordance with an embodiment of the present invention
- FIG. 1A are schematic views of alternative tactile interfaces that may be used within a system of the present invention
- FIG. 2 is a schematic view of a system for generating wireless signals from physical movement in accordance with another embodiment of the present invention
- FIG. 3 is a schematic view of a device used to generate wireless signals from physical movement in the embodiments of the system shown in FIGS. 1 and 2 ;
- FIG. 4 is a schematic view of a pressure sensor layout used within the device shown in FIG. 3 ;
- FIG. 5 is a schematic view of a signal transceiver bracket used within the device shown in FIG. 3 ;
- FIG. 6 is a schematic view of a signal transceiver strap used within the device shown in FIG. 3 ;
- FIG. 7 is a schematic view of the components within a signal transceiver used within the system shown in FIGS. 1 and 2 ;
- FIG. 8 is a schematic view of an alternative device used to generate wireless signals from physical movement in the embodiments of the system shown in FIGS. 1 and 2 ;
- FIG. 9 is a schematic view of an alternative pressure sensor layout used within the device shown in FIG. 3 ;
- FIG. 10 is a schematic view of a system for generating wireless signals from physical movement by more than one performer in accordance with another embodiment of the present invention.
- FIG. 11 is a schematic view of a system for generating wireless signals from physical movement by more than one performer wherein each performer may remotely control system outputs in accordance with another embodiment of the present invention
- FIG. 12 is a schematic view of a system for generating wireless signals from physical movement by more than one performer wherein a non-performer may remotely control system outputs in accordance with another embodiment of the present invention
- FIG. 13 is a plot, along with associated visual aid, showing pressure sensor recordings when analyzing a user's gait in accordance with an aspect of the present invention
- FIG. 13A is a side perspective view of an embodiment of an alternative sensor unit in accordance with a further aspect of the present invention.
- FIG. 13B is a bottom view of the alternative sensor unit shown in FIG. 13A ;
- FIG. 14 is the plot shown n FIG. 13 indicating exemplary diagnostic criteria during a gait analysis
- FIG. 15A is an auxiliary plot of an exemplary left heel pre-MIDI output for the plot shown in FIGS. 13 and 14 ;
- FIG. 15B is an auxiliary plot of an exemplary left toe pre-MIDI output for the plot shown in FIGS. 13 and 14 ;
- FIG. 16A is a diagrammatic plot of the quantization and output of MIDI notes for the plot shown in FIG. 15A ;
- FIG. 16B is a diagrammatic plot of the quantization and output of MIDI notes for the plot shown in FIG. 15B ;
- FIG. 17 is a diagrammatic view of a system for adapting auditory biofeedback cues to adjust a user's gait.
- FIG. 18 is a flow chart of an exemplary method for adapting auditory biofeedback cues to adjust a user's gait in accordance with an aspect of the present invention.
- one or more sensors may be placed on or near a foot or both feet.
- a sensor may be triggered and read by detector, such as an analog-to digital converter.
- a microprocessor may be attached to the sensor and may transmit a wireless signal pulse to a computer.
- Computer software resident on the computer may then convert the wireless signal pulses into MIDI data (or a similar data protocol) that may be recorded as interoperable data or may be assigned to digital responses such as, but not limited to audible sounds including musical notes and beats; visual feedback in lighting effects or digital graphics; or interactive responses from a video game or digital display.
- a user may record, loop or modify the MIDI data or the dynamics of the digital responses in real-time by using a suitable interface, such as through tactile finger movements upon an interface in conjunction with movement of his or her legs and feet.
- a suitable interface such as through tactile finger movements upon an interface in conjunction with movement of his or her legs and feet.
- the dynamics of the responses that may be changed in real-time by coordinating finger and feet movements include, but are not limited to, the modification of the precise timing and length of the digital effects produced or the qualities of the visuals or sounds that are being generated by a person's physical movements.
- the term real-time refers to digital signals or responses that occur perceivably instantaneous with the underlying physical movement.
- systems design in accordance with the teachings of the present invention may have applications as a musical and/or scientific instrument used in such activities as dance performance, music production, athletic activities, art projects, entertainment systems, health diagnostics or medical therapies and the like.
- System 100 may comprise include a signal generation component 102 configured to wirelessly communicate movement related sensor data to a signal receiver component 104 before eventual broadcast (i.e. audio and/or visual responses) via one or more output modalities 106 .
- signal generation component 102 may be worn upon a user 108 and include one or more sensor and radio transceiver units 110 located on or proximate to the user's shoe or foot 112 and, optionally, a tactile interface unit 126 .
- Respective sensor and radio transceiver units 110 may be worn on one or both feet.
- Signals 115 generated by the sensors and transmitted by the transceivers may be received by a radio receiver 114 in communication with a computing device 116 , such as a smart phone, laptop, tablet or PC computer.
- Software resident within computing device 116 may then condition the received signals before eventual output 117 to appropriate output devices, such as via VGA output 118 (for visual signals), HDMI output 120 (for interactive effects), MIDI output 122 (for digital notes and beats) and/or digital output 124 (for sounds).
- a tactile interface unit 126 may also be coupled 127 to receiver component 104 wherein tactile interface 126 may be used for control and manipulation of output 117 . As shown in FIG.
- non-limiting examples of possible tactile interface units 126 may include a touch-screen device 126 A attached to the wrist, an interface responsive to an app on a mobile device 126 B held in the hand or a MIDI controller keyboard 126 C manipulated by the fingers.
- an alternative embodiment of a system 130 for generating wireless signals generated from physical movement sensors may utilize a single mobile device 126 B (such as a smart phone or tablet computer) that contains a built-in touch-screen tactile interface 126 and receiver component 104 (radio receiver 114 and computing device 116 ) configured for digital output of sound to headphones 134 .
- Mobile device 126 B may wirelessly receive signals 115 generated by sensors and radio transceiver units 110 .
- user 108 may use system 130 as a personal musical instrument capable of producing controllable digital sounds by virtue of coordinating commands of the tactile interface with physical movements of the feet and body.
- tactile interface unit 126 may be strapped to the wrist ( FIG. 1 ) or held in the hand ( FIG. 2 ). By pressing the interface with the fingers, user 108 may wirelessly change the precise timing and/or length of the digital effects and/or the quality of the digital responses produced, including the rhythmic timing and/or length of MIDI notes, for example. Via tactile interface unit 126 , user 108 may be able to change the assignment of a particular MIDI note to a particular sensor and/or a particular physical movement that is activated by a pressure sensor and/or inertial measurement unit.
- MIDI continuous controller data can be modified to create dynamic changes in digital effects and/or user 108 launch presets of different combinations of digital effects in order to transpose or arpeggiate musical notes and/or animate visual patterns.
- Tactile interface unit 126 may also display visual feedback confirming the digital effects being produced, such as a preview of graphics that are being projected on a larger screen and/or the user's current score if the system is being used in a multi-user video game environment, for example.
- the conclusive aesthetic result of system 100 / 130 is a series of coordinated digital responses in the form of sounds and/or visuals that are triggered and controlled by a user's physical movements.
- Sensor and radio transceiver unit 110 A may be configured to mount to a user's shoe 112 a with pressure sensors 114 positioned beneath the user's foot 113 and adhered to the innersole 142 of shoe 112 A.
- utilizing foot movement to create wireless signals 115 does not require specialized footwear.
- one or more pressure sensors 140 may be attached to a shim or raised support 144 that is positioned onto a removable innersole 142 of any suitable shoe.
- At least one pressure sensor 140 is positioned between the transverse arch 146 of foot 113 and innersole 142 when inserting foot 113 into shoe 112 A, and more particularly in the area of the transverse arch of the foot located between the ball of the foot 145 and the smallest toe 147 .
- a pressure sensor 140 (an optional shim 144 ) may also be placed near the heel 149 of foot 113 .
- the size and orientation of each shim 144 is selected so as to ensure contact between the innersole and the foot, while also minimizing user awareness of shim 144 and/or pressure sensor 140 and any discomfort that may result therefrom.
- the shape and size of pressure sensors may be modified or increased/decreased in order to selectively define the zone of sensitivity, that is, where on the foot is pressure required to be sensed by the sensor 140 before initiating a signal 115 .
- a radio transceiver box 148 may be releasably secured to laces 150 of shoe 112 A.
- radio transceiver box 148 may be secured to laces 150 by way of a bracket 152 slid under laces 150 of shoe 112 A.
- Radio transceiver box 148 may then be releasably mounted to bracket 152 via a releasable fastener (not shown), such as a snap, magnets, hook-and-loop material and the like.
- a strap 154 may be wrapped around the body of the shoe with the radio transceiver box 148 releasable attached to the strap.
- radio transceiver box 148 may include a housing 156 containing a printed circuit board 158 having an analog to digital converter circuit 160 configured to receive analog sensor data from pressure sensors 140 (such as via external jack 162 ) and convert such analog sensor data into digital signals for interrogation by microprocessor 164 . Interrogated digital signals may then be wirelessly transmitted via wireless transceiver 166 (which may also include an antenna 168 configured for digital broadcast).
- radio transceiver box 148 may also include an inertial measurement unit 170 configured to sense and output sensor data regarding movement of radio transceiver box 148 .
- inertial measurement unit 170 may include one or more of an accelerometer, gyroscope and a magnetometer. By way of example, signals outputted by inertial measurement unit 170 may trigger a selected output 117 solely through user movement without requiring footfall and activation of pressure sensor 140 .
- Radio transceiver box 148 may also include a battery 172 configured to provide necessary power to box components. It should be understood by those skilled in the art that any suitable battery may be used, including but not limited to non-rechargeable and rechargeable batteries.
- Light emitting diodes (LEDs) 171 may be also included to provide visual indication that radio transceiver box 148 and its various internal components are operating properly or to display colors synchronized to musical notes or from the user's settings. It should also be noted that by miniaturizing the electronics of the of the radio transceiver unit 110 A, the scale, weight and power consumption may be reduced.
- a hard soled dance shoe 112 B may be modified to accommodate one or more sensors 140 and a radio transceiver box 148 .
- a battery 172 (and optional battery recharging port 174 in the case of battery 172 being a rechargeable battery) may be embedded into the heel 176 of the shoe. By situating battery 172 within heel 176 , the size and weight of radio transceiver box 148 may be reduced so that radio transceiver box 148 may more comfortably be attached to a strap 178 on dance shoe 112 B.
- one or more pressure sensors 140 may be directly affixed to a user's foot 113 , such as through an adhesive 180 .
- a radio transceiver box 148 may then be releasably secured to the user's body, such as at or near user's ankle 182 .
- user 108 may generate wireless signals 115 without requiring any shoes, but merely through impact of his or her bare foot upon a surface.
- embodiment 110 B may be suitable for use as a scientific instrument for developing physical therapies to improve a person's foot placement and/or gate, including therapies for foot pronation, walking disorders and/or physical movement disabilities, for example.
- pressure sensors 140 By adhering pressure sensors 140 directly to the skin of the foot and strapping radio transceiver box 148 to ankle 182 , a user can walk barefooted while the system would trigger sounds to encourage proper heel-to-toe foot movement and/or provide interactive responses and/or the recording of data, for example, to assist the person in therapy or diagnosis.
- user 108 may elect to wear shoes while pressure sensors 140 are directly affixed to foot 113 .
- FIGS. 10 through 12 show alternative embodiments of the present invention wherein a respective system generates wireless signals generated from physical movements from a plurality of individuals.
- individuals may be members of a dance/music ensemble or may be players in a multi-user video game environment.
- a multi-person system 200 has a group of people 208 wherein each member 208 A- 208 D of the group is equipped with a respective sensor and radio transceiver unit 210 A- 210 D. It should be noted that while shown and described as having four members, multi-person system 200 may be used with any size group of users and such alternative group sizes are to be considered within the teachings of the present invention.
- Each respective sensor and radio transceiver unit 210 A- 210 D may be in wireless communication with common signal receiver component 204 , such as but not limited to a turnkey computer 216 having an external or internal radio receiver 214 .
- Signal receiver component 204 may then be operated by a dedicated DJ or technician 217 , such as via a MIDI controller 226 , to produce sound, lighting or video effects, such as via speaker/lighting unit 228 .
- group 208 may perform as a dance or musical ensemble or may interact with a dance simulation gaming program or app (such as country-line dancing) that responds to each member's physical movement without confining the individual members to a camera-based or environmental motion tracking system.
- Multi-person system 300 is similar to multi-person system 200 described above, with the exception that DJ/technician 217 may be omitted. Rather each member 308 A- 308 D of the group includes a respective sensor and radio transceiver unit 310 A- 310 D wirelessly coupled to a respective tactile interface unit 326 A- 326 D which in turn is in wireless communication with a common signal receiver component 304 on a local area network (LAN) that can be controlled remotely by each member 308 A- 308 D via each respective tactile interface unit 326 A- 326 D.
- LAN local area network
- Signal receiver component 304 may output sound, lighting or video effects signals similar to system 200 or signal receiver component 304 may comprise an external or internal radio receiver 314 coupled to a turnkey computer 316 having built in speakers 328 and video display 329 .
- the group may perform as a dance or music ensemble or players in a multi-user video game.
- FIG. 12 shows a multi-person system 400 similar to system 300 described above wherein individual members 408 A- 408 E of a group includes a respective sensor and radio transceiver unit 410 A- 410 E wirelessly coupled to a respective tactile interface unit 426 A- 426 E which in turn is in wireless communication with the Internet, Cloud or a wide area network (WAN) 427 .
- Each respective sensor and radio transceiver unit 410 A- 410 E and its digital output may be controlled remotely by each member 408 A- 408 E via each respective tactile interface unit 426 A- 426 E.
- Audio outputs may be heard by each member via respective headphones 434 A- 434 F
- a dedicated DJ or technician 408 F who may also be an active member of the group and have a respective sensor and radio transceiver unit 410 F and respective tactile interface unit 426 F
- system 400 may enable members to work collaboratively in real-time even when one or more of the members is remotely located from the other members of the group.
- system 400 may enable concerted group activities in multi-user video games or assisting a team to move together in a drill or routine.
- a pressure sensor 140 when a user steps onto a pressure sensor 140 , it is activated.
- inertial measurement unit 170 When a user spins the body or moves a leg through the air, inertial measurement unit 170 is activated. Both pressure sensor(s) 140 and inertial measurement unit(s) 170 send electrical signals to the analog-to-digital converter circuit 160 . (See FIG. 7 ).
- Microprocessor 164 detects a peak in the pressure wave or acceleration curve and determines a discrete point that is transmitted as a wireless pulse in the signal from the transceiver 166 to the receiver 114 .
- Computer 116 may then convert the wireless signal of pulses into audible sounds including musical notes and beats and/or visual feedback in lighting effects and/or digital graphics and/or interactive responses from a video game or digital display, for example.
- Microprocessor 164 may be programmed to include a peak-detecting algorithm that converts these continuous pressure waves and acceleration curves to discrete pulses.
- the user may require a single note to be generated from a single footstep, so any additional pulses are be filtered out by a clock function within the software to thereby convey a response of a single note from a single footstep. This could be optionally activated in the system settings. In typical physical movements, such as walking, people alternate footsteps between the left and right feet.
- each signal pulse emanating from the combined left and right radio transceivers is numbered sequentially.
- the receiver and computer may also contain software capable of numbering the sequence of pulses, and translating them into MIDI note numbers or pitches, beats or tones in a musical scale.
- a wireless signal 115 of pulses is transmitted via wireless transceiver 166 to receiver component 104 .
- a person can modify the timing and length of the digital effects and/or the qualities of the visual and/or sound responses that are being generated by a person's physical movements.
- signal processing flow begins with pressure waves and acceleration curves generated by one or more sensors 140 /inertial measurement units 170 attached to the body of a user 102 that are triggered by physical movement. (See e.g., FIGS. 1 and 7 ).
- a peak-detecting algorithm stored on the microprocessor 164 identifies the peak of the pressure wave or acceleration curve and signals it as a low latency wireless pulse to the radio receiver 114 .
- Computer 116 (or tactile interface unit 126 ) converts the signal of pulses into operable data, for example MIDI notes and beats.
- the computer software numbers the pulses in a sequence determined by the user, then filters out unwanted pulses, quantizes the notes to a musical grid or modifies them based on the timing of the measures of music by sending the notes through an envelope.
- the resulting digital response can be seen or heard in real-time, and for example MIDI data can be recorded by the computer or controlled and manipulated in real-time by a tactile interface unit either worn by the person generating the physical movement (see FIG. 2 ) or by a DJ/Technician (see FIG. 10 ) controlling the dynamics of multiple people wearing sensors and radio transceivers.
- a user or DJ/technician may modulate all of the notes being generated by multiple users to a different musical scale, chord or key at a specific instant in time or produce a unified aesthetic change in the digital output of the system.
- the term “real-time” refers to digital signals or responses that occur perceivably instantaneous with the underlying physical movement. That is, the digital signal or response originates at the sensor or inertial measurement unit within about 500 milliseconds to about 50 nanoseconds, and more preferably between about 1 millisecond and about 100 milliseconds, from the time of peak detection by the algorithm to transmission of the digital signal to the receiver. In this manner, an output signal and the resultant audio, visual and/or other effects are perceived by the user and any spectators as occurring substantially simultaneously with the user's movements. In other words, there is no noticeable delay between a user's movements and the resultant effect.
- a method for synchronizing the MIDI signals of the system may be utilized for generating harmonious and rhythmically unified digital responses from a group of persons (see FIGS. 10-12 ) generating pulses from physical movement.
- the synchronization may involve quantizing the input of the MIDI notes to the beats and measures of the music. This involves modifying the precise time notes are played by shifting them to an established temporal grid.
- the use of a pitch envelope can modify incoming notes (generated by the feet) to trigger specific pitches at set beats of music or instants in time.
- the modifications of MIDI signals may either expand or limit the amount of pitches generated by the feet.
- the synchronization of signals generated by multiple users to a single receiver and computer produces the digital response of dancing with a partner, or choreographed group movements. Synchronization of digital responses transmitted through a LAN computer network (see FIG. 11 ) or the internet (see FIG. 12 ) enables remote users to dance together, or send signals from physical movements generated by multiple users across computer networks in real-time.
- a computer may include an internal radio receiving unit or an external receiving unit coupled to a computer, such as via a USB port.
- the signal receiver component 104 may utilize the IEEE 802.15.4 networking protocol for fast point-to-multipoint or peer-to-peer networking.
- Bluetooth LE low energy
- Bluetooth 4.0 and later revisions of the protocol may offer a fast data transfer interval for low latency, real-time, wireless signal transmission and reception.
- Utilizing the Bluetooth 4.0 protocol may allow radio transceiver boxes that are removably attachable to a person's body to communicate directly with Bluetooth 4.0 supported devices.
- Tactile interface unit 126 may provide for real-time control of the dynamics of the digital response effects emanating from the computer.
- the tactile interface may be operated by a medical clinician or the user themselves and may be capable of modifying the MIDI signals produced by the radio receiver and computer.
- a user moves his or her feet thereby activating sensors and generating wireless pulses as described above.
- the user's fingers touch tactile interface unit 126 so as to selectively modify the note pitch, length or any other MIDI parameters.
- tactile interface unit 126 may be a touch-screen device attached to the wrist, be contained in an app on a mobile device held in the hands, or be a MIDI controller keyboard manipulated by the fingers.
- Tactile interface unit may be utilized to alter the system settings such as adjusting the sensitivity of the pressure sensors, changing the preset sound or digital effect in real-time or altering the pitch of the note in real-time. Any digital event onset including MIDI events can be triggered as a digital output from the computer and dynamics such as the length and type of sound or visual effect can be controlled and modified by the tactile interface.
- first system 502 (analogous to system 100 described above and as will be discussed in greater detail below) mounted about the foot 504 .
- first system 502 may be mounted directly onto foot 504 or may be mounted within shoe 506 or may be attached to the external surface of shoe 506 , such as via a strap 507 .
- first system 502 is configured to include a heel sensor 508 adapted to detect heel pressure and a toe sensor 510 adapted to detect toe pressure, and may also include one or more optional inertial measurement units (not shown, such as inertial measurement unit 170 as seen in FIG. 7 ).
- first system 502 may be worn on the left foot as described above.
- a second system 514 may be worn on the right foot and include a respective heel sensor 515 and toe sensor 517 .
- whole-foot sensor 502 a may include a plurality of individually triggered sensor elements 504 a that may be arranged as a grid or array 506 a . As a wearer steps, only those sensor elements 504 a which are impacted will produce a corresponding signal. Thus, whole-foot sensor 502 a is configured to generate a series of sensor data for each sensor element 504 a across the whole of array 506 a.
- the wearer, physical therapist, doctor, clinician or other third party may selectively isolate (such as via microprocessor 164 or computing device 116 , described above) specific sensor elements, such as those that define heel region 508 a (denoted by the letter Y in FIG. 13B ) and toe region 510 a (denoted by the letter X in FIG. 13B ). Heel region 508 a and toe region 510 a may then functionally operate analogous to heel sensor 508 and toe sensor 510 , as described above and further discussed below. Thus, for the sake of clarity, the following discussion will refer to heel sensor 508 and toe sensor 510 , although it should be understood that such teachings may equally apply to heel region 508 a and toe region 510 a.
- both heel sensor 508 and toe sensor 510 may be activated as user 500 takes a step such that microprocessor (e.g. microprocessor 164 on radio transceiver unit 110 , FIGS. 1 and 7 ) of system 502 generates a respective pre-MIDI pressure curve 518 , 520 as a function of time for heel and toe sensors 508 , 510 .
- microprocessor e.g. microprocessor 164 on radio transceiver unit 110 , FIGS. 1 and 7
- sensors 508 and 510 measure contact pressure (e.g., heel strikes 522 ) and release (e.g., toe off 524 ) of the user's heels and toes while the user steps.
- respective pre-MIDI pressure curves 518 (left heel), 520 (left toe) may offer diagnostic information regarding the gait of user 500 .
- the length of time of the user's stride time may be measured as the time difference between successive heel strikes 532 a , 532 b
- the user's swing time i.e., the length of time the foot is swinging through the air
- the stance time i.e., the length of time the foot is in contact with the floor
- This data may also be coupled with inertial measurement unit data, such as that received from inertial measurement unit 170 , so as to enable measurement of the stride length or step distance, along with other step/gait performance characteristics.
- the integrated pressure sensors 508 / 510 and inertial measurement unit 170 may be able to detect user state; that is, whether the wearer is sitting, standing or moving (e.g., walking, running, hopping etc.).
- user state data may be important during a physical therapy (PT) session, such as for a patient with Parkinson's disease. For instance, a frequent PT exercise involves walking a short distance.
- User state data may indicate step frequency, stride length, whether the patient has stopped to rest or sit down, or whether the patient is experiencing “frozen feet” which is common for patients with Parkinson's disease.
- System 502 also provides instant auditory biofeedback to encourage gait improvements during the PT session.
- a computing device such as microprocessor 164 or computing device 116 ( FIGS. 1 and 7 ), includes a computer processor that is configured to detect the edge and/or peak of the pre-MIDI data.
- the computer processor may execute an event onset algorithm to detect event onset of the MIDI notes from the pre-MIDI data.
- the event onset algorithm may interrogate the pre-MIDI data to detect event onset through one or both of peak detection or edge detection, and following edge (or peak) detection, microprocessor 164 or computing device 116 may then quantize the pre-MIDI data to generate respective MIDI signals which will ultimately comprise MIDI notes or MIDI continuous controller messages.
- foot motion data may be generated via inertial measurement unit 170 .
- the pressure and inertial measurement unit data may be multiplexed so that information of multiple events can be contained within the same signal.
- the event onset algorithm via microprocessor 164 or computing device 116 processor, may then interrogate each pre-MIDI pressure curve 528 , 530 (and, optionally, the inertial measurement unit data) to detect an event onset whereby the stepping force meets and exceeds a pre-selected force threshold 536 , 538 respectively, at which time the event onset algorithm initiates quantization of respective pressure curves 528 , 530 to define left heel MIDI signal 540 and left toe MIDI signal 542 . Quantization of pressure curves 528 , 530 continues until the applied force to sensors 515 , 517 falls below a respective pre-selected force threshold which may be the same or different that respective thresholds 536 , 538 .
- Respective MIDI signals 540 , 542 define a respective MIDI note or controller message corresponding to the time domain when the corresponding sensor 508 , 510 was actuated with a force greater than the respective pre-selected force thresholds 536 , 538 .
- Each pre-MIDI pressure curve 528 , 530 continues to be interrogated by the event onset algorithm until the force applied to the sensor(s) 508 , 510 again transitions past its pre-selected force threshold 536 , 538 , at which point the next successive MIDI signals on the left foot 544 (left heel) and 546 (left toe) are detected and quantized.
- 15B shows similar MIDI signals 548 , 550 following detection and quantization of pre-MIDI pressure curves 554 , 552 of the right heel and toe sensors 515 and 517 , respectively, once the respective stepping force exceeds pre-selected force thresholds 548 , 550 , respectively.
- the sensitivity of system 500 may be selectively adjusted by defining the event onset (peak and/or edge detection) thresholds for each sensor 508 , 510 , 515 , 517 .
- the threshold can be set so that the event onset algorithm accurately triggers sound with the user's perception/feeling of heel and toe touches, thereby providing a “sensory response” as biofeedback.
- computing device 116 may calibrate the MIDI notes as auditory biofeedback cues. Initially, a user sequentially places maximum weight-bearing pressure on each sensor. A gait analytic algorithm then sets an output value of the maximum weight-bearing pressure at 100%.
- MIDI signals 540 , 542 , 544 , 546 , 548 , 550 may then be outputted as respective square waves 560 , 562 , 564 , 566 , 568 , 570 wherein the magnitude of each square wave is calculated as a function of the maximum weight-bearing pressure and displayed as a respective percentage of that pressure.
- MIDI signals 540 , 542 , 544 , 546 , 548 , 550 may then be ultimately realized as respective musical notes or other sounds.
- each respective outputted note or sound may be correlated to the weight percentage of the MIDI signal square wave so as to provide auditory biofeedback that alerts the user, such as if they are favoring the left or right limb while walking.
- the system can produce biofeedback that matches each step.
- system 500 may efficiently count the alternating left and right steps, 572 , 574 , 576 .
- the accuracy of the step counting may be improved by utilizing the auditory biofeedback to calibrate the system as described above—that is, by stepping and “tuning” the sensitivity of the auditory sensory response (adjusting the pre-determined threshold for each sensor) until it matches the physical movement of foot touches while walking.
- the timing of the auditory biofeedback cues may be modified and arpeggiated so as to fall on a pre-selected musical grid 580 .
- a user 500 couples one or more sensors 508 , 510 and/or 515 , 517 to the bottom of the user's foot/feet or shoe(s) 504 / 506 .
- force measurements are communicated to transceiver unit 110 where the force data may be compiled as a pressure curve showing applied force over time.
- Computing device 116 via its processor and programmed gait analytic algorithm, may then interrogate the pressure curve data to detect event onset, wherein the event data is quantized to a MIDI signal (e.g., MIDI signal 540 , 542 , 544 , 546 , 548 , 550 ).
- the MIDI signal may be further calculated relative to a preset maximum force setting so as to condition the MIDI signal as a square wave having a magnitude indicative of the measured applied force.
- the quantized MIDI signal (or calibrated square wave) may then be modified and arpeggiated onto a pre-selected musical grid 580 .
- computing device 116 may be selectively configured to arpeggiate notes on a sixteenth note grid.
- computing device 116 via the gait analytic algorithm, may “hold back” or postpone the auditory cue slightly so as to properly place the note on the chromatic scale, temporal grid, or in the correct rhythmic timing.
- the auditory biofeedback cues are temporally adjusted so as to avoid discordant noise while, instead, producing an auditorily pleasing pattern.
- the incoming notes can be modified chromatically by transposing them to the nearest note in a specified key.
- This arpeggiated pattern may assist the user during therapy as the user is no longer focused on trying to properly step to an externally-dictated and artificial metronome, but can focus on improving gait mechanics.
- the tempo of arpeggiation may be selectively adjusted (faster or slower) so as to encourage a change in gait or step rate, for example, increased or decreased gait velocity.
- an exemplary method 600 in accordance with the present invention may include: 602 ) receiving, at the computing device, a series of respective first signals for each of the first and second sensors from the radio transceiver unit; 604 ) converting, via the computing device, the series of respective first signals into a series of respective second signals; 606 ) quantizing, via the computing device, each respective second signal within the series of respective second signals; 608 ) modifying, via the computing device, the timing of each respective second signal to a pre-selected rhythmic or musical grid so that each respective second signal manifests as a respective real-time Musical Instrument Digital Interface (MIDI) audio biofeedback cue having a note length determined as a function of the pulse length of the digital pulse of the corresponding respective first signal; 610 ) analyzing, via the computing device, the user's gait as a function of the audio biofeedback cues; and 612 ) adapting, via the computing device based upon the gait analysis, the pre-selected temporal
- MIDI Musical Instrument Digital
- step 614 may include calibrating the sensitivity of the audio biofeedback cue having a note length determined as a function of the pulse length of the digital pulse of the corresponding respective first signal, while at 616 , one or both sensor regions may be calibrated to the applied force setting by having a user exert partial or full weight-bearing pressure on each of the one or both sensors and adjusting the sensitivity of the respective sensor such that the full wearing-bearing pressure is set as 100% applied force.
- each respective second signal may then be calculated as a function of the 100% applied force setting whereby the quantized second signal is displayed as a percentage of applied force as a function of the pressure applied to the respective sensor to produce the associated first signal.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Acoustics & Sound (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electrophonic Musical Instruments (AREA)
Abstract
A method for adapting auditory biofeedback cues to adjust a user's gait includes receiving a series of respective first signals from each sensor of an integrated sensor system and converting the series of respective first signals into a series of respective second signals. Each respective second signal within the series of respective second signals can be quantized as an audio biofeedback cue and modified so the timing of each respective second signal is aligned to a pre-selected temporal or musical grid. The user's gait is then analyzed as a function of the audio biofeedback cues and the pre-selected temporal or musical grid can be adapted to adjust entrainment of the user's gait.
Description
- The present invention generally relates to a system and method for generating wireless signals generated from physical movement sensors and/or similar devices coupled to a person's body. The invention more particularly relates to such a system and method which also enables the selective control of the digital responses to the generated signals, including MIDI data (Musical Instrument Digital Interface), sounds, visuals, and/or interactive responses, for example. Still more particularly, the present invention relates to a system and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses.
- Electronic auditory and/or visual input/output systems and components such as musical controllers for MIDI compatible equipment, electronic tap dancing shoes, and the use of touch-screen interfaces for the remote control of electronics are well known in today's digital world. Musical or MIDI controllers are the hardware interface for the control of digital musical instruments. MIDI (Musical Instrument Digital Interface) is an industry standard data protocol for digital instruments established in the 1980's that remains in use to the present day. A musical keyboard is most typically used to “control” sound banks or synthesizers that are wired to the controller with MIDI cables. Percussive MIDI controllers, such as the Roland Corporation's OCTAPAD®, contain large rubber pads that when hit, trigger digital samples. MIDI controllers may also contain sliders, knobs or buttons that control recording of MIDI music. Novation's Launchpad uses buttons to act as switches for recording, or as drum pads for performing music.
- Alternative MIDI musical controllers produced and marketed in the past have included breath controllers offered by Yamaha as optional accessories for their line of keyboard synthesizers produced in the 1980s (the DX7, DX11, CS01, and others). These breath controllers allowed the use of breath pressure to have the synthesizer send corresponding MIDI continuous control messages to modify the sound output. In the wake Yamaha's controller, other manufacturers have made and offered breath controllers that are free-standing and allow the user to add breath control of MIDI continuous controller messages to instruments lacking that control as a built-in feature. For example the TEControl USB MIDI Breath Controller can be used with a wide range of MIDI compatible musical equipment or computer software that accepts MIDI messages.
- Previous inventors have tried to develop electronic tap-dance shoes that use pressure sensors or other means to detect a dancer's activity and then send corresponding MIDI notes, either through cables or wirelessly. For example, U.S. Pat. No. 5,765,300 entitled Shoe Activated Sound Synthesizer Device is directed to a shoe activated sound synthesizer device that enables movement of a shoe to be translated into audible sounds. The sound synthesizer device consists of a shoe in which there is disposed at least one trigger element capable of producing a trigger signal when the shoe is flexed to a predetermined degree. As the shoe is worn and is brought into contact with the floor, the shoe is flexed. By bringing different parts of the shoe into contact with the floor in a controlled manner, a person can selectively control the production of trigger signals from any trigger element contained within the shoe. A sound synthesizer circuit is provided that is coupled to each trigger element contained within the shoe. The sound synthesizer circuit produces an audible sound, via a speaker, when a trigger signal is received from the shoe. Pressure sensors have also been embedded in floors or floor-mounted surfaces and used as arcade or home game controllers: examples include Nintendo's Wii Fit Balance Board and Konami's Dance Dance Revolution. The Nike+ FUELBAND® is a device that tracks a user's movement and activity in order to track progress in fitness training. The Nike+ sensor may transmit a data packet to a receiver directly attached to a mobile device.
- Additionally, wireless remote control of electronics hardware through an application on a mobile device or tablet computer is an expanding field. One example is the GoPro App, which allows a user full remote control over a GoPro camera's functions and record button, as well as providing a useful preview image of what the camera is photographing, if for example it is attached to the top of a helmet deeming the viewfinder not visible.
- While the above prior art provides examples of signal generation through physical movement, there remains a need for a system and method which allows the manipulation of the response signals in real-time. The present invention addresses this, and other, needs in the art.
- A system and method for generating wireless signals from the physical movement of a person utilizing a movement detection mechanism attached to the person wherein the system allows a person to manipulate the generated wireless signals to selectively control digital responses which may be in the form of sensory-perceivable outputs such as sounds and/or visual effects, for example, through the person's physical movement. Movement sensors attached to the person (e.g., on one foot or both feet) communicate with other system components such as microprocessors, transceivers and/or tactile interface controls, for example, to wirelessly send signal pulses from a person to a computer or mobile device and allow the person wearing the sensors and/or another person to selectively control the dynamics of the digital responses so to create a unique sensory output.
- In accordance with an aspect of the present invention, a system for creating a sensory output from a user's physical movements comprises one or more sensors configured to be removably attachable to a user's body. The one or more sensors may be adapted to detect movement and trigger a signal containing movement data in real-time. A transceiver is operably coupled to the one or more sensors to transmit the real-time signal and a receiver is coupled to a computing device with the receiver configured to receive the transmitted real-time signal. The computing device converts the movement data to an output signal wherein the output signal manifests as a sensory output comprising one or more of a visual signal, an interactive effect signal, a Musical Instrument Digital Interface (MIDI) signal or an audio signal.
- In a further aspect of the present invention, the system may further comprise a tactile interface unit coupled to the receiver and computing device wherein the tactile interface unit is operable to selectively control and manipulate the output signal. The tactile interface unit may be configured to be removably attached to the user's body and to be operable by the user. Moreover, the tactile interface unit, the receiver and the computing device may be housed within a single mobile device configured to be removably attached to the user's body. Alternatively or additionally, the tactile interface may be remotely located from the user and may be operated by a second party.
- In another aspect of the present invention, the tactile interface unit is an application running on a mobile device and is configured for wireless remote control of the output signals produced by the computing device. The output signals may be characterized by a note's length, a MIDI channel used, a MIDI continuous controller number sent, or any other parameters that can be modified in the digital output of the audio, visual or interactive effect signals.
- In still a further aspect of the present invention, the transceiver may be a radio transceiver configured to generate and transmit wireless pre-MIDI signals, wherein the pre-MIDI data signals from the transceiver are converted by the computing device into MIDI notes, MIDI continuous controller messages or similar data protocol. The computing device may convert the wireless pre-MIDI signals into actual MIDI data at a higher data rate than the MIDI protocol to thereby reduce latency in the output of the output signal.
- In an additional aspect of the present invention, at least one of the one or more sensors resides in a shoe configured to be worn by the user. A shim may also be placed within the shoe wherein the shim is configured to position the at least one sensor at a transverse arch or heel of the user's foot.
- In another aspect of the present invention, the transceiver is housed within a transceiver box where the transceiver box further includes a microprocessor programmed to include a peak-detection algorithm. The microprocessor utilizes the peak-detection algorithm to convert movement data generated by the one or more sensors into discrete digital signals indicating event onsets to be transmitted by the transceiver.
- In a further aspect of the present invention, one or both of the real-time signal and the output signal are communicated wirelessly over a local area network (LAN), a wide area network (WAN), a Cloud or the internet.
- In still another aspect of the present invention, a system for creating sensory outputs from physical movements of a plurality of users comprises a respective set of one or more sensors configured to be removably attachable to a respective user's body. The one or more sensors may be adapted to detect movement and trigger a signal containing movement data in real-time for the respective user. A respective transceiver may be operably coupled to each respective set of one or more sensors to transmit the real-time signal for the respective user. A receiver may be coupled to a computing device and be configured to receive the transmitted real-time signal from each user. The computing device may then convert the movement data from each user to a series of output signals wherein the output signals manifest as sensory outputs comprising one or more of a visual signal, an interactive effect signal, a Musical Instrument Digital Interface (MIDI) signal or an audio signal.
- In a further aspect of the present invention, a tactile interface unit may be coupled to the receiver and computing device where the tactile interface unit is operable to selectively control and manipulate the output signals. Additionally or alternatively, a respective tactile interface unit may be configured to be removably attached to each respective user where each respective tactile interface unit may be in communication with the receiver and computing device wherein each respective tactile interface unit is operable to selectively control and manipulate the output signals.
- The above-mentioned and other features and advantages of this invention, and the manner of attaining them, will become apparent and be better understood by reference to the following description of the invention in conjunction with the accompanying drawing, wherein:
-
FIG. 1 is a schematic view of a system for generating wireless signals from physical movement in accordance with an embodiment of the present invention; -
FIG. 1A are schematic views of alternative tactile interfaces that may be used within a system of the present invention; -
FIG. 2 is a schematic view of a system for generating wireless signals from physical movement in accordance with another embodiment of the present invention; -
FIG. 3 is a schematic view of a device used to generate wireless signals from physical movement in the embodiments of the system shown inFIGS. 1 and 2 ; -
FIG. 4 is a schematic view of a pressure sensor layout used within the device shown inFIG. 3 ; -
FIG. 5 is a schematic view of a signal transceiver bracket used within the device shown inFIG. 3 ; -
FIG. 6 is a schematic view of a signal transceiver strap used within the device shown inFIG. 3 ; -
FIG. 7 is a schematic view of the components within a signal transceiver used within the system shown inFIGS. 1 and 2 ; -
FIG. 8 is a schematic view of an alternative device used to generate wireless signals from physical movement in the embodiments of the system shown inFIGS. 1 and 2 ; -
FIG. 9 is a schematic view of an alternative pressure sensor layout used within the device shown inFIG. 3 ; -
FIG. 10 is a schematic view of a system for generating wireless signals from physical movement by more than one performer in accordance with another embodiment of the present invention; -
FIG. 11 is a schematic view of a system for generating wireless signals from physical movement by more than one performer wherein each performer may remotely control system outputs in accordance with another embodiment of the present invention; -
FIG. 12 is a schematic view of a system for generating wireless signals from physical movement by more than one performer wherein a non-performer may remotely control system outputs in accordance with another embodiment of the present invention; -
FIG. 13 is a plot, along with associated visual aid, showing pressure sensor recordings when analyzing a user's gait in accordance with an aspect of the present invention; -
FIG. 13A is a side perspective view of an embodiment of an alternative sensor unit in accordance with a further aspect of the present invention; -
FIG. 13B is a bottom view of the alternative sensor unit shown inFIG. 13A ; -
FIG. 14 is the plot shown nFIG. 13 indicating exemplary diagnostic criteria during a gait analysis; -
FIG. 15A is an auxiliary plot of an exemplary left heel pre-MIDI output for the plot shown inFIGS. 13 and 14 ; -
FIG. 15B is an auxiliary plot of an exemplary left toe pre-MIDI output for the plot shown inFIGS. 13 and 14 ; -
FIG. 16A is a diagrammatic plot of the quantization and output of MIDI notes for the plot shown inFIG. 15A ; -
FIG. 16B is a diagrammatic plot of the quantization and output of MIDI notes for the plot shown inFIG. 15B ; -
FIG. 17 is a diagrammatic view of a system for adapting auditory biofeedback cues to adjust a user's gait; and -
FIG. 18 is a flow chart of an exemplary method for adapting auditory biofeedback cues to adjust a user's gait in accordance with an aspect of the present invention. - Similar reference characters refer to similar parts throughout the several views of the drawings.
- In operation, one or more sensors (such as, but not limited to, pressure sensors, accelerometers and the like) may be placed on or near a foot or both feet. In the act of walking, running, dancing or other movement, a sensor may be triggered and read by detector, such as an analog-to digital converter. A microprocessor may be attached to the sensor and may transmit a wireless signal pulse to a computer. Computer software resident on the computer may then convert the wireless signal pulses into MIDI data (or a similar data protocol) that may be recorded as interoperable data or may be assigned to digital responses such as, but not limited to audible sounds including musical notes and beats; visual feedback in lighting effects or digital graphics; or interactive responses from a video game or digital display.
- A user may record, loop or modify the MIDI data or the dynamics of the digital responses in real-time by using a suitable interface, such as through tactile finger movements upon an interface in conjunction with movement of his or her legs and feet. By way of example, the dynamics of the responses that may be changed in real-time by coordinating finger and feet movements include, but are not limited to, the modification of the precise timing and length of the digital effects produced or the qualities of the visuals or sounds that are being generated by a person's physical movements. As used herein, the term real-time refers to digital signals or responses that occur perceivably instantaneous with the underlying physical movement. As a result, systems design in accordance with the teachings of the present invention may have applications as a musical and/or scientific instrument used in such activities as dance performance, music production, athletic activities, art projects, entertainment systems, health diagnostics or medical therapies and the like.
- Turning now to the drawings, with particular reference to
FIG. 1 thereof, a system for generating wireless signals generated from physical movement sensors is generally indicated byreference numeral 100.System 100 may comprise include asignal generation component 102 configured to wirelessly communicate movement related sensor data to asignal receiver component 104 before eventual broadcast (i.e. audio and/or visual responses) via one ormore output modalities 106. As discussed in greater detail below,signal generation component 102 may be worn upon auser 108 and include one or more sensor andradio transceiver units 110 located on or proximate to the user's shoe orfoot 112 and, optionally, atactile interface unit 126. Respective sensor andradio transceiver units 110 may be worn on one or both feet.Signals 115 generated by the sensors and transmitted by the transceivers may be received by aradio receiver 114 in communication with acomputing device 116, such as a smart phone, laptop, tablet or PC computer. Software resident withincomputing device 116 may then condition the received signals beforeeventual output 117 to appropriate output devices, such as via VGA output 118 (for visual signals), HDMI output 120 (for interactive effects), MIDI output 122 (for digital notes and beats) and/or digital output 124 (for sounds). Atactile interface unit 126 may also be coupled 127 toreceiver component 104 whereintactile interface 126 may be used for control and manipulation ofoutput 117. As shown inFIG. 1A , non-limiting examples of possibletactile interface units 126 may include a touch-screen device 126A attached to the wrist, an interface responsive to an app on amobile device 126B held in the hand or aMIDI controller keyboard 126C manipulated by the fingers. - As shown generally in
FIG. 2 , an alternative embodiment of asystem 130 for generating wireless signals generated from physical movement sensors may utilize a singlemobile device 126B (such as a smart phone or tablet computer) that contains a built-in touch-screentactile interface 126 and receiver component 104 (radio receiver 114 and computing device 116) configured for digital output of sound toheadphones 134.Mobile device 126B may wirelessly receivesignals 115 generated by sensors andradio transceiver units 110. In this manner,user 108 may usesystem 130 as a personal musical instrument capable of producing controllable digital sounds by virtue of coordinating commands of the tactile interface with physical movements of the feet and body. - In the embodiments shown in
FIGS. 1 and 2 ,tactile interface unit 126 may be strapped to the wrist (FIG. 1 ) or held in the hand (FIG. 2 ). By pressing the interface with the fingers,user 108 may wirelessly change the precise timing and/or length of the digital effects and/or the quality of the digital responses produced, including the rhythmic timing and/or length of MIDI notes, for example. Viatactile interface unit 126,user 108 may be able to change the assignment of a particular MIDI note to a particular sensor and/or a particular physical movement that is activated by a pressure sensor and/or inertial measurement unit. In this manner, MIDI continuous controller data can be modified to create dynamic changes in digital effects and/oruser 108 launch presets of different combinations of digital effects in order to transpose or arpeggiate musical notes and/or animate visual patterns.Tactile interface unit 126 may also display visual feedback confirming the digital effects being produced, such as a preview of graphics that are being projected on a larger screen and/or the user's current score if the system is being used in a multi-user video game environment, for example. As a result the conclusive aesthetic result ofsystem 100/130 is a series of coordinated digital responses in the form of sounds and/or visuals that are triggered and controlled by a user's physical movements. - With reference to
FIG. 3 , an embodiment of a sensor andradio transceiver unit 110A is shown. Sensor andradio transceiver unit 110A may be configured to mount to a user's shoe 112 a withpressure sensors 114 positioned beneath the user'sfoot 113 and adhered to theinnersole 142 ofshoe 112A. With additional reference toFIG. 4 , utilizing foot movement to createwireless signals 115 does not require specialized footwear. For instance, one ormore pressure sensors 140 may be attached to a shim or raisedsupport 144 that is positioned onto aremovable innersole 142 of any suitable shoe. In accordance with an aspect of the present invention, at least one pressure sensor 140 (an optional shim 144) is positioned between thetransverse arch 146 offoot 113 andinnersole 142 when insertingfoot 113 intoshoe 112A, and more particularly in the area of the transverse arch of the foot located between the ball of thefoot 145 and thesmallest toe 147. A pressure sensor 140 (an optional shim 144) may also be placed near theheel 149 offoot 113. The size and orientation of eachshim 144 is selected so as to ensure contact between the innersole and the foot, while also minimizing user awareness ofshim 144 and/orpressure sensor 140 and any discomfort that may result therefrom. Further, the shape and size of pressure sensors may be modified or increased/decreased in order to selectively define the zone of sensitivity, that is, where on the foot is pressure required to be sensed by thesensor 140 before initiating asignal 115. - As shown in
FIG. 3 , aradio transceiver box 148 may be releasably secured tolaces 150 ofshoe 112A. For instance, as shown inFIG. 5 ,radio transceiver box 148 may be secured tolaces 150 by way of abracket 152 slid underlaces 150 ofshoe 112A.Radio transceiver box 148 may then be releasably mounted tobracket 152 via a releasable fastener (not shown), such as a snap, magnets, hook-and-loop material and the like. In this manner,radio transceiver boxes 148 may be removed when not needed and/or may be shared between people. Alternatively, as shown inFIG. 6 , astrap 154 may be wrapped around the body of the shoe with theradio transceiver box 148 releasable attached to the strap. - As generally shown in
FIG. 7 ,radio transceiver box 148 may include ahousing 156 containing a printedcircuit board 158 having an analog todigital converter circuit 160 configured to receive analog sensor data from pressure sensors 140 (such as via external jack 162) and convert such analog sensor data into digital signals for interrogation bymicroprocessor 164. Interrogated digital signals may then be wirelessly transmitted via wireless transceiver 166 (which may also include anantenna 168 configured for digital broadcast). In accordance with an aspect of the present invention,radio transceiver box 148 may also include aninertial measurement unit 170 configured to sense and output sensor data regarding movement ofradio transceiver box 148. Without limitation thereto,inertial measurement unit 170 may include one or more of an accelerometer, gyroscope and a magnetometer. By way of example, signals outputted byinertial measurement unit 170 may trigger a selectedoutput 117 solely through user movement without requiring footfall and activation ofpressure sensor 140.Radio transceiver box 148 may also include abattery 172 configured to provide necessary power to box components. It should be understood by those skilled in the art that any suitable battery may be used, including but not limited to non-rechargeable and rechargeable batteries. Light emitting diodes (LEDs) 171 may be also included to provide visual indication thatradio transceiver box 148 and its various internal components are operating properly or to display colors synchronized to musical notes or from the user's settings. It should also be noted that by miniaturizing the electronics of the of theradio transceiver unit 110A, the scale, weight and power consumption may be reduced. - Turning now to
FIG. 8 , a hard soleddance shoe 112B may be modified to accommodate one ormore sensors 140 and aradio transceiver box 148. A battery 172 (and optionalbattery recharging port 174 in the case ofbattery 172 being a rechargeable battery) may be embedded into theheel 176 of the shoe. By situatingbattery 172 withinheel 176, the size and weight ofradio transceiver box 148 may be reduced so thatradio transceiver box 148 may more comfortably be attached to astrap 178 ondance shoe 112B. - As shown in
FIG. 9 , in analternative embodiment 110B, one ormore pressure sensors 140 may be directly affixed to a user'sfoot 113, such as through an adhesive 180. Aradio transceiver box 148 may then be releasably secured to the user's body, such as at or near user'sankle 182. In this manner,user 108 may generatewireless signals 115 without requiring any shoes, but merely through impact of his or her bare foot upon a surface. While not limited strictly thereto,embodiment 110B may be suitable for use as a scientific instrument for developing physical therapies to improve a person's foot placement and/or gate, including therapies for foot pronation, walking disorders and/or physical movement disabilities, for example. By adheringpressure sensors 140 directly to the skin of the foot and strappingradio transceiver box 148 toankle 182, a user can walk barefooted while the system would trigger sounds to encourage proper heel-to-toe foot movement and/or provide interactive responses and/or the recording of data, for example, to assist the person in therapy or diagnosis. Alternatively,user 108 may elect to wear shoes whilepressure sensors 140 are directly affixed tofoot 113. -
FIGS. 10 through 12 show alternative embodiments of the present invention wherein a respective system generates wireless signals generated from physical movements from a plurality of individuals. By way of example, such individuals may be members of a dance/music ensemble or may be players in a multi-user video game environment. - With reference to
FIG. 10 , amulti-person system 200 has a group ofpeople 208 wherein eachmember 208A-208D of the group is equipped with a respective sensor andradio transceiver unit 210A-210D. It should be noted that while shown and described as having four members,multi-person system 200 may be used with any size group of users and such alternative group sizes are to be considered within the teachings of the present invention. Each respective sensor andradio transceiver unit 210A-210D may be in wireless communication with commonsignal receiver component 204, such as but not limited to aturnkey computer 216 having an external orinternal radio receiver 214.Signal receiver component 204 may then be operated by a dedicated DJ ortechnician 217, such as via aMIDI controller 226, to produce sound, lighting or video effects, such as via speaker/lighting unit 228. In this manner,group 208 may perform as a dance or musical ensemble or may interact with a dance simulation gaming program or app (such as country-line dancing) that responds to each member's physical movement without confining the individual members to a camera-based or environmental motion tracking system. -
Multi-person system 300, as shown inFIG. 11 , is similar tomulti-person system 200 described above, with the exception that DJ/technician 217 may be omitted. Rather eachmember 308A-308D of the group includes a respective sensor andradio transceiver unit 310A-310D wirelessly coupled to a respectivetactile interface unit 326A-326D which in turn is in wireless communication with a commonsignal receiver component 304 on a local area network (LAN) that can be controlled remotely by eachmember 308A-308D via each respectivetactile interface unit 326A-326D.Signal receiver component 304 may output sound, lighting or video effects signals similar tosystem 200 orsignal receiver component 304 may comprise an external orinternal radio receiver 314 coupled to aturnkey computer 316 having built inspeakers 328 andvideo display 329. In this manner, and by way of example, the group may perform as a dance or music ensemble or players in a multi-user video game. -
FIG. 12 shows amulti-person system 400 similar tosystem 300 described above whereinindividual members 408A-408E of a group includes a respective sensor andradio transceiver unit 410A-410E wirelessly coupled to a respectivetactile interface unit 426A-426E which in turn is in wireless communication with the Internet, Cloud or a wide area network (WAN) 427. Each respective sensor andradio transceiver unit 410A-410E and its digital output may be controlled remotely by eachmember 408A-408E via each respectivetactile interface unit 426A-426E. Audio outputs may be heard by each member viarespective headphones 434A-434F Additionally or alternatively, a dedicated DJ ortechnician 408F (who may also be an active member of the group and have a respective sensor andradio transceiver unit 410F and respectivetactile interface unit 426F) may control signal outputs, such as through aMIDI controller 426G similar to that describe above with regard toFIG. 10 . In this manner,system 400 may enable members to work collaboratively in real-time even when one or more of the members is remotely located from the other members of the group. By way of example,system 400 may enable concerted group activities in multi-user video games or assisting a team to move together in a drill or routine. - In each of the above embodiments, when a user steps onto a
pressure sensor 140, it is activated. When a user spins the body or moves a leg through the air,inertial measurement unit 170 is activated. Both pressure sensor(s) 140 and inertial measurement unit(s) 170 send electrical signals to the analog-to-digital converter circuit 160. (SeeFIG. 7 ).Microprocessor 164 detects a peak in the pressure wave or acceleration curve and determines a discrete point that is transmitted as a wireless pulse in the signal from thetransceiver 166 to thereceiver 114. Computer 116 (or tactile interface unit 126) may then convert the wireless signal of pulses into audible sounds including musical notes and beats and/or visual feedback in lighting effects and/or digital graphics and/or interactive responses from a video game or digital display, for example.Microprocessor 164 may be programmed to include a peak-detecting algorithm that converts these continuous pressure waves and acceleration curves to discrete pulses. However, in operation, the user may require a single note to be generated from a single footstep, so any additional pulses are be filtered out by a clock function within the software to thereby convey a response of a single note from a single footstep. This could be optionally activated in the system settings. In typical physical movements, such as walking, people alternate footsteps between the left and right feet. In order to generate a scale of musical notes from these alternating footsteps, each signal pulse emanating from the combined left and right radio transceivers is numbered sequentially. The receiver and computer may also contain software capable of numbering the sequence of pulses, and translating them into MIDI note numbers or pitches, beats or tones in a musical scale. - In each of the above embodiments, when movement is detected, such as through
pressure sensor 140 and/orinertial measurement unit 170, awireless signal 115 of pulses is transmitted viawireless transceiver 166 toreceiver component 104. By coordinating inputs withintactile interface 126 with physical movement of the feet and body, a person can modify the timing and length of the digital effects and/or the qualities of the visual and/or sound responses that are being generated by a person's physical movements. - In accordance with an aspect of the present invention, signal processing flow begins with pressure waves and acceleration curves generated by one or
more sensors 140/inertial measurement units 170 attached to the body of auser 102 that are triggered by physical movement. (See e.g.,FIGS. 1 and 7 ). A peak-detecting algorithm stored on themicroprocessor 164 identifies the peak of the pressure wave or acceleration curve and signals it as a low latency wireless pulse to theradio receiver 114. Computer 116 (or tactile interface unit 126) converts the signal of pulses into operable data, for example MIDI notes and beats. The computer software numbers the pulses in a sequence determined by the user, then filters out unwanted pulses, quantizes the notes to a musical grid or modifies them based on the timing of the measures of music by sending the notes through an envelope. The resulting digital response can be seen or heard in real-time, and for example MIDI data can be recorded by the computer or controlled and manipulated in real-time by a tactile interface unit either worn by the person generating the physical movement (seeFIG. 2 ) or by a DJ/Technician (seeFIG. 10 ) controlling the dynamics of multiple people wearing sensors and radio transceivers. By way of example, a user or DJ/technician may modulate all of the notes being generated by multiple users to a different musical scale, chord or key at a specific instant in time or produce a unified aesthetic change in the digital output of the system. - As used herein, the term “real-time” refers to digital signals or responses that occur perceivably instantaneous with the underlying physical movement. That is, the digital signal or response originates at the sensor or inertial measurement unit within about 500 milliseconds to about 50 nanoseconds, and more preferably between about 1 millisecond and about 100 milliseconds, from the time of peak detection by the algorithm to transmission of the digital signal to the receiver. In this manner, an output signal and the resultant audio, visual and/or other effects are perceived by the user and any spectators as occurring substantially simultaneously with the user's movements. In other words, there is no noticeable delay between a user's movements and the resultant effect.
- A method for synchronizing the MIDI signals of the system may be utilized for generating harmonious and rhythmically unified digital responses from a group of persons (see
FIGS. 10-12 ) generating pulses from physical movement. The synchronization may involve quantizing the input of the MIDI notes to the beats and measures of the music. This involves modifying the precise time notes are played by shifting them to an established temporal grid. The use of a pitch envelope can modify incoming notes (generated by the feet) to trigger specific pitches at set beats of music or instants in time. The modifications of MIDI signals may either expand or limit the amount of pitches generated by the feet. - The synchronization of signals generated by multiple users to a single receiver and computer produces the digital response of dancing with a partner, or choreographed group movements. Synchronization of digital responses transmitted through a LAN computer network (see
FIG. 11 ) or the internet (seeFIG. 12 ) enables remote users to dance together, or send signals from physical movements generated by multiple users across computer networks in real-time. - As shown in the above reference embodiments, a computer may include an internal radio receiving unit or an external receiving unit coupled to a computer, such as via a USB port. In accordance with an aspect of the present invention, the
signal receiver component 104 may utilize the IEEE 802.15.4 networking protocol for fast point-to-multipoint or peer-to-peer networking. Bluetooth LE (low energy) and/or Bluetooth 4.0 and later revisions of the protocol may offer a fast data transfer interval for low latency, real-time, wireless signal transmission and reception. Utilizing the Bluetooth 4.0 protocol may allow radio transceiver boxes that are removably attachable to a person's body to communicate directly with Bluetooth 4.0 supported devices. -
Tactile interface unit 126 may provide for real-time control of the dynamics of the digital response effects emanating from the computer. The tactile interface may be operated by a medical clinician or the user themselves and may be capable of modifying the MIDI signals produced by the radio receiver and computer. In operation, a user moves his or her feet thereby activating sensors and generating wireless pulses as described above. Coordinately, the user's fingers touchtactile interface unit 126 so as to selectively modify the note pitch, length or any other MIDI parameters. - As described above,
tactile interface unit 126 may be a touch-screen device attached to the wrist, be contained in an app on a mobile device held in the hands, or be a MIDI controller keyboard manipulated by the fingers. Tactile interface unit may be utilized to alter the system settings such as adjusting the sensitivity of the pressure sensors, changing the preset sound or digital effect in real-time or altering the pitch of the note in real-time. Any digital event onset including MIDI events can be triggered as a digital output from the computer and dynamics such as the length and type of sound or visual effect can be controlled and modified by the tactile interface. - Turning now to
FIGS. 13-18 , and with particular reference toFIG. 13 , an exemplary gait analysis study using an embodiment ofsystem 100, described above, is shown. As seen in the top portion ofFIG. 13 , auser 500 has a first system 502 (analogous tosystem 100 described above and as will be discussed in greater detail below) mounted about thefoot 504. By way of example and without limitation thereto,first system 502 may be mounted directly ontofoot 504 or may be mounted withinshoe 506 or may be attached to the external surface ofshoe 506, such as via astrap 507. In any event,first system 502 is configured to include aheel sensor 508 adapted to detect heel pressure and atoe sensor 510 adapted to detect toe pressure, and may also include one or more optional inertial measurement units (not shown, such asinertial measurement unit 170 as seen inFIG. 7 ). With reference to plot 512 ofFIG. 13 ,first system 502 may be worn on the left foot as described above. With additional reference toFIGS. 15A, 16A and 17 , asecond system 514 may be worn on the right foot and include arespective heel sensor 515 andtoe sensor 517. - While shown and described as distinct heel/
toe sensor units 508/510 (left foot) and 515/517 (right foot), it should be understood by those skilled in the art that one or both heel/toe sensor units may be incorporated within a single, whole-foot sensor 502 a mounted onto or withinshoe 506 a (FIGS. 13A and 13B ). In accordance with this aspect of the present invention, whole-foot sensor 502 a may include a plurality of individually triggeredsensor elements 504 a that may be arranged as a grid orarray 506 a. As a wearer steps, only thosesensor elements 504 a which are impacted will produce a corresponding signal. Thus, whole-foot sensor 502 a is configured to generate a series of sensor data for eachsensor element 504 a across the whole ofarray 506 a. - The wearer, physical therapist, doctor, clinician or other third party may selectively isolate (such as via
microprocessor 164 orcomputing device 116, described above) specific sensor elements, such as those that defineheel region 508 a (denoted by the letter Y inFIG. 13B ) andtoe region 510 a (denoted by the letter X inFIG. 13B ).Heel region 508 a andtoe region 510 a may then functionally operate analogous toheel sensor 508 andtoe sensor 510, as described above and further discussed below. Thus, for the sake of clarity, the following discussion will refer toheel sensor 508 andtoe sensor 510, although it should be understood that such teachings may equally apply toheel region 508 a andtoe region 510 a. - As shown in
plot 512, bothheel sensor 508 andtoe sensor 510 may be activated asuser 500 takes a step such that microprocessor (e.g. microprocessor 164 onradio transceiver unit 110,FIGS. 1 and 7 ) ofsystem 502 generates a respectivepre-MIDI pressure curve toe sensors FIG. 13 ,sensors FIG. 14 , respective pre-MIDI pressure curves 518 (left heel), 520 (left toe) may offer diagnostic information regarding the gait ofuser 500. - By way of example, the length of time of the user's stride time may be measured as the time difference between successive heel strikes 532 a, 532 b, while the user's swing time (i.e., the length of time the foot is swinging through the air) may be measured as the time difference between toe off 534 and
heel strike 532 b and the stance time (i.e., the length of time the foot is in contact with the floor) may be measured as the time difference betweenheel strike 532 a and toe off 534. This data may also be coupled with inertial measurement unit data, such as that received frominertial measurement unit 170, so as to enable measurement of the stride length or step distance, along with other step/gait performance characteristics. - In addition to the above, the
integrated pressure sensors 508/510 andinertial measurement unit 170 may be able to detect user state; that is, whether the wearer is sitting, standing or moving (e.g., walking, running, hopping etc.). By way of example, user state data may be important during a physical therapy (PT) session, such as for a patient with Parkinson's disease. For instance, a frequent PT exercise involves walking a short distance. User state data may indicate step frequency, stride length, whether the patient has stopped to rest or sit down, or whether the patient is experiencing “frozen feet” which is common for patients with Parkinson's disease.System 502 also provides instant auditory biofeedback to encourage gait improvements during the PT session. - As shown in
FIGS. 15A and 15B, and 16A and 16B , a computing device, such asmicroprocessor 164 or computing device 116 (FIGS. 1 and 7 ), includes a computer processor that is configured to detect the edge and/or peak of the pre-MIDI data. In accordance with an aspect of the present invention, the computer processor may execute an event onset algorithm to detect event onset of the MIDI notes from the pre-MIDI data. As will be described in greater detail below, the event onset algorithm may interrogate the pre-MIDI data to detect event onset through one or both of peak detection or edge detection, and following edge (or peak) detection,microprocessor 164 orcomputing device 116 may then quantize the pre-MIDI data to generate respective MIDI signals which will ultimately comprise MIDI notes or MIDI continuous controller messages. - With specific reference to
FIGS. 15A and 15B , when a user places pressure (a stepping force) upon, such as,heel sensor 508 ofleft foot system 502, apre-MIDI pressure curve 528 is generated. Similarly, a stepping force ontoe sensor 510 ofleft foot system 502 generatespre-MIDI pressure curve 530. It should also be noted that foot motion data may be generated viainertial measurement unit 170. The pressure and inertial measurement unit data may be multiplexed so that information of multiple events can be contained within the same signal. The event onset algorithm, viamicroprocessor 164 orcomputing device 116 processor, may then interrogate eachpre-MIDI pressure curve 528, 530 (and, optionally, the inertial measurement unit data) to detect an event onset whereby the stepping force meets and exceeds apre-selected force threshold heel MIDI signal 540 and lefttoe MIDI signal 542. Quantization of pressure curves 528, 530 continues until the applied force tosensors respective thresholds - Respective MIDI signals 540, 542 define a respective MIDI note or controller message corresponding to the time domain when the
corresponding sensor pre-selected force thresholds pre-MIDI pressure curve pre-selected force threshold FIG. 15B showssimilar MIDI signals toe sensors pre-selected force thresholds system 500 may be selectively adjusted by defining the event onset (peak and/or edge detection) thresholds for eachsensor - With reference to
FIGS. 16A and 16B , in accordance with a further aspect of the present invention, once pre-MIDI signals have been quantized to MIDI notes or controller messages,computing device 116 may calibrate the MIDI notes as auditory biofeedback cues. Initially, a user sequentially places maximum weight-bearing pressure on each sensor. A gait analytic algorithm then sets an output value of the maximum weight-bearing pressure at 100%. MIDI signals 540, 542, 544, 546, 548, 550 may then be outputted as respectivesquare waves - The volume and pitch of each respective outputted note or sound may be correlated to the weight percentage of the MIDI signal square wave so as to provide auditory biofeedback that alerts the user, such as if they are favoring the left or right limb while walking. Furthermore, by calibrating the heel and toe touches with the length of the stance time along the X-axis, the system can produce biofeedback that matches each step. Thus,
system 500 may efficiently count the alternating left and right steps, 572, 574, 576. The accuracy of the step counting may be improved by utilizing the auditory biofeedback to calibrate the system as described above—that is, by stepping and “tuning” the sensitivity of the auditory sensory response (adjusting the pre-determined threshold for each sensor) until it matches the physical movement of foot touches while walking. - In accordance with an aspect of the present invention and as shown generally in
FIG. 17 , the timing of the auditory biofeedback cues may be modified and arpeggiated so as to fall on a pre-selectedmusical grid 580. As described above and as summarized inFIG. 17 , auser 500 couples one ormore sensors user 500 applies a force to the one ormore sensors transceiver unit 110 where the force data may be compiled as a pressure curve showing applied force over time.Computing device 116, via its processor and programmed gait analytic algorithm, may then interrogate the pressure curve data to detect event onset, wherein the event data is quantized to a MIDI signal (e.g.,MIDI signal musical grid 580. - In accordance with an aspect of the present invention,
computing device 116 may be selectively configured to arpeggiate notes on a sixteenth note grid. Thus, should an outputted MIDI note be mistimed with respect to the sixteenth note timing,computing device 116, via the gait analytic algorithm, may “hold back” or postpone the auditory cue slightly so as to properly place the note on the chromatic scale, temporal grid, or in the correct rhythmic timing. As a result, the auditory biofeedback cues are temporally adjusted so as to avoid discordant noise while, instead, producing an auditorily pleasing pattern. Furthermore, the incoming notes can be modified chromatically by transposing them to the nearest note in a specified key. This arpeggiated pattern may assist the user during therapy as the user is no longer focused on trying to properly step to an externally-dictated and artificial metronome, but can focus on improving gait mechanics. In a further example, the tempo of arpeggiation may be selectively adjusted (faster or slower) so as to encourage a change in gait or step rate, for example, increased or decreased gait velocity. - Turning now to
FIG. 18 , anexemplary method 600 in accordance with the present invention may include: 602) receiving, at the computing device, a series of respective first signals for each of the first and second sensors from the radio transceiver unit; 604) converting, via the computing device, the series of respective first signals into a series of respective second signals; 606) quantizing, via the computing device, each respective second signal within the series of respective second signals; 608) modifying, via the computing device, the timing of each respective second signal to a pre-selected rhythmic or musical grid so that each respective second signal manifests as a respective real-time Musical Instrument Digital Interface (MIDI) audio biofeedback cue having a note length determined as a function of the pulse length of the digital pulse of the corresponding respective first signal; 610) analyzing, via the computing device, the user's gait as a function of the audio biofeedback cues; and 612) adapting, via the computing device based upon the gait analysis, the pre-selected temporal or musical grid to adjust entrainment of the user's gait. - In a further aspect of the present invention, step 614 may include calibrating the sensitivity of the audio biofeedback cue having a note length determined as a function of the pulse length of the digital pulse of the corresponding respective first signal, while at 616, one or both sensor regions may be calibrated to the applied force setting by having a user exert partial or full weight-bearing pressure on each of the one or both sensors and adjusting the sensitivity of the respective sensor such that the full wearing-bearing pressure is set as 100% applied force. At 618, each respective second signal may then be calculated as a function of the 100% applied force setting whereby the quantized second signal is displayed as a percentage of applied force as a function of the pressure applied to the respective sensor to produce the associated first signal.
- While the inventive system and method have been shown and described with reference to certain preferred embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as described.
Claims (22)
1. A method for adapting auditory biofeedback cues to adjust a user's gait, wherein the user is equipped with a system for creating a sensory output from the user's physical movements wherein the system includes a first sensor configured to be located on or proximate to a heel of a first foot of the user and a second sensor configured to be located on or proximate to a toe of the first foot of the user, wherein each of the first and second sensors is adapted to detect the user's movement and generate respective movement data in real-time, wherein the respective movement data includes one or both of pressure wave data and acceleration curve data; a radio transceiver unit having a transceiver and a microprocessor, wherein the microprocessor is coupled to each of the first and second sensors and is programmed with an event onset detection algorithm wherein the microprocessor is operable to detect an event onset of a respective pressure peak in the generated respective movement data, convert the respective movement data into a respective discrete digital pulse having a pulse width and transmit each respective digital pulse as a respective first signal; a receiver configured to receive each transmitted respective first signal; and a computing device including a processor programmed with a gait analytic algorithm configured to convert each respective first signal to a respective second signal, the method comprising:
a) receiving, at the microprocessor, a series of respective first signals for each of the first and second sensors;
b) converting, via the microprocessor, the series of respective first signals into a series of respective second signals;
c) quantizing, via the microprocessor, each respective second signal within the series of respective second signals;
d) calibrating, via the processor, the sensitivity of the audio biofeedback cue whereby a note length of the audio biofeedback cue is determined as a function of the pulse width of the digital pulse of the corresponding respective first signal,
e) calibrating, via the processor, the sensitivity of one or both of the first and second sensors such that the gait analytic algorithm sets full weight-bearing pressure as 100% applied force;
f) calculating, via the processor, each respective second signal as a percent weight bearing force as a function of the pressure applied to the respective sensor relative to the 100% applied force;
g) adapting, via the processor, one or both of a volume and a pitch of the audio biofeedback cue as a function of percent weight bearing force;
h) modifying, via the processor, the timing of each respective second signal to a pre-selected temporal or musical grid so that each respective second signal manifests as a respective real-time Musical Instrument Digital Interface (MIDI) audio biofeedback cue;
i) analyzing, via the processor, the user's gait as a function of the audio biofeedback cues;
j) analyzing, via the processor, a patient state; and
k) adapting, via the processor based upon the gait analysis, the pre-selected temporal grid to adjust entrainment of the user's gait to the tempo of the audio biofeedback cues.
2. A method for adapting auditory biofeedback cues to adjust a user's gait, the method comprising:
a) providing a system for creating a sensory output from a user's physical movements wherein the system includes:
i) a first sensor configured to be located on or proximate to a heel of a first foot of the user and a second sensor configured to be located on or proximate to a toe of the first foot of the user, wherein each of the first and second sensors is adapted to detect the user's movement and generate respective movement data in real-time, wherein the respective movement data includes one or both of pressure wave data and acceleration curve data;
ii) a radio transceiver unit having a transceiver and a microprocessor, wherein the microprocessor is coupled to each of the first and second sensors and the transceiver is programmed with an event onset detection algorithm wherein the microprocessor is operable to detect an event onset of a respective pressure peak in the generated respective movement data, convert the respective movement data into a respective discrete digital pulse having a pulse width and transmit each respective digital pulse and event onset as a respective first signal;
iii) a receiver configured to receive each transmitted respective first signal; and
iv) a computing device having a processor and a gait analytic algorithm configured to convert each respective first signal to a respective second signal,
b) receiving, at the radio transceiver unit, a series of respective first signals for each of the first and second sensors;
c) quantizing, at the radio transceiver unit, each respective first signal within the series of respective first signals;
d) converting, at the computing device, the series of respective first signals into a series of respective second signals;
e) modifying, at the computing device, the timing of each respective second signal to a pre-selected temporal grid so that each respective second signal manifests as a respective real-time Musical Instrument Digital Interface (MIDI) audio biofeedback cue; and
f) analyzing, via the gait analytic algorithm, the user's gait as a function of the audio biofeedback cues.
3. The method of claim 2 wherein the radio transceiver unit includes: i) a memory populated with the event onset detection algorithm, and ii) the first processor to process the first signals, second signals or audio biofeedback cues.
4. The method of claim 2 wherein the series of respective first signals is cached within a memory of the computing device, wherein the gait analytic algorithm compares the cached first signals with the pre-selected temporal grid, and wherein a tempo of the pre-selected temporal grid or MIDI audio biofeedback cue is adapted based upon the comparison.
5. The method of claim 2 wherein the event onset detection algorithm utilizes an edge detection or a peak detection protocol.
6. The method of claim 2 wherein the first sensor and the second sensor are respective first and second sensor regions within a whole-foot sensor.
7. The method of claim 2 wherein one or both of the first sensor and second sensor includes a pressure sensor and an inertial measurement unit.
8. The method of claim 2 further comprising the step of analyzing, via the gait analytic algorithm, a patient state.
9. The method of claim 2 further comprising, calibrating sensitivity of the audio biofeedback cue on the computing device whereby a note length of the audio biofeedback cue is determined as a function of the pulse width of the digital pulse of the corresponding respective first signal.
10. The method of claim 9 further comprising, calibrating sensitivity of one or both of the first and second sensors such that the gait analytic algorithm sets full weight-bearing pressure as 100% applied force when displayed on the computing device.
11. The method of claim 10 further comprising, calculating each respective second signal as a percent weight bearing force as a function of the pressure applied to the respective sensor relative to the 100% applied force when displayed on the computing device.
12. The method of claim 11 further comprising, adapting the audio biofeedback cue output from the computing device as a function of percent weight bearing force.
13. The method of claim 12 wherein one or both of a volume and a pitch of the audio biofeedback cue is adapted.
14. The method of claim 2 wherein the audio biofeedback cue is synchronized with one or more other MIDI clock signals on the computing device.
15. The method of claim 2 further comprising, adapting, via the processor based upon the gait analysis, the pre-selected temporal grid to adjust entrainment of the user's gait.
16. The method of claim 2 wherein the transceiver is a radio transceiver configured to generate and transmit the respective first signals as a respective wireless pre-MIDI signal that is converted by the computing device into the respective MIDI signal comprising MIDI notes or MIDI continuous controller messages.
17. The method of claim 16 wherein the processor converts each respective first signal into the respective second signal at a higher data rate than the MIDI protocol to thereby reduce latency in the output of the output signal.
18. The method of claim 2 wherein the first and second sensors reside on or within a shoe.
19. The method of claim 2 wherein the microprocessor further includes a clock function configured to isolate and transmit the respective first signals.
20. The method of claim 2 wherein the microprocessor further includes a multiplexer function to integrate the movement data in the transmission of the respective first signals.
21. The method of claim 2 wherein each respective real-time MIDI audio biofeedback cue is generated, transmitted and converted in less than 50 milliseconds.
22. The method of claim 2 wherein the inertial measurement unit comprises one or more of of an accelerometer, gyroscope and a magnetometer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/590,468 US20220155851A1 (en) | 2015-06-03 | 2022-02-01 | System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562170505P | 2015-06-03 | 2015-06-03 | |
US15/172,979 US10248188B2 (en) | 2015-06-03 | 2016-06-03 | System and method for generating wireless signals and controlling digital responses from physical movement |
US16/360,116 US10761598B2 (en) | 2015-06-03 | 2019-03-21 | System and method for generating wireless signals and controlling digital responses from physical movement |
US17/007,372 US11237624B2 (en) | 2015-06-03 | 2020-08-31 | System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses |
US17/590,468 US20220155851A1 (en) | 2015-06-03 | 2022-02-01 | System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/007,372 Continuation US11237624B2 (en) | 2015-06-03 | 2020-08-31 | System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses |
Publications (1)
Publication Number | Publication Date |
---|---|
US20220155851A1 true US20220155851A1 (en) | 2022-05-19 |
Family
ID=76439613
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/007,372 Active US11237624B2 (en) | 2015-06-03 | 2020-08-31 | System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses |
US17/590,468 Abandoned US20220155851A1 (en) | 2015-06-03 | 2022-02-01 | System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/007,372 Active US11237624B2 (en) | 2015-06-03 | 2020-08-31 | System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses |
Country Status (1)
Country | Link |
---|---|
US (2) | US11237624B2 (en) |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3697511A1 (en) * | 2017-10-16 | 2020-08-26 | Lego A/S | Interactive play apparatus |
JP2022552893A (en) * | 2019-10-23 | 2022-12-20 | キューアールエス ミュージック テクノロジーズ、インコーポレイテッド | wireless midi headset |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020167486A1 (en) * | 2001-04-09 | 2002-11-14 | Tan Hong Z. | Sensing chair as an input device for human-computer interaction |
US20110009241A1 (en) * | 2009-04-10 | 2011-01-13 | Sovoz, Inc. | Virtual locomotion controller apparatus and methods |
US20150091790A1 (en) * | 2013-09-30 | 2015-04-02 | Qualcomm Incorporated | Classification of gesture detection systems through use of known and yet to be worn sensors |
US20190011987A1 (en) * | 2010-10-22 | 2019-01-10 | Joshua Michael Young | Methods, Devices, and Methods for Creating Control Signals |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6414731B2 (en) * | 1990-11-30 | 2002-07-02 | Sun Microsystems, Inc. | Low cost virtual reality system |
US20090046056A1 (en) * | 2007-03-14 | 2009-02-19 | Raydon Corporation | Human motion tracking device |
KR101483713B1 (en) * | 2008-06-30 | 2015-01-16 | 삼성전자 주식회사 | Apparatus and Method for capturing a motion of human |
US8717318B2 (en) * | 2011-03-29 | 2014-05-06 | Intel Corporation | Continued virtual links between gestures and user interface elements |
KR101591579B1 (en) * | 2011-03-29 | 2016-02-18 | 퀄컴 인코포레이티드 | Anchoring virtual images to real world surfaces in augmented reality systems |
US20140368434A1 (en) * | 2013-06-13 | 2014-12-18 | Microsoft Corporation | Generation of text by way of a touchless interface |
TWI534601B (en) * | 2014-03-03 | 2016-05-21 | 廣達電腦股份有限公司 | Dockable device and power method thereof |
US9599821B2 (en) * | 2014-08-08 | 2017-03-21 | Greg Van Curen | Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space |
-
2020
- 2020-08-31 US US17/007,372 patent/US11237624B2/en active Active
-
2022
- 2022-02-01 US US17/590,468 patent/US20220155851A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020167486A1 (en) * | 2001-04-09 | 2002-11-14 | Tan Hong Z. | Sensing chair as an input device for human-computer interaction |
US20110009241A1 (en) * | 2009-04-10 | 2011-01-13 | Sovoz, Inc. | Virtual locomotion controller apparatus and methods |
US20190011987A1 (en) * | 2010-10-22 | 2019-01-10 | Joshua Michael Young | Methods, Devices, and Methods for Creating Control Signals |
US20150091790A1 (en) * | 2013-09-30 | 2015-04-02 | Qualcomm Incorporated | Classification of gesture detection systems through use of known and yet to be worn sensors |
Also Published As
Publication number | Publication date |
---|---|
US20210191508A1 (en) | 2021-06-24 |
US11237624B2 (en) | 2022-02-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10761598B2 (en) | System and method for generating wireless signals and controlling digital responses from physical movement | |
US10895914B2 (en) | Methods, devices, and methods for creating control signals | |
US20220155851A1 (en) | System and method for adapting auditory biofeedback cues and gait analysis using wireless signals and digital responses | |
US10421002B2 (en) | Equipment, system and method for improving exercise efficiency in a cardio-fitness machine | |
US20210046373A1 (en) | Equipment, system and method for improving exercise efficiency in a cardio-fitness machine | |
Paradiso et al. | Design and implementation of expressive footwear | |
US9358425B2 (en) | Motion detection system | |
Repp et al. | Sensorimotor synchronization: a review of recent research (2006–2012) | |
EP1729711B1 (en) | Rehabilitation with music | |
US11690535B2 (en) | Mobile system allowing adaptation of the runner's cadence | |
Godbout | Corrective Sonic Feedback in Speed Skating | |
US20100075806A1 (en) | Biorhythm feedback system and method | |
US20140357960A1 (en) | Methods and Systems for Synchronizing Repetitive Activity with Biological Factors | |
EP3374041A1 (en) | Exercise treadmill | |
WO2014163976A1 (en) | Equipment, system and method for improving exercise efficiency in a cardio-fitness machine | |
CA2673149A1 (en) | Audio feedback for motor control training | |
US20220218943A1 (en) | Systems and methods of reducing stress with music | |
JP2004141275A (en) | Sole pressure distribution - auditory biofeedback system | |
CN112789092B (en) | Biofeedback for modifying gait | |
JP2008076785A (en) | Sound generation control unit | |
Nown | Creating a real-time movement sonification system for hemiparetic upper limb rehabilitation for survivors of stroke | |
Marvin | A Design Framework for Real-Time Auditory Feedback for Posture Training Using Wearable Sensors | |
EP2630557A1 (en) | Methods devices and systems for creating control signals | |
JP2014504372A (en) | Musical brain health promotion system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |