US20090042695A1 - Interactive rehabilitation method and system for movement of upper and lower extremities - Google Patents
Interactive rehabilitation method and system for movement of upper and lower extremities Download PDFInfo
- Publication number
- US20090042695A1 US20090042695A1 US12/189,068 US18906808A US2009042695A1 US 20090042695 A1 US20090042695 A1 US 20090042695A1 US 18906808 A US18906808 A US 18906808A US 2009042695 A1 US2009042695 A1 US 2009042695A1
- Authority
- US
- United States
- Prior art keywords
- movement
- target image
- extremity
- image
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0003—Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
-
- A63F13/10—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/53—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
- A63F13/537—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen
- A63F13/5375—Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game using indicators, e.g. showing the condition of a game character on screen for graphically or textually suggesting an action, e.g. by displaying an arrow indicating a turn in a driving game
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/30—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B24/00—Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
- A63B24/0003—Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
- A63B24/0006—Computerised comparison for qualitative assessment of motion sequences or the course of a movement
- A63B2024/0012—Comparing movements or motion sequences with a registered reference
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/80—Special sensors, transducers or devices therefor
- A63B2220/806—Video cameras
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2244/00—Sports without balls
- A63B2244/10—Combat sports
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2244/00—Sports without balls
- A63B2244/10—Combat sports
- A63B2244/102—Boxing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1087—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/303—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
- A63F2300/305—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for providing a graphical or textual hint to the player
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6045—Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
Definitions
- One solution for the aforementioned problem is to provide interactive games comprising virtual computer images for apoplexy victims, which are not only enjoyable but also provide extremity rehabilitation for the apoplexy victims. Thus, assisting to improve rehabilitation results of apoplexy victims.
- an interactive rehabilitation method and system for upper and lower extremities is desirable, assisting with extremity rehabilitation and body training (Chinese shadow boxing, for example) for apoplexy victims via virtual computer images of interactive games.
- An exemplary embodiment of an interactive rehabilitation method comprises the following.
- An identification label of an extracted image is detected to provide an operating position of an image of an extremity.
- a movement mode for a target image is determined according to the identification label and the target image is displayed in a scene. It is determined whether identification labels corresponding to movement of an extremity of the target image are being continuously obtained, and, if so, the performance of the movement of the extremity is led based on operational guidance.
- a feedback operation is provided according to the movement of the extremity, preset movement paths and velocities and targeted positions of the target image. It is determined whether the target image has been moved to the preset targeted positions, and, if so, the performance of the movement of the extremity is graded.
- An exemplary embodiment of an interactive rehabilitation system comprises a hand position monitoring module, a target image movement control module, an image feedback module, and a movement evaluation module.
- the hand position monitoring module detects an identification label of an extracted image to provide an operating position of an image of an extremity.
- the target image movement control module determines a movement mode for a target image according to the identification label, displays the target image in a scene, determines whether identification labels corresponding to movement of an extremity of the target image are being continuously obtained, and, if the identification labels are being continuously obtained, leading the movement of the extremity based on operational guidance.
- the image feedback module provides a feedback operation according to the movement of the extremity, preset movement paths and velocities and targeted positions of the target image.
- the movement evaluation module grades the movement of the extremity when the target image has been moved to the preset targeted positions.
- FIG. 1 is a schematic view of a computer system of the present invention
- FIG. 2 is a schematic view of an interactive rehabilitation system 100 shown in FIG. 1 of the present invention
- FIG. 3 is a flowchart of an interactive rehabilitation method of the present invention
- FIG. 4 illustrates human extremities
- FIG. 5 illustrates a behavioral range of the operator detected by an image extraction device
- FIG. 6 illustrates grabbing a sphere in a game scene
- FIG. 7 illustrates feedback states in response to operator movements in the game scene
- FIGS. 8-11 illustrate Chinese shadow boxing motions.
- FIGS. 1 through 3 generally relate to interactive rehabilitation for movement of upper and lower extremities.
- FIGS. 1 through 3 generally relate to interactive rehabilitation for movement of upper and lower extremities.
- the following disclosure provides various different embodiments as examples for implementing different features of the invention. Specific examples of components and arrangements are described in the following to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting.
- the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various described embodiments and/or configurations.
- the invention discloses an interactive rehabilitation method and system for mobility of upper and lower extremities, assisting extremity rehabilitation and body training (Chinese shadow boxing, for example) for apoplexy patients via virtual computer images of interactive games.
- the invention provides an interactive game for older persons to play the game indoors, and provide brain stimulus and entertainment to facilitate independence of older persons. Additionally, the video system of the game enables older persons to play with and interact with their children or other players, which assists in social interaction, thus, slowing the aging process in mind and soul.
- An embodiment of an interactive rehabilitation method and system for extremities can serve as training equipment for interactive extremity rehabilitation, immediately leading operators to perform extremity rehabilitation or training exercise via the game.
- FIG. 1 is a schematic view of a computer system of the present invention.
- An embodiment of an interactive extremity rehabilitation system 110 is implemented in a computer device 130 .
- the computer device 130 is wired or wireless-connected to an image extraction device (a Webcam, for example) 150 .
- the image extraction device 150 can be internally installed in the computer device 130 .
- the interactive extremity rehabilitation system 110 extracts real-time images of a person via the image extraction device 150 and transmits the extracted images to the computer device 130 to be displayed in a user interface (not shown) provided by the interactive extremity rehabilitation system 110 .
- the image extraction device 150 comprises an image identification system program for analyzing an image scope of a reaction area, retrieving movements from the start to the end of extremities, and performing real-time operations for dynamic images and returning feedbacks for flexibility training of extremities.
- FIG. 2 is a schematic view of an interactive extremity rehabilitation system 110 shown in FIG. 1 of the present invention.
- FIG. 3 is a flowchart of an interactive rehabilitation method of the present invention.
- the exemplary embodiment of an interactive rehabilitation system 110 comprises a hand position monitoring module 210 , a target image movement control module 230 , an image feedback module 250 , and a movement evaluation module 270 .
- a process for the exemplary embodiment of the interactive rehabilitation system 110 is first described in the following.
- a color mark or recognizable mark (defined as an identification label in this embodiment) for a position is first placed on a detected portion of an operator for extraction by the image extraction device 150 before the rehabilitation process starts.
- the interactive rehabilitation system 110 detects extremity movements (hand movements, for example) of the operator using the image extraction device 150 .
- the hand position monitoring module 210 detects the identification label of an extracted image of the operator extracted by the image extraction device 150 to provide a corresponding position (defined as an extremity position) for extremities in a game scene (step S 31 ).
- the target image movement control module 230 retrieves the identification label corresponding to the movement of the extremities from the hand position monitoring module 210 to determine movement modes and appearance sequences of a target image (step S 32 ).
- the system predefines required target images and classifications (Chinese shadow boxing motions or sphere grabbing actions, for example, which are not to be limitative).
- Each targets image and classification comprises plural movements and movement paths and velocities, and targeted positions are preset to movements of each target image.
- the preset data is stored in a database (not shown). Extremity movements of the operator correspond to movements of the target image.
- the target image movement control module 230 When a movement of an extremity corresponding to the identification label is retrieved, the target image movement control module 230 immediately selects a movement mode and an appearance sequence of the target image corresponding to the movement of the extremity and displays the target image (step S 33 ), a Chinese shadow boxing motion or a sphere grabbing action, for example.
- the target image movement control module 230 determines whether identification labels corresponding to movement of an extremity of the target image are being continuously retrieved from the hand position monitoring module 210 (step S 34 ), i.e. determining whether the operator performs the Chinese shadow boxing motion or sphere grabbing action. If the identification labels are not continuously retrieved, which indicates that the operator did not completely perform the movement, does not know how to perform the movement, or has forgotten how to perform the movement, the operator is reminded how to perform the movement by arrow guidance or other eye-catching suggestions. If the identification labels are being continuously retrieved, the extremities (the hands, for example) of the operator is led based on the preset movement paths and velocities and the targeted positions via operational guidance (step S 35 ). The operator, for example, is led to grab a target image in a game scene and place the target image at a correct target position or perform a Chinese shadow boxing motion.
- the image feedback module 250 provides a feedback operation for the operator according to the movement of the extremity, the preset movement paths and velocities and the targeted positions (step S 36 ). Shapes, emotional expressions and sounds of the target image, for example, are changed or an error message (image) or sound effect is shown.
- the hand image i.e. the movement of the extremity
- the feedback operation is provided.
- the feedback operation indicates image pattern variation and combinations of sound and power outputs or velocity variation, enabling the operator to experience interactions with the target image.
- the target image movement control module 230 determines whether all of the target images have been moved to the preset targeted positions (step S 37 ). That is, when a game for grabbing spheres is performed, whether each sphere is placed at its individual position, or, when a game for Chinese shadow boxing motions is performed, whether all Chinese shadow boxing motions are completed, is determined. If a target image has not been moved to the preset targeted positions, the process proceeds to step S 33 to repeat the described operations and enable all of the target images to be moved to their preset targeted positions. When all of the target images have been moved to their preset targeted positions, the movement evaluation module 270 grades the movement of the extremities of the operator according to similarity between the movement of the extremities and the target images (step S 38 ), and then the process terminates.
- the interactive extremity rehabilitation system 110 enables patients requiring rehabilitation for hand extremities to implement movement training via game interactions. Additionally, the system can provide competition for more than one user at the same game platform via video conference, achieving enjoyable rehabilitation and required training results.
- the hand position monitoring module 210 performs skin color recognition (based on the mark placed on the extremities) using computer vision simulation and tracks dynamic object behaviors according to recognition results. Further, the hand position monitoring module 210 extracts images from real-time images retrieved from the image extraction device 150 according to preset skin color definitions, determines whether each pixel on the extracted image comprises an area identical to that of the preset skin color definitions, marking a center of the area, subtracts a position of the center from that of a center of an actual screen, and transmits a control signal of a resulting distance vector to the target image movement control module 230 for tracking.
- a computer/computer game system provides target images for different types of games and movements, movement paths and velocities, targeted positions, and parameters are preset to each target image.
- the movement paths and velocities, targeted positions, and parameters are defined according to medical treatment requirements.
- the target image movement control module 230 leads, controls, and corrects hand movement of the operator to grab and place the target image to a correct target position, correcting and rehabilitating hand function of patients.
- an operational scope for a gesture operating area is first locked and an available skin color is separately highlighted using skin color detection.
- the dragged gesture represents a dynamic process
- the gesture operating area provides dynamic signals of a frame
- an available dynamic signal of the gesture operating area is extracted using a frame differential detection method.
- a logical operation (AND, for example) is implemented to the skin color area with the dynamic signals of the dragged gesture to generate a skin color differential area (i.e. the area in which the gestures of a frame are performed).
- the skin color differential area corresponds to coordinate positions in the game space, collisions for the skin color differential area and a movement area of the target image are detected, and collision signals serve as determination for selecting a game object.
- coordinate positions based on the dragged gesture, and the average center coordinate data of the skin color differential area is corresponded to coordinate positions in the game space to generate target coordinates used for leading the target image to move.
- the image feedback module 250 when the hand image of the operator overlaps (i.e. the grabbing movement) the target image or velocity or locus difference between the hand movement and the target image movement is generated, the image feedback module 250 provides a feedback operation with image pattern variation and combinations of sound and power outputs or velocity variation for the operator based on preset parameters.
- the image feedback module 250 provides leading, controlling, and correcting the hands of the operator to grab and place the game object to a correct target position according to preset values.
- Movement paths and velocities of a target image are created and parameters of targeted positions of the target image are defined and the defined data is stored in a database (not shown). Additionally, it is determined whether movement values, generated by operational behavior using artificial intelligence (IA), correspond to system defined standard parameters.
- IA artificial intelligence
- the hand images of the operator overlaps (i.e. the grabbing movement) the target image or movement velocities or loci of the hand image and the target image in the game scene are different from the system predefined values (i.e. the predefined parameters)
- a feedback operation with image pattern variation and combinations of sound and power outputs or velocity variation is provided for the operator based on preset information stored in the database.
- a real-time feedback mode is available to the operator according to the movement paths and velocities and targeted positions, so that the operator can be immediately corrected.
- human extremities can be at least classified as a wrist swinging around (as shown in Fig. A), a lateral movement (as shown in Fig. B), a finger winding movement (as shown in Fig. C), and clenching movements (as shown in Figs. D-F).
- Sphere grabbing motions or Chinese shadow boxing actions can be implemented using the described movements.
- human-machine interactions and image recognition design are applied to achieve accuracy of movement operation and correctness, as the system of the invention provides feedback operations for each movement of the operator.
- Image pattern variation and combinations of sound and power outputs or velocity variation for example, enables the operator to experience interaction with the target image.
- the movement evaluation module 270 determines performance grades according to the interaction between the hand image and the target image.
- FIG. 5 illustrates a behavioral range of the operator detected by the image extraction device 150 .
- the extractible range (ER) of the image extraction device 150 is shown by the block, wherein the extremities (Ex.) of the operator can only perform inside of the block and will not be detected outside of the block.
- the system selects and sets an identification label for tracking the operator and displays a target image corresponding to a selected movement mode.
- the system detects and displays extremities of the operator in a game scene, wherein when the operator grabs a sphere (the target image) in the game scene, the system leads the operator to place the grabbed sphere at a correct target position according to preset targeted positions and parameters stored in a database (not shown), and provides feedback according to velocity or locus similarity of the movement of the extremity.
- the system displays another sphere in the game scene and leads the operator to place the sphere to a correct target position.
- the system leads grabbing movements of the operator according to preset movement paths and velocities for each target image and, when the hand image of the operator overlaps (i.e. the grabbing movement) the target image or velocity or locus difference between the hand movement and the target image movement is generated, leads and corrects movements of the operator based on image movements, emotional expressions, or moving directions. Additionally, the system provides a feedback pattern (located at any position on the sphere or the operating window) to show feedback states in response to operator movements in the game scene. Referring to FIG. 7 , FIG. A illustrates a normal state where the sphere has not been grabbed, FIG. B illustrates touching the sphere by the extremity image of the operator, and FIG. C illustrates interactions between the extremity image and the sphere, such that the operator can synchronously experience interactions from the target image during the extremity rehabilitation process.
- the system selects and sets an identification label for tracking the operator and displays a target image corresponding to a selected movement mode.
- the system detects and displays extremities of the operator in a game scene, wherein when the operator motions, the system determines movements of the operator as Chinese shadow boxing motions, leads the extremities (both hands in this embodiment) of the operator to move to a correct target position with a correct path using a virtual figure, and provides feedback according to velocity or locus similarity of the movement of the extremities.
- the system generates and locates a virtual figure (VF) at the left side of the frame and enables the left hand (LH) and the right hand (RH) of the virtual figure to perform corresponding movements according to preset targeted positions and parameters stored in the database (not shown), facilitating the operator to imitate the movements of the virtual figure.
- the right side of the frame shows a real figure extracted by an image extraction device. When the operator swings both hands, the real figure in the frame generates corresponding movements.
- the system determines whether a movement of the operator is correct based on the movement of the real figure and that of the virtual figure and provides feedback (performance grading, for example). When the movement is complete, the system shows another virtual figure of the next Chinese shadow boxing motion (as shown in FIG.
- An embodiment of the interactive rehabilitation method and system promotes flexibility of older persons and improves the extremity ability of the operator via real extremity tanning. Additionally, the invention provides human-machine interactions to improve degeneration of extremity ability for older persons caused by old age via a physical touch platform. The extremity activities for older persons are thus expanded and the game platform allows enjoyable entertainment and recreational activities which improve reaction degeneration of older persons. That is, extremity mobility of patients are improved and influenced unobtrusively and imperceptibly by playing games.
- Methods and systems of the present disclosure may take the form of a program code (i.e., instructions) embodied in media, such as floppy diskettes, CD-ROMS, hard drives, firmware, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing embodiments of the disclosure.
- the methods and apparatus of the present disclosure may also be embodied in the form of a program code transmitted over some transmission medium, such as electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing and embodiment of the disclosure.
- the program code When implemented on a general-purpose processor, the program code combines with the processor to provide a unique apparatus that operates analogously to specific logic circuits.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Epidemiology (AREA)
- Medical Informatics (AREA)
- Primary Health Care (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Rehabilitation Tools (AREA)
- Processing Or Creating Images (AREA)
Abstract
An interactive rehabilitation method for movement of upper and lower extremities is disclosed. An identification label of an extracted image is detected to provide an operating position of an image of an extremity. A movement mode for a target image is determined according to the identification label and the target image is displayed in a scene. It is determined whether identification labels corresponding to movement of an extremity of the target image are being continuously obtained, and, if so, the performance of the movement of the extremity is led based on operational guidance. A feedback operation is provided according to the movement of the extremity, preset movement paths and velocities, and targeted positions of the target image. It is determined whether the target image has been moved to the preset targeted positions, and, if so, the performance of the movement of the extremity is graded.
Description
- 1. Field of the Invention
- This Application claims priority of Taiwan Patent Application No. 96129617, filed on 10 Aug. 2007, the entirety of which is incorporated by reference herein.
- 1. Description of the Related Art
- Given the aging society, it has become more apparent that many everyday care products for older persons, do not meet or fully satisfy the needs of older persons. This is especially noticeable when looking at medical treatment for extremity and extremity attachments for apoplexy victims, wherein, the demand to provide more enjoyable and interesting extremity rehabilitation is growing. Note that it is assumed that current extremity rehabilitation for older persons are extremely boring, and because of this in part, causes poor rehabilitation results.
- One solution for the aforementioned problem, is to provide interactive games comprising virtual computer images for apoplexy victims, which are not only enjoyable but also provide extremity rehabilitation for the apoplexy victims. Thus, assisting to improve rehabilitation results of apoplexy victims.
- Thus, an interactive rehabilitation method and system for upper and lower extremities is desirable, assisting with extremity rehabilitation and body training (Chinese shadow boxing, for example) for apoplexy victims via virtual computer images of interactive games.
- Interactive rehabilitation methods are provided. An exemplary embodiment of an interactive rehabilitation method comprises the following. An identification label of an extracted image is detected to provide an operating position of an image of an extremity. A movement mode for a target image is determined according to the identification label and the target image is displayed in a scene. It is determined whether identification labels corresponding to movement of an extremity of the target image are being continuously obtained, and, if so, the performance of the movement of the extremity is led based on operational guidance. A feedback operation is provided according to the movement of the extremity, preset movement paths and velocities and targeted positions of the target image. It is determined whether the target image has been moved to the preset targeted positions, and, if so, the performance of the movement of the extremity is graded.
- Interactive rehabilitation systems are provided. An exemplary embodiment of an interactive rehabilitation system comprises a hand position monitoring module, a target image movement control module, an image feedback module, and a movement evaluation module. The hand position monitoring module detects an identification label of an extracted image to provide an operating position of an image of an extremity. The target image movement control module determines a movement mode for a target image according to the identification label, displays the target image in a scene, determines whether identification labels corresponding to movement of an extremity of the target image are being continuously obtained, and, if the identification labels are being continuously obtained, leading the movement of the extremity based on operational guidance. The image feedback module provides a feedback operation according to the movement of the extremity, preset movement paths and velocities and targeted positions of the target image. The movement evaluation module grades the movement of the extremity when the target image has been moved to the preset targeted positions.
- A detailed description is given in the following embodiments with reference to the accompanying drawings.
- The invention can be more fully understood by reading the subsequent detailed description and examples with references made to the accompanying drawings, wherein:
-
FIG. 1 is a schematic view of a computer system of the present invention; -
FIG. 2 is a schematic view of an interactive rehabilitation system 100 shown inFIG. 1 of the present invention; -
FIG. 3 is a flowchart of an interactive rehabilitation method of the present invention; -
FIG. 4 illustrates human extremities; -
FIG. 5 illustrates a behavioral range of the operator detected by an image extraction device; -
FIG. 6 illustrates grabbing a sphere in a game scene; -
FIG. 7 illustrates feedback states in response to operator movements in the game scene; and -
FIGS. 8-11 illustrate Chinese shadow boxing motions. - Several exemplary embodiments of the invention are described with reference to
FIGS. 1 through 3 , which generally relate to interactive rehabilitation for movement of upper and lower extremities. It is to be understood that the following disclosure provides various different embodiments as examples for implementing different features of the invention. Specific examples of components and arrangements are described in the following to simplify the present disclosure. These are, of course, merely examples and are not intended to be limiting. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various described embodiments and/or configurations. - The invention discloses an interactive rehabilitation method and system for mobility of upper and lower extremities, assisting extremity rehabilitation and body training (Chinese shadow boxing, for example) for apoplexy patients via virtual computer images of interactive games.
- Given the aging society, the invention provides an interactive game for older persons to play the game indoors, and provide brain stimulus and entertainment to facilitate independence of older persons. Additionally, the video system of the game enables older persons to play with and interact with their children or other players, which assists in social interaction, thus, slowing the aging process in mind and soul.
- An embodiment of an interactive rehabilitation method and system for extremities can serve as training equipment for interactive extremity rehabilitation, immediately leading operators to perform extremity rehabilitation or training exercise via the game.
-
FIG. 1 is a schematic view of a computer system of the present invention. - An embodiment of an interactive
extremity rehabilitation system 110 is implemented in acomputer device 130. Thecomputer device 130 is wired or wireless-connected to an image extraction device (a Webcam, for example) 150. Theimage extraction device 150 can be internally installed in thecomputer device 130. The interactiveextremity rehabilitation system 110 extracts real-time images of a person via theimage extraction device 150 and transmits the extracted images to thecomputer device 130 to be displayed in a user interface (not shown) provided by the interactiveextremity rehabilitation system 110. Additionally, theimage extraction device 150 comprises an image identification system program for analyzing an image scope of a reaction area, retrieving movements from the start to the end of extremities, and performing real-time operations for dynamic images and returning feedbacks for flexibility training of extremities. -
FIG. 2 is a schematic view of an interactiveextremity rehabilitation system 110 shown inFIG. 1 of the present invention.FIG. 3 is a flowchart of an interactive rehabilitation method of the present invention. - The exemplary embodiment of an
interactive rehabilitation system 110 comprises a handposition monitoring module 210, a target imagemovement control module 230, animage feedback module 250, and amovement evaluation module 270. A process for the exemplary embodiment of theinteractive rehabilitation system 110 is first described in the following. - Referring to
FIGS. 1-3 , a color mark or recognizable mark (defined as an identification label in this embodiment) for a position is first placed on a detected portion of an operator for extraction by theimage extraction device 150 before the rehabilitation process starts. When a game provided by the present invention is activated, theinteractive rehabilitation system 110 detects extremity movements (hand movements, for example) of the operator using theimage extraction device 150. The handposition monitoring module 210 detects the identification label of an extracted image of the operator extracted by theimage extraction device 150 to provide a corresponding position (defined as an extremity position) for extremities in a game scene (step S31). - The target image
movement control module 230 retrieves the identification label corresponding to the movement of the extremities from the handposition monitoring module 210 to determine movement modes and appearance sequences of a target image (step S32). The system predefines required target images and classifications (Chinese shadow boxing motions or sphere grabbing actions, for example, which are not to be limitative). Each targets image and classification comprises plural movements and movement paths and velocities, and targeted positions are preset to movements of each target image. The preset data is stored in a database (not shown). Extremity movements of the operator correspond to movements of the target image. When a movement of an extremity corresponding to the identification label is retrieved, the target imagemovement control module 230 immediately selects a movement mode and an appearance sequence of the target image corresponding to the movement of the extremity and displays the target image (step S33), a Chinese shadow boxing motion or a sphere grabbing action, for example. - The target image
movement control module 230 determines whether identification labels corresponding to movement of an extremity of the target image are being continuously retrieved from the hand position monitoring module 210 (step S34), i.e. determining whether the operator performs the Chinese shadow boxing motion or sphere grabbing action. If the identification labels are not continuously retrieved, which indicates that the operator did not completely perform the movement, does not know how to perform the movement, or has forgotten how to perform the movement, the operator is reminded how to perform the movement by arrow guidance or other eye-catching suggestions. If the identification labels are being continuously retrieved, the extremities (the hands, for example) of the operator is led based on the preset movement paths and velocities and the targeted positions via operational guidance (step S35). The operator, for example, is led to grab a target image in a game scene and place the target image at a correct target position or perform a Chinese shadow boxing motion. - The
image feedback module 250 provides a feedback operation for the operator according to the movement of the extremity, the preset movement paths and velocities and the targeted positions (step S36). Shapes, emotional expressions and sounds of the target image, for example, are changed or an error message (image) or sound effect is shown. When the hand image (i.e. the movement of the extremity) of the operator overlaps the target image (while grabbing the target image) or velocity or locus difference (fast and slow motions of the Chinese shadow boxing motions) therebetween is generated, the feedback operation is provided. The feedback operation indicates image pattern variation and combinations of sound and power outputs or velocity variation, enabling the operator to experience interactions with the target image. - The target image
movement control module 230 determines whether all of the target images have been moved to the preset targeted positions (step S37). That is, when a game for grabbing spheres is performed, whether each sphere is placed at its individual position, or, when a game for Chinese shadow boxing motions is performed, whether all Chinese shadow boxing motions are completed, is determined. If a target image has not been moved to the preset targeted positions, the process proceeds to step S33 to repeat the described operations and enable all of the target images to be moved to their preset targeted positions. When all of the target images have been moved to their preset targeted positions, themovement evaluation module 270 grades the movement of the extremities of the operator according to similarity between the movement of the extremities and the target images (step S38), and then the process terminates. - As described, the interactive
extremity rehabilitation system 110 enables patients requiring rehabilitation for hand extremities to implement movement training via game interactions. Additionally, the system can provide competition for more than one user at the same game platform via video conference, achieving enjoyable rehabilitation and required training results. - Processes for components of the interactive
extremity rehabilitation system 110 are described as follows. - The hand
position monitoring module 210 performs skin color recognition (based on the mark placed on the extremities) using computer vision simulation and tracks dynamic object behaviors according to recognition results. Further, the handposition monitoring module 210 extracts images from real-time images retrieved from theimage extraction device 150 according to preset skin color definitions, determines whether each pixel on the extracted image comprises an area identical to that of the preset skin color definitions, marking a center of the area, subtracts a position of the center from that of a center of an actual screen, and transmits a control signal of a resulting distance vector to the target imagemovement control module 230 for tracking. - With respect to the target image
movement control module 230, a computer/computer game system provides target images for different types of games and movements, movement paths and velocities, targeted positions, and parameters are preset to each target image. The movement paths and velocities, targeted positions, and parameters are defined according to medical treatment requirements. The target imagemovement control module 230 leads, controls, and corrects hand movement of the operator to grab and place the target image to a correct target position, correcting and rehabilitating hand function of patients. - Processes for the target image
movement control module 230 are described as follows. - To achieve dragging of a target image (a game object) via gestures, an operational scope for a gesture operating area is first locked and an available skin color is separately highlighted using skin color detection. The dragged gesture represents a dynamic process, the gesture operating area provides dynamic signals of a frame, and an available dynamic signal of the gesture operating area is extracted using a frame differential detection method. Next, a logical operation (AND, for example) is implemented to the skin color area with the dynamic signals of the dragged gesture to generate a skin color differential area (i.e. the area in which the gestures of a frame are performed). The skin color differential area corresponds to coordinate positions in the game space, collisions for the skin color differential area and a movement area of the target image are detected, and collision signals serve as determination for selecting a game object. Additionally, to facilitate the target image change, coordinate positions based on the dragged gesture, and the average center coordinate data of the skin color differential area is corresponded to coordinate positions in the game space to generate target coordinates used for leading the target image to move.
- With respect to the
image feedback module 250, when the hand image of the operator overlaps (i.e. the grabbing movement) the target image or velocity or locus difference between the hand movement and the target image movement is generated, theimage feedback module 250 provides a feedback operation with image pattern variation and combinations of sound and power outputs or velocity variation for the operator based on preset parameters. Theimage feedback module 250 provides leading, controlling, and correcting the hands of the operator to grab and place the game object to a correct target position according to preset values. - Processes for the
image feedback module 250 are described as follows. - Movement paths and velocities of a target image are created and parameters of targeted positions of the target image are defined and the defined data is stored in a database (not shown). Additionally, it is determined whether movement values, generated by operational behavior using artificial intelligence (IA), correspond to system defined standard parameters. When the hand images of the operator overlaps (i.e. the grabbing movement) the target image or movement velocities or loci of the hand image and the target image in the game scene are different from the system predefined values (i.e. the predefined parameters), a feedback operation with image pattern variation and combinations of sound and power outputs or velocity variation is provided for the operator based on preset information stored in the database.
- With respect to the
movement evaluation module 270, a real-time feedback mode is available to the operator according to the movement paths and velocities and targeted positions, so that the operator can be immediately corrected. - Several examples are described to illustrate the process of the interactive
extremity rehabilitation system 110. - Referring to
FIG. 4 , human extremities can be at least classified as a wrist swinging around (as shown in Fig. A), a lateral movement (as shown in Fig. B), a finger winding movement (as shown in Fig. C), and clenching movements (as shown in Figs. D-F). Sphere grabbing motions or Chinese shadow boxing actions can be implemented using the described movements. - For clenching movements, human-machine interactions and image recognition design are applied to achieve accuracy of movement operation and correctness, as the system of the invention provides feedback operations for each movement of the operator. Image pattern variation and combinations of sound and power outputs or velocity variation, for example, enables the operator to experience interaction with the target image. The
movement evaluation module 270 determines performance grades according to the interaction between the hand image and the target image. -
FIG. 5 illustrates a behavioral range of the operator detected by theimage extraction device 150. The extractible range (ER) of theimage extraction device 150 is shown by the block, wherein the extremities (Ex.) of the operator can only perform inside of the block and will not be detected outside of the block. - Referring to sphere grabbing in
FIG. 6 , when the game starts, the system selects and sets an identification label for tracking the operator and displays a target image corresponding to a selected movement mode. The system detects and displays extremities of the operator in a game scene, wherein when the operator grabs a sphere (the target image) in the game scene, the system leads the operator to place the grabbed sphere at a correct target position according to preset targeted positions and parameters stored in a database (not shown), and provides feedback according to velocity or locus similarity of the movement of the extremity. When the current sphere is placed to a correct target position and feedback is provided, the system then displays another sphere in the game scene and leads the operator to place the sphere to a correct target position. - The system leads grabbing movements of the operator according to preset movement paths and velocities for each target image and, when the hand image of the operator overlaps (i.e. the grabbing movement) the target image or velocity or locus difference between the hand movement and the target image movement is generated, leads and corrects movements of the operator based on image movements, emotional expressions, or moving directions. Additionally, the system provides a feedback pattern (located at any position on the sphere or the operating window) to show feedback states in response to operator movements in the game scene. Referring to
FIG. 7 , FIG. A illustrates a normal state where the sphere has not been grabbed, FIG. B illustrates touching the sphere by the extremity image of the operator, and FIG. C illustrates interactions between the extremity image and the sphere, such that the operator can synchronously experience interactions from the target image during the extremity rehabilitation process. - Referring to Chinese shadow boxing motions in
FIGS. 8-11 , when the game starts, the system selects and sets an identification label for tracking the operator and displays a target image corresponding to a selected movement mode. The system detects and displays extremities of the operator in a game scene, wherein when the operator motions, the system determines movements of the operator as Chinese shadow boxing motions, leads the extremities (both hands in this embodiment) of the operator to move to a correct target position with a correct path using a virtual figure, and provides feedback according to velocity or locus similarity of the movement of the extremities. - Referring to
FIG. 8 , the system generates and locates a virtual figure (VF) at the left side of the frame and enables the left hand (LH) and the right hand (RH) of the virtual figure to perform corresponding movements according to preset targeted positions and parameters stored in the database (not shown), facilitating the operator to imitate the movements of the virtual figure. The right side of the frame shows a real figure extracted by an image extraction device. When the operator swings both hands, the real figure in the frame generates corresponding movements. The system determines whether a movement of the operator is correct based on the movement of the real figure and that of the virtual figure and provides feedback (performance grading, for example). When the movement is complete, the system shows another virtual figure of the next Chinese shadow boxing motion (as shown inFIG. 9 ) and leads the operator to imitate the motion. The described process is repeated to enable the operator to complete the subsequent Chinese shadow boxing motions (as shown inFIGS. 10 and 11 ) and feedback (performance grades, for example) is provided based on the completed motions, such that the operator can correct his movements according to the feedback. - An embodiment of the interactive rehabilitation method and system promotes flexibility of older persons and improves the extremity ability of the operator via real extremity tanning. Additionally, the invention provides human-machine interactions to improve degeneration of extremity ability for older persons caused by old age via a physical touch platform. The extremity activities for older persons are thus expanded and the game platform allows enjoyable entertainment and recreational activities which improve reaction degeneration of older persons. That is, extremity mobility of patients are improved and influenced unobtrusively and imperceptibly by playing games.
- Methods and systems of the present disclosure, or certain aspects or portions of embodiments thereof, may take the form of a program code (i.e., instructions) embodied in media, such as floppy diskettes, CD-ROMS, hard drives, firmware, or any other machine-readable storage medium, wherein, when the program code is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing embodiments of the disclosure. The methods and apparatus of the present disclosure may also be embodied in the form of a program code transmitted over some transmission medium, such as electrical wiring or cabling, through fiber optics, or via any other form of transmission, wherein, when the program code is received and loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing and embodiment of the disclosure. When implemented on a general-purpose processor, the program code combines with the processor to provide a unique apparatus that operates analogously to specific logic circuits.
- While the invention has been described by way of example and in terms of the preferred embodiments, it is to be understood that the invention is not limited to the disclosed embodiments. To the contrary, it is intended to cover various modifications and similar arrangements (as would be apparent to those skilled in the art). Therefore, the scope of the appended claims should be accorded the broadest interpretation so as to encompass all such modifications and similar arrangements.
Claims (19)
1. An interactive rehabilitation method, comprising:
providing path characteristics of a target image;
extracting a movement of an extremity from an extraction device;
enabling the movement of the extremity to interact with the target image in a scene; and
immediately adjusting interaction states between the movement of the extremity and the target image according to the path characteristics.
2. The interactive rehabilitation method as claimed in claim 1 , further comprising providing a feedback operation according to the interaction states between the movement of the extremity and the target image.
3. The interactive rehabilitation method as claimed in claim 1 , further comprising leading the movement of the extremity based on operational guidance according to preset parameters corresponding to the target image to lead the movement of the extremity to interact with the target image and providing a score according to interactive similarity.
4. An interactive rehabilitation method, comprising:
detecting an identification label of an extracted image to provide an operating position of an image of an extremity;
determining a movement mode for a target image according to the identification label;
displaying the target image in a scene;
determining whether identification labels corresponding to movement of the extremity of the target image are being continuously obtained;
if the identification labels are being continuously obtained, leading the movement of the extremity based on operational guidance;
providing a feedback operation according to the movement of the extremity, preset movement paths and velocities and targeted positions of the target image;
determining whether the target image has been moved to the preset targeted positions; and
the target image has been moved to the preset targeted positions, grading the performance of the movement of the extremity.
5. The interactive rehabilitation method as claimed in claim 4 , further comprising, when plural target images are provided, determining movement modes of the target images and appearance sequences of each target image according to the identification label.
6. The interactive rehabilitation method as claimed in claim 4 , further comprising, if the identification labels are being continuously obtained, leading the movement of the extremity based on the operational guidance, the preset movement paths and velocities and the targeted positions of the target image corresponding to the target image.
7. The interactive rehabilitation method as claimed in claim 4 , further comprising grading the performance of the movement of the extremity according to similarity between the movement of the extremity and the target image.
8. An interactive rehabilitation system, comprising:
a hand position monitoring module, detecting an identification label of an extracted image to provide an operating position of an image of an extremity;
a target image movement control module, determining a movement mode for a target image according to the identification label, displaying the target image in a scene, determining whether identification labels corresponding to movement of an extremity of the target image are being continuously obtained, and, if the identification labels are being continuously obtained, leading the movement of the extremity based on operational guidance;
an image feedback module, providing a feedback operation according to the movement of the extremity, preset movement paths and velocities and targeted positions of the target image; and
a movement evaluation module, when the target image has been moved to the preset targeted positions, grading the performance of the movement of the extremity.
9. The interactive rehabilitation method as claimed in claim 8 , wherein, when plural target images are provided, the target image movement control module determines movement modes of the target images and appearance sequences of each target image according to the identification label.
10. The interactive rehabilitation method as claimed in claim 8 , wherein, if the identification labels are being continuously obtained, the target image movement control module leads the movement of the extremity based on the operational guidance, the preset movement paths and velocities and targeted positions of the target image corresponding to the target image.
11. The interactive rehabilitation method as claimed in claim 8 , wherein the movement evaluation module grades the movement of the extremity according to similarity between the movement of the extremity and the target image.
12. The interactive rehabilitation system as claimed in claim 9 , wherein the target image movement control module separately highlights an available skin color area for the movement of the extremity using a skin color detection method.
13. The interactive rehabilitation system as claimed in claim 8 , wherein the target image movement control module extracts available dynamic signals for the movement of the extremity using a frame differential detection method.
14. The interactive rehabilitation system as claimed in claim 8 , wherein the target image movement control module implements a logical operation to the parameters of the available skin color area with the dynamic signals to generate a skin color differential area.
15. The interactive rehabilitation system as claimed in claim 8 , wherein the image feedback module provides the feedback operation according to the preset movement paths and velocities and the targeted positions when the movement of the extremity overlaps the target image or velocity or locus difference therebetween is generated.
16. A computer-readable storage medium storing a computer program providing an interactive rehabilitation method, comprising using a computer to perform the steps of:
detecting an identification label of an extracted image to provide an operating position of an image of an extremity;
determining a movement mode for a target image according to the identification label;
displaying the target image in a scene;
determining whether identification labels corresponding to movement of an extremity of the target image are being continuously obtained;
if the identification labels are being continuously obtained, leading the movement of the extremity based on operational guidance;
providing a feedback operation according to the movement of the extremity, preset movement paths and velocities and targeted positions of the target image;
determining whether the target image has been moved to the preset targeted positions; and
if the target image has been moved to the preset targeted positions, grading the performance of the movement of the extremity.
17. The computer-readable storage medium as claimed in claim 16 , further comprising, when plural target images are provided, determining movement modes of the target images and appearance sequences of each target image according to the identification label.
18. The computer-readable storage medium as claimed in claim 16 , further comprising, if the identification labels are being continuously obtained, leading the movement of the extremity based on the operational guidance, the preset movement paths and velocities and the targeted positions of the target image corresponding to the target image.
19. The computer-readable storage medium as claimed in claim 16 , further comprising grading the performance of the movement of the extremity according to similarity between the movement of the extremity and the target image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/308,675 US20140295393A1 (en) | 2007-08-10 | 2014-06-18 | Interactive rehabilitation method and system for movement of upper and lower extremities |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
TW096129617A TWI377055B (en) | 2007-08-10 | 2007-08-10 | Interactive rehabilitation method and system for upper and lower extremities |
TWTW96129617 | 2007-08-10 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/308,675 Division US20140295393A1 (en) | 2007-08-10 | 2014-06-18 | Interactive rehabilitation method and system for movement of upper and lower extremities |
Publications (1)
Publication Number | Publication Date |
---|---|
US20090042695A1 true US20090042695A1 (en) | 2009-02-12 |
Family
ID=40347078
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/189,068 Abandoned US20090042695A1 (en) | 2007-08-10 | 2008-08-08 | Interactive rehabilitation method and system for movement of upper and lower extremities |
US14/308,675 Abandoned US20140295393A1 (en) | 2007-08-10 | 2014-06-18 | Interactive rehabilitation method and system for movement of upper and lower extremities |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/308,675 Abandoned US20140295393A1 (en) | 2007-08-10 | 2014-06-18 | Interactive rehabilitation method and system for movement of upper and lower extremities |
Country Status (2)
Country | Link |
---|---|
US (2) | US20090042695A1 (en) |
TW (1) | TWI377055B (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090286545A1 (en) * | 2008-05-13 | 2009-11-19 | Qualcomm Incorporated | Transmit power selection for user equipment communicating with femto cells |
US20110152033A1 (en) * | 2009-12-22 | 2011-06-23 | Bing-Shiang Yang | Physical training system |
US20120183939A1 (en) * | 2010-11-05 | 2012-07-19 | Nike, Inc. | Method and system for automated personal training |
US20120183940A1 (en) * | 2010-11-05 | 2012-07-19 | Nike, Inc. | Method and system for automated personal training |
US20120190505A1 (en) * | 2011-01-26 | 2012-07-26 | Flow-Motion Research And Development Ltd | Method and system for monitoring and feed-backing on execution of physical exercise routines |
EP2371434A3 (en) * | 2010-03-31 | 2013-06-19 | NAMCO BANDAI Games Inc. | Image generation system, image generation method, and information storage medium |
US20150015480A1 (en) * | 2012-12-13 | 2015-01-15 | Jeremy Burr | Gesture pre-processing of video stream using a markered region |
CN104408775A (en) * | 2014-12-19 | 2015-03-11 | 哈尔滨工业大学 | Depth perception based three-dimensional shadow play production method |
US9104240B2 (en) | 2013-01-09 | 2015-08-11 | Intel Corporation | Gesture pre-processing of video stream with hold-off period to reduce platform power |
US9292103B2 (en) | 2013-03-13 | 2016-03-22 | Intel Corporation | Gesture pre-processing of video stream using skintone detection |
EP2848094A4 (en) * | 2012-05-07 | 2016-12-21 | Chia Ming Chen | Light control systems and methods |
US9811639B2 (en) | 2011-11-07 | 2017-11-07 | Nike, Inc. | User interface and fitness meters for remote joint workout session |
US9852271B2 (en) | 2010-12-13 | 2017-12-26 | Nike, Inc. | Processing data of a user performing an athletic activity to estimate energy expenditure |
CN107833611A (en) * | 2017-11-06 | 2018-03-23 | 广州优涵信息技术有限公司 | A kind of self-closing disease recovery training method based on virtual reality |
US9977874B2 (en) | 2011-11-07 | 2018-05-22 | Nike, Inc. | User interface for remote joint workout session |
WO2018140802A1 (en) * | 2017-01-27 | 2018-08-02 | The Johns Hopkins University | Rehabilitation and training gaming system to promote cognitive-motor engagement description |
US10188930B2 (en) | 2012-06-04 | 2019-01-29 | Nike, Inc. | Combinatory score having a fitness sub-score and an athleticism sub-score |
CN110051983A (en) * | 2019-04-12 | 2019-07-26 | 深圳泰山体育科技股份有限公司 | Pushing system is rubbed in intelligent Tai Ji and the intelligent control method of pushing device is rubbed in Tai Ji |
US10406967B2 (en) | 2014-04-29 | 2019-09-10 | Chia Ming Chen | Light control systems and methods |
US10420982B2 (en) | 2010-12-13 | 2019-09-24 | Nike, Inc. | Fitness training system with energy expenditure calculation that uses a form factor |
CN113283385A (en) * | 2021-06-17 | 2021-08-20 | 贝塔智能科技(北京)有限公司 | Somatosensory interaction system and method based on limb recognition technology |
Families Citing this family (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI476632B (en) * | 2009-12-08 | 2015-03-11 | Micro Star Int Co Ltd | Method for moving object detection and application to hand gesture control system |
EP2849438A4 (en) * | 2012-04-23 | 2016-01-27 | Japan Science & Tech Agency | Motion guide presentation method and system therefor, and motion guide presentation device |
TW201412297A (en) * | 2012-09-28 | 2014-04-01 | zhi-zhen Chen | Three-dimensional recording system of upper limb disorder rehabilitation process of and recording method thereof |
TWI559144B (en) * | 2014-06-05 | 2016-11-21 | zhi-zhen Chen | Scale - guided limb rehabilitation system |
KR101711488B1 (en) * | 2015-01-28 | 2017-03-03 | 한국전자통신연구원 | Method and System for Motion Based Interactive Service |
TWI595908B (en) * | 2015-02-04 | 2017-08-21 | zhi-zhen Chen | Electrical stimulation system tocil |
TWI553585B (en) * | 2015-04-07 | 2016-10-11 | 元智大學 | Body rehabilitation sensing method based on a mobile communication device and a system thereof |
TWI603303B (en) * | 2016-07-28 | 2017-10-21 | 南臺科技大學 | Cognitive detection and emotional express system by nostalgic experiences |
TWI749296B (en) * | 2019-02-27 | 2021-12-11 | 康立安智能醫療設備有限公司 | System and method for rehabilitation |
US11439871B2 (en) | 2019-02-27 | 2022-09-13 | Conzian Ltd. | System and method for rehabilitation |
TWI842582B (en) * | 2023-06-29 | 2024-05-11 | 亞東學校財團法人亞東科技大學 | Rehabilitation system and method thereof |
Citations (32)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4508510A (en) * | 1983-12-07 | 1985-04-02 | Mona Clifford | Method for psychomotor training of physical skills |
US5429140A (en) * | 1993-06-04 | 1995-07-04 | Greenleaf Medical Systems, Inc. | Integrated virtual reality rehabilitation system |
US5774357A (en) * | 1991-12-23 | 1998-06-30 | Hoffberg; Steven M. | Human factored interface incorporating adaptive pattern recognition based controller apparatus |
US5875108A (en) * | 1991-12-23 | 1999-02-23 | Hoffberg; Steven M. | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US5901246A (en) * | 1995-06-06 | 1999-05-04 | Hoffberg; Steven M. | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US5920447A (en) * | 1996-03-14 | 1999-07-06 | Kabushiki Kaisha Toshiba | Magnetic disk unit having laminated magnetic heads |
US6049327A (en) * | 1997-04-23 | 2000-04-11 | Modern Cartoons, Ltd | System for data management based onhand gestures |
US6081750A (en) * | 1991-12-23 | 2000-06-27 | Hoffberg; Steven Mark | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US6149586A (en) * | 1998-01-29 | 2000-11-21 | Elkind; Jim | System and method for diagnosing executive dysfunctions using virtual reality and computer simulation |
US6184847B1 (en) * | 1998-09-22 | 2001-02-06 | Vega Vista, Inc. | Intuitive control of portable data displays |
US6244987B1 (en) * | 1996-11-25 | 2001-06-12 | Mitsubishi Denki Kabushiki Kaisha | Physical exercise system having a virtual reality environment controlled by a user's movement |
US6400996B1 (en) * | 1999-02-01 | 2002-06-04 | Steven M. Hoffberg | Adaptive pattern recognition based control system and method |
US6418424B1 (en) * | 1991-12-23 | 2002-07-09 | Steven M. Hoffberg | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US6445364B2 (en) * | 1995-11-28 | 2002-09-03 | Vega Vista, Inc. | Portable game display and method for controlling same |
US6447464B1 (en) * | 1999-09-24 | 2002-09-10 | Lifespan Therapy Services, Inc. | Therapy device for upper extremity dysfunction |
US6526395B1 (en) * | 1999-12-31 | 2003-02-25 | Intel Corporation | Application of personality models and interaction with synthetic characters in a computing system |
US6530085B1 (en) * | 1998-09-16 | 2003-03-04 | Webtv Networks, Inc. | Configuration for enhanced entertainment system control |
US6569066B1 (en) * | 2000-05-31 | 2003-05-27 | Paul Patterson | Upper extremity rehabilitation and training device and method |
US6659774B1 (en) * | 2002-05-21 | 2003-12-09 | Tri-Sil Llc | Diagnostic game and teaching tool |
US6695770B1 (en) * | 1999-04-01 | 2004-02-24 | Dominic Kin Leung Choy | Simulated human interaction systems |
US6712692B2 (en) * | 2002-01-03 | 2004-03-30 | International Business Machines Corporation | Using existing videogames for physical training and rehabilitation |
US6774885B1 (en) * | 1999-01-20 | 2004-08-10 | Motek B.V. | System for dynamic registration, evaluation, and correction of functional human behavior |
US20040155962A1 (en) * | 2003-02-11 | 2004-08-12 | Marks Richard L. | Method and apparatus for real time motion capture |
US20050020409A1 (en) * | 2003-07-22 | 2005-01-27 | Gifu University | Physical rehabilitation training and education device |
US6850252B1 (en) * | 1999-10-05 | 2005-02-01 | Steven M. Hoffberg | Intelligent electronic appliance system and method |
CN1731316A (en) * | 2005-08-19 | 2006-02-08 | 北京航空航天大学 | Human-computer interaction method for dummy ape game |
US7006881B1 (en) * | 1991-12-23 | 2006-02-28 | Steven Hoffberg | Media recording device with remote graphic user interface |
US7018211B1 (en) * | 1998-08-31 | 2006-03-28 | Siemens Aktiengesellschaft | System for enabling a moving person to control body movements to be performed by said person |
US7133535B2 (en) * | 2002-12-21 | 2006-11-07 | Microsoft Corp. | System and method for real time lip synchronization |
US7135637B2 (en) * | 2000-01-11 | 2006-11-14 | Yamaha Corporation | Apparatus and method for detecting performer's motion to interactively control performance of music or the like |
US20090023555A1 (en) * | 2007-04-26 | 2009-01-22 | Heather Raymond | Method and system for developing or tracking a program for medical treatment |
US7648441B2 (en) * | 2004-11-10 | 2010-01-19 | Silk Jeffrey E | Self-contained real-time gait therapy device |
-
2007
- 2007-08-10 TW TW096129617A patent/TWI377055B/en not_active IP Right Cessation
-
2008
- 2008-08-08 US US12/189,068 patent/US20090042695A1/en not_active Abandoned
-
2014
- 2014-06-18 US US14/308,675 patent/US20140295393A1/en not_active Abandoned
Patent Citations (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4508510A (en) * | 1983-12-07 | 1985-04-02 | Mona Clifford | Method for psychomotor training of physical skills |
US5903454A (en) * | 1991-12-23 | 1999-05-11 | Hoffberg; Linda Irene | Human-factored interface corporating adaptive pattern recognition based controller apparatus |
US5867386A (en) * | 1991-12-23 | 1999-02-02 | Hoffberg; Steven M. | Morphological pattern recognition based controller system |
US7006881B1 (en) * | 1991-12-23 | 2006-02-28 | Steven Hoffberg | Media recording device with remote graphic user interface |
US5875108A (en) * | 1991-12-23 | 1999-02-23 | Hoffberg; Steven M. | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US6418424B1 (en) * | 1991-12-23 | 2002-07-09 | Steven M. Hoffberg | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US7136710B1 (en) * | 1991-12-23 | 2006-11-14 | Hoffberg Steven M | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US5774357A (en) * | 1991-12-23 | 1998-06-30 | Hoffberg; Steven M. | Human factored interface incorporating adaptive pattern recognition based controller apparatus |
US6081750A (en) * | 1991-12-23 | 2000-06-27 | Hoffberg; Steven Mark | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US5429140A (en) * | 1993-06-04 | 1995-07-04 | Greenleaf Medical Systems, Inc. | Integrated virtual reality rehabilitation system |
US5901246A (en) * | 1995-06-06 | 1999-05-04 | Hoffberg; Steven M. | Ergonomic man-machine interface incorporating adaptive pattern recognition based control system |
US6445364B2 (en) * | 1995-11-28 | 2002-09-03 | Vega Vista, Inc. | Portable game display and method for controlling same |
US5920447A (en) * | 1996-03-14 | 1999-07-06 | Kabushiki Kaisha Toshiba | Magnetic disk unit having laminated magnetic heads |
US6244987B1 (en) * | 1996-11-25 | 2001-06-12 | Mitsubishi Denki Kabushiki Kaisha | Physical exercise system having a virtual reality environment controlled by a user's movement |
US6452584B1 (en) * | 1997-04-23 | 2002-09-17 | Modern Cartoon, Ltd. | System for data management based on hand gestures |
US6049327A (en) * | 1997-04-23 | 2000-04-11 | Modern Cartoons, Ltd | System for data management based onhand gestures |
US6149586A (en) * | 1998-01-29 | 2000-11-21 | Elkind; Jim | System and method for diagnosing executive dysfunctions using virtual reality and computer simulation |
US7018211B1 (en) * | 1998-08-31 | 2006-03-28 | Siemens Aktiengesellschaft | System for enabling a moving person to control body movements to be performed by said person |
US6530085B1 (en) * | 1998-09-16 | 2003-03-04 | Webtv Networks, Inc. | Configuration for enhanced entertainment system control |
US6184847B1 (en) * | 1998-09-22 | 2001-02-06 | Vega Vista, Inc. | Intuitive control of portable data displays |
US6774885B1 (en) * | 1999-01-20 | 2004-08-10 | Motek B.V. | System for dynamic registration, evaluation, and correction of functional human behavior |
US6640145B2 (en) * | 1999-02-01 | 2003-10-28 | Steven Hoffberg | Media recording device with packet data interface |
US6400996B1 (en) * | 1999-02-01 | 2002-06-04 | Steven M. Hoffberg | Adaptive pattern recognition based control system and method |
US6695770B1 (en) * | 1999-04-01 | 2004-02-24 | Dominic Kin Leung Choy | Simulated human interaction systems |
US6447464B1 (en) * | 1999-09-24 | 2002-09-10 | Lifespan Therapy Services, Inc. | Therapy device for upper extremity dysfunction |
US6850252B1 (en) * | 1999-10-05 | 2005-02-01 | Steven M. Hoffberg | Intelligent electronic appliance system and method |
US6526395B1 (en) * | 1999-12-31 | 2003-02-25 | Intel Corporation | Application of personality models and interaction with synthetic characters in a computing system |
US7135637B2 (en) * | 2000-01-11 | 2006-11-14 | Yamaha Corporation | Apparatus and method for detecting performer's motion to interactively control performance of music or the like |
US6569066B1 (en) * | 2000-05-31 | 2003-05-27 | Paul Patterson | Upper extremity rehabilitation and training device and method |
US6712692B2 (en) * | 2002-01-03 | 2004-03-30 | International Business Machines Corporation | Using existing videogames for physical training and rehabilitation |
US6659774B1 (en) * | 2002-05-21 | 2003-12-09 | Tri-Sil Llc | Diagnostic game and teaching tool |
US7133535B2 (en) * | 2002-12-21 | 2006-11-07 | Microsoft Corp. | System and method for real time lip synchronization |
US20040155962A1 (en) * | 2003-02-11 | 2004-08-12 | Marks Richard L. | Method and apparatus for real time motion capture |
US20050020409A1 (en) * | 2003-07-22 | 2005-01-27 | Gifu University | Physical rehabilitation training and education device |
US7648441B2 (en) * | 2004-11-10 | 2010-01-19 | Silk Jeffrey E | Self-contained real-time gait therapy device |
CN1731316A (en) * | 2005-08-19 | 2006-02-08 | 北京航空航天大学 | Human-computer interaction method for dummy ape game |
US20090023555A1 (en) * | 2007-04-26 | 2009-01-22 | Heather Raymond | Method and system for developing or tracking a program for medical treatment |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090286545A1 (en) * | 2008-05-13 | 2009-11-19 | Qualcomm Incorporated | Transmit power selection for user equipment communicating with femto cells |
US20110152033A1 (en) * | 2009-12-22 | 2011-06-23 | Bing-Shiang Yang | Physical training system |
EP2371434A3 (en) * | 2010-03-31 | 2013-06-19 | NAMCO BANDAI Games Inc. | Image generation system, image generation method, and information storage medium |
US8998718B2 (en) | 2010-03-31 | 2015-04-07 | Bandai Namco Games Inc. | Image generation system, image generation method, and information storage medium |
US10583328B2 (en) | 2010-11-05 | 2020-03-10 | Nike, Inc. | Method and system for automated personal training |
US9283429B2 (en) * | 2010-11-05 | 2016-03-15 | Nike, Inc. | Method and system for automated personal training |
US11915814B2 (en) | 2010-11-05 | 2024-02-27 | Nike, Inc. | Method and system for automated personal training |
US11710549B2 (en) | 2010-11-05 | 2023-07-25 | Nike, Inc. | User interface for remote joint workout session |
US20120183940A1 (en) * | 2010-11-05 | 2012-07-19 | Nike, Inc. | Method and system for automated personal training |
US9919186B2 (en) * | 2010-11-05 | 2018-03-20 | Nike, Inc. | Method and system for automated personal training |
US11094410B2 (en) | 2010-11-05 | 2021-08-17 | Nike, Inc. | Method and system for automated personal training |
US9358426B2 (en) * | 2010-11-05 | 2016-06-07 | Nike, Inc. | Method and system for automated personal training |
US20120183939A1 (en) * | 2010-11-05 | 2012-07-19 | Nike, Inc. | Method and system for automated personal training |
US20160101321A1 (en) * | 2010-11-05 | 2016-04-14 | Nike, Inc. | Method and System for Automated Personal Training |
US10420982B2 (en) | 2010-12-13 | 2019-09-24 | Nike, Inc. | Fitness training system with energy expenditure calculation that uses a form factor |
US9852271B2 (en) | 2010-12-13 | 2017-12-26 | Nike, Inc. | Processing data of a user performing an athletic activity to estimate energy expenditure |
US20120190505A1 (en) * | 2011-01-26 | 2012-07-26 | Flow-Motion Research And Development Ltd | Method and system for monitoring and feed-backing on execution of physical exercise routines |
US9011293B2 (en) * | 2011-01-26 | 2015-04-21 | Flow-Motion Research And Development Ltd. | Method and system for monitoring and feed-backing on execution of physical exercise routines |
US9811639B2 (en) | 2011-11-07 | 2017-11-07 | Nike, Inc. | User interface and fitness meters for remote joint workout session |
US10825561B2 (en) | 2011-11-07 | 2020-11-03 | Nike, Inc. | User interface for remote joint workout session |
US9977874B2 (en) | 2011-11-07 | 2018-05-22 | Nike, Inc. | User interface for remote joint workout session |
EP2848094A4 (en) * | 2012-05-07 | 2016-12-21 | Chia Ming Chen | Light control systems and methods |
US9587804B2 (en) | 2012-05-07 | 2017-03-07 | Chia Ming Chen | Light control systems and methods |
US10188930B2 (en) | 2012-06-04 | 2019-01-29 | Nike, Inc. | Combinatory score having a fitness sub-score and an athleticism sub-score |
US10146322B2 (en) | 2012-12-13 | 2018-12-04 | Intel Corporation | Gesture pre-processing of video stream using a markered region |
US10261596B2 (en) | 2012-12-13 | 2019-04-16 | Intel Corporation | Gesture pre-processing of video stream using a markered region |
US9720507B2 (en) * | 2012-12-13 | 2017-08-01 | Intel Corporation | Gesture pre-processing of video stream using a markered region |
US20150015480A1 (en) * | 2012-12-13 | 2015-01-15 | Jeremy Burr | Gesture pre-processing of video stream using a markered region |
US9104240B2 (en) | 2013-01-09 | 2015-08-11 | Intel Corporation | Gesture pre-processing of video stream with hold-off period to reduce platform power |
US9292103B2 (en) | 2013-03-13 | 2016-03-22 | Intel Corporation | Gesture pre-processing of video stream using skintone detection |
US10406967B2 (en) | 2014-04-29 | 2019-09-10 | Chia Ming Chen | Light control systems and methods |
US10953785B2 (en) | 2014-04-29 | 2021-03-23 | Chia Ming Chen | Light control systems and methods |
CN104408775A (en) * | 2014-12-19 | 2015-03-11 | 哈尔滨工业大学 | Depth perception based three-dimensional shadow play production method |
WO2018140802A1 (en) * | 2017-01-27 | 2018-08-02 | The Johns Hopkins University | Rehabilitation and training gaming system to promote cognitive-motor engagement description |
CN107833611A (en) * | 2017-11-06 | 2018-03-23 | 广州优涵信息技术有限公司 | A kind of self-closing disease recovery training method based on virtual reality |
CN110051983A (en) * | 2019-04-12 | 2019-07-26 | 深圳泰山体育科技股份有限公司 | Pushing system is rubbed in intelligent Tai Ji and the intelligent control method of pushing device is rubbed in Tai Ji |
CN113283385A (en) * | 2021-06-17 | 2021-08-20 | 贝塔智能科技(北京)有限公司 | Somatosensory interaction system and method based on limb recognition technology |
Also Published As
Publication number | Publication date |
---|---|
US20140295393A1 (en) | 2014-10-02 |
TWI377055B (en) | 2012-11-21 |
TW200906377A (en) | 2009-02-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20090042695A1 (en) | Interactive rehabilitation method and system for movement of upper and lower extremities | |
CN108463271B (en) | System and method for motor skill analysis and skill enhancement and prompting | |
US20200097081A1 (en) | Neuromuscular control of an augmented reality system | |
CN101564594B (en) | Interactive type limb action recovery method and system | |
Sadihov et al. | Prototype of a VR upper-limb rehabilitation system enhanced with motion-based tactile feedback | |
KR101338043B1 (en) | Cognitive Rehabilitation System and Method Using Tangible Interaction | |
CN106648120A (en) | Training system for escape from fire based on virtual reality and somatosensory technology | |
CN108351701A (en) | Ancillary technique control system and correlation technique | |
Vourvopoulos et al. | Brain-controlled serious games for cultural heritage | |
KR101799980B1 (en) | Apparatus, system and method for controlling virtual reality image and simulator | |
Bannach et al. | Waving real hand gestures recorded by wearable motion sensors to a virtual car and driver in a mixed-reality parking game | |
Saputra et al. | LexiPal: Kinect-based application for dyslexia using multisensory approach and natural user interface | |
WO2022241047A1 (en) | Equipment detection using a wearable device | |
EP4173574A1 (en) | Device for estimating cognitive ability, method therefor, and program | |
WO2018123293A1 (en) | Output control device, output control method, and program | |
KR20150097050A (en) | learning system using clap game for child and developmental disorder child | |
Casas-Ortiz et al. | Intelligent systems for psychomotor learning: A systematic review and two cases of study | |
Spanogianopoulos et al. | Human computer interaction using gestures for mobile devices and serious games: A review | |
Toma et al. | Car driver skills assessment based on driving postures recognition | |
US11823585B2 (en) | Methods and systems for writing skill development | |
US11331551B2 (en) | Augmented extended realm system | |
Leeb et al. | Combining BCI and virtual reality: scouting virtual worlds | |
Babushkin et al. | Sensorimotor Skill Communication: A Literature Review | |
CN112562825B (en) | Autistic children life skill training method based on serious game | |
Codreanu et al. | A home based health-care solution for older adults using Kinect |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INDUSTRIAL TECHNOLOGY RESEARCH INSTITUTE, TAIWAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHIEN, SHIH YING;SHAU, YIO WHA;REEL/FRAME:021377/0497 Effective date: 20080701 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |