WO2022145563A1 - Procédé et système d'entraînement à l'exercice personnalisé pour l'utilisateur - Google Patents

Procédé et système d'entraînement à l'exercice personnalisé pour l'utilisateur Download PDF

Info

Publication number
WO2022145563A1
WO2022145563A1 PCT/KR2021/000771 KR2021000771W WO2022145563A1 WO 2022145563 A1 WO2022145563 A1 WO 2022145563A1 KR 2021000771 W KR2021000771 W KR 2021000771W WO 2022145563 A1 WO2022145563 A1 WO 2022145563A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
avatar
motion
data
unit
Prior art date
Application number
PCT/KR2021/000771
Other languages
English (en)
Korean (ko)
Inventor
유제광
Original Assignee
동국대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 동국대학교 산학협력단 filed Critical 동국대학교 산학협력단
Publication of WO2022145563A1 publication Critical patent/WO2022145563A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0075Means for generating exercise programs or schemes, e.g. computerized virtual trainer, e.g. using expert databases
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • A63B2024/0015Comparing movements or motion sequences with computerised simulations of movements or motion sequences, e.g. for generating an ideal template as reference to be achieved by the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0625Emitting sound, noise or music
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0658Position or arrangement of display
    • A63B2071/0661Position or arrangement of display arranged on the user
    • A63B2071/0666Position or arrangement of display arranged on the user worn on the head or face, e.g. combined with goggles or glasses
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images
    • G06V2201/033Recognition of patterns in medical or anatomical images of skeletal patterns

Definitions

  • the present invention relates to a user-customized exercise training method and system, and more particularly, to a user-customized exercise training that provides a user with an exercise training prescription suitable for the user's physical characteristics and purpose, such as body type, physical strength, athletic ability, etc. It relates to methods and systems.
  • a system is being developed that implements the user's own appearance as an avatar, checks the user's own motion from a third-person point of view, or compares the user's motion with an expert's motion in real time.
  • the user's exercise immersion may decrease and it may be difficult to receive accurate correction of the motion.
  • An object of the present invention is to provide a user with a high-precision exercise training prescription according to the user's physical characteristics and purpose.
  • the user avatar and the training avatar can be compared and analyzed in real time, the user's convenience and exercise training effect are improved.
  • a user-customized exercise training system includes a sensing unit that collects motion data by sensing a user's physical characteristics and motion, analyzes the motion data and input data input by the user, and based on the analyzed data a calculator for generating an avatar for training, an interface for providing images and sound to a user, and receiving a user's input; and a control unit for controlling driving of each of the calculation unit and the interface unit, wherein the calculation unit analyzes the motion data and the input data to extract a difference between the motion data and the input data, and calculates a conversion factor and an avatar generator configured to generate a user avatar based on the analysis unit, the motion data, generate a target avatar based on the input data, and apply the transformation element to generate the training avatar.
  • the user avatar is synchronized to move in response to the user's movement through the image provided to the user, and the body shape of the training avatar is the same as that of the user avatar.
  • the motion data includes a numerical value of at least any one of a plurality of motion elements, and the plurality of motion factors include a user's muscle strength, muscle mass, body type, posture, fat mass, body weight, range of motion, flexibility, quickness, and balance ability. , cardiorespiratory ability, coordination ability, body-motor intelligence, kinesthetic ability, etc.
  • the input data includes a target value for a specific motion element targeted by a user among the plurality of motion elements, and the avatar generator generates a target avatar that satisfies the target value for the specific motion factor.
  • the conversion factor includes an amount of exercise, an operation maintenance time, a rest time, the number of operations, a breathing cycle, weight, calories consumed, an angular velocity of each joint, a linear velocity, and the like.
  • the interface unit may include a display unit providing an image to the user, a speaker providing sound to the user, and a controller receiving the user's input, wherein the display unit simultaneously displays the user avatar and the training avatar do.
  • the sensing unit includes a motion sensing unit for measuring the user's basic body information, biometric information, and movement information of the joints, and a gaze tracking unit for sensing the user's gaze by measuring the movement of the user's pupil.
  • the motion sensing unit includes at least one of a camera, a gyro sensor, a pressure sensor, a geomagnetic sensor, a skin conductivity sensor, an infrared sensor, a laser sensor, an RF sensor, and an ultrasonic sensor.
  • the display unit enlarges or highlights an area sensed by the eye tracking unit within the image range provided by the display unit, and displays the enlarged or emphasized area.
  • the display unit is provided in the form of a head mounted display.
  • a user-customized exercise training method includes collecting motion data by sensing a user's physical characteristics and movement, generating a user avatar based on the motion data, and selecting a target avatar through a user's input. generating, analyzing the motion data and the input data to extract a difference between the motion data and the input data, analyzing the difference to calculate a transform factor, applying the transform factor to the training avatar and prescribing an exercise to the user through the training avatar.
  • the motion data includes a numerical value of at least one of a plurality of motion elements, and the plurality of motion factors include a user's muscle strength, muscle mass, body type, posture, fat mass, body weight, It includes range of motion, flexibility, agility, balance ability, cardiorespiratory ability, coordination ability, body-motor intelligence, and kinesthetic ability.
  • the generating the target avatar includes: a user selecting a specific motion element that the user targets from among the plurality of motion elements, the user selecting a target value for the specific motion element, and the specific motion element and generating a target avatar that satisfies the target value for .
  • the difference is determined between the user avatar and the target avatar in terms of muscle strength, muscle mass, body type, posture, fat mass, weight, range of motion, flexibility, agility, balance ability, cardiopulmonary It includes the difference value of at least one of ability, coordination ability, body-motor intelligence, and kinesthetic ability.
  • the conversion factor includes an amount of exercise, an operation maintenance time, a rest time, the number of motions, a breathing cycle, weight, calories consumed, an angular velocity of each joint, a linear velocity, and the like.
  • Prescribing exercise to the user may include synchronizing the user avatar displayed on an image provided through a display so that it moves in response to the user's movement, and allowing the user to use the training avatar displayed on the image. requesting to follow the movement.
  • the motion data is collected by measuring through at least one of a camera, a gyro sensor, a pressure sensor, a geomagnetic sensor, a skin conductivity sensor, an infrared sensor, a laser sensor, an RF sensor, and an ultrasonic sensor. .
  • Prescribing exercise to the user may further include: collecting feedback data of the user during exercise; and resetting exercise prescription content based on the feedback data.
  • the motion data further includes interest information obtained by tracking the user's gaze by measuring the motion of the user's pupil.
  • a training prescription is provided through a display to correspond to the user's physical characteristics during exercise training, user convenience can be increased and the user's exercise effect can be improved.
  • FIG. 1 is a block diagram schematically illustrating a user-customized exercise training system according to an embodiment of the present invention.
  • FIGS. 2A and 2B are diagrams illustrating a state in which a user exercises through a user-customized exercise training system.
  • FIG. 3 is a diagram exemplarily illustrating a user, a user avatar, a target avatar, and an avatar for training.
  • FIG. 4 is a diagram illustrating a user avatar and a training avatar displayed on the display unit.
  • FIG. 5 is a step diagram of a user-customized exercise training method according to an embodiment of the present invention.
  • a user-customized exercise training system includes a sensing unit that collects user's motion data, analyzes the motion data and input data input by the user, and generates an avatar for training based on the analyzed data.
  • a sensing unit that collects user's motion data, analyzes the motion data and input data input by the user, and generates an avatar for training based on the analyzed data.
  • the calculation unit includes an analysis unit that extracts a difference between the motion data and the input data and calculates a transformation factor, and generates a user avatar based on the motion data, and generates a target avatar based on the input data; and an avatar generator configured to generate the training avatar by applying the transformation factor.
  • the calculator since the calculator generates a training avatar that has the same motion elements as those of the user's avatar and a
  • FIG. 1 is a block diagram schematically illustrating a user-customized exercise training system according to an embodiment of the present invention.
  • a user-customized exercise training system 1000 may provide a customized exercise prescription to a user through an avatar displayed on an image.
  • the user-customized exercise training system 1000 includes a sensing unit 100 , a calculation unit 200 , a control unit 300 , and an interface unit 400 .
  • the sensing unit 100 collects motion data by sensing the user's physical characteristics and movement.
  • the sensing unit 100 includes a motion sensing unit 110 and an eye tracking unit 120 .
  • the motion sensing unit 110 collects motion data by measuring the user's basic body shape information, biometric information, and joint motion information.
  • the basic body shape information may include body type, height, weight, body composition, muscle mass, body water content, metabolic rate, and the like.
  • the biometric information may be a pulse, blood pressure, body heat, vital information, respiration rate, and the like.
  • the motion information of the joints may be an angle between the joints, an angular velocity, a linear velocity, and the like.
  • the motion sensing unit 110 may calculate motion data based on the measured basic body shape information, biometric information, and motion information of joints.
  • the motion data may be a numerical value of at least one of the plurality of motion elements.
  • the operation elements include the user's muscle strength, muscle mass, body type, posture, fat mass, weight, range of motion, flexibility, quickness, balance ability, cardiopulmonary ability, coordination ability, body-motor intelligence, kinesthetic ability, and the like.
  • the motion sensing unit 110 includes a plurality of sensors.
  • the sensors may be installed at a predetermined distance from the user to photograph the user's movement, or may be worn, contacted, or attached to a part of the user's body to sense the user's movement.
  • the motion sensing unit 110 may include at least one of a camera, a gyro sensor, a pressure sensor, a geomagnetic sensor, a skin conductivity sensor, an infrared sensor, a laser sensor, an RF sensor, and an ultrasonic sensor.
  • the eye tracking unit 120 senses the user's gaze by measuring the movement of the user's pupil. Specifically, the eye tracking unit 120 may detect an area where the user gazes. The detected information is defined as interest information, and the interest information may be transmitted to the controller 300 .
  • motion data may be processed according to user data input to the user-customized exercise training system 1000 .
  • User data is data input by a user.
  • the user data may include the user's disease information, past medical diagnosis records, diet information, sleep amount information, and the like.
  • the calculator 200 includes an avatar generator 210 and an analyzer 220 .
  • the avatar generator 210 generates a plurality of avatars based on the motion data and the input data. A plurality of avatars are displayed on an image provided to the user.
  • the input data is data input into the system by the user, and may be, for example, information on items of interest or preferred operation element, and target values for the corresponding operation element.
  • the analysis unit 220 extracts the difference between the motion data and the input data. A transform factor is calculated based on the difference.
  • the calculated information is transmitted to the avatar generator 210 , and the avatar generator 210 may generate an avatar to which a transformation element is applied.
  • the calculation unit 200 will be described in more detail below.
  • the control unit 300 controls driving of each of the sensing unit 100 , the calculation unit 200 , and the interface unit 400 , and transmits data between the sensing unit 100 , the calculation unit 200 , and the interface unit 400 . send and receive
  • the interface unit 400 provides an image and sound to a user and receives a user's input.
  • the interface unit 400 includes a display unit 410 , a speaker 420 , and a controller 430 .
  • the display unit 410 provides an image to the user.
  • the display unit 410 may be provided in a form mounted on a TV, tablet, PC, mobile electronic device, or the like.
  • the display unit 410 may be applied to a VR device that provides a 3D image to a user.
  • the display unit 410 may be a head mounted display (HMD) worn on the user's head.
  • the display unit 410 may be provided in the form of an AR display such as smart glasses or HoloLens.
  • the display unit 410 may enlarge or emphasize a specific area measured by the eye tracking unit 120 among the images provided by the display unit 410 .
  • the eye tracking unit 120 senses that the user gazes at the leg of one avatar, and enlarges the leg part of the avatar, or processes the leg part in a different color for emphasis, or slows the movement of the leg.
  • Motion processing can be provided to the user.
  • the speaker 420 provides sound to the user.
  • the sound may be a sound effect, music, voice, or the like.
  • the controller 430 is a means for receiving a user's input. The user may express an opinion through the controller 430 .
  • the user may input motion data using the controller 430 .
  • the controller 430 may have a built-in gyro sensor or a geomagnetic sensor, and may sense a user's movement through the sensors.
  • various types of means may replace the controller 430 in the range of receiving a user's input.
  • the controller 430 may be replaced in the form of a keyboard, a mouse, a button, a remote control, a microphone, and the like.
  • controller 430 may be omitted.
  • the user-customized exercise training system 1000 may further include a storage unit (not shown).
  • the storage unit (not shown) includes user data, motion data collected from the sensing unit 100 , input data, a plurality of avatars generated by the avatar generating unit 210 , and conversion factor data calculated by the analyzing unit 220 , etc. can be saved.
  • FIG. 1 are diagrams exemplarily illustrating a state in which a user exercises through a user-customized exercise training system.
  • 2A and 2B a state in which the user US practices a golf swing through the corresponding system 1000 ( FIG. 1 ) is exemplarily shown.
  • 2A is a case in which the display unit 410 provides an image through a monitor or TV.
  • the user US may monitor his/her appearance through the image provided by the display unit 410 .
  • some sensors such as a camera and an infrared sensor, are disposed on the display unit 410 to sense the appearance of the user US.
  • some sensors such as a gyro sensor and a geomagnetic sensor, are provided in the form of a pad attached to a part of the body such as a shoulder, knee, and elbow of the user (US) in the form of a pad (PD). Senses the movement of (US).
  • the golf club held by the user may serve as a controller 430 including sensors.
  • the control unit 300 (FIG. 1) is provided in the form of a set-top box (SB) to serve as a central processing unit.
  • FIG. 2B illustrates a case in which the display unit 410 is provided in the form of a head mounted display (HMD).
  • HMD head mounted display
  • FIG. 3 is a diagram exemplarily illustrating a user, a user avatar, a target avatar, and an avatar for training.
  • the avatar generator 210 generates a user avatar AV1 , a target avatar AV2 , and a training avatar AV3 .
  • (a) is the appearance of the user (US)
  • (b) is the appearance of the user avatar (AV1)
  • (c) is the appearance of the target avatar (AV2)
  • (d) is the appearance of the training avatar (AV3) to be.
  • the avatar generator 210 generates the user avatar AV1 based on the motion data of the user US. Accordingly, the motion data of the user avatar AV1 is the same as the motion data of the user US.
  • the display unit 410 may provide an image by synchronizing the movement of the user avatar AV1 in response to the movement of the user US.
  • the target avatar AV2 is generated based on input data input by the user US.
  • the user US may input a target value for a specific motion element among a plurality of motion elements into the system 1000, and the avatar generator 210 may generate a target avatar AV2 that satisfies the corresponding value. have.
  • the avatar generator 210 may generate a target avatar ( AV2) is created.
  • the analysis unit 220 analyzes the user's motion data and input data, extracts a difference between the motion data and the input data, and calculates a conversion factor. That is, the analysis unit 220 extracts a difference between the user avatar AV1 and the target avatar AV2 to calculate a conversion factor.
  • the transformation element is defined as an element requiring transformation in order to match the operation elements of the target avatar AV2 to the operation elements of the user avatar AV1.
  • the conversion factor may include an amount of exercise, an operation maintenance time, a rest time, the number of operations, a breathing cycle, a weight, calories consumed, an angular velocity and a linear velocity of each joint, and the like.
  • the user may check at least a part of the difference between the user avatar AV1 and the target avatar AV2 through the display unit 410 .
  • the display unit 410 may provide information such as a ratio of segment lengths of the user avatar AV1 and the target avatar AV2, linear velocity, angular velocity, and the like to the user.
  • the present invention is not particularly limited with respect to how the display unit 410 provides the difference information to the user.
  • the numerical value of the operation element of the user avatar AV1 is different from the numerical value of the operation element of the target avatar AV2 . That is, the human motion is determined by the physical condition, the physical condition, and the body-exercise intelligent condition, but the values of the conditions of the user avatar AV1 and the values of the conditions of the target avatar AV2 may be different.
  • examples of the physical condition may include height, length and weight of body segments, and the like, and examples of the physical condition may include physical conditions such as muscle strength, endurance, flexibility, and the like.
  • Examples of body-motor intellectual conditions may be sensory abilities and motor abilities, and the like.
  • the avatar generating unit 210 generates the training avatar AV3 by applying the transformation element.
  • the body shape of the training avatar AV3 is the same as that of the user avatar AV1.
  • FIG. 4 is a diagram illustrating a user avatar and a training avatar displayed on the display unit.
  • the training avatar AV3 may be displayed together with the user avatar AV1 through an image provided by the display unit 410 .
  • the training avatar AV3 provides an exercise prescription to the user US.
  • the user US may perform an exercise by following the movement of the training avatar AV3, and may check his/her own movement through the user avatar AV1.
  • the sensing unit 100 may collect feedback data including movement information and biometric information of the user US in real time during exercise.
  • the calculator 200 may correct the motion data or reset the contents of the exercise prescription based on the feedback data.
  • the display unit 410 provides the user with the operation of the user avatar AV1 and A difference in the operation of the training avatar AV3 may be provided to the user.
  • the display unit 410 enlarges the screen to display the difference between the motions, reproduces a screen on which the difference between the motions is displayed in slow motion, and displays the difference in motion by graphing it with a numerical value, etc. can be provided to users.
  • the movement of the user US cannot be the same as the movement of the target avatar AV2.
  • the exercise effect may not be the same as the predicted value. That is, an exercise prescription in which the input data of the user US is not properly reflected may be provided to the user US.
  • the calculator 400 has the same operation elements as the operation elements of the user avatar AV1, but a transformation element is applied to provide a customized exercise prescription to the user US. Since the dragon avatar AV3 is generated, the exercise effect of the user US may be improved.
  • FIG. 5 is a step diagram of a user-customized exercise training method according to an embodiment of the present invention.
  • reference numerals may be used for the same components as those described above, or descriptions may be omitted.
  • the user-customized exercise training method includes a motion data collection step (S10), a user avatar generation step (S20), a target avatar generation step (S30), a user avatar and a target avatar Comparing the differences (S40), generating an avatar for training by applying the differences (S50), and prescribing an exercise through the training avatar (S60).
  • motion data may be collected in the form of sensing the user's physical characteristics and motion, and the motion data may include the numerical value of at least one motion element among a plurality of motion elements. have.
  • the user avatar AV1 may be generated based on motion data.
  • the motion element of the user avatar AV1 may be the same as that of the user.
  • the target avatar AV2 may be generated according to a user input, and in the target avatar generating step S30 , the user selects a specific action element that the user targets from among the plurality of action elements. It includes the steps of selecting, the user selecting a target value for a specific motion element, and generating a target avatar AV2 that satisfies the target value for the specific motion element.
  • the difference is muscle strength, muscle mass, body type, posture, fat mass, weight, range of motion, flexibility, reflexes, and balance between the user avatar (AV1) and the target avatar (AV2). It may include a difference value of at least one of ability, cardiopulmonary ability, coordination ability, body-motor intelligence, and kinesthetic ability, and the analysis unit 220 may calculate a conversion factor based on the difference value.
  • the training avatar AV3 has the same operation elements as the user avatar AV1.
  • the training avatar AV3 is the same as the body type of the user avatar AV1.
  • the display unit 410 may provide the user avatar (AV1) and the training avatar (AV2) to the user through the displayed image, and the user (US) is trained By following the movement of the dragon avatar AV3, an exercise can be performed, and one's own movement can be confirmed through the user's avatar AV1.
  • the present invention extracts a difference between a target avatar having a motion element targeted by the user and a user avatar having the same motion element as the user, and generates a training avatar having the same motion element as the user but to which the difference is applied. Since the user can be prescribed an exercise suitable for the user's physical condition through the training avatar, the present invention has high industrial applicability in terms of improving the user's exercise effect.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Ophthalmology & Optometry (AREA)
  • Processing Or Creating Images (AREA)
  • Rehabilitation Tools (AREA)

Abstract

Un système d'entraînement à l'exercice personnalisé pour l'utilisateur selon un mode de réalisation de la présente invention comprend : une unité de détection qui collecte des données de mouvement d'un utilisateur ; une unité de calcul qui analyse les données de mouvement et les données d'entrée entrées par l'utilisateur, et génère un avatar d'entraînement sur la base des données analysées ; une unité d'interface qui fournit des images et un son à l'utilisateur et reçoit des entrées de l'utilisateur ; et une unité de commande qui transmet et reçoit des données entre l'unité de détection, l'unité de calcul et l'unité d'interface, et commande les opérations de chacune. L'unité de calcul comprend : une unité d'analyse qui extrait des différences entre les données de mouvement et les données d'entrée, et calcule un facteur de conversion ; et une unité de génération d'avatar qui génère un avatar d'utilisateur sur la base des données de mouvement, génère un avatar cible sur la base des données d'entrée, et génère l'avatar d'entraînement par application du facteur de conversion.
PCT/KR2021/000771 2020-12-31 2021-01-20 Procédé et système d'entraînement à l'exercice personnalisé pour l'utilisateur WO2022145563A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2020-0188963 2020-12-31
KR1020200188963A KR102556863B1 (ko) 2020-12-31 2020-12-31 사용자 맞춤형 운동 훈련 방법 및 시스템

Publications (1)

Publication Number Publication Date
WO2022145563A1 true WO2022145563A1 (fr) 2022-07-07

Family

ID=82259444

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2021/000771 WO2022145563A1 (fr) 2020-12-31 2021-01-20 Procédé et système d'entraînement à l'exercice personnalisé pour l'utilisateur

Country Status (2)

Country Link
KR (1) KR102556863B1 (fr)
WO (1) WO2022145563A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102613638B1 (ko) 2022-12-08 2023-12-14 주식회사 테렌즈랩스 단말에 설치된 어플리케이션 기반의 건강관리 시스템 및 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016527536A (ja) * 2013-06-07 2016-09-08 株式会社ソニー・インタラクティブエンタテインメント ヘッドマウントディスプレイでユーザーの動きに応答する画像レンダリング
US20170103672A1 (en) * 2015-10-09 2017-04-13 The Regents Of The University Of California System and method for gesture capture and real-time cloud based avatar training
US20180207484A1 (en) * 2017-01-26 2018-07-26 International Business Machines Corporation Remote physical training
KR101904889B1 (ko) * 2016-04-21 2018-10-05 주식회사 비주얼캠프 표시 장치와 이를 이용한 입력 처리 방법 및 시스템
KR102125748B1 (ko) * 2018-08-23 2020-06-23 전자부품연구원 4d 아바타를 이용한 동작가이드장치 및 방법

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160049089A1 (en) * 2013-03-13 2016-02-18 James Witt Method and apparatus for teaching repetitive kinesthetic motion
KR101931784B1 (ko) 2018-08-29 2018-12-21 주식회사 큐랩 복수의 사용자 대결 구도의 가상 피트니스 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016527536A (ja) * 2013-06-07 2016-09-08 株式会社ソニー・インタラクティブエンタテインメント ヘッドマウントディスプレイでユーザーの動きに応答する画像レンダリング
US20170103672A1 (en) * 2015-10-09 2017-04-13 The Regents Of The University Of California System and method for gesture capture and real-time cloud based avatar training
KR101904889B1 (ko) * 2016-04-21 2018-10-05 주식회사 비주얼캠프 표시 장치와 이를 이용한 입력 처리 방법 및 시스템
US20180207484A1 (en) * 2017-01-26 2018-07-26 International Business Machines Corporation Remote physical training
KR102125748B1 (ko) * 2018-08-23 2020-06-23 전자부품연구원 4d 아바타를 이용한 동작가이드장치 및 방법

Also Published As

Publication number Publication date
KR102556863B1 (ko) 2023-07-20
KR20220098064A (ko) 2022-07-11

Similar Documents

Publication Publication Date Title
KR100772497B1 (ko) 골프 클리닉 시스템 및 그것의 운용방법
US8287434B2 (en) Method and apparatus for facilitating strength training
Bleser et al. A personalized exercise trainer for the elderly
KR20160054325A (ko) 맞춤형 개인 트레이닝 관리 시스템 및 방법
KR101999748B1 (ko) IoT 운동기구, 운동지도시스템, 및 이를 이용한 운동지도방법
JP7492722B2 (ja) 運動評価システム
JP2020174910A (ja) 運動支援システム
Gauthier et al. Human movement quantification using Kinect for in-home physical exercise monitoring
Nown et al. A mapping review of real-time movement sonification systems for movement rehabilitation
WO2022145563A1 (fr) Procédé et système d'entraînement à l'exercice personnalisé pour l'utilisateur
WO2022193425A1 (fr) Procédé et système d'affichage de données d'exercice
JP2016035651A (ja) 在宅リハビリテーションシステム
WO2014104463A1 (fr) Dispositif de santé et de réadaptation basé sur une interaction naturelle
Hidayat et al. LOVETT scalling with MYO armband for monitoring finger muscles therapy of post-stroke people
WO2017217567A1 (fr) Système de surveillance de la condition physique
JP7023004B2 (ja) 運動解析システム、運動解析プログラム、及び運動解析方法
JP2023162333A (ja) トレーニング機器の制御方法
WO2016204334A1 (fr) Système d'exercice basé sur des contenus interactifs immersifs et procédé associé
JP2018187284A (ja) 運動状態診断システムおよび運動状態診断プログラム
WO2021261529A1 (fr) Système d'assistance à l'exercice physique
WO2022030708A1 (fr) Système de rétroaction auditive par modulation musicale utilisant un écouteur sans fil et un téléphone intelligent
Gupta et al. shoulder Rehabilitation using Virtual Physiotherapy System
JP2021068069A (ja) 無人トレーニングの提供方法
WO2023075052A1 (fr) Dispositif d'accompagnement d'exercice à base d'intelligence artificielle utilisant la ludification
WO2023127870A1 (fr) Dispositif d'aide aux soins, programme d'aide aux soins et procédé d'aide aux soins

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21915363

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21915363

Country of ref document: EP

Kind code of ref document: A1