WO2018030734A1 - 3d 시뮬레이션 방법 및 장치 - Google Patents

3d 시뮬레이션 방법 및 장치 Download PDF

Info

Publication number
WO2018030734A1
WO2018030734A1 PCT/KR2017/008506 KR2017008506W WO2018030734A1 WO 2018030734 A1 WO2018030734 A1 WO 2018030734A1 KR 2017008506 W KR2017008506 W KR 2017008506W WO 2018030734 A1 WO2018030734 A1 WO 2018030734A1
Authority
WO
WIPO (PCT)
Prior art keywords
motion
parameter
user
trajectory
generating
Prior art date
Application number
PCT/KR2017/008506
Other languages
English (en)
French (fr)
Korean (ko)
Inventor
정주호
정창근
유성재
Original Assignee
주식회사 비플렉스
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020160101491A external-priority patent/KR101830371B1/ko
Priority claimed from KR1020160101489A external-priority patent/KR101926170B1/ko
Priority claimed from KR1020170030402A external-priority patent/KR101995484B1/ko
Priority claimed from KR1020170030394A external-priority patent/KR101995482B1/ko
Priority claimed from KR1020170079255A external-priority patent/KR101970674B1/ko
Application filed by 주식회사 비플렉스 filed Critical 주식회사 비플렉스
Priority to CN201780037461.8A priority Critical patent/CN109310913B/zh
Publication of WO2018030734A1 publication Critical patent/WO2018030734A1/ko
Priority to US16/272,201 priority patent/US11497966B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities

Definitions

  • the present invention relates to a 3D simulation method and apparatus, and more particularly, to a method and apparatus for simulating a walking and driving motion of a user in three dimensions.
  • the present invention has been made in view of the above-described technical problem, the object of the present invention is to substantially compensate for the various problems caused by the limitations and disadvantages in the prior art, the walking and driving movement of the user 3
  • the present invention provides a method and apparatus for simulating dimensions, and a computer readable recording medium having recorded thereon a program for executing the method.
  • the 3D simulation method comprises: receiving a model parameter, a pose parameter, a motion space parameter, and a motion time parameter based on an external input; Generating a 3D model of a user based on the model parameter and the pose parameter; Generating a motion trajectory of the user based on the motion space parameter and the motion time parameter; And generating the 3D simulation of the user by applying the motion trajectory to the 3D model.
  • the external input may be input from a user or may be input by recognizing a user's exercise from an exercise recognition device.
  • the model parameter is a parameter related to the appearance of the user;
  • the posture parameter is a parameter relating to the posture of the user;
  • the motion space parameter is a parameter relating to a spatial trajectory of the user's motion;
  • the motion time parameter is a parameter related to a time trajectory of the motion of the user.
  • the model parameter includes at least one of height, weight, foot length, leg length, age, gender, and wearing information.
  • the pose parameter includes at least one of interpolation, interpolation angle, and vertical viewing angle.
  • the motion space parameter may include a vertical section of the support section, a vertical section of the support section, a vertical section of the vertical section, the maximum vertical force load ratio, the average vertical force load ratio, the impact amount, the left and right uniformity, the left and right balance, the stride, the landing feet, , Left and right pelvis angle and at least one of the left and right eye angle.
  • the motion time parameter includes at least one of one foot support time, one foot suspension time, and one minute repair.
  • the motion space parameter may include at least one of up and down oscillation width, up and down oscillation width, left and right uniformity, left and right balance, stride length, landing foot, upper and lower pelvis angle, left and right pelvis angle and left and right eye angle. Include.
  • the motion time parameter includes at least one of one foot support time, two feet support time, and one minute repair.
  • the generating of the motion trajectory of the user may include: motion motion data modeling a predetermined motion motion; And basic motion data independent of the motion space parameter and the motion time parameter.
  • the motion motion data and the basic motion data are four-step data including a left foot support section, a left foot float section, a right foot support section, and a right foot float section. .
  • the motion motion data and the basic motion data are four-step data including a left foot support section, a biped support section, a right foot support section, and a biped support section.
  • the four-step data includes motion trajectory values of the up, down, left, and right axes of the joints.
  • said joint is at least one of the neck, shoulder, waist, knee, arm, elbow, ankle and toe.
  • the generating of the motion trajectory of the user may include: generating a first adjustment value by adjusting a gain value based on the motion space parameter to reflect the motion motion data; And generating a second adjustment value by adjusting a gain value based on the motion time parameter to reflect the first adjustment value.
  • the generating of the motion trajectory of the user may include: generating a first adjustment value by adjusting a gain value based on the motion time parameter by reflecting the motion motion data; And generating a second adjustment value by reflecting and adjusting the gain value based on the motion space parameter to the first adjustment value.
  • generating the motion trajectory of the user further includes merging the basic motion data and the second adjustment value.
  • the present invention includes a computer-readable recording medium on which a program for performing the method is recorded.
  • the 3D simulation apparatus includes a parameter input unit for receiving a model parameter, a posture parameter, a motion space parameter, and a motion time parameter based on an external input; A 3D model generator for generating a 3D model of the user based on the model parameter and the pose parameter; A motion trajectory generation unit generating a motion trajectory of the user based on the motion space parameter and the motion time parameter; And a 3D simulation generator configured to generate the 3D simulation of the user by applying the motion trajectory to the 3D model.
  • a 3D model and a motion trajectory are generated on the basis of a model parameter, a posture parameter, a motion space parameter, and a motion time parameter input from an exercise recognition device or a user to provide a 3D simulation about an exercise posture, and the like.
  • the user can recognize, detect and analyze the state of exercise effectively and accurately. Therefore, through the 3D simulation according to the present invention, the user can effectively and accurately recognize his or her motion and walking state, and can be used to correct his or her posture through 3D simulation analysis.
  • FIG. 1 illustrates a 3D simulation result according to an embodiment of the present invention.
  • FIG. 2 is a block diagram of a 3D simulation apparatus according to an embodiment of the present invention.
  • FIG. 3 illustrates a 3D model of a user reflecting model parameters according to an embodiment of the present invention.
  • FIG. 4 illustrates a 3D model of a user reflecting a posture parameter according to an embodiment of the present invention.
  • FIG 5 illustrates a four-step movement motion when the movement motion according to an embodiment of the present invention is driving.
  • FIG. 6 illustrates a four-step motion motion when the motion motion is walking according to an embodiment of the present invention.
  • FIG. 7 is a block diagram of a motion trajectory generation unit according to an embodiment of the present invention.
  • FIG. 8 is an exemplary view of adjusting and adjusting a motion space parameter in the motion motion data according to an embodiment of the present invention.
  • FIG. 9 is an exemplary view of adjusting and reflecting a motion time parameter in exercise motion data according to an embodiment of the present invention.
  • FIG. 10 is a flowchart of a 3D simulation method according to an embodiment of the present invention.
  • FIG. 1 illustrates a 3D simulation result according to an embodiment of the present invention.
  • the 3D simulation apparatus 200 receives a model parameter, a posture parameter, a motion space parameter, and a motion time parameter based on an external input, and generates a 3D model and a motion trajectory of a user who is exercising based on the input. This creates a 3D simulation.
  • the model parameters such as the user's height, foot length and leg length, posture parameters such as interpolation and interpolation angle, motion space parameters such as stride length, left and right uniformity and left and right balance, and motion time parameters such as air suspension time
  • the 3D simulation apparatus 200 3D simulated walking and driving motions of the user.
  • FIG. 2 is a block diagram of a 3D simulation apparatus according to an embodiment of the present invention.
  • the 3D simulation apparatus 200 includes a parameter input unit 210, a 3D model generator 230, a motion trajectory generator 250, and a 3D simulation generator 270.
  • the parameter input unit 210 receives the model parameter 211, the posture parameter 213, the motion space parameter 215, and the motion time parameter 217 based on the external input.
  • the external input may be input from a user or may be input by recognizing a user's exercise from an exercise recognition device.
  • the exercise recognition device refer to Korean patent, 'A motion recognition method and device for pedestrian and driving monitoring' (application number: 10-2016-0101489, application date: 2016.08.09), 'pressure center path Method and Apparatus for Deriving Based Exercise Posture '(Application No .: 10-2016-0101491, Filed Date: 2016.08.09),' Method and Apparatus for Recognizing Exercise for Walking and Driving Monitoring '(Application No .: 10-2017-0030394 No.
  • the model parameter 211 is a parameter related to the user's appearance, and includes height, weight, foot length, leg length, age, gender, and wearing information. At least one of the.
  • the wear information includes the type, name and brand of the product worn by the user. Products worn by the user include accessories such as watches, clothes, shoes, and the like.
  • the posture parameter 213 is a parameter related to a posture of a user and includes at least one of interpolation (Step width), interpolation (Step angle), and head vertical angle. Interpolation is the average value between the legs, and interpolation is the leg angle average value. Up-and-down eye angle means an average value of the vertical angle of the head.
  • the motion space parameter 215 is a parameter related to the trajectory of the user's motion.
  • the motion space parameter 215 includes vertical oscillation during stance, vertical oscillation during flight, Instantaneous Vertical Loading Rate (IVLR), Average Vertical Loading Rate (AVLR), Impact, Impact, Stability, Balance, Step Length, At least one of a foot strike pattern, a pelvic vertical rotation, a pelvic lateral rotation, and a head lateral angle.
  • the motion space parameter 215 may include vertical oscillation during single stance, vertical oscillation during double stance, left and right uniformity, left and right balance, stride length, landing foot, and the like when the user walks. At least one of the upper and lower pelvis angle, left and right pelvis angle and left and right eye angle.
  • the vertical vibration width of the support section means the vertical movement distance (meter) in the support section
  • the vertical vibration width of the floating section means the vertical movement distance in the floating section.
  • the maximum vertical force load rate is the instantaneous vertical force load rate (Newton / second), which means the maximum slope of the support section of the ground reaction force.
  • Mean vertical force load ratio (Newton / second) means the average slope of the support section of ground reaction force. Impact amount refers to the impact force (Newton) of the support section of the ground reaction force.
  • Stability refers to whether the state of movement is consistently maintained for each leg of the left and right foot in time, strength, etc., and expressed in% using the coefficient of variation (CV) of each leg.
  • Values that can be used as index for evaluation index include maximum vertical force, vertical acceleration maximum, support section impact, support time, stray time, average vertical force load rate and maximum vertical force load rate.
  • Balance of left and right represents the left and right unbalance (%), and is obtained by the following equation.
  • Step length refers to the distance traveled forward during the support section and the floating section, and the foot strike pattern indicates which foot to land on.
  • the landing foot may be one of a fore foot, a rear foot, and a mid foot.
  • the pelvic vertical rotation and the pelvic lateral rotation mean the degree of vertical and horizontal distortion of the pelvis, respectively.
  • Head lateral angle is an average value of the left and right angles of the head.
  • the motion time parameter 217 is a parameter related to the time trajectory of the user's motion.
  • the motion time parameter 217 includes at least one of a single stance time, a single flight time, and a cadence per minute. It includes one.
  • the motion time parameter 217 includes at least one of a single stance time, a double stance time, and a minute's reward when the user walks.
  • the 3D model generator 230 generates a 3D model of the user based on the model parameter 211 and the attitude parameter 213.
  • the motion trajectory generation unit 250 generates a motion trajectory of the user based on the motion space parameter 215 and the motion time parameter 217. The detailed operation of the motion trajectory generation unit 250 will be described later in detail with reference to FIGS. 5 to 9.
  • the 3D simulation generator 270 generates the 3D simulation of the user by applying the motion trajectory to the 3D model.
  • FIG. 3 illustrates a 3D model of a user reflecting model parameters according to an embodiment of the present invention.
  • the 3D model is generated by reflecting the height of the user among the model parameters 211.
  • FIG. 4 illustrates a 3D model of a user reflecting a posture parameter according to an embodiment of the present invention.
  • the 3D model is generated by reflecting the interpolation and the interpolation angle of the user among the posture parameters 213.
  • FIG 5 illustrates a four-step movement motion when the movement motion according to an embodiment of the present invention is driving.
  • the movement movement is repeatedly performed in four stages including the left foot support section, the left foot float section, the right foot support section and the right foot float section.
  • FIG. 6 illustrates a four-step motion motion when the motion motion is walking according to an embodiment of the present invention.
  • the movement movement is repeatedly performed in four stages including a left foot support section, a biped support section, a right foot support section, and a foot support section.
  • FIG. 7 is a block diagram of a motion trajectory generation unit according to an embodiment of the present invention.
  • the motion trajectory generation unit 700 includes the motion motion data 720 and the basic motion data 730.
  • the motion motion data 720 is pre-stored data by modeling a predetermined motion motion
  • the basic motion data 730 is data previously stored as motion data independent of the motion space parameter 715 and the motion time parameter 717.
  • the basic motion data 730 may be data about a tumbling motion such as an arm motion or an upper body distortion that has little change in movement.
  • the movement motion data 720 and the basic motion data 730 are four-step data including a left foot support section, a left foot float section, a right foot support section, and a right foot float section.
  • the motion motion data 720 and the basic motion data 730 are four-step data including a left foot support section, a biped support section, a right foot support section, and a biped support section.
  • Each step includes three-axis motion trajectory values of the vertical axis (z axis), the left and right axis (y axis), and the front and rear axis (x axis) for each joint.
  • the joint is at least one of a neck, shoulder, waist, knee, arm, elbow, ankle and toe.
  • the motion trajectory generation unit 700 further includes a motion space parameter adjusting unit 750 and a motion time parameter adjusting unit 770.
  • the motion time parameter adjusting unit 770 generates a second adjustment value by reflecting and adjusting the gain value based on the motion time parameter 717 to the first adjustment value.
  • the motion time parameter adjusting unit 770 generates the first adjustment value by reflecting and adjusting the gain value based on the motion time parameter 717 in the motion motion data 720.
  • the motion space parameter adjusting unit 750 generates a second adjustment value by reflecting and adjusting the gain value based on the motion space parameter 715 to the first adjustment value.
  • the motion trajectory generation unit 700 further includes a motion trajectory merger 790.
  • the motion trajectory merging unit 790 merges the basic motion data and the second adjustment value to generate a motion trajectory of the user.
  • FIG. 8 is an exemplary view of adjusting and adjusting a motion space parameter in the motion motion data according to an embodiment of the present invention.
  • the typical waist triaxial motion trajectory value stored in the athletic motion data 720 is shown in solid blue for the driving motion.
  • the motion space parameter adjusting unit 750 generates a first adjustment value by reflecting and adjusting the gain value based on the motion space parameter 715 to the 3-axis motion trajectory value of the waist.
  • the first adjustment value of the z-axis among the 3-axis motion trajectory values of the waist is reduced in amplitude compared to the general waist 3-axis motion trajectory value stored in the motion motion data 720.
  • FIG. 9 is an exemplary view of adjusting and adjusting motion time parameters in exercise motion data according to another exemplary embodiment of the present invention.
  • the typical waist triaxial motion trajectory value stored in the athletic motion data 720 is shown in solid blue for the driving motion.
  • the motion time parameter adjustment unit adjusts the gain value based on the motion time parameter to reflect the 3-axis motion trajectory value of the waist, thereby generating a first adjustment value.
  • the first adjustment value of the z-axis among the three-axis motion trajectory values of the waist is reduced in the right foot support time compared to the general waist three-axis motion trajectory values stored in the motion motion data.
  • FIG. 10 is a flowchart of a 3D simulation method according to an embodiment of the present invention.
  • the 3D simulation apparatus 200 receives a model parameter, a pose parameter, a motion space parameter, and a motion time parameter based on an external input.
  • the external input may be input from a user or may be input by recognizing a user's exercise from an exercise recognition device.
  • the 3D simulation apparatus 200 generates a 3D model of the user based on the model parameter and the pose parameter.
  • the 3D simulation apparatus 200 generates a motion trajectory of the user based on the motion space parameter and the motion time parameter.
  • the generating of the motion trajectory of the user uses motion motion data modeling a predetermined motion motion and basic motion data independent of the motion space parameter and the motion time parameter.
  • the motion motion data and the basic motion data are four-step data including a left foot support section, a left foot float section, a right foot support section, and a right foot float section.
  • the motion motion data and the basic motion data are four-step data including a left foot support section, a biped support section, a right foot support section, and a biped support section.
  • the four-step data includes the motion trajectory values of the up, down, left and right axes, and front and rear axes for each joint.
  • the joint is at least one of a neck, shoulder, waist, knee, arm, elbow, ankle and toe.
  • the generating of the motion trajectory of the user may be performed by reflecting and adjusting the gain value based on the motion space parameter in the motion motion data, thereby generating a first adjustment value and adjusting the gain value based on the motion time parameter.
  • the second adjustment value is generated by adjusting to reflect the value.
  • the generating of the motion trajectory of the user may include adjusting the gain value based on the motion time parameter to reflect the motion motion data, thereby generating a first adjustment value and generating the motion space parameter.
  • the second adjustment value is generated by adjusting the gain value based on the reflection on the first adjustment value.
  • the generating of the motion trajectory of the user further includes merging the basic motion data and the second adjustment value.
  • the 3D simulation apparatus 200 generates the 3D simulation of the user by applying the motion trajectory to the 3D model.
  • an apparatus may include a bus coupled to units of each of the apparatus as shown, at least one processor coupled to the bus, and a command, received And a memory coupled to the bus for storing messages or generated messages, and coupled to at least one processor for performing instructions as described above.
  • the system according to the present invention can be embodied as computer readable codes on a computer readable recording medium.
  • the computer-readable recording medium includes all kinds of recording devices in which data that can be read by a computer system is stored.
  • the computer-readable recording medium may include a magnetic storage medium (eg, ROM, floppy disk, hard disk, etc.) and an optical reading medium (eg, CD-ROM, DVD, etc.).
  • the computer readable recording medium can also be distributed over network coupled computer systems so that the computer readable code is stored and executed in a distributed fashion.
  • a 3D model and a motion trajectory are generated on the basis of a model parameter, a posture parameter, a motion space parameter, and a motion time parameter input from an exercise recognition device or a user to provide a 3D simulation about an exercise posture, and the like.
  • the user can recognize, detect and analyze the state of exercise effectively and accurately. Therefore, through the 3D simulation according to the present invention, the user can effectively and accurately recognize his or her motion and walking state, and can be used to correct his or her posture through 3D simulation analysis.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Surgery (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Physiology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Processing Or Creating Images (AREA)
  • Rehabilitation Tools (AREA)
PCT/KR2017/008506 2016-08-09 2017-08-07 3d 시뮬레이션 방법 및 장치 WO2018030734A1 (ko)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201780037461.8A CN109310913B (zh) 2016-08-09 2017-08-07 三维模拟方法及装置
US16/272,201 US11497966B2 (en) 2016-08-09 2019-02-11 Automatic coaching system and method for coaching user's exercise

Applications Claiming Priority (10)

Application Number Priority Date Filing Date Title
KR1020160101491A KR101830371B1 (ko) 2016-08-09 2016-08-09 압력 중심 경로 기반 운동 자세 도출 방법 및 장치
KR1020160101489A KR101926170B1 (ko) 2016-08-09 2016-08-09 보행 및 주행 모니터링을 위한 운동 인식 방법 및 장치
KR10-2016-0101489 2016-08-09
KR10-2016-0101491 2016-08-09
KR1020170030402A KR101995484B1 (ko) 2017-03-10 2017-03-10 압력 중심 경로 기반 운동 자세 도출 방법 및 장치
KR10-2017-0030402 2017-03-10
KR10-2017-0030394 2017-03-10
KR1020170030394A KR101995482B1 (ko) 2017-03-10 2017-03-10 보행 및 주행 모니터링을 위한 운동 인식 방법 및 장치
KR1020170079255A KR101970674B1 (ko) 2017-06-22 2017-06-22 주행 시 부상 위험성 정량화 방법 및 장치
KR10-2017-0079255 2017-06-22

Related Child Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/008534 Continuation-In-Part WO2018030743A1 (ko) 2016-08-09 2017-08-08 운동 인식 방법 및 장치

Publications (1)

Publication Number Publication Date
WO2018030734A1 true WO2018030734A1 (ko) 2018-02-15

Family

ID=61162849

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/KR2017/008506 WO2018030734A1 (ko) 2016-08-09 2017-08-07 3d 시뮬레이션 방법 및 장치
PCT/KR2017/008534 WO2018030743A1 (ko) 2016-08-09 2017-08-08 운동 인식 방법 및 장치

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/008534 WO2018030743A1 (ko) 2016-08-09 2017-08-08 운동 인식 방법 및 장치

Country Status (3)

Country Link
JP (1) JP6765505B2 (ja)
CN (2) CN109310913B (ja)
WO (2) WO2018030734A1 (ja)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109621331A (zh) * 2018-12-13 2019-04-16 深圳壹账通智能科技有限公司 辅助健身方法、装置及存储介质、服务器
CN112130677A (zh) * 2020-09-23 2020-12-25 深圳市爱都科技有限公司 一种可穿戴终端及其举手识别方法
CN116491935A (zh) * 2023-06-29 2023-07-28 深圳市微克科技有限公司 一种智能穿戴设备的运动健康监测方法、系统及介质

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102022942B1 (ko) * 2018-05-10 2019-09-19 주식회사 비플렉스 상하 지면 반력 힘의 기울기 예측 방법 및 이를 이용하는 주행 시 부상 위험성 정량화 장치
CN111790133B (zh) * 2019-04-03 2021-06-08 杭州乾博科技有限公司 一种智能搏击球训练结束识别方法及系统
EP3735900B1 (en) * 2019-05-07 2022-07-27 Bodytone International Sport, S.L. Treadmill for sport training
KR102304300B1 (ko) * 2019-05-08 2021-09-23 주식회사 비플렉스 머리부 가속도 센서를 통한 보행인자 검출 방법 및 장치
CN110180158B (zh) * 2019-07-02 2021-04-23 乐跑体育互联网(武汉)有限公司 一种跑步状态识别方法、系统及终端设备
CN113509173A (zh) * 2020-04-10 2021-10-19 华为技术有限公司 运动姿势识别方法、终端设备及存储介质
CN111569397B (zh) * 2020-04-30 2021-06-15 东莞全创光电实业有限公司 手柄类运动计数方法及终端
TWI741724B (zh) * 2020-08-05 2021-10-01 美律實業股份有限公司 身體質量指數區間估測裝置及其操作方法
JP2022120517A (ja) * 2021-02-05 2022-08-18 パナソニックIpマネジメント株式会社 音響装置および音響制御方法
WO2023128511A1 (ko) * 2021-12-28 2023-07-06 주식회사 디랙스 운동 자세 분석 장치 및 이를 포함하는 스마트 운동 기기

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09330424A (ja) * 1996-06-07 1997-12-22 Matsushita Electric Ind Co Ltd 3次元骨格構造の動き変換装置
JP2009204568A (ja) * 2008-02-29 2009-09-10 Seiko Instruments Inc 歩行シミュレーション装置
KR20120059824A (ko) * 2010-12-01 2012-06-11 경희대학교 산학협력단 복합 센서를 이용한 실시간 모션 정보 획득 방법 및 시스템
JP2014519947A (ja) * 2011-06-22 2014-08-21 ゴルフゾン カンパニー リミテッド ユーザーに対するカスタマイズ練習環境を提供する仮想ゴルフシミュレーション装置、これとネットワークで連結されるサーバー、及び仮想ゴルフシミュレーションを用いたユーザーカスタマイズ練習環境の提供方法
US9307932B2 (en) * 2010-07-14 2016-04-12 Ecole Polytechnique Federale De Lausanne (Epfl) System and method for 3D gait assessment

Family Cites Families (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6430997B1 (en) * 1995-11-06 2002-08-13 Trazer Technologies, Inc. System and method for tracking and assessing movement skills in multidimensional space
JP3443077B2 (ja) * 1999-09-20 2003-09-02 ソニー株式会社 ロボットの運動パターン生成装置及び運動パターン生成方法、並びにロボット
WO2005021107A1 (en) * 2003-08-27 2005-03-10 Steffan Klein Personel training system and method
KR100620118B1 (ko) * 2004-03-31 2006-09-13 학교법인 대양학원 관성센서를 이용한 보행패턴 분석장치 및 그 방법
JP5028751B2 (ja) * 2005-06-09 2012-09-19 ソニー株式会社 行動認識装置
GB0602127D0 (en) * 2006-02-02 2006-03-15 Imp Innovations Ltd Gait analysis
US7561960B2 (en) * 2006-04-20 2009-07-14 Honeywell International Inc. Motion classification methods for personal navigation
US7610166B1 (en) * 2006-07-21 2009-10-27 James Solinsky Geolocation system and method for determining mammal locomotion movement
KR100894895B1 (ko) * 2007-05-21 2009-04-30 연세대학교 산학협력단 운동, 균형 및 보행측정방법 및 치료시스템
KR100962530B1 (ko) * 2007-09-28 2010-06-14 한국전자통신연구원 생체신호 측정 장치 및 방법
CN101881625B (zh) * 2008-08-19 2012-09-12 幻音科技(深圳)有限公司 步幅校正方法、测距方法及计步装置
US9392966B2 (en) * 2008-09-04 2016-07-19 Koninklijke Philips N.V. Fall prevention system
KR101101003B1 (ko) * 2009-12-14 2011-12-29 대구대학교 산학협력단 센서노드를 이용한 신체의 움직임 및 균형 감지 시스템 및 방법
KR101633362B1 (ko) * 2010-01-18 2016-06-28 삼성전자 주식회사 인간형 로봇 및 그 보행 제어방법
JP2012024275A (ja) * 2010-07-22 2012-02-09 Omron Healthcare Co Ltd 歩行姿勢判定装置
EP2422698A1 (en) * 2010-08-26 2012-02-29 Nederlandse Organisatie voor toegepast -natuurwetenschappelijk onderzoek TNO Method and system for determining the walking or running speed of a person
JP2014534659A (ja) * 2011-09-23 2014-12-18 クリエイツ インコーポレイテッドCreatz Inc. 周辺の明るさに応じてカメラを制御して良好なボールのイメージを取得するための仮想スポーツシステム
US9101812B2 (en) * 2011-10-25 2015-08-11 Aquimo, Llc Method and system to analyze sports motions using motion sensors of a mobile device
JP5915285B2 (ja) * 2012-03-15 2016-05-11 セイコーエプソン株式会社 状態検出装置、電子機器、測定システム及びプログラム
CN103729614A (zh) * 2012-10-16 2014-04-16 上海唐里信息技术有限公司 基于视频图像的人物识别方法及人物识别装置
JP2014236774A (ja) * 2013-06-06 2014-12-18 セイコーエプソン株式会社 生体情報処理装置および生体情報処理方法
JP6358889B2 (ja) * 2013-09-26 2018-07-18 株式会社メガチップス 歩行者観測システム、プログラムおよび進行方向推定方法
JP6134680B2 (ja) * 2014-03-19 2017-05-24 日本電信電話株式会社 歩行支援装置、歩容計測装置、方法及びプログラム
JP2016034481A (ja) * 2014-07-31 2016-03-17 セイコーエプソン株式会社 情報分析装置、運動解析システム、情報分析方法、分析プログラム、画像生成装置、画像生成方法、画像生成プログラム、情報表示装置、情報表示システム、情報表示プログラム及び情報表示方法
JP2016034480A (ja) * 2014-07-31 2016-03-17 セイコーエプソン株式会社 報知装置、運動解析システム、報知方法、報知プログラム、運動支援方法及び運動支援装置
JP6080078B2 (ja) * 2014-08-18 2017-02-15 高知県公立大学法人 姿勢および歩行状態推定装置
HK1203120A2 (en) * 2014-08-26 2015-10-16 高平 A gait monitor and a method of monitoring the gait of a person
US10448867B2 (en) * 2014-09-05 2019-10-22 Vision Service Plan Wearable gait monitoring apparatus, systems, and related methods
TWM499888U (zh) * 2014-11-10 2015-05-01 Alexandave Ind Co Ltd 姿勢穩定度之評估與復健系統
CN104382599B (zh) * 2014-12-05 2017-01-18 京东方科技集团股份有限公司 用于测量颈椎活动的方法、设备和可穿戴装置
JPWO2016092912A1 (ja) * 2014-12-11 2017-09-21 ソニー株式会社 プログラム及び情報処理システム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09330424A (ja) * 1996-06-07 1997-12-22 Matsushita Electric Ind Co Ltd 3次元骨格構造の動き変換装置
JP2009204568A (ja) * 2008-02-29 2009-09-10 Seiko Instruments Inc 歩行シミュレーション装置
US9307932B2 (en) * 2010-07-14 2016-04-12 Ecole Polytechnique Federale De Lausanne (Epfl) System and method for 3D gait assessment
KR20120059824A (ko) * 2010-12-01 2012-06-11 경희대학교 산학협력단 복합 센서를 이용한 실시간 모션 정보 획득 방법 및 시스템
JP2014519947A (ja) * 2011-06-22 2014-08-21 ゴルフゾン カンパニー リミテッド ユーザーに対するカスタマイズ練習環境を提供する仮想ゴルフシミュレーション装置、これとネットワークで連結されるサーバー、及び仮想ゴルフシミュレーションを用いたユーザーカスタマイズ練習環境の提供方法

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109621331A (zh) * 2018-12-13 2019-04-16 深圳壹账通智能科技有限公司 辅助健身方法、装置及存储介质、服务器
CN112130677A (zh) * 2020-09-23 2020-12-25 深圳市爱都科技有限公司 一种可穿戴终端及其举手识别方法
CN112130677B (zh) * 2020-09-23 2023-05-12 深圳市爱都科技有限公司 一种可穿戴终端及其举手识别方法
CN116491935A (zh) * 2023-06-29 2023-07-28 深圳市微克科技有限公司 一种智能穿戴设备的运动健康监测方法、系统及介质
CN116491935B (zh) * 2023-06-29 2023-08-29 深圳市微克科技有限公司 一种智能穿戴设备的运动健康监测方法、系统及介质

Also Published As

Publication number Publication date
CN109310913B (zh) 2021-07-06
CN109310913A (zh) 2019-02-05
JP2019531772A (ja) 2019-11-07
CN109414608B (zh) 2021-04-02
JP6765505B2 (ja) 2020-10-07
CN109414608A (zh) 2019-03-01
WO2018030743A1 (ko) 2018-02-15

Similar Documents

Publication Publication Date Title
WO2018030734A1 (ko) 3d 시뮬레이션 방법 및 장치
US10105571B2 (en) Systems and methods for sensing balanced-action for improving mammal work-track efficiency
CN107632698B (zh) 基于影像的动作分析系统与方法
Silva et al. The basics of gait analysis
CN107080540A (zh) 用于分析人的步态和姿势平衡的系统和方法
US20180085045A1 (en) Method and system for determining postural balance of a person
CN1231753A (zh) 追踪并显示使用者在空间的位置与取向的方法,向使用者展示虚拟环境的方法以及实现这些方法的系统
TW201113005A (en) Method and system for monioring sport related fitness by estimating muscle power and joint force of limbs
JP6943294B2 (ja) 技認識プログラム、技認識方法および技認識システム
JP2007144107A (ja) 運動補助システム
Spelmezan et al. Wearable automatic feedback devices for physical activities
King et al. Periods of extreme ankle displacement during one-legged standing
CN117148977B (zh) 一种基于虚拟现实的运动康复训练方法
WO2017217567A1 (ko) 피트니스 모니터링 시스템
JP7020479B2 (ja) 情報処理装置、情報処理方法及びプログラム
WO2022145563A1 (ko) 사용자 맞춤형 운동 훈련 방법 및 시스템
WO2020241738A1 (ja) トレーニング支援方法及び装置
KR102418958B1 (ko) 3d 시뮬레이션 방법 및 장치
Felton et al. Are planar simulation models affected by the assumption of coincident joint centers at the hip and shoulder?
CN114011043A (zh) 一种动作辅助用收集系统的运行方法和反馈方法
JP2021083562A (ja) 情報処理装置、計算方法およびプログラム
KR20210040671A (ko) 동적으로 변화하는 인체 무게 중심 궤적 추정 장치 및 그 방법
KR102351534B1 (ko) 균형 능력 평가 장치 및 방법
Han et al. Estimation of the center of bodymass during forward stepping using body acceleration
Krombholz et al. The role of anthropometric parameters on single-leg balance performance in young sub-elite soccer players

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17839752

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17839752

Country of ref document: EP

Kind code of ref document: A1