KR101726041B1 - Ski posture training system based on motion analysis - Google Patents

Ski posture training system based on motion analysis Download PDF

Info

Publication number
KR101726041B1
KR101726041B1 KR1020150056581A KR20150056581A KR101726041B1 KR 101726041 B1 KR101726041 B1 KR 101726041B1 KR 1020150056581 A KR1020150056581 A KR 1020150056581A KR 20150056581 A KR20150056581 A KR 20150056581A KR 101726041 B1 KR101726041 B1 KR 101726041B1
Authority
KR
South Korea
Prior art keywords
ski
user
posture
skier
information
Prior art date
Application number
KR1020150056581A
Other languages
Korean (ko)
Other versions
KR20160125737A (en
Inventor
정승문
윤재홍
김종한
박준형
Original Assignee
동신대학교산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 동신대학교산학협력단 filed Critical 동신대학교산학협력단
Priority to KR1020150056581A priority Critical patent/KR101726041B1/en
Publication of KR20160125737A publication Critical patent/KR20160125737A/en
Application granted granted Critical
Publication of KR101726041B1 publication Critical patent/KR101726041B1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/18Training appliances or apparatus for special sports for skiing
    • G06K9/00342

Abstract

A motion analysis based ski posture training system is disclosed. The motion analysis based ski posture training system according to an embodiment of the present invention is a system for skiing posture training based on a standard of a skier's actual skiing position for correcting a skiing posture of a user, Attitude database; And a skiing posture information of a user who takes a skiing posture is obtained and a skiing posture information of a user is compared with a standard posture information of the skier, And a ski posture training section for outputting to the screen.

Description

BACKGROUND OF THE INVENTION Field of the Invention [0001] The present invention relates to a ski-posture training system,

The present invention relates to a motion analysis based ski posture training system, and more particularly, to a motion analysis based ski posture training system, in which a user performs a comparison analysis between a posture of a skier skiing on an actual slope and a posture of a user skiing on the virtual slope simulator And more particularly to a motion analysis based ski posture training system for inducing a correct ski posture.

Skiing is a sport in which the user slides on the snow by wearing shoes or boots attached to a long, flat surface made of plywood, thin board, glass, or metal. Techniques for sliding on the snow include Bogen, Schutten, Parallel Turn, and Shoton. Bogen is the basis of skiing, and the inside edge of both skis is set up. The right side of the skis is slid, and the right knee is twisted and turned by force. The intermediate technique, Shunteton, Technology, and advanced technologies, such as Parallel Turn and Shoton, are technologies that shift the two skis side by side.

In order to learn these ski skills, the user must be fully trained in advance and must visit the ski resort in winter, especially in order to receive pre-training. As a result, users are constrained in time and space to learn ski skills.

A prior art related to the present invention is Korean Patent Publication No. 10-2015-0010268 (published on Jan. 28, 2015).

SUMMARY OF THE INVENTION The present invention has been made in view of the above problems, and it is therefore an object of the present invention to provide a skiing apparatus and a skiing method of a skier, The present invention provides a motion analysis based ski posture training system that calibrates a user's ski posture through comparative analysis between ski positions.

The solution of the present invention is not limited to the above-mentioned solutions, and other solutions not mentioned can be clearly understood by those skilled in the art from the following description.

The motion analysis based ski posture training system according to one aspect of the present invention is a system for skiing posture training based on a standard of a skier's actual skiing scene for correcting a skiing posture of a user and a skier's standard posture information Attitude database; And a skiing posture information of a user who takes a skiing posture is obtained and a skiing posture information of a user is compared with a standard posture information of the skier, And a ski posture training section for outputting to the screen.

Wherein the ski posture training unit comprises: a display device for displaying a slide image in an actual ski field of the skier on the screen; A matching analysis unit for analyzing a matching rate between the ski attitude information of the user and the standard attitude information of the ski player; And a control unit for controlling an attitude of the avatar to be displayed on the screen to indicate a skiing attitude of the user according to the generated ski attitude information of the user and outputting the analyzed matching rate to the screen.

Wherein the control unit determines whether the analyzed matching rate falls within a matching rate interval of a different range and determines whether the rate of reproduction of the sliding image in the actual ski resort of the skier according to the matching rate interval of the different range to which the analyzed matching rate belongs Or stop playback of the slide image.

The standard posture information of the ski player may be distance values between three-dimensional space coordinates generated from motion capture sensors attached to the body of the skier when the skier slides in the actual ski area, Dimensional space coordinates received from the motion capture sensors attached to the body of the user, and the matching analysis unit may calculate the three-dimensional spatial coordinates Dimensional space coordinates received from the motion capture sensors attached to the body of the user, and the distance values between the three-dimensional space coordinates received from the motion capture sensors attached to the user's body.

The motion capture sensors attached to the body of the user may be attached to the same body parts as the positions of the motion capture sensors attached to the body of the ski player.

The matching and analyzing unit may reflect the offset value, which is a difference between the body information of the user and the body information of the skier, in the matching rate in order to eliminate the difference between the skier and the user.

The ski posture training unit may further include a body information input unit for receiving body information of the user.

According to the motion analysis-based ski posture training system according to the embodiment of the present invention, a user takes a ski posture in accordance with the image of a skier skiing on an actual slope, and compares the ski posture of the skier with the ski posture of the user By correcting the user's ski posture, the user can easily learn or correct the ski posture without having to visit the actual ski area.

1 is a diagram illustrating a configuration of a motion analysis based ski posture training system according to an embodiment of the present invention.
2 is a diagram illustrating information output on a screen of a display device during a ski posture training process through a motion analysis based ski posture training system according to an embodiment of the present invention.

Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings.

Embodiments of the present invention are provided to more fully describe the present invention to those skilled in the art, and the following embodiments may be modified in various other forms, The present invention is not limited to the following embodiments. Rather, these embodiments are provided so that this disclosure will be more thorough and complete, and will fully convey the concept of the invention to those skilled in the art.

The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an," and "the" include plural forms unless the context clearly dictates otherwise. Also, " comprise "and / or" comprising "when used herein should be interpreted as specifying the presence of stated shapes, numbers, steps, operations, elements, elements, and / And does not preclude the presence or addition of one or more other features, integers, operations, elements, elements, and / or groups. As used herein, the term "and / or" includes any and all combinations of one or more of the listed items.

Although the terms first, second, etc. are used herein to describe various elements, regions and / or regions, it should be understood that these elements, components, regions, layers and / Do. These terms do not imply any particular order, top, bottom, or top row, and are used only to distinguish one member, region, or region from another member, region, or region. Thus, the first member, region or region described below may refer to a second member, region or region without departing from the teachings of the present invention.

Hereinafter, embodiments of the present invention will be described with reference to the drawings schematically showing embodiments of the present invention. In the figures, for example, variations in the shape shown may be expected, depending on manufacturing techniques and / or tolerances. Accordingly, embodiments of the present invention should not be construed as limited to any particular shape of the regions illustrated herein, including, for example, variations in shape resulting from manufacturing.

1 is a diagram illustrating a configuration of a motion analysis based ski posture training system according to an embodiment of the present invention.

Referring to FIG. 1, a motion analysis based ski posture training system according to an embodiment of the present invention includes a standard posture database 100 and a ski posture training unit 200.

The standard attitude database 100 stores slide images of the skiers in the actual ski slopes of the skier for correcting the skiing attitude of the user and standard posture information of the skiers when sliding in the actual ski slopes.

The ski posture training unit 200 acquires the ski posture information of the user who takes the ski posture and displays the slide image in the actual ski ground of the skier displayed on the screen, And outputs it to the screen.

The ski posture training unit 200 includes a display device 210, a matching analysis unit 220, and a control unit 230.

The display device 210 displays a slide image on an actual ski field of the ski player on the screen. The display device 210 may be a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED) a flexible display, and a 3D display.

The matching analysis unit 220 analyzes the skiing attitude information of the user and the matching rate between the standard attitude information of the ski player.

The control unit 230 controls the attitude of the avatar displayed on the screen of the display device 210 to indicate the skiing attitude of the user according to the generated skiing attitude information of the user and outputs the analyzed matching rate to the screen do. At this time, the controller 230 determines whether the analyzed matching rate falls within a matching rate interval of a different range, and determines whether the sliding rate of the skier in the actual skiing ground according to the matching rate interval of the different range to which the analyzed matching rate belongs It is possible to control the reproduction speed of the image at a different speed or to stop the reproduction of the slide image.

In the embodiment, the matching rate section may be divided into 0 to 40%, 41 to 70%, and 71 to 100%. When the analyzed matching rate is in the range of 0 to 40% The control unit 230 may control the reproduction speed of the slide image to be slower than the original reproduction speed when the analyzed matching rate is in the range of 41 to 70% and the analyzed matching rate is 71 To 100%, the control unit 230 can reproduce the slide image in a normal manner.

The standard posture information of such a ski player may be distance values between three-dimensional space coordinates generated from motion capture sensors attached to the body of the ski player when the skier slides in the actual ski area. And the user's ski attitude information may be distance values between 3D spatial coordinates received from motion capture sensors attached to the user's body.

At this time, the user of the motion capture sensors attached to the body, for example attached to the body of the user n u1, u2 n as shown in Figure 1, ...., 18 n stars motion capture sensors u18 of the three-dimensional space Coordinates can be wirelessly transmitted to the attitude information generating unit 240. [ n u1 , n u2 , ...., n u18 represent the IDs of the motion capture sensors attached to the user's body. For example, nu1 may be the ID of the motion capture sensor attached to the user's head, and nu2 may be the ID of the motion capture sensor attached to the user's neck. The attitude information generation unit 240 may calculate the distance values between the three-dimensional space coordinates received from the motion capture sensors, and may use the distance values as the ski attitude information of the user.

The distance values between the three-dimensional space coordinates calculated by the attitude information generation unit 240 represent distances between the three-dimensional space coordinates of the respective motion capture sensors at a predetermined time (t? 0). For example, in the case of the motion capture sensors n u1 , n u2 , ..., n u18 shown in FIG. 1, the distance between n u1 , n u2 , ...., n u18 is calculated 157 times , The distance values between the three-dimensional spatial coordinates can have 157.

The matching analysis unit 220 calculates the distance between the three-dimensional space coordinates generated from the motion capture sensors attached to the body of the skier and the three-dimensional space coordinates received from the motion capture sensors attached to the body of the user, The matching rate can be analyzed using the difference of the values.

In embodiments, motion capture sensors attached to the body of the ski players n s1, n s2, ...., n in the case of an individual 18 n s1, n s2, ...., n s18 distance calculation 157 s18 So that the distance values between three dimensional spatial coordinates generated from the motion capture sensors attached to the body of the ski player may include 157 pieces. The standard posture information of the skier, which is the distance values between the three-dimensional space coordinates generated from the motion capture sensors attached to the body of the skier, must be obtained before the ski posture training of the user and stored in the standard posture database 100 do. Likewise, the slide image of the ski player must be photographed and stored in the standard attitude database 100 before performing the skiing posture training of the user.

Thus, the motion capture sensors attached to the body of the user may be attached to the same body parts as the positions of the motion capture sensors attached to the body of the ski player. The attachment position of the motion capture sensor n u1 attached to the body of the user shown in Fig. 1 may be attached to the head which is the attachment position of the motion capture sensor n s1 attached to the body of the ski player.

However, since there is a difference between the user and the ski player in terms of the body, in order to eliminate the difference of the physical information between the ski player and the user, the matching analysis unit 220 may determine the offset value, which is the difference between the user's body information and the skier's body information, Rate can be reflected. At this time, the body information of the user and the body information of the ski player may be keys, but are not limited thereto. In the embodiment, if the key of the user's body information is H U and the key of the skier's body information is H S, then the offset value O offset is

Figure 112015039174194-pat00001
Dimensional space coordinate values generated from the motion capture sensors attached to the body of the skier and the distance values between the three dimensional space coordinates generated from the motion capture sensors attached to the user's body, The matching rate can be calculated by reflecting the difference between the distance values between the coordinates.

For example, the distance between the 3D space coordinates between the motion capture sensors n u1 and n u2 attached to the user's body, D 1-2u, and the motion capture sensors n s1 and n s2 attached to the body of the ski player The D 1-2s inter-match rate S 1-2, which is the distance value between the three-dimensional spatial coordinates, can be calculated by reflecting the offset value. That is,

Figure 112015039174194-pat00002
It is possible to calculate the matching rate S 1-2 .

This process can be repeated 157 times in the case of 18 motion capture sensors, and thus the distance values between the three-dimensional spatial coordinates generated from the motion capture sensors attached to the body of the ski player, in which the offset value is reflected, The number of matching rates between the distance values of three-dimensional spatial coordinates received from the motion capture sensors attached to the body of the subject may be 157. With the average of 157 matching ratios thus obtained, the matching analysis unit 220 can analyze the matching ratios of the user's skiing attitude and the skier's skiing attitude.

Meanwhile, in order to receive the user's body information, the ski posture training unit 200 may further include a body information input unit 250.

The configuration diagram of the motion analysis based ski posture training system according to the embodiment of the present invention shown in FIG. 1 is merely classified from the functional point of view and does not mean actual implementation method or hardware method. The one or more configuration modules shown in FIG. 1 may be integrated or subdivided into one or more configuration modules, which will be apparent to those skilled in the art to which the present invention pertains.

2 is a diagram illustrating information output on a screen of a display device during a ski posture training process through a motion analysis based ski posture training system according to an embodiment of the present invention.

2, the screen of the display device 210 includes a slide image playback region 212, a matching rate display region 214 between the user ski position and the standard posture of the skier, an avatar display region And a sliding state information display area 218 which changes according to skier slide image playback. At this time, the slip situation information display area 218 displays the slip situation information display area 218 on the basis of the slip speed of the skier (displayed through a speedometer), the wind speed of the skier (displayed through an anemometer), the altitude, .

The user takes a skier slide image reproduced through the slide image reproduction area 212 of the screen of the display device 210 according to the sliding posture of the report skier. According to the user's attitude change, the three-dimensional space coordinates from the attached motion capture sensors of the user are wirelessly transmitted to the attitude information generating unit 240 and the control unit 230 of FIG. The controller 230 controls the avatar posture displayed in the avatar display area 216 using the three-dimensional space coordinates received from the motion capture sensors attached to the user's body. The posture information generation unit 240 calculates distance values between the three-dimensional space coordinates received from the motion capture sensors attached to the user's body and transmits the calculated distance values to the matching analysis unit 220 as the ski posture information of the user.

The matching analysis unit 220 analyzes the matching rate of the user's ski attitude information received from the attitude information generating unit 240 and the standard attitude information of the ski player in the standard attitude database 100, send. The controller 230 displays the matching rate analyzed by the matching analyzer 220 in the matching rate display area 214 of the screen of the display device 210. [

The present invention has been described above with reference to the embodiments. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined by the appended claims. Therefore, the disclosed embodiments should be considered in an illustrative rather than a restrictive sense. Therefore, the scope of the present invention is not limited to the above-described embodiments, but should be construed to include various embodiments within the scope of the claims and equivalents thereof.

100: Standard attitude database
200: ski posture training part
210: Display device
220: matching analysis section
230:
240: attitude information generating unit
250: Body information input unit

Claims (7)

A standard posture database for storing a slide image of a skier in an actual ski field for correcting a user's ski posture and a standard posture information of a skier when the skier slides on the actual ski field, And a ski posture training unit for acquiring the ski posture information of a user who takes a skiing posture to report a slide image and grasping whether the ski posture information of the user matches the standard posture information of the skier and outputting it to the screen,
The ski posture training unit may include a display device for displaying a slide image on an actual ski field of the skier on the screen, a matching analysis unit for analyzing a matching rate between the ski posture information of the user and the standard posture information of the ski player, And a controller for controlling an attitude of the avatar to be displayed on the screen to indicate a skiing attitude of the user according to the generated ski attitude information of the user and outputting the analyzed matching rate to the screen,
The standard posture information of the ski player is distance values of three-dimensional space coordinates generated from motion capture sensors attached to the body of the ski player when the skier slides in the actual ski area, Dimensional space coordinates received from the motion capture sensors attached to the body of the ski player, and the distance between the three-dimensional space coordinates generated by the motion capture sensors and the distance values between the three- Dimensional space coordinates received from the motion capture sensors attached to the body of the robot,
And a posture information generation unit for calculating distance values between three-dimensional space coordinates received from the 18 motion capture sensors attached to the body of the user and using the distance values as the ski posture information of the user.
delete The method according to claim 1,
The control unit
Determining whether the analyzed rate of matching belongs to a matching rate interval of a different range and changing a reproducing speed of the sliding image in the actual skiing distance of the skier according to the matching rate interval of the different range to which the analyzed matching rate belongs Or stops the playback of the slide image.
delete The method of claim 3,
Wherein the motion capture sensors attached to the body of the user are attached to a location on the same body parts as the positions of the motion capture sensors attached to the body of the ski player.
The method of claim 3,
The matching /
Wherein the matching rate reflects an offset value, which is a difference between the body information of the user and the body information of the skier, in order to eliminate a difference in physical information between the skier and the user.
The method of claim 3,
The ski posture-
And a body information input unit for inputting the body information of the user.
KR1020150056581A 2015-04-22 2015-04-22 Ski posture training system based on motion analysis KR101726041B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020150056581A KR101726041B1 (en) 2015-04-22 2015-04-22 Ski posture training system based on motion analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020150056581A KR101726041B1 (en) 2015-04-22 2015-04-22 Ski posture training system based on motion analysis

Publications (2)

Publication Number Publication Date
KR20160125737A KR20160125737A (en) 2016-11-01
KR101726041B1 true KR101726041B1 (en) 2017-04-14

Family

ID=57484795

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020150056581A KR101726041B1 (en) 2015-04-22 2015-04-22 Ski posture training system based on motion analysis

Country Status (1)

Country Link
KR (1) KR101726041B1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102050825B1 (en) * 2017-04-13 2019-12-03 한국과학기술원 Method and Apparatus for Analyzing Country Skiing Techniques
KR102262725B1 (en) * 2019-05-20 2021-06-10 엘지전자 주식회사 System for managing personal exercise and method for controlling the same

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100816364B1 (en) * 2007-11-09 2008-03-25 김대봉 Recoder for golf swing motion

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20010095900A (en) * 2000-04-12 2001-11-07 박명수 3D Motion Capture analysis system and its analysis method
KR101364594B1 (en) * 2011-05-26 2014-02-20 한국과학기술연구원 Tangible Snowboard Apparatus based on Bi-directional Interaction between motion of user and Motion Platform
KR101527463B1 (en) 2013-07-19 2015-06-11 주식회사 진 Boarding simulator

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100816364B1 (en) * 2007-11-09 2008-03-25 김대봉 Recoder for golf swing motion

Also Published As

Publication number Publication date
KR20160125737A (en) 2016-11-01

Similar Documents

Publication Publication Date Title
US11132533B2 (en) Systems and methods for creating target motion, capturing motion, analyzing motion, and improving motion
CN105850113B (en) The calibration of virtual reality system
US8854356B2 (en) Storage medium having stored therein image processing program, image processing apparatus, image processing system, and image processing method
US10488659B2 (en) Apparatus, systems and methods for providing motion tracking using a personal viewing device
CN109214231A (en) Physical education auxiliary system and method based on human body attitude identification
US10636185B2 (en) Information processing apparatus and information processing method for guiding a user to a vicinity of a viewpoint
CN110646938B (en) Near-eye display system
US9323055B2 (en) System and method to display maintenance and operational instructions of an apparatus using augmented reality
CN103177269B (en) For estimating the apparatus and method of object gesture
US9779511B2 (en) Method and apparatus for object tracking and 3D display based thereon
US20170086712A1 (en) System and Method for Motion Capture
CN103390174A (en) Physical education assisting system and method based on human body posture recognition
US20070035436A1 (en) Method to Provide Graphical Representation of Sense Through The Wall (STTW) Targets
US8113953B2 (en) Image-linked sound output method and device
US10474342B2 (en) Scrollable user interface control
CN104035557A (en) Kinect action identification method based on joint activeness
CN111985393A (en) Intelligent mirror for correcting motion posture and motion posture correcting method thereof
CN105611267A (en) Depth and chroma information based coalescence of real world and virtual world images
US10714055B1 (en) Systems and methods for display synchronization in head-mounted display devices
KR101726041B1 (en) Ski posture training system based on motion analysis
JP7129839B2 (en) TRAINING APPARATUS, TRAINING SYSTEM, TRAINING METHOD, AND PROGRAM
CN110491226A (en) Learning support system and computer readable recording medium
US11609626B2 (en) Method and control device for operating a virtual reality headset in a vehicle
CN104205007A (en) Electronic device for displaying content of an obscured area of a view
Lyubanenko et al. Multi-camera finger tracking and 3d trajectory reconstruction for hci studies

Legal Events

Date Code Title Description
A201 Request for examination
GRNT Written decision to grant