CN109328094B - Motion recognition method and device - Google Patents

Motion recognition method and device Download PDF

Info

Publication number
CN109328094B
CN109328094B CN201780037462.2A CN201780037462A CN109328094B CN 109328094 B CN109328094 B CN 109328094B CN 201780037462 A CN201780037462 A CN 201780037462A CN 109328094 B CN109328094 B CN 109328094B
Authority
CN
China
Prior art keywords
value
acceleration
motion state
state value
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780037462.2A
Other languages
Chinese (zh)
Other versions
CN109328094A (en
Inventor
郑株豪
郑畅根
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beavers Corp
Original Assignee
Beavers Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020160101489A external-priority patent/KR101926170B1/en
Priority claimed from KR1020160101491A external-priority patent/KR101830371B1/en
Priority claimed from KR1020170030402A external-priority patent/KR101995484B1/en
Priority claimed from KR1020170030394A external-priority patent/KR101995482B1/en
Priority claimed from KR1020170079255A external-priority patent/KR101970674B1/en
Priority claimed from KR1020170099566A external-priority patent/KR102043104B1/en
Application filed by Beavers Corp filed Critical Beavers Corp
Publication of CN109328094A publication Critical patent/CN109328094A/en
Application granted granted Critical
Publication of CN109328094B publication Critical patent/CN109328094B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities

Abstract

The invention discloses a motion recognition method and a device thereof, wherein the motion recognition device comprises: an acceleration sensing part for measuring 3-axis direction acceleration values including up-down, left-right and front-back directions; an angular velocity sensing unit for measuring 3-axis angular velocity values in the up-down, left-right, and front-back directions; a processing unit for generating a first motion state value based on the 3-axis direction acceleration value and the 3-axis direction angular velocity value; and a user interface part for controlling the sleep mode or the activation mode of the processing part.

Description

Motion recognition method and device
Technical Field
The present invention relates to an exercise recognition method and apparatus, and more particularly, to an exercise recognition method and apparatus for collecting and analyzing walking and running exercise data of a user.
Background
Generally, the amount of exercise in modern daily life is quite insufficient in maintaining proper physical health. Thus, there is an increased interest in systematic motion methods that are effective in promoting health. Specifically, with the exercise for systematically and efficiently exercising the body quickly, the exercise for correcting the posture for promoting health in a long-term view, and the like, and the extension of the life of the human body, the relationship suitable for various exercises such as the exercise of the elderly with reduced physical ability is increasing. As one of the exercise methods that meet the above-described various needs, there is a walking exercise that anyone can simply perform.
Since any person can perform walking exercise as long as there is no problem in terms of the body, most people walk in a posture familiar with unconsciousness. However, the body of a person is not perfectly symmetrical, and therefore, in most cases, walking is often performed in an unbalanced and incorrect posture. Such continuous walking in a wrong posture causes distortion of muscles and bones, and further causes various systemic pains. Such wrong walking posture may deteriorate physical health for ordinary people, and especially, a problem of body distortion or deterioration of health may be more serious for children in a growing period or elderly people with reduced physical ability. On the other hand, for professionals such as athletes and dancers who require more physical abilities than ordinary people, there is a problem that the physical ability is limited in improvement.
Such a correct walking posture is important for both ordinary persons and professionals, and therefore, various studies are being made on how to effectively perform correction of the walking posture.
According to the prior art, in the process of detecting walking, a pressure sensor mounted on most shoes or foot boards is used. However, according to the prior art, when the user recognizes and analyzes the walking posture, the pressure sensor may be damaged during the long-term continuous pressure application, and thus, the user may be inconvenienced in terms of time and economy by the replacement or repair of the device. Further, since the size of each foot is different, the shoes to which the pressure sensors are attached should be manufactured in different sizes, which causes a problem of poor productivity and economical efficiency. Also, each family cannot share one walking posture correction device, and each person needs to purchase a device that fits the size of the foot, thus resulting in an increased economic burden.
In the technology for recognizing, detecting, and analyzing walking in order to correct such a walking posture, a technology for efficiently and accurately recognizing, detecting, and analyzing walking by means other than a pressure sensor is required.
Disclosure of Invention
Technical problem
The present invention has been made in view of the above problems, and it is an object of the present invention to provide an exercise recognition method and apparatus for collecting and analyzing walking and running exercise data of a user, which substantially solve various problems due to limitations and disadvantages of the related art, and a computer-readable recording medium having recorded thereon a program for executing the method.
Means for solving the problems
According to an embodiment of the present invention, a motion recognition method of a first device includes: measuring 3-axis direction acceleration values including up-down, left-right, and front-back directions by an acceleration sensing unit; measuring 3-axis direction angular velocity values including up-down, left-right, and front-back directions by an angular velocity sensor unit; generating a first motion state value based on the 3-axis direction acceleration value and the 3-axis direction angular velocity value by a processing unit; and controlling a sleep mode or an active mode of the processing section through the user interface section.
According to an embodiment of the present invention, the method further includes the step of transmitting the first motion state value to a second device.
According to an embodiment of the present invention, the step of transmitting the first motion state value to the second device is performed by one of bluetooth, wireless fidelity, and Near Field Communication (NFC).
According to an embodiment of the present invention, the method further includes the step of transmitting the first motion state value to a server.
According to an embodiment of the present invention, the method further comprises the step of determining the user position value.
According to an embodiment of the present invention, the second exercise status value is generated based on at least one of the first exercise status value, the user location value and the user profile.
According to an embodiment of the present invention, the method further includes the step of transmitting at least one of the first motion state value and the second motion state value to a server.
According to an embodiment of the invention, the second motion state value is at least one of distance, speed, energy consumption, height and stride.
According to an embodiment of the present invention, the method further includes the step of comparing the first motion state value and the second motion state value with respective predetermined reference values to generate posture correction information.
According to an embodiment of the present invention, the method further includes outputting the posture correction information in at least one of voice, graphic, image and vibration.
According to an embodiment of the present invention, in the step of measuring the 3-axis direction acceleration value and the step of measuring the 3-axis direction acceleration value, when the processing unit changes to the active mode by the user interface unit, the processing unit sets a connection to the second device, and generates the 3-axis direction acceleration value and the 3-axis direction angular velocity value based on a command received from the second device or the user interface unit.
According to an embodiment of the present invention, the acceleration sensing unit and the angular velocity sensing unit store the 3-axis acceleration values and the 3-axis angular velocity values in a first-in first-out queue, and the processing unit is in a sleep mode when a storage space of the first-in first-out queue is smaller than a predetermined threshold value, and is in an active mode when the storage space of the first-in first-out queue is equal to or larger than the predetermined threshold value.
According to an embodiment of the present invention, the first exercise state value is at least one of exercise time, exercise steps, steps per minute, step interval, step angle, head angle, ground support time, levitation time, ground support time ratio with respect to levitation time, maximum vertical force, average vertical force load rate, maximum vertical force load rate, left-right balance degree, and left-right stability degree.
According to an embodiment of the present invention, the first device is one of a belt worn on the head and the waist, a belt attached to the head and the waist in a clip type, a hat, a belt, a pair of glasses, a helmet, an ear, a garment, and a garment.
According to an embodiment of the present invention, the glasses may be one of augmented reality glass, a glasses frame and sunglasses, the ear-attached glasses may be one of a hands-free headset, a headphone and an earphone, and the clothing-worn glasses may be one of a vest and a shoulder strap.
On the other hand, according to an embodiment of the present invention, a motion recognition method of a second device includes: a step of receiving a first motion state value generated based on a 3-axis direction acceleration value and a 3-axis direction angular velocity value from a first device; measuring a user position value; and generating a second exercise status value based on at least one value among the first exercise status value, the user position value, and the user profile.
According to an embodiment of the present invention, the method further includes the step of transmitting at least one of the first motion state value and the second motion state value to a server.
According to an embodiment of the present invention, the first exercise state value is at least one of exercise time, exercise steps, steps per minute, step interval, step angle, head angle, ground support time, levitation time, ground support time ratio with respect to levitation time, maximum vertical force, average vertical force load rate, maximum vertical force load rate, left-right balance degree, and left-right stability degree.
According to an embodiment of the invention, the second motion state value is at least one of distance, speed, energy consumption, height and stride.
According to an embodiment of the present invention, the method further includes generating the posture correction information by comparing at least one of the first motion state value and the second motion state value with a predetermined reference value.
According to an embodiment of the present invention, the method further includes outputting the posture correction information in at least one of voice, graphic, image and vibration.
According to an embodiment of the present invention, the step of receiving the first motion state value from the first device is performed by at least one of bluetooth, wifi and short-range wireless communication technologies.
According to an embodiment of the present invention, a computer-readable recording medium having recorded thereon a program for executing the above-described method is included.
Also, according to an embodiment of the present invention, a motion recognition first device includes: an acceleration sensing part for measuring 3-axis direction acceleration values including up-down, left-right and front-back directions; an angular velocity sensing unit for measuring 3-axis angular velocity values in the up-down, left-right, and front-back directions; a processing unit for generating a first motion state value based on the 3-axis direction acceleration value and the 3-axis direction angular velocity value; and a user interface part for controlling the sleep mode or the activation mode of the processing part.
Also, according to an embodiment of the present invention, the motion recognition second means includes: a first communication section for receiving a first motion state value generated based on a 3-axis direction acceleration value and a 3-axis direction angular velocity value from a first device; a position sensing unit for measuring a user position value; and a processing unit for generating a second exercise status value based on at least one of the first exercise status value, the user position value, and the user data.
ADVANTAGEOUS EFFECTS OF INVENTION
According to the present invention, the specific analysis algorithm of the present invention, which converts the acceleration, position, and the like into the center-of-mass motion state value and the like, is used in the process of measuring the acceleration, position, and the like of the body (for example, the head or the waist) of the user, whereby the walking can be recognized, detected, and analyzed efficiently and accurately.
Further, according to the present invention, the walking is effectively and accurately recognized, detected and analyzed by using the specific analysis algorithm of the present invention, which converts the acceleration, position, etc. of the body of the user into the center-of-mass motion state value and estimates the pressure center path, etc. based on the converted value. In particular, in the present invention, the acceleration and the position can be measured more accurately by using the acceleration measured at the position most similar to the movement of the center of mass of the body of the user (specifically, the acceleration in the left-right direction is measured at the head, the acceleration and the position in the front-rear direction are measured at the waist, and the acceleration and the position in the up-down direction are measured at the head or the waist).
Further, according to the present invention, there is an advantage in that only a sensor for measuring a dynamic physical quantity of a user, such as an acceleration sensor or a position sensor, can be used in terms of the device configuration. That is, conventionally, a pressure sensor for recognizing walking by pressing with a foot of a user has been used, but there have been various problems such as a reduction in durability and life of the device, a production of a separate device based on a body size of the user, and a problem in use. However, according to the present invention, the technical configuration itself in which the pressure sensor is disposed at the leg portion as a cause of such a problem is completely eliminated, and thus, various problems as described above are fundamentally solved. Further, the convenience of the user is improved, and the effect of improving the economy and the like can be obtained for both the user and the manufacturer.
Drawings
Fig. 1 shows a use state of a motion posture deriving apparatus according to an embodiment of the present invention.
Fig. 2 shows a schematic diagram of a motion posture deriving apparatus according to an embodiment of the present invention.
Fig. 3 shows a flow chart of a motion gesture derivation method of an embodiment of the invention.
FIG. 4 is a graph illustrating the relationship between center of mass and center of pressure according to one embodiment of the present invention.
FIG. 5 illustrates the determination of the center of pressure direction and location analogy in one embodiment of the present invention.
FIG. 6 shows an illustration of an estimated center of pressure path of an embodiment of the present invention.
Fig. 7 shows an example of a vertical acceleration chart with respect to time during walking and running according to an embodiment of the present invention.
Fig. 8 shows an example of the measurement result of the acceleration signal according to the embodiment of the present invention.
Fig. 9 shows a flow chart of a motion recognition method according to still another embodiment of the present invention.
FIG. 10 shows a detailed flow chart of the data collection and motion recognition steps of yet another embodiment of the present invention.
Fig. 11 shows an example of the result of measuring the acceleration signal according to still another embodiment of the present invention.
Fig. 12 is a flowchart showing the detailed motion state value deriving step based on acceleration according to still another embodiment of the present invention.
Fig. 13 shows a use state of a motion posture deriving apparatus of another embodiment of the present invention.
Fig. 14 shows a diagram of a motion posture deriving device of another embodiment of the present invention.
Fig. 15 is a schematic view of an injury risk quantifying apparatus according to another embodiment of the present invention.
Fig. 16 is a flowchart illustrating an injury risk quantifying method according to another embodiment of the present invention.
Fig. 17 shows a vertical acceleration chart during running according to another embodiment of the present invention.
Fig. 18 shows the inclination of the vertical acceleration chart during running according to another embodiment of the present invention.
Fig. 19 shows the impact amount of the vertical acceleration chart during running according to another embodiment of the present invention.
Fig. 20 shows a motion recognition first device according to another embodiment of the present invention.
Fig. 21 shows a motion recognition second device according to another embodiment of the present invention.
Fig. 22 shows a flow chart of a motion recognition method according to another embodiment of the invention.
Detailed Description
Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the accompanying drawings. In the drawings, like reference numerals denote like components, and the sizes of the components are enlarged for clarity of the description.
Fig. 1 shows a use state of a motion posture deriving apparatus according to an embodiment of the present invention.
As shown in fig. 1, a motion posture deriving device 100 according to an embodiment of the present invention is worn on the head of a user. As shown in fig. 1, the motion posture deriving device 100 of the present embodiment may be in the form of a band or a hat to be worn on the head, and may be in the form of an ear-inserted type such as an earphone, which is further miniaturized.
Fig. 2 shows a schematic diagram of a motion posture deriving apparatus according to an embodiment of the present invention.
As shown in fig. 2, the motion posture deriving device 100 of the present embodiment includes a sensor signal collecting unit 110 and a motion posture deriving unit 121. The motion posture deriving device 100 of the present embodiment further includes a motion correction generating unit 122 and a correction information output unit 130.
The sensor signal collecting unit 110 includes a 3-axis direction acceleration sensor 111 having vertical, horizontal, and longitudinal directions, and a position measuring sensor 112 for measuring the position of the user. The 3-axis direction acceleration sensor 111 is used as an appropriate sensor among sensors that normally measure acceleration in the 3-axis direction, such as a type incorporating a gyroscope. The position measurement sensor 112 is used to measure the absolute position of the user, and for example, the position of the user is measured using a Global Positioning System (GPS) signal, or recently, ultra-precise satellite navigation technology with higher accuracy than the GPS is developed, and a sensor suitable for such technology can be used. Further, as shown in fig. 2, the sensor signal collecting section 110 may further include a speed sensor 113 in the 3-axis direction in such a manner that the accuracy in the motion recognition and analysis process described below can be improved.
As shown in fig. 1, the sensor signal collecting unit 110 of the present embodiment is worn on the head to measure dynamic physical quantities of the user such as acceleration, velocity, and position. Conventionally, pressure sensors are used which are provided on shoes, foot boards, and the like which are directly stepped on by a foot for monitoring walking. This causes the sensor to be damaged quickly, and the durability and life of the device are shortened. In addition, there are problems in that accuracy of walking recognition and analysis is lowered due to damage of the device during use, and convenience and economy are lowered due to frequent replacement of the device. In addition, in the case where such a device is provided in a shoe, a different device is required for each user according to the foot size of the user, which increases the convenience and economic efficiency of the user, and imposes an economic burden on the manufacturer that requires the size-based production.
However, in the present embodiment, the concept of using the pressure pressed by the foot is completely removed from the walking recognition process, and as shown in fig. 1, the dynamic physical quantity of the user such as the acceleration, the velocity, and the position measured at the head of the user is measured, and the recognition, the detection, and the analysis of the walking are realized by applying the specific analysis algorithm of the present invention described below. As described above, the present embodiment is different from the conventional technique in measuring the position and the physical quantity. In this case, the root cause of the problems specified in the above-described conventional art is derived from the technical configuration of "disposing the pressure sensor at the foot portion", and according to the present invention, the problems are fundamentally eliminated only by the above-described configuration.
The exercise posture deriving unit 121 receives the signal from the sensor signal collecting unit 110, derives a walking or running exercise state value including the acceleration, speed, and position of the center of mass of the user using the 3-axis directional acceleration and position signal, and analyzes the walking or running exercise state value to derive a walking or running posture. Specifically, the exercise posture deriving unit 121 estimates a pressure center path from the walking or running exercise state value, and analyzes the pressure center path to derive the walking or running posture. The analysis algorithm of the present invention used in the motion posture deriving unit 121 will be described in detail later, and a description thereof will be omitted here.
On the other hand, the motion posture deriving unit 121 is in the form of a direct circuit capable of performing various calculations, and may be formed on the sensor signal collecting unit 110 and one substrate, or in the form of a separate computer or the like. In this case, when the motion posture deriving unit 121 and the sensor signal collecting unit 110 are formed separately, as shown in fig. 2, the communication unit 114 is provided for signal transmission between the sensor signal collecting unit 110 and the motion posture deriving unit 121. The communication section 114 may be formed by wired or wireless communication. The wireless communication may utilize bluetooth, wifi, nfc, but one of ordinary skill in the art will recognize that other wireless communication technologies may be utilized.
The motion correction generation unit 122 compares the walking posture derived by the motion posture derivation unit 121 with a reference posture to generate posture correction information. As described above, the exercise posture deriving unit 121 derives the walking or running posture of the user based on the signal collected by the sensor signal collecting unit 110, and specifically, derives the advancing direction, speed, and the like of the user when walking or running, thereby obtaining the stride as one of the elements of the walking posture. In this case, the exercise correction generation unit 122 incorporates the optimal height and stride relationship data of walking and running speeds, compares the data with the walking posture information of the corresponding user, determines whether the stride is too wide or too narrow as compared with the height of the corresponding user, and can easily calculate the amount of stride correction to be reduced or increased if the stride exceeds the optimal range.
The correction information output unit 130 converts the posture correction information generated by the motion correction generation unit 122 into information recognizable by a user including a voice, a graphic, and a video, and outputs the converted information. For example, when the stride length correction amount is calculated and the stride length needs to be reduced, a voice such as "reduce stride length" is output from a speaker provided in the exercise posture deriving device 100, or a warning sound is emitted to make the user recognize that the stride length is not the optimum stride length and to make the user change the walking posture. Or, the system is connected with a smart phone, a computer, a special display or the like, and is realized in various forms such as accurate correction information output by illustration or images.
At the same time, the exercise posture deriving device 100 transmits the walking posture derived by the exercise posture deriving unit 121 to the external database 140 and stores the walking posture in a cumulative manner. The user who needs such walking or running exercise analysis is an ordinary person who performs walking or jogging daily for the purpose of promoting health or a professional who exercises for the purpose of improving physical ability, and preferably, such exercise analysis data is accumulated and needs to exhibit temporal changes. In addition, if such motion analysis data is stored in a large amount, such data is used as big data, and is applicable to various systems, analyses, and the like.
The motion posture deriving method according to an embodiment of the present invention detects the motion of the user by the motion posture deriving device 100 and performs analysis for determining whether the user is walking or running. In this case, as described above, the analysis algorithm used in the present invention uses the dynamic physical quantity measured on the head of the user, the motion posture deriving device 100 includes at least the 3-axis direction acceleration sensor 111 having the up-down, left-right, and front-back directions and the position measuring sensor 112 measuring the position of the user, and the motion posture deriving unit 121 executes the analysis algorithm described below. The motion posture deriving device 100 may include various additional configurations described above in order to improve the functions of the device.
Fig. 3 shows a flow chart of a motion gesture derivation method of an embodiment of the invention. As shown in the figure, the motion posture derivation method of the present embodiment includes a pressure center path estimation step, a motion type determination step, and a motion posture derivation step. Hereinafter, each step will be described in detail.
In the above-described pressure center path estimation step, the 3-axis directional acceleration a collected by the motion posture derivation means 100 is usedx、ay、azTo calculate the motion state value and the mass of the mass center of the userIn the center position, the pressure center path is estimated along the direction of the acceleration vector projected toward the ground.
In the motion posture deriving device 100, the 3-axis direction acceleration sensor 111 collects the 3-axis direction accelerations in the left-right, front-back, and up-down directions, and integrates them, or the position sensor 112 collects position information per time to obtain the speed, position, and the like. On the other hand, in general, when analyzing the motion of one object, it is preferable to analyze the motion of the center of mass of the object as a reference, and the motion posture deriving device 100 is installed on the head of the user, and converts the value measured here into the motion state value of the center of mass. As described above, the value measured at the head position of the user is converted into the value at the center of mass of the user, and the value is easily derived by multiplying the previously obtained proliferation appropriately by the body information such as the height information of the user.
As described above, if the motion state value of the center of mass (acceleration/velocity/position per time, frequency analysis, etc. associated with each direction) is derived, the pressure center path can be estimated from this. The human body is activated by applying a counteracting pressure to the foot supported while walking or running. The sum of the reaction pressures is used as a Ground Reaction Force (GRF), and the Center of the pressure is used as a Center of pressure (COP). The ground reaction force occurring at this time has a characteristic of being directed toward the center of mass (COM) of the human body at the center of pressure.
FIG. 4 is a graph illustrating the relationship between center of mass and center of pressure according to one embodiment of the present invention.
In the present invention, the pressure center is estimated by projecting the vector direction of the force measured at the mass center onto the ground by directly and reversely utilizing the biomechanical characteristics.
FIG. 5 illustrates the determination of the center of pressure direction and location analogy in one embodiment of the present invention.
The center of pressure direction is the direction toward the center of pressure at the center of mass. In the pressure center path estimation step, the pressure center direction is determined first, and the projection is performed along the directionAnd so on for the center of pressure location. Specifically, first, in the pressure center direction determining step, as shown in fig. 5, the acceleration a in the up-down direction is passedzAnd the sum of the gravitational acceleration g and axRatio of (A) and vertical acceleration azAnd the sum of the gravitational acceleration g and the acceleration a in the front-rear directionyDetermines the direction of the centre of pressure. As described above, when the direction of the center of pressure is specified, then, in the center of pressure position analogy step, the center of mass is assumed to be located at a height determined by multiplying the height information of the user measured in advance by a predetermined analogy constant, and in the center of pressure direction specification step, the center of pressure position is analogized by projecting the center of mass onto the ground along the specified direction. Wherein the analogy constant is the height of the center of mass based on the height of the user. Typically, the mass center of a child is higher than the mass center of an adult, and the mass center of a male is higher than the mass center of a female, in proportions, although such proportions are also disclosed. Specifically, for example, the mass center of an adult male is located at 55.27% of the length on average, and in this case, the analogy constant is 0.5527. Therefore, for example, when the height information of the user is input, the child and adult and the male and female distinction information are input together, and thus, an appropriate analogy constant is selected and calculated.
In order to further improve the accuracy of the center of pressure position thus obtained, the center of pressure position correction step is executed again, which is modified such that the center of pressure position analogized in the center of pressure position analogizing step is multiplied by a value of a predetermined constant for front-rear and left-right direction correction. Wherein the constant for front-back and left-right direction modification is a constant statistically equivalent to the pressure center in the actual front-back and left-right directions at the initial pressure center position by the above projection method.
In the motion type determination step, the acceleration a is measured from the vertical directionzThe pattern of the graph determines whether walking or running.
FIG. 6 is an illustration of a footprint pattern found as an estimated center of pressure path. As shown, the left and right feet alternately support the ground and advance.
On the other hand, the distinction between walking and running is made such that in the case of walking, one foot or both feet are always in contact with the ground, whereas in the case of running, one foot or both feet are always separated from the ground.
Fig. 7 shows a vertical acceleration chart relating to the time during walking and running. In the walking shown in part (a) of fig. 7, the peak occurs at the moment when both feet contact the ground in the case of the graph, and in the running shown in part (B) of fig. 7, the vertical acceleration a occurs at the moment when both feet separate from the ground in the case of the graphzA constant value interval that becomes the minimum value. As described above, when walking or running, the vertical acceleration a is generated in each casezThe patterns of the chart are different to judge whether the user's exercise currently performed is walking or running.
In the motion posture deriving step, the estimated pressure center path value and the 3-axis direction acceleration a are usedx、ay、azPosture information including stride, step interval, step angle, left-right asymmetry is derived on the basis. The following describes in more detail the example of the center of pressure path in fig. 6 and the example of the acceleration in the vertical direction in walking or running in fig. 7.
First, the description has been given of points where the shapes are somewhat different between the case where the user's exercise is walking and the case where the user is running, but naturally, there are also items that appear in common. As described above, in the case of walking, the single foot or both feet are always in contact with the ground, and in the case of running, the single foot or both feet are always separated from the ground. That is, there is a region supported by a single foot both when walking and when running. In consideration of these points, the motion posture deriving step includes a center support time point determining step of determining a center support time point and a section classification determining step of determining a both-feet support section, a one-feet support section, and a floating section, and forms basic information for distinguishing walking and running and deriving a posture.
First, the walking exercise is explained in detail as follows. First, at the moment when the heel of one foot steps on the ground, the toe of the other foot is not separated from the ground, that is, the feet are usedThe state in which the feet are supported starts. In this state, the ground is supported by one foot, and the other foot is separated from the ground, so that the other foot is suspended in the air and moves forward, and the body of the person also moves forward. At the moment when the heel of the other foot steps on the ground, the toe of the other foot is not separated from the ground, that is, the other foot is supported by both feet, and the other foot is walked. In the second process, the head of the person does not shake downward (vertical acceleration a) at the moment when the body of the person moves forward while being supported by only one footzMedium minimum), and on the contrary, at the moment of stepping on the ground, the vehicle shakes in the vertical direction (vertical acceleration a)zTo form a peak).
In other words, the walking exercise is divided into a state section in which both feet step on the ground and a state section in which one foot steps on the ground, and the vertical shaking is minimized only in the state in which one foot steps on the ground. Such a situation of the movement appears in part (a) of fig. 7, and as shown in this example, in the center support time point determining step, when the movement of the user is walking, the vertical acceleration a measured in the start areazThe minimum value is defined as the center support time point. In the section classification determination step, when the user moves in a walking direction, the vertical acceleration a measured in the time regionzIn the above description, the section forming the peak is determined as a two-legged support section, and the remaining section is determined as a single-legged support section.
Next, running exercise is briefly described as follows. First, the foot stepping forward starts at the moment when the foot steps on the ground (at the moment, the other foot is suspended in the air). In this state, one foot is stepped from the ground, both feet are suspended in the air, and the body of the person moves forward, and at the same time, both feet are suspended in the air and the front and rear are changed, so that the other foot is stepped forward. When the other foot stepping forward steps on the ground, the instant of stepping on the ground is formed again, and one-step running is performed. In this process, at the moment when the person steps on the ground with one foot, the head of the person shakes in the up-down direction (up-down acceleration a)zIn which a local maximum is formed),on the other hand, the vehicle does not swing in the vertical direction (vertical acceleration a) when the vehicle is moving forward while suspended in the airzForming a constant value).
That is, the running exercise is divided into a state section in which both feet are suspended in the air, a state section in which one foot treads the ground, and a state section in which both feet are suspended in the air, and the shaking in the up-down direction is minimized. This state of the exercise is shown in part (B) of fig. 7, and as shown in this example, in the center support time point determining step, when the exercise of the user is running, the vertical acceleration a measured in the time zonezThe maximum value is defined as the center support time point. In the section classification determination step, when the user's exercise is running, the vertical acceleration a measured in the time zonezIn the middle, a section represented by a constant value is determined as a floating section, and the remaining sections are determined as single-foot supporting sections. The constant value appearing in the levitation zone is a set value of the signal level when no external force other than acceleration acts, and may be set to a value substantially close to 0. That is, the above-mentioned constant value is a reference value from which the current posture can be discriminated, and in this sense, is referred to as a posture discrimination constant (state phase constant), and in a simple manner, when running, if the vertical acceleration is smaller than the posture discrimination constant, it is discriminated as a floating section, and if the vertical acceleration is larger than the posture discrimination constant, it is discriminated as a one-foot supporting section.
As described above, if the basic information for posture derivation is derived, walking or running postures such as a stride length, a step interval, a step angle, and left-right asymmetry can be derived.
Stride: first, the user position information is measured at predetermined time intervals to calculate an average speed. Next, the number of the center support time points during the time interval is measured to calculate the walking frequency. Finally, the average speed is divided into the walking frequency, so that the stride of the user can be accurately calculated.
Step interval: the pressure center position value corresponding to the center support time point is used to calculate the step interval in the left-right direction. That is, when the time value corresponding to the center support time point shown in part (a) or part (B) of fig. 7 is applied to the center of pressure position value shown in fig. 6 and the center of pressure position corresponding to the time value is found, the position where the left foot steps on the ground and the position where the right foot steps on the ground are derived, and the interval between these positions is obtained to accurately calculate the step interval of the user.
Step angle: and calculating a step angle using a pressure center position value corresponding to a start time point of the one-foot support section and a pressure center position value corresponding to an end time point of the one-foot support section. Simply, at the beginning time point of the monopod support interval, the heel will contact the ground, and at the end application point of the monopod support interval, the toe contacts the ground.
That is, as described above, the angle between the pressure center positions is an angle formed by the position at the moment when the floor surface is stepped and the toe position, that is, the step angle, and thus the step angle of the user can be accurately calculated by this method.
Left-right asymmetry: first, the lateral acceleration a measured in a time domain is graspedxThe reference numeral (2) is a foot supported on the basis. Next, the vertical accelerations a measured in the time domain are comparedzPeak value, valley value and difference value between the two. That is, the left asymmetry of the user can be accurately calculated by comparing the peak value and the valley value of the left foot support and the right foot support. And, the repetition of walking or running is calculated in the same manner.
FIG. 8 shows an example of the measurement result of the acceleration signal, and in the lowermost diagram of FIG. 8, the acceleration a in the vertical directionzIs very asymmetric to the left and right.
By the method, walking or running postures such as stride, step interval, step angle, left-right asymmetry and the like are derived, so that a user can monitor whether walking or running in a correct posture is performed in real time. In this case, of course, the stride length, the step interval, the step angle, and the left-right asymmetry value corresponding to the optimum posture are stored in advance, and the correction amount is calculated by comparing the values with the respective current posture values currently being monitored. The user can effectively correct his/her posture by walking or running with an accurate posture by informing or storing the information in real time and confirming the information.
Fig. 9 shows a flow chart of an exercise identification method for walking and running monitoring in accordance with another embodiment of the present invention.
The motion recognition method of the present embodiment is roughly divided into two steps, i.e., collecting and analyzing the 3-axis directional acceleration ax、ay、azTo judge whether the step is a step of collecting data of walking and running and recognizing the movement and to use the collected 3-axis directional acceleration ax、ay、azAnd an acceleration-based motion state value derivation step for calculating a plurality of motion state values using the center of mass of the person. The detailed steps of each step are described in more detail below.
FIG. 10 shows a detailed flow chart of the data collection and motion identification steps of another embodiment of the present invention.
As shown in fig. 10, the data collection and motion recognition step includes a vertical acceleration collection step, a peak detection step, a motion detection step, a 3-axis acceleration collection step, a fourier transform step, and a motion pattern determination step. The data collection and motion recognition step recognizes whether a user is exercising, and if so, whether the exercise corresponds to walking or running.
As shown in fig. 10, the initially collected data variables are initialized in preparation for performing motion recognition.
In the vertical acceleration collection step, the 3-axis acceleration a is not collected all the timex、ay、azOnly collecting the acceleration a in the up-down directionz. Collected vertical acceleration azIt is directly usable, however, preferably subjected to a noise removal step of removing noise by a predetermined band-pass filter. In this case, for example, the band pass filter is generally formed of 0.1 to 5Hz corresponding to the walking or running frequency of the ordinary person, but the present invention is applicable to the caseIt will be apparent to those skilled in the art that the above range can be appropriately modified.
In the peak detection step, the vertical acceleration a collected by the peak detection step is detectedzIn the motion detection step, the vertical acceleration a is determinedzWhether the peak value is equal to or more than a predetermined threshold value is judged, and thus, whether the movement occurs is judged. If it is determined in the motion detection step that no motion has occurred, the operation returns to the initial preparation step again to initialize the variable.
Specifically, in the course of the analysis performed by the motion recognition apparatus 100, the 3-axis directional acceleration a is collected at all timesx、ay、azIn the case of (3), when the exercise is not performed, unnecessary calculation load occurs, and thus, problems such as power consumption and heat generation may occur. On the other hand, in the case where the user is sitting or rolling the body, or in the case of walking or running, the greatest difference appears as the degree of the user's shake up and down, i.e., the vertical acceleration az. Therefore, first, the vertical acceleration a of the user wearing the motion recognition device 100 is collectedzIf the value is equal to or greater than any threshold value, it is determined that the user is walking or running, and the motion detection is started directly, so that the problem of unnecessary calculation load described above can be prevented.
In the 3-axis direction acceleration collection step, as described above, the vertical direction acceleration a is detectedzCollecting 3-axis direction acceleration a when the peak value is above a predetermined threshold valuex、ay、az. In the same manner, the collected 3-axis directional acceleration ax、ay、azDirectly used or, preferably, subjected to a noise removal step of removing noise by a predetermined band-pass filter. The band pass filter used in this case may be used for the above-mentioned vertical acceleration azThe band pass filter for noise removal is formed identically, or may be changed as appropriate.
In the Fourier transform step, the 3-axis direction acceleration a is measuredx、ay、azCarry out Fourier transformThe frequency response map is derived by conversion, and in the exercise form determination step, the frequency response map is compared with a predetermined frequency response profile or a size reference to determine whether the exercise is walking or running exercise or other exercise. In the exercise form determination step, if it is determined that the exercise corresponding to the walking or running exercise is not generated, the process returns to the initial preparation step again to perform the variable initialization, and if it is determined that the exercise corresponding to the walking or running exercise is generated, the exercise state value derivation step based on the acceleration is performed.
As described above, in the exercise form determination step, it is determined whether or not the exercise of the user is a walking or running state.
Fig. 11 shows an example of the measurement result of the acceleration signal during walking according to another embodiment of the present invention. The left side of FIG. 11 shows the 3-axis directional acceleration a with time on the horizontal axis and x, y, and z on the vertical axisx、ay、azThe acceleration chart of (2) shows, on the right side, the frequency on the horizontal axis and the frequency response chart derived by the above fourier transform step represented by magnitude, as described above.
When walking or running, it is a matter of course to periodically shake in the up-down direction, the front-back direction, and the left-right direction, that is, as shown in the left side of fig. 11, a periodic signal is generated. At this time, since the left and right feet alternately step on and walk or run, the frequency of the periodic signal in the left-right direction is 1/2 values of the frequencies of the periodic signals in the up-down direction and the front-rear direction, which can be easily confirmed in the right graph of fig. 11. On the other hand, when walking, the foot or feet are always in contact with the ground, and when running, the foot or feet are separated from the ground. That is, when walking or running, large shaking at the head must occur periodically.
In view of the above, in the exercise form determination step, the user exercise is determined as the walking or running state if the following expression is satisfied, and the other exercise is determined if the following expression is not satisfied. That is, to briefly explain the following relational expression, if the degree of cyclic fluctuation in the up-down direction and the left-right direction is greater than or equal to a predetermined level, it is determined that walking or running is performed.
Mz,p/Mz,other>cz and Mx,p/Mx,other>cx
Wherein, az: acceleration in the up-down direction, fp: with acceleration a in the up-down directionzFrequency of the maximum in the Fourier transform results of (1), Mz,p: acceleration a in the vertical directionzIn the Fourier transform result of (2), f ispAs the center frequency, the sum of energies, M, of frequency components belonging to a reference band in the up-down direction having a frequency band smaller than 1Hzz,other: acceleration a in the vertical directionzIn the Fourier transform result of (2), c is the sum of energies of remaining frequency components excluding the reference band in the up-down directionz: predetermined reference threshold value in up-down direction, ax: acceleration in the left-right direction, Mx,p: acceleration a in the left and right directionsxIn the Fourier transform result of (2), f isp(ii)/2 is a sum of energies of frequency components belonging to a reference band in the left and right directions having a frequency band smaller than 1Hz, Mx,other: acceleration a in the left and right directionsxIn the Fourier transform result of (1), c is the sum of energies of remaining frequency components excluding the reference band in the left-right directionx: a predetermined left-right direction reference threshold.
When it is determined that the user exercise is in the walking or running state by satisfying the above formula, it is necessary to determine whether the exercise is exercise or running. At this time, as described above, when walking, the single foot or both feet are always in contact with the ground, and when running, the single foot or both feet are suspended from the ground. However, since the feet are suspended in the air at the time point when the feet are separated from the ground during running, there is no upward external force applied to the user, and thus, at this time, the vertical acceleration azIs a minimum constant value.
In view of the above, in the motion pattern determination step, the acceleration a in the vertical directionzIf there is an interval satisfying the following formula in the Fourier transform result of (1), the user's motion is judged as the running state,if not, judging the walking state:
az<k
wherein, k: a pose discrimination constant.
The posture phase constant is a predetermined value of the signal level at which no other external force other than gravity acts on the acceleration, and can be appropriately determined to be a value close to 0.
In the data collection and exercise recognition step, the step of deriving an exercise state value based on acceleration is performed using the collected variables when walking or running exercise of the user is detected.
Fig. 12 is a detailed flowchart of the acceleration-based motion state value derivation step according to another embodiment of the present invention.
As shown in the figure, the acceleration-based motion state value derivation step includes a center-of-mass acceleration derivation step, a center-of-mass velocity and position derivation step.
In the center of mass acceleration deriving step, the acceleration a is derived in the 3-axis directionx、ay、azThe respective values are multiplied by predetermined gain values to derive the acceleration of the center of mass of the person in use. In general, when analyzing the motion of an object, the motion of the center of mass of the object is used as a reference, and all the variable values used for the analysis are measured at the head of the user, and are converted into motion state values of the center of mass. Such proliferation may be represented as a constant vector γ, and may be obtained in advance using body information such as height information of a user.
In the mass center velocity and position deriving step, the velocity and position of the mass center of the user are derived using the height information of the user, the position information of the user, and the mass center acceleration, which are measured in advance. That is, as described above, the speed and position of the center of mass are obtained by integrating (adding an integral constant value to) the obtained acceleration of the center of mass, or the speed and position of the center of mass are obtained using the user position information measured by the position measuring sensor with time. An error of an integral constant value exists between the two calculated values, and the two calculated values are properly compared to obtain accurate speed and position values of the center of mass.
As described above, according to the present invention, it is possible to accurately determine whether or not a user performs walking or running exercise using acceleration, position, and the like measured at the head of the user, and to accurately grasp how the weight center of the user moves (i.e., how the acceleration, speed, position of the center of mass appears) when the user performs walking or running. Thus, based on this, a variety of elements of a walking or running posture can be derived and used for correction of the posture.
Fig. 13 shows a use state of a motion posture deriving apparatus of another embodiment of the present invention.
As shown in fig. 13, the motion posture deriving device 1300 of the present embodiment is worn on the body of the user, specifically, on the head and waist of the user. That is, as shown in the schematic diagram of fig. 13, in the exercise posture deriving device 1300 of the present embodiment, the head sensor signal collecting unit 1310H worn on the head is inserted into the ear like an earphone, and the waist sensor signal collecting unit 1310W worn on the waist is inserted into the waist. Of course, the present invention is not limited thereto, and the head sensor signal collecting unit 1310H may have various modifications such as a head-mounted configuration, a glasses configuration, a configuration inserted into an additional hat, and a helmet configuration.
Fig. 14 shows a diagram of a motion posture deriving device of another embodiment of the present invention.
As shown in fig. 14, the exercise posture deriving device 1300 according to another embodiment of the present invention includes a head sensor signal collection unit 1310H, a lumbar sensor signal collection unit 1310W, and an exercise posture deriving unit 1421. The motion posture deriving device 1300 further includes a motion correction generating unit 1422 and a correction information output unit 1430.
The lumbar sensor signal collection unit 1310H includes a head 3-axis direction acceleration sensor 1411H having vertical, horizontal, and front-rear directions, and the lumbar sensor signal collection unit 1310W includes a lumbar 3-axis direction acceleration sensor 1411W having vertical, horizontal, and front-rear directions, and a position measurement sensor 1412W for measuring the position of the user.
The head axis direction acceleration sensor 1411H and the waist 3 axis direction acceleration sensor 1411W are usually used in the form of a built-in gyroscope, with an appropriate sensor selected from those measuring acceleration in the 3 axis direction. The position measurement sensor 1412W is used to measure the absolute position of the user, for example, by using a global positioning system signal, or more recently, an ultra-precise satellite navigation technique with accuracy higher than that of the global positioning system is developed, and a sensor suitable for such a technique can be used. In addition, the accuracy of the head sensor signal collection unit 1310H and the lumbar sensor signal collection unit 1310W is higher than that in the motion recognition and analysis process described in detail below, and a 3-axis direction angular velocity sensor 1412H is further included as shown in fig. 14.
In the exercise posture deriving device 1300 of the present embodiment, as shown in fig. 13, the head sensor signal collecting unit 1310H and the waist sensor signal collecting unit 1310H are worn on the head and the waist of the user to measure the dynamic physical quantity of the user such as acceleration, speed, position, and the like. In particular, in the present embodiment, when measuring the dynamic physical quantity of the user for posture derivation, a value measured in a position most similar to the motion of the center of mass of the body of the user is used. Specifically, in the present invention, the acceleration in the left-right direction is measured at the head, the acceleration in the front-back direction and the position are measured at the waist, and the acceleration in the up-down direction is measured at the head or the waist. Specifically, as for the acceleration in the up-down direction, the acceleration in the up-down direction is measured at both the head and the waist and exhibits considerable accuracy, and therefore, a value measured in one of the head or the waist, or an average value of values measured at both sides is selectively used.
Of course, since the acceleration sensors normally measure the acceleration in the up-down, left-right, and front-back directions, that is, the 3-axis direction, a plurality of calculations described later are performed using the up-down, left-right, and front-back direction accelerations collected by the head sensor signal collection unit 1310H alone or the waist sensor signal collection unit 1310W alone. However, when walking and running, the left-right movement from the head is similar to the left-right movement of the center of mass of the user's body, and the front-back movement at the waist is similar to the front-back movement of the center of mass of the user's body. Meanwhile, the vertical movement is similar to the movement of the head, the waist and the mass center. On the other hand, in the motion recognition method of the present invention described later, motion recognition, posture derivation, and the like are performed using the dynamic physical quantity of the center of mass of the human body. When such a combination of these things is used, the horizontal acceleration is measured at the head, the vertical acceleration is measured at the waist, the desired vertical acceleration is measured at the head or waist, or the vertical acceleration is calculated by averaging the values of the horizontal acceleration and the vertical acceleration after the vertical acceleration is measured at both sides.
The exercise posture deriving unit 1421 receives signals from the head sensor signal collecting unit 1310H and the lumbar sensor signal collecting unit 1310W, derives a walking or running exercise state value including the acceleration, speed, and position of the center of mass of the user using the 3-axis direction acceleration and position signal, and analyzes the walking or running exercise state value to derive a walking or running posture. Specifically, the exercise posture derivation unit 1421 estimates a pressure center path from the walking or running exercise state value, and analyzes the pressure center path to derive a walking or running posture. The analysis algorithm of the present invention used in the motion posture deriving part 1421 will be described in detail later, and therefore, it is omitted here.
On the other hand, the exercise posture deriving unit 1421 is in the form of an integrated circuit that can perform various calculations, and may be formed integrally with the lumbar sensor signal collecting unit 1310W on one substrate, or may be in the form of a separate computer or the like. Meanwhile, a head communication unit 1413H and a lumbar communication unit 1413W are provided in the head sensor signal collection unit 1310H and the lumbar sensor signal collection unit 1310W, respectively, for signal transmission to the exercise posture deriving unit 1421. When the lumbar sensor signal collection unit 1310W is integrated with the exercise posture derivation unit 1421, the lumbar communication unit 1310W is directly connected to the exercise posture derivation unit 1421 to transmit a signal, or the lumbar communication unit 1413W receives a signal from the head communication unit 1413H to transmit the signal to the exercise posture derivation unit 1421. The head communication unit 1413H and the waist communication unit 1413W may be formed by wire or may transmit signals by at least one wireless communication selected from bluetooth, wireless fidelity, and short-range wireless communication techniques so as to improve user convenience.
The motion correction generation unit 1422 compares the walking posture derived by the motion posture derivation unit 1421 with a reference posture to generate posture correction information. As described above, the exercise posture deriving unit 1421 derives the walking or running posture of the user based on the signals collected by the head sensor signal collecting unit 1310H and the waist sensor signal collecting unit 1310W, and specifically, for example, when walking or running, the forward direction, speed, and the like of the user can be derived, whereby the stride as one of the elements of the walking posture can be obtained. In this case, the exercise correction generation unit 1422 incorporates the optimal height and stride relationship data of walking and running speeds, compares the data with the walking posture information of the user, determines whether the stride is too wide or too narrow as compared with the height of the user, and can easily calculate the amount of stride correction to be reduced or increased when the stride exceeds the optimal range.
The correction information output unit 1430 converts the posture correction information generated by the motion correction generation unit 1422 into information that can be recognized by a user including a voice, a graphic, and a video, and outputs the information. For example, when the stride length correction amount is calculated and the stride length needs to be reduced, a voice such as "reduce stride length" is output through a speaker provided in the exercise posture deriving apparatus 1300, or a warning sound is emitted to cause the user to recognize the stride length that is not the optimum stride length and to cause the user to change the walking posture. In particular, in the above case, the correction information output unit 1430 is integrated with the head sensor signal collection unit 1310H, and thus, information is transmitted close to an information collection organ, i.e., eyes, ears, and the like, disposed on the user. Specifically, as shown in fig. 13, the head sensor signal collection unit 1310H is in the form of an earphone inserted in the ear, and when the output information is in the form of a voice or an acoustic signal, the efficiency of correction information transmission can be maximized. Alternatively, the system may be connected to a smart phone, a computer, a dedicated display, or the like, and output accurate correction information through illustration or video.
At the same time, the exercise posture deriving device 1300 transmits the walking posture derived by the exercise posture deriving unit 1421 to the external database 1440, and accumulates and stores the walking posture. The user who needs such walking or running exercise analysis is an ordinary person who performs walking or jogging daily for the purpose of promoting health or a professional who exercises for the purpose of improving physical ability, and preferably, such exercise analysis data is accumulated and needs to exhibit temporal changes. In addition, if such motion analysis data is stored in a large amount, such data is used as big data, and is applicable to various systems, analyses, and the like.
In the motion posture deriving device 1300 of the present embodiment described above, a specific integration from motion recognition through motion posture derivation to correction information output is exemplified as follows. First, as described above, the right-left direction acceleration is collected at the head, the front-rear direction acceleration and position are collected at the waist, and the up-down direction acceleration is collected at the head or waist, in a manner similar to the movement of the center of mass of the body of the user.
The lumbar sensor signal collection unit 1310W may be integrated with the motion posture deriving unit 1421, and therefore, the physical quantity collected by the head sensor signal collection unit 1310H is transmitted to the motion posture deriving unit 1421 side through the head communication unit 1413H. At this time, the lumbar communication unit 1413W provided in the lumbar sensor signal collection unit 1310W receives the signal and transmits the signal to the exercise posture derivation unit 1421.
As described above, the walking and running postures of the user are derived by the collected physical quantities such as acceleration and position in the exercise posture deriving unit 1421. The motion correction generating unit 1422 compares the actual posture derived in this manner with the abnormal reference posture to generate posture correction information. The motion posture deriving unit 1421 and the motion correction generating unit 1422 are also formed integrally, that is, these are formed integrally with the lumbar sensor signal collecting unit 1310W.
In order to efficiently transmit the posture correction information generated in this manner to the user, it is preferable to transmit the information to the head near the eyes, ears, and the like of the information collection organ of the user. As described above, when the head sensor signal collection unit 1310H is integrated with the correction information output unit 1430, the generated information for posture correction is transmitted to the correction information output unit 1430 via the waist communication unit 1413W and the head communication unit 1413H in this order, and effective transmission of the correction information is realized in a form in which voice information or the like is transmitted to the ear of the user.
In the motion posture deriving method according to another embodiment of the present invention, analysis is performed to detect the motion of the user by the motion posture deriving device 1300 and determine whether the user is walking or running. In this case, as described above, the analysis algorithm used in the present invention uses the dynamic physical quantities measured at the head and waist of the user, the motion posture deriving device 1300 includes at least the head 3 axis direction acceleration sensor 1411H having the up-down, left-right, and front-rear directions, the waist 3 axis direction acceleration sensor 1411W having the up-down, left-right, and front-rear directions, and the position measuring sensor 1412W that measures the position of the user, and the analysis algorithm described below is executed in the motion posture deriving unit 1421. The motion posture deriving device 1300 may further include various additional configurations described above in order to improve the functions of the device.
Fig. 15 is a schematic view of an injury risk quantifying apparatus according to another embodiment of the present invention.
The injury risk quantifying apparatus 1500 of the present embodiment notifies a user of the risk of injury that may occur during walking or running. The concrete description is as follows. When walking or running, the ankle, knee, waist, etc. are subjected to stress for various reasons, such as wrong posture or hard ground. In order to prevent such a risk, conventionally, there have been only proposals such as wearing impact-absorbing functional sports shoes, and there is no accurate index of whether or not there is a risk of injury. In the present embodiment, the injury risk is quantified as a determination index, and when the injury risk increases to a predetermined level or more, the degree of risk is notified to the user by an alarm. Thus, the user can appropriately stop walking or running, correct posture, change walking or running course, or the like before injury occurs, and the risk of injury occurring during walking or running can be greatly reduced.
The injury risk quantifying apparatus 1500 of the present embodiment includes a sensor signal collecting unit 1510, a control unit 1520, and an alarm unit 1530. The injury risk quantifying apparatus 1500 may further include a database 1540.
The sensor signal collecting section 1510 includes an acceleration sensor 1511, and is worn on the upper body except for the arm of the user. The sensor signal collecting section 1510 may be one or more. The number of the sensor signal collecting units 1510 is 2, and the sensor signal collecting units can be worn on the head and the waist of the user, in which case the sensor signal collecting unit worn on the head of the user can be classified as a head sensor signal collecting unit 1510H, and the sensor signal collecting unit worn on the waist of the user can be classified as a waist sensor signal collecting unit 1510W. As a specific example of the wearing state, the head sensor signal collecting unit 1510H worn on the head may be inserted into the ear like an earphone, and the waist sensor signal collecting unit 1510W worn on the waist may be inserted into the waist. Of course, the present invention is not limited to this, and the head sensor signal collecting unit 1510H may be in various modifications such as a head-mounted form, a glasses form, a form inserted in an additional hat, and a helmet form. Although not shown, the sensor signal collecting unit 1510 may be worn on the upper body other than the arm of the user, and may be worn by a wearer in a form of being housed in or inserted into a chest pocket of a garment when worn on the chest, or by an additional vest or a harness.
As described above, the sensor signal collection section 1510 includes the acceleration sensor 1511. The acceleration sensor 1511 is used as an appropriate sensor of sensors that normally measure acceleration in the 3-axis direction, such as a built-in gyroscope type. On the other hand, the sensor signal collection unit 1510 includes a control unit 1520 that performs calculation, control, and the like using the acceleration data signal collected by the acceleration sensor 1511. Alternatively, the control unit 1520 may be changed to various modifications of the application form of the smartphone, which is used in the past. That is, in the case where the control unit 1520 and the sensor signal collection unit 1510 are formed separately, the acceleration data signal collected by the acceleration sensor 1511 is smoothly transmitted to the control unit 1520, and the sensor signal collection unit 1510 may further include the communication unit 1512. Such signal transmission is performed by wired communication based on wiring, and by wireless communication such as bluetooth, wireless fidelity, short-range wireless communication technology, and the like, and an appropriate form is adopted according to a required condition or a required performance.
Next, the method for quantifying the risk of injury according to the present embodiment will be described in more detail, and in the present embodiment, the vertical acceleration is used in the process of determining the risk of injury.
In the present embodiment, vertical acceleration is used to quantify the risk of injury. Generally, when running, the left-right movement of the head and the left-right movement of the center of mass of the user's body are more similar, and the front-back movement of the waist and the front-back movement of the center of mass of the user's body are similar. Meanwhile, the up-and-down movement is similar in the upper body including the head to the waist and the center of mass. In the upper body, the arm portion moves in the front-rear direction in addition to the movement of the center of mass, and therefore the arm portion is excluded. When such a point is considered, the acceleration in the up-down direction can be measured at any position in the upper body except the arm. In detail, since the acceleration in the vertical direction can be accurately measured at any position of the upper body other than the arms, a value measured at one position of the head or the waist is selectively used, and an average value of values measured at both sides or the like is used.
The control unit 1520 receives the signal from the sensor signal collection unit 1510 and makes the acceleration a in the up-down direction based on the position of the objectzAnd at least one injury risk judgment index calculated by the computer, and judging and controlling whether an alarm is generated or not by using the injury risk judgment index. Specifically, the control unit 1520 derives the vertical acceleration a as the injury risk determination indexzAverage inclination, up and downTo an acceleration azAnd at least one selected from the maximum inclination, the maximum impact force, and the impact amount of the first sensor, thereby quantifying the injury risk and determining the risk level. The method for quantifying an injury risk according to the present embodiment will be described in more detail below with respect to, for example, derivation of an injury risk determination index executed by the control unit 1520.
The actual embodiment of the control unit 1520 may be variously formed according to needs or purposes. That is, the control unit 1520 is in the form of an integrated circuit capable of executing various calculations, is formed integrally with the sensor signal collecting unit 1510, may be formed as one substrate, may be in the form of an additional dedicated device (i.e., a separate device formed only for the purpose of quantifying the risk of injury), an additional computer, or the like, and as described above, a conventionally used smartphone may be embodied in an application form. As described above, in the case where the control section 1520 is formed integrally with the sensor signal collection section 1510, the integrated signal is directly received from the acceleration sensor 1511. On the other hand, if the control unit 1520 is in the form of a smartphone application or the like and is formed separately from the sensor signal collection unit 1510, it receives a signal from the acceleration sensor 1511 by wired or wireless communication.
The alarm unit 1530 receives the alarm generation control signal from the control unit 1520, and gives an alarm to the user about the risk of injury. The control unit 1520 derives the acceleration a based on the vertical directionzThe calculated at least one injury risk determination index is used to determine whether or not a route is present, and if it is determined that the injury risk is equal to or greater than a predetermined reference, an alarm is generated in the alarm unit 1530, thereby notifying a user of a risk.
The alarm unit 1530 outputs an alarm signal through information that can be recognized by a user including sound, graphics, and video. For example, when the alarm unit 1530 is formed in a speaker type that outputs an audio signal, if the injury risk is equal to or greater than a reference level, a warning sound is generated. Alternatively, when the device of the present embodiment is applied to augmented reality glasses such as skeletal glasses, the alarm unit 1530 outputs a red warning pattern or a blinking image of such a pattern to the augmented reality glasses, or outputs information such as "injury risk is several%". Or the alarm unit 1530 is embodied as a thermoelectric element, and is in direct contact with the skin of the user, and when the risk of injury is equal to or higher than a reference value, it is cooled or heated, thereby giving an alarm to the user. As another example, the alarm portion 1530 may be in a braille format that is changeable and recognizable by a tactile sense in order to cope with the case where the user is visually impaired. As described above, the alarm unit may be any type of information that can be recognized by the user, and may output an alarm signal.
At the same time, the injury risk quantifying apparatus 1500 transfers and cumulatively stores injury risk data including the injury risk determination index value at the time point of occurrence of injury risk information and the corresponding time point to the external database 1540. The user who needs such walking or running exercise analysis is an ordinary person who performs walking or jogging daily for the purpose of promoting health or a professional who exercises for the purpose of improving physical ability, and preferably, such exercise analysis data is accumulated and needs to exhibit temporal changes. In addition, if such motion analysis data is stored in a large amount, such data is used as big data, and is applicable to various systems, analyses, and the like.
Fig. 16 is a flowchart illustrating an injury risk quantifying method according to another embodiment of the present invention.
As described above, the method for quantifying the risk of injury according to the present embodiment includes the acceleration sensor 1511, and uses the vertical acceleration a measured by the at least one sensor signal collecting unit 1510 attached to the upper body excluding the arm of the userzThe injury risk judgment index is derived to quantify the injury risk. Therefore, the method for quantifying an injury risk of the present embodiment includes a data collection step, a determination index derivation step, an injury risk determination step, and an injury risk alarm step. Meanwhile, in order to improve the accuracy of deriving the injury risk judgment index, the method also comprises a noise removal step. The respective steps shown in fig. 16 are explained in detail as follows.
In the data collection step, the vertical acceleration a measured by the sensor signal collection unit 1510 is collectedz. CollectingAcceleration a in the vertical direction ofzIt can be used directly, preferably, via a noise removal step of removing noise by a predetermined band-pass filter. In this case, the band pass filter may be set to 0.1 to 5Hz corresponding to the walking or running frequency of a general person, for example, but the range may be appropriately changed.
In the judgment index derivation step, the acceleration a is derived based on the vertical directionzAnd calculating at least one injury risk judgment index. In this case, the injury risk determination index may be selected from vertical accelerations azAverage inclination, vertical acceleration azAt least one of a maximum inclination, a maximum impact force, and an impact amount. Each determination index will be described in more detail later.
In the injury risk determination step, it is determined whether the injury risk determination index is larger than a predetermined reference. In this case, as described above, the injury risk determination index may be provided in plural numbers, and when one of the plural determination indexes is equal to or larger than a reference, an alarm may be generated, and when all are equal to or larger than the reference, an alarm may be generated, or information may be generated step by step with priority left as appropriate. In the injury risk determination step, if the injury risk determination index is smaller than a predetermined reference, the process returns to the data collection step again without generating an alarm. Meanwhile, in the injury risk determination step, it is preferable that the acceleration a in the vertical direction of the periodically appearing signal is determinedzAnd data for determining the injury risk determination index calculated by combining data for at least 2 cycles or more.
In the injury risk information step, when at least one of the injury risk determination indexes is larger than a predetermined value, the injury risk is notified to the user. As described above, the alarm mode of the injury risk may be in various modes such as audio, graphic, and video, and the user may take measures (such as exercise completion, posture correction, shoe replacement, and stroke change) for actively reducing the injury risk by receiving the hard explosion, thereby greatly reducing the injury risk.
Hereinafter, a plurality of examples of the injury risk determination index used in the present invention and a process of deriving each will be described in more detail.
Fig. 17 shows a vertical acceleration chart during running according to another embodiment of the present invention.
As shown in the figure, the acceleration a in the up-down directionzIt exhibits a periodic morphology over time (walking or running posture is a periodic movement, and therefore, this is inevitable). The running exercise is specifically explained as follows. First, the moment when the one foot forward comes into contact with the ground (the moment when the other foot is suspended in the air) starts. In this state, one foot is suspended from the ground, and both feet are suspended in the air, and the body of the person moves forward, and at the same time, both feet are suspended in the air and change front and back, and the other foot is stepped forward. The other foot, which is stepped forward, is brought into contact with the ground while the moment of contact with the ground is formed again, thereby performing one-step running. In this process, at the moment when stepping off the ground with one foot, the head of the person shakes in the up-down direction (acceleration a in the up-down direction)zMaximum value) and, on the contrary, does not shake in the vertical direction (vertical acceleration a) in a state of flying forwardzTo form a constant value).
As described above, the impact applied to the joint is the largest at the moment when the foot is stepped from the ground, and this impact appears as the first peak in the vertical acceleration chart shown in fig. 17. The injury risk changes depending on the degree of impact at this time, and in the present invention, the degree is indexed, and thus, the degree is used as a basis for quantitative determination. As such a determination index, in the present invention, as described above, the vertical acceleration a is usedzAverage inclination, vertical acceleration azMaximum inclination, maximum impact force, and impact amount.
Fig. 18 shows the inclination of the acceleration chart in the up-down direction during running according to another embodiment of the present invention. Thus, the vertical acceleration a will be describedzAverage inclination and maximum inclination.
First, the injury risk judgment index is selectedSelected as the acceleration a in the up-down directionzThe injury risk determination index is calculated by using the following equation in the case of the average inclination value of (a).
Figure BDA0001906700250000271
Wherein, az: up-down direction acceleration, mean: mean calculation function, i: index number, ti: time i, ti-1: time i-1, tc: attack start time, tm: end of impact time.
The impact start time is the instant that actually means that the foot touches the ground. This is the vertical acceleration azAmong the values of 0 or less, 0 is close to a predetermined reference value (for example, 0.3 m/s)2) Towards the point in time of breakthrough. Wherein the specific value of the reference value for determining the impact start time may be at 0.5m/s as described above2The following values are appropriately determined. The end-of-impact time is the point in time that represents the first peak and can be easily determined intuitively on the graph. The index i is an index of the digitized time from the impact start time to the impact end time divided by n, n being appropriately determined as necessary.
The average inclination value is an average value of n inclination values obtained at intervals of time from the impact start time to the impact end time, the n inclination values being divided into n equal parts. FIG. 18 shows the vertical acceleration a in one cyclezIn such a single cycle, the average inclination value can be obtained. On the other hand, as shown in fig. 17, the average inclination value can be obtained at each cycle (i.e., at each step) by continuously repeating the graph of the form shown in fig. 18 during running. In this case, the determination index deriving step may also obtain an average vertical loading rate (average vertical loading rate) calculated by multiplying the average inclination by the user mass m.
On the other hand, the injury risk determination index is selected as the vertical acceleration azThe injury risk determination index is calculated using the following equation in the case of the maximum inclination of (1).
Figure BDA0001906700250000281
i=1、2、…、n
t0=tc,tn=tm
Wherein, az: up-down direction acceleration, max: maximum calculation function, i: index number, ti: time i, ti-1: time of i-1 st time, tc: impact start time, tm: end of impact time.
As described above, in the description of the maximum inclination and the average inclination, the maximum value among n inclination values obtained between the impact start time and the impact end time in one cycle (one step) is obtained. In this case, in the determination index deriving step, a maximum vertical load rate (instant vertical load rate) calculated by using a product of the staff mass m and the maximum inclination may be calculated.
Fig. 19 shows the impact amount of the vertical acceleration chart during running according to another embodiment of the present invention. From this, a process of deriving the maximum impact force and the impact amount will be described.
First, when the injury risk determination index is selected as the maximum impact force value, the injury risk determination index is calculated using the following expression.
Maximum impact force m x az(tm)
Wherein, az: vertical direction acceleration, m: user quality, tm: end of impact time.
As described above, the impact end time is a time point at which the first peak is exhibited, and of course, the time point at which the maximum impact force is exhibited is the impact end time. FIG. 19 shows the vertical acceleration azThe first peak (1st peak) where the value of the user mass m is multiplied is the maximum impact force value.
On the other hand, when the injury risk determination index is selected as the impact magnitude, the injury risk determination index is calculated using the following equation.
Figure BDA0001906700250000291
Wherein, az: vertical direction acceleration, m: mass of person used, tc: impact start time, tm: end of impact time.
In FIG. 19, the vertical acceleration a from the impact start time to the impact end time is shownzThe graph area is the impact magnitude value obtained by multiplying the user mass m by the area.
Fig. 20 shows a motion recognition first device according to another embodiment of the present invention.
The motion recognition first device 2000 of the present embodiment (hereinafter, referred to as a first device) includes an acceleration sensing part 2010, an angular velocity sensing part 2020, a processing part 2040, and a user interface part 2050. The first device 2000 of the present embodiment is worn on the body of the user to measure the dynamic physical quantity of the user such as the acceleration angle and the like, thereby analyzing the exercise state of the user such as walking or running. As shown in fig. 1, the first device 2000 may be in the form of a band to be worn on the head and waist, a band to be attached to the head and waist in a clip type, a cap, a belt, glasses, a helmet, ears, clothes, and a garment. Specifically, the glasses are Augmented Reality (AR) glasses, glasses frames, and sunglasses. The form of the ear patch is a hands-free earphone, a headphone, an earphone, or the like. Further, it will be apparent to one of ordinary skill in the art that the first device 2000 may take on a variety of forms. The first device 2000 is an integrated circuit capable of performing various calculations and is formed on a substrate.
The acceleration sensing unit 2010 measures 3-axis angular velocity values including the up-down, left-right, and front-rear directions.
The angular velocity sensor 2020 measures 3-axis angular velocity values including the up-down, left-right, and front-rear directions.
The processing unit 2040 generates a first motion state value based on the 3-axis direction acceleration value and the 3-axis direction angular velocity value. The first motion state value is one of motion time, motion steps, steps per minute, step intervals, step angles, head angles, ground support time, levitation time, ground support time ratio relative to levitation time, maximum vertical force, average vertical force load rate, maximum vertical force load rate, left-right balance degree and left-right stability degree. The first device 2000 determines the movement state of the user through the first movement state value. Observing each meaning of the first motion state value, wherein the steps per minute are the steps per minute, the step intervals are the average intervals between legs, the step angles are the average angles of legs, the head angles are the average angles of an upper head and a lower head, the ground supporting time is the supporting time of contacting with the ground, the suspension time is the average time of not contacting all the legs with the ground, the maximum vertical force is the maximum value of the ground reaction force, the average vertical force load rate is the average inclination of the initial stage of the supporting interval of the left side ground reaction force and the right side ground reaction force, and the maximum vertical load rate is the maximum inclination of the initial stage of the supporting interval of the left side ground reaction force and the right side ground reaction force.
The left-right Stability (Stability) is obtained by the following equation, in which the legs of the left foot and the right foot maintain a state of inertial motion in terms of time, force, and the like, and is expressed in% by the Coefficient of variation (CV: Coefficient) of each leg.
Stability(Left)=1-std(Left indices)/mean(Left indices)
Stability(Right)=1-std(Right indices)/mean(Right indices)
Values written by index as an evaluation index include a vertical force maximum value, a vertical acceleration maximum value, a support section impact amount, a support time, a flying time, an average vertical force load rate, and a maximum vertical force load rate.
The left-right Balance (Balance) is a left-right Balance% and is obtained by the following equation.
Balance=Left index/(Left index+Right index)×100%
The user interface section 2050 controls a sleep mode or an active mode of the processing section 2040. The user interface portion 2050 may be in the form of software or hardware. For example, the user interface portion 2050 may be embodied as a push button in a software or hardware form. A flowchart of the motion recognition method from the user input of the user interface part 2050 is described in detail in fig. 22.
On the other hand, the first device 2000 of the embodiment of the present invention may further include a first communication part 2070. The first communication part 2070 transmits the first motion state value to the second device 2100. The first communication part 2070 may transmit the first motion state value to the second device 2100 at a predetermined cycle, and may be embodied in various methods. The second device 2100 of the present embodiment may be a device of various types such as a computer, a mobile terminal, and a watch.
On the other hand, the first device 2000 of the present embodiment may further include a second communication section 2080. The second communication section 2080 transmits the first motion state value to the server 2200.
On the other hand, the first device 2000 of the present embodiment may further include a position sensing portion 2030.
The position sensing section 2030 measures the position of the user. The position sensing unit 2030 may measure a position value of the user using a global positioning system, an ultra-precise satellite navigation technique, or the like, or may use another technique.
In the case where the first device 2000 further includes the position sensing section 2030, the processing section 2040 generates a second motion state value based on at least one value among the first motion state value, the user position value, and the user profile. The second exercise state value is at least one of exercise distance, exercise speed, energy consumption, height and stride length. And observing the meaning of each second motion state value, wherein the height is the vertical height moved during motion, and the step is the distance moved by the advancing in the ground supporting interval and the air suspension interval. The user data is personal information of the height, the weight and the like of the user.
The processing unit 2040 selectively compares at least one of the first motion state value and the second motion state value with each predetermined reference value to additionally generate motion posture correction information. For example, the processing unit 2040 stores height and stride relationship data optimal for the exercise speed, and determines whether the stride is too large or too small compared to the height of the user based on the stride in the second exercise state value. In the case where the stride length is out of the optimum range, the processing unit 2040 generates the stride length correction amount that needs to be decreased or increased as the posture correction information.
On the other hand, the first device 2000 of the present embodiment may further include an output portion 2060. The output unit 2060 converts the posture correction information into user-recognizable information that is at least one of voice, graphic, video, and vibration, and outputs the converted information. For example, in the case where the stride correction amount is calculated and the stride needs to be reduced, a voice such as "reduce stride" is output through a speaker, or a warning sound is sounded to make the user recognize that the stride is not the optimum stride and make it change the walking posture. Alternatively, the first device 2000 is connected to an external device such as a mobile terminal, a watch, a computer, a dedicated display, or the like, and outputs the correction information to at least one of voice, illustration, video, and vibration.
In the case where the first device 2000 further includes the position sensing section 2030, the first device 2000 may further include a third communication section that transmits the above-described second motion state value to the server 2200. The server 2200 accumulates the second motion state value in the database to store it. The server 2200 stores the statistical data based on the second motion state values stored in the database. The statistical data includes a maximum value, a minimum value, an average value, and the like, which are related to the second motion state value, for a predetermined motion section. The user who needs the exercise analysis receives the above statistical data through the server 2200 for the exercise habit improvement of himself or herself. The user who needs the exercise analysis is a general person who takes a walk or a jog on a daily basis in order to promote health, an expert who trains to improve physical ability, or the like. The server 2200 stores the second motion state value for each user, and provides a big data service for statistically analyzing the second motion state value relationship between users.
The first communication unit 2070, the second communication unit 2080 and the third communication unit are configured by at least one of wireless communication including bluetooth, wifi and nfc technologies, and wired communication through a wire, and it is obvious to those skilled in the art that other wired and wireless communication technologies can be used. The first communication unit 2070 and the second communication unit 2130 may be physically configured by a single interface or may be configured by a plurality of interfaces.
Fig. 21 shows a motion recognition second device according to another embodiment of the present invention.
The motion recognition second device 2100 (hereinafter referred to as a second device) of the present embodiment includes a first communication unit 2110, a processing unit 2150, and a position sensing unit 2170. The second device 2100 of the present embodiment may be a device of various types such as a computer, a mobile terminal, and a watch.
The first communication unit 2110 receives the first motion state value generated based on the 3-axis direction acceleration value and the 3-axis direction angular velocity value from the first device 2000.
The position sensing section 2170 measures a user position value. The position sensing section 2030 measures a position value of the user based on a global positioning system or an ultra-precise satellite navigation technique, or may use another technique.
The processor 2150 generates a second motion state value based on at least one of the first motion state value, the user position value, and the user profile. The second motion state value is at least one of distance, speed, energy consumption, height and stride. The user data includes personal information of the height, weight, etc. of the user.
On the other hand, the processing unit 2150 of this embodiment selectively compares at least one of the first motion state value and the second motion state value with a predetermined reference value, and additionally generates motion posture correction information. For example, the processing unit 2150 stores height and stride relationship data optimal for the exercise speed, and determines whether the stride is too large or too small compared to the height of the user based on the stride in the second exercise state value. In the case where the stride is out of the optimum range, the processing section 2150 generates the stride correction amount that needs to be decreased or increased as the posture correction information.
On the other hand, the second device 2100 of the present embodiment may further include an output portion 2190. The output unit 2190 converts the posture correction information into information recognizable to the user as at least one of voice, graphics, video, and vibration, and outputs the converted information. For example, in the case where the stride length correction amount is calculated to require reduction of the stride length, a voice such as "reduce stride length" or a warning sound is sounded through a speaker to make the user recognize that it is not the optimum stride length and change the walking posture.
On the other hand, the second device 2100 of the present embodiment may further include a second communication section 2150. The second communication unit 2150 transmits the second motion state value to the server 2200. The server 2200 accumulates the second motion state value in the database to store it. The server 2200 stores the statistical data based on the second motion state values stored in the database. The statistical data includes a maximum value, a minimum value, an average value, and the like, which are related to the second motion state value, for a predetermined motion section. The user who needs the exercise analysis receives the above statistical data through the server 2200 for the exercise habit improvement of himself or herself. The server 2200 stores the second motion state value for each user, and provides a big data service for statistically analyzing the second motion state value relationship between users.
The first and second communication units 2110 and 2130 are configured by at least one of wireless communication including bluetooth, wifi and nfc technologies, and wired communication through wires, and it is obvious to those skilled in the art that other wired and wireless communication technologies can be used. The first communication unit 2110 and the second communication unit 2130 may be physically configured by a single interface or may be configured by a plurality of interfaces.
Fig. 22 shows a flow of a motion recognition method according to another embodiment of the present invention.
In step 2210, the user interface part 2050 of the first device 2000 changes the processing part 2040 to the active mode.
In step 2220, the first device 2000 sets up a connection with the first communication part 2070 through the first communication part 2070.
In the case of connecting with the second device 2100, the first device 2000 may receive an instruction from the user interface part 2050 or the second device 2100 in step 2230.
In step 2240, the first device 2000 measures a 3-axis direction acceleration value and a 3-axis direction angular velocity value by the acceleration sensing unit 2010 and the angular velocity sensing unit 2020 based on the above command. According to an embodiment of the present invention, the acceleration sensing part 2010 and the angular velocity sensing part 2020 store the 3-axis direction acceleration values and the 3-axis direction angular velocity values In a First-In First-Out (FIFO) queue. The first device 2000 changes the processing unit 2040 to the horizontal mode when the storage space of the first-in first-out queue is smaller than a predetermined threshold value, and changes the processing unit 2040 to the active mode when the storage space of the first-in first-out queue is equal to or larger than the predetermined threshold value, thereby allowing the device to be driven by electric power.
At step 2250, the first device 2000 generates a first motion state value based on the 3-axis acceleration value and the 3-axis angular velocity value. The first motion state value is at least one of motion time, motion steps, steps per minute, step intervals, step angles, head angles, ground support time, suspension time, ground support time ratio relative to suspension time, maximum vertical force, average vertical force load rate, maximum balance degree and left-right stability.
In step 2260, the first device 2000 transmits the first kinematic state value to the second device 2100.
In step 2270, the second device 2100 determines a user location value.
In step 2280, the second device 2100 generates a second motion state value based on at least one of the first motion state value, the user location value, and the user profile. The second motion state value is at least one of distance, speed, energy consumption, height and stride. The user data includes personal information of height, weight and the like of the user.
The second device 2100 selectively compares at least one of the first motion state value and the second motion state value with each predetermined reference value to additionally generate motion posture correction information. For example, the second device 2100 determines whether the stride is too large or too small compared to the height of the user based on the stride in the second exercise state value based on the optimal height and stride relationship data of the exercise speed. In the case where the stride length is out of the optimum range, the above-described second motion state value will require a decreased or increased stride length correction amount to be generated as the posture correction information. The second device 2100 converts the posture correction information into information recognizable to the user as at least one of voice, graphics, video, and vibration, and outputs the converted information. For example, in the case where the stride length correction amount is calculated to require reduction of the stride length, a voice such as "reduce stride length" or a warning sound is sounded through a speaker to make the user recognize that it is not the optimum stride length and change the walking posture.
In step 2290, the second device 2100 transmits the second motion state value to the server 2200. The server 2200 accumulates the second motion state value in the database to store it. The server 2200 stores the statistical data based on the second motion state values stored in the database. The statistical data includes a maximum value, a minimum value, an average value, and the like, which are related to the second motion state value, for a predetermined motion section. The user who needs the exercise analysis receives the above statistical data through the server 2200 for the exercise habit improvement of himself or herself. The server 2200 stores the second motion state value for each user, and provides a big data service for statistically analyzing the second motion state value relationship between users.
The preferred embodiments of the present invention have been described above in detail, and the scope of the present invention is not limited thereto, and various modifications and equivalent other embodiments are possible. Therefore, the true technical scope of the present invention is defined by the claims.
For example, an apparatus of an exemplary embodiment of the present invention may include: a bus coupled to the respective units of the apparatus as described above; and at least one processor coupled to the bus and may include a memory coupled to the bus for storing instructions, received information, or generated data and to the at least one processor for executing the instructions.
Also, the system of the present invention can be embodied by computer readable codes in a computer readable recording medium. The computer-readable recording medium includes all kinds of recording devices that store data read by a computer system. The computer-readable recording medium includes magnetic storage media (e.g., rom, floppy disks, hard disks, etc.) and optical recording media (e.g., cd-roms, digital versatile disks, etc.). Also, the computer-readable recording medium is dispersed in computer systems connected through a network to store and execute computer-readable codes in a dispersed manner.
Industrial applicability
According to the present invention, the specific analysis algorithm of the present invention, which measures acceleration, position, and the like in the body (for example, the head or the waist) of the user and converts the measured values into the state value of the center-of-mass motion, is used, whereby walking can be recognized, detected, and analyzed efficiently and accurately.
Further, according to the present invention, the walking is effectively and accurately recognized, detected and analyzed by measuring the acceleration, position, etc. in the body of the user and using the specific analysis algorithm of the present invention, which estimates the pressure center path, etc. by converting the measured acceleration, position, etc. into the mass center motion state value. In particular, in the present invention, the acceleration and the position can be measured more accurately by using the acceleration measured at the position most similar to the movement of the center of mass of the body of the user (specifically, the acceleration in the left-right direction is measured at the head, the acceleration and the position in the front-rear direction are measured at the waist, and the acceleration and the position in the up-down direction are measured at the head or the waist).
Further, according to the present invention, only a sensor for measuring a dynamic physical quantity of a user is used on a side surface of the device structure, such as an acceleration sensor, a position sensor, and the like. That is, conventionally, there have been various problems such as a reduction in durability and life of the device by using a pressure sensor for recognizing walking by pressing with a foot of a user. However, in the case of the present invention, the technical structure itself in which the pressure sensor, which is the cause of such a problem, is disposed at the foot portion is completely eliminated, and thus, the above-described various problems are fundamentally solved. Further, the convenience of the user can be improved, and the economical efficiency of the user or the manufacturer can be improved.

Claims (45)

1. A first device, characterized in that,
the method comprises the following steps:
an acceleration sensing part worn on the head for measuring 3-axis acceleration values including up-down, left-right, and front-back directions;
an angular velocity sensing unit for measuring 3-axis angular velocity values in the up-down, left-right, and front-back directions;
a processing unit for generating a first motion state value based on the 3-axis direction acceleration value and the 3-axis direction angular velocity value; and
a user interface part for controlling the sleep mode or the activation mode of the processing part,
the 3-axis direction acceleration values include an up-down direction acceleration, a front-back direction acceleration, and a left-right direction acceleration,
the processing unit calculates a pressure center path based on at least a ratio of the sum of the vertical acceleration and the gravitational acceleration to the horizontal acceleration, and generates the first motion state value based on the pressure center path.
2. The first device according to claim 1, further comprising a first communication unit for transmitting the first motion state value to a second device.
3. The first apparatus according to claim 2, wherein the first communication unit is configured by a short-range wireless communication technology.
4. The first apparatus according to claim 1, further comprising a second communication unit for transmitting the first motion state value to a server.
5. The first apparatus according to claim 1, further comprising a position sensing unit for measuring a position value of the user.
6. The first apparatus according to claim 5, wherein the processing unit generates a second exercise status value based on at least one of the first exercise status value, the user position value, and user data.
7. The first apparatus according to claim 6, further comprising a third communication unit configured to transmit at least one of the first motion state value and the second motion state value to a server.
8. The first device of claim 6, wherein said second motion state value is at least one of distance, speed, energy consumption, height, and stride length.
9. The first device according to claim 6, wherein the processing unit compares at least one of the first motion state value and the second motion state value with a predetermined reference value to generate the posture correction information.
10. The first device according to claim 9, further comprising an output unit that outputs the posture correction information in at least one of a voice, a graphic, an image, and a vibration.
11. The first device according to claim 2, wherein when the processing unit is changed to the active mode by the user interface unit, the processing unit sets a connection to the second device via the first communication unit, and the acceleration sensing unit and the angular velocity sensing unit generate the 3-axis direction acceleration value and the 3-axis direction angular velocity value, respectively, based on a command received from the second device or the user interface unit.
12. The first device according to claim 1, wherein the acceleration sensing unit and the angular velocity sensing unit store the 3-axis direction acceleration value and the 3-axis direction angular velocity value in a first-in first-out queue, and the processing unit is in a sleep mode when a storage space of the first-in first-out queue is smaller than a predetermined threshold value, and is in an active mode when the storage space of the first-in first-out queue is equal to or larger than the predetermined threshold value.
13. The first apparatus according to claim 1, wherein the first exercise state value is at least one of exercise time, exercise steps, steps per minute, step interval, step angle, head angle, ground support time, levitation time, ground support time ratio with respect to levitation time, maximum vertical force, average vertical force load rate, maximum vertical load rate, left-right balance, left-right stability.
14. The first device according to claim 1, wherein the first device is one of a band to be worn on a head, a clip-type head-attached form, a hat-mounted form, a glasses form, a helmet form, and an ear-attached form.
15. The first device of claim 14, wherein the glasses are one of augmented reality glasses, glasses frames, and sunglasses, and the ear-attached glasses are earphones.
16. A second apparatus, comprising:
a first communication section for receiving a first motion state value generated based on a 3-axis direction acceleration value and a 3-axis direction angular velocity value from a first device;
a position sensing unit for measuring a user position value; and
the processor generates a second motion state value based on at least one of the first motion state value, the user position value, and the user data.
17. The second apparatus according to claim 16, further comprising a second communication unit configured to transmit at least one of the first motion state value and the second motion state value to a server.
18. The second device according to claim 16, wherein the first exercise state value is at least one of exercise time, exercise steps, steps per minute, step interval, step angle, head angle, ground support time, levitation time, ground support time ratio with respect to levitation time, maximum vertical force, average vertical force duty ratio, maximum vertical force duty ratio, left-right balance, left-right stability.
19. The second device as recited in claim 16 wherein said second motion state value is at least one of distance, speed, energy consumption, height, and stride length.
20. The second device according to claim 16, wherein the processing unit compares at least one of the first motion state value and the second motion state value with a predetermined reference value to generate the posture correction information.
21. The second device according to claim 20, further comprising an output unit for outputting the posture correction information in at least one of a voice, a graphic, an image, and a vibration.
22. The second apparatus according to claim 16, wherein the first communication unit is configured by a short-range wireless communication technology.
23. A method for motion recognition of a first device, comprising:
measuring 3-axis acceleration values including up-down, left-right, and front-back directions by an acceleration sensor unit worn on the head;
measuring 3-axis direction angular velocity values including up-down, left-right, and front-back directions by an angular velocity sensor unit;
generating a first motion state value based on the 3-axis direction acceleration value and the 3-axis direction angular velocity value by a processing unit; and
controlling the sleep mode or the active mode of the processing section through the user interface section,
the 3-axis direction acceleration values include an up-down direction acceleration, a front-back direction acceleration, and a left-right direction acceleration,
the processing unit calculates a pressure center path based on at least a ratio of the sum of the vertical acceleration and the gravitational acceleration to the horizontal acceleration, and generates the first motion state value based on the pressure center path.
24. The method of claim 23, further comprising the step of transmitting the first motion state value to a second device.
25. The method of claim 24, wherein the step of transmitting the first motion state value to the second device is performed by a short-range wireless communication technique.
26. The method of claim 23, further comprising the step of transmitting the first motion state value to a server.
27. The method for recognizing a motion of a first device according to claim 23, further comprising a step of determining a user position value.
28. The method of claim 27, wherein a second exercise status value is generated based on at least one of the first exercise status value, the user location value, and user profile.
29. The method of claim 28, further comprising the step of transmitting at least one of the first motion state value and the second motion state value to a server.
30. The method as claimed in claim 28, wherein the second motion state value is at least one of distance, speed, energy consumption, altitude, and stride length.
31. The motion recognition method for a first device according to claim 28, further comprising the step of comparing the first motion state value and the second motion state value with respective predetermined reference values to generate posture correction information.
32. The method of claim 31, further comprising the step of outputting the gesture correction information in at least one of a voice, a graphic, an image, and a vibration.
33. The method of recognizing motion of a first device according to claim 24, wherein in the step of measuring the 3-axis direction acceleration value and the step of measuring the 3-axis direction angular velocity value, when the processing unit changes to an active mode through the user interface unit, a connection to the second device is set, and the 3-axis direction acceleration value and the 3-axis direction angular velocity value are generated based on a command received from the second device or the user interface unit, respectively.
34. The motion recognition method of the first device according to claim 23, wherein the acceleration sensing unit and the angular velocity sensing unit store the 3-axis acceleration value and the 3-axis angular velocity value in a first-in first-out queue, the processing unit is in a sleep mode when a storage space of the first-in first-out queue is smaller than a predetermined threshold value, and the processing unit is in an active mode when the storage space of the first-in first-out queue is equal to or larger than the predetermined threshold value.
35. The method of claim 23, wherein the first motion state value is at least one of a motion time, a motion step number, a step number per minute, a step interval, a step angle, a head angle, a ground support time, a levitation time, a ground support time ratio with respect to a levitation time, a maximum vertical force, an average vertical force load rate, a maximum vertical force load rate, a left-right balance degree, and a left-right stability degree.
36. The motion recognition method of the first device according to claim 23, wherein the first device is one of a band to be worn on a head, a clip-type head-attached form, a hat-mounted form, a glasses form, a helmet form, and an ear-attached form.
37. The motion recognition method of the first device according to claim 36, wherein the glasses are one of augmented reality glasses, glasses frames, and sunglasses, and the ear-attached glasses are earphones.
38. A method for motion recognition of a second device, comprising:
a step of receiving a first motion state value generated based on a 3-axis direction acceleration value and a 3-axis direction angular velocity value from a first device;
measuring a user position value; and
generating a second motion state value based on at least one of the first motion state value, the user position value and the user data.
39. The method according to claim 38, further comprising a step of transmitting at least one of the first motion state value and the second motion state value to a server.
40. The method of claim 38, wherein the first motion state value is at least one of a motion time, a motion step number, a step number per minute, a step interval, a step angle, a head angle, a ground support time, a levitation time, a ground support time ratio with respect to a levitation time, a maximum vertical force, an average vertical force load rate, a maximum vertical force load rate, a left-right balance degree, and a left-right stability degree.
41. The method as claimed in claim 38, wherein the second motion state value is at least one of distance, speed, energy consumption, altitude, and stride length.
42. The motion recognition method for a second device according to claim 38, further comprising a step of comparing at least one of the first motion state value and the second motion state value with a predetermined reference value to generate posture correction information.
43. The method of claim 42, further comprising the step of outputting the gesture correction information as at least one of a voice, a graphic, an image, and a vibration.
44. The method of claim 38, wherein the step of receiving the first motion state value from the first device is performed by a short-range wireless communication technique.
45. A computer-readable recording medium characterized by recording a program for executing the method of any one of claims 23 to 44.
CN201780037462.2A 2016-08-09 2017-08-08 Motion recognition method and device Active CN109328094B (en)

Applications Claiming Priority (13)

Application Number Priority Date Filing Date Title
KR10-2016-0101489 2016-08-09
KR10-2016-0101491 2016-08-09
KR1020160101489A KR101926170B1 (en) 2016-08-09 2016-08-09 Motion sensing method and apparatus for gait-monitoring
KR1020160101491A KR101830371B1 (en) 2016-08-09 2016-08-09 Motion posture deriving method and apparatus based path of COP
KR10-2017-0030394 2017-03-10
KR1020170030402A KR101995484B1 (en) 2017-03-10 2017-03-10 Motion posture deriving method and apparatus based path of COP
KR10-2017-0030402 2017-03-10
KR1020170030394A KR101995482B1 (en) 2017-03-10 2017-03-10 Motion sensing method and apparatus for gait-monitoring
KR1020170079255A KR101970674B1 (en) 2017-06-22 2017-06-22 Method and apparatus for quantifying risk of gait injury
KR10-2017-0079255 2017-06-22
KR1020170099566A KR102043104B1 (en) 2017-08-07 2017-08-07 Motion sensing method and apparatus
KR10-2017-0099566 2017-08-07
PCT/KR2017/008533 WO2018030742A1 (en) 2016-08-09 2017-08-08 Method and apparatus for recognizing exercise

Publications (2)

Publication Number Publication Date
CN109328094A CN109328094A (en) 2019-02-12
CN109328094B true CN109328094B (en) 2021-06-01

Family

ID=61163142

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780037462.2A Active CN109328094B (en) 2016-08-09 2017-08-08 Motion recognition method and device

Country Status (3)

Country Link
JP (2) JP2019528105A (en)
CN (1) CN109328094B (en)
WO (1) WO2018030742A1 (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111723624B (en) * 2019-03-22 2023-12-05 京东方科技集团股份有限公司 Head movement tracking method and system
US10773123B1 (en) 2019-08-30 2020-09-15 BioMech Sensor LLC Systems and methods for wearable devices that determine balance indices
CN111609865B (en) * 2020-05-25 2022-04-26 广州市建筑科学研究院有限公司 Assembly type automatic navigation blind road system based on wireless network
CN111672087B (en) * 2020-06-13 2021-05-07 曲灏辰 Muscle vibration detection system and detection method suitable for street dance
KR102522964B1 (en) * 2021-05-13 2023-04-20 임영상 Device and method for training techniques based on artificial intelligence
WO2023228088A1 (en) * 2022-05-26 2023-11-30 Cochlear Limited Fall prevention and training

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100620118B1 (en) * 2004-03-31 2006-09-13 학교법인 대양학원 Walking pattern analysis apparatus and method using inertial sensor
US7561960B2 (en) * 2006-04-20 2009-07-14 Honeywell International Inc. Motion classification methods for personal navigation
KR100894895B1 (en) * 2007-05-21 2009-04-30 연세대학교 산학협력단 Movement, Gait, and Posture Assessment and Intervention System and Method, MGPAISM
US8467726B2 (en) 2009-11-06 2013-06-18 Panasonic Corporation Communication device and communication method
KR101101003B1 (en) * 2009-12-14 2011-12-29 대구대학교 산학협력단 Monitoring system and method for moving and balancing of human body using sensor node
JP5733503B2 (en) 2011-02-28 2015-06-10 国立大学法人広島大学 Measuring device, measuring method, and measuring program
EP3003149A4 (en) * 2013-06-03 2017-06-14 Kacyvenski, Isaiah Motion sensor and analysis
JP6511439B2 (en) * 2013-06-04 2019-05-15 プロテウス デジタル ヘルス, インコーポレイテッド Systems, devices, and methods for data collection and outcome assessment
JP6127873B2 (en) * 2013-09-27 2017-05-17 花王株式会社 Analysis method of walking characteristics
JP2015217250A (en) * 2014-05-21 2015-12-07 富士通株式会社 System, program, method, and device for stride measurement
JP2016034480A (en) * 2014-07-31 2016-03-17 セイコーエプソン株式会社 Notification device, exercise analysis system, notification method, notification program, exercise support method, and exercise support device
JP2016034482A (en) * 2014-07-31 2016-03-17 セイコーエプソン株式会社 Exercise analysis device, exercise analysis method, exercise analysis program, and exercise analysis system
JP2016032610A (en) * 2014-07-31 2016-03-10 セイコーエプソン株式会社 Exercise analysis system, exercise analysis device, exercise analysis program and exercise analysis method
US20160038083A1 (en) * 2014-08-08 2016-02-11 Orn, Inc. Garment including integrated sensor components and feedback components
JP6080078B2 (en) * 2014-08-18 2017-02-15 高知県公立大学法人 Posture and walking state estimation device
US9687695B2 (en) * 2014-10-22 2017-06-27 Dalsu Lee Methods and systems for training proper running of a user
JP6583605B2 (en) * 2014-12-12 2019-10-02 カシオ計算機株式会社 Exercise information generation apparatus, exercise information generation method, and exercise information generation program
JP2016116566A (en) * 2014-12-18 2016-06-30 セイコーエプソン株式会社 Motion analysis device, motion analysis method, program, and motion analysis system
JP6696109B2 (en) * 2014-12-22 2020-05-20 セイコーエプソン株式会社 Motion analysis device, motion analysis system, motion analysis method and program

Also Published As

Publication number Publication date
JP2021098027A (en) 2021-07-01
CN109328094A (en) 2019-02-12
JP2019528105A (en) 2019-10-10
JP7101835B2 (en) 2022-07-15
WO2018030742A1 (en) 2018-02-15

Similar Documents

Publication Publication Date Title
CN109414608B (en) Motion recognition method and device
CN109328094B (en) Motion recognition method and device
US11497966B2 (en) Automatic coaching system and method for coaching user's exercise
US10789708B1 (en) Athletic performance and technique monitoring
KR102043104B1 (en) Motion sensing method and apparatus
US10441212B2 (en) Method to determine positions and states of an activity monitoring device
US20170095181A1 (en) System and method for characterizing biomechanical activity
JP2010005033A (en) Walking motion analyzer
WO2016097746A1 (en) Biomechanical analysis
EP2783630A1 (en) Human motion analysis method and device
KR102304300B1 (en) A method and apparatus for detecting walking factor with portion acceleration sensor
JP6781798B2 (en) IVLR prediction method and injuries risk quantifier during driving using it
KR102081735B1 (en) Motion sensing method and apparatus
KR102055661B1 (en) An Automatic Coaching System And Method For Coaching A User's Exercise
KR101830371B1 (en) Motion posture deriving method and apparatus based path of COP
KR101995482B1 (en) Motion sensing method and apparatus for gait-monitoring
US20220193522A1 (en) Exercise analysis system using sensor worn on user's head
JPWO2018211550A1 (en) Information processing apparatus, information processing system, and information processing method
KR102020796B1 (en) Method and apparatus for evaluating stability during running and walking
KR102039381B1 (en) Method and apparatus for evaluating imbalance during running and walking
KR20230081878A (en) System for analyzing mothion using sensor worn on the user's head
KR20210040671A (en) Apparatus for estimating displacement center of gravity trajectories and method thereof
KR102379992B1 (en) An Automatic Coaching System And Method For Coaching A User's Exercise
KR101995484B1 (en) Motion posture deriving method and apparatus based path of COP
KR101970674B1 (en) Method and apparatus for quantifying risk of gait injury

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant