JP5835206B2 - Motion analyzer - Google Patents

Motion analyzer Download PDF

Info

Publication number
JP5835206B2
JP5835206B2 JP2012279501A JP2012279501A JP5835206B2 JP 5835206 B2 JP5835206 B2 JP 5835206B2 JP 2012279501 A JP2012279501 A JP 2012279501A JP 2012279501 A JP2012279501 A JP 2012279501A JP 5835206 B2 JP5835206 B2 JP 5835206B2
Authority
JP
Japan
Prior art keywords
data
observation
reference
acoustic
comparison
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2012279501A
Other languages
Japanese (ja)
Other versions
JP2014121456A (en
Inventor
谷高 幸司
幸司 谷高
Original Assignee
ヤマハ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ヤマハ株式会社 filed Critical ヤマハ株式会社
Priority to JP2012279501A priority Critical patent/JP5835206B2/en
Publication of JP2014121456A publication Critical patent/JP2014121456A/en
Application granted granted Critical
Publication of JP5835206B2 publication Critical patent/JP5835206B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • A61B5/7415Sound rendering of measured values, e.g. by pitch or volume variation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6895Sport equipment

Description

  The present invention relates to a technique for analyzing a user's movement.

  Various techniques for analyzing user movement have been proposed. For example, in the technique of Patent Document 1, a moving image of a user's swing motion and a pre-recorded moving image of a reference swing motion (for example, a professional golfer's swing motion) are placed on the same screen at the same timing. A technique for displaying is disclosed.

JP-A-6-39070

  Under the technique of Patent Document 1, a user analyzes his / her swing motion by visually comparing his / her swing motion with a reference swing motion. However, it is actually difficult to grasp the difference between the user's operation and the reference operation accurately and in detail by visually recognizing the moving image on the screen. In view of the above circumstances, an object of the present invention is to enable a user to easily grasp a difference between a user's operation and a reference operation.

  In order to solve the above problems, the motion analysis apparatus of the present invention includes observation data acquisition means for acquiring observation data that can specify the trajectory of the observation point that moves in conjunction with the user's movement, and the observation point. Comparison means for comparing the reference data capable of specifying the predetermined trajectory and the observation data acquired and generated by the observation data acquisition means, and an acoustic control means for generating an acoustic signal according to the comparison result of the comparison means. With the above configuration, an acoustic signal is generated according to the comparison result between the observation data and the reference data, so that the user can easily grasp the difference between the observation point orbit and the predetermined orbit indicated by the reference data. Can do.

  In a preferred aspect of the present invention, the comparison means sequentially compares the observation data and the reference data in parallel with the user's action, and the acoustic control means generates an acoustic signal in parallel with each comparison by the comparison means. To do. In the above configuration, since an acoustic signal is generated in real time with respect to the user's operation, the actual operation and the observed point are compared with the configuration in which the sound is reproduced after the operation to be analyzed is executed. The user can sensuously grasp the difference between the trajectory and the predetermined trajectory.

  In a preferred aspect of the present invention, the comparison means expands / contracts the time series of the reference data on the time axis, and compares each expanded / contracted reference data with the observation data. In the above configuration, since the time series of the reference data is expanded and contracted on the time axis, for example, if the time series of the reference data is expanded and contracted according to the user's operation speed, the time length of the time series of the reference data is fixed. Compared to the case, the difference between the trajectory of the observed point and the predetermined trajectory can be appropriately evaluated.

  In a preferred aspect of the present invention, delay means for delaying the acoustic signal generated by the acoustic control means is provided. In the above configuration, since the acoustic signal is delayed, for example, it is possible to prevent the concentration of users during the period from the start of the generation of the acoustic signal to the lapse of the delay time.

  In a preferred aspect of the present invention, the acoustic control means selects acoustic data corresponding to an instruction from the user from among a plurality of acoustic data indicating different sounds, and the selected acoustic data is selected according to a comparison result by the comparison means. To generate an acoustic signal. With the above configuration, it is possible to diversify the types of reproduced sound of an acoustic signal as compared to a configuration in which one type of acoustic data is modulated to generate an acoustic signal.

  In a preferred aspect of the present invention, the sound control means adds a predetermined sound effect to the sound according to the comparison result by the comparison means in accordance with the degree of approximation between the trajectory specified from the observation data and the predetermined trajectory. Generate an acoustic signal. In the above configuration, the sound corresponding to the comparison result by the comparison means and the predetermined sound effect are reproduced according to the degree of approximation between the trajectory of the observed point and the predetermined trajectory. Can be intuitively recognized.

  The motion analysis apparatus according to each aspect described above is realized by hardware (electronic circuit) such as a DSP (Digital Signal Processor) dedicated to analysis of a user's motion and a general purpose such as a CPU (Central Processing Unit). This is also realized by cooperation between the arithmetic processing unit and the program. The program of the present invention includes an observation data acquisition process for acquiring observation data that can specify a trajectory of an observed point that moves in conjunction with a user's action, and a plurality of data that can specify a predetermined trajectory of the observed point. The computer is caused to execute a comparison process for comparing each of the reference data and the observation data acquired by the observation data acquisition process, and an acoustic control process for generating an acoustic signal according to the result of the comparison process. The above program is provided in a form stored in a computer-readable recording medium and installed in the computer. The recording medium is, for example, a non-transitory recording medium, and an optical recording medium (optical disk) such as a CD-ROM is a good example, but a known arbitrary one such as a semiconductor recording medium or a magnetic recording medium This type of recording medium can be included. Also, for example, the program of the present invention can be provided from a distribution server device, for example, in the form of distribution via a communication network and installed in a computer.

1 is an external view of a motion analysis system according to a first embodiment. It is a block diagram of a motion analysis system. It is explanatory drawing of the acceleration sensor fixed to the to-be-observed point. It is a schematic diagram of observation data. It is a schematic diagram of a reference data series. It is a schematic diagram of contrast data It is a flowchart of the contrast process by a contrast part. It is explanatory drawing of the pitch of the reproduction sound according to the difference between an observation orbit and a reference orbit. It is explanatory drawing of operation | movement of the acoustic control part in 2nd Embodiment. It is a block diagram of the exercise | movement analysis system in 3rd Embodiment.

<First Embodiment>
FIG. 1 is an external view of a motion analysis system 100 according to the first embodiment of the present invention, and FIG. 2 is a block diagram of the motion analysis system 100. As shown in FIGS. 1 and 2, the motion analysis system 100 includes a motion analysis device 10 and an acceleration sensor 20. The motion analysis device 10 is a device that analyzes the motion of the user U and notifies the user U of the analysis result, and is preferably used for practicing a specific motion in various sports. The motion analysis apparatus 10 according to the first embodiment analyzes a motion in which the user U swings the golf club C (hereinafter referred to as “swing motion”). Specifically, the motion analysis apparatus 10 analyzes the movement of a point P (hereinafter referred to as “observed point”) that moves in conjunction with the swing motion of the user U. The observed point P of the first embodiment is a specific point in the club C used by the user U. Specifically, as shown in FIG. 3, the tip portion (the end portion on the head Ch side) of the grip Cg fixed to the shaft Cs of the club C is exemplified as the observed point P. It should be noted that another point of the club C (for example, a point on the head Ch or the shaft Cs) or a point that moves in conjunction with the swing motion in the body of the user U can be set as the observed point P.

  The acceleration sensor 20 in FIGS. 1 and 2 is a detection body that detects the movement of the observed point P (swing motion of the user U), and the basic data DA corresponding to the movement of the observed point P is obtained at a predetermined cycle. Generate sequentially. As shown in FIG. 3, the acceleration sensor 20 of the present embodiment is fixed to the observed point P, and detects the acceleration in each direction of three axes (X axis, Y axis, Z axis) orthogonal to each other. It is an acceleration sensor. The Z axis is an axis parallel to the longitudinal direction of the shaft Cs of the club C, and the Y axis and the X axis are axes on a plane orthogonal to the Z axis. One basic data DA includes an acceleration ax in the X-axis direction, an acceleration ay in the Y-axis direction, and an acceleration az in the Z-axis direction. Each basic data DA sequentially generated by the acceleration sensor 20 is transmitted to the motion analysis apparatus 10 in time series. Note that wireless / wired communication between the acceleration sensor 20 and the motion analysis apparatus 10 is not required.

  As shown in FIG. 2, the motion analysis device 10 is realized by a computer system including an arithmetic processing device 12, a storage device 14, and a sound emitting device 16. The storage device 14 stores programs executed by the arithmetic processing device 12 and various types of data used by the arithmetic processing device 12 (for example, acoustic data W and reference data series SREF). A known recording medium such as a semiconductor storage medium or a magnetic recording medium or a combination of a plurality of types of recording media is arbitrarily adopted as the storage device 14. The sound emitting device 16 is an acoustic device (for example, a speaker) that reproduces a sound wave corresponding to the acoustic signal S generated by the arithmetic processing device 12.

  The arithmetic processing device 12 executes a program stored in the storage device 14 and thereby has a plurality of functions (observation data acquisition unit 32, comparison unit 34, acoustic control unit 36) for analyzing the motion of the user U. Realize. Each function of the arithmetic processing unit 12 can be distributed to a plurality of devices.

  The observation data acquisition unit 32 sequentially acquires observation data DB that can specify the orbit (hereinafter referred to as “observation orbit”) OA of the observation point P according to the swing motion of the user U. Specifically, the observation data acquisition unit 32 sequentially generates the observation data DB from the basic data DA that the acceleration sensor 20 supplies in time series. As shown in FIG. 4, one observation data DB includes an observation value Bx, an observation value By, and an observation value Bz. The observed value Bx is the difference (change amount) of the acceleration ax in the two consecutive basic data DA, and the observed value By is the difference in the acceleration ay in the two consecutive basic data DA. Bz is the difference of acceleration az in two basic data DA that precede and follow. The period in which the observation data acquisition unit 32 acquires the observation data DB is set to a sufficiently short time (for example, 1 millisecond) as compared to the time during which the user U performs the swing motion.

  The storage device 14 in FIG. 2 stores the acoustic data W and the reference data series SREF. The acoustic data W of the present embodiment is data indicating a specific acoustic waveform. For example, digital data sampled at a predetermined frequency (for example, 44.1 kHz) by recording the wind noise of the club C during the swing operation is stored in advance in the storage device 14 as the acoustic data W.

  FIG. 5 is a schematic diagram of the reference data series SREF. The reference data series SREF is data that can specify the orbit (hereinafter referred to as “reference orbit”) OREF of the observation point P over a predetermined time length, and is a time series of a plurality of reference data DREF as shown in FIG. . Each reference data DREF is data to be compared with each observation data DB for evaluating the swing motion of the user U, and includes a reference value Rx, a reference value Ry, and a reference value Rz.

  The reference trajectory OREF is a trajectory that serves as a standard for the observation trajectory OA specified by each observation data DB. For example, the trajectory of the observed point P when an operator such as a professional golfer who is proficient in the swing motion executes the normative swing motion is preferably adopted as the reference trajectory OREF. Specifically, the time series of the plurality of observation data DB generated by the observation data acquisition unit 32 when the exemplary operator performs the swing movement is stored in the storage device 14 as the reference data series SREF (each reference data DREF). Stored in advance. Therefore, the reference value Rx of each reference data DREF corresponds to the change amount of the acceleration ax when the normative swing operation is executed, the reference value Ry corresponds to the change amount of the acceleration ay, and the reference value Rz is the acceleration az. It corresponds to the amount of change.

  2 compares each observation data DB acquired by the observation data acquisition unit 32 with each reference data DREF of the reference data series SREF stored in the storage device 14. Specifically, the comparison unit 34 reads the reference data DREF from the reference data series SREF in the storage device 14 in time series every time the observation data DB is acquired by the observation data acquisition unit 32, and the observation data DB and the reference data DREF are obtained. The difference is calculated to generate comparison data DC. As shown in FIG. 6, one piece of contrast data DC includes a comparison value ΔTx, a comparison value ΔTy, and a comparison value ΔTz. The comparison value ΔTx is a difference between the observation value Bx of the observation data DB and the reference value Rx of the reference data DREF. Similarly, the comparison value ΔTy is a difference between the observation value By and the reference value Ry, and the comparison value ΔTz is a difference between the observation value Bz and the reference value Rz. Since the time series of the plurality of observation data DB corresponds to the observation orbit OA and the reference data series SREF corresponds to the reference orbit OREF, the contrast data DC corresponds to data indicating the difference between the observation orbit ОA and the reference orbit ОREF. .

  The comparison unit 34 of the present embodiment compares the observation data DB and the reference data DREF for a predetermined analysis section in the section from the start to the end of the swing motion by the user U. The analysis section is a section from when the user U starts a downswing (an operation for swinging down the club C) after the takeback operation (hereinafter referred to as “operation start point”) until a predetermined time T elapses. The time length T of the analysis section is set according to the time length from when the user U completes the follow-through operation (the operation of swinging out the club C) from the operation start point. The time length from the actual operation start point to the operation end varies depending on the swing speed of the user U. In the present embodiment, the average swing speed of the user U is calculated from the time-series measurement results of the observation data DB over a plurality of past times, and the time length T of the analysis section corresponding to the average swing speed is used. Each user U is selected and stored in the storage device 14.

  FIG. 7 is a flowchart of a process in which the comparison unit 34 compares each observation data DB and each reference data DREF (hereinafter referred to as “contrast process”). For example, when the user U gives an instruction to start analysis through an operation on an input device (not shown), the comparison process of FIG. 7 is executed.

  The contrast unit 34 detects an operation start point using each observation data DB (S1). Considering the tendency that the amount of change in the acceleration of the observed point P increases immediately after the start of the downswing, the comparison unit 34 of the first embodiment responds to the amount of change A in acceleration indicated by each observation data DB. Detect the operation start point. Specifically, the comparison unit 34 sequentially determines whether or not the acceleration change amount A indicated by the observation data DB sequentially supplied from the observation data acquisition unit 32 exceeds a predetermined threshold ATH. The change amount A is, for example, the sum of the absolute value of the observed value Bx, the absolute value of the observed value By, and the absolute value of the observed value Bz. The comparison unit 34 repeats step S1 until the change amount A exceeds the threshold value ATH (S1: NO), and detects the time point (S1: YES) when the change amount A exceeds the threshold value ATH as the operation start point. Move on to S2.

  The comparison unit 34 expands and contracts the reference data series SREF on the time axis according to the time length T of the analysis section selected and stored in advance from the average swing speed of the user U (S2). Specifically, the time length TREF (the time from the start point to the end point of the reference trajectory OREF) from the first reference data DREF to the last reference data DREF of the reference data series SREF is adjusted to the time length T. Specifically, when the time length T is longer than the time length TREF, the comparison unit 34 increases the number of reference data DREF and the number of observation data DB by increasing the number of reference data DREF by interpolation processing on the reference data sequence SREF. Match the number. A known technique (for example, linear interpolation process or spline interpolation process) is arbitrarily employed for the interpolation process of the reference data series SREF. On the other hand, when the time length T is shorter than the time length TREF, the comparison unit 34 reduces the number of reference data DREF and the number of observation data DB by reducing the number of reference data DREF by thinning out the reference data sequence SREF. Match. A known technique is arbitrarily employed for the thinning process of the reference data series SREF. Note that the reference trajectory OREF itself does not change in the process of step S2.

  The comparison unit 34 compares each observation data DB sequentially generated by the observation data acquisition unit 32 with each of the plurality of reference data DREF of the reference data series SREF after the adjustment in step S2 (S3). Specifically, every time the observation data DB is generated by the observation data acquisition unit 32, the comparison unit 34 sequentially reads the reference data DREF of the adjusted reference data sequence SREF from the top in time series order, and the reference data DREF And the difference between the observation data DB and the comparison data DC are sequentially generated. As shown in FIG. 7, the generation of the contrast data DC (S3) is repeated until it is determined in step S4 that the time length T of the analysis section has elapsed from the operation start point detected in step S1 (S4: NO). ).

  When it is determined that the time length T has elapsed from the operation start point (S4: YES), the comparison unit 34 ends the comparison process. As understood from the above description, the comparison data DC indicating the difference between the observation trajectory OA and the reference trajectory OREF is sequentially generated in the analysis interval during the execution of the swing motion.

  The acoustic control unit 36 in FIG. 2 generates an acoustic signal S corresponding to the comparison data DC (the comparison result between each observation data DB and each reference data DREF) sequentially generated by the comparison unit 34. Specifically, the acoustic control unit 36 acquires each sample of the acoustic data W sequentially from the storage device 14 in time-series order from the operation start point detected by the comparison unit 34, while the comparison unit 34 immediately before each reading. Each sample of the acoustic data W is modulated in accordance with the comparison data DC generated by. The acoustic signal S generated by the modulation by the acoustic control unit 36 is supplied to the sound emitting device 16 and reproduced as a sound wave. In addition, illustration of the D / A converter which converts the acoustic signal S from digital to analog is abbreviate | omitted for convenience.

  Specifically, the acoustic control unit 36 changes the pitch of each sample of the acoustic data W according to the comparison data DC. For example, the comparison unit so that the change in the pitch of each sample increases as the comparison values (ΔTx, ΔTy, ΔTz) of the comparison data DC increase (that is, the difference between the observation orbit OA and the reference orbit OREF increases). 34 modulates the acoustic data W. The modulation and output of each sample of the acoustic data W are executed sequentially (in real time) in parallel with the swing operation by the user U. That is, within the analysis interval of the swing motion by the user U, the pitch of the reproduced sound changes every moment according to the difference between the observation trajectory OA and the reference trajectory OREF.

  FIG. 8 is an explanatory diagram of the pitch of the reproduced sound according to the difference between the observation orbit OA and the reference orbit OREF. FIG. 8 shows in parallel the observation trajectory OA (OA1, OA2) of a part of the analysis section and the time change of the pitch PA of the reproduced sound (PA1, PA2). The hitting point Q in FIG. 8 corresponds to the observed point P when the head Ch hits the ball. In FIG. 8, a reference trajectory OREF is shown together with each observation trajectory OA, and each pitch PA is shown as a relative value with the pitch of the acoustic data W as the reference pitch PREF.

  As shown in FIG. 8, the pitch PA of the reproduced sound increases as the observation orbit OA is separated from the reference orbit OREF toward the user U, and the observation orbit OA is opposite to the user U as viewed from the reference orbit OREF. The sound control unit 36 modulates the sound data W in accordance with each comparison data DC so that the pitch PA of the reproduced sound decreases as the distance from the sound data increases. Specifically, it is as follows.

  The pitch PA1 in FIG. 8 is the pitch of the reproduced sound when the observed point P moves on the observation trajectory OA1. The observation trajectory OA1 is positioned on the opposite side of the user U from the reference trajectory OREF before passing the hitting point Q, and is positioned on the user U side from the reference trajectory OREF after passing the hitting point Q (outside in). ). Accordingly, the pitch PA1 of the reproduced sound when the observed point P moves on the observation trajectory OA1 is higher than the reference pitch PREF before the observed point P passes the hitting point Q, and the observed point P is It decreases as it approaches the hitting point Q, and after the observed point P passes the hitting point Q, it becomes lower than the reference pitch PREF. On the other hand, the pitch PA2 in FIG. 8 is the pitch of the reproduced sound when the observed point P moves on the observation trajectory OA2. The observation trajectory OA2 is located on the user U side as viewed from the reference trajectory OREF before passing the hitting point Q, and is located on the opposite side of the user U as viewed from the reference trajectory OREF after passing the hitting point Q (inside out). . Therefore, the pitch PA2 of the reproduced sound when the observed point P moves on the observation trajectory OA2 is lower than the reference pitch PREF before the observed point P passes the hitting point Q, and the observed point P is It rises as it approaches the hitting point Q, and after the observed point P passes the hitting point Q, it becomes higher than the reference pitch PREF. According to the above configuration, the user U determines how the pitch PA of the reproduced sound changes in how the difference between the observation orbit ОA and the reference orbit ОREF changes at each time point of the swing motion (as time elapses). It can be grasped intuitively by change.

  As described above, in the first embodiment, since the acoustic signal S is generated according to the comparison result between the observation data DB and the reference data DREF, the reference indicated by the observation orbit OA of the observed point P and the reference data DREF. The user U can easily grasp the difference from the orbit ОREF.

  Further, since the acoustic signal S is generated in real time with respect to the swing motion, the difference between the actual swing motion and the observation trajectory OA and the reference trajectory OREF is different from the configuration in which the sound is reproduced after the swing motion is performed. The user U can grasp the relationship with In the configuration in which the time length of the reference data series SREF is fixed, the observation trajectory OA itself is approximated to the reference trajectory ОREF, but if the time length of the swing operation is different from the time length of the reference data series SREF, It can be evaluated that the trajectory OA and the reference trajectory ОREF are different. In the present embodiment, since the reference data series SREF is expanded and contracted on the time axis according to the average swing speed of the user U, the difference between the observation trajectory OA of the swing motion of the user U and the reference trajectory ОREF is appropriately determined. Can be evaluated.

Second Embodiment
A second embodiment of the present invention will be described below. In addition, about the element in which an effect | action and a function are equivalent to 1st Embodiment in each structure illustrated below, the detailed description of each is abbreviate | omitted suitably using the code | symbol referred by the above description.

  The storage device 14 according to the second embodiment stores three types of acoustic data W (Wx, Wy, Wz) indicating waveforms of different sounds (for example, a warning sound such as “peep” having a different pitch or sound quality). . The acoustic control unit 36 of the present embodiment controls the reproduction / stop of the acoustic data Wx according to the result (comparison value ΔTx) of the comparison unit 34 comparing the observation orbit OA and the reference orbit OREF in the X-axis direction, and Y The reproduction / stop of the acoustic data Wy is controlled according to the comparison result (comparison value ΔTy) in the axial direction, and the reproduction / stop of the acoustic data Wz is controlled according to the comparison result (comparison value ΔTz) in the Z-axis direction. The acoustic signal S is generated by adding the acoustic data Wx, the acoustic data Wy, and the acoustic data Wz. Specifically, the acoustic control unit 36 determines that the comparison value ΔT (ΔTx, ΔTy, ΔTz) in each axial direction is below a predetermined threshold value TH (when the difference between the observation value B and the reference value R is small). The reproduction of the acoustic data W corresponding to the axial direction is stopped, and the acoustic data W is reproduced when the comparison value ΔT exceeds the threshold value TH (when the difference between the observed value B and the reference value R is large). It is also possible to set the threshold TH individually for each axial direction.

  FIG. 9 is an explanatory diagram illustrating the reproduction / stop of the acoustic data W for each section (t1, t2, t3) of the observation orbit OA. In the period t1 in which the comparison value ΔTx and the comparison value ΔTz are below the threshold value TH and the comparison value ΔTy is above the threshold value TH in the observation orbit OA, only the acoustic data Wy corresponding to the Y-axis direction is reproduced, and the acoustic data Wx and The reproduction of the sound data Wz is stopped. Similarly, none of the acoustic data Wx to Wz is reproduced in the period t2 when the comparison value ΔTx, the comparison value ΔTy, and the comparison value ΔTz are all below the threshold value TH, and the comparison value ΔTx and the comparison value ΔTy exceed the threshold value TH. In the period t3 when the comparison value ΔTz is lower than the threshold value TH, the mixed sound of the acoustic data Wx and the acoustic data Wy is reproduced, and the acoustic data Wz is not reproduced.

  In the second embodiment, the same effect as in the first embodiment is realized. In the second embodiment, the comparison result between the observation orbit OA and the reference orbit OREF is individually reflected in the acoustic signal S in each axial direction, so that the observation orbit OA and the reference orbit OREF in any of the three axial directions. Can be recognized by the user U.

<Third Embodiment>
FIG. 10 is a block diagram of the motion analysis system 100 in the third embodiment. As shown in FIG. 10, the motion analysis system 100 of the third embodiment has a configuration in which a delay device 15 is added to the motion analysis system 100 of the first embodiment. The delay device 15 delays the acoustic signal S by a delay time δ. Therefore, the acoustic signal S is reproduced by the sound emitting device 16 after the delay time δ has elapsed since the acoustic control unit 36 started generating. Since the generation of the acoustic signal S (generation of the contrast data DC) is started at the operation start point, the reproduction of the acoustic signal S is started when the delay time δ has elapsed from the operation start point. That is, the acoustic signal S is not reproduced until the delay time δ elapses from the operation start point. An element (buffer) that temporarily holds and outputs the acoustic signal S is used as the delay device 15.

  In the third embodiment, the same effect as in the first embodiment is realized. In the third embodiment, since the reproduction of the acoustic signal S is started when the delay time δ has elapsed from the operation start point, it is possible to prevent the user U from being concentrated before and after the operation start point. For example, since the time length from the operation start point to the hit ball is about 500 milliseconds, if the delay time δ is set to about 500 milliseconds, the user U must hit the ball from the operation start point where the user needs to be particularly focused. There is an advantage that it is possible to prevent the user from being disturbed during the operation. Note that the configuration of the third embodiment (delay device 15) can also be applied to the second embodiment.

<Modification>
Each of the aforementioned embodiments can be variously modified. Specific modifications are exemplified below. Two or more aspects arbitrarily selected from the following examples can be appropriately combined.

(1) In each of the above-described embodiments, the amount of change in the acceleration (ax, ay, az) in each axial direction is exemplified as the observation data DB. However, the acceleration (ax, ay, az) itself is used as the observation data DB. Is also possible. Similarly, the numerical value of the acceleration in each axial direction can be used as the reference data DREF. Further, the element (detection body) for detecting the movement of the observation point P (swing motion of the user U) is not limited to the acceleration sensor 20. For example, a speed sensor that detects the speed of the observed point P and a direction sensor (for example, a gyro sensor) that detects the direction of the observed point P can be used instead of the acceleration sensor 20 (or together with the acceleration sensor 20). It is. It is also possible to specify the observation trajectory OA from a moving image obtained by shooting the swing motion of the user U with a video camera. As understood from the above description, the observation data DB is included as time-series data that can specify the observation orbit OA of the observed point P, and the numerical value indicated by the observation data DB is arbitrary. Similarly, the reference data DREF is included as time series data that can specify the reference trajectory OREF, and the numerical value indicated by the reference data DREF is arbitrary.

(2) In the above-described embodiments, the pitch of the reproduced sound is changed according to the difference between the observation orbit OA and the reference orbit REF. However, the method of modulating the acoustic data W is arbitrary. For example, the volume of the acoustic data W can be changed according to the difference (respective comparison data DC) between the observation orbit OA and the reference orbit ОREF. In the configuration in which the acoustic control unit 36 imparts various acoustic effects (for example, reverberation effect) to the acoustic data W, the degree of the acoustic effect imparted to the acoustic data W is determined according to the difference between the observation orbit OA and the reference orbit OREF. Can also be controlled. As understood from the above description, the acoustic control unit 36 is included as an element that generates the acoustic signal S according to the comparison result (contrast data DC) by the comparison unit 34, and specific processing of the acoustic control unit 36 is performed. The content is arbitrary.

(3) It is also possible to selectively use a plurality of sound data W indicating different sounds. Specifically, a configuration in which the acoustic control unit 36 selects and modulates the acoustic data W according to an instruction from the user U among the plurality of acoustic data W is preferable. For example, a plurality of acoustic data W indicating wind noise during swing of different types of clubs C is stored in the storage device 14. The acoustic control unit 36 selects the acoustic data W corresponding to the type of the club C used by the user U from the storage device 14, and generates the acoustic signal S by modulation of the acoustic data W to which the contrast data DC is applied. The type of club C (for example, driver, iron, etc.) is instructed by the user to the motion analysis apparatus 10 by operating the input device, for example. According to the above configuration, it is possible to diversify the types of reproduced sound.

(4) It is also possible to reproduce a specific sound (for example, sound effect) when the observation orbit OA and the reference orbit OREF are approximated. For example, the storage device 14 stores sound effect data indicating a sound effect waveform. The sound effect is, for example, sound when the ball enters the cup, sound such as cheers or applause. The acoustic control unit 36 counts the number N of the comparison data DC in which each comparison value ΔT (ΔTx, ΔTy, ΔTz) exceeds the threshold among the comparison data DC sequentially generated by the comparison unit 34, and after the swing operation is finished. When the number N is below a predetermined threshold (that is, when the observation orbit OA and the reference orbit OREF are approximate), sound effect data is acquired from the storage device 14 and supplied to the sound emitting device 16 as an acoustic signal S. That is, the sound signal S of the sound added with the sound effect immediately after the sound indicated by the sound data W (wind noise of the club C) is reproduced. With the above configuration, since the sound effect is reproduced when the observation orbit OA and the reference orbit OREF are approximated, there is an advantage that the user U can intuitively recognize the quality of the observation orbit OA of his own swing motion. . Note that it is also possible to add a sound effect to the acoustic signal S when the observation orbit OA and the reference orbit OREF are different (when the above-mentioned number N exceeds a threshold value). That is, the acoustic control unit 36 controls whether sound effects can be added to the acoustic signal S according to the degree of approximation between the observation orbit OA and the reference orbit OB.

(5) In the third embodiment, the delay device 15 delays the acoustic signal S by a predetermined delay time δ, but the delay time δ can be variably controlled. For example, in a configuration in which the comparison unit 34 detects the point of time when the club C hits the ball according to temporal changes in the observation data DA (or the comparison data DC), the delay device 15 outputs the acoustic signal S from the operation start point to the point of hitting. A configuration for delaying (that is, a configuration in which the delay time δ is set to the time from the operation start point to the time of hitting the ball) is employed. In the above configuration, sound is not reproduced from the operation start point to the time of hitting, and sound is reproduced from the time of hitting.

(6) Each element illustrated by each above-mentioned form may be abbreviate | omitted suitably. For example, the storage device 14 can be omitted by fetching various data from an external device separate from the motion analysis device 10. In the configuration in which the acoustic signal S generated by the acoustic control unit 36 is transferred to an external device via a communication network or a portable recording medium and reproduced by the sound emitting device 16 of the external device, the sound emitting device 16 is omitted. The

(7) In the first embodiment, the observation data acquisition unit 32 sequentially generates the observation data DB from the basic data DA supplied from the acceleration sensor 20, but the observation data DB that the acceleration sensor 20 sequentially generates is the observation data. A configuration in which the acquisition unit 32 receives may be employed. That is, the element (observation data acquisition means) for acquiring the observation data DB includes an element for generating the observation data DB from a detection result by the acceleration sensor 20 and an element for receiving the observation data DB from the external device (acceleration sensor 20). Including both.

(7) In each of the above-described embodiments, the motion analysis device 10 that analyzes the swing motion of the golf club C is illustrated, but the motion that can use the motion analysis device 10 is not limited to the golf motion. For example, the motion analysis system 100 (the motion analysis device 10) of the above-described embodiments can be used for analyzing the swing motion of a bat in baseball, the swing motion of a racket in tennis, and the slow motion of a fishing rod in throw fishing.

(8) It is also possible to change the number of reference data DREF (sampling period) per unit time within the analysis interval. For example, a configuration in which the number of reference data DREF per unit time is increased in the analysis section immediately before or after the impact as compared with other sections is preferable. As the number of the reference data DREF in the unit time is larger, the comparison between the observation data DB and the reference data DREF is executed at a shorter interval, and the observation orbit OA and the reference orbit OREF are more closely compared. It becomes possible to analyze in detail the difference of each orbit in the section immediately before and after. In addition, since the number of reference data DREF is increased in a part of the analysis interval, there is an advantage that the data amount can be reduced as compared with a configuration in which the number is increased in all the analysis intervals. It is assumed that the number of reference data DREF is different in advance inside and outside a predetermined section of the analysis section. However, the comparison unit 34 performs the interpolation process or the thinning-out process of the reference data series SREF to determine the number of reference data DREF. It is also possible to make the number of reference data DREF per unit time different inside and outside a predetermined section of the analysis section when increasing or decreasing. Further, it is possible to reduce the number of reference data DREF per unit time for a section that does not require detailed analysis among the analysis sections.

DESCRIPTION OF SYMBOLS 100 ... Motion analysis system, 10 ... Motion analysis apparatus, 12 ... Arithmetic processing apparatus, 32 ... Observation data acquisition part, 34 ... Comparison part, 36 ... Acoustic control part, 14 ... Memory | storage device, 15 ... ... delay device, 16 ... sound emission device, 20 ... acceleration sensor

Claims (2)

  1. Observation data acquisition means for acquiring observation data including observation values in each of three axes orthogonal to each other, the data being capable of specifying the trajectory of the observation point moving in conjunction with the user's action;
    Between each of a plurality of reference data including a reference value in each of the three axes and specifying a predetermined trajectory of the observed point, and the observation data acquired by the observation data acquisition means , the reference value and the Contrast means for comparing the observation values for each axis ;
    A motion analysis apparatus comprising: an acoustic control unit that controls reproduction / stop of each of three types of acoustic data corresponding to different axes in accordance with a comparison result by the comparison unit regarding an axis corresponding to the acoustic data .
  2. The motion analysis apparatus according to claim 1 , wherein the number of the reference data per unit time is different between one section of the predetermined trajectory and another section .
JP2012279501A 2012-12-21 2012-12-21 Motion analyzer Active JP5835206B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2012279501A JP5835206B2 (en) 2012-12-21 2012-12-21 Motion analyzer

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2012279501A JP5835206B2 (en) 2012-12-21 2012-12-21 Motion analyzer
US14/132,531 US20140180632A1 (en) 2012-12-21 2013-12-18 Motion Analysis Device
KR1020130157894A KR20140081695A (en) 2012-12-21 2013-12-18 Motion analysis device
CN201310704302.2A CN103877715B (en) 2012-12-21 2013-12-19 Motion analysis equipment and method of motion analysis

Publications (2)

Publication Number Publication Date
JP2014121456A JP2014121456A (en) 2014-07-03
JP5835206B2 true JP5835206B2 (en) 2015-12-24

Family

ID=50947043

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012279501A Active JP5835206B2 (en) 2012-12-21 2012-12-21 Motion analyzer

Country Status (4)

Country Link
US (1) US20140180632A1 (en)
JP (1) JP5835206B2 (en)
KR (1) KR20140081695A (en)
CN (1) CN103877715B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160024477A (en) * 2014-08-26 2016-03-07 김홍채 Golf swing analyzer
WO2017031247A1 (en) * 2015-08-18 2017-02-23 University Of Miami Method and system for adjusting audio signals based on motion deviation
WO2018070232A1 (en) * 2016-10-14 2018-04-19 ソニー株式会社 Signal processing device and signal processing method
CN106512362A (en) * 2016-12-13 2017-03-22 中山市得高行知识产权中心(有限合伙) Table tennis auxiliary training system and method

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5221088A (en) * 1991-01-22 1993-06-22 Mcteigue Michael H Sports training system and method
JPH0947535A (en) * 1995-08-05 1997-02-18 Yoshikazu Nakamura Golf swing practice device
US6514081B1 (en) * 1999-08-06 2003-02-04 Jeffrey L. Mengoli Method and apparatus for automating motion analysis
US20020115047A1 (en) * 2001-02-16 2002-08-22 Golftec, Inc. Method and system for marking content for physical motion analysis
FI20011518A0 (en) * 2001-07-11 2001-07-11 Raimo Olavi Kainulainen The movement
JP2004024627A (en) * 2002-06-26 2004-01-29 Yamaha Corp Device for movement practice
KR100631035B1 (en) * 2004-06-03 2006-10-02 송기무 swing training equipment in ball game sports
US20070135225A1 (en) * 2005-12-12 2007-06-14 Nieminen Heikki V Sport movement analyzer and training device
US8597133B2 (en) * 2006-03-16 2013-12-03 William B. Priester Motion training apparatus and method
JP2009125499A (en) * 2007-11-27 2009-06-11 Panasonic Electric Works Co Ltd Tennis swing improvement supporting system
US8827847B2 (en) * 2009-06-17 2014-09-09 Vernon Ralph Johnson Training aid
JP2011019793A (en) * 2009-07-17 2011-02-03 Ishida Co Ltd Sports technique-improving device
CA2673149A1 (en) * 2009-07-20 2011-01-20 National Research Council Of Canada Audio feedback for motor control training
JP2011062352A (en) * 2009-09-17 2011-03-31 Koki Hashimoto Exercise motion teaching device and play facility
US20110143866A1 (en) * 2009-12-14 2011-06-16 William Dean McConnell Core Tempo Golf Swing Training Tones
JP5948011B2 (en) * 2010-11-19 2016-07-06 セイコーエプソン株式会社 Motion analysis device
JP5704317B2 (en) * 2011-02-02 2015-04-22 セイコーエプソン株式会社 Swing analysis device, swing analysis system, program, and swing analysis method

Also Published As

Publication number Publication date
KR20140081695A (en) 2014-07-01
CN103877715B (en) 2017-08-08
JP2014121456A (en) 2014-07-03
CN103877715A (en) 2014-06-25
US20140180632A1 (en) 2014-06-26

Similar Documents

Publication Publication Date Title
JP5128135B2 (en) Voice interval training device
US8702485B2 (en) Dance game and tutorial
CN104023799B (en) Method and system to analyze sports motions using motion sensors of mobile device
US20070135225A1 (en) Sport movement analyzer and training device
CA2752699C (en) Chaining animations
US20110035666A1 (en) Show body position
KR20120052228A (en) Bringing a visual representation to life via learned input from the user
US9731182B2 (en) Systems and methods for measuring and/or analyzing swing information
US8786698B2 (en) Blow tracking user interface system and method
US9173596B1 (en) Movement assessment apparatus and a method for providing biofeedback using the same
US8858453B2 (en) Sound-output-control device, sound-output-control method, and sound-output-control program
KR101625360B1 (en) Motion detection system
KR101800795B1 (en) Web-based game platform with mobile device motion sensor input
US20130018494A1 (en) System and method for motion analysis and feedback with ongoing dynamic training orientation determination
JP2011000367A (en) Music reproduction control device
JP2008284133A (en) Golf swing measuring instrument
US20060276919A1 (en) Music playback apparatus and processing control method
US9981193B2 (en) Movement based recognition and evaluation
CN102814034B (en) Hit analytical equipment, analysis method of hitting
CN102814033B (en) Analysis swing and swing analysis device
JP2009125499A (en) Tennis swing improvement supporting system
WO2013002653A1 (en) Method of analysing a video of sports motion
Nymoen et al. Mumyo–evaluating and exploring the myo armband for musical interaction
US9599635B2 (en) Motion analysis apparatus and motion analysis method
US20160027325A1 (en) Feedback Signals From Image Data of Athletic Performance

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20140919

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20150213

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20150217

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20150410

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20150415

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20151006

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20151019