JP5641222B2 - Arithmetic processing device, motion analysis device, display method and program - Google Patents

Arithmetic processing device, motion analysis device, display method and program Download PDF

Info

Publication number
JP5641222B2
JP5641222B2 JP2010271630A JP2010271630A JP5641222B2 JP 5641222 B2 JP5641222 B2 JP 5641222B2 JP 2010271630 A JP2010271630 A JP 2010271630A JP 2010271630 A JP2010271630 A JP 2010271630A JP 5641222 B2 JP5641222 B2 JP 5641222B2
Authority
JP
Japan
Prior art keywords
model
subject
information
trajectory information
trajectory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2010271630A
Other languages
Japanese (ja)
Other versions
JP2012120579A (en
JP2012120579A5 (en
Inventor
安俊 滝沢
安俊 滝沢
Original Assignee
セイコーエプソン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by セイコーエプソン株式会社 filed Critical セイコーエプソン株式会社
Priority to JP2010271630A priority Critical patent/JP5641222B2/en
Publication of JP2012120579A publication Critical patent/JP2012120579A/en
Publication of JP2012120579A5 publication Critical patent/JP2012120579A5/en
Application granted granted Critical
Publication of JP5641222B2 publication Critical patent/JP5641222B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a motion analysis apparatus.

  In sports such as golf, tennis, and baseball, it is important to wear an exemplary exercise form without waste to improve skills. For this reason, many people take instructor lessons at sports clubs, etc., but they may not be able to take lessons due to space and time constraints. Therefore, for example, in Patent Document 1, a person who wants a golf lesson transmits photographing data obtained by photographing his / her swing form with a camera to a server using a communication line, and an instructor analyzes the photographing data and advises a subject. A method for providing the above has been proposed. According to this method, a lesson by an instructor can be taken without being restricted by location and time.

JP 2001-296799 A

  However, in the method of Patent Document 1, there is a strong tendency to provide instructor's subjective advice, and it is difficult to know the quantitative difference from the exemplary form. On the other hand, it may be possible to compare an exercise form taken with a camera with an exemplary form prepared in advance. However, because the physique and exercise speed are different, the difference from the exemplary form is also measured quantitatively. It ’s difficult.

  The present invention has been made in view of the above problems, and according to some aspects of the present invention, information capable of quantitatively analyzing the difference between the exercise form of the subject and the reference exercise form. Can be provided.

  (1) The present invention provides a plurality of motion sensors attached to a subject, a data acquisition unit that acquires information based on output data of each of the motion sensors, and the subject based on information acquired by the data acquisition unit. A trajectory information generation unit that generates trajectory information of the modeled subject model, and a trajectory information storage unit that stores trajectory information of the reference model, and the trajectory information generation unit receives the reference model from the trajectory information storage unit The trajectory information of the subject model is compared with the trajectory information of the subject model and the trajectory information of the reference model, and the size of the subject model is compared with at least one of the trajectory information of the subject model and the trajectory information of the reference model. Normalizing the size to match the size of the reference model, It is a dynamic analysis apparatus.

  The information based on each output data of the motion sensor may be each output data of the motion sensor itself, or information obtained by performing a given arithmetic process on each output data of the motion sensor. Also good.

  The trajectory information generation unit may normalize the size for only one of the trajectory information of the subject model and the trajectory information of the reference model, or may normalize the size for both.

  The trajectory information generation unit may generate trajectory information of the subject model while normalizing the size when normalizing the size of the trajectory information of the subject model, or may generate trajectory information of the subject model. Size normalization may be performed later.

  According to the present invention, by performing size normalization and comparing the trajectory information of the subject model and the trajectory information of the reference model, it is possible to quantitatively analyze the difference between the exercise form and the exercise speed of the subject and the reference model. it can.

  (2) In the motion analysis apparatus, the trajectory information generation unit may determine the motion time of the subject model of the motion to be analyzed and the reference with respect to at least one of trajectory information of the subject model and trajectory information of the reference model. You may make it further normalize the time which makes the operating time of a model correspond.

  In this way, by normalizing the size and time, and comparing the trajectory information of the subject model with the trajectory information of the reference model, the slight difference between the subject and the reference exercise form is also analyzed quantitatively. be able to.

  (3) The present invention provides a plurality of motion sensors attached to a subject, a data acquisition unit that acquires information based on output data of each of the motion sensors, and the subject based on information acquired by the data acquisition unit. A trajectory information generation unit that generates trajectory information of the modeled subject model, and a trajectory information storage unit that stores trajectory information of the reference model, and the trajectory information generation unit receives the reference model from the trajectory information storage unit The trajectory information of the subject model is compared with the trajectory information of the subject model and the trajectory information of the reference model, and at least one of the trajectory information of the subject model and the trajectory information of the reference model, When the operating time of the subject model matches the operating time of the reference model Performing normalization, a motion analysis device.

  The information based on each output data of the motion sensor may be each output data of the motion sensor itself, or information obtained by performing a given arithmetic process on each output data of the motion sensor. Also good.

  The trajectory information generation unit may normalize time for only one of the trajectory information of the subject model and the trajectory information of the reference model, or may normalize time for both.

  When the time information is normalized with respect to the trajectory information of the subject model, the trajectory information generation unit may generate the trajectory information of the subject model while performing time normalization, or the trajectory information of the subject model is generated. You may perform time normalization later.

  According to the present invention, by performing time normalization and comparing the trajectory information of the subject model with the trajectory information of the reference model, the difference between the exercise form of the subject and the reference can be quantitatively analyzed.

  (4) In this motion analysis apparatus, the trajectory information generation unit divides the motion to be analyzed into a plurality of motions in the time normalization, and the motion time of the subject model for each of the motions A process of matching the operation time of the reference model may be performed.

  In this way, since the exercise forms can be compared for each divided exercise, the difference between the exercise forms can be analyzed in more detail.

  (5) The motion analysis apparatus may further include an image generation unit that generates an image of the trajectory information of the subject model and the trajectory information of the reference model after the normalization and displays the image on the display unit. .

  Thus, by displaying the subject's exercise form and the reference exercise form as images, the difference between the two becomes intuitively easy to understand.

  (6) The motion analysis apparatus compares the trajectory information of the subject model and the trajectory information of the reference model after the dictionary information storage unit that stores dictionary information and the trajectory information generation unit perform the normalization. A trajectory information comparison unit and a display selection unit that selects at least one piece of information from the dictionary information based on the comparison result of the trajectory information comparison unit and displays the information on the display unit may be further included.

  In this way, the quantitative difference between the trajectory of the subject model and the trajectory of the reference model can be replaced with another expression and displayed. For example, you may make it display the text which points out the fault of the exercise | movement form etc. which are derived | led-out from the quantitative difference of both.

  (7) In this motion analysis apparatus, the motion sensor may be further attached to an exercise tool, and may further include a tool information generation unit that generates motion analysis information of the exercise tool based on output data of the motion sensor. .

  In this way, it is possible to analyze the relationship between the exercise form and the exercise tool information (for example, speed and trajectory).

  (8) In this motion analysis apparatus, at least one of the motion sensors may include a multi-axis acceleration sensor and a multi-axis angular velocity sensor.

The figure which shows the structure of the motion analysis apparatus of this embodiment. The flowchart figure of the whole process of a process part. The figure which shows the example of attachment of a motion sensor. The figure which shows an example of a human body model. The figure for demonstrating the normalization process of the time of locus | trajectory data. The figure for demonstrating the normalization process of the time of locus | trajectory data. The flowchart figure of the normalization process of the time of locus | trajectory data. The figure for demonstrating the normalization process of the time of locus | trajectory data. The flowchart figure of the normalization process of the size of locus | trajectory data. The figure for demonstrating the normalization process of the size of locus | trajectory data. The figure for demonstrating the comparison process of locus | trajectory data. The figure which shows an example of the display screen of a display part. The figure which shows an example of the display screen of a display part.

  DESCRIPTION OF EMBODIMENTS Hereinafter, preferred embodiments of the present invention will be described in detail with reference to the drawings. The embodiments described below do not unduly limit the contents of the present invention described in the claims. Also, not all of the configurations described below are essential constituent requirements of the present invention.

1. Configuration of Motion Analysis Device FIG. 1 is a diagram illustrating a configuration of a motion analysis device of the present embodiment.

  The motion analysis apparatus 1 of the present embodiment includes a human body mounting unit 100 and a processing / control unit 200, and analyzes the motion of a subject. When the subject performs a predetermined exercise, the exercise analysis apparatus 1 acquires the exercise form of the subject and analyzes the difference from the reference exercise form.

  The human body wearing unit 100 includes a plurality of motion sensors 10 and a synchronization unit 20 and is worn by a subject.

  The motion sensor 10 includes, for example, a three-axis acceleration sensor 12 and a three-axis gyro sensor (angular velocity sensor) 14, and acceleration in the three-axis (x-axis, y-axis, z-axis) direction and three-axis (x-axis, y-axis, The angular velocity in the (z-axis) direction can be detected. There may be three or more detection axes. The motion sensor 10 is worn on the subject's body, such as the shoulder, waist, and elbow. The number of motion sensors 10 worn by the subject and the site where the sensors 10 are attached can be arbitrarily determined according to the required accuracy of motion analysis and the analysis content. The motion sensor 10 may be equipped with a function for calculating speed and position in the xyz coordinate system by a function for correcting acceleration and angular velocity (functions such as bias correction and temperature correction) and a Kalman filter.

  Information of acceleration, angular velocity, speed, position, etc. of each motion sensor 10 is transmitted from each motion sensor 10 to the synchronization unit 20 by wireless communication or wired communication. The acceleration, angular velocity, position information, and the like are sent in synchronization with time information sent from the synchronization unit 20 to each motion sensor 10.

  The synchronization unit 20 sends the data received from each motion sensor 10 to the processing / control unit 200 in a packet combined with time information and the like. It should be noted that a function for calculating speed and position by a function for correcting acceleration and angular velocity (functions such as bias correction and temperature correction) and a Kalman filter may be mounted on the synchronization unit 20 instead of the motion sensor 10.

  When analyzing exercise using the exercise tool 300, the motion sensor 10 may be further incorporated into the exercise tool 300. Thereby, the position information of the exercise equipment 300 can be sent to the synchronization unit 20 of the human body wearing unit 100. As the exercise equipment 300, various things can be considered according to the type of exercise, such as a baseball ball and bat, a tennis ball and racket, and a golf ball and club.

  The human body mounting unit 100 and the processing / control unit 200 are connected to each other wirelessly or by wire. The processing / control unit 200 can be realized by a general-purpose information processing device such as a personal computer (PC) or a smartphone. When the processing / control unit 200 is a PC, it is easier to use a wireless connection. On the other hand, when the processing / control unit 200 is a portable device such as a smartphone, it can be connected by a wired device such as a USB.

  The processing / control unit 200 includes a processing unit 30, a communication unit 40, an operation unit 50, a storage unit 60, and a display unit 70.

  The communication unit 40 performs processing to receive the packet data transmitted from the synchronization unit 20 of the human body wearing unit 100 and send it to the processing unit 30.

  The operation unit 50 performs a process of acquiring operation data from the user and sending it to the processing unit 30. The operation unit 50 is, for example, a touch panel display, buttons, keys, a microphone, and the like.

  The storage unit 60 stores programs for the processing unit 30 to perform various calculation processes and control processes, various programs and data for realizing application functions, and the like. The storage unit 60 is also used as a work area of the processing unit 30, and temporarily stores data input from the operation unit 50, calculation results executed by the processing unit 30 according to various programs, and the like.

  In particular, the storage unit 60 of the present embodiment includes a trajectory information storage unit 62. The trajectory information storage unit 62 stores trajectory information (trajectory data) of the reference model. The trajectory information of the reference model may be, for example, trajectory information of an exercise form as an example collected from a professional sports player, or may be trajectory information of an exercise form when the subject has performed well in the past. Of the trajectory information generated by the processing of the processing unit 30, what is desired to be stored for a long time, such as trajectory data of the exercise form when the subject is in good condition, is stored (recorded) in the trajectory information storage unit 62. If necessary, important trajectory information (for example, trajectory information that involves copyright such as professional athlete data) may be encrypted.

  The storage unit 60 of the present embodiment includes a dictionary information storage unit 64. The dictionary information storage unit 64 stores dictionary information that defines the correspondence between the information on the difference between the trajectory information of the subject model and the trajectory information of the reference model and a plurality of expressions.

  The display unit 70 displays the processing result of the processing unit 30 as characters, graphs, or other images. The display unit 70 is, for example, a CRT, LCD, touch panel display, HMD (head mounted display), or the like. In addition, you may make it implement | achieve the function of the operation part 50 and the display part 70 with one touchscreen type display.

  The processing unit 30 performs various calculation processes on the data received from the synchronization unit 20 via the communication unit 40 and various control processes (display control for the display unit 70, etc.) according to the program stored in the storage unit 60. Do. The processing unit 30 can be realized by a microprocessor, for example.

  In particular, the processing unit 30 of the present embodiment includes a data acquisition unit 31, a trajectory information generation unit 32, an image generation unit 33, a trajectory information comparison unit 34, a display selection unit 35, and a tool information generation unit 36. Note that some of these may be omitted.

  The data acquisition unit 31 performs a process of continuously acquiring information based on each output data of the motion sensor 10.

  The trajectory information generation unit 32 performs a process of identifying each position of the motion sensor 10 based on the information acquired by the data acquisition unit 31 and generating trajectory information of a subject model that models the subject. The trajectory information generation unit 32 normalizes the size of the subject model and the reference model so as to match at least one of the trajectory information of the subject model and the trajectory information of the reference model. Alternatively, the trajectory information generation unit 32 matches the time of operation of the subject model corresponding to the motion to be analyzed and the time of operation of the reference model for at least one of the trajectory information of the subject model and the trajectory information of the reference model. Normalize the time to be performed. Further, the trajectory information generation unit 32 normalizes the size of at least one of the trajectory information of the subject model and the trajectory information of the reference model, and at least one of the trajectory information of the subject model and the trajectory information of the reference model. The time may be normalized.

  Further, the trajectory information generation unit 32 performs a process of further matching the time of the motion of the subject model corresponding to each of a plurality of motions obtained by dividing the motion to be analyzed with the time of motion of the reference model in time normalization. You may make it perform.

  Furthermore, the trajectory information generation unit 32 further generates virtual trajectory information of the subject model based on the information on the positions of the other motion sensors 10 without using the information on the positions of some of the motion sensors 10. May be.

  The image generation unit 33 is associated with the subject model based on the trajectory information of the subject model and the trajectory information of the reference model after the trajectory information generation unit 32 performs at least one of time normalization and size normalization. A process of generating an image associated with the image and the reference model and displaying the image on the display unit is performed.

  The trajectory information comparison unit 34 performs a process of comparing the trajectory information of the subject model and the trajectory information of the reference model after the trajectory information generation unit 32 performs at least one of time normalization and size normalization.

  The display selection unit 35 selects at least one expression from a plurality of expressions defined by the dictionary information stored in the dictionary information storage unit 64 based on the comparison result of the trajectory information comparison unit 34 and displays it on the display unit 70. To perform the process.

  The tool information generation unit 36 performs a process of generating analysis information related to the exercise of the exercise tool 300 based on the information acquired by the data acquisition unit 31 and based on the output data of the motion sensor 10 attached to the exercise tool 300.

2. Processing of motion analyzer [Overall processing]
In the present embodiment, the user can select any one of four operation modes, Mode 1 to Mode 4, by operating the operation unit 50, and the processing unit 30 corresponds to the selected operation mode. Perform the process. Specifically, a program for performing each process of mode 1 to mode 4 is stored in the storage unit 60, and the processing unit 30 performs the process of each mode by executing this program.

  Mode 1 is a mode in which data from each motion sensor 10 is acquired, locus data in a predetermined format is generated, and stored in the storage unit 60. Mode 2 is a mode in which the trajectory data stored in the storage unit 60 in mode 1 or the trajectory data obtained by format conversion of the trajectory data is compared with reference trajectory data. Mode 3 is a mode in which a 3D image of the exercise form of the subject or the reference exercise form is displayed on the display unit 70. Mode 4 is a mode for providing support information for determining the optimum number of motion sensors 10 to be attached to the subject and the optimum attachment site.

  FIG. 2 is a flowchart of the overall processing of the processing unit 30. When mode 1 is selected by the user (Y in step S10), the processing unit 30 performs a process of acquiring data from the synchronization unit 20 of the human body wearing unit 100 (step S12), and then the trajectory data of the subject model. Is generated and normalized (step S14).

  When mode 2 is selected by the user (Y in Step S20), the processing unit 30 performs a process of comparing the trajectory data of the subject model generated in Step S14 with the trajectory data of the reference model (Step S20). S22) If necessary, an appropriate notation is selected from the dictionary information and displayed on the display unit 70 (step S24).

  When the mode 3 is selected by the user (Y in step S30), the processing unit 30 generates a 3D image of the exercise form from the trajectory data of the subject model and the trajectory data of the reference model generated in step S14. Then, a process of displaying on the display unit 70 is performed (step S32).

  If mode 4 is selected by the user (Y in step S40), the processing unit 30 performs a process of analyzing whether or not the predetermined motion sensor 10 is unnecessary (step S42).

  And the process part 30 performs the process according to the mode which the user selected repeatedly until a test subject's exercise | movement analysis process is complete | finished (Y of step S50).

[Mode 1 processing]
For example, when analyzing a baseball throwing form, as shown in FIG. 3, the motion sensor 10 detects the head (hat), the back of the neck, both shoulders, both elbows, both wrists, the back of the waist, and the legs of the subject 2 as shown in FIG. It can be attached to the base, both knees, both ankles, etc. Further, the motion sensor 10 may be built in the ball 3 (an example of the exercise device 300).

  In mode 1, the processing unit 30 acquires data such as position information from the synchronization unit 20 of the human body mounting unit 100 at a constant period from when the subject 2 starts a motion to be analyzed (for example, pitching) to when it ends. Then, trajectory data representing the trajectory of the subject model is generated from the acquired data.

  In this embodiment, as shown in FIG. 4, a human body model 4 is defined as a subject model in which nodes are set in association with the head and joints of the subject 2 and predetermined nodes are connected by lines. And the process part 30 calculates the position of the XYZ coordinate system of each node of a test subject model (human body model 4) from the position of the xyz coordinate system of each motion sensor 10 which changes with the motion of the test subject 2, and the test subject model ( The trajectory data of the human body model 4) is generated. The X axis, the Y axis, and the Z axis have the same direction as the three axes defined by the trajectory data of the reference model.

  For example, as shown in FIG. 3, when the motion sensor 10 is attached to the head of the subject 2 and each joint, each node of the motion sensor 10 and the human body model 4 has a one-to-one correspondence. . That is, the nodes N1 to N15 are the head (hat), the back of the neck, the right shoulder, the right elbow, the right wrist, the left shoulder, the left elbow, the left wrist, the back of the waist, the base of the right foot, the right knee, and the right ankle. , Corresponding to the motion sensor 10 attached to 15 points of the base of the left foot, the left knee, and the left ankle. In such a case, the position of the corresponding node may be calculated according to the position of each motion sensor 10.

  On the other hand, when the motion sensor 10 is not attached to the head or some joints, it is assumed that the sensor is virtually attached, and this virtually attached sensor (hereinafter referred to as “virtual sensor”). The position of the node corresponding to) is calculated from the position of the node corresponding to the other motion sensor 10 using a technique such as inverse kinematics. For example, if the motion sensor 10 is attached only to the shoulder and wrist, and the motion of the elbow is calculated from the motion of the motion sensor 10 attached to the shoulder and wrist, rather than attaching the motion sensor 10 to the shoulder, wrist and elbow, respectively. The usability of the motion analysis apparatus 1 can be improved. In order to calculate the position of the virtual sensor, it is necessary to model the movement of the human body. For example, assuming that the human body has a rigid articulated structure, the length between the shoulder, elbow and wrist is always constant and the bone does not deform. Similarly, it is assumed that the distance from the hip joint to the knee and ankle is constant, the bone is not deformed, and the movable range of the joint has an upper limit. Based on this assumption, the movement of the human body is modeled from a plurality of human motion records and used to calculate the position of the virtual sensor.

  Further, the processing unit 30 performs processing for saving the generated trajectory data in the storage unit 60 (trajectory information storage unit 62). In the present embodiment, there are three types of trajectory data formats to be saved, and whether or not to save in each format can be selected. The trajectory data of format 1 is trajectory data in which the positions of the XYZ coordinate system calculated from the positions of the xyz coordinate system of each motion sensor 10 acquired from the synchronization unit 20 are arranged in time series. The data of format 2 is trajectory data that has been subjected to size normalization (time normalization is not performed). The data of format 3 is trajectory data obtained by normalizing the size and time.

  A part or all of the trajectory data of format 1, format 2, and format 3 may be generated and stored. Alternatively, the trajectory data may be stored only in the format 1 and the trajectory data of the format 2 or the format 3 may be generated when the trajectory data is displayed or compared with the reference trajectory data.

  The trajectory data of format 1 is used when viewing the actual motion form of the subject 2 with 3D graphic software or the like. By using the trajectory data of format 1, the actual physique and exercise speed of the subject 2 can be confirmed on the screen.

  The format 2 trajectory data (trajectory data normalized in size) is used to facilitate comparison between the exercise form of the subject 2 and the reference exercise form. That is, when the person who acquired the reference trajectory data is different from the physique of the subject 2, since it is difficult to compare the exercise forms when compared with the same size, the size is normalized to facilitate the comparison. Since the track data of the format 2 is not normalized with respect to time, it is used particularly when information on the time axis of the motion form (for example, comparison of motion speed difference) is important. However, when comparing the reference trajectory data with the normalized size and the trajectory data of the format 2, the two are not synchronized in the middle, so it is not suitable for the purpose of comparing the forms in detail.

  Format 3 trajectory data (size and time normalized trajectory data) is used to compare the exercise form of the subject 2 with the reference exercise form in detail.

(Normalization of trajectory data time)
Next, an example of a method for normalizing the time of the trajectory data will be described.

  First, one or a plurality of feature points (reference points) such as a start point, an end point, and a posture that change greatly in a series of motions are defined. That is, by defining a start point, a reference point, and an end point, the motion to be analyzed is divided into a plurality of motions.

  For example, in the case of analyzing a baseball throwing form, as shown in FIG. 5 (A), the time when the subject 2 stops moving before starting to pitch and the rear arm starts to move backward is the starting point of the exercise. As shown in (D), the end point is when the pitch is performed and the foot on the same side as the dominant arm leaves the ground and touches again. Further, for example, as shown in FIG. 5B, when the arm is taken back to shift to the pitching motion and the foot opposite to the dominant arm touches down, the first reference point (reference point 1), FIG. ), The time when the ball 3 is separated from the fingertip of the subject's 2 dominant arm is defined as the second reference point (reference point 2).

  For example, in the case of analyzing a baseball batting form, for example, after starting to take a rest after starting a swing, the start point is the start point, and the follow-through speed is 0, that is, the follow-through speed is 0. The point when it starts moving in the opposite direction becomes the end point, the point when the front foot leaves the ground and touches again, the first reference point, the point when the ball hits the bat is determined as the second reference point That's fine.

  The starting point, ending point, and reference point of the exercise may be automatically determined by the processing unit 30 based on the position data of the motion sensor 10, for example, an animation image of an object corresponding to the subject model is displayed on the display unit 70, and the user may determine manually by pressing a switch at an appropriate timing while viewing the animation image.

  Then, the position coordinates in the XYZ coordinate system of each node of the human body model 4 at the start point, the end point, and each reference point are calculated. For example, FIGS. 6A to 6D show states of the human body model 4 corresponding to the states of the subject 2 shown in FIGS. 5A to 5D, respectively. Then, the position coordinates in the XYZ coordinate system of each node of the human body model 4 in the states of FIGS. 6A to 6D are calculated in association with the start point, the reference point 1, the reference point 2, and the end point. To do. When each node of the motion sensor 10 and the human body model 4 has a one-to-one correspondence, the position data of the xyz coordinate system output by each motion sensor 10 at the start point, end point, and reference point is in the XYZ coordinate system. By performing the coordinate transformation, the position coordinates of each node can be calculated.

  In addition, the period from the start point to the first reference point, the period between two adjacent reference points, and the period from the last reference point to the end point, each of a predetermined number of times are equally spaced The position coordinates of each node of the human body model 4 at each division point are calculated. Since each motion sensor 10 detects a position at a constant cycle, there is not always position data corresponding to the time of the dividing point. If there is position data corresponding to the time of the dividing point, the position coordinates of each node at the dividing point can be calculated by performing coordinate conversion from the xyz coordinate system to the XYZ coordinate system for each position data. On the other hand, if there is no position data corresponding to the time of the dividing point, for example, position data corresponding to the time of the dividing point is generated by interpolation calculation using position data corresponding to the time before and after that time. The position coordinates of each node at the division point can be calculated by performing coordinate conversion from the xyz coordinate system to the XYZ coordinate system for each position data generated by the interpolation calculation.

  FIG. 7 shows an example of a flowchart of processing for generating trajectory data in which time is normalized. The flowchart shown in FIG. 7 is an example of a flowchart of a part of the process of step S14 of the flowchart shown in FIG. 2, and prior to the process of this flowchart, the synchronization unit 20 sends the xyz coordinate system of each motion sensor 10 to each other. The position data acquisition process (step S12 in FIG. 2) has been completed.

  First, the time t is set to 0 and the variable k is set to 0 (step S100), and it is determined whether or not the time t is the start point from the position data of the xyz coordinate system acquired at the time t and its neighboring times. (Step S102). In the case of analysis of a baseball pitching form, whether or not it is the start point shown in FIG. 5A, for example, in the vicinity of time t at the position of the node N4 corresponding to the right elbow or the node N5 corresponding to the right wrist. It can be determined from the amount of change.

If the time t is not the start point (N in step S104), t is increased by the position data acquisition period Δt (step S105), and it is determined again whether the time t is the start point (step S102). When the time t is the start point (Y in step S104), the position coordinates of each node at the start point in the XYZ coordinate system are calculated (step S106). For example, as shown in FIG. 8, when the actual trajectory of a certain node N is A, the coordinates of the position p 0 of the node N at the start point are calculated.

  Next, t is held in the array T (k), the time t is increased by Δt, and the variable k is increased by 1 (step S108). T (0) holds the time of the start point.

  When the reference point is defined (Y in step S110), it is determined whether or not the time t is the k (= 1) th reference point from the position data of the xyz coordinate system acquired at the time t and the nearby time. (Step S112). In the case of the pitch form analysis, whether or not the reference point 1 shown in FIG. 5B can be determined from the amount of change in the vicinity of time t of the position of the node N15 corresponding to the left ankle, for example.

When the time t is not the k (= 1) th reference point (N in step S114), t is increased by Δt (step S115), and it is determined again whether the time t is the k (= 1) th reference point. (Step S112). When the time t is the k (= 1) th reference point (Y in step S114), the position coordinates of each node at the k (= 1) th reference point are calculated (step S116). In the example of FIG. 8, it calculates the position p 1 of the coordinates of the node N at the reference point 1.

  Next, t is held in the array T (k), the time t is increased by Δt, and the variable k is increased by 1 (step S118). When the reference point 1 is defined, the time of the reference point 1 is held in T (1).

  And the process of step S112-S118 is repeated until the process about all the reference points is complete | finished (Y of step S120). For example, when the second reference point is defined, it is determined whether or not the time t is the k (= 2) th reference point from the position data of the xyz coordinate system acquired at the time t and the nearby time ( Step S112). In the case of the pitch form analysis, whether or not the reference point 2 shown in FIG. 5C is determined is, for example, the amount of change in the vicinity of time t of the distance between the position of the ball 3 and the position of the node N5 corresponding to the right wrist. It can be determined from.

When the time t is the k (= 2) th reference point (Y in Step S114), the position coordinates of the XYZ coordinate system of each node at the k (= 2) th reference point are calculated (Step S116). In the example of FIG. 8, it calculates the position p 2 of the coordinates of the node N at the reference point 2.

  Next, t is held in the array T (k), the time t is increased by Δt, and the variable k is increased by 1 (step S118). When the reference point 2 is defined, the time of the reference point 2 is held in T (2).

  When the processing for all the reference points is completed (Y in step S120), it is next determined whether or not the time t is the end point from the position data of the xyz coordinate system acquired at the time t and the time in the vicinity thereof (step S120). Step S122). In the case of the pitch form analysis, whether or not it is the end point shown in FIG. 5D, for example, the amount of change in the vicinity of time t of the position of the node N11 corresponding to the right knee or the node N12 corresponding to the right ankle. It can be determined from.

If time t is not the end point (N in step S124), t is increased by Δt (step S125), and it is determined again whether time t is the end point (step S122). If the time t is the end point (Y in step S124), the position coordinates of each node at the end point in the XYZ coordinate system are calculated (step S126). In the example of FIG. 8, and calculates the coordinates of the position p 3 of the node N at the end point.

  Next, t is held in the array T (k), k is held in the variable M, and the variable k is reset to 0 (step S128). For example, when the reference points 1 and 2 are defined, the end point time is held at T (3), and when the reference point is not defined, the end point time is held at T (1).

Next, the interval between T (k) and T (k + 1) is divided into n (k + 1) periods, and the position coordinates in the XYZ coordinate system of each node at n (k + 1) −1 division points are calculated (step). S130). In the example of FIG. 8, the start point and the reference point 1 are divided into n1 periods, and the positions p (0,1) , p (0,2) , p (0 ) of the node N at the n1-1 division points. , 3) ,..., P (0, n1-1) coordinates are calculated.

Next, the variable k is incremented by 1 (step S132). If k = M is not satisfied (N in step S134), the processes of steps S130 and S132 are performed again. In the example of FIG. 8, when k = 1, the reference point 1 and the reference point 2 are divided into n2 periods, and the positions p (1,1) and p ( 1) of the node N at the n2-1 division points. , 2) , p (1 , 3) ,..., P (1, n2-1) . Further, when k = 2, the reference point 2 and the end point are divided into n3 periods, and the positions p (2,1) , p (2,2) , p of the node N at the n3-1 dividing points. Each coordinate of (2,3) ,..., P (2, n3-1) is calculated.

When k = M (Y in step S134), the position coordinates of the nodes calculated so far are arranged in order of time to generate trajectory data (step S136), and the time normalization process ends. Thus, in the example of FIG. 8, p 0, p (0,1 ), p (0,2), p (0,3), ···, p (0, n1-1), p 1, p (1,1), p (1,2) , p (1,3), ···, p (1, n2-1), p 2, p (2,1), p (2,2), p (2,3), ···, p (2, n3-1), locus data consisting of coordinate sequence corresponding to p 3 is obtained.

  There are two types of time normalization methods. In the first normalization method, the time from the start point to the end point (time corresponding to the motion to be analyzed) is made equal between the trajectory data of the subject model and the trajectory data of the reference model. In this case, the interval between the start point and the first reference point, the interval between the reference points, and the interval between the last reference point and the end point are different between the trajectory data of the subject model and the trajectory data of the reference model. In this first normalization method, since the knot time of movement differs between the trajectory data of the subject model and the trajectory data of the reference model, the transition time from the knot to the knot can be compared.

  In the second normalization method, with the time from the start point to the end point, the interval between the start point and the first reference point, the interval between the reference points, the interval between the last reference point and the end point (the motion of the analysis target The time corresponding to each of the plurality of divided motions) is made equal between the trajectory data of the subject model and the trajectory data of the reference model. This second normalization method is suitable for comparing forms and trajectories because the movement node time is the same for the trajectory data of the subject model and the trajectory data of the reference model.

(Normalization of trajectory data size)
Next, an example of a method for generating trajectory data with a normalized size will be described. Normalizing the size of trajectory data means normalizing the size of the subject model at each time. Since each person has a different physique, for example, an average value of the size of each part of the person's body is obtained and converted.

  FIG. 9 shows an example of a flowchart of processing for generating trajectory data with normalized size. The flowchart shown in FIG. 9 is an example of a flowchart of a part of the process in step S14 of the flowchart shown in FIG. 2, and prior to the process of this flowchart, the synchronization unit 20 sends the xyz coordinate system of each motion sensor 10 to each other. The position data acquisition process (step S12 in FIG. 2) has been completed.

First, an actual distance L ij and a normalized distance K ij between each node Ni of the subject model and each node Nj connected to the node Ni are set (step S200). The subject's physique may be actually measured using a measure, and L ij may be set, or the subject may perform a predetermined exercise and measure the physique (calibration) to set L ij. Is possible. In addition, L ij can be set by measuring the physique from the actual exercise form of the subject without performing calibration. The measurement data (L ij value) of the physique of the subject once measured is recorded in the storage unit 60, and the recorded measurement data of the physique of each subject can be updated as necessary. K ij may be set to a predetermined value such as an average value of the distance between the node Ni and the node Nj, or may be set to a distance between corresponding nodes in the reference model.

Next, time t is set to 0 (step S202), and from the position data of the xyz coordinate system acquired at time t, the vector of the XYZ coordinate system from the rotation center node Nk to each node Nm connected to the node Nk. V km (t) is calculated. For example, as shown in FIG. 10, at time t, a vector V 34 (t) from the node N3 corresponding to the right shoulder to the node N4 corresponding to the right elbow is calculated.

Next, the length of each vector V km (t) calculated in step S202 is multiplied by K km / L km , and the position coordinates of each node Nm in the XYZ coordinate system are calculated (step S206). In the example of FIG. 10, assuming that the average length from the right shoulder to the right elbow is 25 cm (= K 34 ) and the length from the subject's right shoulder to the right elbow is 23 cm (= L 34 ), V 34 at time t. The length L 34 (t) of (t) is multiplied by 25/23.

  If there is another rotation center node (Y in step S208), the processes in steps S204 and S206 are performed on the next rotation center node.

  When the process is completed for all nodes at the center of rotation (N in step S208), t is increased by the position data acquisition period Δt (step S210), and if there is still position data to be processed (Y in step S212). Similarly, the processing of steps S204 to S210 is performed using the position data of the xyz coordinate system acquired at time t.

  If there is no position data to be processed (N in step S212), the position coordinates of each node in the XYZ coordinate system are arranged in time order to generate trajectory data (step S214), and the size normalization process ends.

  The trajectory data obtained by normalizing the size thus obtained is stored in the storage unit 60 as trajectory data of format 2.

  Further, for example, when trajectory data in which time is normalized according to the flowchart of FIG. 7 is obtained, the flowchart of FIG. 9 may be applied to the trajectory data. The trajectory data obtained by normalizing the size and time obtained in this manner is stored in the storage unit 60 as trajectory data of format 3.

  Further, based on the data from the motion sensor 10 built in the exercise tool 300 such as the ball 3, the same processing is performed to generate analysis information on the exercise tool 300 such as trajectory data and speed, and if necessary, The time and size of the trajectory data can be normalized and stored in the storage unit 60.

  Instead of storing the original trajectory data and the normalized trajectory data in the storage unit 60, a feature amount may be extracted from the trajectory data and stored in the storage unit 60. As the feature amount, it is conceivable that the trajectory data is subjected to FFT (Fast Fourier Transform) processing or DCT (Discrete Cosine Transform) processing to extract and store the n-th order coefficient. Use of feature quantities instead of the trajectory data itself can be advantageous in that comparison with a reference can be easily performed, normalization processing can be simplified, and calculation of a virtual sensor described later can be simplified.

[Mode 2 processing]
In mode 2, the processing unit 30 performs a process of comparing the trajectory data of the subject model of the subject with the trajectory data of the reference model. What is to be compared is normalized trajectory data. The trajectory data of the subject model is generated and normalized in mode 1 and stored in the storage unit 60. The trajectory data of the reference model is normalized in advance and stored in the storage unit 60 (trajectory information storage unit 62) as one or both of the format 2 and format 3 trajectory data. However, one or both of the trajectory data of the subject model and the trajectory data of the reference model is stored as original trajectory data (trajectory data of format 1), and the original trajectory data is normalized (format 2) before the comparison process. Alternatively, it may be converted into locus data of format 3.

When comparing the track data of the format 3, the positions of the nodes at the corresponding time points are compared, and the difference is calculated. For example, the start point and the start point, the reference point and the reference point, the end point and the end point, and the position of each node at the division point and the division point are compared. FIG. 11 is a diagram illustrating an example of the trajectory of one node of the subject model and the trajectory of the corresponding node of the reference model, where the trajectory of the node of the subject model is represented by a solid line, and the trajectory of the node of the reference model is represented by a broken line. ing. For example, p 1 and r 1 , p 2 and r 2 , p 3 and r 3 , p 4 and r 4 , p 5 and r 5 , and p 5 and r 5 are respectively compared, and a vector of the difference is calculated. By this comparison, it is possible to perform a detailed analysis of the exercise form by matching the time axis.

  Further, when comparing the track data of format 2, the positions of the nodes at the corresponding time points are compared to calculate the difference, and the times at the corresponding time points are compared to calculate the difference. For example, the start point, reference point, end point, and the position of each node at each division point are compared, and the start points are combined, and the time between reference points, the time between end points, and each division as necessary Compare the time between points. By this comparison, along with the analysis of the movement form, the time difference from the start point to the reference point, the time difference from the reference point to the reference point, the time difference from the reference point to the end point, etc. can be analyzed. .

  The comparison result is displayed on the display unit 70 as a table or graph, and “the time from the reference point 1 to the reference point 2 is Xms earlier than the reference” or “the reference point 1 to the reference point 2 passes 5 cm above the reference. Quantitative analysis information is obtained.

  However, in some cases, it may be easier to understand if an intuitive expression is displayed rather than quantitative analysis information. For example, depending on the quantitative analysis information, an expression characterizing the difference in trajectory from the reference such as “small swing”, “too much picking”, “head up”, “upper swing”, etc. may be displayed. Good. Therefore, in the present embodiment, the storage unit 60 (dictionary information storage unit 64) stores the correspondence between the difference information between the trajectory information of the subject model and the trajectory information of the reference model and a plurality of expressions (intuitive expressions, etc.). Dictionary information to be defined is stored. The processing unit 30 refers to the dictionary information, selects one or more expressions according to the quantitative analysis information, and causes the display unit 70 to display them. As a result, the user can know the difference from the reference with easy-to-understand words. Note that this dictionary information can be user-defined and can be expressed optimized for each user.

  Further, along with the comparison result, analysis information such as a trajectory and speed related to the exercise of the exercise tool 300 such as the ball 3 may be displayed on the display unit 70. In this way, for example, a quantitative relationship between the pitching form of the subject and the trajectory and speed of the ball 3 can be evaluated, and more useful analysis can be performed.

[Mode 3 processing]
In mode 3, the processing unit 30 performs a process of generating an animation of motion data from the trajectory data and displaying it as a 3D image. Arbitrary exercise data can be displayed on the display unit 70 to check the movement of the reference and the movement of the subject. For example, 3D graphics software is prepared in the storage unit 60, and the processing unit 30 activates the 3D graphics software, inputs exercise data, and uses the drawing function of the 3D graphics software to display the display unit. 70 can display a 3D image. Further, by changing the position and direction of the virtual camera using the function of the 3D graphics software, the viewpoint can be changed to display a 3D image. In this way, you can see the form of movement from various angles.

  In addition, by using the function of 3D graphics software, the animation of the subject model and the animation of the reference model can be displayed side by side or displayed in a superimposed manner. FIG. 12 is a diagram illustrating an example in which a subject model animation and a reference model animation are displayed side by side. In the display area 400 of the display unit 70, an animation image of the subject model 412 is displayed in the display area 410, and an animation image of the reference model 422 is displayed side by side in another display area 420. The subject model 412 and the reference model 422 are obtained by fleshing out the human body model 4 with simplified part objects corresponding to the respective parts. FIG. 13 is a diagram showing an example in which the subject model animation and the reference model animation are displayed in an overlapping manner. In the display area 400 of the display unit 70, the animation image of the subject model 432 and the animation image of the reference model 434 are displayed in the display area 430 in an overlapping manner. By overlapping and displaying in this way, the difference from the reference can be made clearer.

  In this way, by displaying the subject model animation and the reference model animation side by side or overlapping, the current subject form and the professional athlete's exemplary form or the subject's past favorable form You can check the form at the same time. In the example of FIG. 13, it can be clearly seen that the reference has a lower center of gravity and the width of both feet is wider at the moment of pitching.

[Mode 4 processing]
Before performing the processing of mode 1, mode 2, and mode 3, it is necessary to determine which part of the subject the motion sensor 10 is attached to and which part obtains the position information by the virtual sensor. The position where the motion sensor 10 is attached differs depending on the type of exercise, and the accuracy required for the level of analysis (for professional players or for beginners) differs, so it is necessary to determine the optimum conditions. Therefore, in mode 4, the determination of the optimal number of motion sensors 10 and the optimal attachment site is supported.

  Specifically, in mode 4, the processing unit 30 obtains the trajectory of the corresponding node from the data of the motion sensor 10 attached to the subject, and uses the other motion sensor 10 without using the data of the motion sensor 10. The trajectory of the same node (virtual trajectory) is obtained from the data alone by calculation, and information on the difference between the two trajectories is displayed on the display unit 70. By analyzing this information, the user can search for a motion sensor 10 that may be a virtual sensor. For example, when determining whether the motion sensor 10 should be attached to the elbow, the motion sensor 10 is attached to the shoulder, elbow, and wrist to acquire data. Then, the elbow trajectory is calculated from the information of the motion sensor 10 attached to the shoulder and the wrist, the difference between the trajectory obtained by the calculation and the trajectory obtained from the information of the motion sensor 10 attached to the elbow, and other quantitative measures. Calculate and display the error. By checking the difference between these two trajectories and the quantitative error, it can be determined whether there is no problem even if the motion sensor 10 attached to the elbow is removed. In addition, since the human body model is adapted to general-purpose motion, there is a possibility that a calculation error with respect to the virtual sensor becomes large in a special motion. Therefore, the calculation error can be reduced by changing the parameters of the human body model based on this error. In this way, the human body model can be learned, and the accuracy of the model can be increased.

  As described above, according to the motion analysis apparatus of the present embodiment, by comparing the track data of the format 2 in which the size is normalized with respect to the subject model and the reference model, the subject and the reference are compared. Differences in exercise form and exercise speed can be analyzed quantitatively. Further, by comparing the track data of the format 3 in which the size normalization and the time normalization are performed, a slight difference between the subject and the reference exercise form can be quantitatively analyzed.

  In addition, according to the motion analysis apparatus of the present embodiment, the time when the start point, the reference point, and the end point are matched is compared with the trajectory data of the subject model and the trajectory data of the reference model, and the two are compared. By doing so, the difference of the exercise form can be analyzed in more detail.

  The present invention is not limited to the present embodiment, and various modifications can be made within the scope of the gist of the present invention.

  In the present embodiment, the analysis of the baseball pitching form has been described as an example. However, the motion analysis apparatus of the present embodiment analyzes the baseball batting form, the tennis or golf swing analysis, the bowling pitching form analysis, the model It can be used for various motion analysis such as analysis of walking posture.

  In the present embodiment, the case of generating the data of format 2 in which only the size is normalized and the data in format 3 in which the size and time are normalized and compared with the reference data has been described as an example. However, only the time is described. Normalized data (without size normalization) may be generated and compared with reference data. Even in this way, the difference between the subject and the reference exercise form can be quantitatively analyzed.

  The present invention includes configurations that are substantially the same as the configurations described in the embodiments (for example, configurations that have the same functions, methods, and results, or configurations that have the same objects and effects). In addition, the invention includes a configuration in which a non-essential part of the configuration described in the embodiment is replaced. In addition, the present invention includes a configuration that exhibits the same operational effects as the configuration described in the embodiment or a configuration that can achieve the same object. Further, the invention includes a configuration in which a known technique is added to the configuration described in the embodiment.

1 motion analysis device, 2 subjects, 3 balls, 4 human body model, 10 motion sensor, 12 3-axis acceleration sensor, 14 3-axis gyro sensor, 20 synchronization unit, 30 processing unit, 31 data acquisition unit, 32 locus information generation unit, 33 Image generation unit, 34 Trajectory information comparison unit, 35 Display selection unit, 36 Tool information generation unit, 40 Communication unit, 50 Operation unit, 60 Storage unit, 62 Trajectory information storage unit, 64 Dictionary information storage unit, 70 Display unit, 100 human body mounting unit, 200 processing / control unit, 300 exercise equipment, 400 display area, 410 display area, 412 subject model, 420 display area, 422 reference model, 430 display area, 432 subject model, 434 reference model

Claims (12)

  1. Based on the output data of the motion sensor, a trajectory information generation unit that generates trajectory information of a subject model obtained by modeling the subject,
    A trajectory information storage unit that stores trajectory information of the reference model,
    The trajectory information generation unit
    It reads the locus information of the reference model from the locus information storage unit, to at least one of the locus information of the locus information and the reference model of the previous SL subject model, and size of the reference model and the size of the subject model An arithmetic processing unit that performs normalization of a size for matching the two and normalization of time for matching the operation time of the subject model and the operation time of the reference model .
  2. A data acquisition unit that acquires output data of a motion sensor that detects the movement of the subject;
    Based on the output data, a trajectory information generation unit that generates trajectory information of a subject model that models the subject,
    A trajectory information storage unit that stores trajectory information of the reference model,
    The trajectory information generation unit
    It reads the locus information of the reference model from the locus information storage unit, to at least one of the locus information of the locus information and the reference model of the previous SL subject model, and operation time of the subject model to be analyzed for motion An arithmetic processing unit that performs time normalization to match the operation time of the reference model.
  3. In claim 1 or 2 ,
    The trajectory information generation unit
    In the time normalization, an operation for dividing the motion to be analyzed into a plurality of motions and performing a process of matching the motion time of the subject model and the motion time of the reference model for each of the motions Processing equipment.
  4. In any one of Claims 1 thru | or 3 ,
    An arithmetic processing apparatus, comprising: an image generation unit that generates an image of the trajectory information of the subject model and the trajectory information of the reference model and displays the image on the display unit after performing the normalization.
  5. In any one of Claims 1 thru | or 4 ,
    A dictionary information storage unit for storing dictionary information;
    After the trajectory information generation unit performs the normalization, a trajectory information comparison unit that compares trajectory information of the subject model and trajectory information of the reference model,
    And a display selection unit that selects information from the dictionary information based on a comparison result of the trajectory information comparison unit and causes the display unit to display the information.
  6. In any one of Claims 1 thru | or 5 ,
    An arithmetic processing apparatus, comprising: a tool information generation unit that generates motion analysis information of the exercise tool based on output data of the motion sensor attached to the exercise tool.
  7. In any one of Claims 1 thru | or 6 ,
    An arithmetic processing unit that detects acceleration and angular velocity using output data from the motion sensor.
  8.   A motion sensor attached to the subject;
      A computation comprising: a trajectory information generation unit that generates trajectory information of a subject model that models the subject based on output data of the motion sensor; and a trajectory information storage unit that stores trajectory information of a reference model. And a processing device,
      The trajectory information generation unit
      The trajectory information of the reference model is read from the trajectory information storage unit, and the size of the subject model and the size of the reference model are determined for at least one of the trajectory information of the subject model and the trajectory information of the reference model. A motion analysis apparatus that normalizes a size to be matched, and normalizes a time to match the operation time of the subject model and the operation time of the reference model.
  9. Based on the output data from the motion sensor, generate trajectory information of the subject model that models the subject,
    Read the stored reference model ,
    To at least one of the locus information of the locus information and the reference model of the previous SL subject model, the operation time of the normalization and the subject model size to match the size of the reference model and the size of the subject model And normalizing the time to match the operation time of the reference model ,
    The display method of displaying the trajectory information of the subject model and the trajectory information of the reference model that have been subjected to normalization of the size and normalization of the time .
  10. Based on the output data from the motion sensor, generate trajectory information of the subject model that models the subject,
    Read the stored reference model ,
    To at least one of the locus information of the locus information and the reference model of the previous SL subject model, the normalized time to match the operation time of the reference model and the operation time of the subject model to be analyzed for motion Done
    A display method for displaying the trajectory information of the subject model and the trajectory information of the reference model in which the time is normalized.
  11. A procedure for generating trajectory information of a subject model obtained by modeling the subject based on output data from a motion sensor;
    The procedure to read the stored reference model,
    To at least one of the locus information of the locus information and the reference model of the previous SL subject model, the operation time of the normalization and the subject model size to match the size of the reference model and the size of the subject model And a program for causing a computer to execute a procedure for normalizing the time to match the operation time of the reference model .
  12. A procedure for generating trajectory information of a subject model obtained by modeling the subject based on output data from a motion sensor;
    The procedure to read the stored reference model,
    To at least one of the locus information of the locus information and the reference model of the previous SL subject model, the normalized time to match the operation time of the reference model and the operation time of the subject model to be analyzed for motion A program that causes a computer to perform the steps to be performed.
JP2010271630A 2010-12-06 2010-12-06 Arithmetic processing device, motion analysis device, display method and program Active JP5641222B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010271630A JP5641222B2 (en) 2010-12-06 2010-12-06 Arithmetic processing device, motion analysis device, display method and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010271630A JP5641222B2 (en) 2010-12-06 2010-12-06 Arithmetic processing device, motion analysis device, display method and program

Publications (3)

Publication Number Publication Date
JP2012120579A JP2012120579A (en) 2012-06-28
JP2012120579A5 JP2012120579A5 (en) 2014-01-16
JP5641222B2 true JP5641222B2 (en) 2014-12-17

Family

ID=46502718

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010271630A Active JP5641222B2 (en) 2010-12-06 2010-12-06 Arithmetic processing device, motion analysis device, display method and program

Country Status (1)

Country Link
JP (1) JP5641222B2 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6051770B2 (en) * 2012-10-24 2016-12-27 セイコーエプソン株式会社 Sensor system and synchronization method
JP6187735B2 (en) * 2012-12-28 2017-08-30 セイコーエプソン株式会社 Analysis control device, motion analysis system, program, recording medium, and motion analysis method
JP6187734B2 (en) * 2012-12-28 2017-08-30 セイコーエプソン株式会社 Analysis control device, motion analysis system, program, recording medium, and motion analysis method
JP5803962B2 (en) 2013-03-22 2015-11-04 ソニー株式会社 Information processing apparatus, sensor apparatus, information processing system, and recording medium
EP3058551A4 (en) * 2013-10-20 2017-07-05 Oahu Group, LLC Method and system for determining object motion
EP3120256A4 (en) * 2014-03-17 2018-01-03 Core Sports Technology Group Method and system for delivering biomechanical feedback to human and object motion
JP2015186531A (en) * 2014-03-26 2015-10-29 国立大学法人 東京大学 Action information processing device and program
WO2016035464A1 (en) * 2014-09-04 2016-03-10 ソニー株式会社 Analysis method, system and analysis device
RU2679533C2 (en) * 2014-09-04 2019-02-11 Леомо, Инк. Information terminal device, motion data collection system and method of motion data collection
TWI530821B (en) * 2014-09-25 2016-04-21 中強光電股份有限公司 Head-mounted display system and operation method thereof
JP6583605B2 (en) 2014-12-12 2019-10-02 カシオ計算機株式会社 Exercise information generation apparatus, exercise information generation method, and exercise information generation program
KR101624595B1 (en) 2014-12-19 2016-05-26 아이디어링크 주식회사 Golf club and golf swing monitoring system
JPWO2016111069A1 (en) * 2015-01-05 2017-10-12 ソニー株式会社 Information processing apparatus, information processing method, and program
JP2018075071A (en) * 2016-11-07 2018-05-17 美津濃株式会社 Swing analysis apparatus, program for making computer analyze swing and swing analysis system
WO2019008657A1 (en) * 2017-07-04 2019-01-10 富士通株式会社 Information processing device, information processing program, and information processing method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3742701B2 (en) * 1997-01-27 2006-02-08 ブラザー工業株式会社 Choreography scoring equipment for karaoke
JP4264368B2 (en) * 2004-02-24 2009-05-13 日本ナレッジ株式会社 Practical skill analysis system and program
JP2006181014A (en) * 2004-12-27 2006-07-13 Fuji Photo Film Co Ltd Image analysis device and movement correction system

Also Published As

Publication number Publication date
JP2012120579A (en) 2012-06-28

Similar Documents

Publication Publication Date Title
JP6185053B2 (en) Combined score including fitness subscore and athletic subscore
US7264554B2 (en) Method and system for athletic motion analysis and instruction
US9046919B2 (en) Wearable user interface device, system, and method of use
US8589114B2 (en) Motion capture and analysis
EP0959444A1 (en) Method for following and imaging a subject's three-dimensional position and orientation, method for presenting a virtual space to a subject, and systems for implementing said methods
US9011293B2 (en) Method and system for monitoring and feed-backing on execution of physical exercise routines
US20060166738A1 (en) Method and system for golf swing analysis and training for putters
KR101549761B1 (en) Method and system for automated personal training
JP2004534583A (en) Automated method and system for golf club selection based on swing type
JP2012110359A (en) Motion analyzing apparatus
JP5870969B2 (en) Motion analysis apparatus and motion analysis program
US9656121B2 (en) Methods for analyzing and providing feedback for improved power generation in a golf swing
JP5704317B2 (en) Swing analysis device, swing analysis system, program, and swing analysis method
KR101817393B1 (en) Method and system to analyze sports motions using motion sensors of a mobile device
US20130018494A1 (en) System and method for motion analysis and feedback with ongoing dynamic training orientation determination
JP3656853B2 (en) Motion measuring device
AU2005201321B2 (en) Golf swing-diagnosing system
US20130171596A1 (en) Augmented reality neurological evaluation method
US8998717B2 (en) Device and method for reconstructing and analyzing motion of a rigid body
US8944939B2 (en) Inertial measurement of sports motion
Coleman et al. A three-dimensional examination of the planar nature of the golf swing
US9529011B2 (en) Flight time
KR20160022940A (en) Fatigue indices and uses therof
KR100772497B1 (en) Golf clinic system and application method thereof
KR20160054325A (en) Management system and the method for customized personal training

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20131121

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20131121

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20140526

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140604

RD07 Notification of extinguishment of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7427

Effective date: 20140619

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140801

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20141001

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20141014

R150 Certificate of patent or registration of utility model

Ref document number: 5641222

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350