WO2023100246A1 - Technique recognition program, technique recognition method, and information processing device - Google Patents

Technique recognition program, technique recognition method, and information processing device Download PDF

Info

Publication number
WO2023100246A1
WO2023100246A1 PCT/JP2021/043871 JP2021043871W WO2023100246A1 WO 2023100246 A1 WO2023100246 A1 WO 2023100246A1 JP 2021043871 W JP2021043871 W JP 2021043871W WO 2023100246 A1 WO2023100246 A1 WO 2023100246A1
Authority
WO
WIPO (PCT)
Prior art keywords
grip
exercise
technique
athlete
hand
Prior art date
Application number
PCT/JP2021/043871
Other languages
French (fr)
Japanese (ja)
Inventor
卓也 佐藤
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2021/043871 priority Critical patent/WO2023100246A1/en
Publication of WO2023100246A1 publication Critical patent/WO2023100246A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities

Definitions

  • the present invention relates to a technique recognition program and the like.
  • a conventional technology that uses a learning model such as DL (Deep Learning) to score performances performed by competitors.
  • DL Deep Learning
  • sensors, cameras, and the like are used to obtain sensing data of an athlete, input the sensing data into a trained learning model, and calculate skeletal information of the athlete.
  • a feature amount indicating the characteristics of the posture corresponding to the "skill" is calculated from the time-series data of the skeleton information, and the technique performed by the athlete is calculated based on the time-series data of the skeleton information and the feature amount. automatically recognizes and outputs the performance score.
  • the performance score is calculated as the sum of the D (Difficulty) score and the E (Execution) score.
  • the D score is a score calculated based on whether or not a technique is established.
  • the E-score is a score calculated by a deduction method according to the degree of perfection of the technique.
  • the computer executes the following processing.
  • the computer obtains sensing data obtained by sensing the athlete. Based on the sensing data, the computer generates time-series three-dimensional skeletal information of the athlete.
  • the computer determines the combination of the athlete's grip on the stick at the end of the first exercise and the second exercise following the first exercise. Based on this, the player's grip on the bar at the end of the second movement is determined.
  • FIG. 1 is a diagram showing an example of a technique recognition system according to this embodiment.
  • FIG. 2 is a diagram (1) for explaining the effect of the information processing apparatus according to the embodiment.
  • FIG. 3 is a diagram (2) for explaining the effect of the information processing apparatus according to the embodiment.
  • FIG. 4 is a functional block diagram showing the configuration of the information processing apparatus according to this embodiment.
  • FIG. 5 is a diagram showing an example of the data structure of joint definition data according to this embodiment.
  • FIG. 6 is a diagram showing an example of the data structure of the joint position DB according to this embodiment.
  • FIG. 7 is a diagram showing an example of the data structure of a skeleton information DB.
  • FIG. 8 is a diagram showing an example of the data structure of motion dictionary data.
  • FIG. 1 is a diagram showing an example of a technique recognition system according to this embodiment.
  • FIG. 2 is a diagram (1) for explaining the effect of the information processing apparatus according to the embodiment.
  • FIG. 3 is a diagram (2) for explaining the effect of
  • FIG. 9 is a diagram showing an example of the data structure of a grip determination table.
  • FIG. 10 is a diagram showing an example of the data structure of a grip determination result table.
  • FIG. 11 is a diagram showing an example of the data structure of technique dictionary data.
  • FIG. 12 is a diagram showing an example of the data structure of the skill determination result table.
  • FIG. 13 is a diagram (1) for explaining the processing of the correction unit;
  • FIG. 14 is a diagram (2) for explaining the processing of the correction unit;
  • FIG. 15 is a flow chart showing the processing procedure of the information processing apparatus according to this embodiment.
  • FIG. 16 is a flow chart showing the procedure of grip determination processing.
  • FIG. 17 is a flow chart showing the processing procedure of grip correction processing.
  • FIG. 18 is a diagram for supplementary explanation of the processing of the specifying unit.
  • FIG. 19 is a diagram illustrating an example of a hardware configuration of a computer that implements functions similar to those of an information processing apparatus.
  • FIG. 1 is a diagram showing an example of a technique recognition system according to this embodiment.
  • this technique recognition system has 3D (3-dimension) laser sensors 10a and 10b and an information processing device 100.
  • the information processing device 100 and the 3D laser sensors 10a and 10b are interconnected.
  • the 3D laser sensors 10a and 10b are sensors that perform 3D sensing on the player 5.
  • the 3D laser sensors 10 a and 10 b output distance image data as sensing results to the information processing device 100 .
  • the 3D laser sensors 10a and 10b are collectively referred to as "3D laser sensor 10".
  • the distance image data includes a plurality of distance image frames, and each distance image frame is assigned a frame number that uniquely identifies the frame in ascending order.
  • One distance image frame includes distance information to each point on the athlete 5 sensed by the 3D laser sensor 10 at a certain timing.
  • the contestant 5 performs a predetermined performance to be scored in front of the 3D laser sensor 10 .
  • a predetermined performance to be scored in front of the 3D laser sensor 10 .
  • the player 5 performs a horizontal bar performance will be described, but the same can be applied to other scoring competitions in which techniques differ depending on how the player holds the bar.
  • the way to hold the bar includes the right hand, the reverse hand, and the overhand hand.
  • Shuntetsu is a normal grip with the back of the hand up.
  • the reverse hand is a grip that is twisted outward by 180° from the normal hand.
  • Ogyakute is a grip that is twisted 180 degrees inward from the forward hand.
  • the information processing device 100 Based on the distance image data acquired from the 3D laser sensor 10, the information processing device 100 generates time-series skeletal information of the athlete 5. Based on the combination of the grip style of the athlete 5 at the end of the current exercise and the type of the current exercise, the grip style of the athlete 5 at the end of the current exercise is specified.
  • the information processing device 100 identifies the current exercise (skill) based on the combination of the type of the current exercise and the grip style at the end of the current exercise. For example, information processing apparatus 100 identifies the technique name as “front wheel” when the current exercise is “front wheel” and the grip style for the current exercise is “right hand” or “reverse hand”. On the other hand, the information processing apparatus 100 specifies the name of the technique as “large reverse wheel” when the current exercise is "forward system wheel” and the grip style of the current exercise is “high reverse wheel.”
  • FIG. 2 are diagrams for explaining the effect of the information processing apparatus according to this embodiment.
  • FIG. 2 will be described.
  • technique T1 competitor 5 performs exercises in order of T1-1, T1-2, and T1-3.
  • Technique T1 is a C-difficulty technique called "Endo 1 Twist Kata Dai Gakute”.
  • technique T2 the competitor 5 performs exercises in order of T2-1, T2-2, and T2-3.
  • Technique T2 is a D-difficulty technique called "End-One Twisting Ogyakute”. Comparing the technique T1 and the technique T2, the grip A1 in T1-3 is different from the grip A2 in T2-3.
  • the information processing apparatus 100 since it is not possible to recognize the grip A1 and the grip A2 explained in FIG. On the other hand, since the information processing apparatus 100 can recognize the grip A1 and the grip A2, it recognizes the "end-single-twist-single-upside-down hand" and the "end-single-twist-single-upside-down hand” respectively. can be accurately identified.
  • Technique T3 is an A-difficulty technique called "front wheel”.
  • Technique T4 is an A-difficulty technique called "Ogyakute Wheel”.
  • Technique T3 is an A-difficulty technique called "front wheel”.
  • Technique T4 is a B-difficulty technique called "Ogyakute Wheel”.
  • FIG. 3 for the sake of convenience, the exercise movements of the technique T3 are summarized in one picture, and the exercise movements of the technique T4 are arranged in chronological order. The difference between technique T3 and technique T4 is the way the stick is gripped.
  • the information processing apparatus 100 can recognize the difference in gripping the stick, it can correctly identify the "front wheel” and the "large reverse wheel”.
  • FIG. 4 is a functional block diagram showing the configuration of the information processing apparatus according to this embodiment.
  • the information processing apparatus 100 has a communication section 110 , an input section 120 , a display section 130 , a storage section 140 and a control section 150 .
  • the communication unit 110 is connected to the 3D laser sensor 10.
  • the communication unit 110 acquires distance image data from the 3D laser sensor 10 and outputs the acquired distance image data to the control unit 150 .
  • the communication unit 110 may be connected to a camera (not shown) that captures an image of the competitor 5 and acquire image data from this camera.
  • the input unit 120 is an input device for inputting various types of information to the information processing device 100 .
  • the input unit 120 corresponds to a keyboard, mouse, touch panel, and the like.
  • the display unit 130 is a display device that displays information on the display screen that is output from the control unit 150 .
  • the display unit 130 corresponds to a liquid crystal display, a touch panel, or the like.
  • the storage unit 140 includes a distance image DB (Data Base) 141, joint definition data 142, joint position DB 143, skeleton information DB 144, motion dictionary data 145, a grip determination table 146, a grip determination result table 147, technique dictionary data 148, and technique determination. It has a result table 149 .
  • the storage unit 140 corresponds to semiconductor memory devices such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, and storage devices such as HDD (Hard Disk Drive).
  • the distance image DB 141 is a DB that stores distance image data acquired from the 3D laser sensor 10 .
  • the distance image DB 141 associates frame numbers with distance image frames.
  • the frame number is a number that uniquely identifies each distance image frame, and is assigned in ascending order.
  • a distance image frame is a frame included in distance image data sensed by the 3D laser sensor 10 .
  • the joint definition data 142 defines each joint position of the athlete (athlete 5).
  • FIG. 5 is a diagram showing an example of the data structure of joint definition data according to this embodiment.
  • this joint definition data 142 stores information in which each joint identified by a known skeleton model is numbered.
  • the right shoulder joint SHOULDER_RIGHT
  • the left elbow joint ELBOW_LEFT
  • the left knee joint KNEE_LEFT
  • the right hip joint HIP_RIGHT
  • the X coordinate of the right elbow joint No. 8 may be described as X8, the Y coordinate as Y8, and the Z coordinate as Z8.
  • the joint position DB 143 is a DB that stores the position data of each joint of the player 5 generated based on the distance image data of the 3D laser sensor 10.
  • FIG. 6 is a diagram showing an example of the data structure of the joint position DB according to this embodiment. As shown in FIG. 6, this joint position DB 143 associates a frame number with three-dimensional coordinates "X0, Y0, Z0, . . . , X17, Y17, Z17" of each joint.
  • Fig. 6 shows changes in the time series of each joint in the range image data.
  • the skeleton information DB 144 is a DB that stores the athlete's skeleton information generated based on the distance image data.
  • FIG. 7 is a diagram showing an example of the data structure of a skeleton information DB. As shown in FIG. 7, the skeleton information DB 144 associates frame numbers with skeleton information. The explanation about the frame number is the same as the explanation given for the distance image DB 141 .
  • the skeletal information is data indicating the skeletal structure of the player 5 estimated by connecting joint positions.
  • the exercise dictionary data 145 is dictionary data that defines exercise.
  • the exercise may correspond to, for example, a technique narrowed down based on a feature amount other than how to grip.
  • the technique corresponding to exercise includes a plurality of technique candidates.
  • the exercise ⁇ front wheel'' includes techniques ⁇ front wheel'', ⁇ big reverse wheel'', and the like.
  • FIG. 8 is a diagram showing an example of the data structure of exercise dictionary data.
  • the motion dictionary data 145 associates each motion with a plurality of basic motions.
  • a basic motion and a feature amount are set for each basic motion.
  • the feature amount is a value indicating the feature of the movement of the player 5 calculated from the time-series skeleton information.
  • the feature amount is time-series changes in joint positions.
  • a movement is specified by a combination of basic techniques.
  • Exercises are not limited to the above examples, and for example, a specific action may be defined as an exercise regardless of the technique defined by the scoring rules, and a grip may be recognized for each exercise.
  • the grip determination table 146 is a table for specifying a module corresponding to the combination of the grip at the end of the previous exercise and the current exercise.
  • the specified module specifies the grip corresponding to the current exercise.
  • FIG. 9 is a diagram showing an example of the data structure of the grip determination table.
  • the grip determination table 146 associates conditions, modules, and priorities.
  • the conditions include the previous grip and exercise.
  • the previous grip indicates the grip at the end of the previous exercise. If the previous gripping method is "any", it indicates that any gripping method is acceptable.
  • the module defines a determination policy for each grip style. For example, if the grip at the end of the previous exercise was "any" and the current exercise was "Adler", the module would be "first module".
  • FIG. 9 shows the first to fourth modules, the present invention is not limited to this.
  • Priority indicates the priority of the module, and the smaller the number, the higher the priority. Modules with higher priority are more likely to output values than modules with lower priority.
  • the first module outputs the preset likelihoods of the left and right hand grips. For example, the hand grip after Adler is always a large reverse hand due to the characteristics of the movement. In this case, the first module outputs (0, 0, 1) as the likelihood of the left hand grip in the order of reverse hand, forward hand, and major reverse hand. The first module outputs (0, 0, 1) in the order of reverse hand, forward hand, and large reverse hand as the likelihood of the right hand grip.
  • the second module determines whether or not there has been a grip change based on the movement of the wrist and arms estimated from the feature quantity corresponding to the current exercise and the balance of the whole body.
  • the likelihood of the grip is set to (0.7, 0.1, 0.2) in the order of reverse hand, straight hand, and large reverse hand. to output Since the previous grip was "(both hands) large reverse hand", the likelihood of the reverse hand is set high.
  • the likelihood of the grip is set to (0.2, 0.1, 0.7) in the order of the reverse hand, the straight hand, and the great reverse hand. to output Since the previous grip was "(both hands) Ogyakute", the likelihood of Ogyakute is set high.
  • the third module is a module that outputs the same likelihood as the previous motion likelihood. For example, if (0.7, 0.1, 0.2) is output by the previous exercise module, the likelihood for this exercise is (0.7, 0.1, 0.2 ).
  • the fourth module (inverted twist determination) will be explained.
  • the way of grasping the handle is determined by the motion and constraints of the human body structure, so the likelihood of determination is output.
  • the fourth module outputs (0, 0, 1) as the likelihood of the left hand grip in the order of reverse hand, forward hand, and major reverse hand.
  • the fourth module evaluates the likelihood of the hand that is not the axis, based on characteristics such as the trajectory of the arm, which side the stick is gripped from, and whether the grip is made by rubbing the stick after gripping. Guess which way you grabbed it and output the likelihood.
  • the DL may be used to learn the relationship between motion features and the likelihood of the non-axis hand and used as a fourth module.
  • the grip determination result table 147 is a table that holds determination results of how to grip.
  • FIG. 10 is a diagram showing an example of the data structure of a grip determination result table. As shown in FIG. 10, this grip determination result table 147 has order, motion, left hand likelihood, right hand likelihood, left hand grip, and right hand grip.
  • the order indicates the order of the exercises performed.
  • Exercise as described above, is a technique that represents a plurality of technique candidates.
  • the left-hand likelihood includes each likelihood of a reverse hand, a forward hand, and a major reverse hand regarding how to hold the left hand.
  • the right hand likelihood includes the likelihoods of each of reverse hand, forward hand, and large reverse hand regarding how to hold the right hand.
  • the left hand grip is a way of gripping the left hand identified from the likelihood.
  • the right hand grip is a way of gripping the right hand identified from the likelihood.
  • the likelihood of the left hand at the end of the exercise “Adler” is (0, 0, 1) in the order of the reverse move, the forward move, and the major reverse move, and the "great reverse move” having the maximum likelihood. is set to left hand grip.
  • the likelihood of the right hand at the end of the exercise “Adler” is (0, 0, 1) in the order of the reverse move, the forward move, and the great reverse move. It is set in a grip.
  • the left hand grip and right hand grip of the grip determination result table 147 are the grips at the end of the exercise, but both the grips at the start and the grips at the end may be retained. .
  • the technique dictionary data 148 is dictionary data that defines techniques corresponding to exercise and how to grip at the end of the exercise.
  • FIG. 11 is a diagram showing an example of the data structure of technique dictionary data. As shown in FIG. 11, this technique dictionary data 148 associates exercises with left hand grips, right hand grips, and techniques. The explanations regarding exercise, left hand grip, and right hand grip are the same as those described with reference to FIG. 10 .
  • the technique shown in FIG. 11 is the name of the technique that is finally specified.
  • the technique determination result table 149 is a table that holds the determination results of techniques performed by the player 5 .
  • FIG. 12 is a diagram showing an example of the data structure of the skill determination result table. As shown in FIG. 12, the technique determination result table 149 associates the order of techniques with the techniques.
  • the control unit 150 has an acquisition unit 151 , a generation unit 152 , a calculation unit 153 , a specification unit 154 , a correction unit 155 , a technique determination unit 156 and an evaluation unit 157 .
  • the control unit 150 can be realized by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like.
  • the control unit 160 can also be realized by hardwired logic such as ASIC (Application Specific Integrated Circuit) and FPGA (Field Programmable Gate Array).
  • the acquisition unit 151 acquires distance image data from the 3D laser sensor 10 via the communication unit 110, and registers the acquired distance image data (frame number, distance image frame) in the distance image DB 141.
  • the generation unit 152 generates skeleton information by performing processing for extracting the position data of each joint of the player 5 in time series and processing for generating skeleton information in time series.
  • the generation unit 152 compares the distance image frame of the distance image DB 141 and the joint definition data 142 to identify the type of each joint included in the distance image frame and the three-dimensional coordinates of the joint.
  • the generation unit 152 registers information in which the frame number and the three-dimensional coordinates of each joint type are associated with each other in the joint position DB 143 .
  • the generation unit 152 repeatedly executes the above process for each frame number.
  • the generator 152 generates skeleton information corresponding to each frame number based on the joint position DB 143 .
  • Generation unit 152 stores the generated skeleton information in skeleton information DB 144 in association with the frame number.
  • the generation unit 152 generates skeleton information by connecting the three-dimensional coordinates of each joint stored in the joint position DB 143 based on the connection relationship defined in the joint definition data 142 .
  • the calculation unit 153 calculates feature amounts based on the time-series skeleton information stored in the skeleton information DB 144 .
  • the feature amount calculated by the calculator 153 is a time-series change in each joint position, and indicates the feature of the movement of the player 5 .
  • the calculation unit 153 calculates the feature quantity for each break in the movement at which the basic movement is switched. For example, the calculation unit 153 determines the timing at which the change in each joint position is less than the threshold for a certain period of time as a discontinuity. The calculation unit 153 may identify a break in exercise using the technology disclosed in “International Publication No. 2019/116496”. The calculation unit 153 outputs the feature amount for each break to the identification unit 154 . In the following description, the feature amount for each break is simply referred to as "feature amount”.
  • the specifying unit 154 is a processing unit that specifies how the player 5 grips the stick.
  • the identifying unit 154 executes a process of identifying an exercise and a process of identifying a grip style.
  • the process of identifying exercise performed by the identifying unit 154 will be described.
  • the specifying unit 154 compares each feature quantity with the feature quantity of the basic motion in the motion dictionary data 145 to specify the basic motion corresponding to each feature quantity.
  • the identifying unit 154 compares the specified combinations of basic motions with the basic motions in the motion dictionary data 145 to specify motions corresponding to the basic motions.
  • the specifying unit 154 repeats the above process to specify motions in order.
  • the identifying unit 154 identifies the way of gripping.
  • the k-1th exercise is referred to as "previous exercise”
  • the kth exercise is referred to as "current exercise”.
  • the identifying unit 154 uses the determined module to identify the likelihood of each gripping style, and identifies the gripping style with the highest likelihood as the gripping style corresponding to the current exercise.
  • the identifying unit 154 registers information on the current exercise, left hand likelihood, right hand likelihood, left hand grip, and right hand grip in the grip determination result table 147 for the k-th.
  • the specifying unit 154 registers the gripping manner at the start and the gripping manner at the end for left hand grip and right hand grip.
  • the specifying unit 154 sequentially registers each piece of information in the grip determination result table 147 by repeatedly executing the above process for the following k+1st, k+2nd, . . . , k+nth.
  • the correction unit 155 refers to the grip determination result table 147 registered by the processing of the identification unit 154 described above, and determines whether there is a contradiction in the human body structure in the state transition from the previous gripping method to the current gripping method. However, if there is a contradiction, the likelihood of each grip is calculated again.
  • FIG. 13 and 14 are diagrams for explaining the processing of the correction unit.
  • FIG. 13 will be described.
  • the grip determination result table 147A shown in FIG. (illustration omitted).
  • the correction unit 155 scans two consecutive records in the grip determination result table 147A and identifies records in which the grip at the end of the previous exercise is different from the grip at the start of the subsequent exercise. Correction unit 155 identifies the module that has identified the left-hand likelihood and right-hand likelihood set in the identified record, and selects the module with the higher priority value. The correcting unit 155 uses the module with the higher priority to re-specify the likelihood while changing the conditions.
  • the correcting unit 155 identifies the “k+1st” and “k+2nd” records.
  • the correction unit 155 Based on the grip determination table 146, the correction unit 155 identifies the module “second module” used when identifying the "k+1" left hand likelihood and right hand likelihood. Based on the grip determination table 146, the correction unit 155 identifies the module “fourth module” used when identifying the “k+2” left hand likelihood and right hand likelihood.
  • the correction unit 155 selects the second module because the priority of the "second module” is "3" and the priority of the "fourth module” is "1".
  • the correction unit 155 changes the conditions for using the second module, and identifies the left-hand likelihood and the right-hand likelihood again. For example, it is assumed that each likelihood "0.2, 0.1, 0.7" was output because the condition of the previous second module was "there was no grip change". In this case, the correction unit 155 changes the condition to "there was a grip change" and specifies each likelihood as "0.7, 0.1, 0.2". Also, by this, the left hand grip and right hand grip are updated to "upside down". For example, by the processing of the correction unit 155, the grip determination result table 147B is updated to the grip determination result table 147C, and the contradiction is resolved.
  • the correction unit 155 may select an output result that eliminates contradiction. For example, in FIG. 13, the way of gripping at the end of the previous time is "Ogyakute” and the way of gripping at the start of this time is "Inverse hand", which contradicts each other. In this case, among the output results of the selected modules, the output result that maximizes the likelihood of oganete is selected.
  • the correcting unit 155 skips the above processing when there is no contradiction in the grip determination result table 147 .
  • the technique determination unit 156 identifies the technique performed by the player 5 based on the combination of the exercise and the grip method (left hand grip, right hand grip) registered in the grip determination result table 147 and the technique dictionary data 148. Then, the identified results are registered in the skill determination result table 149 in order.
  • the evaluation unit 157 grades the score of the competitor 5 based on the skill determination result table 149. For example, the evaluation unit 157 calculates the D score of the competitor 5 by totaling the points corresponding to the difficulty levels of the techniques (successful techniques) registered in the technique determination result table 149 . Further, the evaluation unit 157 may further evaluate the degree of perfection of the technique, etc., based on the skeleton information registered in the skeleton information DB 144, and further calculate the E-score based on the deduction method. The evaluation unit 157 outputs the evaluation result to the display unit 130 for display.
  • FIG. 15 is a flow chart showing the processing procedure of the information processing apparatus according to this embodiment.
  • the acquisition unit 151 of the information processing device 100 acquires distance image data from the 3D laser sensor 10 and registers it in the distance image DB 141 (step S101).
  • the generation unit 152 of the information processing device 100 generates time-series skeleton information based on the distance image DB 141 and the joint definition data 142, and registers it in the skeleton information DB 144 (step S102).
  • the calculation unit 153 of the information processing apparatus 100 calculates feature amounts based on the skeleton information registered in the skeleton information DB 144 (step S103).
  • the identifying unit 154 of the information processing device 100 identifies exercise based on the feature amount and the exercise dictionary data 145 (step S104).
  • the identifying unit 154 executes grip determination processing (step S105).
  • the correction unit 155 of the information processing device 100 executes grip correction processing (step S106).
  • the technique determination unit 156 of the information processing device 100 determines a technique based on the grip determination result table 147 and the technique dictionary data 148, and registers it in the technique determination result table 149 (step S107).
  • the evaluation unit 157 of the information processing device 100 grades the score based on the skill determination result table 149 (step S108), and causes the display unit 130 to display the score (step S109).
  • FIG. 16 is a flow chart showing the procedure of grip determination processing.
  • the identifying unit 154 Based on the grip determination table 146, the identifying unit 154 identifies the module corresponding to the k-th exercise and the previous grip method (step S202). The identifying unit 154 reads out the identified module (step S203).
  • the identifying unit 154 identifies the gripping style corresponding to the maximum likelihood among the likelihoods of the gripping styles output by the module as the gripping style of the k-th exercise, and registers it in the grip determination result table 147 (step S204).
  • step S205 If the k-th movement is the last movement (step S205, Yes), the identification unit 154 ends the grip determination process.
  • step S205 If the k-th motion is not the last exercise (step S205, No), the specifying unit 154 adds 1 to k (step S206), and proceeds to step S202.
  • FIG. 17 is a flow chart showing the processing procedure of grip correction processing.
  • the correcting unit 155 of the information processing device 100 refers to the grip determination result table 147 and identifies a record in which the grip at the start and the grip at the end contradict each other (step S301).
  • the correction unit 155 extracts the module with the higher priority value from among the modules used to calculate the likelihood contained in the contradictory record (step S302).
  • the correction unit 155 selects the module with the k-th highest priority value from among the plurality of extracted modules (step S304).
  • the correction unit 155 selects, from among the results that can be output by the selected module, an output that eliminates the contradiction between the way of gripping at the start and the way of gripping at the end (step S305).
  • the correction unit 155 updates the grip determination result table 147 (step S306).
  • step S307 If the k-th module is the last extracted module (step S307, Yes), the specifying unit 154 ends the grip correction process.
  • step S307 If the k-th module is not the last extracted module (step S307, No), the specifying unit 154 adds 1 to k (step S308), and proceeds to step S304.
  • the information processing device 100 Based on the distance image data acquired from the 3D laser sensor 10, the information processing device 100 generates time-series skeletal information of the athlete 5. Based on the combination of the grip style of the athlete 5 at the end of the current exercise and the type of the current exercise, the grip style of the athlete 5 at the end of the current exercise is specified. As described above with reference to FIGS. 2 and 3, this makes it possible to distinguish between different ways of gripping the stick and improve the accuracy of technique recognition.
  • the information processing device 100 identifies a module based on the combination of how the athlete 5 gripped the bar at the end of the previous exercise and the current exercise, and identifies the likelihoods of the plurality of grips.
  • the information processing device 100 identifies how to grip the stick in the current exercise based on the identified likelihood. As a result, it is possible to improve the recognition accuracy of the grip of the player 5 .
  • the information processing device 100 determines whether the grips related to the plurality of grips are inconsistent. Correct the likelihood. As a result, it is possible to further improve the recognition accuracy of the grip of the player 5 .
  • the information processing device 100 identifies the technique performed by the athlete 5 based on the exercise and the grip and combination of the exercise, and scores a score for evaluating the gymnastics performance based on the identified technique. do. This makes it possible to assist the referee's judgment in gymnastics.
  • the embodiment described above is an example, and the information processing apparatus 100 may further execute other processes.
  • Other processes 1 and 2 of the information processing apparatus 100 will be described below.
  • the information processing apparatus 100 acquires distance image data from the 3D laser sensor 10 and generates skeleton information, it is not limited to this, and skeleton information is generated based on image data captured by an RGB camera. may be generated. In this case, the information processing apparatus 100 uses a trained learning model that receives image data and outputs skeleton information.
  • the identification unit 154 of the information processing device 100 basically executes the above-described processing (processing according to the processing procedure described with reference to FIG. 16) to identify the grip of the player 5. From this point of view, an example of the processing of the identification unit 154 will be described.
  • the elements that change the grip of competitor 5 include “twisting the body”, “twisting the arm by putting the leg between the arms”, and “changing the grip”.
  • the elements that twist the body there are those that can be grasped by the amount of twisting the shaft, and those that change the grip depending on how the hand is released.
  • the identifying unit 154 identifies the gripping method by motion recognition for the element that twists the body that can be grasped by the amount of twisting of the shaft.
  • the identification unit 154 identifies the gripping method based on the gripping method recognition for the element that twists the body and the gripping method changes depending on the gripping method of the separated hand.
  • the specifying unit 154 specifies how to grip an element in which the arm is twisted by passing the leg between the arms through motion recognition.
  • the specifying unit 154 specifies how to grip the change-of-grip element by motion recognition or gripping method recognition.
  • the identification unit 154 indicates that when the player 5 grabs a stick and twists his or her body once (360°) around the gripped hand, the change from an overhand to a backhand is made, or from a backhand to a backhand. identify as changed.
  • the identifying part 154 changes from a reverse hand to a straight hand or from a straight hand to a reverse hand. , to identify a change from right hand to big hand hand, or from high hand hand to right hand.
  • the specifying unit 154 specifies that when the arm is twisted by putting the leg between the arms, it changes from the reverse hand to the overhand hand.
  • the identification unit 154 An example of grip recognition performed by the identification unit 154 will be described.
  • the grip of the released hand changes depending on how it is gripped. For example, since the direction of twisting the arm is opposite in ogakute and gakute, the player's 5 elbow tends to open outward when grasping with ogakute, and the elbow tends to move inward when grasping with gakute.
  • the specifying unit 154 may specify how to grasp. Further, the identifying unit 154 may perform grasp recognition using a learning model trained based on the above characteristics.
  • the identification unit 154 detects that a grip has been changed by motion recognition and grip recognition, and when a specific hand is released in a specific state, the grip changes , changed the grip with both hands upside down).
  • FIG. 18 is a diagram for supplementary explanation of the processing of the identification unit.
  • the trick shown in Figure 18 is an end-one twist.
  • Athlete 5 exercises in order of T3-1, T3-2, and T3-3.
  • the specifying unit 154 specifies the grip A3
  • the following processing is performed for the “hand” and “hand in the air”.
  • the identification unit 154 identifies the grip after twisting based on "which hand is the left or right hand?", “which side is it twisted around?”, and "what is the original grip”.
  • the specifying unit 154 specifies “which hand is left or right?” from the skeleton information.
  • the identifying unit 154 identifies “which way the body is twisted” based on motion recognition.
  • the identification unit 154 uses the determination result of the above module or the like for "what is the original nigiri", but there is a possibility that it is wrong.
  • the identifying unit 154 determines the grip after grasping the hand in the air based on the shape of the arm, the trajectory (whether the elbow is inside or outside, which side of the bar to grasp from when grasping, etc.). Be specific, but you may be wrong.
  • the correct way of gripping is specified by the processing based on the correction unit 155 described above.
  • FIG. 19 is a diagram illustrating an example of a hardware configuration of a computer that implements functions similar to those of an information processing apparatus.
  • the computer 200 has a CPU 201 that executes various arithmetic processes, an input device 202 that receives data input from the user, and a display 203 .
  • the computer 200 also has a communication device 204 and an interface device 205 for exchanging data with other computers via a wired or wireless network.
  • the computer 200 also has a RAM 206 that temporarily stores various information, and a hard disk device 207 . Each device 201 - 207 is then connected to a bus 208 .
  • the hard disk device 207 has an acquisition program 207a, a generation program 207b, a calculation program 207c, a specific program 207d, a correction program 207e, a technique determination program 207f, and an evaluation program 207g.
  • the CPU 201 reads out the acquisition program 207a, the generation program 207b, the calculation program 207c, the specific program 207d, the correction program 207e, the technique determination program 207f, and the evaluation program 207g, and develops them in the RAM 206.
  • the acquisition program 207a functions as an acquisition process 206a.
  • Generation program 207b functions as generation process 206b.
  • the calculation program 207c functions as a calculation process 206c.
  • the specific program 207d functions as the specific process 206d.
  • the correction program 207e functions as a correction process 206e.
  • the technique determination program 207f functions as a technique determination process 206f.
  • Evaluation program 207g functions as evaluation process 206g.
  • the processing of the acquisition process 206a corresponds to the processing of the acquisition unit 151.
  • the processing of the generation process 206 b corresponds to the processing of the generation unit 152 .
  • Processing of the calculation process 206 c corresponds to processing of the calculation unit 153 .
  • the processing of the identification process 206 d corresponds to the processing of the identification unit 154 .
  • Processing of the correction process 206 e corresponds to processing of the correction unit 155 .
  • the processing of the technique determination process 206 f corresponds to the processing of the technique determination section 156 .
  • Processing of the evaluation process 206 g corresponds to processing of the evaluation unit 157 .
  • each program 207a to 207g do not necessarily have to be stored in the hard disk device 207 from the beginning.
  • each program is stored in a “portable physical medium” such as a flexible disk (FD), CD-ROM, DVD, magneto-optical disk, IC card, etc., inserted into the computer 200 .
  • the computer 200 may read and execute each program 207a to 207g.

Abstract

This information processing device acquires sensing data obtained by sensing competitors. The information processing device generates time-series 3D skeletal information of a competitor on the basis of the sensing data. When identifying the motion performed by the competitor on the basis of the skeletal information, the computer identifies the competitor's grip on the bar at the end of a second motion, on the basis of a combination of the competitor's grip on the bar at the end of a first motion, and the second motion following the first motion.

Description

技認識プログラム、技認識方法および情報処理装置Technique recognition program, technique recognition method, and information processing device
 本発明は、技認識プログラム等に関する。 The present invention relates to a technique recognition program and the like.
 従来、複数の審判が目視によって、体操競技の採点を行っていたが、動きの複雑化をともなう技の高度化が進み、目視による技の認識が困難となる場合がある。 In the past, multiple referees visually scored gymnastics competitions, but as techniques become more sophisticated with more complex movements, it is sometimes difficult to visually recognize techniques.
 これに対し、DL(Deep Learning)等の学習モデルを用いて、競技者が行った演技に対する採点を行う従来技術がある。たとえば、従来技術では、センサ、カメラ等を用いて、競技者のセンシングデータを取得し、センシングデータを訓練済みの学習モデルに入力して、競技者の骨格情報を算出する。そして、従来技術では、骨格情報の時系列データから「技」に対応する姿勢の特徴を示す特徴量を算出し、骨格情報の時系列データと特徴量とに基づいて競技者により実施された技を自動的に認識し、演技のスコアを出力する。 On the other hand, there is a conventional technology that uses a learning model such as DL (Deep Learning) to score performances performed by competitors. For example, in the conventional technology, sensors, cameras, and the like are used to obtain sensing data of an athlete, input the sensing data into a trained learning model, and calculate skeletal information of the athlete. Then, in the conventional technology, a feature amount indicating the characteristics of the posture corresponding to the "skill" is calculated from the time-series data of the skeleton information, and the technique performed by the athlete is calculated based on the time-series data of the skeleton information and the feature amount. automatically recognizes and outputs the performance score.
 演技のスコアは、D(Difficulty)スコアとE(Execution)スコアとの合計で算出される。例えば、Dスコアは、技の成立不成立に基づいて算出されるスコアである。Eスコアは、技の完成度に応じて、減点法により算出されるスコアである。 The performance score is calculated as the sum of the D (Difficulty) score and the E (Execution) score. For example, the D score is a score calculated based on whether or not a technique is established. The E-score is a score calculated by a deduction method according to the degree of perfection of the technique.
特開2020-038440号公報Japanese Patent Application Laid-Open No. 2020-038440 特表2012-518856号公報Japanese Patent Publication No. 2012-518856 国際公開第2020/121500号WO2020/121500 米国特許出願公開第2020/0074247号明細書U.S. Patent Application Publication No. 2020/0074247
 体操競技では、同じ動きでも、棒を持つ際の握り方の違いによって点数の異なる技になる場合がある。握り方の違いは、腕のねじれ方の違いであるが、従来技術では、競技者のセンシングデータから、腕のねじれまで正確に認識することは難しく、正確に技を認識できない場合がある。 In gymnastics, even the same movement can be scored differently depending on how you grip the stick. The difference in grip means the difference in how the arm is twisted, but with the conventional technology, it is difficult to accurately recognize even the twist of the arm from the player's sensing data, and there are cases where the technique cannot be accurately recognized.
 なお、握り方の違いを出力可能な学習モデルを機械学習する場合においても、前記理由により、握りの違いを識別するための特徴が得られないフレームが多く、機械学習だけで正確に認識するのは難しい。 It should be noted that even in the case of machine learning a learning model that can output differences in grip, there are many frames in which features for identifying differences in grip cannot be obtained for the reasons described above, and machine learning alone is not sufficient for accurate recognition. is difficult.
 1つの側面では、本発明は、棒の握り方の違いを区別して技の認識の精度を向上させることができる技認識プログラム、技認識方法および情報処理装置を提供することを目的とする。 In one aspect, it is an object of the present invention to provide a technique recognition program, a technique recognition method, and an information processing apparatus that can improve the accuracy of technique recognition by distinguishing between different ways of gripping a stick.
 第1の案では、コンピュータに次の処理を実行させる。コンピュータは、競技者をセンシングしたセンシングデータを取得する。コンピュータは、センシングデータを基にして、競技者の時系列の3次元の骨格情報を生成する。コンピュータは、骨格情報に基づいて、競技者の実施した運動を特定する場合に、第1運動の終了時における競技者の棒に対する握り方と、第1運動の次の第2運動との組合せに基づいて、第2運動の終了時における競技者の棒に対する握り方を特定する。 In the first plan, the computer executes the following processing. The computer obtains sensing data obtained by sensing the athlete. Based on the sensing data, the computer generates time-series three-dimensional skeletal information of the athlete. When specifying the exercise performed by the athlete based on the skeletal information, the computer determines the combination of the athlete's grip on the stick at the end of the first exercise and the second exercise following the first exercise. Based on this, the player's grip on the bar at the end of the second movement is determined.
 棒の握り方の違いを区別して技の認識の精度を向上させることができる。 It is possible to improve the accuracy of skill recognition by distinguishing between different ways of gripping the stick.
図1は、本実施例に係る技認識システムの一例を示す図である。FIG. 1 is a diagram showing an example of a technique recognition system according to this embodiment. 図2は、本実施例に係る情報処理装置の効果を説明するための図(1)である。FIG. 2 is a diagram (1) for explaining the effect of the information processing apparatus according to the embodiment. 図3は、本実施例に係る情報処理装置の効果を説明するための図(2)である。FIG. 3 is a diagram (2) for explaining the effect of the information processing apparatus according to the embodiment. 図4は、本実施例に係る情報処理装置の構成を示す機能ブロック図である。FIG. 4 is a functional block diagram showing the configuration of the information processing apparatus according to this embodiment. 図5は、本実施例に係る関節定義データのデータ構造の一例を示す図である。FIG. 5 is a diagram showing an example of the data structure of joint definition data according to this embodiment. 図6は、本実施例に係る関節位置DBのデータ構造の一例を示す図である。FIG. 6 is a diagram showing an example of the data structure of the joint position DB according to this embodiment. 図7は、骨格情報DBのデータ構造の一例を示す図である。FIG. 7 is a diagram showing an example of the data structure of a skeleton information DB. 図8は、運動辞書データのデータ構造の一例を示す図である。FIG. 8 is a diagram showing an example of the data structure of motion dictionary data. 図9は、握り判定テーブルのデータ構造の一例を示す図である。FIG. 9 is a diagram showing an example of the data structure of a grip determination table. 図10は、握り判定結果テーブルのデータ構造の一例を示す図である。FIG. 10 is a diagram showing an example of the data structure of a grip determination result table. 図11は、技辞書データのデータ構造の一例を示す図である。FIG. 11 is a diagram showing an example of the data structure of technique dictionary data. 図12は、技判定結果テーブルのデータ構造の一例を示す図である。FIG. 12 is a diagram showing an example of the data structure of the skill determination result table. 図13は、補正部の処理を説明するための図(1)である。FIG. 13 is a diagram (1) for explaining the processing of the correction unit; 図14は、補正部の処理を説明するための図(2)である。FIG. 14 is a diagram (2) for explaining the processing of the correction unit; 図15は、本実施例に係る情報処理装置の処理手順を示すフローチャートである。FIG. 15 is a flow chart showing the processing procedure of the information processing apparatus according to this embodiment. 図16は、握り判定処理の処理手順を示すフローチャートである。FIG. 16 is a flow chart showing the procedure of grip determination processing. 図17は、握り補正処理の処理手順を示すフローチャートである。FIG. 17 is a flow chart showing the processing procedure of grip correction processing. 図18は、特定部の処理を補足説明するための図である。FIG. 18 is a diagram for supplementary explanation of the processing of the specifying unit. 図19は、情報処理装置と同様の機能を実現するコンピュータのハードウェア構成の一例を示す図である。FIG. 19 is a diagram illustrating an example of a hardware configuration of a computer that implements functions similar to those of an information processing apparatus.
 以下に、本願の開示する技認識プログラム、技認識方法および情報処理装置の実施例を図面に基づいて詳細に説明する。なお、この実施例によりこの発明が限定されるものではない。 Hereinafter, embodiments of the technique recognition program, technique recognition method, and information processing apparatus disclosed in the present application will be described in detail based on the drawings. In addition, this invention is not limited by this Example.
 図1は、本実施例に係る技認識システムの一例を示す図である。図1に示すように、この技認識システムは、3D(3 dimension)レーザーセンサ10a,10bと、情報処理装置100とを有する。情報処理装置100と、3Dレーザーセンサ10a,10bとは相互に接続される。 FIG. 1 is a diagram showing an example of a technique recognition system according to this embodiment. As shown in FIG. 1, this technique recognition system has 3D (3-dimension) laser sensors 10a and 10b and an information processing device 100. As shown in FIG. The information processing device 100 and the 3D laser sensors 10a and 10b are interconnected.
 3Dレーザーセンサ10a,10bは、競技者5に対して3Dセンシングを行うセンサである。3Dレーザーセンサ10a,10bは、センシング結果となる距離画像データを、情報処理装置100に出力する。以下の説明では、3Dレーザーセンサ10a,10bをまとめて「3Dレーザーセンサ10」と表記する。 The 3D laser sensors 10a and 10b are sensors that perform 3D sensing on the player 5. The 3D laser sensors 10 a and 10 b output distance image data as sensing results to the information processing device 100 . In the following description, the 3D laser sensors 10a and 10b are collectively referred to as "3D laser sensor 10".
 たとえば、距離画像データには、複数の距離画像フレームが含まれ、各距離画像フレームには、フレームを一意に識別するフレーム番号が昇順で付与される。一つの距離画像フレームには、あるタイミングで3Dレーザーセンサ10にセンシングされた、競技者5上の各点までの距離情報が含まれる。 For example, the distance image data includes a plurality of distance image frames, and each distance image frame is assigned a frame number that uniquely identifies the frame in ascending order. One distance image frame includes distance information to each point on the athlete 5 sensed by the 3D laser sensor 10 at a certain timing.
 競技者5は、3Dレーザーセンサ10の前方で、採点対象となる所定の演技を行う。本実施例では一例として、競技者5が、鉄棒の演技を行う場合について説明するが、棒に対する握り方によって、技が異なる他の採点競技に関しても同様に適用することが可能である。 The contestant 5 performs a predetermined performance to be scored in front of the 3D laser sensor 10 . In this embodiment, as an example, a case where the player 5 performs a horizontal bar performance will be described, but the same can be applied to other scoring competitions in which techniques differ depending on how the player holds the bar.
 たとえば、鉄棒の演技において、棒に対する握り方には、順手、逆手、大逆手が含まれる。順手は、手の甲を上にした普通の握り方である。逆手は、順手から外向きに180°ひねった状態の握り方である。大逆手は、順手から内向きに180°ひねった状態の握り方である。 For example, in a horizontal bar performance, the way to hold the bar includes the right hand, the reverse hand, and the overhand hand. Shuntetsu is a normal grip with the back of the hand up. The reverse hand is a grip that is twisted outward by 180° from the normal hand. Ogyakute is a grip that is twisted 180 degrees inward from the forward hand.
 情報処理装置100は、3Dレーザーセンサ10から取得する距離画像データを基にして、競技者5の時系列の骨格情報を生成し、運動(技)を順に特定していく場合に、前回の運動の終了時における競技者5の握り方と、今回の運動の種別との組合せに基づいて、今回の運動の終了時における競技者5の握り方を特定する。 Based on the distance image data acquired from the 3D laser sensor 10, the information processing device 100 generates time-series skeletal information of the athlete 5. Based on the combination of the grip style of the athlete 5 at the end of the current exercise and the type of the current exercise, the grip style of the athlete 5 at the end of the current exercise is specified.
 情報処理装置100は、今回の運動の種別と、今回の運動の終了時における握り方との組合せによって、今回の運動(技)を識別する。たとえば、情報処理装置100は、今回の運動が「前方系車輪」で、今回の運動の握り方が「順手」または「逆手」の場合には、技名を「前方車輪」と特定する。一方、情報処理装置100は、今回の運動が「前方系車輪」で、今回の運動の握り方が「大逆手」の場合には、技名を「大逆手車輪」と特定する。 The information processing device 100 identifies the current exercise (skill) based on the combination of the type of the current exercise and the grip style at the end of the current exercise. For example, information processing apparatus 100 identifies the technique name as "front wheel" when the current exercise is "front wheel" and the grip style for the current exercise is "right hand" or "reverse hand". On the other hand, the information processing apparatus 100 specifies the name of the technique as "large reverse wheel" when the current exercise is "forward system wheel" and the grip style of the current exercise is "high reverse wheel."
 ここで、従来技術と比較した情報処理装置100の効果について説明する。図2、図3は、本実施例に係る情報処理装置の効果を説明するための図である。図2について説明する。技T1は、競技者5が、T1-1、T1-2、T1-3の順に、運動を行う。技T1は「エンドー1回ひねり片大逆手」と呼ばれるC難度の技である。一方、技T2は、競技者5が、T2-1、T2-2、T2-3の順に、運動を行う。技T2は「エンドー1回ひねり大逆手」と呼ばれるD難度の技である。技T1と、技T2とを比較すると、T1-3における握り方A1と、T2-3における握り方A2とが異なる。 Here, the effect of the information processing device 100 compared with the conventional technology will be described. 2 and 3 are diagrams for explaining the effect of the information processing apparatus according to this embodiment. FIG. 2 will be described. In technique T1, competitor 5 performs exercises in order of T1-1, T1-2, and T1-3. Technique T1 is a C-difficulty technique called "Endo 1 Twist Kata Dai Gakute". On the other hand, in technique T2, the competitor 5 performs exercises in order of T2-1, T2-2, and T2-3. Technique T2 is a D-difficulty technique called "End-One Twisting Ogyakute". Comparing the technique T1 and the technique T2, the grip A1 in T1-3 is different from the grip A2 in T2-3.
 従来技術では、図2で説明した握り方A1と、握り方A2とを認識することができないので、技を正解に特定できない場合がある。これに対して、情報処理装置100は、握り方A1と、握り方A2とを認識することができるので、「エンドー1回ひねり片大逆手」と、「エンドー1回ひねり大逆手」とをそれぞれ正確に特定できる。 In the conventional technology, since it is not possible to recognize the grip A1 and the grip A2 explained in FIG. On the other hand, since the information processing apparatus 100 can recognize the grip A1 and the grip A2, it recognizes the "end-single-twist-single-upside-down hand" and the "end-single-twist-single-upside-down hand" respectively. can be accurately identified.
 図3について説明する。技T3は「前方車輪」と呼ばれるA難度の技である。技T4は「大逆手車輪」と呼ばれるA難度の技である。技T3は「前方車輪」と呼ばれるA難度の技である。技T4は「大逆手車輪」と呼ばれるB難度の技である。図3では、便宜上、技T3による運動の動きを一つの絵にまとめ、技T4による運動の動きを時系列に並べている。技T3と技T4との違いは、棒の握り方の違いである。 Fig. 3 will be explained. Technique T3 is an A-difficulty technique called "front wheel". Technique T4 is an A-difficulty technique called "Ogyakute Wheel". Technique T3 is an A-difficulty technique called "front wheel". Technique T4 is a B-difficulty technique called "Ogyakute Wheel". In FIG. 3, for the sake of convenience, the exercise movements of the technique T3 are summarized in one picture, and the exercise movements of the technique T4 are arranged in chronological order. The difference between technique T3 and technique T4 is the way the stick is gripped.
 従来技術では、図3で説明した棒の握り方の違いを認識することができないので、技を正解に特定することができない場合がある。これに対して、情報処理装置100は、棒の握り方の違いを認識することができるので、「前方車輪」と「大逆手車輪」とをそれぞれ正解に特定できる。 With the conventional technology, it is not possible to recognize the difference in how the stick is gripped as described in FIG. On the other hand, since the information processing apparatus 100 can recognize the difference in gripping the stick, it can correctly identify the "front wheel" and the "large reverse wheel".
 続いて、情報処理装置100の構成例について説明する。図4は、本実施例に係る情報処理装置の構成を示す機能ブロック図である。図4に示すように、情報処理装置100は、通信部110、入力部120、表示部130、記憶部140、制御部150を有する。 Next, a configuration example of the information processing apparatus 100 will be described. FIG. 4 is a functional block diagram showing the configuration of the information processing apparatus according to this embodiment. As shown in FIG. 4 , the information processing apparatus 100 has a communication section 110 , an input section 120 , a display section 130 , a storage section 140 and a control section 150 .
 通信部110は、3Dレーザーセンサ10に接続される。通信部110は、3Dレーザーセンサ10から距離画像データを取得し、取得した距離画像データを、制御部150に出力する。通信部110は、競技者5の映像を撮影するカメラ(図示略)に接続され、このカメラから、映像データを取得してもよい。 The communication unit 110 is connected to the 3D laser sensor 10. The communication unit 110 acquires distance image data from the 3D laser sensor 10 and outputs the acquired distance image data to the control unit 150 . The communication unit 110 may be connected to a camera (not shown) that captures an image of the competitor 5 and acquire image data from this camera.
 入力部120は、情報処理装置100に各種の情報を入力するための入力装置である。入力部120は、キーボード、マウス、タッチパネル等に対応する。 The input unit 120 is an input device for inputting various types of information to the information processing device 100 . The input unit 120 corresponds to a keyboard, mouse, touch panel, and the like.
 表示部130は、制御部150から出力される表示画面の情報を表示する表示装置である。表示部130は、液晶ディスプレイ、タッチパネル等に対応する。 The display unit 130 is a display device that displays information on the display screen that is output from the control unit 150 . The display unit 130 corresponds to a liquid crystal display, a touch panel, or the like.
 記憶部140は、距離画像DB(Data Base)141、関節定義データ142、関節位置DB143、骨格情報DB144、運動辞書データ145、握り判定テーブル146、握り判定結果テーブル147、技辞書データ148、技判定結果テーブル149を有する。記憶部140は、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ(Flash Memory)などの半導体メモリ素子や、HDD(Hard Disk Drive)などの記憶装置に対応する。 The storage unit 140 includes a distance image DB (Data Base) 141, joint definition data 142, joint position DB 143, skeleton information DB 144, motion dictionary data 145, a grip determination table 146, a grip determination result table 147, technique dictionary data 148, and technique determination. It has a result table 149 . The storage unit 140 corresponds to semiconductor memory devices such as RAM (Random Access Memory), ROM (Read Only Memory), flash memory, and storage devices such as HDD (Hard Disk Drive).
 距離画像DB141は、3Dレーザーセンサ10から取得する距離画像データを格納するDBである。たとえば、距離画像DB141は、フレーム番号と、距離画像フレームとを対応付ける。フレーム番号は、各距離画像フレームを一意に識別する番号であり、昇順に番号が割り当てられる。距離画像フレームは、3Dレーザーセンサ10にセンシングされた距離画像データに含まれるフレームである。 The distance image DB 141 is a DB that stores distance image data acquired from the 3D laser sensor 10 . For example, the distance image DB 141 associates frame numbers with distance image frames. The frame number is a number that uniquely identifies each distance image frame, and is assigned in ascending order. A distance image frame is a frame included in distance image data sensed by the 3D laser sensor 10 .
 関節定義データ142は、競技者(競技者5)の各関節位置を定義するものである。図5は、本実施例に係る関節定義データのデータ構造の一例を示す図である。図5に示すように、この関節定義データ142は、公知の骨格モデルで特定される各関節をナンバリングした情報を記憶する。たとえば、図5に示すように、右肩関節(SHOULDER_RIGHT)には7番が付与され、左肘関節(ELBOW_LEFT)には5番が付与され、左膝関節(KNEE_LEFT)には11番が付与され、右股関節(HIP_RIGHT)には14番が付与される。ここで、本実施例では、8番の右肘関節のX座標をX8、Y座標をY8、Z座標をZ8と記載する場合がある。 The joint definition data 142 defines each joint position of the athlete (athlete 5). FIG. 5 is a diagram showing an example of the data structure of joint definition data according to this embodiment. As shown in FIG. 5, this joint definition data 142 stores information in which each joint identified by a known skeleton model is numbered. For example, as shown in FIG. 5, the right shoulder joint (SHOULDER_RIGHT) is assigned number 7, the left elbow joint (ELBOW_LEFT) is assigned number 5, and the left knee joint (KNEE_LEFT) is assigned number 11. , and the right hip joint (HIP_RIGHT) is given number 14. Here, in this embodiment, the X coordinate of the right elbow joint No. 8 may be described as X8, the Y coordinate as Y8, and the Z coordinate as Z8.
 関節位置DB143は、3Dレーザーセンサ10の距離画像データを基に生成される競技者5の各関節の位置データを格納するDBである。図6は、本実施例に係る関節位置DBのデータ構造の一例を示す図である。図6に示すように、この関節位置DB143は、フレーム番号と、各関節の3次元座標「X0、Y0、Z0、・・・、X17、Y17、Z17」を対応付ける。 The joint position DB 143 is a DB that stores the position data of each joint of the player 5 generated based on the distance image data of the 3D laser sensor 10. FIG. 6 is a diagram showing an example of the data structure of the joint position DB according to this embodiment. As shown in FIG. 6, this joint position DB 143 associates a frame number with three-dimensional coordinates "X0, Y0, Z0, . . . , X17, Y17, Z17" of each joint.
 図6では、距離画像データにおける各関節の時系列の変化を示している。たとえば、フレーム番号「1」では、各関節の位置が「X0=100、Y0=20、Z0=0、・・・、X17=200、Y17=40、Z17=5」あることを示す。そして、フレーム番号「2」では、各関節の位置が「X0=101、Y0=25、Z0=5、・・・、X17=202、Y17=39、Z17=15」へ移動したことを示す。 Fig. 6 shows changes in the time series of each joint in the range image data. For example, frame number "1" indicates that the joint positions are "X0=100, Y0=20, Z0=0, . . . , X17=200, Y17=40, Z17=5". Frame number “2” indicates that the positions of the joints have moved to “X0=101, Y0=25, Z0=5, . . . , X17=202, Y17=39, Z17=15”.
 骨格情報DB144は、距離画像データを基に生成される競技者の骨格情報を格納するDBである。図7は、骨格情報DBのデータ構造の一例を示す図である。図7に示すように、骨格情報DB144は、フレーム番号と、骨格情報とを対応付ける。フレーム番号に関する説明は、距離画像DB141で行った説明と同様である。骨格情報は、各関節位置を接続することで推定される競技者5の骨格を示すデータである。 The skeleton information DB 144 is a DB that stores the athlete's skeleton information generated based on the distance image data. FIG. 7 is a diagram showing an example of the data structure of a skeleton information DB. As shown in FIG. 7, the skeleton information DB 144 associates frame numbers with skeleton information. The explanation about the frame number is the same as the explanation given for the distance image DB 141 . The skeletal information is data indicating the skeletal structure of the player 5 estimated by connecting joint positions.
 運動辞書データ145は、運動を定義する辞書データである。運動は、たとえば、握り方以外の特徴量を基にして絞り込まれる技に相当してもよい。その場合、運動に相当する技には、複数の技の候補が含まれる。たとえば、運動「前方系車輪」には、技「前方車輪」、「大逆手車輪」等が含まれる。後述する処理で握り方が特定されることで、運動「前方系車輪」が、「前方車輪」であるのか「大逆手車輪」、その他の技であるかが特定される。 The exercise dictionary data 145 is dictionary data that defines exercise. The exercise may correspond to, for example, a technique narrowed down based on a feature amount other than how to grip. In that case, the technique corresponding to exercise includes a plurality of technique candidates. For example, the exercise ``front wheel'' includes techniques ``front wheel'', ``big reverse wheel'', and the like. By specifying the way of gripping in the process described later, it is specified whether the motion "forward system wheel" is "front wheel", "big reverse wheel", or other tricks.
 図8は、運動辞書データのデータ構造の一例を示す図である。図8に示すように、この運動辞書データ145は、運動毎に複数の基本運動が対応付けられる。各基本運動には、基本運動および特徴量が設定される。特徴量は、時系列の骨格情報から算出される競技者5の動きの特徴を示す値である。たとえば、特徴量は、関節位置の時系列変化である。基本技の組み合わせによって、運動が特定される。運動は前記の例に限定されるものではなく、たとえば、採点規則で定められた技にとらわれずに特定の動作を運動と定義して、その運動ごとに握りを認識してもよい。 FIG. 8 is a diagram showing an example of the data structure of exercise dictionary data. As shown in FIG. 8, the motion dictionary data 145 associates each motion with a plurality of basic motions. A basic motion and a feature amount are set for each basic motion. The feature amount is a value indicating the feature of the movement of the player 5 calculated from the time-series skeleton information. For example, the feature amount is time-series changes in joint positions. A movement is specified by a combination of basic techniques. Exercises are not limited to the above examples, and for example, a specific action may be defined as an exercise regardless of the technique defined by the scoring rules, and a grip may be recognized for each exercise.
 握り判定テーブル146は、前回の運動の終了時における握り方と、今回の運動との組合せに応じたモジュールを特定するためのテーブルである。特定されたモジュールによって、今回の運動に対応する握り方が特定される。 The grip determination table 146 is a table for specifying a module corresponding to the combination of the grip at the end of the previous exercise and the current exercise. The specified module specifies the grip corresponding to the current exercise.
 図9は、握り判定テーブルのデータ構造の一例を示す図である。図9に示すように、握り判定テーブル146は、条件と、モジュールと、優先度とを対応付ける。条件には、前回の握り方と、運動とが含まれる。ここで、前回の握り方は、前回の運動の終了時の握り方を示す。前回の握り方が「any」の場合には、どのような握り方でもよいことを示す。モジュールは、各握り方の判定ポリシーを定義するものである。たとえば、前回の運動の終了時の握り方が「any」で、今回の運動が「アドラー」となる場合には、モジュールは「第1モジュール」となる。図9では、第1モジュール~第4モジュールを示すが、これに限定されるものではない。 FIG. 9 is a diagram showing an example of the data structure of the grip determination table. As shown in FIG. 9, the grip determination table 146 associates conditions, modules, and priorities. The conditions include the previous grip and exercise. Here, the previous grip indicates the grip at the end of the previous exercise. If the previous gripping method is "any", it indicates that any gripping method is acceptable. The module defines a determination policy for each grip style. For example, if the grip at the end of the previous exercise was "any" and the current exercise was "Adler", the module would be "first module". Although FIG. 9 shows the first to fourth modules, the present invention is not limited to this.
 優先度は、モジュールの優先度を示すものであり、数値が小さいほどより優先度が高いことを意味する。優先度の高いモジュールは、優先度の低いモジュールよりも、出力される値がより確からしい。  Priority indicates the priority of the module, and the smaller the number, the higher the priority. Modules with higher priority are more likely to output values than modules with lower priority.
 第1モジュールは、事前に設定された、左右の手の握り方の尤度を出力する。たとえば、アドラーの後の握りは運動の特性から必ず大逆手となる。この場合、第1モジュールは、左手の握り方の尤度として、逆手、順手、大逆手の順に(0,0,1)を出力する。第1モジュールは、右手の握り方の尤度として、逆手、順手、大逆手の順に(0,0,1)を出力する。 The first module outputs the preset likelihoods of the left and right hand grips. For example, the hand grip after Adler is always a large reverse hand due to the characteristics of the movement. In this case, the first module outputs (0, 0, 1) as the likelihood of the left hand grip in the order of reverse hand, forward hand, and major reverse hand. The first module outputs (0, 0, 1) in the order of reverse hand, forward hand, and large reverse hand as the likelihood of the right hand grip.
 第2モジュール(握り替え判定)について説明する。第2モジュールは、今回の運動に相当する特徴量から推定される手首や腕の動き、全身のバランスから、握り替えがあったか否かを判定する。第2モジュールは、大逆手から逆手へと握り替えがあったと判定した場合には、握り方の尤度として、逆手、順手、大逆手の順に(0.7,0.1,0.2)を出力する。前の握り方が「(両手)大逆手」であるため、逆手の尤度が高く設定される。 The second module (holding change judgment) will be explained. The second module determines whether or not there has been a grip change based on the movement of the wrist and arms estimated from the feature quantity corresponding to the current exercise and the balance of the whole body. When the second module determines that there is a grip change from a large reverse hand to a reverse hand, the likelihood of the grip is set to (0.7, 0.1, 0.2) in the order of reverse hand, straight hand, and large reverse hand. to output Since the previous grip was "(both hands) large reverse hand", the likelihood of the reverse hand is set high.
 一方、第2モジュールは、大逆手のまま握り替えがなかったと判定した場合には、握り方の尤度として、逆手、順手、大逆手の順に(0.2,0.1,0.7)を出力する。前の握り方が「(両手)大逆手」であるため、大逆手の尤度が高く設定される。 On the other hand, when the second module determines that there was no change of grip while still in the great reverse hand, the likelihood of the grip is set to (0.2, 0.1, 0.7) in the order of the reverse hand, the straight hand, and the great reverse hand. to output Since the previous grip was "(both hands) Ogyakute", the likelihood of Ogyakute is set high.
 第3モジュール(前の尤度引継ぎ)は、前回の運動の尤度と同じ尤度を出力するモジュールである。たとえば、前回の運動のモジュールによって(0.7,0.1,0.2)が出力されている場合には、今回の運動に関する尤度として、(0.7,0.1,0.2)を出力する。 The third module (previous likelihood inheritance) is a module that outputs the same likelihood as the previous motion likelihood. For example, if (0.7, 0.1, 0.2) is output by the previous exercise module, the likelihood for this exercise is (0.7, 0.1, 0.2 ).
 第4モジュール(倒立ひねり判定)について説明する。一回ひねりの場合、軸手は運動と人体構造による制約から握り方が確定するので、確定する尤度を出力する。たとえば、第4モジュールは、左手の握り方の尤度として、逆手、順手、大逆手の順に(0,0,1)を出力する。 The fourth module (inverted twist determination) will be explained. In the case of a single twist, the way of grasping the handle is determined by the motion and constraints of the human body structure, so the likelihood of determination is output. For example, the fourth module outputs (0, 0, 1) as the likelihood of the left hand grip in the order of reverse hand, forward hand, and major reverse hand.
 なお、第4モジュールは、軸でない方の手に関する尤度について、腕の軌跡や、棒をどちら側から掴むのか、掴んだ後、棒を擦らせて掴む動きをするのか、といった特徴からどの握り方で掴んだのかを推測し、尤度を出力する。DLを利用して、動きの特徴と、軸でない方の手に関する尤度との関係を学習させ、第4モジュールとして利用してもよい。 In addition, the fourth module evaluates the likelihood of the hand that is not the axis, based on characteristics such as the trajectory of the arm, which side the stick is gripped from, and whether the grip is made by rubbing the stick after gripping. Guess which way you grabbed it and output the likelihood. The DL may be used to learn the relationship between motion features and the likelihood of the non-axis hand and used as a fourth module.
 図4の説明に戻る。握り判定結果テーブル147は、握り方の判定結果を保持するテーブルである。図10は、握り判定結果テーブルのデータ構造の一例を示す図である。図10に示すように、この握り判定結果テーブル147は、順番と、運動と、左手尤度、右手尤度、左手握り、右手握りを有する。 Return to the description of Fig. 4. The grip determination result table 147 is a table that holds determination results of how to grip. FIG. 10 is a diagram showing an example of the data structure of a grip determination result table. As shown in FIG. 10, this grip determination result table 147 has order, motion, left hand likelihood, right hand likelihood, left hand grip, and right hand grip.
 順番は、演技された運動の順番を示す。運動は、上記のように、複数の技の候補を代表する技である。左手尤度は、左手の握り方に関する逆手、順手、大逆手の各尤度を含む。右手尤度は、右手の握り方に関する逆手、順手、大逆手の各尤度を含む。左手握りは、尤度から特定される左手の握り方である。右手握りは、尤度から特定される右手の握り方である。 The order indicates the order of the exercises performed. Exercise, as described above, is a technique that represents a plurality of technique candidates. The left-hand likelihood includes each likelihood of a reverse hand, a forward hand, and a major reverse hand regarding how to hold the left hand. The right hand likelihood includes the likelihoods of each of reverse hand, forward hand, and large reverse hand regarding how to hold the right hand. The left hand grip is a way of gripping the left hand identified from the likelihood. The right hand grip is a way of gripping the right hand identified from the likelihood.
 図10において、順番「k番目」では、運動「アドラー」終了時の左手尤度が、逆手、順手、大逆手の順に(0,0,1)となり、尤度が最大となる「大逆手」が左手握りに設定されている。また、順番「k番目」では、運動「アドラー」終了時の右手尤度が、逆手、順手、大逆手の順に(0,0,1)となり、尤度が最大となる「大逆手」が右手握りに設定されている。 In FIG. 10, in the order "kth", the likelihood of the left hand at the end of the exercise "Adler" is (0, 0, 1) in the order of the reverse move, the forward move, and the major reverse move, and the "great reverse move" having the maximum likelihood. is set to left hand grip. In the order "k-th", the likelihood of the right hand at the end of the exercise "Adler" is (0, 0, 1) in the order of the reverse move, the forward move, and the great reverse move. It is set in a grip.
 図10において、握り判定結果テーブル147の左手握り、右手握りは、運動の終了時における握り方とするが、開始時の握り方と、終了時の握り方との双方を保持していてもよい。 In FIG. 10, the left hand grip and right hand grip of the grip determination result table 147 are the grips at the end of the exercise, but both the grips at the start and the grips at the end may be retained. .
 図4の説明に戻る。技辞書データ148は、運動と運動終了時の握り方に対応する技を定義する辞書データである。図11は、技辞書データのデータ構造の一例を示す図である。図11に示すように、この技辞書データ148は、運動と、左手握り、右手握り、技を対応付ける。運動と、左手握り、右手握りに関する説明は、図10で説明した内容と同様である。図11の技は、最終的に特定される技名である。 Return to the description of Fig. 4. The technique dictionary data 148 is dictionary data that defines techniques corresponding to exercise and how to grip at the end of the exercise. FIG. 11 is a diagram showing an example of the data structure of technique dictionary data. As shown in FIG. 11, this technique dictionary data 148 associates exercises with left hand grips, right hand grips, and techniques. The explanations regarding exercise, left hand grip, and right hand grip are the same as those described with reference to FIG. 10 . The technique shown in FIG. 11 is the name of the technique that is finally specified.
 図11において、運動「前方系車輪」、左手握り「逆手」、右手握り「逆手」に対応する技は「前方車輪」となる。 In FIG. 11, the technique corresponding to the exercise "front wheel", left hand grip "reverse hand", and right hand hand grip "reverse hand" is "front wheel".
 図4の説明に戻る。技判定結果テーブル149は、競技者5が行った技の判定結果を保持するテーブルである。図12は、技判定結果テーブルのデータ構造の一例を示す図である。図12に示すように、この技判定結果テーブル149は、技の順番と、技とが対応付けられる。 Return to the description of Fig. 4. The technique determination result table 149 is a table that holds the determination results of techniques performed by the player 5 . FIG. 12 is a diagram showing an example of the data structure of the skill determination result table. As shown in FIG. 12, the technique determination result table 149 associates the order of techniques with the techniques.
 図4の説明に戻る。制御部150は、取得部151、生成部152、算出部153、特定部154、補正部155、技決定部156、評価部157を有する。制御部150は、CPU(Central Processing Unit)やMPU(Micro Processing Unit)などによって実現できる。また、制御部160は、ASIC(Application Specific Integrated Circuit)やFPGA(Field Programmable Gate Array)などのハードワイヤードロジックによっても実現できる。 Return to the description of Fig. 4. The control unit 150 has an acquisition unit 151 , a generation unit 152 , a calculation unit 153 , a specification unit 154 , a correction unit 155 , a technique determination unit 156 and an evaluation unit 157 . The control unit 150 can be realized by a CPU (Central Processing Unit), an MPU (Micro Processing Unit), or the like. The control unit 160 can also be realized by hardwired logic such as ASIC (Application Specific Integrated Circuit) and FPGA (Field Programmable Gate Array).
 取得部151は、通信部110を介して、3Dレーザーセンサ10から距離画像データを取得し、取得した距離画像データ(フレーム番号、距離画像フレーム)を、距離画像DB141に登録する。 The acquisition unit 151 acquires distance image data from the 3D laser sensor 10 via the communication unit 110, and registers the acquired distance image data (frame number, distance image frame) in the distance image DB 141.
 生成部152は、競技者5の各関節の位置データを時系列に抽出する処理、時系列に骨格情報を生成する処理を行うことで、骨格情報を生成する。 The generation unit 152 generates skeleton information by performing processing for extracting the position data of each joint of the player 5 in time series and processing for generating skeleton information in time series.
 生成部152が、競技者5の各関節の位置データを時系列に抽出する処理の一例について説明する。生成部152は、距離画像DB141の距離画像フレームと、関節定義データ142とを比較して、距離画像フレームに含まれる各関節の種別および関節の3次元座標を特定する。生成部152は、フレーム番号、各関節の種別の3次元座標を対応付けた情報を、関節位置DB143に登録する。生成部152は、フレーム番号毎に上記の処理を繰り返し実行する。 An example of a process in which the generation unit 152 extracts the position data of each joint of the player 5 in time series will be described. The generation unit 152 compares the distance image frame of the distance image DB 141 and the joint definition data 142 to identify the type of each joint included in the distance image frame and the three-dimensional coordinates of the joint. The generation unit 152 registers information in which the frame number and the three-dimensional coordinates of each joint type are associated with each other in the joint position DB 143 . The generation unit 152 repeatedly executes the above process for each frame number.
 生成部152が、骨格情報を時系列に生成する処理の一例について説明する。生成部152は、関節位置DB143を基にして、各フレーム番号に対応する骨格情報を生成する。生成部152は、生成した骨格情報を、フレーム番号と対応付けて、骨格情報DB144に格納する。 An example of a process in which the generation unit 152 generates skeleton information in time series will be described. The generator 152 generates skeleton information corresponding to each frame number based on the joint position DB 143 . Generation unit 152 stores the generated skeleton information in skeleton information DB 144 in association with the frame number.
 たとえば、生成部152は、関節位置DB143に格納された各関節の3次元座標を、関節定義データ142に定義された接続関係を基にしてつなぎ合わせることで、骨格情報を生成する。 For example, the generation unit 152 generates skeleton information by connecting the three-dimensional coordinates of each joint stored in the joint position DB 143 based on the connection relationship defined in the joint definition data 142 .
 算出部153は、骨格情報DB144に格納された時系列の骨格情報を基にして、特徴量を算出する。たとえば、算出部153が算出する特徴量は、各関節位置の時系列変化であり、競技者5の動きの特徴を示すものである。 The calculation unit 153 calculates feature amounts based on the time-series skeleton information stored in the skeleton information DB 144 . For example, the feature amount calculated by the calculator 153 is a time-series change in each joint position, and indicates the feature of the movement of the player 5 .
 算出部153は、基本運動が切り替わる運動の切れ目毎に、特徴量を算出する。たとえば、算出部153は、各関節位置の変化が一定期間閾値未満となるタイミングを切れ目と判定する。算出部153は、「国際公開第2019/116496号」に開示された技術等を用いて、運動の切れ目を特定してもよい。算出部153は、切れ目毎の特徴量を、特定部154に出力する。以下の説明では、切れ目毎の特徴量を、単に「特徴量」と表記する。 The calculation unit 153 calculates the feature quantity for each break in the movement at which the basic movement is switched. For example, the calculation unit 153 determines the timing at which the change in each joint position is less than the threshold for a certain period of time as a discontinuity. The calculation unit 153 may identify a break in exercise using the technology disclosed in “International Publication No. 2019/116496”. The calculation unit 153 outputs the feature amount for each break to the identification unit 154 . In the following description, the feature amount for each break is simply referred to as "feature amount".
 特定部154は、競技者5の棒に対する握り方を特定する処理部である。たとえば、特定部154は、運動を特定する処理、握り方を特定する処理を実行する。 The specifying unit 154 is a processing unit that specifies how the player 5 grips the stick. For example, the identifying unit 154 executes a process of identifying an exercise and a process of identifying a grip style.
 特定部154が実行する運動を特定する処理について説明する。特定部154は、各特徴量と、運動辞書データ145の基本運動の特徴量とを比較して、各特徴量に対応する基本運動をそれぞれ特定する。特定部154は、特定した各基本運動の組合せと、運動辞書データ145の各基本運動とを比較して、各基本運動に対応する運動を特定する。特定部154は、上記処理を繰り返し実行し、運動を順に特定する。 The process of identifying exercise performed by the identifying unit 154 will be described. The specifying unit 154 compares each feature quantity with the feature quantity of the basic motion in the motion dictionary data 145 to specify the basic motion corresponding to each feature quantity. The identifying unit 154 compares the specified combinations of basic motions with the basic motions in the motion dictionary data 145 to specify motions corresponding to the basic motions. The specifying unit 154 repeats the above process to specify motions in order.
 特定部154が握り方を特定する処理について説明する。ここでは、一例として、k-1番目の運動を、「前回の運動」と表記し、k番目の運動を、「今回の運動」とする。特定部154は、前回の運動の握り方(終了時の握り方)および今回の運動の組合せと、握り判定テーブル146の条件とを比較して、握り方の特定に利用するモジュールを決定する。特定部154は、今回が初回である場合(k=1の場合)には、前回の運動の握り方を所定の握り方(たとえば、順手とする)としてもよい。 A description will be given of the process by which the identifying unit 154 identifies the way of gripping. Here, as an example, the k-1th exercise is referred to as "previous exercise", and the kth exercise is referred to as "current exercise". The identification unit 154 compares the combination of the gripping manner of the previous exercise (the gripping manner at the end) and the current exercise with the conditions of the grip determination table 146, and determines the module to be used for identifying the gripping manner. If this is the first time (when k=1), the specifying unit 154 may set the gripping manner of the previous exercise to a predetermined gripping manner (for example, smooth).
 特定部154は、決定したモジュールを用いて、各握り方の尤度を特定し、尤度が最大となる握り方を、今回の運動に対応する握り方として特定する。特定部154は、k番目について、今回の運動、左手尤度、右手尤度、左手握り、右手握りの情報を、握り判定結果テーブル147に登録する。特定部154は、左手握り、右手握りに、開始時の握り方と、終了時の握り方とを登録する。 The identifying unit 154 uses the determined module to identify the likelihood of each gripping style, and identifies the gripping style with the highest likelihood as the gripping style corresponding to the current exercise. The identifying unit 154 registers information on the current exercise, left hand likelihood, right hand likelihood, left hand grip, and right hand grip in the grip determination result table 147 for the k-th. The specifying unit 154 registers the gripping manner at the start and the gripping manner at the end for left hand grip and right hand grip.
 特定部154は、続くk+1番目、k+2番目、・・・、k+n番目についても、上記処理を繰り返し実行することで、握り判定結果テーブル147に、各情報を順次登録する。 The specifying unit 154 sequentially registers each piece of information in the grip determination result table 147 by repeatedly executing the above process for the following k+1st, k+2nd, . . . , k+nth.
 補正部155は、上記の特定部154の処理によって登録された握り判定結果テーブル147を参照し、前回の握り方から、今回の握り方への状態遷移に人体構造上の矛盾があるかを判定し、矛盾がある場合には、各握り方の尤度を再度算出し直す。 The correction unit 155 refers to the grip determination result table 147 registered by the processing of the identification unit 154 described above, and determines whether there is a contradiction in the human body structure in the state transition from the previous gripping method to the current gripping method. However, if there is a contradiction, the likelihood of each grip is calculated again.
 図13および図14は、補正部の処理を説明するための図である。図13について説明する。図13に示す握り判定結果テーブル147Aでは、説明の便宜上、左手握り(開始時)および右手握り(開始時)と、左手握り(終了時)および右手握り(終了時)とを示す(図10では図示を省略)。 13 and 14 are diagrams for explaining the processing of the correction unit. FIG. 13 will be described. For convenience of explanation, the grip determination result table 147A shown in FIG. (illustration omitted).
 補正部155は、握り判定結果テーブル147Aの連続する2つのレコードを走査し、前の運動の終了時の握り方と、後の運動の開始時の握り方とが相違するレコードを特定する。補正部155は、特定したレコードに設定された左手尤度、右手尤度を特定したモジュールを特定し、優先度の値が高い方のモジュールを選択する。補正部155は、優先度の値が高い方のモジュールを用いて、条件などを変更しつつ、尤度を特定し直す。 The correction unit 155 scans two consecutive records in the grip determination result table 147A and identifies records in which the grip at the end of the previous exercise is different from the grip at the start of the subsequent exercise. Correction unit 155 identifies the module that has identified the left-hand likelihood and right-hand likelihood set in the identified record, and selects the module with the higher priority value. The correcting unit 155 uses the module with the higher priority to re-specify the likelihood while changing the conditions.
 たとえば、図13に示す例では、順番「k+1番目」のレコードの終了時の左手握り「大逆手」と、順番「k+2番目」のレコードの開始時の左手握り「逆手」とが異なっている。このため、補正部155は、順番「k+1番目」、「k+2番目」のレコードを特定する。 For example, in the example shown in FIG. 13, the left hand grip "upside down hand" at the end of the "k+1th" record and the left hand grip "upside down" at the start of the "k+2th" record are different. Therefore, the correcting unit 155 identifies the “k+1st” and “k+2nd” records.
 補正部155は、握り判定テーブル146を基にして、「k+1番目」の左手尤度、右手尤度を特定する場合に用いたモジュール「第2モジュール」を特定する。補正部155は、握り判定テーブル146を基にして、「k+2番目」の左手尤度、右手尤度を特定する場合に用いたモジュール「第4モジュール」を特定する。 Based on the grip determination table 146, the correction unit 155 identifies the module "second module" used when identifying the "k+1" left hand likelihood and right hand likelihood. Based on the grip determination table 146, the correction unit 155 identifies the module “fourth module” used when identifying the “k+2” left hand likelihood and right hand likelihood.
 補正部155は、「第2モジュール」の優先度が「3」であり、「第4モジュール」の優先度が「1」であるため、第2モジュールを選択する。 The correction unit 155 selects the second module because the priority of the "second module" is "3" and the priority of the "fourth module" is "1".
 図14の説明に移行する。補正部155は、第2モジュールを利用する際の条件を変更して、再度、左手尤度、右手尤度を特定する。たとえば、前回の第2モジュールの条件が「握り替えがなかった」によって、各尤度「0.2,0.1,0.7」が出力されていたものとする。この場合には、補正部155は、条件を「握り替えがあった」に変更し、各尤度「0.7,0.1,0.2」と特定する。また、これによって、左手握り、右手握りが「逆手」に更新される。たとえば、補正部155の処理によって、握り判定結果テーブル147Bは、握り判定結果テーブル147Cに更新され、矛盾が解消される。 Move to the description of FIG. The correction unit 155 changes the conditions for using the second module, and identifies the left-hand likelihood and the right-hand likelihood again. For example, it is assumed that each likelihood "0.2, 0.1, 0.7" was output because the condition of the previous second module was "there was no grip change". In this case, the correction unit 155 changes the condition to "there was a grip change" and specifies each likelihood as "0.7, 0.1, 0.2". Also, by this, the left hand grip and right hand grip are updated to "upside down". For example, by the processing of the correction unit 155, the grip determination result table 147B is updated to the grip determination result table 147C, and the contradiction is resolved.
 また、補正部155は、選択したモジュール(上記の例では、第2モジュール)が、複数の出力結果を出力するモジュールである場合には、矛盾がなくなるような出力結果を選択してもよい。たとえば、図13では、前回の終了時の握り方が「大逆手」で、今回の開始時の握り方が「逆手」であり、矛盾している。この場合、選択したモジュールの出力結果のうち、大逆手の尤度が最大となるような出力結果を選択する。 Further, when the selected module (second module in the above example) is a module that outputs a plurality of output results, the correction unit 155 may select an output result that eliminates contradiction. For example, in FIG. 13, the way of gripping at the end of the previous time is "Ogyakute" and the way of gripping at the start of this time is "Inverse hand", which contradicts each other. In this case, among the output results of the selected modules, the output result that maximizes the likelihood of oganete is selected.
 なお、補正部155は、握り判定結果テーブル147に矛盾がない場合には、上記の処理をスキップする。 It should be noted that the correcting unit 155 skips the above processing when there is no contradiction in the grip determination result table 147 .
 図4の説明に戻る。技決定部156は、握り判定結果テーブル147に登録された運動と握り方(左手握り、右手握り)との組合せと、技辞書データ148とを基にして、競技者5が実行した技を特定し、特定した結果を順に技判定結果テーブル149に登録する。 Return to the description of Fig. 4. The technique determination unit 156 identifies the technique performed by the player 5 based on the combination of the exercise and the grip method (left hand grip, right hand grip) registered in the grip determination result table 147 and the technique dictionary data 148. Then, the identified results are registered in the skill determination result table 149 in order.
 評価部157は、技判定結果テーブル149を基にして、競技者5のスコアを採点する。たとえば、評価部157は、技判定結果テーブル149に登録された技(成立した技)の難度に対応する点を合計することによって、競技者5のDスコアを算出する。また、評価部157は、骨格情報DB144に登録された骨格情報から、技の完成度等を更に評価し、減点方式によるEスコアを更に算出してもよい。評価部157は、評価結果を、表示部130に出力して表示させる。 The evaluation unit 157 grades the score of the competitor 5 based on the skill determination result table 149. For example, the evaluation unit 157 calculates the D score of the competitor 5 by totaling the points corresponding to the difficulty levels of the techniques (successful techniques) registered in the technique determination result table 149 . Further, the evaluation unit 157 may further evaluate the degree of perfection of the technique, etc., based on the skeleton information registered in the skeleton information DB 144, and further calculate the E-score based on the deduction method. The evaluation unit 157 outputs the evaluation result to the display unit 130 for display.
 次に、本実施例に係る情報処理装置100の処理手順の一例について説明する。図15は、本実施例に係る情報処理装置の処理手順を示すフローチャートである。図15に示すように、情報処理装置100の取得部151は、3Dレーザーセンサ10から距離画像データを取得し、距離画像DB141に登録する(ステップS101)。 Next, an example of the processing procedure of the information processing apparatus 100 according to this embodiment will be described. FIG. 15 is a flow chart showing the processing procedure of the information processing apparatus according to this embodiment. As shown in FIG. 15, the acquisition unit 151 of the information processing device 100 acquires distance image data from the 3D laser sensor 10 and registers it in the distance image DB 141 (step S101).
 情報処理装置100の生成部152は、距離画像DB141および関節定義データ142を基にして、時系列の骨格情報を生成し、骨格情報DB144に登録する(ステップS102)。情報処理装置100の算出部153は、骨格情報DB144に登録された骨格情報を基にして、特徴量を算出する(ステップS103)。 The generation unit 152 of the information processing device 100 generates time-series skeleton information based on the distance image DB 141 and the joint definition data 142, and registers it in the skeleton information DB 144 (step S102). The calculation unit 153 of the information processing apparatus 100 calculates feature amounts based on the skeleton information registered in the skeleton information DB 144 (step S103).
 情報処理装置100の特定部154は、特徴量と、運動辞書データ145とを基にして、運動を特定する(ステップS104)。特定部154は、握り判定処理を実行する(ステップS105)。情報処理装置100の補正部155は、握り補正処理を実行する(ステップS106)。 The identifying unit 154 of the information processing device 100 identifies exercise based on the feature amount and the exercise dictionary data 145 (step S104). The identifying unit 154 executes grip determination processing (step S105). The correction unit 155 of the information processing device 100 executes grip correction processing (step S106).
 情報処理装置100の技決定部156は、握り判定結果テーブル147と、技辞書データ148とを基にして、技を決定し、技判定結果テーブル149に登録する(ステップS107)。情報処理装置100の評価部157は、技判定結果テーブル149を基にして、スコアを採点し(ステップS108)、スコアを表示部130に表示させる(ステップS109)。 The technique determination unit 156 of the information processing device 100 determines a technique based on the grip determination result table 147 and the technique dictionary data 148, and registers it in the technique determination result table 149 (step S107). The evaluation unit 157 of the information processing device 100 grades the score based on the skill determination result table 149 (step S108), and causes the display unit 130 to display the score (step S109).
 続いて、図15のステップS105に示した握り判定処理の一例について説明する。図16は、握り判定処理の処理手順を示すフローチャートである。図16に示すように、情報処理装置100の特定部154は、kを初期値(k=1)に設定する(ステップS201)。 Next, an example of the grip determination process shown in step S105 of FIG. 15 will be described. FIG. 16 is a flow chart showing the procedure of grip determination processing. As shown in FIG. 16, the specifying unit 154 of the information processing device 100 sets k to an initial value (k=1) (step S201).
 特定部154は、握り判定テーブル146を基にして、k番目の運動および前回の握り方に対応するモジュールを特定する(ステップS202)。特定部154は、特定したモジュールを読み出す(ステップS203)。 Based on the grip determination table 146, the identifying unit 154 identifies the module corresponding to the k-th exercise and the previous grip method (step S202). The identifying unit 154 reads out the identified module (step S203).
 特定部154は、モジュールの出力する各握り方の尤度のうち、最大の尤度に対応する握り方を、k番目の運動の握り方として特定し、握り判定結果テーブル147に登録する(ステップS204)。 The identifying unit 154 identifies the gripping style corresponding to the maximum likelihood among the likelihoods of the gripping styles output by the module as the gripping style of the k-th exercise, and registers it in the grip determination result table 147 (step S204).
 特定部154は、k番目が最後の運動である場合には(ステップS205,Yes)、握り判定処理を終了する。 If the k-th movement is the last movement (step S205, Yes), the identification unit 154 ends the grip determination process.
 特定部154は、k番目が最後の運動でない場合には(ステップS205,No)、kに1を加算し(ステップS206)、ステップS202に移行する。 If the k-th motion is not the last exercise (step S205, No), the specifying unit 154 adds 1 to k (step S206), and proceeds to step S202.
 続いて、図15のステップS106に示した握り補正処理の一例について説明する。図17は、握り補正処理の処理手順を示すフローチャートである。図17に示すように、情報処理装置100の補正部155は、握り判定結果テーブル147を参照し、開始時の握り方と終了時の握り方とが矛盾するレコードを特定する(ステップS301)。 Next, an example of the grip correction process shown in step S106 of FIG. 15 will be described. FIG. 17 is a flow chart showing the processing procedure of grip correction processing. As shown in FIG. 17, the correcting unit 155 of the information processing device 100 refers to the grip determination result table 147 and identifies a record in which the grip at the start and the grip at the end contradict each other (step S301).
 補正部155は、矛盾するレコードに含まれる尤度の算出に用いたモジュールのうち、優先度の値の高い方のモジュールを抽出する(ステップS302)。 The correction unit 155 extracts the module with the higher priority value from among the modules used to calculate the likelihood contained in the contradictory record (step S302).
 補正部155は、kを初期値(k=1)に設定する(ステップS303)。補正部155は、抽出した複数のモジュールのうち、優先度の値がk番目に高いモジュールを選択する(ステップS304)。 The correction unit 155 sets k to an initial value (k=1) (step S303). The correction unit 155 selects the module with the k-th highest priority value from among the plurality of extracted modules (step S304).
 補正部155は、選択したモジュールの出力し得る結果のうち、開始時の握り方と終了時の握り方のとの矛盾が無くなる出力を選択する(ステップS305)。補正部155は、握り判定結果テーブル147を更新する(ステップS306)。 The correction unit 155 selects, from among the results that can be output by the selected module, an output that eliminates the contradiction between the way of gripping at the start and the way of gripping at the end (step S305). The correction unit 155 updates the grip determination result table 147 (step S306).
 特定部154は、k番目が、抽出した最後のモジュールである場合には(ステップS307,Yes)、握り補正処理を終了する。 If the k-th module is the last extracted module (step S307, Yes), the specifying unit 154 ends the grip correction process.
 特定部154は、k番目が、抽出した最後のモジュールでない場合には(ステップS307,No)、kに1を加算し(ステップS308)、ステップS304に移行する。 If the k-th module is not the last extracted module (step S307, No), the specifying unit 154 adds 1 to k (step S308), and proceeds to step S304.
 次に、本実施例に係る情報処理装置100の効果について説明する。情報処理装置100は、3Dレーザーセンサ10から取得する距離画像データを基にして、競技者5の時系列の骨格情報を生成し、運動(技)を順に特定していく場合に、前回の運動の終了時における競技者5の握り方と、今回の運動の種別との組合せに基づいて、今回の運動の終了時における競技者5の握り方を特定する。これによって、上記の図2、図3で説明したように、棒の握り方の違いを区別して技の認識の精度を向上させることができる。 Next, the effects of the information processing apparatus 100 according to this embodiment will be described. Based on the distance image data acquired from the 3D laser sensor 10, the information processing device 100 generates time-series skeletal information of the athlete 5. Based on the combination of the grip style of the athlete 5 at the end of the current exercise and the type of the current exercise, the grip style of the athlete 5 at the end of the current exercise is specified. As described above with reference to FIGS. 2 and 3, this makes it possible to distinguish between different ways of gripping the stick and improve the accuracy of technique recognition.
 情報処理装置100は、前回の運動の終了時における競技者5の棒に対する握り方と、今回の運動との組合せに基づいて、モジュールを特定し、複数の握り方に関する尤度をそれぞれ特定する。情報処理装置100は、特定した尤度を基にして、今回の運動における棒に対する握り方を特定する。これによって、競技者5の握り方の認識精度を向上させることができる。 The information processing device 100 identifies a module based on the combination of how the athlete 5 gripped the bar at the end of the previous exercise and the current exercise, and identifies the likelihoods of the plurality of grips. The information processing device 100 identifies how to grip the stick in the current exercise based on the identified likelihood. As a result, it is possible to improve the recognition accuracy of the grip of the player 5 .
 情報処理装置100は、前回の運動の終了時における競技者5の棒に対する握り方から、今回の運動における競技者5の棒に対する握り方への遷移について矛盾がある場合に、複数の握り方に関する尤度を修正する。これによって、競技者5の握り方の認識精度を更に向上させることができる。 When there is a contradiction in the transition from how the athlete 5 grips the bar at the end of the previous exercise to how the athlete 5 grips the bar in the current exercise, the information processing device 100 determines whether the grips related to the plurality of grips are inconsistent. Correct the likelihood. As a result, it is possible to further improve the recognition accuracy of the grip of the player 5 .
 情報処理装置100は、運動と、運動の握り方と組合せを基にして、競技者5の実施した技を特定し、特定した技を基にして、体操の演技を評価するためのスコアを採点する。これによって、体操競技において、審判員の判定を補助することが可能となる。 The information processing device 100 identifies the technique performed by the athlete 5 based on the exercise and the grip and combination of the exercise, and scores a score for evaluating the gymnastics performance based on the identified technique. do. This makes it possible to assist the referee's judgment in gymnastics.
 ところで、上述した実施例は一例であり、情報処理装置100は、その他の処理を更に実行してもよい。以下では、情報処理装置100のその他の処理1、2について説明する。 By the way, the embodiment described above is an example, and the information processing apparatus 100 may further execute other processes. Other processes 1 and 2 of the information processing apparatus 100 will be described below.
 その他の処理1について説明する。情報処理装置100は、3Dレーザーセンサ10から距離画像データを取得し、骨格情報を生成していたが、これに限定されるものではなく、RGBカメラが撮影する画像データを基にして、骨格情報を生成してもよい。この場合、情報処理装置100は、画像データを入力、骨格情報を出力とする訓練済みの学習モデルを用いる。 Other processing 1 will be explained. Although the information processing apparatus 100 acquires distance image data from the 3D laser sensor 10 and generates skeleton information, it is not limited to this, and skeleton information is generated based on image data captured by an RGB camera. may be generated. In this case, the information processing apparatus 100 uses a trained learning model that receives image data and outputs skeleton information.
 その他の処理2について説明する。情報処理装置100の特定部154は、基本的には、上述した処理(図16で説明した処理手順で処理)を実行し、競技者5の握り方を特定しているが、ここでは、別の観点で、特定部154の処理の一例について説明する。 Other processing 2 will be explained. The identification unit 154 of the information processing device 100 basically executes the above-described processing (processing according to the processing procedure described with reference to FIG. 16) to identify the grip of the player 5. From this point of view, an example of the processing of the identification unit 154 will be described.
 競技者5の握り方が変わる要素には、「体をひねる」、「脚を腕の間に通すことで腕がねじれる」、「握り替え」がある。また、体をひねる要素には、軸手をひねった量によって握り方がわかるものと、離した手のつかみ方によって握り方が変わるものがある。 The elements that change the grip of competitor 5 include "twisting the body", "twisting the arm by putting the leg between the arms", and "changing the grip". In addition, as for the elements that twist the body, there are those that can be grasped by the amount of twisting the shaft, and those that change the grip depending on how the hand is released.
 特定部154は、体をひねる要素のうち、軸手をひねった量によって握り方がわかるものについては、運動認識によって、握り方を特定する。特定部154は、体をひねる要素のうち、離した手のつかみ方によって握り方が変わるものについては、つかみ方認識によって、握り方を特定する。 The identifying unit 154 identifies the gripping method by motion recognition for the element that twists the body that can be grasped by the amount of twisting of the shaft. The identification unit 154 identifies the gripping method based on the gripping method recognition for the element that twists the body and the gripping method changes depending on the gripping method of the separated hand.
 特定部154は、脚を腕の間に通すことで腕がねじれる要素については、運動認識によって、握り方を特定する。 The specifying unit 154 specifies how to grip an element in which the arm is twisted by passing the leg between the arms through motion recognition.
 特定部154は、握り替え要素については、運動認識またはつかみ方認識によって、握り方を特定する。 The specifying unit 154 specifies how to grip the change-of-grip element by motion recognition or gripping method recognition.
 特定部154が実行する運動認識の一例について説明する。競技者5が、倒立でひねる場合、軸手は最初の握りと体をひねる方向とひねった量とによって握り方が異なる。たとえば、特定部154は、競技者5が棒をつかんだ状態で、掴んだ手を軸に体を1回(360°)ひねったら、逆手から大逆手に変化した、または、大逆手から逆手に変化したと特定する。 An example of motion recognition performed by the identifying unit 154 will be described. When the player 5 twists in an inverted position, the grip differs depending on the initial grip, the direction of twisting the body, and the amount of twisting. For example, the identification unit 154 indicates that when the player 5 grabs a stick and twists his or her body once (360°) around the gripped hand, the change from an overhand to a backhand is made, or from a backhand to a backhand. identify as changed.
 たとえば、競技者5が棒をつかんだ状態で、つかんだ手を軸に体を1/2回(180°)ひねったら、特定部154は、逆手から順手に変化した、順手から逆手に変化した、順手から大逆手に変化した、または、大逆手から順手に変化したと特定する。 For example, when the athlete 5 grabs a stick and twists his or her body 1/2 times (180°) around the gripping hand, the identifying part 154 changes from a reverse hand to a straight hand or from a straight hand to a reverse hand. , to identify a change from right hand to big hand hand, or from high hand hand to right hand.
 特定部154は、脚を腕の間に通すことで腕がねじれた場合には、逆手から大逆手に変化したと特定する。 The specifying unit 154 specifies that when the arm is twisted by putting the leg between the arms, it changes from the reverse hand to the overhand hand.
 特定部154が実行するつかみ方認識の一例について説明する。競技者5が倒立でひねる場合等で、離した手はつかみ方によって握り方が変わる。たとえば、大逆手と逆手では腕をひねる方向が逆なので、大逆手でつかむ場合は競技者5の肘が外側に開き、逆手でつかむ場合には肘が内側に入る傾向がある。この特徴を基にしたルールを基にして、特定部154は、つかみ方を特定してもよい。また、特定部154は、上記の特徴に基づき訓練した学習モデルを用いて、つかみ方認識を行ってもよい。 An example of grip recognition performed by the identification unit 154 will be described. In the case where the player 5 twists in an inverted position, etc., the grip of the released hand changes depending on how it is gripped. For example, since the direction of twisting the arm is opposite in ogakute and gakute, the player's 5 elbow tends to open outward when grasping with ogakute, and the elbow tends to move inward when grasping with gakute. Based on the rule based on this characteristic, the specifying unit 154 may specify how to grasp. Further, the identifying unit 154 may perform grasp recognition using a learning model trained based on the above characteristics.
 特定部154が実行する運動認識またはつかみ方認識の一例について説明する。競技者5が握り替えを行った場合、棒から手を離す動きをしたら、握りを変えた可能性が高い。たとえば、両手大逆手の技をした後に、両手を棒から離したら逆手に握り替えるケースがほとんどである。たとえば、特定部154は、握り替えがあったことを運動認識かつかみ方認識で検出し、特定の状態で特定の手を離した場合に握りが変化した(例えば両手大逆手で両手離した場合、両手逆手に握り替えた)と推定する。 An example of motion recognition or grip recognition performed by the identification unit 154 will be described. If player 5 changed grips, it is highly likely that he changed his grip if he made a movement to release his hand from the bar. For example, in most cases, after performing the two-handed ogakute technique, the two hands are released from the stick and the grip is changed to the opposite hand. For example, the identification unit 154 detects that a grip has been changed by motion recognition and grip recognition, and when a specific hand is released in a specific state, the grip changes , changed the grip with both hands upside down).
 図18は、特定部の処理を補足説明するための図である。図18に示す技は、エンドー1回ひねりである。競技者5は、T3-1、T3-2、T3-3の順に、運動を行う。特定部154が、握り方A3を特定する場合には、「軸手」、「空中の手」について、下記のような処理を行う。 FIG. 18 is a diagram for supplementary explanation of the processing of the identification unit. The trick shown in Figure 18 is an end-one twist. Athlete 5 exercises in order of T3-1, T3-2, and T3-3. When the specifying unit 154 specifies the grip A3, the following processing is performed for the “hand” and “hand in the air”.
 特定部154は、軸手について、「左右どちらの手か」、「どっち周りにひねっているか」、「元の握りが何か」を基にして、ひねり後の握りを特定する。特定部154は、「左右どちらの手か」については、骨格情報から特定する。特定部154は、「どっち周りにひねっているか」は、運動認識に基づいて特定する。特定部154は、「元の握りが何か」については、上記のモジュール等の判定結果を利用するが、間違っている可能性がある。 The identification unit 154 identifies the grip after twisting based on "which hand is the left or right hand?", "which side is it twisted around?", and "what is the original grip". The specifying unit 154 specifies “which hand is left or right?” from the skeleton information. The identifying unit 154 identifies “which way the body is twisted” based on motion recognition. The identification unit 154 uses the determination result of the above module or the like for "what is the original nigiri", but there is a possibility that it is wrong.
 特定部154は、空中の手について、腕の形状、軌跡(肘が内側に入っているか外側に出ているか、つかむときにバーのどちら側からつかみにいくか等)からつかんだ後の握りを特定するが、間違える可能性もある。 The identifying unit 154 determines the grip after grasping the hand in the air based on the shape of the arm, the trajectory (whether the elbow is inside or outside, which side of the bar to grasp from when grasping, etc.). Be specific, but you may be wrong.
 特定部154によって特定された「元の握りが何か」、「空中の手」については、上記の補正部155に基づく処理によって、正確な握り方が特定される。 For "what is the original grip" and "hand in the air" specified by the specifying unit 154, the correct way of gripping is specified by the processing based on the correction unit 155 described above.
 次に、本実施例で説明した情報処理装置100と同様の機能を実現するコンピュータのハードウェア構成の一例について説明する。図19は、情報処理装置と同様の機能を実現するコンピュータのハードウェア構成の一例を示す図である。 Next, an example of the hardware configuration of a computer that implements the same functions as the information processing apparatus 100 described in this embodiment will be described. FIG. 19 is a diagram illustrating an example of a hardware configuration of a computer that implements functions similar to those of an information processing apparatus.
 図19に示すように、コンピュータ200は、各種演算処理を実行するCPU201と、ユーザからのデータの入力を受け付ける入力装置202と、ディスプレイ203とを有する。また、コンピュータ200は、通信装置204と、有線または無線ネットワークを介して他のコンピュータとの間でデータの授受を行うインタフェース装置205とを有する。また、コンピュータ200は、各種情報を一時記憶するRAM206と、ハードディスク装置207とを有する。そして、各装置201~207は、バス208に接続される。 As shown in FIG. 19, the computer 200 has a CPU 201 that executes various arithmetic processes, an input device 202 that receives data input from the user, and a display 203 . The computer 200 also has a communication device 204 and an interface device 205 for exchanging data with other computers via a wired or wireless network. The computer 200 also has a RAM 206 that temporarily stores various information, and a hard disk device 207 . Each device 201 - 207 is then connected to a bus 208 .
 ハードディスク装置207は、取得プログラム207a、生成プログラム207b、算出プログラム207c、特定プログラム207d、補正プログラム207e、技決定プログラム207f、評価プログラム207gを有する。CPU201は、取得プログラム207a、生成プログラム207b、算出プログラム207c、特定プログラム207d、補正プログラム207e、技決定プログラム207f、評価プログラム207gを読み出してRAM206に展開する。 The hard disk device 207 has an acquisition program 207a, a generation program 207b, a calculation program 207c, a specific program 207d, a correction program 207e, a technique determination program 207f, and an evaluation program 207g. The CPU 201 reads out the acquisition program 207a, the generation program 207b, the calculation program 207c, the specific program 207d, the correction program 207e, the technique determination program 207f, and the evaluation program 207g, and develops them in the RAM 206. FIG.
 取得プログラム207aは、取得プロセス206aとして機能する。生成プログラム207bは、生成プロセス206bとして機能する。算出プログラム207cは、算出プロセス206cとして機能する。特定プログラム207dは、特定プロセス206dとして機能する。補正プログラム207eは、補正プロセス206eとして機能する。技決定プログラム207fは、技決定プロセス206fとして機能する。評価プログラム207gは、評価プロセス206gとして機能する。 The acquisition program 207a functions as an acquisition process 206a. Generation program 207b functions as generation process 206b. The calculation program 207c functions as a calculation process 206c. The specific program 207d functions as the specific process 206d. The correction program 207e functions as a correction process 206e. The technique determination program 207f functions as a technique determination process 206f. Evaluation program 207g functions as evaluation process 206g.
 取得プロセス206aの処理は、取得部151の処理に対応する。生成プロセス206bの処理は、生成部152の処理に対応する。算出プロセス206cの処理は、算出部153の処理に対応する。特定プロセス206dの処理は、特定部154の処理に対応する。補正プロセス206eの処理は、補正部155の処理に対応する。技決定プロセス206fの処理は、技決定部156の処理に対応する。評価プロセス206gの処理は、評価部157の処理に対応する。 The processing of the acquisition process 206a corresponds to the processing of the acquisition unit 151. The processing of the generation process 206 b corresponds to the processing of the generation unit 152 . Processing of the calculation process 206 c corresponds to processing of the calculation unit 153 . The processing of the identification process 206 d corresponds to the processing of the identification unit 154 . Processing of the correction process 206 e corresponds to processing of the correction unit 155 . The processing of the technique determination process 206 f corresponds to the processing of the technique determination section 156 . Processing of the evaluation process 206 g corresponds to processing of the evaluation unit 157 .
 なお、各プログラム207a~207gについては、必ずしも最初からハードディスク装置207に記憶させておかなくても良い。例えば、コンピュータ200に挿入されるフレキシブルディスク(FD)、CD-ROM、DVD、光磁気ディスク、ICカードなどの「可搬用の物理媒体」に各プログラムを記憶させておく。そして、コンピュータ200が各プログラム207a~207gを読み出して実行するようにしても良い。 It should be noted that the programs 207a to 207g do not necessarily have to be stored in the hard disk device 207 from the beginning. For example, each program is stored in a “portable physical medium” such as a flexible disk (FD), CD-ROM, DVD, magneto-optical disk, IC card, etc., inserted into the computer 200 . Then, the computer 200 may read and execute each program 207a to 207g.
 100  情報処理装置
 110  通信部
 120  入力部
 130  表示部
 140  記憶部
 141  距離画像DB
 142  関節定義データ
 143  関節位置DB
 144  骨格情報DB
 145  運動辞書データ
 146  握り判定テーブル
 147  握り判定結果テーブル
 148  技辞書データ
 149  技判定結果テーブル
 150  制御部
 151  取得部
 152  生成部
 153  算出部
 154  特定部
 155  補正部
 156  技決定部
 157  評価部
100 information processing device 110 communication unit 120 input unit 130 display unit 140 storage unit 141 distance image DB
142 joint definition data 143 joint position DB
144 skeleton information DB
145 Exercise dictionary data 146 Grip determination table 147 Grip determination result table 148 Technique dictionary data 149 Technique determination result table 150 Control unit 151 Acquisition unit 152 Generation unit 153 Calculation unit 154 Identification unit 155 Correction unit 156 Technique determination unit 157 Evaluation unit

Claims (15)

  1.  競技者をセンシングしたセンシングデータを取得し、
     前記センシングデータを基にして、前記競技者の時系列の3次元の骨格情報を生成し、
     前記骨格情報に基づいて、前記競技者の実施した運動を特定する場合に、
     第1運動の終了時における前記競技者の棒に対する握り方と、前記第1運動の次の第2運動との組合せに基づいて、前記第2運動の終了時における前記競技者の棒に対する握り方を特定する
     処理をコンピュータに実行させることを特徴とする技認識プログラム。
    Acquire the sensing data that sensed the athlete,
    generating time-series 3D skeletal information of the athlete based on the sensing data;
    When identifying the exercise performed by the athlete based on the skeleton information,
    The player's grip on the bar at the end of the second movement based on the combination of the player's grip on the bar at the end of the first movement and the second movement following the first movement. A technique recognition program characterized by causing a computer to execute a process of specifying a technique.
  2.  前記握り方を特定する処理は、複数の握り方に関する尤度をそれぞれ特定し、特定した尤度を基にして、前記第2運動の終了時における棒に対する握り方を特定することを特徴とする請求項1に記載の技認識プログラム。 The process of specifying the gripping method is characterized by specifying likelihoods related to a plurality of gripping methods, and specifying the gripping method for the rod at the end of the second movement based on the specified likelihoods. The technique recognition program according to claim 1.
  3.  前記第1運動の終了時における前記競技者の棒に対する握り方から、前記第2運動における前記競技者の棒に対する握り方への遷移について人体構造上の矛盾がある場合に、複数の握り方に関する尤度を修正する処理を更にコンピュータに実行させることを特徴とする請求項2に記載の技認識プログラム。 When there is an anatomical contradiction in the transition from the player's grip on the bar at the end of the first exercise to the player's grip on the bar in the second exercise, a plurality of grips 3. The technique recognition program according to claim 2, further causing the computer to execute a process of correcting the likelihood.
  4.  前記握り方を特定する処理は、前記競技者の棒に対する握り方として、順手、逆手、大逆手にうち何れか一つを特定することを特徴とする請求項1に記載の技認識プログラム。  The technique recognition program according to claim 1, characterized in that the process of specifying the gripping method specifies any one of a straight hand, a reverse hand, and a large reverse hand as the gripping method of the player with respect to the stick.
  5.  前記特定する処理は、前記競技者の棒に対する握り方を基にして、前記競技者の実施した運動を特定し、
     前記特定する処理の特定結果を基にして、前記競技者の体操の演技を評価する処理を更にコンピュータに実行させることを特徴とする請求項1に記載の技認識プログラム。
    The identifying process identifies the exercise performed by the athlete based on how the athlete grips the stick,
    2. The technique recognition program according to claim 1, further causing a computer to execute a process of evaluating gymnastics performances of said competitor based on the identification result of said identifying process.
  6.  競技者をセンシングしたセンシングデータを取得し、
     前記センシングデータを基にして、前記競技者の時系列の3次元の骨格情報を生成し、
     前記骨格情報に基づいて、前記競技者の実施した運動を特定する場合に、
     第1運動の終了時における前記競技者の棒に対する握り方と、前記第1運動の次の第2運動との組合せに基づいて、前記第2運動の終了時における前記競技者の棒に対する握り方を特定する
     処理をコンピュータが実行することを特徴とする技認識方法。
    Acquire the sensing data that sensed the athlete,
    generating time-series 3D skeletal information of the athlete based on the sensing data;
    When identifying the exercise performed by the athlete based on the skeleton information,
    The player's grip on the bar at the end of the second movement based on the combination of the player's grip on the bar at the end of the first movement and the second movement following the first movement. A technique recognition method characterized in that a computer executes a process of identifying a technique.
  7.  前記握り方を特定する処理は、複数の握り方に関する尤度をそれぞれ特定し、特定した尤度を基にして、前記第2運動の終了時における棒に対する握り方を特定することを特徴とする請求項6に記載の技認識方法。 The process of specifying the gripping method is characterized by specifying likelihoods related to a plurality of gripping methods, and specifying the gripping method for the rod at the end of the second movement based on the specified likelihoods. The technique recognition method according to claim 6.
  8.  前記第1運動の終了時における前記競技者の棒に対する握り方から、前記第2運動における前記競技者の棒に対する握り方への遷移について人体構造上の矛盾がある場合に、複数の握り方に関する尤度を修正する処理を更にコンピュータが実行することを特徴とする請求項7に記載の技認識方法。 When there is an anatomical contradiction in the transition from the player's grip on the bar at the end of the first exercise to the player's grip on the bar in the second exercise, a plurality of grips 8. The technique recognition method according to claim 7, wherein the computer further executes a process of modifying the likelihood.
  9.  前記握り方を特定する処理は、前記競技者の棒に対する握り方として、順手、逆手、大逆手にうち何れか一つを特定することを特徴とする請求項6に記載の技認識方法。 The technique recognition method according to claim 6, characterized in that the process of specifying the way of gripping the stick specifies any one of straight hand, reverse hand, and overhand hand grip as the way of gripping the stick by the player.
  10.  前記特定する処理は、前記競技者の棒に対する握り方を基にして、前記競技者の実施した運動を特定し、
     前記特定する処理の特定結果を基にして、前記競技者の体操の演技を評価する処理を更にコンピュータが実行することを特徴とする請求項6に記載の技認識方法。
    The identifying process identifies the exercise performed by the athlete based on how the athlete grips the stick,
    7. The technique recognition method according to claim 6, wherein the computer further executes a process of evaluating the gymnastics performance of the competitor based on the identification result of the identifying process.
  11.  競技者をセンシングしたセンシングデータを取得する取得部と、
     前記センシングデータを基にして、前記競技者の時系列の3次元の骨格情報を生成し、
     前記骨格情報に基づいて、前記競技者の実施した運動を特定する場合に、第1運動の終了時における前記競技者の棒に対する握り方と、前記第1運動の次の第2運動との組合せに基づいて、前記第2運動の終了時における前記競技者の棒に対する握り方を特定する特定部と
     を有することを特徴とする情報処理装置。
    an acquisition unit that acquires sensing data obtained by sensing an athlete;
    generating time-series 3D skeletal information of the athlete based on the sensing data;
    When identifying the exercise performed by the athlete based on the skeletal information, a combination of how the athlete grips the bar at the end of the first exercise and the second exercise following the first exercise. and a specifying unit that specifies how the player grips the stick at the end of the second exercise based on the above.
  12.  前記特定部は、複数の握り方に関する尤度をそれぞれ特定し、特定した尤度を基にして、前記第2運動の終了時における棒に対する握り方を特定することを特徴とする請求項11に記載の情報処理装置。 12. The method according to claim 11, wherein the identifying unit identifies likelihoods related to a plurality of gripping manners, and identifies the gripping manner for the stick at the end of the second movement based on the identified likelihoods. The information processing device described.
  13.  前記第1運動の終了時における前記競技者の棒に対する握り方から、前記第2運動における前記競技者の棒に対する握り方への遷移について人体構造上の矛盾がある場合に、複数の握り方に関する尤度を修正する補正部を更に有することを特徴とする請求項12に記載の情報処理装置。 When there is an anatomical contradiction in the transition from the player's grip on the bar at the end of the first exercise to the player's grip on the bar in the second exercise, a plurality of grips 13. The information processing apparatus according to claim 12, further comprising a corrector for correcting likelihood.
  14.  前記特定部は、前記競技者の棒に対する握り方として、順手、逆手、大逆手にうち何れか一つを特定することを特徴とする請求項11に記載の情報処理装置。 12. The information processing apparatus according to claim 11, wherein the identifying unit identifies any one of a straight hand, a reverse hand, and a large reverse hand as the way of gripping the stick by the player.
  15.  前記競技者の棒に対する握り方を基にして、前記競技者の実施した運動を特定する技決定部と、前記技決定部の特定結果を基にして、前記競技者の体操の演技を評価する評価部を更に有することを特徴とする請求項11に記載の情報処理装置。 A technique determination unit that identifies the exercise performed by the athlete based on how the athlete grips the bar, and evaluates the gymnastics performance of the athlete based on the identification result of the technique determination unit. 12. The information processing apparatus according to claim 11, further comprising an evaluation section.
PCT/JP2021/043871 2021-11-30 2021-11-30 Technique recognition program, technique recognition method, and information processing device WO2023100246A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/043871 WO2023100246A1 (en) 2021-11-30 2021-11-30 Technique recognition program, technique recognition method, and information processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2021/043871 WO2023100246A1 (en) 2021-11-30 2021-11-30 Technique recognition program, technique recognition method, and information processing device

Publications (1)

Publication Number Publication Date
WO2023100246A1 true WO2023100246A1 (en) 2023-06-08

Family

ID=86611719

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/043871 WO2023100246A1 (en) 2021-11-30 2021-11-30 Technique recognition program, technique recognition method, and information processing device

Country Status (1)

Country Link
WO (1) WO2023100246A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018070414A1 (en) * 2016-10-11 2018-04-19 富士通株式会社 Motion recognition device, motion recognition program, and motion recognition method
CN111527520A (en) * 2017-12-27 2020-08-11 富士通株式会社 Extraction program, extraction method, and information processing device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018070414A1 (en) * 2016-10-11 2018-04-19 富士通株式会社 Motion recognition device, motion recognition program, and motion recognition method
CN111527520A (en) * 2017-12-27 2020-08-11 富士通株式会社 Extraction program, extraction method, and information processing device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HIDEKI TOMIMORI, RYO MURAKAMI, TAKUYA SATO, KAZUO SASAKI: "A Judging Support System for Gymnastics Using 3D Sensing", NIHON ROBOTTO GAKKAISHI - JOURNAL OF THE ROBOTICS SOCIETY OF JAPAN, ROBOTICS SOCIETY OF JAPAN, TOKYO, JP, vol. 38, no. 4, 15 May 2020 (2020-05-15), JP , pages 339 - 344, XP009538617, ISSN: 0289-1824, DOI: 10.7210/jrsj.38.339 *

Similar Documents

Publication Publication Date Title
US11763697B2 (en) User interface system for movement skill analysis and skill augmentation
JP6733738B2 (en) MOTION RECOGNITION DEVICE, MOTION RECOGNITION PROGRAM, AND MOTION RECOGNITION METHOD
US11645872B2 (en) Scoring method, scoring apparatus, and recording medium
WO2017181717A1 (en) Electronic coaching method and system
JP6915701B2 (en) Extraction program, extraction method and information processing equipment
CN104834384B (en) Improve the device and method of exercise guidance efficiency
WO2011009302A1 (en) Method for identifying actions of human body based on multiple trace points
CN107789803B (en) Cerebral stroke upper limb rehabilitation training method and system
CN107422852A (en) Healing hand function training and estimating method and system
JP6943294B2 (en) Technique recognition program, technique recognition method and technique recognition system
US20090042661A1 (en) Rule based body mechanics calculation
CN107485826A (en) Hand rehabilitation training system based on virtual reality technology
CN113409651B (en) Live broadcast body building method, system, electronic equipment and storage medium
CN103240746A (en) Finger-guessing robot with image recognition system, and gesture recognition method
Osgouei et al. An objective evaluation method for rehabilitation exergames
WO2023100246A1 (en) Technique recognition program, technique recognition method, and information processing device
TWI664550B (en) Golf player swing posture detection system
CN114005180A (en) Motion scoring method and device for badminton
JP2011078753A (en) Exercise learning support device and method
CN110227249A (en) A kind of upper limb training system
CN110070036B (en) Method and device for assisting exercise motion training and electronic equipment
CN111353345B (en) Method, apparatus, system, electronic device, and storage medium for providing training feedback
Benmansour et al. A neural network architecture for automatic and objective surgical skill assessment
TWI729337B (en) Sports teaching assisted system based on wearable device
TW201416112A (en) Motion sensing game directing system and method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21966332

Country of ref document: EP

Kind code of ref document: A1