WO2018069981A1 - Dispositif, programme et procédé de reconnaissance de mouvement - Google Patents

Dispositif, programme et procédé de reconnaissance de mouvement Download PDF

Info

Publication number
WO2018069981A1
WO2018069981A1 PCT/JP2016/080150 JP2016080150W WO2018069981A1 WO 2018069981 A1 WO2018069981 A1 WO 2018069981A1 JP 2016080150 W JP2016080150 W JP 2016080150W WO 2018069981 A1 WO2018069981 A1 WO 2018069981A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
frame
group
frames
data
Prior art date
Application number
PCT/JP2016/080150
Other languages
English (en)
Japanese (ja)
Inventor
矢吹 彰彦
Original Assignee
富士通株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士通株式会社 filed Critical 富士通株式会社
Priority to PCT/JP2016/080150 priority Critical patent/WO2018069981A1/fr
Priority to CN201780062410.0A priority patent/CN109863535B/zh
Priority to JP2018545018A priority patent/JP6733738B2/ja
Priority to PCT/JP2017/036797 priority patent/WO2018070414A1/fr
Priority to EP17859616.9A priority patent/EP3528207B1/fr
Publication of WO2018069981A1 publication Critical patent/WO2018069981A1/fr
Priority to US16/362,701 priority patent/US11176359B2/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • A61B5/1122Determining geometric values, e.g. centre of rotation or angular range of movement of movement trajectories
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B5/00Apparatus for jumping
    • A63B5/12Bolster vaulting apparatus, e.g. horses, bucks, tables
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • G06V20/47Detecting features for summarising video content
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/49Segmenting video sequences, i.e. computational techniques such as parsing or cutting the sequence, low-level clustering or determining units such as shots or scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/20Movements or behaviour, e.g. gesture recognition
    • G06V40/23Recognition of whole body movements, e.g. for sport training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/10Athletes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0062Monitoring athletic performances, e.g. for determining the work of a user on an exercise apparatus, the completed jogging or cycling distance
    • A63B2024/0071Distinction between different activities, movements, or kind of sports performed
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/05Image processing for measuring physical parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/20Distances or displacements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image

Definitions

  • the present invention relates to a motion recognition device and the like.
  • Prior art 1 is a technique for evaluating a player performing a horizontal bar.
  • Prior art 1 detects a keyboard from a two-dimensional silhouette image, recognizes a technique from a combination of keyboards, and performs automatic scoring.
  • the technique since the technique is recognized using the two-dimensional silhouette image, only a part of the competition technique that can be discriminated only by the two-dimensional silhouette can be recognized.
  • Prior art 2 is a technique in which a prototype of a player's skeleton movement for each technique is generated in advance, and each generated prototype is compared with the skeleton information of the player performing the action to determine the technique. .
  • an object of the present invention is to provide a motion recognition device, a motion recognition program, and a motion recognition method that can efficiently evaluate a subject's performance.
  • the motion recognition device includes a segment part, an identification part, and a first evaluation part.
  • the segmentation unit chronologically segments a plurality of frames including position information of feature points corresponding to a predetermined part or joint part of the subject's body based on the position of the predetermined part of the subject's body.
  • a plurality of frames are classified into a plurality of groups in time series.
  • the identification unit identifies, for each group, the type of basic motion corresponding to the group based on the movement of the feature points included in the continuous frames.
  • a 1st evaluation part evaluates the technique and difficulty of the exercise
  • FIG. 1 is a diagram illustrating a configuration of a system according to the present embodiment.
  • FIG. 2 is a diagram for explaining an example of the feature points of the skeleton.
  • FIG. 3 is a diagram illustrating an example of the data structure of the skeleton data.
  • FIG. 4 is a functional block diagram showing the configuration of the motion recognition device.
  • FIG. 5 is a diagram illustrating an example of a data structure of calculation result data.
  • FIG. 6 is a diagram for explaining an example of the body vector.
  • FIG. 7 is a diagram for explaining the angle between the horse and the body.
  • FIG. 8 is a diagram illustrating an example of a data structure of the technique authorization rule data.
  • FIG. 9 is a diagram illustrating an example of an angle region.
  • FIG. 1 is a diagram illustrating a configuration of a system according to the present embodiment.
  • FIG. 2 is a diagram for explaining an example of the feature points of the skeleton.
  • FIG. 3 is a diagram illustrating an example of
  • FIG. 10 is a diagram illustrating an example of a support position.
  • FIG. 11 is a diagram illustrating an example of the data structure of the upgrade rule data.
  • FIG. 12 is a diagram illustrating an example of a data structure of the evaluation result data.
  • FIG. 13 is a diagram for explaining an example of the segment processing.
  • FIG. 14 is a diagram for explaining the evaluation function F (k).
  • FIG. 15 is a diagram illustrating an example of a display screen.
  • FIG. 16 is a flowchart (1) illustrating a processing procedure of the motion recognition apparatus.
  • FIG. 17 is a flowchart (2) illustrating the processing procedure of the motion recognition apparatus.
  • FIG. 18 is a flowchart showing a processing procedure for determining a technique name and a difficulty level.
  • FIG. 16 is a flowchart (1) illustrating a processing procedure of the motion recognition apparatus.
  • FIG. 17 is a flowchart (2) illustrating the processing procedure of the motion recognition apparatus.
  • FIG. 18 is a flowchart showing
  • FIG. 19 is a flowchart illustrating a processing procedure for upgrading the difficulty level.
  • FIG. 20 is a diagram illustrating an example of processing of the motion recognition device.
  • FIG. 21 is a diagram illustrating an example of a hardware configuration of a computer that realizes the same function as the motion recognition apparatus.
  • FIG. 1 is a diagram illustrating a configuration of a system according to the present embodiment.
  • this system includes a 3D (dimension) sensor 10, a skeleton recognition device 20, and a motion recognition device 100.
  • the subject 5 performs a predetermined gymnastic exercise within the imaging range of the 3D sensor 10.
  • the present invention is not limited to this and may be other gymnastics.
  • the 3D sensor 10 is a sensor that measures distance information from the installation position of the 3D sensor 10 to each observation point on the subject 5 included in the shooting range of the 3D sensor 10. For example, the 3D sensor 10 generates three-dimensional data indicating the three-dimensional coordinates of each observation point for each frame, and outputs the generated three-dimensional data to the skeleton recognition device 20.
  • the frame indicates the information of the three-dimensional coordinates of each observation point measured by the 3D sensor 10 at a certain timing, and the three-dimensional data is composed of a plurality of frames.
  • the skeleton recognition device 20 is a device that recognizes a plurality of feature points constituting the skeleton of the subject 5 based on the three-dimensional data.
  • FIG. 2 is a diagram for explaining an example of the feature points of the skeleton. As shown in FIG. 2, the feature points of the skeleton include feature points 5a to 5p.
  • Feature point 5a is a point corresponding to the position of the head.
  • the feature points 5b and 5e are points corresponding to the positions of the shoulder joints (the right shoulder joint and the left shoulder joint).
  • the feature points 5c and 5f are points corresponding to the position of the elbow joint (right elbow joint, left elbow joint).
  • the feature points 5d and 5g are points corresponding to the positions of the wrists (the right wrist and the left wrist).
  • the feature point 5h is a point corresponding to the neck position.
  • the feature point 5i is a point corresponding to the back (center of the back).
  • the feature point 5j is a point corresponding to the waist (center of the waist).
  • the feature points 5k and 5n are points corresponding to the positions of the hip joints (the right hip joint and the left hip joint).
  • the feature points 5l and 5o are points corresponding to the positions of the knee joints (the right knee joint and the left knee joint).
  • the feature points 5m and 5p are points corresponding to the positions of
  • the skeleton recognition device 20 recognizes the feature points of the skeleton for each frame based on the three-dimensional data for each frame, and generates skeleton data.
  • FIG. 3 is a diagram illustrating an example of the data structure of the skeleton data. As shown in FIG. 3, the skeleton data has three-dimensional coordinates of the feature points 5a to 5p for each frame. Each frame is given a frame number that uniquely identifies the frame.
  • the motion recognition device 100 is a device that evaluates the skill and difficulty of the gymnastics performed by the subject 5 based on the skeleton data.
  • FIG. 4 is a functional block diagram showing the configuration of the motion recognition device. As illustrated in FIG. 4, the motion recognition apparatus 100 includes an interface unit 110, an input unit 120, a storage unit 140, and a control unit 150.
  • the interface unit 110 is an interface that transmits and receives data to and from the skeleton recognition device 20 and other external devices.
  • the control unit 150 described later exchanges data with the skeleton identification device 20 via the interface 110.
  • the input unit 120 is an input device for inputting various information to the motion recognition device 100.
  • the input unit 120 corresponds to a keyboard, a mouse, a touch panel, or the like.
  • the display unit 130 is a display device that displays information output from the control unit 150.
  • the display unit 130 corresponds to a liquid crystal display, a touch panel, or the like.
  • the storage unit 140 includes skeleton data 141, calculation result data 142, technique authorization rule data 143, upgrade rule data 144, and evaluation result data 145.
  • the storage unit 140 corresponds to a semiconductor memory element such as a RAM (Random Access Memory), a ROM (Read Only Memory), and a flash memory (Flash Memory), and a storage device such as an HDD (Hard Disk Drive).
  • the skeleton data 141 is skeleton data generated by the skeleton recognition device 20.
  • the data structure of the skeleton data 141 is the same as the data structure of the skeleton data described with reference to FIG.
  • the calculation result data 142 is data having information regarding the positions and postures of the limbs and the limbs of the subject 5.
  • the calculation result data 142 is calculated by the calculation unit 152 described later.
  • FIG. 5 is a diagram illustrating an example of a data structure of calculation result data. As shown in FIG. 5, the calculation result data 142 associates frame numbers, hand / foot position data, body vector data, and segment numbers.
  • the frame number is information that uniquely identifies the frame.
  • the hand / foot position data is data indicating the three-dimensional coordinates of both hands and feet.
  • the hand position data corresponds to the three-dimensional coordinates of the wrist feature points 5d and 5g.
  • the foot position data corresponds to the three-dimensional coordinates of the ankle feature points 5m and 5p.
  • Body vector data is data indicating the direction and size of the body vector.
  • FIG. 6 is a diagram for explaining an example of the body vector.
  • the body mass vector 7 corresponds to a normal vector of the plane 6 passing through the feature point 5b of the right shoulder joint, the feature point 5e of the left shoulder joint, and the feature point 5i of the back.
  • the size of the body vector 7 is “1”.
  • the body vector is defined by an orthogonal coordinate system including an X axis, a Y axis, and a Z axis.
  • the direction of the X axis is the longitudinal direction of the horse 8.
  • the direction of the Y axis is a direction orthogonal to the X axis.
  • the direction of the Z axis is defined as the vertically upward direction of the XY plane.
  • FIG. 7 is a diagram for explaining the angle between the horse and the body.
  • the angle ⁇ between the horse and the body is an angle formed by the line segment 1a and the line segment 1b.
  • the line segment 1 b is a line segment obtained by projecting the body vector 7 onto the XY plane and extending in front of the subject 5.
  • the line segment 1a is a line segment corresponding to the Y axis.
  • the segment number is a number indicating to which group the frame belongs.
  • a group handles each frame from the frame that is the starting point of the basic motion to the frame that is the end point of the basic motion as a set when the subject performs a certain basic motion.
  • the same segment number is assigned to frames belonging to the same group.
  • the segment number is set by a segment unit 153 described later. For example, in a plurality of frames that are continuous in time series, a plurality of frames that are included from a frame that becomes a node to a frame that becomes the next node become frames included in the same group. In the following description, a plurality of frames belonging to the same group will be referred to as “partial data” as appropriate.
  • the identifying unit 154 to be described later identifies the type of basic exercise corresponding to the partial data for each partial data.
  • the technique authorization rule data 143 is data used when the first evaluation unit 155 described later determines the technique name and the difficulty level.
  • FIG. 8 is a diagram illustrating an example of a data structure of the technique authorization rule data. As shown in FIG. 8, the technique authorization rule data 143 includes basic motion type, start point body angle region, end point body angle region, start point left hand support position, start point right hand support position, end point left hand support position, end point right hand support position, front Corresponds basic exercise type, technical name, group, difficulty.
  • the basic exercise type indicates the type of basic exercise.
  • the starting point body angle region indicates in which angle region the direction of the body vector is included in the frame of the starting point included in the partial data.
  • the end point body angle area indicates in which angle region the direction of the body vector is included in the end point frame included in the partial data.
  • a (B) may be the A region or the B region in a combination outside () or inside (). It shows that.
  • FIG. 9 is a diagram illustrating an example of an angle region.
  • the angle region includes a 0 ° region, a 90 ° region, a 180 ° region, and a 270 ° region.
  • the line segment 1 b is a line segment obtained by projecting the body vector 7 of the subject 5 onto the XY plane and extending in front of the subject 5.
  • the direction of the body vector 7 is a 0 ° region.
  • the start-point left-hand support position indicates which support position on the horse is determined in advance by the position of the left hand (left wrist) of the subject 5 in the start-point frame included in the partial data.
  • the starting point right-hand support position indicates which support position on the horse is determined in advance by the position of the right hand (right wrist) of the subject 5 in the frame of the starting point included in the partial data.
  • a (B) may be the supporting position A or the supporting position B in a combination outside () or inside (). Also shows good.
  • the end-point left-hand support position indicates which support position on the horse is determined in advance by the position of the left hand (left wrist) of the subject 5 in the end-point frame included in the partial data.
  • the end right hand support position indicates which support position on the horse is determined in advance by the position of the right hand (right wrist) of the subject 5 in the end point frame included in the partial data.
  • a (B) may be the support position A or the support position B in a combination outside () or inside (). Also shows good.
  • FIG. 10 is a diagram illustrating an example of a support position.
  • the position of the hand corresponds to the support position “1”.
  • the position of the hand corresponds to the support position “2”.
  • the position of the hand corresponds to the support position “3”.
  • the position of the hand corresponds to the support position “4”.
  • the position of the hand corresponds to the support position “5”.
  • the previous basic exercise type indicates the type of basic exercise corresponding to the partial data one time before the partial data of interest.
  • the previous basic exercise type “any” indicates that the type of the previous basic exercise may be any.
  • the partial data immediately before the partial data to be focused on is expressed as “previous partial data”.
  • the technique name is a technique specified by a combination of basic exercise type, start point body angle area, end body angle area, start point left hand support position, start point right hand support position, end point left hand support position, end point right hand support position, and previous basic motion type. It is a name.
  • the group indicates a group into which the technique specified by the technique name is classified.
  • the difficulty level indicates the difficulty level of the technique.
  • the difficulty of the technique increases in the order of A, B, C, D, E. Difficulty A is the least difficult.
  • the upgrade rule data 144 is data used when the upgrade unit 156, which will be described later, determines whether or not to upgrade the skill difficulty level.
  • FIG. 11 is a diagram illustrating an example of the data structure of the upgrade rule data. As shown in FIG. 11, the upgrade rule data 144 includes the technical name, the body rotation angle in the previous exercise, the start left hand support position in the previous exercise, the start right hand support position in the previous exercise, the end left hand support position in the previous exercise, the front And an end right hand support position in motion. Further, the upgrade rule data 144 includes the previous basic exercise type and the number of difficulty upgrades.
  • the technical name corresponds to the technical name specified by the first evaluation unit 155 for the focused partial data, and is a target of whether or not to upgrade the difficulty level.
  • the body rotation angle in the previous motion indicates the total sum of the rotation angles of the body vector from the start point frame to the end point frame included in the front part data.
  • the rotation angle is the rotation angle when the body vector is projected onto the XY plane as described with reference to FIG.
  • the total rotation angle is calculated assuming that the position where the body vector of the frame serving as the start point of the previous part data is projected on the XY plane is 0 °.
  • the starting point left hand support position in the front motion indicates which support position on the horse is determined in advance by the position of the left hand (left wrist) of the subject 5 in the starting point frame included in the front part data.
  • the starting-point right-hand support position in the front movement indicates which support position on the horse is determined in advance by the position of the right hand (right wrist) of the subject 5 in the frame of the starting point included in the front part data.
  • the end-point left-hand support position in the front movement indicates which support position on the horse is determined in advance by the position of the left hand (left wrist) of the subject 5 in the end-point frame included in the front part data.
  • the end-point right-hand support position in the front exercise indicates which support position on the horse is determined in advance by the position of the right hand (right wrist) of the subject 5 in the end-point frame included in the front part data.
  • the previous basic exercise type indicates the type of basic exercise corresponding to the previous partial data.
  • the difficulty level upgrade number is information indicating how many levels the difficulty level of the technique specified by the first evaluation unit 155 is to be upgraded when hitting a corresponding condition.
  • the technical name specified by the first evaluation unit 155 is “front crossing”, the body rotation angle is “less than 360 °”, and the support positions of the start point / end point and the left hand / right hand in the previous motion are “3”. ”And the previous basic exercise type is“ inverted twist ”.
  • the record in the first column of the upgrade rule data 144 is hit, and the difficulty upgrade number is “1”. For example, when the current difficulty level is “A”, if the one-level difficulty level is upgraded, the corrected difficulty level becomes “B”.
  • Evaluation result data 145 is data that holds information related to the performance evaluation results of the subject 5.
  • FIG. 12 is a diagram illustrating an example of a data structure of the evaluation result data. As shown in FIG. 12, this evaluation result data 145 associates a frame number, right hand position, left hand position, body angle, foot height, body direction, segment flag, basic exercise type, technical name, difficulty, and E score. .
  • the frame number is information that uniquely identifies the frame.
  • the right hand position indicates which support position on the horse is determined in advance by the position of the right hand (right wrist) of the subject 5.
  • the left hand position indicates which support position on the horse is determined in advance by the position of the left hand (left wrist) of the subject 5.
  • the body angle indicates the body angle of the subject 5.
  • the angle formed by the line segment when the body vector obtained from the corresponding frame is projected on the XY plane and the Y axis is defined as the body angle.
  • the height of the foot indicates the higher position of the position of the right foot (right ankle) of the subject 5 and the position of the left foot (left ankle) of the subject.
  • the body direction indicates whether the direction of the Z axis of the body vector is a positive direction or a negative direction.
  • the body direction “down” indicates that the direction of the Z axis of the body vector is negative.
  • the body orientation “up” indicates that the orientation of the Z axis of the body vector is positive.
  • the segment flag is information indicating whether or not the corresponding frame corresponds to a segment point. Whether or not the corresponding frame is a segment is determined by the segment unit 153 described later. If the corresponding frame is a node, the segment flag is “1”. If the corresponding frame is not a node, the segment flag is “0”.
  • the explanation about the basic exercise type, the technical name, and the difficulty level is the same as the explanation about the basic exercise type, the technical name, and the difficulty level described with reference to FIG.
  • E score indicates the E score of the basic exercise type corresponding to the partial data.
  • the E score is calculated by the second evaluation unit 157 described later.
  • the control unit 150 includes an acquisition unit 151, a calculation unit 152, a segmentation unit 153, an identification unit 154, a first evaluation unit 155, an upgrade unit 156, a second evaluation unit 157, and a display control unit 158.
  • the control unit 150 can be realized by a CPU (Central Processing Unit) or an MPU (Micro Processing Unit).
  • the control unit 150 can also be realized by hard wired logic such as ASIC (Application Specific Integrated Circuit) or FPGA (Field Programmable Gate Array).
  • the acquisition unit 151 is a processing unit that acquires the skeleton data 141 from the skeleton recognition device 20.
  • the acquisition unit 151 registers the acquired skeleton data 141 in the storage unit 140.
  • the data structure of the skeleton data 141 has been described with reference to FIG.
  • the calculation unit 152 is a processing unit that calculates the calculation result data 142 based on the skeleton data 141.
  • the data structure of the calculation result data 142 has been described with reference to FIG.
  • hand / foot position data and body vector data corresponding to the frame number are registered by the calculation unit 152, and segment numbers are registered by the segment unit 153.
  • the calculation unit 152 calculates hand / foot position data and body vector data for each frame. For example, the calculation unit 152 calculates the three-dimensional coordinates of the feature points 5d and 5g of the skeleton data 141 as the hand position. The calculation unit 152 calculates the three-dimensional coordinates of the feature points 5m and 5p of the skeleton data 141 as the foot position. The calculation unit 152 registers hand / foot position data in the calculation result data 142 in association with the frame number of the corresponding frame.
  • the calculation unit 152 calculates a normal vector of the plane 6 passing through the feature point 5b of the right shoulder joint, the feature point 5e of the left shoulder joint, and the feature point 5i of the back as a body vector.
  • the calculation unit 152 registers the body vector data in the calculation result data 142 in association with the frame number of the corresponding frame.
  • the calculation unit 152 generates the calculation result data 142 by repeatedly executing the above process for each frame of the skeleton data 141.
  • the segment unit 153 is a processing unit that identifies a frame serving as a segment point based on the calculation result data 142 and classifies the plurality of frames into a plurality of groups based on the identified segment point.
  • the segment unit 153 accesses the calculation result data 142 and sets the segment numbers of the frames belonging to the same group to be the same.
  • the segment unit 153 refers to hand / foot position data corresponding to a certain frame number in the calculation result data 142.
  • the segment unit 153 determines that a frame corresponding to a certain frame number is a segment point when all of the following conditions A1, A2, and A3 are satisfied. When all of the conditions A1 to A3 are satisfied, it means that the subject 5 places both hands on the horse 8 and the direction of the body vector is downward.
  • Condition A1 The position of the left hand of the subject 5 is located in any of the regions 8a to 8e shown in FIG.
  • Condition A2 The position of the right hand of the subject 5 is located in any of the areas 8a to 8e shown in FIG.
  • Condition A3 The Z-axis component of the body vector is negative.
  • the segment unit 153 accesses the evaluation result data and sets the segment flag corresponding to the frame number determined to be a segment point to “1”. The initial value of the segment flag is set to “0”.
  • FIG. 13 is a diagram for explaining an example of the segment processing.
  • the plurality of frames shown in FIG. 13 are arranged in time series.
  • the frames 30a, 31a, and 32a satisfy the above-described conditions A1 to A3, and are frame frames.
  • the segment unit 153 classifies each frame included in the range 30 into the same group.
  • the segment unit 153 classifies the frames included in the range 31 into the same group.
  • the segment unit 153 classifies the frames included in the range 32 into the same group.
  • the segment number of each frame included in the range 30 is “n”, the segment number of each frame included in the range 31 is “n + 1”, and the segment number of each frame included in the range 32 is “n + 2”.
  • n is a natural number of 1 or more.
  • the top frame 30a is the start frame
  • the end frame 30b is the end frame.
  • the first frame 31a is the start frame
  • the last frame 31b is the end frame.
  • the first frame 32a is the start frame
  • the last frame 32b is the end frame.
  • partial data and “previous partial data” will be described again.
  • Data obtained by collecting the frames included in the ranges 30, 31, and 32 is partial data. Assuming that the partial data of interest is each frame included in the range 31, the previous partial data is each frame included in the range 31.
  • the identification unit 154 is a processing unit that identifies, for each partial data, the basic motion type corresponding to the partial data based on the motion of the feature points of the frames included in the partial data.
  • the identifying unit 154 acquires information on the feature points of the frames included in the partial data from the calculation result data 142.
  • the identification unit 154 registers the identification result in the evaluation result data 145 in association with the frame number.
  • the identification unit 154 uses an HMM (Hidden Markov Model).
  • the HMM is a probability model that takes time-series data as input and determines which category (basic exercise type) the time-series data belongs to.
  • the identification unit 154 has a learning phase and a recognition phase.
  • the HMM determines four parameter sets ⁇ for each basic exercise type in the learning phase.
  • the parameter set ⁇ is defined by equation (1).
  • ⁇ Q, A, B, ⁇ (1)
  • Q represents a set of states and is defined by equation (2).
  • Q ⁇ a 1 , q 2 ,..., Q n ⁇ (2)
  • A represents a state transition probability matrix that is a set of transition probabilities a ij from state q i to state q j , and is defined by equation (3).
  • A ⁇ a ij ⁇ (3)
  • B represents a set of probability distributions that output a vector x in the state q j .
  • Equation (5) the probability density function in the state qi is defined by Equation (5).
  • u i represents an average vector of the probability density function.
  • ⁇ i indicates a covariance matrix of the probability density function.
  • the identification unit 154 randomly sets HMM parameters as an initial state when obtaining a parameter set ⁇ of a certain basic exercise type.
  • the identification unit 154 prepares a plurality of partial data corresponding to a certain basic motion identification, and performs a process of calculating the likelihood P (O segment
  • the identifying unit 154 repeatedly performs the process of optimizing the parameter set of ⁇ k using the partial data with the maximum likelihood P as the ⁇ k teacher signal for the HMM.
  • the identification unit 154 identifies a parameter set ⁇ of a certain basic exercise type through such processing.
  • the identification unit 154 learns the parameter set ⁇ for each basic exercise type by performing the same process for other basic exercise types.
  • the identification unit 154 acquires the partial data to be recognized for the basic exercise type, the maximum value among ⁇ k for each parameter set ⁇ is obtained while changing the parameter set learned in the learning phase based on Expression (6). Specify ⁇ k0 .
  • the identification unit 154 determines the basic exercise type corresponding to the parameter set ⁇ used when ⁇ k0 is calculated as the basic exercise type of the partial data to be recognized.
  • 1st evaluation part 155 is a process part which evaluates the technique and difficulty of the exercise
  • the first evaluation unit 155 will be described.
  • the partial data to be evaluated is simply referred to as partial data
  • the partial data immediately before the partial data to be evaluated is referred to as previous partial data.
  • the first evaluation unit 155 identifies the basic exercise type of the partial data based on the frame number included in the partial data and the evaluation result data 145.
  • the first evaluation unit 155 identifies the previous basic exercise type of the previous partial data based on the frame number included in the previous partial data and the evaluation result data 145.
  • the first evaluation unit 155 refers to the calculation result data 142 using the frame number of the start point frame included in the partial data as a key, and specifies the start point body angle region, the start point left hand support position, and the start point right hand support position.
  • the first evaluation unit 155 refers to the calculation result data 142 using the frame number of the end point frame included in the partial data as a key, and specifies the end point body angle region, the end point left hand support position, and the end point right hand support position.
  • the first evaluation unit 155 compares the definition of each angle region described in FIG. 9 with the body vector, and specifies the start point body angle region and the end point body angle region.
  • the first evaluation unit 155 compares the regions 8a to 8e described with reference to FIG. 10 with the hand position data, so that the start point left hand support position, the start point right hand support position, the end point left hand support position, and the end point right hand support position are determined. Identify.
  • the first evaluation unit 155 compares the identified information with the technique authorization rule data 143, and identifies the technique name, difficulty level, and group corresponding to the partial data.
  • each specified information includes a basic exercise type and a previous exercise type.
  • Each identified information includes a start point body angle area, a start point left hand support position, a start point right hand support position, an end point body angle angle area, an end point left hand support position, and an end point right hand support position.
  • the first evaluation unit 155 registers the specified technique name, difficulty level, and group in the evaluation result data 145 in association with the frame number of the frame included in the partial data. Further, the first evaluation unit 155 registers the right hand position, the left hand position, the body angle, the foot height, and the body direction in association with the frame number. The first evaluation unit 155 identifies the technique name, the degree of difficulty, and the group by repeatedly executing the above processing for other partial data, and registers it in the evaluation result data 145.
  • the upgrade unit 156 is a processing unit that determines whether or not to upgrade the difficulty level evaluated by the first evaluation unit 155 and upgrades the difficulty level based on the determination result.
  • the partial data to be evaluated is simply referred to as partial data
  • the partial data immediately before the partial data to be evaluated is referred to as previous partial data.
  • the upgrade unit 156 refers to the evaluation result data 145 using the frame number of the frame included in the partial data as a key, and identifies the technical name corresponding to the partial data evaluated by the first evaluation unit 155.
  • the upgrade unit 156 calculates the body rotation angle based on the body vector of each frame included in the partial data.
  • the body rotation angle is the sum of changes in the body angle from the start point frame to the end point frame.
  • the upgrade unit 156 acquires the data of the body vector of each frame from the calculation result data 142.
  • the upgrade unit 156 refers to the calculation result data 142 using the frame number of the start point frame included in the partial data as a key, and specifies the start point left hand support position and the start point right hand support position.
  • the upgrade unit 156 refers to the calculation result data 142 with the frame number of the end point frame included in the partial data as a key, and specifies the end point left hand support position and the end point right hand support position.
  • the upgrade unit 156 specifies the previous basic exercise type of the previous partial data based on the frame number included in the previous partial data and the evaluation result data 145.
  • the upgrade unit 156 compares each identified information with the upgrade rule data 144 to determine whether or not to upgrade the difficulty corresponding to the partial data.
  • each specified information includes a technical name, a body rotation angle, a start point left hand support position, a start point right hand support position, an end point left hand support position, an end point right hand support position, and a previous basic motion type.
  • the upgrade unit 156 determines that the difficulty level is to be upgraded when there is a column record that hits each identified information in the upgrade rule data 144.
  • the upgrade unit 156 determines that the difficulty level is not upgraded when the column record that hits each identified information does not exist in the upgrade rule data 144.
  • the number of steps to be upgraded refers to the number of difficulty upgrades of hit column records.
  • the technical name of the partial data is “Cross-crossing”
  • the body rotation angle is “less than 360 °”
  • the start / end points of the previous movement and the support positions of the left / right hand are “3”.
  • the exercise type is “inverted twist”.
  • the record in the first column of the upgrade rule data 144 is hit, and the difficulty upgrade number is “1”.
  • the upgrade unit 156 corrects the difficulty level to “B”.
  • the upgrade unit 156 updates the difficulty level of the evaluation result data 145 by repeatedly executing the above processing for other partial data.
  • the second evaluation unit 157 is a processing unit that evaluates the E score of the basic exercise for each partial data. It is assumed that a policy for calculating an E score is set in the second evaluation unit 157 in advance for each basic exercise type.
  • the second evaluation unit 157 calculates the E score of the basic exercise type “down”.
  • “down” indicates a basic motion when the subject 5 descends from the horse.
  • the second evaluation unit 157 uses an evaluation function F (k) shown in Expression (7).
  • the evaluation function F (k) is a function for evaluating the kth frame among a plurality of frames included in the partial data to be evaluated.
  • F (k) arcsin (Tz / Tx) (7)
  • FIG. 14 is a diagram for explaining the evaluation function F (k).
  • Tx corresponds to the length of the X component from the shoulder joint of the subject 5 to the toe (ankle) in the kth frame.
  • Tz corresponds to the length of the Z component from the shoulder joint to the tip of the subject 5 in the kth frame.
  • the position of the shoulder joint corresponds to the position of the feature point 5b or 5e.
  • the tip of the foot corresponds to the position of the feature point 5m or 5p.
  • the second evaluation unit 157 When evaluating “downlink”, the second evaluation unit 157 repeatedly executes the above process for each frame included in the target partial data, and calculates the value of F (k) for each frame. The second evaluation unit 157 determines the defect “0.3” when a frame having a value of F (k) less than 30 ° appears in each frame.
  • the second evaluation unit 157 calculates a final E score for the basic exercise based on the equation (8).
  • ⁇ shown in the equation (8) indicates the sum total of the defects determined by the result of the evaluation function in all partial data.
  • the “downhill” evaluation function has been described as an example.
  • other basic motions are also determined by the evaluation function corresponding to the corresponding basic motion.
  • the evaluation function of the “turning” partial data is an evaluation function for evaluating how the left and right feet are aligned, and the more the left and right feet are not aligned, the greater the disadvantage.
  • the second evaluation unit 157 registers the calculated E score information in the evaluation result data 145 in association with the frame number.
  • the second evaluation unit 157 may register the E score in the record of each frame included in the same partial data, or the E score is recorded in the record of the start frame among the frames included in the same partial data. You can register your score.
  • the second evaluation unit 157 calculates an E score for other partial data in the same manner, and registers it in the evaluation result data 145.
  • the first evaluation unit 155 may re-determine the difficulty level of the technique based on the evaluation result data 145 and a logical expression indicating a rule.
  • a logical expression representing a rule is a logical expression that associates the order of the technical names of each basic movement with a D score (Difficulty score).
  • the display control unit 158 is a processing unit that generates a display screen based on the evaluation result data 145 and causes the display unit 130 to display the display screen.
  • FIG. 15 is a diagram illustrating an example of a display screen. As shown in FIG. 15, the display screen 50 includes areas 50a to 50g.
  • the display control unit 158 may display the evaluation result data 145 on the display unit 130 as it is.
  • the region 50a is a region for displaying the current position / hand position data and body vector data of the subject 5.
  • the area 50b is an area for displaying hand / foot position data and body vector data when the subject 5 starts acting.
  • the area 50c is an area for displaying hand / foot position data and body vector data of the subject 5 specified by the frame immediately before the current frame.
  • the area 50d is an area for displaying hand / foot position data and body vector data of the subject 5 specified by a frame two frames before the current frame.
  • the area 50e is an area for displaying hand / foot position data and body vector data of the subject 5 specified by the frame three frames before the current frame.
  • the area 50f is an area for displaying a technique name of the basic exercise specified by the partial data including the current frame.
  • the region 50g is a region for displaying an image in which information of the 3D sensor 10 is visualized.
  • 16 and 17 are flowcharts showing the processing procedure of the motion recognition apparatus.
  • the motion recognition apparatus 100 updates the processing target frame number with a value obtained by adding 1 to the frame number (step S ⁇ b> 101).
  • the calculation unit 152 of the motion recognition apparatus 100 reads the skeleton data 141 (step S102).
  • the calculating unit 152 calculates hand / foot position data and body vector data based on the skeleton data 141 (step S103).
  • the segment part 153 of the motion recognition apparatus 100 sets the front support flag (step S104).
  • the segment unit 153 sets the front support flag to “1” when all of the above-described conditions A1, A2, and A3 are satisfied.
  • the segment part 153 sets the front support flag to “0” when all of the conditions A1, A2, and A3 are not satisfied.
  • the front support flag is associated with the frame number to be processed.
  • the segment unit 153 sets a landing flag (step S105).
  • step S105 the segment unit 153 sets the landing flag to “1” when the position of the left foot tip (left ankle) or the position of the right foot (right ankle) of the subject 5 is less than a predetermined threshold.
  • the segment unit 153 sets the landing flag to “0” when the position of the left foot tip (left ankle) or the right foot (right ankle) of the subject 5 is equal to or greater than a predetermined threshold.
  • the landing flag is associated with the frame number to be processed.
  • the segment unit 153 performs a segment point determination process (step S106).
  • step S106 the segment unit 153 determines that the front support flag corresponding to the previous frame number is “0” and the front support flag corresponding to the processing target frame number is “1”. It is determined that the frame corresponding to the frame number is a node.
  • the segment unit 153 determines the frame number to be processed when the front support flag corresponding to the previous frame number is “1” or the front support flag corresponding to the frame number to be processed is “0”. It is determined that the frame corresponding to is not a node.
  • step S107 When the segment unit 153 determines that the segment point is not a segment point (No in step S107), the hand / foot position data, the body vector data, and the current segment number are associated with the frame number in the calculation result data 142. Register (step S108), and proceed to step S101.
  • step S107 determines that the segment is a segment point (step S107, Yes)
  • the identification unit 154 of the motion recognition apparatus 100 determines the basic motion type based on the frame of the segment number n (step S110).
  • the first evaluation unit 155 of the motion recognition apparatus 100 determines a technique name and a difficulty level based on the technique authorization rule data 143 (step S111a).
  • the upgrade unit 156 of the motion recognition apparatus 100 upgrades the difficulty level based on the upgrade rule data 144 (step S112), and proceeds to step S113.
  • movement recognition apparatus 100 calculates E score regarding a basic exercise
  • the motion recognition apparatus 100 registers the technique, the difficulty level, and the E score in the evaluation result table 145 in association with the frame number (step S113).
  • the movement recognition device 100 performs a performance end determination process (step S114).
  • step S114 the motion recognition apparatus 100 determines that the performance is complete when the group to which the technique belongs is a predetermined group.
  • the motion recognition device 100 determines that the performance is complete when the value of the fall count obtained by counting the landing flags is equal to or greater than a predetermined number.
  • Step S115 the motion recognition apparatus 100 proceeds to Step S101 in FIG. If the performance is finished (step S115, Yes), the motion recognition apparatus 100 sets the landing flag, the fall count, and the segment number to initial values (step S116).
  • the first evaluation unit 155 of the motion recognition apparatus 100 re-determines the difficulty level of the technique based on the evaluation result data 145 and a logical expression indicating a rule, and calculates a D score (step S117).
  • the display control unit 158 of the motion recognition apparatus 100 displays the evaluation result (step S118).
  • FIG. 18 is a flowchart showing a processing procedure for determining a technique name and a difficulty level.
  • the first evaluation unit 155 of the motion recognition apparatus 100 acquires the hand / foot position data and body vector data of the frame corresponding to the segment number n from the calculation result data 142 (step S201). .
  • the first evaluation unit 155 identifies the starting point body angle region, the starting point left hand supporting position, and the starting point right hand supporting position based on the hand / foot position data corresponding to the starting point frame (step S202).
  • the first evaluation unit 155 identifies the end point body angle region, the end point left hand support position, and the end point right hand support position based on the hand / foot position data corresponding to the end point frame (step S203).
  • the first evaluation unit 155 identifies the basic exercise type and the previous exercise type corresponding to the segment number n (step S204).
  • the first evaluation unit 155 compares each specified information with the technique authorization rule data 143 to determine whether there is a hit column record (step S205).
  • each specified information includes a starting point body angle area, a starting point left hand support position, a starting point right hand supporting position, an ending body arm angle area, an ending point left hand supporting position, an ending point right hand supporting position, a basic motion type, and a previous basic motion type.
  • the first evaluation unit 155 specifies the technique name, group, and difficulty included in the hit column record as the technique name, group, and difficulty corresponding to the segment number n ( Step S207).
  • the first evaluation unit 155 determines that the technical name corresponding to the segment number n is unknown, and sets the group and the difficulty level to 0 (Step S208).
  • FIG. 19 is a flowchart illustrating a processing procedure for upgrading the difficulty level.
  • the upgrade unit 156 of the motion recognition apparatus 100 acquires the hand / foot position data and body vector data of the frame corresponding to the segment number n from the calculation result data 142 (step S301).
  • the upgrade unit 156 calculates the body rotation angle based on the body vector from the start frame to the end frame (step S302).
  • the upgrade unit 156 specifies the start-point left-hand support position and the start-point right-hand support position based on the hand / foot position data corresponding to the start-point frame (step S303). Based on the hand / foot position data corresponding to the end point frame, the end point left hand support position and the end point right hand support position are specified (step S304).
  • the upgrade unit 156 identifies the technique name and the previous basic exercise type corresponding to the segment number n (step S305).
  • the upgrade unit 156 compares each identified information with the upgrade rule data 144 to determine whether there is a hit column record (step S306). In the case of a hit (step S307, Yes), the upgrade unit 156 upgrades the difficulty level according to the number of difficulty upgrades (step S308). If the upgrade unit 156 does not hit (No in step S307), the upgrade unit 156 ends the process.
  • FIG. 20 is a diagram illustrating an example of processing of the motion recognition device.
  • the segment unit 153 sets the segment points 60a to 60g based on the calculation result data 142. For example, in the frame of the segment points 60a, 60b and 60c, the left hand support position of the subject 5 is “2” and the right hand support position is “4”. In the frame of the node 60d, the left hand support position of the subject 5 is “4” and the right hand support position is “4”. In the frame of the node 60e, the left hand support position of the subject 5 is “4” and the right hand support position is “2”.
  • the left hand support position of the subject 5 is “2” and the right hand support position is “2”.
  • the body vector is pointing downward.
  • the body angle of the frame of the segment points 60a, 60b, 60c is 180 °.
  • the body angle of the frame at the node 60d is 90 °.
  • the body angle of the frame of the node 60e is 0 °.
  • the body angle of the frame of the segment points 60f and 60g is 270 °.
  • the identification unit 154 determines that the basic motion type is “CCW turning” from the motion of the feature points of each frame included in the node points 60a to 60b.
  • the identification unit 154 determines that the basic motion type is “CCW turning” from the motion of the feature points of each frame included in the nodes 60b to 60c.
  • the identification unit 154 determines that the basic motion type is “CCW downward reverse 1/4 turn” from the motion of the feature points of each frame included in the nodes 60c to 60d.
  • the identification unit 154 determines that the basic motion type is “CCW upward reverse 1/4 turn” from the motion of the feature points of each frame included in the node points 60d to 60e.
  • the identification unit 154 determines that the basic motion type is “CCW downward reverse 1/4 turn” from the motion of the feature points of each frame included in the nodes 60e to 60f. The identification unit 154 determines that the basic motion type is “CCW turning” from the motion of the feature points of each frame included in the node points 60f to 60g.
  • the first evaluation unit 155 uses the technique authorization rule data 143 based on the technique name that the subject performed between the frames included in the nodes 60a to 60b as “lateral turn” and the difficulty “A”. judge.
  • the first evaluation unit 155 determines, based on the technique authorization rule data 143, the technique name performed by the subject between the frames included in the nodes 60b to 60c as “lateral turn” and the difficulty “A”.
  • the first evaluation unit 155 determines that the technique name performed by the subject between the frames included in the segment points 60c to 60e is “turning the handle up and down” and the difficulty level “B”. To do.
  • the first evaluation unit 155 Based on the technique authorization rule data 143, the first evaluation unit 155 does not hit the technique name and difficulty performed by the subject between the frames included in the nodes 60e to 60f, but the technique name "Unknown", difficulty level It is determined as “ ⁇ ”. Based on the technique authorization rule data 143, the first evaluation unit 155 assigns the technique name performed by the subject between the frames included in the nodes 60f to 60g as “one hand up vertically turning” and the difficulty “B”. judge.
  • the second evaluation unit 157 determines that the E score of the performance performed by the subject between the frames included in the node points 60a to 60b is “8”.
  • the second evaluation unit 157 determines that the E score of the performance performed by the subject between the frames included in the node points 60b to 60c is “8.5”.
  • the second evaluation unit 157 determines that the E score of the performance performed by the subject between the frames included in the nodes 60c to 60e is “7.8”.
  • the second evaluation unit 157 determines that the E score of the performance performed by the subject between the frames included in the nodes 60e to 60f is “1”.
  • the second evaluation unit 157 determines that the E score of the performance performed by the subject between the frames included in the nodes 60f to 60g is “8.3”.
  • the motion recognition apparatus 100 identifies a frame serving as a node based on the skeleton data 141 of the subject 5 and classifies the frame into a plurality of partial data.
  • the motion recognition device 100 identifies the basic motion type of each partial data, and evaluates the technique name and difficulty level of the exercise performed by the subject based on the order of the basic motion types of the partial data continuous in time series. As described above, since the technique name and the difficulty level are evaluated based on the order of the basic exercise types, the exercise recognition apparatus 100 can efficiently evaluate the performance of the subject 5.
  • the motion recognition device 100 upgrades the difficulty level of motion based on the order of basic motion types and the body rotation angle of partial data that continues in time series. For this reason, the exact difficulty corresponding to the complex evaluation criteria of a game can be specified.
  • the motion recognition apparatus 100 calculates an E score corresponding to the partial data based on an evaluation criterion in which the feature points of each frame included in the partial data are associated with the score. For this reason, in addition to a technique and a difficulty level, E score can also be calculated appropriately.
  • the motion recognition apparatus 100 identifies a frame in which the body vector is directed downward and the both hands of the subject 5 are located in a predetermined area as a segment point. For this reason, a some frame can be segmented appropriately and the determination precision of a technique or difficulty can be improved.
  • FIG. 21 is a diagram illustrating an example of a hardware configuration of a computer that realizes the same function as the motion recognition apparatus.
  • the computer 200 includes a CPU 201 that executes various arithmetic processes, an input device 202 that receives input of data from a user, and a display 203.
  • the computer 200 also includes a reading device 204 that reads programs and the like from a storage medium, and an interface device 205 that exchanges data with other computers via a network.
  • the computer 200 acquires skeleton data from the skeleton recognition device 20 via the interface device 205.
  • the computer 200 also includes a RAM 206 that temporarily stores various information and a hard disk device 207.
  • the devices 201 to 207 are connected to the bus 208.
  • the hard disk device 207 includes an acquisition program 207a, a calculation program 207b, a segmentation program 207c, an identification program 207d, a first evaluation program 207e, an upgrade program 207f, a second evaluation program 207g, and a display program 207h.
  • the CPU 201 reads each program 207 a to 207 h and develops it in the RAM 206.
  • the acquisition program 207a functions as the acquisition process 206a.
  • the calculation program 207b functions as a calculation process 206b.
  • the segment program 207c functions as a segment process 206c.
  • the identification program 207d functions as an identification process 206d.
  • the first evaluation program 207e functions as the first evaluation process 206e.
  • the upgrade program 207f functions as an upgrade process 206f.
  • the second evaluation program 207g functions as the second evaluation process 206g.
  • the display program 207h functions as a display process 206h.
  • the processing of the acquisition process 206a corresponds to the processing of the acquisition unit 151.
  • the process of the calculation process 206 b corresponds to the process of the calculation unit 152.
  • the process of the segment process 206c corresponds to the process of the segment unit 153.
  • the process of the identification process 206d corresponds to the process of the identification unit 154.
  • the process of the first evaluation process 206e corresponds to the process of the first evaluation unit 155.
  • the process of the upgrade process 206f corresponds to the process of the upgrade unit 156.
  • the process of the second evaluation process 206g corresponds to the process of the second evaluation unit 157.
  • the process of the display process 206h corresponds to the process of the display control unit 158.
  • each program 207a to 207h are not necessarily stored in the hard disk device 207 from the beginning.
  • each program is stored in a “portable physical medium” such as a flexible disk (FD), a CD-ROM, a DVD disk, a magneto-optical disk, and an IC card inserted into the computer 200.
  • the computer 200 may read and execute each of the programs 207a to 207h.

Abstract

Dispositif de reconnaissance de mouvement (100) comprenant une unité de segmentation (153), une unité d'identification (154) et une première unité d'évaluation (155). L'unité de segmentation (153) : utilise la position d'une région prescrite du corps d'un sujet pour segmenter chronologiquement une pluralité d'images qui comprennent des informations de position pour un point caractéristique qui correspond à la région prescrite du corps du sujet ou à une articulation ; et trie ainsi chronologiquement la pluralité d'images en une pluralité de groupes. Pour chaque groupe, l'unité d'identification (154) utilise le mouvement du point caractéristique tel qu'il est inclus dans des images successives pour identifier un type de mouvement de base qui correspond au groupe. La première unité d'évaluation (155) utilise la séquence des types de mouvement de base qui correspond à des groupes chronologiquement successifs pour évaluer la technique et la difficulté d'un mouvement effectué par le sujet.
PCT/JP2016/080150 2016-10-11 2016-10-11 Dispositif, programme et procédé de reconnaissance de mouvement WO2018069981A1 (fr)

Priority Applications (6)

Application Number Priority Date Filing Date Title
PCT/JP2016/080150 WO2018069981A1 (fr) 2016-10-11 2016-10-11 Dispositif, programme et procédé de reconnaissance de mouvement
CN201780062410.0A CN109863535B (zh) 2016-10-11 2017-10-11 运动识别装置、存储介质以及运动识别方法
JP2018545018A JP6733738B2 (ja) 2016-10-11 2017-10-11 運動認識装置、運動認識プログラムおよび運動認識方法
PCT/JP2017/036797 WO2018070414A1 (fr) 2016-10-11 2017-10-11 Dispositif, programme et procédé de reconnaissance de mouvement
EP17859616.9A EP3528207B1 (fr) 2016-10-11 2017-10-11 Dispositif, programme et procédé de reconnaissance de mouvement
US16/362,701 US11176359B2 (en) 2016-10-11 2019-03-25 Motion recognition device and motion recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/080150 WO2018069981A1 (fr) 2016-10-11 2016-10-11 Dispositif, programme et procédé de reconnaissance de mouvement

Publications (1)

Publication Number Publication Date
WO2018069981A1 true WO2018069981A1 (fr) 2018-04-19

Family

ID=61905221

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/JP2016/080150 WO2018069981A1 (fr) 2016-10-11 2016-10-11 Dispositif, programme et procédé de reconnaissance de mouvement
PCT/JP2017/036797 WO2018070414A1 (fr) 2016-10-11 2017-10-11 Dispositif, programme et procédé de reconnaissance de mouvement

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/036797 WO2018070414A1 (fr) 2016-10-11 2017-10-11 Dispositif, programme et procédé de reconnaissance de mouvement

Country Status (5)

Country Link
US (1) US11176359B2 (fr)
EP (1) EP3528207B1 (fr)
JP (1) JP6733738B2 (fr)
CN (1) CN109863535B (fr)
WO (2) WO2018069981A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111382624A (zh) * 2018-12-28 2020-07-07 杭州海康威视数字技术股份有限公司 动作识别方法、装置、设备及可读存储介质
JPWO2021064830A1 (fr) * 2019-09-30 2021-04-08
WO2021149250A1 (fr) * 2020-01-24 2021-07-29 富士通株式会社 Procédé de reconnaissance d'exercice, programme de reconnaissance d'exercice, et dispositif de traitement d'informations
WO2022030439A1 (fr) * 2020-08-07 2022-02-10 ハイパーダイン株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109789329B (zh) * 2016-10-11 2020-12-01 富士通株式会社 评分辅助程序、评分辅助装置以及评分辅助方法
EP3744408A4 (fr) * 2018-01-24 2021-01-20 Fujitsu Limited Programme de génération d'écran, procédé de génération d'écran et dispositif de traitement d'informations
JP7146247B2 (ja) * 2018-09-03 2022-10-04 国立大学法人 東京大学 動作認識方法及び装置
US10769422B2 (en) * 2018-09-19 2020-09-08 Indus.Ai Inc Neural network-based recognition of trade workers present on industrial sites
US10853934B2 (en) 2018-09-19 2020-12-01 Indus.Ai Inc Patch-based scene segmentation using neural networks
US11093886B2 (en) * 2018-11-27 2021-08-17 Fujifilm Business Innovation Corp. Methods for real-time skill assessment of multi-step tasks performed by hand movements using a video camera
TWI715903B (zh) * 2018-12-24 2021-01-11 財團法人工業技術研究院 動作追蹤系統及方法
CN111460870A (zh) * 2019-01-18 2020-07-28 北京市商汤科技开发有限公司 目标的朝向确定方法及装置、电子设备及存储介质
CN109840917B (zh) * 2019-01-29 2021-01-26 北京市商汤科技开发有限公司 图像处理方法及装置、网络训练方法及装置
CN110516112B (zh) * 2019-08-28 2021-04-27 北京理工大学 一种基于层次模型的人体动作检索方法及设备
WO2021064960A1 (fr) 2019-10-03 2021-04-08 富士通株式会社 Procédé de reconnaissance de mouvement, programme de reconnaissance de mouvement, et dispositif de traitement d'informations
CN114514559A (zh) 2019-10-03 2022-05-17 富士通株式会社 运动识别方法、运动识别程序及信息处理装置
CN112668359A (zh) * 2019-10-15 2021-04-16 富士通株式会社 动作识别方法、动作识别装置和电子设备
EP4066087A1 (fr) 2019-11-03 2022-10-05 ST37 Sport et Technologie Procede et systeme de caracterisation d'un mouvement d'une entite en mouvement
WO2021199392A1 (fr) * 2020-04-02 2021-10-07 日本電信電話株式会社 Dispositif d'apprentissage, procédé d'apprentissage, programme d'apprentissage, dispositif d'estimation de score, procédé d'estimation de score et programme d'estimation de score
JP7459679B2 (ja) * 2020-06-23 2024-04-02 富士通株式会社 行動認識方法、行動認識プログラム及び行動認識装置
CN112288766A (zh) * 2020-10-28 2021-01-29 中国科学院深圳先进技术研究院 运动评估方法、装置、系统及存储介质
GB2602248A (en) * 2020-11-11 2022-06-29 3D4Medical Ltd Motion assessment instrument
EP4258184A4 (fr) * 2021-01-27 2023-12-27 Fujitsu Limited Dispositif de détermination de séquence d'actions, procédé de détermination de séquence d'actions et programme de détermination de séquence d'actions
JPWO2022208859A1 (fr) * 2021-04-01 2022-10-06
WO2023100246A1 (fr) * 2021-11-30 2023-06-08 富士通株式会社 Programme de reconnaissance de technique, procédé de reconnaissance de technique et dispositif de traitement d'informations

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016056449A1 (fr) * 2014-10-10 2016-04-14 富士通株式会社 Programme, procédé et dispositif de détermination d'habileté et serveur

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4420512B2 (ja) * 1999-06-01 2010-02-24 富士通マイクロエレクトロニクス株式会社 移動物体間動作分類方法及び装置並びに画像認識装置
JP2005209137A (ja) * 2003-12-26 2005-08-04 Mitsubishi Heavy Ind Ltd 対象物認識方法及び対象物認識装置、並びに顔方向識別装置
JP5186656B2 (ja) * 2008-09-03 2013-04-17 独立行政法人産業技術総合研究所 動作評価装置および動作評価方法
US8755569B2 (en) * 2009-05-29 2014-06-17 University Of Central Florida Research Foundation, Inc. Methods for recognizing pose and action of articulated objects with collection of planes in motion
US9981193B2 (en) * 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
JP2011170856A (ja) * 2010-02-22 2011-09-01 Ailive Inc 複数の検出ストリームを用いたモーション認識用システム及び方法
JP2011215967A (ja) * 2010-03-31 2011-10-27 Namco Bandai Games Inc プログラム、情報記憶媒体及び物体認識システム
US20130218053A1 (en) * 2010-07-09 2013-08-22 The Regents Of The University Of California System comprised of sensors, communications, processing and inference on servers and other devices
US8761437B2 (en) * 2011-02-18 2014-06-24 Microsoft Corporation Motion recognition
JP6011154B2 (ja) * 2012-08-24 2016-10-19 富士通株式会社 画像処理装置、画像処理方法
CN103839040B (zh) * 2012-11-27 2017-08-25 株式会社理光 基于深度图像的手势识别方法和装置
JP6044306B2 (ja) * 2012-12-03 2016-12-14 株式会社リコー 情報処理装置、情報処理システム及びプログラム
CN103310193B (zh) * 2013-06-06 2016-05-25 温州聚创电气科技有限公司 一种记录体操视频中运动员重要技术动作时刻的方法
CN104461000B (zh) * 2014-12-03 2017-07-21 北京航空航天大学 一种基于少量缺失信号的在线连续人体运动识别方法
US10873777B2 (en) * 2014-12-18 2020-12-22 Sony Corporation Information processing device and information processing method to calculate score for evaluation of action
CA3080600C (fr) * 2015-01-06 2022-11-29 David Burton Systemes de surveillance pouvant etre mobiles et portes

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016056449A1 (fr) * 2014-10-10 2016-04-14 富士通株式会社 Programme, procédé et dispositif de détermination d'habileté et serveur

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JEONGEUN SHIN ET AL.: "A Study on Motion Analysis of an Artistic Gymnastics by using Dynamic Image Processing", IEICE TECHNICAL REPORT, vol. 108, no. 47, 15 May 2008 (2008-05-15), pages 13 - 18, XP031447210, ISSN: 0913-5685 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111382624A (zh) * 2018-12-28 2020-07-07 杭州海康威视数字技术股份有限公司 动作识别方法、装置、设备及可读存储介质
CN111382624B (zh) * 2018-12-28 2023-08-11 杭州海康威视数字技术股份有限公司 动作识别方法、装置、设备及可读存储介质
JPWO2021064830A1 (fr) * 2019-09-30 2021-04-08
WO2021064830A1 (fr) * 2019-09-30 2021-04-08 富士通株式会社 Procédé d'évaluation, programme d'évaluation, et dispositif de traitement d'informations
JP7248137B2 (ja) 2019-09-30 2023-03-29 富士通株式会社 評価方法、評価プログラムおよび情報処理装置
WO2021149250A1 (fr) * 2020-01-24 2021-07-29 富士通株式会社 Procédé de reconnaissance d'exercice, programme de reconnaissance d'exercice, et dispositif de traitement d'informations
JPWO2021149250A1 (fr) * 2020-01-24 2021-07-29
JP7272470B2 (ja) 2020-01-24 2023-05-12 富士通株式会社 運動認識方法、運動認識プログラムおよび情報処理装置
WO2022030439A1 (fr) * 2020-08-07 2022-02-10 ハイパーダイン株式会社 Dispositif de traitement d'informations, procédé de traitement d'informations et programme
JP2022030683A (ja) * 2020-08-07 2022-02-18 ハイパーダイン株式会社 情報処理装置、情報処理方法及びプログラム

Also Published As

Publication number Publication date
JPWO2018070414A1 (ja) 2019-06-24
WO2018070414A1 (fr) 2018-04-19
CN109863535B (zh) 2022-11-25
EP3528207B1 (fr) 2024-04-10
EP3528207A1 (fr) 2019-08-21
US20190220657A1 (en) 2019-07-18
JP6733738B2 (ja) 2020-08-05
US11176359B2 (en) 2021-11-16
EP3528207A4 (fr) 2019-10-30
CN109863535A (zh) 2019-06-07

Similar Documents

Publication Publication Date Title
WO2018069981A1 (fr) Dispositif, programme et procédé de reconnaissance de mouvement
JP6082101B2 (ja) 身体動作採点装置、ダンス採点装置、カラオケ装置及びゲーム装置
WO2021051579A1 (fr) Procédé, système et appareil de reconnaissance de pose corporelle, et support de stockage
CN102693413B (zh) 运动识别
CN105229666B (zh) 3d图像中的运动分析
Hoffman et al. Breaking the status quo: Improving 3D gesture recognition with spatially convenient input devices
US20180021648A1 (en) Swing analysis method using a swing plane reference frame
CN103597515B (zh) 用于识别张开的或闭合的手的系统
Rein et al. Cluster analysis of movement patterns in multiarticular actions: a tutorial
WO2011009302A1 (fr) Procédé d'identification d'actions de corps humain en fonction de multiples points de trace
JP6943294B2 (ja) 技認識プログラム、技認識方法および技認識システム
JP6677319B2 (ja) スポーツ動作解析支援システム、方法およびプログラム
JP7235133B2 (ja) 運動認識方法、運動認識プログラムおよび情報処理装置
KR100907704B1 (ko) 인공지능형 캐디를 이용한 골퍼자세교정시스템 및 이를이용한 골퍼자세교정방법
CN115331314A (zh) 一种基于app筛查功能的运动效果评估方法和系统
CN114093032A (zh) 一种基于动作状态信息的人体动作评估方法
Ting et al. Kinect-based badminton movement recognition and analysis system
JP7409390B2 (ja) 運動認識方法、運動認識プログラムおよび情報処理装置
Anbarsanti et al. Dance modelling, learning and recognition system of aceh traditional dance based on hidden Markov model
US20220276721A1 (en) Methods and systems for performing object detection and object/user interaction to assess user performance
CN111353347A (zh) 动作识别纠错方法、电子设备、存储介质
CN111353345A (zh) 提供训练反馈的方法、装置、系统、电子设备、存储介质
WO2021149250A1 (fr) Procédé de reconnaissance d'exercice, programme de reconnaissance d'exercice, et dispositif de traitement d'informations
Miyashita et al. Analyzing Fine Motion Considering Individual Habit for Appearance-Based Proficiency Evaluation
JP7375829B2 (ja) 生成方法、生成プログラム及び情報処理システム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16918678

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16918678

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP