WO2014042121A1 - 動作評価装置及びそのプログラム - Google Patents
動作評価装置及びそのプログラム Download PDFInfo
- Publication number
- WO2014042121A1 WO2014042121A1 PCT/JP2013/074227 JP2013074227W WO2014042121A1 WO 2014042121 A1 WO2014042121 A1 WO 2014042121A1 JP 2013074227 W JP2013074227 W JP 2013074227W WO 2014042121 A1 WO2014042121 A1 WO 2014042121A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user
- motion
- teacher
- coordinates
- unit
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B23/00—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
- G09B23/28—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine
- G09B23/288—Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for medicine for artificial respiration or heart massage
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
- G06F17/10—Complex mathematical operations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
- G09B19/003—Repetitive work cycles; Sequence of movements
- G09B19/0038—Sports
Definitions
- the present invention teaches an exemplary operation to the user and evaluates the user's operation when it is necessary to memorize the operation in the body, such as emergency measures such as chest compression operation (so-called heart massage), nursing care, and sports.
- emergency measures such as chest compression operation (so-called heart massage), nursing care, and sports.
- the present invention relates to an operation evaluation apparatus and a program for supporting operation acquisition.
- Patent Document 1 when the robot teaches the operation and the user moves, the difference between the photographed user's action and the robot's teaching action is determined based on the photographed image, and advice for correcting the difference is output. Techniques to do this are disclosed.
- Patent Document 2 discloses a technology that can provide an appropriate teacher image by detecting a user's line-of-sight direction and providing a teacher image from the user's angle.
- JP 2006-320424 A Japanese Patent Application Laid-Open No. 11-249772
- Patent Document 1 and Patent Document 2 have a problem in that it is not possible to properly teach a fine difference between the movement taught by the robot and the movement of the user and the tempo.
- these techniques are not teacher images that match the user's physique, there is a problem that it is difficult for the user to grasp the difference in movement and cannot be fully learned.
- the present invention has been made in view of the above-described problems of the prior art, and provides a teacher motion image based on a model that matches the user's physique, displays it superimposed on the user's motion, and evaluates the motion.
- An object of the present invention is to provide an operation teaching device that can effectively teach a model operation to a user.
- the present invention generates a part coordinate calculation unit that calculates part coordinates of a user's body based on the user's image data, generates a user's shape model based on the part coordinates, and uses it as a teacher operation parameter.
- a user model generation unit that generates moving image data of a teacher motion based on the user's shape model, a motion evaluation unit that evaluates the user's motion based on the part coordinates, and a teacher motion based on the user's shape model
- an operation evaluation device having an output control unit that superimposes and displays a user's operation and outputs an evaluation result.
- the part coordinate calculation unit of the present invention is characterized in that the part coordinates of each part are calculated by calculating the center-of-gravity coordinates of the user's head, both shoulders, both elbows, hands, and both wrists. Further, the part coordinate calculation unit may calculate based on at least the parallax information in pixels within a predetermined range of each part. Then, the part coordinate calculation unit may calculate the part coordinates of both shoulders and calculate the part coordinates of the head based on the part coordinates of both shoulders.
- the user model generation unit adds a physique data calculation unit that generates physique data of the user based on the part coordinates calculated by the part coordinate calculation unit, and adds a teacher motion parameter to the user physique data at each time point.
- a teacher motion adding unit for generating teacher moving image data based on a user's shape model.
- the teacher motion addition unit corrects the part coordinates of both elbows and both shoulders of the physique data acquired from the physique data calculation unit based on the part coordinates of both wrists and the initial angle of the teacher action.
- the operation evaluation unit extracts and evaluates the user's operation for one cycle, and the output control unit outputs the evaluation result together.
- the motion referred to in the present invention is a chest compression motion, and the user's chest compression motion may be evaluated.
- the present invention also includes a part coordinate calculation unit that calculates part coordinates of the user's body based on the user's image data, a user's shape model based on the part coordinates, and based on the teacher operation parameters.
- a user model generation unit that generates moving image data of a teacher motion based on a user's shape model, a motion evaluation unit that evaluates a user's motion based on part coordinates, a teacher motion based on a user's shape model, and a user's It is a program characterized by causing a computer to function as an operation evaluation apparatus having an output control unit that superimposes an operation and outputs an evaluation result.
- the present invention it is possible to generate a shape model suitable for the user's physique and display the teacher's action based on the model. For this reason, the user can easily understand the correct operation, and can learn the correct operation more quickly.
- a teacher image is provided with a three-dimensional shape model, the difference in posture in the depth direction, which has been difficult to understand in the past, becomes clear and the operation can be learned more accurately.
- the teacher image and the image of the action performed by the user are superimposed and displayed, and the evaluation result is output, so that the user can clearly understand the small difference from the teacher action. it can. Therefore, it is possible to learn accurate operations at an early stage.
- FIG. 3 is a functional block diagram illustrating an example of a configuration of a user model generation unit 104.
- FIG. It is a flowchart of the production
- FIG. 1 is a block diagram showing an example of the overall configuration of the motion evaluation apparatus 100 according to the present invention.
- the motion evaluation device 100 is connected to an imaging device 10, a sound output device 20, and a display device 30.
- the imaging device 10 is a so-called stereo camera having two cameras, for example. Moreover, the camera which can acquire a three-dimensional distance image may be used.
- the imaging device 10 captures an image of a user who is performing an operation to be learned, such as a chest compression operation.
- the imaging device 10 acquires image data in time series by shooting and sends the image data to the motion evaluation device 100.
- the voice output device 20 is, for example, a speaker, and outputs a voice to an instruction to a user, an evaluation result in the operation evaluation device 100, advice, or the like.
- the display device 30 is a display, for example, and displays moving image data captured by the imaging device 10. Moreover, the evaluation result with respect to a user's operation
- the motion evaluation apparatus 100 includes an image capture unit 101, a part coordinate calculation unit 102, a teacher motion storage unit 103, a user model generation unit 104, a motion evaluation unit 105, and an output control unit 106.
- the image capturing unit 101 captures, for example, a moving image obtained by capturing a user's operation input in real time from the imaging device 10 or moving image data stored in a database (not shown).
- the captured moving image data is moving image data having parallax data or distance data.
- the part coordinate calculation unit 102 calculates the part coordinates of each part of the user's body in each frame image data of the moving image data captured by the image capturing unit 101.
- Each part of the body refers to a part of the human body such as the head or shoulder.
- the part coordinate calculation unit 102 detects the part of the user's body based on the parallax and color information in the captured image, and calculates the center of gravity of each part, so that each part of the user's body in the image is calculated.
- the part coordinates are calculated. A specific calculation method will be described later.
- the calculated part coordinates are output to the user model generation unit 104 or the motion evaluation unit 105.
- the teacher action storage unit 103 is, for example, a memory having a database, and stores moving image data obtained by imaging a model teacher action.
- moving image data obtained by imaging a chest compression operation performed by a lifesaving technician is stored.
- the moving image data stored in the teacher motion storage unit 103 includes parallax data or distance data.
- the teacher motion storage unit 103 stores teacher motion parameters that are time-series change data of the part coordinates of each part.
- the user model generation unit 104 generates a three-dimensional shape model of the user, and generates moving image data of a teacher action based on the user's shape model. That is, based on the part coordinates of each part of the user calculated from the frame image data by the part coordinate calculation unit 102, the user's physique data is calculated, and the user's shape model is generated.
- teacher motion image data is stored in the teacher motion storage unit 103
- a teacher motion parameter is generated based on the teacher motion image data
- the teacher motion parameter is stored in the teacher motion storage unit 103
- Read the teacher movement parameters Read the teacher movement parameters.
- a teacher motion parameter is added to the user's shape model to generate a teacher motion based on the user's shape model.
- the motion evaluation unit 105 evaluates the user's motion based on the teacher motion based on the user's shape model.
- the user's motion is analyzed by calculating the time series change of the image captured by the image capturing unit 101 and the region coordinates during the operation calculated by the region coordinate calculation unit 102.
- an evaluation table is provided, a threshold value at a predetermined part set based on the teacher action is stored, and the action of the user is evaluated by comparing with the threshold value.
- the evaluation may be performed by comparing with the moving image data of the teacher motion based on the shape model that matches the physique of the user generated by the user model generation unit 104.
- the operation evaluation unit 105 may select and output advice such as an improvement point for performing a more ideal operation in addition to the user operation evaluation.
- the output control unit 106 controls output to the display device 30 and the audio output device 20.
- the output control unit 106 matches the moving image data captured by the image capturing unit 101 from the imaging device 10 and the user's physique generated by the user model generation unit 104.
- the output is controlled so as to superimpose and display the moving image data of the teacher motion based on the shape model. Further, output control is performed so that the evaluation result and advice in the operation evaluation unit 105 are displayed or output as audio.
- the processing of the operation evaluation apparatus is roughly divided into the following three.
- (1) A process for calculating part coordinates of each part of the user's body, (2) a process for generating a teacher action based on the user's shape model, and (3) a user action and a teacher action are superimposed and displayed.
- the part coordinate calculation process (1) is performed for each frame image data.
- the motion evaluation apparatus first instructs the user to stand still in order to generate the user's shape model, and based on the calculation result of the part coordinates in (1),
- a teacher motion generation process (user shape model generation mode) is performed based on the user's shape model.
- the operation evaluation mode is set, and the process proceeds to the evaluation process (3) based on the calculation result of the part coordinates according to (1). Therefore, regardless of whether the process (2) or the process (3) is performed, the process (1) is performed on the moving image data before that.
- these processes will be specifically described.
- FIG. 2 is a flowchart showing an example of the flow of calculation processing of part coordinates of each part of the user's body executed by the motion evaluation apparatus 100. is there. Here, it demonstrates along the case where the chest compression operation
- the image capturing unit 101 captures distance image data input from the imaging device, that is, image data and parallax data (step S201). Note that two image data may be acquired from the stereo camera, and the image capturing unit 101 may calculate the parallax. Although the parallax data is used, when the imaging apparatus is not a stereo camera but a camera having a distance meter, image data and distance data may be captured.
- the “image data” here is image data (one frame image) at each time point in the moving image data which is a time-series image.
- the part coordinate calculation process is performed on each of the captured image data.
- the image capturing unit 101 outputs the input image data and parallax data to the part coordinate calculation unit 102.
- the part coordinate calculation unit 102 acquires parallax / color information in each pixel of the screen coordinates in a range corresponding to each part of the body from the image data captured by the image capturing unit 101 (step S202).
- the part coordinate calculation unit 102 performs part coordinate calculation processing for each part of the user's body.
- Each part includes the head, both shoulders (right shoulder, left shoulder), both elbows (right elbow, left elbow), hand, and both wrists (right wrist, left wrist).
- the user may wear a marker with a specific color at a predetermined position such as the shoulder, elbow, wrist, etc. so that the calculation of the part coordinates is easy.
- the part coordinate calculation unit 102 stores a coordinate range (x, y coordinates in the screen coordinates) as a part coordinate calculation range in which each part of the body can exist in the image data, and the screen in that range. Coordinate parallax and color information (for example, hue, saturation, brightness) are acquired and calculated.
- the part coordinate calculation unit 102 inputs parameters for each part of the body (step S203).
- the parameter is a threshold value regarding parallax and color information, and different parameters are set for each part of the body. For example, values such as parallax: 155 to 255, hue: 79 to 89, saturation: 150 to 200, brightness: 0 to 19 are set as parameters for the right elbow.
- the color information is expressed in the range of 0 to 255, for example.
- parameters are set in consideration of the color of the marker.
- parameters are set in consideration of the skin color. In the chest compression operation, the left hand and the right hand are detected as one body because the hands are put in a state where both hands are combined.
- the part coordinate calculation unit 102 extracts a pixel having a range coordinate corresponding to each part, and compares whether or not the parallax and the color information have values within a range set by parameters for each pixel.
- the part coordinate calculation unit 102 extracts pixels having values within the setting range in all parameters of parallax and color information, and acquires coordinate values of these pixels.
- the part coordinate calculation unit 102 calculates the barycentric coordinates for each part (step S204).
- the average value of the coordinates of the pixels whose parallax and color information extracted in step S203 are within a predetermined range is calculated as the center of gravity, and is used as the part coordinate data.
- the calculation order does not matter in calculating the barycentric coordinates.
- more accurate calculation can be performed using the calculation results of the center-of-gravity coordinates of both shoulders and both wrists.
- the coordinate range stored in advance as the extraction range is corrected with the calculated center of gravity coordinates of the wrist.
- the maximum value of the y coordinate in the set coordinate range of the hand is corrected to the y coordinate of the center of gravity of the wrist, and then extraction is performed.
- the extraction range is corrected using the y coordinate of the center of gravity coordinates of the shoulder.
- the center of gravity coordinates are calculated after further correcting the area of the coordinate values used for the calculation.
- a method for calculating the center-of-gravity coordinates of the head will be described later.
- the part coordinate calculation unit 102 acquires the barycentric coordinates in the camera coordinates of each part (step S205). Since the coordinate values calculated in step S204 are screen coordinates, it is necessary to convert them into camera coordinates.
- the camera is the origin
- the plane parallel to the camera plane is the XY plane
- the optical axis extending from the camera plane is the Z axis
- the calculation of the position (camera coordinates) in the three-dimensional space is, for example, Is obtained.
- the part coordinate calculation unit 102 calculates the camera coordinates in each part of the body
- the part coordinate calculation unit 102 outputs the part coordinates to the user model generation unit 104 or the motion evaluation unit 105.
- FIG. 3 is a diagram illustrating an example of a coordinate range designated as a part coordinate calculation range.
- the part coordinate calculation unit 102 stores, for example, two coordinate values (X1, Y1) and (X2, Y2) as a range in which the right elbow can exist in the image data. That is, the maximum value and the minimum value in the x coordinate and the y coordinate, respectively.
- the part coordinate calculation unit 102 acquires the image data
- the two coordinate values that is, a rectangle formed by (X1, Y1), (X1, Y2), (X2, Y1), (X2, Y2) are represented by the part. Extract as a coordinate calculation range.
- the range coordinates corresponding to each part are stored in the part coordinate calculation unit 102 for the other parts.
- the part coordinate calculation unit 102 acquires the parallax and color information of the pixels located in the corresponding range from the image data based on the stored range coordinates.
- FIG. 4 is a diagram for explaining a method of calculating the center of gravity of the right elbow.
- a square cell is an image of a pixel whose parallax and color information are within the parameters.
- the part coordinate calculation unit 102 extracts these pixels within the parameter range, calculates the coordinate average value as the center of gravity, and sets the center of gravity coordinates as the part coordinates.
- the round point in FIG. 4 is the position of the center of gravity of the right elbow calculated.
- FIG. 5 is a diagram for explaining a method of calculating the center-of-gravity coordinates of the head.
- FIG. 5 a is a diagram for explaining a region to be extracted as a head. Assume that image data and parallax data are captured in the same manner as the calculation of the center of gravity of other parts, and here, a method of calculating part coordinates of the head will be described.
- the part coordinate calculation unit 102 reads the stored range coordinates of the head when calculating the center of gravity coordinates of the head.
- the part coordinate calculation unit 102 acquires parallax / color information of screen coordinates based on the corrected range coordinates. This is a process corresponding to step S202. And the part coordinate calculation part 102 inputs the parameter of a head. This is a process corresponding to step S203.
- a parallax threshold value is stored as a head parameter, and a pixel having a corresponding parallax value is extracted.
- the part coordinate calculation unit 102 acquires the maximum value (LeftX) and minimum value (RightX) of the x coordinate and the maximum value (TopY) of the y coordinate from the coordinate value of the extracted pixel.
- a coordinate value (BottomY) of the y coordinate that is an intermediate point between the maximum value of the y coordinate (TopY) and the average value of the y coordinates of the center of gravity of both shoulders (Shoulder Hight) is calculated.
- the pixels in the region surrounded by LeftX, RightX, TopY, and BottomY are corrected as the region used for calculating the center of gravity.
- the pixels used for calculating the center of gravity are further narrowed down by performing a process of extracting only the pixels in the contour portion.
- Number of pixels in the area surrounded by LeftX, RightX, TopY, BottomY and having a predetermined parallax value, the pixels in the contour part (edge) at the top and left and right parts in the area, and the number toward the inside Only the pixels in the pixel portion are extracted.
- the area formed by the extracted pixels is generally a crescent moon.
- FIG. 5b is a schematic diagram showing the pixels used for calculating the center of gravity of the head and the calculated center of gravity.
- the center-of-gravity coordinates of the head are calculated based on the coordinate values of the pixels in the contour region (corresponding to step S204).
- the squares in the figure are the image of the pixels used for calculating the center of gravity, and are the center of gravity of the head where the central black dot is calculated.
- the center of gravity is calculated based on pixels having a predetermined parallax value in a predetermined range coordinate, the central portion of the head is usually hair, so the color is the same and stereo matching is difficult. An error occurs.
- the pixels used for calculating the center of gravity are corrected using the center of gravity coordinates of the shoulder, and further, the center of gravity of the parallax pixels in the contour region of the head is calculated as the center of gravity coordinates of the head. Allows extraction of the head region. Note that there may be no pixel information corresponding to the center of gravity due to missing parallax when acquiring parallax / color information of the pixel. At that time, the camera coordinates of the pixels 3 ⁇ 3 around the center of gravity are calculated together, and the average of the pixels having values is calculated as the camera coordinates of the center of gravity.
- FIG. 6 is a functional block diagram illustrating an example of the configuration of the user model generation unit 104.
- the user model generation unit 104 includes a teacher motion parameter generation unit 601, a physique data calculation unit 602, and a teacher motion addition unit 603.
- the teacher motion parameter generation unit 601 reads the teacher moving image data stored in the teacher motion storage unit 103 and generates a teacher motion parameter.
- the teacher operation parameter is a change value between adjacent image frames of the part coordinates. It is calculated for each of the x, y, and z coordinate values of each part of the body.
- a part coordinate of each part of the body is calculated in each piece of image data of the teacher moving image, and a teacher operation parameter is generated by obtaining a time series difference of each part coordinate. Note that when the teacher motion parameter is already generated and the teacher motion parameter is stored in the teacher motion storage unit 103, the teacher motion parameter generation unit 601 is not necessary.
- the physique data calculation unit 602 is configured to perform a chest compression operation for a predetermined time (for example, for example, at a position where the user's upper body fits in the image in front of the camera so that the physique data can be calculated by the user. For about 1 to 2 seconds, if 1/30 second per frame, 50 frame images). As an example, place a hand on the chest of a cardiopulmonary resuscitation doll placed on the floor and instruct the robot to stand still in a posture in which the straight line connecting the midpoint of the line connecting both shoulders and the palm is perpendicular to the floor plane. In this case, the voice output device 20 may make a voice announcement so as to be stationary. Moving image data is acquired while still, and physique data is calculated based on moving image data for a predetermined time.
- a predetermined time for example, for example, at a position where the user's upper body fits in the image in front of the camera so that the physique data can be calculated by the user. For about 1 to 2 seconds, if 1/30 second per frame, 50 frame
- the calculation method here is based on the part coordinates calculated by the part coordinate calculation unit 102 by the part coordinate calculation process described in (1), and based on a plurality of part coordinates for a predetermined time, the physique data calculation unit 602.
- the physique data is calculated by calculating the average in each part.
- the teacher motion adding unit 603 adds a teacher motion parameter, which is a time-series change in the coordinate values of the part coordinates generated by the teacher motion parameter generating unit 601, to the user's physique data generated by the physique data calculating unit 602.
- a teacher motion parameter which is a time-series change in the coordinate values of the part coordinates generated by the teacher motion parameter generating unit 601
- the user's physique data generated by the physique data calculating unit 602.
- FIG. 7a is a flowchart of a teacher operation parameter generation process based on a teacher moving image.
- the teacher motion parameter generation unit 601 takes in the teacher moving image data stored in the teacher motion storage unit 103 (step S701).
- moving image data for one cycle of the chest compression operation is extracted (step S702).
- moving image data at the time of compression of the sternum by the lifesaving technician is used as teacher moving image data, and moving image data is extracted with one cycle of compressing and returning the sternum.
- the change is performed by extracting, from the teacher moving image data, the change in the coordinate (y coordinate) in the direction of pressing the chest of the doll of the hand part coordinates. Note that this step can be omitted when the stored moving image data is already one cycle of moving image data.
- the teacher motion parameter generation unit 601 calculates part coordinates of each part of the body in each image data of moving image data for one period of the chest compression motion (step S703).
- the calculation method of the part coordinates is the same as the above-described method, and the part coordinates of each part of the head, both shoulders, both elbows, hands, and both wrists are calculated.
- the teacher motion parameter generation unit 601 averages the movements of the extracted part coordinates (step S704).
- the calculated part coordinate data is grouped for each part coordinate, and the time series of movement is normalized. For example, it is assumed that there is data for seven frames with respect to the y coordinate data of the head. This is normalized as follows.
- the teacher motion parameter generation unit 601 generates a teacher motion parameter that is time-series motion change data (step S705).
- the generated teacher motion is not three-dimensional coordinate data indicating a position, but data indicating a change in coordinate data in time series. That is, the difference data of the coordinate values between adjacent frames.
- the teacher operation parameters for the y-coordinate data of the head after the polynomial approximation are as follows. In Table 3, although it corresponds to 7 frames, the teacher motion parameters are generated for the time of one period of the chest compression motion for each part coordinate.
- the teacher motion that is the change data of the motion in time series may be generated in advance.
- the generation processing performed by the teacher motion parameter generation unit 601 is performed in advance, and the teacher motion storage unit 103 stores data of teacher motion parameters, which are movement change data in a time series for one cycle.
- the teacher action addition unit 603 stores the teacher action parameters from the teacher action storage unit 103 and performs addition processing.
- FIG. 7 b is a flowchart of the moving image generation process of the teacher action based on the user's shape model.
- the physique data calculation unit 602 instructs the user for a predetermined time (for example, about 2 to 3 seconds) by voice or display so as to stand still at a predetermined position where the upper body of the user is reflected on the imaging device.
- the part coordinates of each part of the person are acquired for a plurality of frames (step S711).
- the part coordinates of each part are calculated for each image data by the part coordinate calculation unit 102 and input to the physique data calculation unit 602. Therefore, the physique data calculation unit 602 stores the input part coordinate data for a plurality of frames (for example, 50 frames).
- the physique data calculation unit 602 calculates the average value of the part coordinate data for each part of the stored part coordinate data for a plurality of frames, thereby calculating the physique data of the user (step S712).
- the teacher motion addition unit 603 acquires the physique data from the physique data calculation unit 602, and sets these part coordinate data as initial values as physique data (step S713). Based on the calculated physique data, a user's initial shape model is generated in principle. In addition, although the correction process regarding the part coordinates of both elbows and both shoulders is performed by the teacher motion adding unit 603, the process will be described later.
- the physique data was calculated by calculating the average value of the part coordinate data of each part in a stationary state, but for the part coordinates of the hand, several attempts to perform chest compressions were made. From the above.
- the three-dimensional shape data of the chest of the cardiopulmonary resuscitation doll is acquired in a state where the user is not imaged.
- the part coordinate calculation unit 102 acquires a data string of coordinates of the ridgeline portion of the chest of the cardiopulmonary resuscitation doll.
- the motion evaluation apparatus 100 instructs the user to place a hand on the cardiopulmonary resuscitation doll, and press it into the chest to perform the chest compression operation to return to the original multiple times by voice or the like.
- the coordinates of the center of gravity (x, y) at the position where the hand is at the top are obtained.
- the physique data calculation unit 602 calculates. This is the x-coordinate value in the hand part coordinate data.
- the y value of the ridge line of the chest of the doll at the position of the x coordinate value is acquired and compared with the y coordinate value of the hand, and the larger y value is set as the y coordinate value in the hand part coordinate data. This is because the position of the hand in an unpressed state is set to the initial position.
- the teacher motion adding unit 603 receives the user's physique data calculated by the physique data calculating unit 602 and the teacher motion generated by the teacher motion parameter generating unit 601.
- the teacher motion addition unit 603 adds the teacher motion at each time point (each frame image) using the user's physique data calculated by the physique data calculation unit 602 as an initial value (step S714).
- the teacher motion parameters indicating the time-series change data of the coordinates in each part to the user's physique data, and further adding to the added coordinate value
- the user's part coordinates are changed to the teacher motion. Changes according to Thereby, it is possible to generate moving image data of a teacher action that matches the physique of the user.
- FIG. 8 is a schematic diagram regarding correction of the user's physique data.
- the initial correct posture is a state where the arm is relatively straight.
- the initial posture of the user may be a state where the arm is bent. For this reason, with respect to the physique data of both wrists, the physique data of both elbows and both shoulders is corrected by the teacher motion adding unit 603.
- the teacher motion adding unit 603 obtains initial coordinate data of both wrists, both elbows, and both shoulders when the teacher motion parameters are generated from the teacher motion parameter generating unit 601, and is configured with wrists, elbows, and shoulders.
- the initial angle ⁇ is calculated.
- the initial angle ⁇ may be calculated and stored in advance.
- the teacher motion addition unit 603 acquires the physique data from the physique data calculation unit 602, and calculates the length from the wrist to the elbow for both arms from the coordinates of the user's wrist and elbow. Similarly, the length from the elbow to the shoulder is calculated from the coordinates of the user's elbow and shoulder.
- the teacher motion adding unit 603 determines the positions of both wrists based on the coordinates of the two wrists in the physique data, the initial angle ⁇ of the teacher motion, the calculated length from the wrist to the elbow, and the length from the elbow to the shoulder. Using the coordinates as a reference, the position coordinates of both elbows and both shoulders when the user's arm makes an initial angle ⁇ are calculated, and the position coordinates of the elbow and the direction in the physique data are corrected.
- the part coordinates of both elbows and the parts of both shoulders in the physique data are replaced with the coordinate data in the correct initial posture. Therefore, it is possible to teach an initial posture that matches the user's physique.
- FIG. 9 is a flowchart of a process for evaluating a user's operation.
- a shape model that is generated by the user model generation unit 104 and performs a teacher action with an initial posture that matches the corrected user's physique is output to the display device 30 under output control by the output control unit 106.
- the superimposed display with the image of the user displayed and imaged by the imaging device is started (step S901).
- the user adjusts the posture so that the shape model is the same as the image in which the shape model is copied.
- the motion evaluation unit 105 buffers the part coordinates of each part of the user's body calculated by the part coordinate calculation part 102 in each image data for each frame image (step S902). Each buffered part coordinate data is used to detect one period of chest compression motion or to calculate a feature value for motion evaluation.
- the motion evaluation unit 105 extracts one period of chest compression motion (step S903).
- the user's hand checks the transition of the position coordinates of both hands when performing chest compression action on the cardiopulmonary resuscitation doll.
- the coordinates (y coordinate) in the direction of pressing the cardiopulmonary resuscitation doll among the part coordinates of the user's hand include the movement indicating the compression (for example, the frame group in which the value of the y coordinate continues to decrease) and the hand.
- the movement to return to for example, a frame group in which the value of the y coordinate continues to increase
- it is invalid about a motion with a small motion in which the change of coordinate value is below a predetermined value.
- the motion evaluation unit 105 when extracting one cycle of the chest compression motion, generates a feature amount for the user's motion evaluation and advice based on the region coordinate data for one cycle (step S904).
- the feature amount is, for example, the time (number of frames) required for one cycle of the chest compression operation and the average of the minimum value of the left elbow angle and the minimum value of the right elbow angle in one cycle.
- the motion evaluation unit 105 stores, for example, the feature value used for evaluation as an evaluation table in association with the threshold value and evaluation of the feature value.
- the operation evaluation unit 105 refers to the evaluation table with respect to the generated feature amount, and compares the generated feature amount with a predetermined threshold value stored, thereby evaluating the operation of the user (step S905). Each feature quantity is compared with a threshold value, and a feature quantity deviating from the threshold value is extracted. The operation evaluation is performed in each cycle. The output of the evaluation result and the advice content is controlled by the output control unit 106, and a sound and a screen are displayed for each cycle.
- the operation evaluation unit 105 detects whether or not a certain time has elapsed from the start of evaluation (step S906). If the fixed time has not elapsed (No in step S906), the evaluation is continued.
- the fixed time elapse is, for example, 2 minutes, which is a guideline for one person in the guidelines on chest compressions to continue. However, it is not limited to this and may be set freely.
- the final evaluation result is output (step S907).
- FIG. 10 shows an example in which a shape model image based on the user's physique and a user's image are superimposed and displayed.
- the moving image data of the shape model that performs the teacher motion is generated by a mesh-like shape as shown in FIG. 10 when the teacher motion adder 603 generates a user shape model based on the part coordinates of each part of the body.
- the model is displayed.
- the shape model based on the part coordinates of the center of gravity of the body is generated by the existing technology, the description is omitted.
- FIG. 11 is a graph showing the transition of the coordinates of the part of the user's hand buffered in step S902 of FIG.
- the y coordinate is the y coordinate in the camera coordinate and is in the vertical direction. Therefore, the user's hand is in the direction of pressing back the sternum of the doll, and the change in the coordinate value of the y coordinate is extracted from the part coordinates.
- the motion evaluation unit 105 extracts a change (difference) in coordinates between image data.
- FIG. 12 is a table showing the difference between image data (between image data frames) and the direction of the compression direction (positive or negative).
- the motion evaluation unit 105 makes the change between 7 and 8 frames negative when the change is positive as in the case of the 7-8 frame, but is negative between both adjacent frames. Processing to do. In this way, when only the direction of movement between one frame is different, invalidation processing is performed.
- FIG. 13 is an example of a table in which one cycle of the chest compression operation detected based on the transition of the coordinates of the part of the user's hand is calculated.
- a frame whose coordinate value change is “positive” is regarded as a direction in which the hand moves away from the doll, that is, ascending, and is returned from chest compression.
- a frame whose coordinate value change is “negative” is regarded as a direction in which the hand approaches the doll, that is, descends, and is regarded as a movement that compresses the sternum.
- the motion evaluation unit 105 extracts the compression motion and the return motion as one set and extracts it as one cycle of the chest compression motion.
- FIG. 14 is an example of an evaluation table that represents the feature amount generated in step S904 of FIG. 9 and its threshold value.
- a threshold value for a desired motion is stored in association with each feature amount, and the motion evaluation unit 105 evaluates each feature amount based on the evaluation table for each period of the chest compression motion in step S905. For example, if the difference between the initial value of the y coordinate of the right wrist and the maximum value is 40, it is determined to be good.
- the maximum and minimum values of the z coordinate of the head, shoulder, and elbow are set as feature quantities for evaluation. Since the user is located in front of the camera, the z direction is the depth direction.
- threshold values are stored in the evaluation table, but advice may be stored in association with them. For example, “more slowly” is advice when the number of frames in one cycle is less than 6 frames, “faster” is advice when the number of frames is 8 frames or more, and the like. These evaluations and advice are displayed and output as audio.
- FIG. 15 shows an example of a display screen in which the moving image data of the shape model of the user performing the teacher operation and the moving image data of the chest compression operation of the user are superimposed and displayed on the screen.
- the operation evaluation unit 105 evaluates that the operation is an exemplary operation. As shown in FIG. 15, in the case of an exemplary operation, the number of times may be counted as “good” and displayed. Also, as shown in FIG. 15, the depth of the user's hand pressing the doll's hand, that is, the transition of the y coordinate of the hand is displayed, and the threshold line of the depth necessary for chest compression is also displayed. Thus, it may be possible to tell at a glance whether or not compression necessary for chest compression is being performed.
- the operation evaluation apparatus 100 may be configured as a personal computer owned by a system user and a program executed on the personal computer as shown in FIG.
- the personal computer includes a CPU (Central Processing Unit) 1601, a RAM (Random Access Memory) 1603 connected to the CPU 1601 via a bus, a ROM (Read Only Memory) 1605, an external storage device 1607 such as a hard disk drive, an I / O O interface 1609, a communication interface 1611 for connecting to a communication network line, and the like.
- a camera 1613, a microphone 1615, and a display 1617 are connected to the interface 1609.
- the functions of the teacher motion storage unit 103 are realized by an external storage device
- the functions of the imaging device 10, the audio output device 20, and the display device 30 are realized by a camera, a microphone, and a display, respectively.
- Programs that realize various functions are stored in the external storage device 1607, read out to the RAM 1603, and then executed by the CPU 1601.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Educational Technology (AREA)
- Educational Administration (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Entrepreneurship & Innovation (AREA)
- Computer Graphics (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Computational Mathematics (AREA)
- Pure & Applied Mathematics (AREA)
- Algebra (AREA)
- General Engineering & Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Chemical & Material Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Cardiology (AREA)
- Medical Informatics (AREA)
- Medicinal Chemistry (AREA)
- Geometry (AREA)
- Computer Hardware Design (AREA)
- Databases & Information Systems (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
図2は、動作評価装置100により実行される利用者の体の各部位の部位座標の算出処理の流れの一例を示すフロー図である。ここでは、胸骨圧迫動作を教示する場合に沿って説明する。
図6は、利用者モデル生成部104の構成の一例を示す機能ブロック図である。例えば、利用者モデル生成部104は、教師動作パラメータ生成部601、体格データ算出部602、教師動作加算部603から構成される。教師動作パラメータ生成部601は、教師動作記憶部103に記憶された教師動画像データを読み出し、教師動作パラメータを生成する。教師動作パラメータとは、部位座標の隣接する画像フレーム間での変化値である。体の各部位のx、y、zの座標値それぞれにおいて算出される。具体的には、教師動画像の各画像データにおいて体の各部位の部位座標を算出し、各部位座標の時系列での差分を取得することで教師動作パラメータを生成する。なお、すでに教師動作パラメータを生成し、教師動作記憶部103に教師動作パラメータが記憶されている場合は、教師動作パラメータ生成部601は不要である。
たとえば、多項式近似を行った後の頭部のy座標データについての教師動作パラメータは、下記のようになる。
体格データの算出が終わると、利用者のモデルによる教師動作が表示できる状態となり、動作学習へとうつる(動作評価モード)。利用者が動作を行う際に利用者のモデルによる教師動作が重畳表示され、利用者の動作について評価が行われる。図9は、利用者の動作を評価する処理のフローチャートである。
20 音声出力装置
30 表示装置
100 動作評価装置
101 画像取込部
102 部位座標算出部
103 教師動作記憶部
104 利用者モデル生成部
105 動作評価部
106 出力制御部
Claims (9)
- 利用者の画像データに基づいて利用者の体の部位座標を算出する部位座標算出部と、
前記部位座標に基づいて利用者の形状モデルを生成し、教師動作パラメータに基づいて前記利用者の形状モデルによる教師動作の動画像データを生成する利用者モデル生成部と、
前記部位座標に基づいて利用者の動作を評価する動作評価部と、
前記利用者の形状モデルによる教師動作と利用者の動作とを重畳表示し、評価結果を出力する出力制御部とを有する動作評価装置。 - 前記部位座標算出部は、利用者の頭、両肩、両肘、手、両手首それぞれの重心座標を算出することで各部位の部位座標を算出することを特徴とする請求項1記載の動作評価装置。
- 前記部位座標算出部は、各部位の所定範囲内の画素における少なくとも視差情報に基づいて算出することを特徴とする請求項1又は2記載の動作評価装置。
- 前記部位座標算出部は、両肩の部位座標を算出し、両肩の部位座標に基づいて頭の部位座標を算出することを特徴とする請求項2又は3記載の動作評価装置。
- 前記利用者モデル生成部は、
前記部位座標算出部が算出した部位座標に基づいて利用者の体格データを生成する体格データ算出部と、
前記利用者の体格データに教師動作パラメータを各時点で加算することで前記利用者の形状モデルによる教師動画像データを生成する教師動作加算部と、
を有する、請求項1~4記載の動作評価装置。 - 前記教師動作加算部は、前記体格データ算出部から取得した体格データのうち、両肘及び両肩の部位座標を、両手首の部位座標と、教師動作の初期角度に基づいて補正することを特徴とする請求項5記載の動作評価装置。
- 前記動作評価部は、利用者の動作を1周期分抽出して評価を行い、
前記出力制御部は、評価結果をあわせて出力する請求項1~6記載の動作評価装置。 - 動作は胸骨圧迫動作であり、利用者の胸骨圧迫動作を評価することを特徴とする請求項1~7記載の動作評価装置。
- 利用者の画像データに基づいて利用者の体の部位座標を算出する部位座標算出部と、
前記部位座標に基づいて利用者の形状モデルを生成し、教師動作パラメータに基づいて前記利用者の形状モデルによる教師動作の動画像データを生成する利用者モデル生成部と、
前記部位座標に基づいて利用者の動作を評価する動作評価部と、
前記利用者の形状モデルによる教師動作と利用者の動作とを重畳表示し、評価結果を出力する出力制御部とを有する動作評価装置としてコンピュータを機能させることを特徴とするプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/427,801 US20150255005A1 (en) | 2012-09-12 | 2013-09-09 | Movement evaluation device and program therefor |
JP2014535535A JP6124308B2 (ja) | 2012-09-12 | 2013-09-09 | 動作評価装置及びそのプログラム |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-200433 | 2012-09-12 | ||
JP2012200433 | 2012-09-12 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014042121A1 true WO2014042121A1 (ja) | 2014-03-20 |
Family
ID=50278232
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2013/074227 WO2014042121A1 (ja) | 2012-09-12 | 2013-09-09 | 動作評価装置及びそのプログラム |
Country Status (3)
Country | Link |
---|---|
US (1) | US20150255005A1 (ja) |
JP (1) | JP6124308B2 (ja) |
WO (1) | WO2014042121A1 (ja) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017136248A (ja) * | 2016-02-04 | 2017-08-10 | 公立大学法人岩手県立大学 | 聴診システム |
WO2018124188A1 (ja) * | 2016-12-27 | 2018-07-05 | Coaido株式会社 | 測定装置およびプログラム |
CN109859537A (zh) * | 2019-03-22 | 2019-06-07 | 吉首大学 | 一种织锦教学系统及方法、信息数据处理终端 |
JP2020005192A (ja) * | 2018-06-29 | 2020-01-09 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
WO2022075116A1 (ja) * | 2020-10-06 | 2022-04-14 | 国立研究開発法人産業技術総合研究所 | 情報表示装置及び情報表示方法 |
JP2023001003A (ja) * | 2021-06-17 | 2023-01-04 | 昭則 皆月 | 心肺蘇生術訓練プログラム、心肺蘇生術訓練方法、心肺蘇生術訓練装置、心肺蘇生術訓練システム |
Families Citing this family (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2012147961A1 (ja) * | 2011-04-28 | 2012-11-01 | Necシステムテクノロジー株式会社 | 情報処理装置、情報処理方法及び記録媒体 |
EP2973215B1 (en) * | 2013-03-15 | 2023-05-17 | NIKE Innovate C.V. | Feedback signals from image data of athletic performance |
CN111552416A (zh) * | 2015-04-13 | 2020-08-18 | 华为技术有限公司 | 启动任务管理界面的方法、装置及设备 |
US11219575B2 (en) * | 2016-03-23 | 2022-01-11 | Zoll Medical Corporation | Real-time kinematic analysis during cardio-pulmonary resuscitation |
US10737140B2 (en) | 2016-09-01 | 2020-08-11 | Catalyft Labs, Inc. | Multi-functional weight rack and exercise monitoring system for tracking exercise movements |
JP7122090B2 (ja) * | 2017-07-21 | 2022-08-19 | 前田建設工業株式会社 | 熟練動作教示システム |
CN111640176A (zh) * | 2018-06-21 | 2020-09-08 | 华为技术有限公司 | 一种物体建模运动方法、装置与设备 |
KR102276009B1 (ko) * | 2019-10-15 | 2021-07-13 | 정성호 | 심폐소생술 훈련 시뮬레이션 장치 및 방법 |
CN111539245B (zh) * | 2020-02-17 | 2023-04-07 | 吉林大学 | 一种基于虚拟环境的cpr技术训练评价方法 |
EP4288016A4 (en) * | 2021-05-04 | 2024-04-03 | Nutricia Early Life Nutrition (Shanghai) Co.,Ltd. | MASSAGE TRAINING METHOD, DEVICE AND APPLICATION THEREOF |
CN115641646B (zh) * | 2022-12-15 | 2023-05-09 | 首都医科大学宣武医院 | 一种cpr自动检测质控方法及系统 |
CN116012938B (zh) * | 2022-01-27 | 2024-07-09 | 首都医科大学宣武医院 | 一种基于AlphaPose算法的CPR自动反馈检测模型的构建方法及系统 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005237494A (ja) * | 2004-02-24 | 2005-09-08 | Nihon Knowledge Kk | 実技分析システム及びプログラム |
JP2011062352A (ja) * | 2009-09-17 | 2011-03-31 | Koki Hashimoto | 運動モーション教示装置および遊戯施設 |
JP2011095935A (ja) * | 2009-10-28 | 2011-05-12 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
Family Cites Families (53)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4448534A (en) * | 1978-03-30 | 1984-05-15 | American Hospital Corporation | Antibiotic susceptibility testing |
US5681223A (en) * | 1993-08-20 | 1997-10-28 | Inventures Inc | Training video method and display |
US5823786A (en) * | 1993-08-24 | 1998-10-20 | Easterbrook; Norman John | System for instruction of a pupil |
US6390996B1 (en) * | 1998-11-09 | 2002-05-21 | The Johns Hopkins University | CPR chest compression monitor |
US6554706B2 (en) * | 2000-05-31 | 2003-04-29 | Gerard Jounghyun Kim | Methods and apparatus of displaying and evaluating motion data in a motion game apparatus |
US20020064764A1 (en) * | 2000-11-29 | 2002-05-30 | Fishman Lewis R. | Multimedia analysis system and method of use therefor |
US20060247070A1 (en) * | 2001-06-11 | 2006-11-02 | Recognition Insight, Llc | Swing position recognition and reinforcement |
US20050272517A1 (en) * | 2001-06-11 | 2005-12-08 | Recognition Insight, Llc | Swing position recognition and reinforcement |
US6702691B2 (en) * | 2002-08-12 | 2004-03-09 | Callaway Golf Company | Static pose fixture |
US7428318B1 (en) * | 2003-12-11 | 2008-09-23 | Motion Reality, Inc. | Method for capturing, measuring and analyzing motion |
US7457439B1 (en) * | 2003-12-11 | 2008-11-25 | Motion Reality, Inc. | System and method for motion capture |
US20060094523A1 (en) * | 2004-11-04 | 2006-05-04 | Hall Carolyn W | Method and apparatus for teaching how to execute a predetermined motion |
US20070291035A1 (en) * | 2004-11-30 | 2007-12-20 | Vesely Michael A | Horizontal Perspective Representation |
WO2007030947A1 (en) * | 2005-09-16 | 2007-03-22 | Anthony Szturm | Mapping motion sensors to standard input devices |
US8323189B2 (en) * | 2006-05-12 | 2012-12-04 | Bao Tran | Health monitoring appliance |
JP2009543649A (ja) * | 2006-07-19 | 2009-12-10 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 健康管理装置 |
US8328691B2 (en) * | 2007-02-14 | 2012-12-11 | Koninklijke Philips Electronics N.V. | Feedback device for guiding and supervising physical excercises |
US9149222B1 (en) * | 2008-08-29 | 2015-10-06 | Engineering Acoustics, Inc | Enhanced system and method for assessment of disequilibrium, balance and motion disorders |
US8175326B2 (en) * | 2008-02-29 | 2012-05-08 | Fred Siegel | Automated scoring system for athletics |
US7671802B2 (en) * | 2008-03-17 | 2010-03-02 | Disney Enterprises, Inc. | Active player tracking |
US20100152600A1 (en) * | 2008-04-03 | 2010-06-17 | Kai Sensors, Inc. | Non-contact physiologic motion sensors and methods for use |
KR101483713B1 (ko) * | 2008-06-30 | 2015-01-16 | 삼성전자 주식회사 | 모션 캡쳐 장치 및 모션 캡쳐 방법 |
US7996793B2 (en) * | 2009-01-30 | 2011-08-09 | Microsoft Corporation | Gesture recognizer system architecture |
US8517834B2 (en) * | 2009-02-17 | 2013-08-27 | Softkinetic Studios Sa | Computer videogame system with body position detector that requires user to assume various body positions |
JP2011078728A (ja) * | 2009-03-10 | 2011-04-21 | Shinsedai Kk | 身体状態評価装置、状態推測装置、歩幅推測装置、及び、健康管理システム |
WO2011011633A2 (en) * | 2009-07-22 | 2011-01-27 | Atreo Medical, Inc. | Optical techniques for the measurement of chest compression depth and other parameters during cpr |
US8206266B2 (en) * | 2009-08-05 | 2012-06-26 | David Hall | Sensor, control and virtual reality system for a trampoline |
US8202161B2 (en) * | 2009-10-23 | 2012-06-19 | Disney Enterprises, Inc. | Virtual game instructor |
IT1399855B1 (it) * | 2010-04-28 | 2013-05-09 | Technogym Spa | Apparato per l'esecuzione assistita di un esercizio ginnico. |
US20110269601A1 (en) * | 2010-04-30 | 2011-11-03 | Rennsselaer Polytechnic Institute | Sensor based exercise control system |
US8602887B2 (en) * | 2010-06-03 | 2013-12-10 | Microsoft Corporation | Synthesis of information from multiple audiovisual sources |
US8562403B2 (en) * | 2010-06-11 | 2013-10-22 | Harmonix Music Systems, Inc. | Prompting a player of a dance game |
CN103118647B (zh) * | 2010-09-22 | 2016-04-06 | 松下知识产权经营株式会社 | 运动支援系统 |
US9223936B2 (en) * | 2010-11-24 | 2015-12-29 | Nike, Inc. | Fatigue indices and uses thereof |
WO2012061804A1 (en) * | 2010-11-05 | 2012-05-10 | Nike International Ltd. | Method and system for automated personal training |
US9457256B2 (en) * | 2010-11-05 | 2016-10-04 | Nike, Inc. | Method and system for automated personal training that includes training programs |
US9283429B2 (en) * | 2010-11-05 | 2016-03-15 | Nike, Inc. | Method and system for automated personal training |
US9011293B2 (en) * | 2011-01-26 | 2015-04-21 | Flow-Motion Research And Development Ltd. | Method and system for monitoring and feed-backing on execution of physical exercise routines |
US8718748B2 (en) * | 2011-03-29 | 2014-05-06 | Kaliber Imaging Inc. | System and methods for monitoring and assessing mobility |
US8786680B2 (en) * | 2011-06-21 | 2014-07-22 | Disney Enterprises, Inc. | Motion capture from body mounted cameras |
CN103732297B (zh) * | 2011-06-30 | 2016-11-02 | 奥林奇实验室 | 增强现实活动范围的治疗系统及其操作方法 |
US8696450B2 (en) * | 2011-07-27 | 2014-04-15 | The Board Of Trustees Of The Leland Stanford Junior University | Methods for analyzing and providing feedback for improved power generation in a golf swing |
US11133096B2 (en) * | 2011-08-08 | 2021-09-28 | Smith & Nephew, Inc. | Method for non-invasive motion tracking to augment patient administered physical rehabilitation |
US9350951B1 (en) * | 2011-11-22 | 2016-05-24 | Scott Dallas Rowe | Method for interactive training and analysis |
US8951138B2 (en) * | 2012-01-25 | 2015-02-10 | Wawgd, Inc. | Golf club head measurement system |
JP5314224B1 (ja) * | 2012-02-29 | 2013-10-16 | 美津濃株式会社 | ランニングフォーム診断システムおよびランニングフォームを得点化する方法 |
KR102025752B1 (ko) * | 2012-07-30 | 2019-11-05 | 삼성전자주식회사 | 사용자의 자세에 따라 콘텐트를 제공하는 전자기기 및 콘텐트 제공 방법 |
US9579048B2 (en) * | 2012-07-30 | 2017-02-28 | Treefrog Developments, Inc | Activity monitoring system with haptic feedback |
KR20140062892A (ko) * | 2012-11-15 | 2014-05-26 | 삼성전자주식회사 | 운동 서비스를 제공하기 위한 웨어러블 디바이스와 디스플레이 장치 및 이를 포함하는 운동 서비스 제공 시스템과 그 방법 |
US9161708B2 (en) * | 2013-02-14 | 2015-10-20 | P3 Analytics, Inc. | Generation of personalized training regimens from motion capture data |
KR102033077B1 (ko) * | 2013-08-07 | 2019-10-16 | 나이키 이노베이트 씨.브이. | 제스처 인식 및 전력 관리를 갖는 손목 착용 운동 디바이스 |
US10134307B2 (en) * | 2013-12-12 | 2018-11-20 | Koninklijke Philips N.V. | Software application for a portable device for CPR guidance using augmented reality |
SE539429C2 (en) * | 2015-12-15 | 2017-09-19 | Greater Than S A | Method and system for assessing the trip performance of a driver |
-
2013
- 2013-09-09 WO PCT/JP2013/074227 patent/WO2014042121A1/ja active Application Filing
- 2013-09-09 JP JP2014535535A patent/JP6124308B2/ja active Active
- 2013-09-09 US US14/427,801 patent/US20150255005A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005237494A (ja) * | 2004-02-24 | 2005-09-08 | Nihon Knowledge Kk | 実技分析システム及びプログラム |
JP2011062352A (ja) * | 2009-09-17 | 2011-03-31 | Koki Hashimoto | 運動モーション教示装置および遊戯施設 |
JP2011095935A (ja) * | 2009-10-28 | 2011-05-12 | Namco Bandai Games Inc | プログラム、情報記憶媒体及び画像生成システム |
Non-Patent Citations (1)
Title |
---|
"Kinect no Tracking Genri 'Bui Ninshiki ni Motozuku 3D Shisei Suitei'", DERIVE, 5 December 2010 (2010-12-05), Retrieved from the Internet <URL:http://derivecv.tumblr.com/post/2106495200> [retrieved on 20131202] * |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017136248A (ja) * | 2016-02-04 | 2017-08-10 | 公立大学法人岩手県立大学 | 聴診システム |
WO2018124188A1 (ja) * | 2016-12-27 | 2018-07-05 | Coaido株式会社 | 測定装置およびプログラム |
JPWO2018124188A1 (ja) * | 2016-12-27 | 2018-12-27 | Coaido株式会社 | 測定装置およびプログラム |
JP2020005192A (ja) * | 2018-06-29 | 2020-01-09 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
JP7262937B2 (ja) | 2018-06-29 | 2023-04-24 | キヤノン株式会社 | 情報処理装置、情報処理方法、及びプログラム |
US11854420B2 (en) | 2018-06-29 | 2023-12-26 | Canon Kabushiki Kaisha | Information processing apparatus having position information of parts of subjects acquired at different timings, information processing method, and storage medium |
CN109859537A (zh) * | 2019-03-22 | 2019-06-07 | 吉首大学 | 一种织锦教学系统及方法、信息数据处理终端 |
WO2022075116A1 (ja) * | 2020-10-06 | 2022-04-14 | 国立研究開発法人産業技術総合研究所 | 情報表示装置及び情報表示方法 |
JP2023001003A (ja) * | 2021-06-17 | 2023-01-04 | 昭則 皆月 | 心肺蘇生術訓練プログラム、心肺蘇生術訓練方法、心肺蘇生術訓練装置、心肺蘇生術訓練システム |
Also Published As
Publication number | Publication date |
---|---|
JPWO2014042121A1 (ja) | 2016-08-18 |
JP6124308B2 (ja) | 2017-05-10 |
US20150255005A1 (en) | 2015-09-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6124308B2 (ja) | 動作評価装置及びそのプログラム | |
CN109432753B (zh) | 动作矫正方法、装置、存储介质及电子设备 | |
CN111091732B (zh) | 一种基于ar技术的心肺复苏指导器及指导方法 | |
US11069144B2 (en) | Systems and methods for augmented reality body movement guidance and measurement | |
JP6045139B2 (ja) | 映像生成装置、映像生成方法及びプログラム | |
US20160048993A1 (en) | Image processing device, image processing method, and program | |
CN110544301A (zh) | 一种三维人体动作重建系统、方法和动作训练系统 | |
EP2203896B1 (en) | Method and system for selecting the viewing configuration of a rendered figure | |
TW201528225A (zh) | 心肺復甦術教學系統及方法 | |
CN110544302A (zh) | 基于多目视觉的人体动作重建系统、方法和动作训练系统 | |
CN110490173B (zh) | 一种基于3d体感模型的智能动作打分系统 | |
CN112749684A (zh) | 心肺复苏术训练和评估的方法及装置、设备、存储介质 | |
CN111539245B (zh) | 一种基于虚拟环境的cpr技术训练评价方法 | |
CN110751100A (zh) | 一种体育场馆辅助训练方法与系统 | |
CN114022512A (zh) | 运动辅助方法、装置及介质 | |
CN114783001A (zh) | 游泳姿态评估方法、系统、装置及计算机可读存储介质 | |
CN110477921B (zh) | 基于骨架折线Ridge回归的身高测量方法 | |
JP2008065368A (ja) | ステレオ画像を利用した物体の位置および姿勢認識システム、物体の位置および姿勢認識方法、およびこの方法を実行するプログラム | |
WO2016107226A1 (zh) | 图像处理方法及装置 | |
WO2021039857A1 (ja) | 映像生成装置 | |
CN112818800A (zh) | 一种基于人体骨骼点深度图像的体育动作评估方法及系统 | |
CN116525061B (zh) | 一种基于远程人体姿态评估的训练监控方法及系统 | |
JP7482471B2 (ja) | 学習モデルの生成方法 | |
Li et al. | Fitness coach: Design and implementation of a smart mirror based on automatic image recognition and action model comparison | |
KR20200088562A (ko) | 심층 신경망을 이용한 동작 및 근력 분석 방법 및 시스템 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 13837578 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014535535 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14427801 Country of ref document: US |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 13837578 Country of ref document: EP Kind code of ref document: A1 |