WO2022224998A1 - Dispositif et procédé d'affichage - Google Patents

Dispositif et procédé d'affichage Download PDF

Info

Publication number
WO2022224998A1
WO2022224998A1 PCT/JP2022/018361 JP2022018361W WO2022224998A1 WO 2022224998 A1 WO2022224998 A1 WO 2022224998A1 JP 2022018361 W JP2022018361 W JP 2022018361W WO 2022224998 A1 WO2022224998 A1 WO 2022224998A1
Authority
WO
WIPO (PCT)
Prior art keywords
moving image
display
control unit
display control
racket
Prior art date
Application number
PCT/JP2022/018361
Other languages
English (en)
Japanese (ja)
Inventor
義之 川口
健汰 西村
貴紀 生田
重雄 青野
Original Assignee
京セラ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京セラ株式会社 filed Critical 京セラ株式会社
Priority to JP2023515504A priority Critical patent/JPWO2022224998A1/ja
Publication of WO2022224998A1 publication Critical patent/WO2022224998A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/37Details of the operation on graphic patterns
    • G09G5/377Details of the operation on graphic patterns for mixing or overlaying two or more graphic patterns
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • the present disclosure relates to display devices and the like for displaying body movements and equipment movements.
  • Patent Documents 1 and 2 There is known a technique for improving the skill of a player by attaching a sensor to a racket and analyzing the movement of the racket (for example, Patent Documents 1 and 2).
  • Patent Document 1 discloses a technique in which an acceleration sensor and a contact sensor are provided on a table tennis racket.
  • Patent Literature 2 discloses a technique for displaying both an image of a target person and an index indicating the target person's movement generated based on a measurement result of measuring the target person's movement by a sensor.
  • JP 2009-183455 A Japanese Patent Application Laid-Open No. 2020-205608
  • a display device acquires data of a first video showing a user's action using a tool and data of a second video showing movement of the tool according to the action. and a display control unit configured to simultaneously display the first moving image and the second moving image.
  • a display method acquires data of a first video showing a user's action using a tool and data of a second video showing movement of the tool according to the action. and a displaying step of simultaneously displaying the first moving image and the second moving image.
  • FIG. 1 is a diagram illustrating an example of a schematic configuration of a motion analysis system according to one aspect of the present disclosure
  • FIG. 1 is a perspective view showing a configuration example of a racket to which a motion sensor device is attached
  • FIG. 1 is a block diagram showing an example of a schematic configuration of a motion sensor device and a user terminal
  • FIG. 2 is a block diagram showing an example of a schematic configuration of a user terminal
  • FIG. FIG. 4 is a sequence diagram showing an example of the flow of main processing in the motion analysis system
  • FIG. 4 is a diagram showing an example of a display screen on a user terminal
  • FIG. FIG. 4 is a diagram showing an example of a display screen on a user terminal
  • FIG. FIG. 4 is a diagram showing an example of a display screen on a user terminal
  • FIG. FIG. 4 is a diagram showing an example of a display screen on a user terminal
  • FIG. 1 is a diagram illustrating an example of a display screen on a user
  • the motion analysis system 100 is, for example, a system that analyzes the motion of sports equipment when using it, or a system that analyzes the motion of a user who moves a mobile object by hitting it with the equipment. is.
  • Equipment is intended to be a tool that a user moves when playing sports, etc.
  • moving object is intended to be a moving object used in sports.
  • Equipment includes, for example, rackets such as tennis, badminton, table tennis, squash, and lacrosse, baseball and cricket bats, golf clubs, hockey and ice hockey sticks, gym equipment, fishing rods, user-worn wear, and the like.
  • moving objects include, for example, balls used in various ball games, badminton shuttlecocks, ice hockey pucks, and the like.
  • cues in billiards are tools for hitting balls and can be included in equipment.
  • the action of kicking a ball in rugby and soccer can be regarded as the action of hitting the ball with shoes.
  • Equipment may therefore also include shoes in rugby and soccer.
  • Equipment may also include clothing, gloves, socks, etc. worn by the user.
  • the equipment may also include tights-like garments worn over the whole or part of the user's body. In this case, by analyzing the movement of the tool, it becomes possible to analyze the movement of the user's body.
  • a motion analysis system 100 that analyzes a user's motion in order to improve the user's table tennis skill will be described as an example.
  • the tool is a table tennis racket 6, and the moving object is, for example, a celluloid table tennis ball 7 (also called a ping-pong ball).
  • FIG. 1 is a diagram showing an example of a schematic configuration of a motion analysis system 100. As shown in FIG. 1
  • the motion analysis system 100 includes one or more user terminals 2 (display devices).
  • the motion analysis system 100 may also include one or more other user terminals 3 (display devices).
  • the user terminal 2 and the user terminal 3 are communicably connected to each other via the network 4, as shown in FIG.
  • the motion analysis system 100 may include a management server 1 (server) that mediates communication between the user terminals 2 and 3, as shown in FIG. This enables sharing of analysis results and/or simultaneous confirmation of analysis results between the user terminal 2 and the user terminal 3 .
  • server management server 1
  • ⁇ Racket 6 to which motion sensor device 5 is attached A user who uses the motion analysis system 100 uses a racket 6 to which a motion sensor device 5 (motion sensor) is attached.
  • the motion sensor device 5 measures the motion of the racket 6 held by the user in real time.
  • FIG. 2 is a perspective view showing a configuration example of the racket 6 to which the motion sensor device 5 is attached.
  • the shake-hand type racket 6 is illustrated here as an example, it is not limited to this.
  • the racket 6 may be of other shapes such as a pen holder type racket.
  • the racket 6 has a hitting part 62 for hitting the ball 7 and a grip 61 for the user to hold.
  • the hitting part 62 has a hitting surface 9 a for hitting the ball 7 .
  • the racket 6 shown in FIG. 2 is of a shake-hand type and has a pair of front and back hitting surfaces 9a.
  • the hitting surface 9a of the racket 6 is often provided with rubber having various properties (for example, a sheet having resilience composed of a sponge sheet and a rubber sheet).
  • the grip 61 is a part that the user grips to manipulate the racket 6.
  • the hitting part 62 and the grip 61 may be integrally formed with each other.
  • the motion sensor device 5 is fixed to the racket 6 and functions as an inertial sensor that detects at least angular velocity.
  • the attachment position, shape, size, etc. of the motion sensor device 5 to the racket 6 may be appropriately set.
  • the motion sensor device 5 is attached to, but not limited to, the grip 61 of the racket 6 at the end opposite the hitting portion 62 .
  • the motion sensor device 5 may be detachable from the racket 6 or may be non-detachably fixed to the racket 6 .
  • the motion sensor device 5 can also detect the impact when the ball 7 hits the racket 6 .
  • FIG. 3 is a block diagram showing an example of schematic configurations of the motion sensor device 5 and the user terminal 2. As shown in FIG.
  • the motion sensor device 5 includes a storage unit 52, an angular velocity sensor 53 that detects the angular velocity of the racket 6, an acceleration sensor 54 that detects the acceleration of the racket 6, a communication unit 55 that communicates with the user terminal 2, and a CPU (Central Processing Unit). ) 50.
  • the angular velocity sensor 53 is, for example, a gyro sensor that detects the angular velocity of the racket 6 .
  • the CPU 50 processes signals input to or output from the angular velocity sensor 53 , the acceleration sensor 54 , and the communication unit 55 .
  • the motion sensor device 5 may also include a power supply that supplies power to the various components described above. In FIG. 4, illustration of the power supply is omitted.
  • the storage unit 52 may include ROM (Read Only Memory), RAM (Random Access Memory), an external storage device, and the like.
  • the storage unit 52 stores predetermined application programs and the like executed by the CPU 50 .
  • the angular velocity sensor 53 is composed of a triaxial angular velocity sensor capable of detecting the angular velocity of each of the x, y, and z axes.
  • angular velocity sensors include a combination of an angular velocity sensor that detects angular velocity about the x-axis, an angular velocity sensor that detects angular velocity about the y-axis, and an angular velocity sensor that detects angular velocity about the z-axis.
  • the acceleration sensor 54 is composed of a three-axis acceleration sensor capable of detecting acceleration in directions along three axes of the x-axis, the y-axis, and the z-axis. Any known configuration can be applied as the configuration of the triaxial acceleration sensor.
  • the motion sensor device 5 can detect the moving direction and rotational motion of the racket 6 with the angular velocity sensor 53 and can measure the moving distance and speed of the racket 6 with the acceleration sensor 54 . In this way, the motion sensor device 5 allows analysis of the motion of the racket 6 .
  • the communication unit 17 has a configuration capable of communicating with at least the user terminal 2.
  • This communication may be wireless communication or wired communication.
  • a method using radio waves, a method using infrared rays, or the like can be applied.
  • Bluetooth (registered trademark), WiFi (registered trademark), and the like can be applied as those using radio waves.
  • the user terminal 2 may be configured including a computer, and may be, for example, a smart phone and a tablet as shown in FIG. 1, or may be a personal computer. Any known computer hardware and OS (Operating System) can be applied.
  • a user terminal 2 can be obtained by installing a predetermined application program in a general computer.
  • the user terminal 2 has a CPU 20, a storage unit 22, an input unit 23 for receiving various user operations, a display unit 24 for displaying images and the like, and a communication unit 25.
  • the user terminal 2 may further include an imaging unit 26 .
  • the CPU 20 processes signals input to or output from the input unit 23, the display unit 24, the communication unit 25, and the imaging unit 26.
  • the storage unit 22 may include a ROM (Read Only Memory), a RAM (Random Access Memory), an external storage device, and the like.
  • the storage unit 22 stores analysis programs and the like executed by the CPU 20 .
  • the user terminal 2 analyzes the user's motion from various angles based on the user's motion and the movement of the racket 6 during the motion, and presents the analysis results to the user.
  • FIG. 1 shows how a user terminal 2 is used to capture an image of a user's motion of holding a racket 6 to which a motion sensor device 5 is attached.
  • FIG. 4 is a block diagram showing an example of a schematic configuration of the user terminal 2.
  • a control unit 21 shown in FIG. 4 corresponds to the CPU 20 in FIG. That is, each part of the control part 21 is a functional block realized by executing the analysis program 221 by the CPU 20 in FIG.
  • the user terminal 2 includes a control section 21 , a storage section 22 , an imaging section 26 , an input section 23 , a display section 24 and a communication section 25 .
  • the control unit 21 controls to execute processing of each function provided in the user terminal 2 .
  • the storage unit 22 is a storage device in which various computer programs read by the control unit 21, data used in various processes executed by the control unit 21, and the like are stored.
  • the control unit 21 includes an image acquisition unit 211 (acquisition unit), a measurement value acquisition unit 212, a motion analysis unit 213, and a display control unit 214.
  • the image acquisition unit 211 acquires image data (for example, video data) from the imaging unit 26.
  • the moving image data is an image of the user playing table tennis with the racket 6 to which the motion sensor device 5 is attached.
  • the image acquisition unit 211 may also acquire moving image data showing the trajectory of the ball 7 actually hit by the racket 6 held by the user.
  • the image acquisition unit 211 may be configured to acquire, via the communication unit 25, moving image data captured by an external device (not shown) such as a digital camera and a digital video, for example.
  • an external device not shown
  • the acquired moving image data is stored in the captured image 222 of the storage unit 22 together with, for example, time information for synchronizing with the measured value. A method for synchronizing the detection information and the measured value will be described later.
  • the image acquisition unit 211 acquires an animation video, which will be described later, from the motion analysis unit 213 . That is, the image acquisition unit 211 collects video data of a competition video (first video) showing the action of the user using the racket 6, and video data of a video (second video) showing the movement of the racket 6 according to the action. Get data.
  • the moving image showing the movement of the racket 6 may be, for example, an animation moving image.
  • a case where the image acquisition unit 211 acquires video data of a competition video and video data of an animation video will be described as an example.
  • the measurement value acquisition unit 212 acquires detection information detected by the motion sensor device 5 via the communication unit 25 .
  • the detection information is information in any format output from the motion sensor device 5, and may be, for example, a graph representing changes in the position of the racket 6 over time, or a graph representing changes in the position and orientation of the racket 6 over time. It may be a measured value that indicates a change.
  • the detection information may include time information indicating the measured time.
  • the measured value acquiring section 212 may be configured integrally with the image acquiring section 211 .
  • the measured value acquisition unit 212 may store the acquired detection information in the storage unit 22 . A case where the measured value acquisition unit 212 acquires the measured value as the detection information from the motion sensor device 5 will be described below as an example.
  • the motion analysis unit 213 analyzes the motion of the racket 6 based on the acquired measurement values. In addition, the motion analysis unit 213 synchronizes the moving image data and the measurement values, and detects events included in the moving image data.
  • the analysis results may include the movement of the racket, the estimated trajectory and number of revolutions of the ball 7 hit by the racket 6, the movement of the user's body, and the like.
  • the motion analysis unit 213 stores the analysis result in the storage unit 22 (analysis result 223 in FIG. 4) in association with the measured values and video data used for the analysis. Events detected by the motion analysis unit 213 include, for example, hitting the ball 7 with the racket 6 .
  • the motion analysis unit 213 can analyze the motion of the racket 6 and create an animation movie by any known method.
  • the motion analysis unit 213 can store analysis results related to multiple players (eg, users) in the storage unit 22 .
  • the analysis result may include moving image data of animation moving images.
  • the motion analysis section 213 may be configured integrally with the measured value acquisition section 212 and the image acquisition section 211 .
  • the display control unit 214 reads the analysis result from the storage unit 22 and causes the display unit 24 to display it.
  • the display control unit 214 causes the display unit 24 to display moving image data, analysis results, and the like.
  • the display control unit 214 simultaneously displays the game video and the animation video. In this specification, a case where the display control unit 214 displays the game video and the animation video in separate frames will be described, but the present invention is not limited to this.
  • the communication unit 25 communicates with the motion sensor device 5, and communicates with the user terminal 3 and the management server 1.
  • FIG. 5 is a sequence diagram showing an example of the main processing flow in the motion analysis system 100. As shown in FIG.
  • the user activates the motion sensor device 5 (step S10), and the motion sensor device 5 starts detecting the state of the racket 6 (step S12).
  • the analysis program 221 is activated in the user terminal 2 (step S11)
  • the user terminal 2 for example, the measured value acquisition unit 212
  • the user terminal 2 analyzes the motion of the racket 6 using the measured values received from the motion sensor device 5 (step S13).
  • the user terminal 2 image acquisition unit 211) acquires video data of the user playing table tennis with the racket 6 to which the motion sensor device 5 is attached.
  • the user terminal 2 analyzes the movement of the racket 6 and the like based on the moving image data and the measurement values measured while the image is captured. Then, the user terminal 2 simultaneously displays, on the display unit 24, the competition video in which the user's action is shown as the first video and the animation video showing the analysis results as the second video (step S14: display step).
  • the game moving image may include game images, which are still images at each point in time (see game images P1 to P10, which will be described later).
  • FIG. 1 to 3 a moving image of a user playing table tennis (hereinafter sometimes referred to as a "competition moving image") and an animation moving image of the racket trajectory of the racket 6 are displayed side by side.
  • the common point is that the corresponding images related to the players of the two can also be displayed side by side.
  • Display examples 1 to 3 will be described below.
  • FIG. 6 is a diagram showing an example of a display screen on the user terminal 2.
  • the display control unit 214 displays, on the left side of the display unit 24, a game video R1 of Player X (hereinafter referred to as "video R1") and an animation video R2 showing the behavior of Player X's racket (hereinafter "video R2"). ) are displayed side by side.
  • video R1 game video R1 of Player X
  • video R2 an animation video R2 showing the behavior of Player X's racket
  • the display control unit 214 displays, on the right side of the display unit 24, a competition video R3 of Player Y (hereinafter referred to as “video R3”) and an animation video R4 showing the behavior of Player Y's racket (hereinafter referred to as “video R4”) are arranged vertically and displayed.
  • video R3 a competition video R3 of Player Y
  • animation video R4 showing the behavior of Player Y's racket
  • Video R1 and video R3 are video data showing the actions of Players X and Y using the racket 6, respectively.
  • the animation R1 and the animation R3 particularly show actions before and after the players X and Y hit the ball 7, such as serving or rallying.
  • the moving images R1 and R3 are shown as still images in the drawing, they can be reproduced as moving images.
  • Animation R2 and animation R4 may be animations showing the movements of the racket 6 according to the actions of Player X and Y, respectively. Animation R2 and animation R4 particularly show actions before and after Players X and Y hit the ball 7, such as serving or rallying.
  • the display control unit 214 can display an afterimage of the racket 6 at the position where the racket 6 has passed.
  • the animation R2 and the animation R4 may include information indicating the swing speed when the racket 6 hits the ball 7 (such as "15.0 km/hr" in the image R2).
  • Video R2 and video R4 may include graphs showing swing speed over the course of the player's swing.
  • the animation R2 and the animation R4 may be 3D images showing racket postures from various viewpoints such as the front, back, and sides of the player.
  • the moving image R2 and the moving image R4 may or may not contain a still image or a moving image of the ball 7 .
  • the display control unit 214 can simultaneously or individually reproduce the videos R1 to R4 according to the user's selection. That is, the display control unit 214 can, for example, reproduce the moving images R2 and R3 while the moving image R1 is stopped at an arbitrary timing. The display control unit 214 can simultaneously reproduce the moving image R1 and the moving image R2 according to the user's selection. The display control unit 214 can simultaneously reproduce the moving image R3 and the moving image R4 according to the user's selection. The display control unit 214 can display only one of the videos of Player X and Player Y according to the user's selection.
  • the display control unit 214 can display the video R1 and the video R2 in a synchronized state. Specifically, the display control unit 214 displays the display unit 24 at the same timing in the video R1 and the video R2 when the swing of the racket 6 by Player X and Player Y starts or when the ball 7 is hit by the racket 6. It can be played back as it is displayed. Further, the display control unit 214 can display the moving image R3 and the moving image R4 in a synchronized state. The display control unit 214 can switch to a state in which only one of the moving images R1 and R2 and/or only one of the moving images R3 and R4 is displayed.
  • the control unit 21 may acquire a plurality of data sets including video data of competition videos and video data of animation videos. This data set may be data for multiple users.
  • the display control unit 214 can simultaneously display the competition moving image and the animation moving image included in each of the plurality of data sets. For example, when displaying a game video and an animation video included in each of a plurality of data sets, the display control unit 214 may display the game video and the animation video in a mutually synchronized state. In this way, the display control unit 214 can display the moving images R1 to R4 on the display unit 24 with a wide variety of variations.
  • display example 1 The effects of display example 1 are as follows. When only the game video is displayed on the display unit 24 of the user terminal 2, it is difficult for the user to fully grasp the movement of the racket 6 if the movement of the racket 6 is fast. Also, moving image data that does not include part of the user's motion is generated depending on the position at which the user is imaged.
  • the inventors of the present application have made intensive studies and found the following configuration.
  • the user terminal 2 acquires additional information from the motion sensor device 5 attached to the racket 6, generates the additional information as an animation video, and displays the animation video and the competition video side by side.
  • the user terminal 2 can provide the user with a situation in which the movement of the user can be more objectively confirmed.
  • the user can correctly understand the features and points of improvement in his/her own motion using the racket 6, and can efficiently improve his/her skill.
  • competition videos and animation videos of multiple users side by side it is possible to easily compare actions, features, or points of improvement between multiple users.
  • the competition video of the user and the professional player side by side with the animation video of both the user can easily understand the difference between the actions of the user and the professional player. In this way, the user can further correctly understand the features and points of improvement in his/her own movement using the racket 6 and efficiently improve his/her skill.
  • the display control unit 214 may display a first still image obtained by freezing the moving image R1 and/or the moving image R3 at a predetermined time.
  • the display control unit 214 can display a second still image obtained by freezing the moving image R2 and/or the moving image R4 at a predetermined time.
  • the display control unit 214 can simultaneously display a first still image and a second still image obtained by freezing the moving images R1 and R2, and/or the moving images R3 and R4 respectively at predetermined points.
  • the user can easily compare a game video and an animation image in which the user's own body movement at a certain moment is captured, which leads to an efficient skill improvement. be able to.
  • the display control unit 214 may generate a plurality of first still images by freezing the moving image R1 and/or the moving image R3 at a plurality of points in time.
  • the display control unit 214 can display a plurality of first still images side by side.
  • the display control unit 214 can generate a plurality of second still images by freezing the moving image R2 and/or the moving image R4 at a plurality of points in time.
  • the display control unit 214 may generate a plurality of second still images based on a plurality of viewpoints at arbitrary points in time.
  • the display control unit 214 can display a plurality of second still images side by side. This makes it easier for the user to check his/her own body movements, features, points to be improved, etc. in continuous motions, which can lead to efficient skill improvement.
  • the display control unit 214 displays a plurality of first still images and a plurality of second still images corresponding to each other, which are obtained by freezing the moving images R1 and R2 and/or the moving images R3 and R4 at a plurality of points in time. may be generated. Then, the display control unit 214 may display the corresponding first still image and second still image at the same time side by side.
  • the user can confirm the movement of his/her own body in a series of motions in combination with the game video and the animation image, which can lead to efficient skill improvement. For example, the user can complement motions that could not be confirmed from the competition video from the animation image, and complement motions that could not be confirmed from the animation image from the competition video.
  • the display control unit 214 may display the video R1 and/or the video R3 corresponding to a predetermined period including a predetermined point in time. Further, the display control unit 214 may display the moving image R2 and/or the moving image R4 corresponding to a predetermined period including a predetermined point in time. For example, a video before and after impact of the ball 7 hitting the racket 6, which greatly affects the rotation, trajectory, direction, etc. of the ball 7, is an important point for the user to improve his skill. By checking the moving image corresponding to the predetermined period including the predetermined point in time, the user can efficiently improve the points to be emphasized.
  • the display control unit 214 may be configured to display the racket 6 and a part of the user's body together in the video R2 and/or the video R4.
  • the display control unit 214 may display the racket 6 and the user's hand holding the racket 6 in the video R2 and/or the video R4.
  • the shake-hand type racket 6 (see FIG. 2) has a front (palm side of the gripping hand) hitting surface 9a and a back (back of the gripping hand) hitting surface 9a. For example, when the user holds a shake-hand type racket 6, it is possible to hit the ball 7 with the front and back hitting surfaces 9a.
  • the user terminal 2 can allow the user to recognize the front side and the back side of the racket 6 in the moving image R2 and/or the moving image R4. Thereby, the user can accurately and easily recognize the relationship between the movement of the racket 6 and the movement of the user's body in the animation moving image.
  • the display control unit 214 may display a still image or a moving image of the ball 7 together with the moving image R2 and/or the moving image R4. For example, in the examples shown in FIGS. 6 and 7, together with the racket 6, a still image of the ball 7 hit by the racket 6 is displayed. In this case, based on the measurement values received from the motion sensor device 5, the motion analysis unit 213 determines the position and time of the racket 6 when the ball 7 hits the racket 6, and where on the racket 6 the ball 7 hits. is determined. Thereby, the position where the ball 7 is to be displayed in the video R2 and/or the video R4 is determined.
  • the display control unit 214 displays the moving image R2 and/or the moving image R4 in which the ball 7 is displayed at the determined position. Thereby, the user can easily recognize the attitude of the racket 6 when the ball 7 is hit.
  • the motion analysis unit 213 may analyze together the measurement values received from the motion sensor device 5 and the moving image data acquired by the image acquisition unit 211 . Thereby, the motion analysis unit 213 can determine the position of the ball 7 when it is separated from the racket 6 . Thereby, the display control unit 214 can display the moving image of the ball 7 .
  • the user terminal 2 can allow the user to easily recognize the movement of the racket 6 with respect to the ball 7 and the posture of the racket 6 .
  • the display control unit 214 may display an afterimage of the racket 6 in a predetermined period together with the moving image R2 and/or the moving image R4. Specifically, the display control unit 214 may display an afterimage of the racket 6 in the moving image R2 and/or the moving image R4 at the position where the racket 6 has passed. In this case, the display control unit 214 may display the position of the racket 6 at each time of a constant time interval as an afterimage, which is the position of the racket 6 in continuous motion. Alternatively, the display control unit 214 may display, as an afterimage, the position of the racket 6 in the continuous motion at each time at irregular time intervals.
  • the motion analysis unit 213 acquires time information by a method described in a method for synchronizing detection information and moving image data, which will be described later, and analyzes the measurement values received from the motion sensor device 5 to obtain hits. You may specify the time.
  • afterimages of the racket 6 are displayed at regular time intervals (for example, 0.001 seconds).
  • the user terminal 2 can allow the user to easily check how the attitude and trajectory of the racket 6 are changing during the swing motion.
  • the image acquisition unit 211 and/or the measurement value acquisition unit 212 may acquire information indicating the speed of the racket 6 at a predetermined point in time.
  • the display control unit 214 may display information indicating the swing speed of the racket 6 together with the video R2 and/or the video R4.
  • information indicating the swing speed when the racket 6 hits the ball 7 is displayed in the animation R2 and/or the animation R4.
  • the information indicating the swing speed may be an analysis result obtained by the motion analysis unit 213 analyzing the measurement value received from the motion sensor device 5 .
  • the user terminal 2 can provide the user with quantified swing speed information in addition to visual swing speed information based on moving images. This allows the user to accurately recognize and evaluate the swing speed of his/her own swing motion.
  • the display control unit 214 may display a numerical value indicating the swing speed on the video R2 and/or the video R4 as information on the swing speed. In the examples shown in FIGS. 6 and 7, values indicating the swing speed at the time of hit are displayed (for example, 15.0 km/hr, 24.1 km/hr, etc.).
  • the motion analysis unit 213 may analyze the swing motion at regular time intervals or continuously. For example, the motion analysis section 213 may calculate the swing speed at all points in the swing motion of the racket 6 . In this case, the display control unit 214 may display the swing speed that changes in conjunction with the video playback time, or may display the swing speed at regular time intervals.
  • the image acquisition unit 211 and/or the measured value acquisition unit 212 may acquire information indicating the swing speed of the racket 6 at a predetermined point in time.
  • the display control unit 214 may display a graph indicating the speed of the racket 6 together with the video R2 and/or the video R4.
  • the display control unit 214 may display a graph showing temporal changes in the swing speed of the racket 6 in the swing motion on the video R2 and/or the video R4.
  • the graph showing the change in swing speed over time may be generated by the display control unit 214 based on the analysis results of the motion analysis unit 213 analyzing the measured values received from the motion sensor device 5 .
  • the user terminal 2 provides the user with not only the hit time but also information about the start time of the swing motion, the swing speed before the hit time, and after the hit time (for example, follow-through). can provide. This allows the user to correctly understand the characteristics of his/her own swing motion.
  • the image acquisition unit 211 and/or the measurement value acquisition unit 212 may acquire two or more pieces of information indicating the swing speed of the racket 6 at a predetermined point in time.
  • the display control unit 214 may display two or more graphs indicating the swing speed of the racket 6 overlaid on the video R2 and/or the video R4.
  • the display control unit 214 may display a graph in which the graph for Player X and the graph for Player Y are superimposed.
  • the animation R2 and/or the animation R4 may be animations that can display the movement of the racket 6 from two or more viewpoints.
  • the display control unit 214 may display the moving image R2 and/or the moving image R4 in a state in which two or more viewpoints can be freely switched.
  • the display control unit 214 generates a three-dimensional image (3D image) from which the racket 6 of each player can be seen from various viewpoints such as the front, the rear, and the side as the animation R2 and/or animation R4. may be displayed.
  • the moving image R2 and/or the moving image R4 may be a 3D image in which the viewpoint of viewing the racket 6 of each player can be switched by the user's operation.
  • the user terminal 2 can allow the user to confirm the three-dimensional posture of the racket 6 . Thereby, the user can confirm the posture of the racket 6 more accurately.
  • the image acquisition unit 211 and/or the measurement value acquisition unit 212 may acquire movements of two or more rackets 6 .
  • the display control unit 214 may superimpose and display two or more animation videos showing movements of the racket 6 as the video R2 and/or the video R4.
  • the display control unit 214 may superimpose and display animation videos showing movements of a plurality of rackets 6 as the video R2 and/or the video R4.
  • the display control unit 214 may superimpose and display animation videos showing movements of the racket 6 of the same player at a plurality of points in time.
  • the display control unit 214 may display animation videos showing movements of the rackets 6 of different players, such as Player X and Player Y, overlapping each other.
  • a configuration may be adopted in which the user appropriately selects a plurality of animation moving images to be superimposed and displayed.
  • the plurality of animated moving images to be superimposed and displayed may be animation moving images corresponding to a certain period before and after the hit time as a reference.
  • the display control unit 214 may display the ping-pong table 8 (see FIG. 1) together with the video R2 and/or the video R4.
  • the image acquisition unit 211 and/or the measurement value acquisition unit 212 may acquire the relative positions of the racket 6 and the table tennis table 8 .
  • the display control unit 214 may display the moving image R2 and/or the moving image R4 by reflecting the relative positions of the racket 6 and the ping-pong table 8 .
  • the user terminal 2 can allow the user to recognize the relative positions of the racket 6 and the table tennis table 8 . Thereby, the user can more accurately check the posture of the racket 6 while checking the position of the table tennis table 8 and the hitting direction.
  • Time information is included in each of the detection information and video data, and the times indicated by the respective time information are matched with each other.
  • a user's predetermined motion that is not performed in table tennis (for example, a golf swing) is linked to the measurement start/end of the motion sensor device 5 and the imaging unit 26 (or an external device such as a digital camera). wear.
  • the motion sensor device 5 and imaging section 26 start or end recording.
  • the synchronization method is not limited to the above (1) to (3), and other methods can be adopted.
  • the frame rate of video data may be various frame rates.
  • the frame rate may be 30fps, 100fps, or 1000fps.
  • the frame rate is high, it becomes easier to synchronize the competition video and the animation video.
  • a crystal gyro sensor module manufactured by Kyocera Corporation is attached to the racket 6, highly accurate measurement of the movement of the racket 6, which was difficult with conventional gyro sensors, is realized.
  • the frame rate of video data is 1000 fps, it is possible to provide the user with an image in which the game video and the animation video are synchronized at the moment the ball 7 is hit with the racket 6 .
  • the motion analysis system 100 can simplify the overall system, and can avoid situations where the overall system becomes complicated, such as when using a high-speed camera.
  • moving images R1 and R2, and moving images R3 and R4 can be synchronized by a similar method.
  • the user terminal 2 may compare the time information obtained by (1) to (3) above with the obtained video data.
  • the user terminal 2 may synchronize each video at the time when Player X and Player Y start swinging, or the time when the racket 6 hits the ball 7, or the like.
  • the user terminal 2 may compare the time information obtained by (1) to (3) above with the obtained detection data.
  • the display control unit 214 may synchronize the moving images R1 and R2, and the moving images R3 and R4 with reference to a predetermined point in time. In this case, for example, the display control unit 214 may synchronize each moving image at the end of data acquisition, the time at which the racket 6 hits the ball 7, or the like.
  • the display control unit 214 controls the video R1, the video R2, and the video R3 to match the video playback time of the video R1, the video R2, and the video R3, and the video R3, and the video R3, and the video R3. A part of the moving image R4 may be skipped and displayed.
  • the display control unit 214 may cause the display unit 24 to display the time seek bar SB for operating the reproduction of the first moving image showing the action of the user using the racket 6 . Further, the display control unit 214 may cause the display unit 24 to display a time seek bar SB for operating reproduction of the second moving image showing the movement of the racket 6 according to the user's motion.
  • the time seek bar SB is provided with one or a plurality of marks M capable of starting reproduction from a predetermined point in time.
  • FIG. 7 is a diagram showing an example of a display screen on the user terminal 2. As shown in FIG. Description of the contents described with reference to FIG. 6 will be omitted.
  • the display control unit 214 causes the time seek bar SB and two marks M to be displayed on the moving image R1, and the time seek bar SB and three marks M to be displayed on the moving image R3.
  • a mark M corresponds to the time when the racket 6 hits the ball 7 .
  • the mark M exists as many times as the racket 6 hits the ball 7 in a predetermined period.
  • the display control unit 214 causes the display unit 24 to display the moving image R2 (R4) corresponding to the mark M.
  • the moving image R2 (R4) can be said to be an animation image specialized in hitting the ball 7.
  • the display control unit 214 displays on the display unit 24 the movement of, for example, 0.1 seconds before and after the first racket movement as a moving image R2 (R4).
  • the time interval is not limited to 0.1 seconds, and may be changed as appropriate, such as 0.3 seconds.
  • the display control unit 214 can display the ball 7 in the video R2 (R4) according to the position of the ball 7.
  • the display control unit 214 can display the moving image R1 (R3) as a still image and repeatedly display the moving image R2 (R4).
  • the display control unit 214 may or may not match the display speeds of the video R1 (R3) and the video R2 (R4).
  • the display control unit 214 can display an afterimage of the racket 6 at the position where the racket 6 has passed.
  • the effects of display example 2 are as follows. When a plurality of hits are performed within a predetermined time, it takes time and effort for the user to find a specific hit among them. Therefore, the display control unit 214 causes the seek bar SB to display the mark M associated with the hit of the ball 7 by the racket 6 . By providing such a UI (User Interface) screen by the display control unit 214, the user can save labor and time and efficiently improve his or her skills.
  • UI User Interface
  • the control unit 21 acquires the skeleton data of the skeleton model generated based on the moving image data of the first moving image showing the action of the user using the racket 6 .
  • the display control unit 214 causes the display unit 24 to display the skeleton model superimposed on the first moving image.
  • the display control unit 214 can cause the display unit 24 to display the skeletal model instead of the first moving image.
  • FIG. 8 is a diagram showing an example of a display screen on the user terminal 2. As shown in FIG. 8
  • the display control unit 214 causes the characters Player X and Player Y to be displayed vertically on the left side of the display unit 24 .
  • the display control unit 214 displays the game images P1 to P5 of Player X (hereinafter referred to as “images P1 to P5") on the right side of Player X, and the game images P6 to P10 of Player Y (hereinafter referred to as "Images P1 to P5") on the right side of Player Y. images P6 to P10”) are respectively displayed.
  • the images P4 and P9 correspond to the time when the racket 6 hits the ball 7 respectively.
  • Image P1 and image P6 are images 0.3 seconds before the time when racket 6 hits ball 7 .
  • Image P2 and image P7 are images 0.2 seconds before the time when racket 6 hits ball 7 .
  • Image P3 and image P8 are images 0.1 seconds before the time when racket 6 hits ball 7 .
  • Image P5 and image P10 are images 0.1 seconds after the racket 6 hits the ball 7 .
  • Each of the images P1 to P10 includes a skeletal model S generated based on video data in which the user's actions using the racket 6 are shown.
  • the display control unit 214 displays the skeletal model S superimposed on each of the user images in the images P1 to R10.
  • the display control unit 214 may display the skeleton model S instead of the user images of the images P1 to P10.
  • the display control unit 214 can superimpose and display the images p1 to p10 in the respective image areas of the images P1 to P10.
  • the images p1 to p10 are images showing the posture of the racket corresponding to the images P1 to P10, respectively.
  • the display control unit 214 displays consecutive images of the user's skeleton data arranged in chronological order with reference to the time when the racket 6 hit the ball 7 and consecutive images of the animation of the racket 6 side by side.
  • the number of continuous images is not limited to 5, and may be 3, 6, or 10.
  • the display control unit 214 can display an image corresponding to the time when the racket 6 hits the ball 7 with a thick frame or the like so that the user can easily check the image.
  • the effects of display example 3 are as follows.
  • the motion of the racket 6, which is difficult to confirm only with the game video can be visually complemented by the animation video generated by the motion analysis unit 213.
  • FIG. 1 the motion of the racket 6, which is difficult to confirm only with the game video, can be visually complemented by the animation video generated by the motion analysis unit 213.
  • the display control unit 214 can comparatively display the change in the user's movement and the change in the attitude of the racket before and after the racket 6 hits the ball 7 . Furthermore, the display control unit 214 can display a comparison of changes in movement of other users and changes in attitude of the racket before and after the racket 6 hits the ball 7 . This makes it easier for the user to understand, for example, the difference in action between the user and the professional player and the change in racket posture.
  • the user can check the images of the body and the racket 6 at the same timing in association with each other. By checking these images, the user can easily understand how to control the posture of the racket and how to control the movement of the body required for that.
  • the display control unit 214 can display the continuous images in which the skeleton data of the user are arranged in chronological order and the continuous images of the animation of the racket 6 side by side. As a result, the user can correctly understand the features and points of improvement in his or her own actions using the racket 6 while making comparisons with other players, and can improve his/her skill.
  • the display control unit 214 can display the images P1 to P5 and/or the images P6 to P10 simultaneously or individually according to the user's selection.
  • the display control unit 214 can display the images p1 to p5 and/or the images p6 to p10 simultaneously or individually according to the user's selection.
  • the display control unit 214 can display only one image of Player X or Player Y according to the user's selection.
  • the display control unit 214 can display a pair of images (image P1 and image p1, etc.) in a synchronized state.
  • the display control unit 214 can switch to a state in which only one of the images P1 to P5 and the images p1 to p5 (or the images P6 to P10 and the images p6 to p10) is displayed.
  • the display control unit 214 can display various images on the display unit 24 with a wide variety of variations. As a result, the user can efficiently improve his skill.
  • the display control unit 214 can also cause the display unit 24 to display an appropriate combination of the display examples 1 to 3.
  • FIG. the display control unit 214 may superimpose and display the skeleton model S of the display example 3 (FIG. 8) on the video R1 and the video R3 of the display example 1 (FIG. 6).
  • the display control unit 214 can simultaneously display the data of three or more players instead of the data of two players (Players X and Y).
  • the data may include at least one of video data of a competition video, video data of an animation video, and skeleton data.
  • the display control unit 214 can simultaneously display current and past data of one user instead of two users (Players X and Y). In this way, the user can efficiently improve his/her skill through images with a wide variety of variations.
  • the user terminal 2 can effectively present body movements and equipment movements, and can improve the skill of the user using the equipment.
  • the display control unit 214 may display the screen of the display example 3 (FIG. 8) when the mark M is selected by the user in the display example 2 (FIG. 7). At this time, the display control unit 214 may select and display appropriate data as a comparison target. Alternatively, the display control unit 214 may display data corresponding to another mark M selected by the user as a comparison target.
  • Display examples 1 to 3 in which data of different players (eg, Player X and Player Y) are displayed at the same time have been described, but the user terminal 2 is not limited to this configuration.
  • the user terminal 2 may simultaneously display data of the same player acquired at different times.
  • the user terminal 2 may read image data of competition videos acquired at different times in the past from the storage unit 22 and display them as videos R1 and R3, respectively.
  • the user terminal 2 may display the video data A1 of the competition video acquired at the past time TA as the video R1, and display the video data B1 of the competition video acquired at the past time TB as the video R2.
  • the user terminal 2 causes the moving image data A2 of the animation moving image generated based on the detection information acquired from the motion sensor device 5 to be displayed as the moving image R2 at the past time TA, and displays the motion sensor device 5 at the past time TB. 5 may be displayed as the moving image R4.
  • the user terminal 2 can present, for example, the user's own motion and the past motion with the player to the user at the same time. This allows the user to compare his/her own motions with, for example, motions of various famous players who have played an active role in the past. In addition, the user can compare his or her recent actions with his or her past actions to check how much his or her skill has improved.
  • the user terminal 2 may acquire an annotation related to the video data of the competition video in association with the video data of the competition video. Further, the user terminal 2 may acquire an annotation related to the animation moving image generated based on the detection information acquired from the motion sensor device 5 in association with the moving image data of the animation moving image. Both the annotations about the competition video and the annotations about the animation video may include information about the player corresponding to the video data, comments and requests to the player, and the like. Annotations about the competition videos and annotations about the animation videos may be input from the input unit 23, for example. The user terminal 2 may store the acquired moving image data in the storage unit 22 in association with the information about the player and the comment to the player.
  • the user terminal 2 can present annotations associated with each moving image data to the user when allowing the user to select moving image data to be displayed as the moving image R1. Thereby, the user can correctly select the moving image data to be displayed as the moving image R1. Also, the user appropriately selects the video data of the competition video to be displayed as the video R3 (for example, the video data of the competition video to be compared with the video R1), using the annotations associated with each video data as a clue. It is also possible to
  • the user terminal 2 when the user terminal 2 acquires an annotation about a game video via the input unit 23, it may select and display video data of a suitable game video according to the annotation about the game video. Further, when the user terminal 2 acquires the annotation about the animation moving image via the input unit 23, the user terminal 2 may select and display the moving image data of the suitable animation moving image according to the annotation about the animation moving image.
  • the user terminal 2 may receive inputs such as information about the player's body, the type of racket 6 used by the player, and the type of shot as annotations about the game video and annotations about the animation video. Information about the player's body may include information about the player's height, arm length, and the like.
  • the user terminal 2 may select image data of similar competition videos based on the annotations on the competition videos, and display the selected videos. Further, the user terminal 2 may select image data of similar animation videos based on the annotations on the animation videos, and display the selected videos.
  • the user terminal 2 may generate video data showing ideal movements from video data of a plurality of competition videos stored in the storage unit 22 .
  • the user terminal 2 may generate moving image data of an animation moving image showing ideal movements from a plurality of animation moving images stored in the storage unit 22 .
  • the user terminal 2 generates an ideal animation video by averaging detection information used for analyzing movements of the rackets 6 used by a plurality of players with similar body information.
  • the user terminal 2 may generate video data of a game video in which a player having a pseudo-personality moves from the generated ideal animation video.
  • the user terminal 2 can present to the user a moving image in which the individual characteristics (so-called habits, etc.) of the player's actions are reduced.
  • the display control unit 214 may compare the moving image R1 and the moving image R3, and display a mark M on the time seek bar SB for a period with a predetermined deviation value. Further, the display control unit 214 may compare the moving image R2 and the moving image R4 and display a mark M on the time seek bar SB for a period with a predetermined deviation value. For example, the display control unit 214 compares the moving images R1 and R3 and/or the moving images R2 and R4, and compares the body posture, the angle of the racket, the position of the racket, etc. during periods when the deviation value is greater than a predetermined deviation value. Alternatively, the mark M may be displayed on the time seek bar SB.
  • the user terminal 2 can effectively present the difference in the swing motions of the players to the user.
  • the user can, for example, compare his or her own swing motion with the swing motions of other players, and clearly recognize a significantly different motion.
  • the user terminal 2 has the image acquisition unit 211, the measurement value acquisition unit 212, and the display control unit 214 has been described, but the user terminal 2 according to the present disclosure is limited to this configuration. not.
  • a server device different from the user terminal 2, such as the management server 1 may have some or all of the functions of the control unit 21 and/or the storage unit 22 described above.
  • the management server 1 analyzes the movement of the racket 6 and the movement of the user's body, and sends the analysis results to the user terminal 2.
  • the analysis result may include the video data of the competition video and the video data of the animation video.
  • the analysis result may further include information indicating swing speed.
  • the user terminal 2 causes the display unit 24 to display the analysis result received from the management server 1 .
  • the management server 1 may be communicably connected to a plurality of user terminals 2 . By adopting this configuration, it is also possible for the management server 1 to distribute the results of analyzing the movement of the racket 6 and the movement of the user's body to a plurality of user terminals 2 .
  • the display control unit 214 may display the moving images R1 and R3 and/or the moving images R2 and R4 in an overlapping manner. At this time, there may be a difference in frame rate between the videos R1 and R3, which are competition videos, and the videos R2 and R4, which are animation videos. In this case, the display control unit 214 may perform a process of filling the spaces between frames in a moving image with a low frame rate. For example, the display control unit 214 may have a function of predicting and displaying a frame image at a predetermined point in time based on preceding and subsequent frame images.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Physical Education & Sports Medicine (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention présente efficacement le mouvement corporel et le mouvement d'un équipement afin d'améliorer les compétences de l'utilisateur qui utilise l'équipement. Un terminal utilisateur comprend une unité d'acquisition d'image qui acquiert une image animée (R1) comprenant des actions effectuées par un utilisateur utilisant une raquette et une image animée (R2) indiquant le mouvement de la raquette correspondant aux actions, et une unité de commande d'affichage qui affiche l'image animée (R1) et l'image animée (R2) de façon simultanée.
PCT/JP2022/018361 2021-04-22 2022-04-21 Dispositif et procédé d'affichage WO2022224998A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2023515504A JPWO2022224998A1 (fr) 2021-04-22 2022-04-21

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021072802 2021-04-22
JP2021-072802 2021-04-22

Publications (1)

Publication Number Publication Date
WO2022224998A1 true WO2022224998A1 (fr) 2022-10-27

Family

ID=83722314

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/018361 WO2022224998A1 (fr) 2021-04-22 2022-04-21 Dispositif et procédé d'affichage

Country Status (2)

Country Link
JP (1) JPWO2022224998A1 (fr)
WO (1) WO2022224998A1 (fr)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0787430A (ja) * 1993-09-13 1995-03-31 Unicom:Kk 記念撮影装置
JP2003180896A (ja) * 2001-12-17 2003-07-02 Kazuyoshi Tsukamoto バーチャルスポーツシステム
JP2007244443A (ja) * 2006-03-14 2007-09-27 Shizuo Nakayama 人間の動作のレッスンシステム及びレッスン方法
US20140127659A1 (en) * 2005-07-26 2014-05-08 Interactive Sports Direct Incorporated Method and System for Providing Web Based Interactive Lessons with Improved Session Playback
JP2014188146A (ja) * 2013-03-27 2014-10-06 Nippon Telegraph & Telephone East Corp 運動姿勢評価装置、運動姿勢評価方法及びコンピュータプログラム
WO2017119403A1 (fr) * 2016-01-08 2017-07-13 ソニー株式会社 Dispositif de traitement d'informations
JP2018530804A (ja) * 2015-07-16 2018-10-18 ブラスト モーション インコーポレイテッドBlast Motion Inc. 多センサ事象検出およびタグづけシステム
JP2019053392A (ja) * 2017-09-13 2019-04-04 株式会社コロプラ 情報処理方法、コンピュータ、及びプログラム

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0787430A (ja) * 1993-09-13 1995-03-31 Unicom:Kk 記念撮影装置
JP2003180896A (ja) * 2001-12-17 2003-07-02 Kazuyoshi Tsukamoto バーチャルスポーツシステム
US20140127659A1 (en) * 2005-07-26 2014-05-08 Interactive Sports Direct Incorporated Method and System for Providing Web Based Interactive Lessons with Improved Session Playback
JP2007244443A (ja) * 2006-03-14 2007-09-27 Shizuo Nakayama 人間の動作のレッスンシステム及びレッスン方法
JP2014188146A (ja) * 2013-03-27 2014-10-06 Nippon Telegraph & Telephone East Corp 運動姿勢評価装置、運動姿勢評価方法及びコンピュータプログラム
JP2018530804A (ja) * 2015-07-16 2018-10-18 ブラスト モーション インコーポレイテッドBlast Motion Inc. 多センサ事象検出およびタグづけシステム
WO2017119403A1 (fr) * 2016-01-08 2017-07-13 ソニー株式会社 Dispositif de traitement d'informations
JP2019053392A (ja) * 2017-09-13 2019-04-04 株式会社コロプラ 情報処理方法、コンピュータ、及びプログラム

Also Published As

Publication number Publication date
JPWO2022224998A1 (fr) 2022-10-27

Similar Documents

Publication Publication Date Title
KR101817393B1 (ko) 모바일 디바이스의 모션 센서들을 이용하여 스포츠 모션들을 분석하는 방법 및 시스템
AU2017331639B2 (en) A system and method to analyze and improve sports performance using monitoring devices
EP2973406B1 (fr) Déterminations des attributs athlétiques à partir des données d'images
EP2973215B1 (fr) Signaux de retroaction provenant de donnees d'image de performances athletiques
US20150072797A1 (en) Terminal Device and Display Method
WO2017103674A1 (fr) Système et procédé de génération de rétroaction mobile au moyen d'un traitement vidéo et d'un suivi d'objets
TW201501751A (zh) 運動解析裝置
JP2005110850A (ja) 身体運動の評価方法、スイング動作評価方法および身体運動の計測システム
JP7215515B2 (ja) 解析装置、解析方法及びプログラム
CN113270185A (zh) 用于基于时间的运动活动测量和显示的系统和方法
US10307657B2 (en) Apparatus and method for automatically analyzing a motion in a sport
JP2009050721A (ja) スイング動作評価方法、スイング動作評価装置、スイング動作評価システム及びスイング動作評価プログラム
JP2015156882A (ja) 運動解析装置及び運動解析システム
KR20160106671A (ko) 운동 해석 장치, 운동 해석 시스템, 운동 해석 방법, 운동 해석 정보의 표시 방법 및 프로그램
WO2023085333A1 (fr) Dispositif d'affichage et procédé d'affichage
CN104587662A (zh) 运动分析装置、运动分析方法
WO2022224998A1 (fr) Dispositif et procédé d'affichage
KR20160099501A (ko) 운동 해석 방법, 운동 해석 장치 및 기억 장치
WO2020049596A1 (fr) Système de capteur de mouvement vestimentaire qui mesure des paramètres liés au mouvement dans des sports pour un utilisateur
KR20170086859A (ko) 골프 스윙 표시 방법, 이를 수행하는 모바일 장치 및 이를 포함하는 골프 스윙 분석 시스템
JP2016116745A (ja) 傾き判定装置、傾き判定システム、傾き判定方法及びプログラム
WO2023219049A1 (fr) Système et procédé d'analyse des mouvements
KR20160099502A (ko) 운동 해석 방법, 운동 해석 장치 및 기억 장치
WO2021153573A1 (fr) Système d'analyse de mouvement, serveur, procédé d'analyse de mouvement, programme de commande et support d'enregistrement
KR20160099503A (ko) 운동 해석 방법, 운동 해석 장치 및 기억 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22791769

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023515504

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22791769

Country of ref document: EP

Kind code of ref document: A1