WO2021187193A1 - Ball game footage analysis device, ball game footage analysis system, ball game footage analysis method, and computer program - Google Patents

Ball game footage analysis device, ball game footage analysis system, ball game footage analysis method, and computer program Download PDF

Info

Publication number
WO2021187193A1
WO2021187193A1 PCT/JP2021/008950 JP2021008950W WO2021187193A1 WO 2021187193 A1 WO2021187193 A1 WO 2021187193A1 JP 2021008950 W JP2021008950 W JP 2021008950W WO 2021187193 A1 WO2021187193 A1 WO 2021187193A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
moving image
ball game
action
locus
Prior art date
Application number
PCT/JP2021/008950
Other languages
French (fr)
Japanese (ja)
Inventor
優麻 齋藤
純子 上田
井村 康治
Original Assignee
パナソニックIpマネジメント株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by パナソニックIpマネジメント株式会社 filed Critical パナソニックIpマネジメント株式会社
Publication of WO2021187193A1 publication Critical patent/WO2021187193A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/292Multi-camera tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/92Transformation of the television signal for recording, e.g. modulation, frequency changing; Inverse transformation for playback
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/91Television signal processing therefor
    • H04N5/93Regeneration of the television signal or of selected parts thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • This disclosure relates to a ball game video analysis device, a ball game video analysis system, a ball game video analysis method, and a computer program.
  • the practice of the action of a person who plays a ball game is shot with a video using a camera that is close to us, and the video is played back later to be useful for the practice.
  • a ball game for example, serving volleyball
  • we searched for videos by relying on the shooting date and time of the video and thumbnails, and identified the target video and comparative video from multiple videos in which the same action was shot, and the proficiency level and form of the action. It is difficult to grasp the points to be improved.
  • the purpose of the present disclosure is to identify the target video and the comparative video from a plurality of videos of the action of a person performing a ball game, and to easily grasp the proficiency level of the action and the improvement points of the form. To provide technology.
  • the ball game video analysis device is locus information indicating the movement locus of the moving body based on the first moving image and the second moving image of the actions of a person with respect to the moving body used in the ball game taken from different positions.
  • a locus analysis unit that generates A display for displaying a UI (User Interface) for selecting the recorded information based on the locus information, and displaying a first moving image or a second moving image having a corresponding relationship with the recorded information selected through the UI. It includes a control unit.
  • UI User Interface
  • the ball game video analysis system includes a first mobile terminal with a camera, a second mobile terminal with a camera, a ball game video analysis device, and a display terminal.
  • the first mobile terminal captures a person's action on a moving body used in a ball game from a first position to generate a first moving image.
  • the second mobile terminal captures the action of the person from a second position to generate a second moving image.
  • the ball game video analysis device generates locus information indicating the locus of the moving body based on the first moving image and the second moving image, and generates the locus information indicating the locus of the moving body, and the first moving image, the second moving image, and the locus information.
  • a UI for managing a plurality of recorded information which is information indicating a correspondence relationship with the above and selecting a recorded information based on the locus information from the plurality of recorded information is displayed on the display terminal and selected through the UI.
  • the first moving image or the second moving image having a corresponding relationship in the recorded information is displayed on the display terminal.
  • the apparatus obtains the movement locus of the moving body based on the first moving image and the second moving image of the action of the person with respect to the moving body used in the ball game taken from different positions.
  • the locus information to be shown is generated, a plurality of recorded information which is information indicating the correspondence between the first moving image and the second moving image and the locus information is managed, and the locus information is selected from the plurality of recorded information.
  • a UI for selecting recording information is displayed based on the above, and a first moving image or a second moving image having a corresponding relationship in the recording information selected through the UI is displayed.
  • the computer program generates trajectory information indicating the movement trajectory of the moving body based on the first moving image and the second moving image of the actions of the person with respect to the moving body used in the ball game taken from different positions. Then, a plurality of recorded information, which is information indicating the correspondence between the first moving image and the second moving image and the locus information, is managed, and the recorded information is recorded from the plurality of recorded information based on the locus information. Is displayed, and a computer is made to execute a process of displaying a first moving image or a second moving image having a corresponding relationship in the recorded information selected through the UI.
  • the target video and the comparative video from a plurality of videos of the action of a person performing a ball game, and easily grasp the proficiency level of the action and the improvement points of the form. ..
  • Diagram for explaining how to identify the position of the ball A block diagram showing a configuration example of the ball game video analysis device according to the first embodiment.
  • Diagram for explaining how to identify the reference frame image Flowchart showing an example of pre-work Sequence chart showing an example of processing related to pre-work
  • the figure which shows the detailed example of the UI for changing a reference frame Diagram showing a display example of analysis results
  • a block diagram showing a configuration example of the ball game video analysis device according to the second embodiment.
  • Block diagram showing a configuration example of a display terminal according to the second embodiment Diagram for explaining an example of a posture model Diagram to illustrate the sub-actions that make up a jumping serve Diagram showing a display example of the video comparison UI Diagram showing a display example of the UI for sub-action video comparison
  • Diagram showing an example of the UI for sub-action posture comparison Diagram for explaining an example when selecting a posture model in the posture comparison UI The figure which shows the hardware structure of the computer which realizes the function of the ball game image analysis apparatus which concerns on this disclosure by a program. The figure which shows the hardware composition of the computer which realizes the function of the 1st mobile terminal, the 2nd mobile terminal and the display terminal by a program which concerns on this disclosure.
  • FIG. 1 is a diagram showing a configuration example of a ball game video analysis system according to the present embodiment.
  • the ball game video analysis system 10 is a system that analyzes a video of a ball game and analyzes the trajectory of a moving body used in the ball game.
  • volleyball which is one of the ball games
  • the ball game video analysis system 10 is not limited to volleyball, and can be applied to various ball games such as soccer, baseball, table tennis, basketball, tennis, rugby, American football, lacrosse, badminton, and ice hockey.
  • the moving body targeted by the ball game video analysis system 10 is typically a ball.
  • the moving body targeted by the ball game video analysis system 10 is not limited to a ball, and may have a shape that does not fit the concept of a "ball" such as a badminton shuttle or an ice hockey puck.
  • the ball game video analysis system 10 includes a first mobile terminal 20A, a second mobile terminal 20B, a display terminal 30, and a ball game video analysis device 100.
  • the first mobile terminal 20A, the second mobile terminal 20B, the display terminal 30, and the ball game video analysis device 100 can transmit and receive information to and from each other through the communication network N.
  • the communication network N is composed of, for example, a wireless communication network (wireless LAN (Local Area Network), 4G, 5G, etc.), an Internet network, or a combination thereof.
  • the first mobile terminal 20A and the second mobile terminal 20B are wireless communication terminals provided with a camera.
  • Examples of the first mobile terminal 20A and the second mobile terminal 20B are mobile phones, smartphones, tablets, mobile PCs, and the like.
  • the first mobile terminal 20A and the second mobile terminal 20B photograph the action of a person performing a ball game (hereinafter referred to as "player").
  • a volleyball serve which is one of the actions of the player, will be described as an example.
  • the action of the player is not limited to the serve of volleyball, and may be a dig, a reception, a toss, an attack, a block, or the like.
  • the action of the person performing the ball game may be a specific action in the above sports other than volleyball.
  • the first mobile terminal 20A and the second mobile terminal 20B for example, shoot a player's serve practice as a moving image from different directions, and generate a first moving image 201A and a second moving image 201B, respectively.
  • the first mobile terminal 20A photographs the player from the rear right
  • the second mobile terminal 20B photographs the player from the rear left.
  • the first mobile terminal 20A and the second mobile terminal 20B transmit the generated first moving image 201A and second moving image 201B to the ball game video analysis device 100 through the communication network N, respectively.
  • the display terminal 30 displays the analysis result by the ball game video analysis device 100.
  • a display example of the analysis result will be described later (see FIG. 10).
  • Examples of the display terminal 30 are mobile phones, smartphones, tablets, mobile PCs, and the like.
  • the display terminal 30 may be read as a third mobile terminal. Although the configuration in which the display terminal 30 for displaying the analysis result is separately provided is illustrated, either the first mobile terminal 20A or the second mobile terminal 20B may be used as a terminal for displaying the analysis result.
  • the ball game video analysis device 100 analyzes the movement trajectory of the ball in the three-dimensional space based on the first moving image 201A and the second moving image 201B received from the first mobile terminal 20A and the second mobile terminal 20B, respectively. ..
  • the movement locus of the ball in the three-dimensional space can be generated by arranging the three-dimensional coordinates of the ball in chronological order.
  • the three-dimensional coordinates of the ball at time t can be calculated from the frame image F1 at time t in the first moving image 201A and the frame image F2 at time t in the second moving image 201B. .. That is, as shown in FIG. 2, the vector V1 extending from the position U1 of the first mobile terminal 20A (camera) to the position B1 of the ball in the frame image F1 and the position of the second mobile terminal 20B (camera). The intersection with the vector V2 extending from U2 to the position B2 of the ball in the frame image F2 is the three-dimensional coordinate Pa of the ball.
  • the time code (time information) given to each frame image of the first moving image 201A by the first mobile terminal 20A and the time given to each frame image of the second moving image 201B by the second mobile terminal 20B. If the code is a completely accurate time, the three-dimensional coordinates of the ball can be accurately specified by the above method using the frame images F1 and F2 to which the time code of the time t is assigned.
  • a frame image taken by a first mobile terminal 20A and a second frame image taken at a high frame rate of 30 fps or more are taken.
  • the synchronization accuracy in milliseconds with the frame image captured by the mobile terminal 20B of the above is required.
  • NTP Network Time Protocol
  • a synchronization shift may occur.
  • the processing performance of the second mobile terminal 20B is lower than that of the first mobile terminal 20A, the processing load of time code assignment of the mobile terminal 20B is large, and it is slightly slower than the first mobile terminal 20A as a whole.
  • the frame image F2 may actually be taken a little earlier than the frame image F1.
  • the position of the ball is B3, and as shown in FIG. 2, the three-dimensional coordinates of the ball are also shifted to Pb. Therefore, the calculated movement locus of the ball also deviates.
  • the ball game video analysis device 100 has a function of specifying the synchronization deviation of the frame image between the first mobile terminal 20A and the second mobile terminal 20B. Then, the ball game image analysis device 100 calculates the movement locus of the ball in consideration of the specified synchronization deviation.
  • the movement locus of the ball can be calculated by using a general-purpose camera such as a camera mounted on a mobile terminal instead of a dedicated camera specialized in ball game video analysis as shown in Patent Document 1.
  • a general-purpose camera such as a camera mounted on a mobile terminal instead of a dedicated camera specialized in ball game video analysis as shown in Patent Document 1.
  • FIG. 3 is a block diagram showing a configuration example of the ball game video analysis device 100 according to the present embodiment.
  • the ball game video analysis device 100 includes a moving image receiving unit 101, an image conversion unit 102, a reference frame specifying unit 103, a reference frame changing unit 104, a trajectory analysis unit 105, and a result display unit 106.
  • the video receiving unit 101 receives the first video 201A from the first mobile terminal 20A, and receives the second video 201B from the second mobile terminal 20B.
  • Examples of video formats include MPEG4, TS files, H.D. 264, H. It is 265 mag.
  • a time code is assigned to each frame image constituting the first moving image 201A by the first mobile terminal 20A.
  • a time code is assigned to each frame image constituting the second moving image 201B by the second mobile terminal 20B.
  • a scene in which the player hits a serve is shot.
  • the image conversion unit 102 converts each frame image of the first moving image 201A into a still image, and generates the first frame image group 202A.
  • the image conversion unit 102 converts each frame image of the second moving image 201B into a still image, and generates a second frame image group 202B.
  • Examples of still image formats are JPEG, GIF, TIFF and the like.
  • the reference frame specifying unit 103 identifies a reference frame image (hereinafter referred to as “reference frame image”) for frame synchronization from the first frame image group 202A and the second frame image group 202B, respectively. ..
  • reference frame image a reference frame image
  • the reference frame image specified from the first frame image group 202A is referred to as a first reference frame image
  • the reference frame image specified from the second frame image group 202B is referred to as a second reference frame. It is called an image.
  • the reference frame specifying unit 103 specifies a frame image in which a specific sub-action constituting the action is captured as a reference frame image. For example, the reference frame specifying unit 103 captures a frame image in which the moment (sub-action) when the hand hitting the serve hits the ball is captured in the frame image group in which the series of actions of the serve (action) is captured. Specify as a reference frame image. Next, an example of a method of specifying the reference frame image will be described with reference to FIG.
  • the reference frame specifying unit 103 sets the ball detection region 211A for the first frame image group 202A.
  • the reference frame specifying unit 103 sets the ball detection area 211B for the second frame image group 202B.
  • the ball detection area becomes smaller, so that the time required for the ball detection process can be shortened. It is not necessary to set the detection area of the ball.
  • the reference frame specifying unit 103 detects a ball from each frame image of the first frame image group 202A by a known method. For example, the reference frame specifying unit 103 detects a ball from a frame image by pattern matching. Alternatively, the reference frame specifying unit 103 detects the ball from the frame image by using deep learning learned in advance for ball detection. Similarly, the reference frame specifying unit 103 detects the ball from each frame image of the second frame image group 202B.
  • the reference frame specifying unit 103 calculates the two-dimensional movement locus of the ball based on the position of the ball detected from each frame image of the first frame image group 202A. Then, the reference frame specifying unit 103 identifies the first reference frame image based on the change in the two-dimensional movement locus of the ball. For example, the reference frame specifying unit 103 identifies the first reference frame image by the processing of the following steps S11 to S13.
  • the reference frame specifying unit 103 obtains a first frame image from the first frame image group 202A in which the amount of movement of the ball in the X direction is equal to or greater than a predetermined threshold value (that is, the ball has moved significantly in the lateral direction). Extract. As a result, it is possible to extract the first frame image in which the ball is flying toward the net immediately after the hand hitting the serve hits the ball.
  • a predetermined threshold value that is, the ball has moved significantly in the lateral direction
  • the movement amount of the ball in the Y-axis direction is a predetermined threshold value among the plurality of first frame images included in a certain period before the first frame image extracted in step S11.
  • the first frame image that changes as described above and is closest to the first frame image extracted in step S11 is specified. Thereby, among the plurality of first frame images in which the ball thrown for the serve is rising or falling, the first frame image at the moment when the hand hitting the serve hits the ball can be identified. ..
  • the reference frame specifying unit 103 uses the first frame image specified in step S12 as the first reference frame image.
  • the reference frame specifying unit 103 specifies the second reference frame image by the same processing as described above. Then, the reference frame specifying unit 103 has a time code corresponding to the first reference frame image (hereinafter referred to as “first reference time code”) and a time code corresponding to the second reference frame image (hereinafter referred to as “second reference time code”). "Reference time code”) is written in the synchronization deviation information 203.
  • the reference frame specifying unit 103 may specify the reference frame image by an action and a sub-action different from the serve.
  • the reference frame specifying unit 103 may use a frame image in which the moment when the hand that made the attack (or dig, reception, toss, block, etc.) of the player hits the ball is captured as the reference frame image.
  • the reference frame specifying unit 103 may write a value obtained by subtracting the second reference time code from the first reference time code (hereinafter referred to as “synchronization time difference”) in the synchronization deviation information 203.
  • the synchronization time difference written in the synchronization deviation information 203 corresponds to the magnitude of the synchronization deviation of the frame image between the first mobile terminal 20A and the second mobile terminal 20B. For example, when the synchronization time difference is almost 0, it is shown that the time code assigned by the first mobile terminal 20A and the time code assigned by the second mobile terminal 20B tend to be almost the same. When the synchronization time difference is a positive value, it indicates that the time code assigned by the second mobile terminal 20B tends to be later than the time code assigned by the first mobile terminal 20A. When the synchronization time difference is a negative value, it indicates that the time code assigned by the first mobile terminal 20A tends to be later than the time code assigned by the second mobile terminal 20B.
  • the reference frame image automatically identified by the reference frame specifying unit 103 by the method described above is not necessarily the correct reference frame image. Therefore, the reference frame specifying unit 103 provides a UI (User Interface) (hereinafter referred to as “reference frame confirmation UI”) 310 for the user to confirm the reference frame image.
  • UI User Interface
  • the details of the reference frame confirmation UI 310 will be described later (see FIG. 8).
  • the reference frame changing unit 104 provides a UI (hereinafter referred to as "reference frame changing UI") 320 for the user to change the reference frame image.
  • reference frame changing UI UI
  • the details of the reference frame changing UI 320 will be described later (see FIG. 9).
  • the reference frame change unit 104 changes the synchronization shift information 203 based on the changed reference frame image.
  • the locus analysis unit 105 identifies the movement locus of the ball in the three-dimensional space related to the action by using the synchronization deviation information 203 from, for example, the first video 201A and the second video 201B in which the action of the serve by the player is photographed. , Generates analysis result information 204 including information indicating the movement trajectory of the ball.
  • the first moving image 201A and the second moving image 201B to be analyzed by the locus analysis unit 105 are the first moving image 201A and the second moving image 201B taken to generate the above-mentioned synchronization deviation information 203. It can be different.
  • the trajectory analysis unit 105 uses a perspective projection conversion matrix for converting the camera parameters (world coordinates into a camera image) of the first mobile terminal 20A and the second mobile terminal 20B obtained in advance by calibration processing. ), The position of the ball in the first frame image, and the position of the ball in the second frame image synchronized with the first frame image corrected by the synchronization deviation information 203. The three-dimensional coordinates of the ball at the timing when the frame image and the second frame image are taken are specified. The locus analysis unit 105 specifies the movement locus of the ball in the three-dimensional space by arranging the three-dimensional coordinates of the ball thus specified in chronological order. The locus analysis unit 105 generates analysis result information 204 including information indicating the movement locus of the specified ball. As a result, the movement trajectory of the ball in the three-dimensional space can be analyzed with sufficient accuracy even if a camera mounted on the mobile terminal is used instead of a dedicated camera specialized in ball game video analysis.
  • the result display unit 106 displays the analysis result information 204 on, for example, the display terminal 30.
  • the result display unit 106 may display the analysis result information 204 on the first mobile terminal 20A and / or the second mobile terminal 20B.
  • FIG. 5 is a flowchart showing an example of the preliminary work.
  • the pre-work is a work performed before the trajectory analysis unit 105 analyzes the actual action, and is, for example, a work performed to generate the synchronization deviation information 203.
  • the user installs the first mobile terminal 20A and the second mobile terminal 20B at positions where the player and the ball can be photographed from different directions (S101).
  • the user pairs the first mobile terminal 20A and the second mobile terminal 20B (S102).
  • the pairing information may be transmitted to the ball game video analyzer 100.
  • the user instructs the first mobile terminal 20A and the second mobile terminal 20B to start shooting related to the preliminary work (S103). As a result, the first mobile terminal 20A and the second mobile terminal 20B start shooting.
  • the player hits the serve (S104).
  • the first mobile terminal 20A and the second mobile terminal 20B can shoot the first moving image 201A and the second moving image 201B in which the player hits the serve and the ball flies, respectively.
  • the user and the player may be the same person.
  • the user instructs the first mobile terminal 20A and the second mobile terminal 20B to stop shooting related to the preliminary work (S105).
  • the first mobile terminal 20A and the second mobile terminal 20B receive an instruction to stop shooting, respectively, and transmit the first moving image 201A and the second moving image 201B to the ball game video analysis device 100, respectively.
  • the ball game video analysis device 100 identifies the first reference frame image from the first moving image 201A and the second reference frame image from the second moving image 201B, and displays the reference frame confirmation UI 310 on the display terminal 30.
  • the user confirms the first reference frame image and the second reference frame image from the reference frame confirmation UI 310 displayed on the display terminal 30 (S106). For example, when changing the reference frame image, the user instructs the ball game video analysis device 100 to change the reference frame image through the reference frame confirmation UI 310. In this case, the ball game video analysis device 100 causes the display terminal 30 to display the UI 320 for changing the reference frame.
  • instructions for starting and stopping shooting in steps S103 and S105 may be given from either mobile terminal.
  • the start and instruction of shooting is automatically transmitted to the second mobile terminal 20B via the ball game video analyzer 100. You can.
  • the user operates the reference frame change UI 320 displayed on the display terminal 30 to change the reference frame image (S107).
  • the details of steps S106 and S107 will be described later (see FIGS. 7 and 8).
  • the ball game image analysis device 100 can generate the synchronization deviation information 203 based on the first reference frame image and the second reference frame image confirmed and corrected by the user.
  • FIG. 6 is a sequence chart showing an example of processing related to pre-work.
  • FIG. 6 corresponds to a detailed description of steps S106 to S107 in FIG.
  • the video receiving unit 101 receives the first video 201A from the first mobile terminal 20A (S201), and receives the second video 201B from the second mobile terminal 20B (S202).
  • the image conversion unit 102 generates the first frame image group 202A from the first moving image 201A, and generates the second frame image group 202B from the second moving image 201B (S203).
  • the reference frame specifying unit 103 searches the first frame image group 202A for a first reference frame image in which a specific sub-action constituting the action is captured (S204). For example, as described above, the reference frame specifying unit 103 captures the moment when the hand hitting the player's serve hits the ball based on the change in the two-dimensional (X direction, Y direction) movement locus of the ball. The first reference frame image is searched.
  • the reference frame specifying unit 103 searches for the second reference frame image from the second frame image group 202B (S205). At this time, the reference frame specifying unit 103 may use the second frame image group 202B after the time code that goes back a predetermined time from the time code of the first reference frame image as the search range of the second reference frame image. As a result, the time required for the search can be shortened as compared with the case where the entire second frame image group 202B is set as the search range.
  • the reference frame specifying unit 103 uses the time code of the first reference frame image (first reference time code) and the time code of the second reference frame image (second reference time code) as synchronization deviation information 203. Write to (S206).
  • the reference frame changing unit 104 transmits information for displaying the first reference frame image, the second reference frame image, and the reference frame confirmation UI 310 to the display terminal 30 (S207).
  • the display terminal 30 displays the first reference frame image, the second reference frame image, and the reference frame confirmation UI 310 (S208).
  • the details of the reference frame confirmation UI 310 will be described later (see FIG. 8).
  • the display terminal 30 When the user performs an operation to the effect that the reference frame image is not changed with respect to the reference frame confirmation UI 310 displayed in step S208, the display terminal 30 provides instruction information indicating no change to the ball game video analysis device 100. (S209).
  • the reference frame changing unit 104 determines the content of the synchronization deviation information 203 written in step S206 (S210). Then, the ball game video analysis device 100 ends this process.
  • FIG. 7 is a sequence chart showing an example of processing related to pre-work when an operation to change the reference frame image is performed in FIG.
  • the display terminal 30 When the user performs an operation to change the reference frame image with respect to the reference frame confirmation UI 310 displayed in step S208, the display terminal 30 provides the instruction information indicating the change to the ball game video analysis device 100. (S221).
  • the reference frame changing unit 104 When the reference frame changing unit 104 receives the instruction information indicating that there is a change, the reference frame changing unit 104 performs the following processing. That is, the reference frame changing unit 104 extracts a plurality of first frame images taken before and after the first reference frame image searched in step S204 from the first frame image group 202A, and extracts a plurality of the extracted first frame images. The first frame image of is used as a candidate for the first reference frame image.
  • the reference frame changing unit 104 extracts a plurality of second frame images taken before and after the second reference frame image searched in step S205 from the second frame image group 202B, and extracts them.
  • a plurality of second frame images are candidates for the second reference frame image.
  • the reference frame changing unit 104 displays a plurality of first candidate frame candidates and their time codes, a plurality of second reference frame image candidates and these time codes, and a reference frame changing UI 320. Information for this is transmitted to the display terminal 30 (S222).
  • the display terminal 30 displays the reference frame change UI 320 (S223).
  • the details of the reference frame changing UI 320 will be described later (see FIG. 9).
  • the user operates the reference frame change UI 320 displayed in step S223 to select the changed first reference frame image from the plurality of first reference frame image candidates, and a plurality of first reference frame images. From the two reference frame image candidates, the changed second reference frame image is selected (S224).
  • the display terminal 30 uses the time code of the first reference frame image after the change (hereinafter referred to as “the first reference time code after the change”) selected in step S224, and / or the second after the change.
  • the time code of the reference frame image (hereinafter referred to as “the second reference time code after the change”) is transmitted to the ball game image analysis device 100 (S225).
  • the reference frame changing unit 104 overwrites the changed first reference time code and / or the changed second reference time code received from the display terminal 30 on the synchronization deviation information 203, and the synchronization deviation.
  • the content of the information 203 is confirmed (S226).
  • the content of the synchronization deviation information 203 automatically generated by the reference frame specifying unit 103 is overwritten with the time code of the reference frame image changed by the user through the reference frame changing unit 104. Then, the ball game video analysis device 100 ends this process.
  • FIG. 8 is a diagram showing a detailed example of the reference frame confirmation UI.
  • the reference frame confirmation UI 310 includes a first image display area 210A, a second image display area 210B, an OK button 312, a change button 313, and a redo button 314.
  • the reference frame confirmation UI 310 may not include either the change button 313 or the redo button 314.
  • the display terminal 30 displays the first reference frame image transmitted from the ball game image analysis device 100 in step S207 of FIG. 6 in the first image display area 210A.
  • the display terminal 30 displays the second reference frame image transmitted from the ball game image analysis device 100 in step S207 of FIG. 6 in the second image display area 210B.
  • the first image display area 210A and the second image display area 210B may be arranged side by side.
  • the arrangement mode of the first image display area 210A and the second image display area 210B is not limited to the arrangement arranged side by side.
  • the first image display area 210A and the second image display area 210B may be displayed as separate windows.
  • the display terminal 30 transmits instruction information indicating no change in step S209 of FIG. 6 to the ball game video analysis device 100.
  • the user correctly captures both the first reference frame image displayed in the first image display area 210A and the second reference frame image displayed in the second image display area 210B.
  • the OK button 312 may be pressed.
  • the content of the synchronization shift information 203 is determined.
  • the display terminal 30 transmits instruction information indicating that there is a change in step S221 of FIG. 7 to the ball game video analysis device 100.
  • the user correctly sub-acts at least one of the first reference frame image displayed in the first image display area 210A and the second reference frame image displayed in the second image display area 210B. If is not the captured image, the change button 313 may be pressed. As a result, as shown in step S223 of FIG. 7, the reference frame change UI 320 is displayed on the display terminal 30.
  • the display terminal 30 transmits instruction information indicating redoing the pre-work to the ball game video analysis device 100.
  • the ball game video analysis device 100, the first mobile terminal 20A, and the second mobile terminal 20B redo the preliminary work from step S103 of FIG.
  • the serve in step S104 fails, the first reference frame image and / or the second reference frame image is clearly different from the ball served by the player (eg, a ball lying on the floor or another). If the player recognizes the ball being practiced, the user may press the redo button 314.
  • the first reference frame image includes three first frame images immediately before the first reference frame image searched in step S204 and three third images immediately after the first reference frame.
  • a frame image of 1 may be included. In this case, the number of the first reference frame images is seven in total, including the first reference frame image searched in step S204.
  • the second reference frame image includes three second frame images immediately before the second reference frame image searched in step S205 and three images immediately after the second reference frame image.
  • a second frame image of the above may be included. In this case, the number of the second reference frame images is seven in total, including the second reference frame image searched in step S205.
  • the display terminal 30 displays seven first reference frame images and seven second reference frame images in the first image display area 210A and the second image display area 210B, respectively. It may be displayed continuously in synchronization.
  • first and second reference frame images are an example, and the number of a plurality of first and second reference frame images may be any number. Further, the plurality of first and second reference frame images are not limited to the case where each frame is continuous, and may be skipped at predetermined frame intervals.
  • the user not only confirms the synchronization between the one first reference frame image and the second reference frame image, but also a plurality of first reference frame images and the second reference frame image. You can easily check the synchronization with.
  • the reference frame confirmation UI 310 is displayed on either the first mobile terminal 20A or the second mobile terminal 20B. It may be displayed in.
  • the reference frame confirmation UI 310 for the first reference frame image may be displayed on the first mobile terminal 20A
  • the reference frame confirmation UI 310 for the second reference frame image may be displayed on the second mobile terminal 20B. ..
  • FIG. 9 is a diagram showing a detailed example of the UI for changing the reference frame.
  • the reference frame changing UI 320 includes a first image display area 210A, a second image display area 210B, an OK button 322, and a return button 323. Further, the reference frame change UI 320 includes a 1-frame return button 324A, an N-frame return button 325A, a 1-frame advance button 326A, and an N-frame advance button 327A corresponding to the first image display area 210A.
  • N is a predetermined integer of 2 or more.
  • the reference frame change UI 320 includes a 1-frame return button 324B, an N-frame return button 325B, a 1-frame advance button 326B, and an N-frame advance button 327B corresponding to the second image display area 210B.
  • the display terminal 30 returns to the display of the reference frame confirmation UI 310 shown in FIG.
  • the display terminal 30 selects a candidate for the first reference frame image one frame before the candidate for the first reference frame image displayed in the first image display area 210A. , Is displayed in the first image display area 210A.
  • the 1-frame return button 324B corresponding to the second image display area 210B.
  • the display terminal 30 selects a candidate for the first reference frame image N frames or earlier than the candidate for the first reference frame image displayed in the first image display area 210A. , Is displayed in the first image display area 210A. The same applies to the N-frame return button 325B corresponding to the second image display area 210B.
  • the display terminal 30 selects a candidate for the first reference frame image one frame or later from the candidate for the first reference frame image displayed in the first image display area 210A. , Is displayed in the first image display area 210A.
  • the one-frame advance button 326B corresponding to the second image display area 210B.
  • the display terminal 30 selects a candidate for the first reference frame image after the N frame rather than a candidate for the first reference frame image displayed in the first image display area 210A. , Is displayed in the first image display area 210A.
  • the N-frame advance button 327B corresponding to the second image display area 210B.
  • buttons 324A to 327A and 324B to 327B described above so that the image in which the sub-action is correctly captured is displayed in both the first image display area 210A and the second image display area 210B. You may operate it.
  • the display terminal 30 When the OK button 322 is pressed, the display terminal 30 has a time code of a candidate for the first reference frame image being displayed in the first image display area 210A (first reference time code after the change), and The time code of the candidate for the second reference frame image displayed in the second image display area 210B (the changed second reference time code) is transmitted to the ball game image analysis device 100.
  • This process corresponds to step S225 in FIG.
  • the user may press the OK button 322 after operating so that the image in which the sub-action is correctly captured is displayed in both the first image display area 210A and the second image display area 210B.
  • FIG. 10 is a diagram showing a display example of the analysis result.
  • the first mobile terminal 20A and the second mobile terminal 20B photograph the player's serve practice and generate and transmit the first video 201A and the second video 201B, respectively.
  • the first moving image 201A and the second moving image 201B may be different from the first moving image 201A and the second moving image 201B taken for the preliminary work.
  • the trajectory analysis unit 105 corrects the synchronization deviation of the frame image between the first moving image 201A and the second moving image 201B in which the practice of serving is taken based on the synchronization deviation information 203, and then the movement trajectory of the ball. Is analyzed, and analysis result information 204 is generated. For example, in the synchronization deviation information 203, when the second reference time code is later than the first reference time code, the locus analysis unit 105 between the second reference time code and the first reference time code. After correcting to delay each frame image of the first moving image 201A by the difference time of the first moving image 201, each frame image of the first moving image 201 and each frame image of the second moving image 201B are synchronized to form a ball. Analyze the movement trajectory of. As a result, the delay in assigning the time code by the second mobile terminal 20B is corrected, and the movement locus of the ball with sufficient accuracy can be calculated.
  • the result display unit 106 causes the display terminal 30 to display the analysis result screen 330 based on the analysis result information 204.
  • the analysis result screen 330 As shown in FIG. 10, the movement locus 331 of the ball indicated by the analysis result information 204 may be superimposed and displayed on the reproduction of the first moving image 201A or the second moving image 201B. ..
  • the analysis result screen 330 may display the height and velocity of the ball at the moment when the ball hit by the hand leaves the hand.
  • the analysis result screen 330 may display the height and velocity of the ball at the moment when the ball passes over the net.
  • the player can confirm in detail the movement trajectory of the ball by the serve in addition to the movement of the body at the time of the serve from the analysis result screen 330.
  • a video for synchronization adjustment is shot in advance, synchronization adjustment and confirmation are performed, then a video for analysis processing is shot, and the video on which the trajectory of the ball is superimposed is analyzed.
  • the synchronization adjustment and the analysis processing are continuously performed. May be good.
  • the synchronization adjustment may be executed at an arbitrary timing while the detection of the synchronization deviation is automated and the analysis processing for the sequentially shot moving images is performed.
  • the frame number of the first frame image in the first moving image 201A generated by the first mobile terminal 20A (hereinafter referred to as "first frame number”) and the second A frame image (hereinafter referred to as "second frame number”) in the second moving image 201B generated by the mobile terminal 20B may be used.
  • the reference frame specifying unit 103 has a first frame number of the first reference frame image (hereinafter referred to as "first reference frame number”) and a second frame number of the second reference frame image (hereinafter referred to as “first reference frame number”). “Second reference frame number”) may be written in the synchronization shift information 203.
  • the reference frame changing unit 104 uses the frame number of the changed first reference frame image (hereinafter referred to as “changed first reference frame number”) selected through the reference frame changing UI 320, and / or ,
  • the frame number of the changed second reference frame image (hereinafter referred to as “the changed second reference frame number”) may be overwritten on the synchronization shift information 203.
  • the target video and the comparative video are specified from a plurality of videos of the action of the person (player) performing the ball game, and the proficiency level of the action and the improvement points of the form can be grasped.
  • the analysis system will be described.
  • the same reference numbers may be assigned to the components common to the first embodiment, and the description may be omitted.
  • the second embodiment a case where the action practice by the player is a volleyball serve practice will be described as an example.
  • the second embodiment is not limited to this example, and various exercises such as tennis serve or volley practice, badminton serve or smash practice, soccer heading practice, baseball batting practice, and golf swing practice. It can be applied to practice actions.
  • FIG. 11 is a diagram showing a configuration example of the ball game video analysis system 10 according to the second embodiment.
  • the ball game video analysis system 10 according to the second embodiment is the same as the ball game video analysis system 10 according to the first embodiment shown in FIG.
  • the first mobile terminal 20A (camera) may be installed near the net on one side of the court toward the serving player.
  • the second mobile terminal 20B (camera) may be installed near the net on the other side of the court towards the serving player.
  • the ball game video analysis system 10 described in the second embodiment may be configured so that the synchronization deviation or the like is corrected by the method described in the first embodiment.
  • FIG. 12 is a block diagram showing a configuration example of the ball game video analysis device 100 according to the second embodiment.
  • the ball game video analysis device 100 includes a moving image receiving unit 101, a trajectory analysis unit 105, a posture analysis unit 131, a sub-action analysis unit 132, a record management unit 133, and a display control unit 134.
  • the video receiving unit 101 receives the first video 201A from the first mobile terminal 20A, and receives the second video 201B from the second mobile terminal 20B.
  • the first moving image 201A and the second moving image 201B for example, a scene in which the player serves is shot.
  • the trajectory analysis unit 105 identifies the movement trajectory of the ball from the first moving image 201A and the second moving image 201B by using, for example, the technique disclosed in Patent Document 1, and generates the trajectory information 220.
  • the locus information 220 includes a plurality of information indicating the correspondence relationship between the time code at a certain time point of the moving image and the three-dimensional position of the ball at the time code.
  • the posture analysis unit 131 generates posture model information 221 indicating the posture of the player during the period in which the action is performed, based on the first video 201A and the second video 201B. For example, the posture analysis unit 131 uses a known joint estimation technique to determine the three-dimensional position of each predetermined joint of the player (hereinafter referred to as “joint position”) from the first moving image 201A and the second moving image 201B. The movement of each joint position is specified. The posture analysis unit 131 generates information indicating each of the specified joint positions and the movement of each joint position as posture model information 221. The details of the posture model information 221 will be described later (see FIG. 14).
  • the sub-action analysis unit 132 analyzes each sub-action constituting the action captured in the first moving image 201A and the second moving image 201B based on the locus information 220 and / or the posture model information 221. Then, the sub-action analysis unit 132 generates sub-action information 222 indicating the occurrence timing of each analyzed sub-action. For example, a jumping serve, which is an example of a volleyball action, is a series of sub-actions such as "run-up”, “toss", “jump”, and "hit” which means that the hand that hit the serve hits the ball. It is composed of. The details of the sub-action will be described later (see FIG. 15).
  • the record management unit 133 manages a plurality of record information 200.
  • the recording information 200 is generated from the shooting date and time of the first moving image 201A and the second moving image 201B, the first moving image 201A and the second moving image 201B, and the first moving image 201A and the second moving image 201B. It is information showing the correspondence relationship with the locus information 220, the posture model information 221 and the sub-action information 222. That is, the recording information 200 is information indicating the correspondence relationship between the moving image of the action performed by the player at a certain date and time and the locus information 220, the posture model information 221 and the sub action information 222 generated from the moving image. It can be said that.
  • the display control unit 134 causes the display terminal 30 to display a UI for selecting the record information 200 based on the locus information 220 from the plurality of record information 200 managed by the record management unit 133. Further, the display control unit 134 causes the display terminal 30 to display the first moving image 201A or the second moving image 201B having a corresponding relationship in the recorded information 200 selected through the UI.
  • the display control unit 134 has, for example, the first record information 200A (see FIG. 13) and the first record information 200A from among the plurality of record information 200 managed by the record management unit 133 through the UI displayed on the display terminal 30.
  • the recording information 200B (see FIG. 13) of No. 2 is selected, the following processing may be performed. That is, the display control unit 134 has a correspondence relationship between the first moving image 201A or the second moving image 201B in the first recording information 200A and the first moving image 201A or the first moving image 201A or the second moving image 201B having a correspondence relationship in the second recording information 200B.
  • a UI 400A for displaying the moving image 201B of 2 side by side may be displayed on the display terminal 30.
  • the details of the video comparison UI 400A will be described later (see FIGS. 16 and 17).
  • the display control unit 134 uses, for example, the first record information 200A and the second record information 200A from among the plurality of record information 200 managed by the record management unit 133 through the video comparison UI 400A displayed on the display terminal 30.
  • the following processing may be performed (see FIG. 14). That is, the display control unit 134 has the posture model information 221 that has a correspondence relationship with the first posture model 231A generated from the posture model information 221 that has a correspondence relationship with the first recording information 200A and the second recording information 200B.
  • the display terminal 30 may display a UI (hereinafter referred to as “posture comparison UI”) 400C for displaying the second posture model 231B generated from the above side by side or in an overlapping manner. The details of the posture comparison UI 400C will be described later (see FIGS. 18 to 21).
  • the display control unit 134 has a locus information corresponding to the movement locus 241A of the moving body (ball) generated from the locus information 220 having a correspondence relationship in the first recording information 200A and the locus information having a correspondence relationship in the second recording information 200B.
  • the movement locus 241B of the moving body (ball) generated from the 220 may be displayed on the display terminal 30 together with the posture models 231A and 231B in the posture comparison UI400C (see FIGS. 18 to 21).
  • the display control unit 134 may display the sub-action selection UI (447) capable of selecting one of the plurality of sub-actions on the display terminal 30. Then, the display control unit 134 causes the display terminal 30 to display the posture models 231A and 231B at the occurrence timing of one sub-action selected through the sub-action selection UI (447) on the posture comparison UI (400D). Good (see Figure 20).
  • FIG. 13 is a block diagram showing a configuration example of the display terminal 30 according to the second embodiment.
  • the display terminal 30 has an information transmission / reception unit 151 and a UI control unit 152.
  • the information transmission / reception unit 151 transmits / receives information through the ball game video analysis device 100 and the communication network N.
  • the information transmission / reception unit 151 receives a plurality of recording information 200 (for example, the first recording information 200A and the second recording information 200B) from the ball game video analysis device 100.
  • the UI control unit 152 in cooperation with the display control unit 134 of the ball game image analysis device 100, displays the image comparison UI 400A, the posture comparison UI 400C, and the like on a display which is an example of the output device 1002 (see FIG. 23). Further, the UI control unit 152 detects an input from the user to the UI through a touch panel which is an example of the input device 1001 (see FIG. 23). When the UI control unit 152 detects an input to the UI, the UI control unit 152 transmits the input content to the ball game video analysis device 100 through the information transmission / reception unit 151.
  • FIG. 14 is a diagram for explaining an example of the posture model.
  • the posture model information 221 includes information indicating each joint position of the player and the movement of each joint position.
  • the display control unit 134 or the UI control unit 152 refers to the posture model information 221 that has a corresponding relationship in the first recording information 200A, and refers to each joint position at the timing of the sub-action “jump” during the action period (FIG. 14). (White circle and black circle) are specified.
  • the display control unit 134 or the UI control unit 152 identifies each joint position at the timing of the sub-action "jump” in the action period with reference to the posture model information 221 that is in a corresponding relationship in the second recording information 200B. do. Then, the display control unit 134 generates the player posture models 231A and 231B as shown in FIG.
  • the display control unit 134 or the UI control unit 152 can display the posture model by arbitrarily changing the viewpoint in the 3D space by an operation such as 3D rotation. can.
  • the display control unit 134 or the UI control unit 152 sets the first posture model 231A and the second posture model 231B at the timing of the sub-action “jump” and the neck joint position 232A. It can be displayed in an overlapping manner based on 232B.
  • the first posture model 231A and the second posture model 231B may be generated based on moving images of serves taken by the same player at different dates and times.
  • the user can visually recognize the difference in posture at the time of the sub-action "jump" performed by the same player at different dates and times from the superimposed display of the first posture model 231A and the second posture model 231B. .. Therefore, the user can easily confirm the progress of the serve of the same player, for example.
  • the first posture model 231A and the second posture model 231B may be generated based on moving images of serves performed by different players.
  • the user From the superimposed display of the first posture model 231A and the second posture model 231B, it is possible to visually recognize the difference in posture at the time of the sub-action "jump" performed by different players. Therefore, the user can easily confirm, for example, the difference in posture at the time of serving between the model player and the player who is practicing.
  • the sub-action analysis unit 132 may detect the timing when the joint positions of both ankles of the player are separated from the ground by a certain distance or more as the timing when the player jumps.
  • the sub-action analysis unit 132 may detect the timing when the joint positions 232A and 232B of the player's neck are at the highest point as the timing when the player's jump reaches the highest point.
  • the posture models 231A and 231B are not limited to the model in which each joint position is connected by a line as shown in FIG.
  • the attitude models 231A and 231B may be texture-mapped three-dimensional CG (Computer Graphics) models.
  • FIG. 15 is a diagram for explaining sub-actions constituting the jumping serve.
  • jumping serve which is an example of volleyball action, includes “running”, “toss”, “jump”, and “hit” which means that the hand hitting the serve hits the ball. It consists of a series of sub-actions.
  • the sub-action analysis unit 132 specifies a frame image at the timing when each sub-action occurs based on the ball trajectory information 220 and / or the player's posture model information 221.
  • the sub-action analysis unit 132 refers to the ball locus information 220 to specify a frame image at the timing when the ball's movement locus changes from a stationary state to an upward movement. Then, the sub-action analysis unit 132 writes the timing (for example, time information) at which the specified frame image is taken in the sub-action information 222 as the occurrence timing of the sub-action "toss".
  • the sub-action analysis unit 132 refers to the posture model information 221 of the player and specifies a frame image at the timing when the joint positions of both ankles of the player are separated from the ground by a certain distance or more. Then, the sub-action analysis unit 132 writes the timing at which the specified frame image is taken in the sub-action information 222 as the occurrence timing of the sub-action “jump”.
  • the sub-action analysis unit 132 refers to the trajectory information 220 of the ball and specifies a frame image at the timing when the amount of movement of the ball in the direction approaching the net changes to a predetermined threshold value or more. Then, the sub-action analysis unit 132 writes the timing at which the specified frame image is captured in the sub-action information 222 as the occurrence timing of the sub-action “hit”.
  • the sub-action information 222 As a result, information (for example, time information) indicating the timing at which each sub-action constituting the action occurs is written in the sub-action information 222. Therefore, by referring to the sub-action information 222, the frame image at the time of the desired sub-action can be specified from the moving image. Further, by referring to the sub-action information 222, the posture model at the time of the desired sub-action can be specified from the posture model information 221.
  • FIG. 16 is a diagram showing a display example of the video comparison UI 400A.
  • the video comparison UI 400A is an example of a comparison UI, and is a UI for comparing moving images related to a plurality of recorded information 200 that are different from each other.
  • the video comparison UI 400A includes selection lists 401A, 401B, display areas 402A, 402B, add button 403, seek bar 410, play button 411, pause button 412, slow play button 413, repeat button 414, and so on. It includes a side-by-side button 415, an enlargement button 416, a camera switching button 417, a video inversion button 418, and a 3D switching button 419.
  • the information specified from the locus information 220 related to the record information 200 is displayed in a list format.
  • the information specified from the locus information 220 includes, for example, information indicating the height, speed, and angle of the ball when passing through the net in the serve.
  • One line of the selection lists 401 and 401B may correspond to one recording information 200.
  • the recorded information 200 may include information indicating the evaluation of the action (for example, serve).
  • the selection lists 401A and 401B may also display the evaluation of the action, as shown in FIG.
  • the evaluation of the action may be manually entered by the user.
  • the evaluation of the action may be automatically determined based on the height, speed and angle of the ball when passing through the net.
  • the evaluation of the action may be determined based on whether or not the angle of the ball when passing through the net is equal to or greater than a predetermined threshold value.
  • the threshold value referred to in the evaluation of the action may be arbitrarily set by the user.
  • the first moving image 201A or the second moving image 201B related to the first recording information 200A selected from the selection list 401A is displayed.
  • the first moving image 201A or the second moving image 201B related to the second recording information 200B is displayed.
  • a set of the selection list 401A and the display area 402A (hereinafter referred to as the “first set”) and a set of the selection list 401B and the display area 402B (hereinafter referred to as the “second set”). ”) May be displayed side by side.
  • the user selects a line (recording information 200) to be compared from the selection list 401A and the selection list 401B, respectively, and compares the moving images related to the selected line displayed in the display areas 402A and 402B, respectively. You can watch while watching.
  • the selection list 401A included in the first set includes records generated from the first moving image 201A and the second moving image 201B taken on the same day using the first mobile terminal 20A and the second mobile terminal 20B.
  • Information 200 may be displayed at any time.
  • the display control unit 134 adds a new set including a new selection list and a display area to the video comparison UI 400A.
  • the UI control unit 152 may inquire of the user about the date of the recorded information 200 to be displayed in the new selection list. Then, the UI control unit 152, in cooperation with the display control unit 134, acquires one or a plurality of record information 200 including the date input by the user from the record management unit 133, and based on the acquired record information 200. , You may want to display the selection list of the new set added.
  • the seek bar 410 is a bar indicating the playing time of the moving image displayed in the display areas 402A and 402B.
  • the UI control unit 152 may play the moving image from the time corresponding to the selected position.
  • the UI control unit 152 starts playing the moving image displayed in the display areas 402A and 402B.
  • the UI control unit 152 pauses the playback of the moving image displayed in the display areas 402A and 402B.
  • the UI control unit 152 slow-plays (for example, plays back at 1/2 speed) the moving images displayed in the display areas 402A and 402.
  • the UI control unit 152 When the repeat button 414 is pressed, the UI control unit 152 repeatedly plays back the moving image displayed in the display areas 402A and 402B.
  • the slow play button 413 and the repeat button 414 may be used together.
  • the UI control unit 152 is a UI for comparing the sub-actions of the selected plurality of sets for sub-action video comparison. Switch to the display of UI400B (see FIG. 17). The details of the sub-action video comparison UI 400B will be described later (see FIG. 17).
  • the UI control unit 152 expands the range including the player of the moving image displayed in the display area of the selected set. indicate.
  • the range to be enlarged may be arbitrarily specified by the user. Further, the enlargement ratio of the display may be such that the user can arbitrarily specify the magnification.
  • the UI control unit 152 uses the moving image displayed in the display area of the selected set (for example, the first moving image). 201A) is switched to a moving image (for example, a second moving image 201B) taken by another mobile terminal. Thereby, for example, the moving images displayed in the display areas 402A and 402B can be aligned with the moving images shot from the same direction.
  • the UI control unit 152 Inverts the moving image displayed in the display area of the selected set. This makes it easy to compare, for example, the actions of a right-handed player with the actions of a left-handed player.
  • the UI control unit 152 When at least one set in the video comparison UI 400A is selected and the 3D switching button 419 is pressed, the UI control unit 152 performs the posture comparison UI 400C for comparing the posture models of the selected set (see FIG. 18). ) Is displayed. The details of the posture comparison UI 400C will be described later (see FIG. 18).
  • FIG. 17 is a diagram showing a display example of the sub-action video comparison UI 400B.
  • the UI control unit 152 is used for sub-action video comparison shown in FIG. Display UI400B.
  • the sub-action video comparison UI 400B includes display areas 421A and 421B, and a plurality of sub-action selection buttons 420 extended from the side-by-side buttons 415.
  • the sub-action selection button 420 is an example of the sub-action selection UI.
  • the display areas 421A and 421B are arranged side by side as shown in FIG.
  • the UI control unit 152 expands the range including the player of the moving image displayed in the first set and displays it in the display area 421A.
  • the UI control unit 152 expands the range including the player of the moving image displayed in the second set and displays it in the display area 421B.
  • the UI control unit 152 may adjust the display of the moving image in the display areas 421A and 421B so that the joint position of the player's neck is located at the center of the display areas 421A and 421B.
  • the plurality of sub-action selection buttons 420 are buttons for selecting sub-actions, and correspond to, for example, “toss”, “jump”, and “hit” sub-actions.
  • the UI control unit 152 when the user selects the sub-action selection button 420 "hit", the UI control unit 152 performs the following processing. That is, the UI control unit 152 refers to the sub-action information 222 related to the first recording information 200A, and refers to the frame at the timing when the sub-action "hit” is shot from the moving images displayed in the first set. Identify the image. Then, the UI control unit 152 displays the specified frame image in the display area 421A. Similarly, the UI control unit 152 refers to the sub-action information 222 related to the second recording information 200B, and refers to the timing at which the sub-action "hit” is shot from the moving images displayed in the second set. Identify the frame image.
  • the UI control unit 152 displays the specified frame image in the display area 421B.
  • the UI control unit 152 may start playing the moving image from the frame image at the timing of the specified sub-action “hit”. This allows the user to easily compare, for example, the player's footage at the time of the sub-action "hit” of the serve performed at different dates and times. If the timing of the frame image of the sub-action "hit” is aligned, the moving image may be played including "run-up” and "toss".
  • FIG. 18 is a diagram showing an example of a posture comparison UI when the posture models are displayed side by side.
  • FIG. 19 is a diagram showing an example of a posture comparison UI when the posture models are superimposed and displayed.
  • the posture comparison UI 400C is an example of a comparison UI, and is a UI for comparing posture model information 221 related to a plurality of different recorded information 200.
  • the UI control unit 152 takes the posture shown in FIG. 18 or FIG.
  • the comparison UI 400C is displayed.
  • the posture comparison UI 400C includes a display area 434, a seek bar 410, a play button 411, a pause button 412, a slow play button 413, a repeat button 414, a video switching button 441, a trajectory display switching button 442, a ball display switching button 443, and a player display. It includes a changeover button 444, a superposition separation changeover button 445, and a side-by-side button 446.
  • the UI control unit 152 includes a posture model 231A, a ball 240A, and a ball movement locus 241A corresponding to the first set, and a posture model 231B, a ball 240B corresponding to the second set. And the movement locus 241B of the ball are displayed in the display area 434.
  • the posture model 231A, the ball 240A, and the ball movement locus 241A corresponding to the first set are collectively referred to as the first posture model and the like.
  • the posture model 231B, the ball 240B, and the ball movement locus 241B corresponding to the second set are collectively referred to as a second posture model and the like.
  • the user can arbitrarily rotate the first and second posture models and the like displayed in the display area 434 in 3D space.
  • the viewpoint can be changed and displayed.
  • the user can confirm and compare the postures of the players at the time of serving at different dates and times from various viewpoints.
  • the seek bar 410 is a bar indicating the playing time of the first and second posture models and the like displayed in the display area 434. That is, the posture model or the like displayed in the display area 434 represents the posture or the like of the player at the timing of the reproduction time being displayed on the seek bar 410.
  • the UI control unit 152 starts playing the actions of the first and second posture models and the like displayed in the display area 434.
  • the first and second posture models and the like are displayed like an animation movie.
  • the first video 201A and the second video 201B which are the actual videos, are synchronizedly played in a separate window (not shown). May be good.
  • the UI control unit 152 suspends the operations of the first and second posture models displayed in the display area 434.
  • the UI control unit 152 slow-plays (for example, plays at 1/2 speed) the operations of the first and second posture models displayed in the display area 434.
  • the UI control unit 152 When the repeat button 414 is pressed, the UI control unit 152 repeatedly reproduces the operations of the first and second posture models and the like displayed in the display area 434.
  • the slow play button 413 and the repeat button 414 may be used together.
  • the UI control unit 152 switches to the display of the video comparison UI 400A shown in FIG.
  • the UI control unit 152 switches between displaying and hiding the movement loci 241A and 241B in the display area 434.
  • the UI control unit 152 switches between displaying and hiding the balls 240A and 240B in the display area 434.
  • the UI control unit 152 switches between displaying and hiding the posture models 231A and 231B in the display area 434.
  • the UI control unit 152 displays in the display area 434 by separating the first posture model and the like as shown in FIG. 18 and the second posture model and the like, and in FIG. The display is switched between the first posture model and the like as shown and the display in which the second posture model and the like are superimposed.
  • the UI control unit 152 may display the first posture model 231A and the second posture model 231B on top of each other with reference to the neck joint positions 232A and 232B.
  • the UI control unit 152 displays the first posture model 231A and the second posture model 231B in an overlapping manner, as shown in FIG. 19, the UI control unit 152 matches the movement of the positions of the posture models 231A and 231B with the ball 240A. , 240B and the positions of the ball movement loci 241A and 241B are also moved and displayed.
  • the UI control unit 152 displays the sub-action posture comparison UI 400D (see FIG. 20). Details of the UI400D for sub-action posture comparison will be described later (see FIG. 20).
  • the posture comparison UI400C may include a model inversion button (not shown).
  • the UI control unit 152 displays the selected posture model in a left-right reversal. Thereby, for example, the posture of a right-handed player and the posture of a left-handed player can be easily compared.
  • FIG. 20 is a diagram showing an example of a UI for sub-action posture comparison.
  • the UI control unit 152 displays the sub-action posture comparison UI 400D shown in FIG. 20.
  • the sub-action posture comparison UI 400D includes display areas 451A and 451B, and a plurality of sub-action selection buttons 447 extended from the side-by-side buttons 446.
  • the sub-action selection button 447 is an example of the sub-action selection UI.
  • the display areas 451A and 451B are arranged side by side as shown in FIG.
  • the UI control unit 152 enlarges the first posture model and the like (231A, 240A, 241A) and displays them in the display area 451A.
  • the UI control unit 152 enlarges the second posture model and the like (231B, 240B, 241B) and displays them in the display area 451B.
  • the UI control unit 152 may adjust the display of the posture models 231A and 231B so that the joint positions 232A and 232B of the neck of the posture model are located at the center of the display areas 451A and 451B.
  • the plurality of sub-action selection buttons 447 are buttons for selecting sub-actions, and correspond to, for example, “toss”, “jump”, and “hit” sub-actions.
  • the UI control unit 152 when the user selects the sub-action selection button 447 "jump", the UI control unit 152 performs the following processing. That is, the UI control unit 152 specifies the first posture model and the like of the timing of the sub-action "jump” with reference to the sub-action information 222 related to the first recorded information 200A. Then, the UI control unit 152 displays the specified first posture model and the like in the display area 451A. Similarly, the UI control unit 152 specifies the second posture model and the like of the timing of the sub action "jump” with reference to the sub action information 222 related to the second recorded information 200B. Then, the UI control unit 152 displays the specified second posture model and the like in the display area 451B.
  • the UI control unit 152 may start playing the motion of the posture model or the like from the posture model or the like at the timing of the specified sub-action “jump”. This allows the user to easily compare, for example, the posture of the player at the time of the sub-action "jump” of the serve performed at different dates and times and the movement thereof. If the timing of the frame image of the sub-action "jump" is aligned, the moving image may be played including "run-up” and "toss”.
  • FIG. 21 is a diagram for explaining an example in the case of selecting a posture model in the posture comparison UI 400C.
  • the UI control unit 152 displays the selection list 431 as shown in FIG. 21.
  • the selection list 431 the information specified from the locus information 220 related to the recorded information 200 is displayed in a list format as in the case of FIG.
  • a check box 432 is displayed in each line of the selection list 431. The user sets the check box 432 of the row to be compared from the selection list 431 to ON.
  • the UI control unit 152 identifies the recording information 200 in which the check box 432 is set to ON, and displays the posture model or the like generated from the posture model information 221 related to the specified recording information 200 in the display area 434. As a result, the user can display a desired posture model or the like in the display area 434 and compare them.
  • the ball filtering button 433 may be displayed on the sidebar 430.
  • the UI control unit 152 displays a condition box (not shown) for filtering the recording information 200 to be displayed in the selection list 431.
  • the condition box At least one of the ball height, speed, angle and evaluation, which is the threshold for filtering, may be entered.
  • the UI control unit 152 displays the recording information 200 that matches the conditions input in the condition box as the selection list 431. As a result, the user can efficiently extract the recorded information 200 as a comparison candidate from the large amount of recorded information 200.
  • FIG. 22 is a diagram showing a hardware configuration of a computer that realizes the functions of the ball game video analysis device 100 by a program.
  • the computer 1000A includes an input device 1001 such as a keyboard or mouse and a touch pad, an output device 1002 such as a display or a speaker, a CPU (Central Processing Unit) 1003, a GPU (Graphics Processing Unit) 1004, and a ROM (Read Only Memory) 1005.
  • RAM RandomAccessMemory
  • HDD HardDiskDrive
  • SSD SolidStateDrive
  • storage device 1007 such as flash memory, DVD-ROM (DigitalVersatileDiskReadOnlyMemory) or USB (UniversalSerialBus) memory, etc.
  • a reading device 1008 that reads information from the recording medium of the above, and a transmitting / receiving device 1009 that communicates by wire or wirelessly via the communication network N are provided, and each part is connected by a bus 1010.
  • the reading device 1008 reads the program from the recording medium on which the program for realizing the functions of the above devices is recorded, and stores the program in the storage device 1007. or,
  • the transmission / reception device 1009 communicates with the server device connected to the communication network N, and stores the program downloaded from the server device for realizing the function of each device in the storage device 1007.
  • the CPU 1003 copies the program stored in the storage device 1007 to the RAM 1006, and sequentially reads and executes the instructions included in the program from the RAM 1006, whereby the functions of the above devices are realized.
  • FIG. 23 is a diagram showing a hardware configuration of a computer that programmatically realizes the functions of the first mobile terminal 20A, the second mobile terminal 20B, and the display terminal 30.
  • the components described with reference to FIG. 11 may be designated by the same reference numerals and the description thereof may be omitted.
  • This computer 1000B includes an input device 1001, an output device 1002, a CPU 1003, a GPU 1004, a ROM 1005, a RAM 1006, a storage device 1007, a reading device 1008, a transmission / reception device 1009, and a camera device 1011.
  • the camera device 1011 includes an image sensor and generates captured images (moving images and still images).
  • the generated captured image is stored in the storage device 1007.
  • the captured image stored in the storage device 1007 may be transmitted to an external server or the like through the transmission / reception device 1009 and the communication network N.
  • LSI is an integrated circuit. These may be individually integrated into one chip, or may be integrated into one chip so as to include a part or all of them. Although it is referred to as LSI here, it may be referred to as IC, system LSI, super LSI, or ultra LSI depending on the degree of integration.
  • the ball game video analysis device (100) is a first moving image (201A) and a second moving image (201A) in which the actions of a person (for example, a player) with respect to a moving body (for example, a ball) used in the ball game are photographed from different positions.
  • the locus analysis unit (105) that generates the locus information (220) indicating the locus of the moving body and the information indicating the correspondence between the first moving image and the second moving image and the locus information.
  • a record management unit (133) that manages a plurality of recorded information (200) and a UI (User Interface) that selects recorded information based on the trajectory information from the plurality of recorded information are displayed and selected through the UI.
  • the recorded information is selected based on the locus information from the plurality of recorded information managed by the record management unit, and the corresponding moving images are displayed in the selected recorded information.
  • the action of a person performing a ball game is recorded. Therefore, the user can easily find and watch a video useful for practicing ball games, such as a video in which the action succeeds or fails, based on the trajectory information, from a plurality of videos in which the action of the person is shot.
  • the display control unit has a first moving image or a second moving image having a corresponding relationship in the first recorded information.
  • the moving image and the first moving image or the second moving image having a corresponding relationship in the second recorded information may be displayed side by side.
  • the moving image related to the selected first recording information and the moving image related to the selected second recording information are displayed side by side. Therefore, the user can watch a moving image of the actions displayed side by side and use it for practicing ball games.
  • the ball game video analysis device may further include a posture analysis unit (131) that generates posture model information (221) indicating the posture of the person at the time of action based on the first moving image and the second moving image. ..
  • the recorded information may be information indicating the correspondence between the first moving image and the second moving image, the locus information, and the posture model information.
  • the display control unit When the first recorded information and the second recorded information are selected from the plurality of recorded information, the display control unit generates a posture model (posture model) generated from the posture model information having a corresponding relationship in the first recorded information. 231A) and the posture model (231B) generated from the posture model information corresponding to each other in the second recorded information may be displayed side by side or overlapped.
  • the posture model related to the selected first recorded information and the posture model related to the selected second recorded information are displayed side by side or overlapped. Therefore, the user can see the posture model at the time of action displayed side by side or overlapped, and can use it for practicing ball games.
  • the display control unit has a movement locus (241A) of the moving body generated from the locus information having a correspondence relationship in the first recording information, and a movement generated from the locus information having a correspondence relationship in the second recording information.
  • the movement locus (241B) of the body may be displayed together with the posture model.
  • the movement locus of the moving body is displayed along with the posture model. Therefore, the user can see the movement of the moving body at the time of the action displayed together with the posture model and use it for practicing the ball game.
  • the ball game video analysis device includes a sub-action analysis unit (132) that generates sub-action information (222) indicating the occurrence timing of each of the plurality of sub-actions constituting the action based on the trajectory information and the posture model information. You may prepare further.
  • the recorded information may be information indicating the correspondence between the first moving image and the second moving image, the trajectory information, the posture model information, and the sub-action information.
  • the display control unit displays a sub-action selection UI (420) capable of selecting one of a plurality of sub-actions, and displays a posture model at the occurrence timing of one sub-action selected through the sub-action selection UI. May be displayed.
  • the posture model at the occurrence timing of one sub-action selected through the sub-action selection UI is displayed. Therefore, the user can display the posture model at the time of the desired sub-action through the sub-action selection UI, which can be useful for practicing ball games.
  • the technology of the present disclosure can be used for analysis of sports using mobile objects.
  • Ball game video analysis system 20A 1st mobile terminal 20B 2nd mobile terminal 30 Display terminal 100 Ball game video analysis device 101 Video receiver 102 Image conversion unit 103 Reference frame identification unit 104 Reference frame change unit 105 Trajectory analysis unit 106 Result display Unit 131 Posture analysis unit 132 Sub-action analysis unit 133 Recording management unit 134 Display control unit 151 Information transmission / reception unit 152 UI control unit 200 Recording information 200A First recording information 200B Second recording information 201A First video 201B Second Movie 202A First frame image group 202B Second frame image group 203 Synchronous deviation information 204 Analysis result information 210A First image display area 210B Second image display area 211A, 211B Ball detection area 220 Trajectory information 221 Attitude model Information 222 Sub-action information 231A First posture model 231B Second posture model 232A, 232B Neck joint position 240A, 240B Ball 241A, 241B Ball movement trajectory 310 Reference frame confirmation UI 312 OK button 313 Change button 314 Redo button 320

Abstract

This ball game footage analysis device: generates trajectory information of a moving body on the basis of a first video and a second video that capture, from differing positions, an action of a person on a moving body for a ball game; manages a plurality of pieces of record information indicating the correspondence relationship between the trajectory information and the first video and the second video; displays a UI that allows selection of the record information on the basis of the trajectory information from among the plurality of pieces of record information; and displays the first video or the second video that is in a correspondence relationship with the record information selected through the UI.

Description

球技映像解析装置、球技映像解析システム、球技映像解析方法、及び、コンピュータプログラムBall game video analysis device, ball game video analysis system, ball game video analysis method, and computer program
 本開示は、球技映像解析装置、球技映像解析システム、球技映像解析方法、及び、コンピュータプログラムに関する。 This disclosure relates to a ball game video analysis device, a ball game video analysis system, a ball game video analysis method, and a computer program.
 従来、バレーボール分析ソフトウェアとして「Data Volley」が市販され、このソフトウェアに精通したアナリストの主観的判断に基づいて、チームの選手の状況をデータ化する技術が知られている。 Conventionally, "Data Volley" has been marketed as volleyball analysis software, and a technique for digitizing the situation of team players based on the subjective judgment of an analyst who is familiar with this software is known.
 近年では、アナリストの操作に頼ることなく、コートを見下ろす四隅にそれぞれカメラを設置し、これら4台のカメラを同期させてプレーを撮影した映像を取り込み、ボールの動きを追跡して、当該ボールの3次元の位置情報を求め、同時に選手の背番号やプレーに含まれるアクションの種類(サーブ、レセプション、ディグ、トス、アタック、ブロック)に関する解析データを出力する技術が知られている(特許文献1)。 In recent years, without relying on the operation of analysts, cameras have been installed at each of the four corners overlooking the court, and these four cameras have been synchronized to capture images of the play, and the movement of the ball has been tracked to track the ball. There is known a technique for obtaining three-dimensional position information of the camera and at the same time outputting analysis data regarding the player's uniform number and the type of action (serve, reception, dig, toss, attack, block) included in the play (patent document). 1).
国際公開第2019/225415号International Publication No. 2019/225415
 ところで、球技を行う人物のアクション(例えばバレーボールのサーブ)の練習を、身近に存在するカメラを用いて動画で撮影し、後にその動画を再生して練習に役立てることが行われている。しかし、従来は、動画の撮影日時及びサムネイルを頼りに動画を探しており、同じアクションが撮影されている複数の動画の中から対象動画および比較動画をそれぞれ特定して、アクションの習熟度やフォームの改善点などを把握することが難しい。 By the way, the practice of the action of a person who plays a ball game (for example, serving volleyball) is shot with a video using a camera that is close to us, and the video is played back later to be useful for the practice. However, in the past, we searched for videos by relying on the shooting date and time of the video and thumbnails, and identified the target video and comparative video from multiple videos in which the same action was shot, and the proficiency level and form of the action. It is difficult to grasp the points to be improved.
 本開示の目的は、球技を行う人物のアクションを撮影した複数の動画の中から対象動画および比較動画をそれぞれ特定して、アクションの習熟度やフォームの改善点などを容易に把握することができる技術を提供することにある。 The purpose of the present disclosure is to identify the target video and the comparative video from a plurality of videos of the action of a person performing a ball game, and to easily grasp the proficiency level of the action and the improvement points of the form. To provide technology.
 本開示に係る球技映像解析装置は、球技に用いられる移動体に対する人物のアクションを互いに異なる位置から撮影した第1の動画及び第2の動画に基づいて、前記移動体の移動軌跡を示す軌跡情報を生成する軌跡解析部と、前記第1の動画及び前記第2の動画と前記軌跡情報との対応関係を示す情報である複数の記録情報を管理する記録管理部と、前記複数の記録情報の中から前記軌跡情報に基づいて前記記録情報を選択させるUI(User Interface)を表示し、前記UIを通じて選択された前記記録情報において対応関係にある第1の動画又は第2の動画を表示する表示制御部と、を備える。 The ball game video analysis device according to the present disclosure is locus information indicating the movement locus of the moving body based on the first moving image and the second moving image of the actions of a person with respect to the moving body used in the ball game taken from different positions. A locus analysis unit that generates A display for displaying a UI (User Interface) for selecting the recorded information based on the locus information, and displaying a first moving image or a second moving image having a corresponding relationship with the recorded information selected through the UI. It includes a control unit.
 本開示に係る球技映像解析システムは、カメラ付きの第1の携帯端末と、カメラ付きの第2の携帯端末と、球技映像解析装置と、表示端末とを含む。前記第1の携帯端末は、球技に用いられる移動体に対する人物のアクションを第1の位置から撮影して第1の動画を生成する。前記第2の携帯端末は、前記人物の前記アクションを第2の位置から撮影して第2の動画を生成する。前記球技映像解析装置は、前記第1の動画及び前記第2の動画に基づいて、前記移動体の軌跡を示す軌跡情報を生成し、前記第1の動画及び前記第2の動画と前記軌跡情報との対応関係を示す情報である複数の記録情報を管理し、前記複数の記録情報の中から前記軌跡情報に基づいて記録情報を選択させるUIを前記表示端末に表示させ、前記UIを通じて選択された前記記録情報において対応関係にある第1の動画又は第2の動画を前記表示端末に表示させる。 The ball game video analysis system according to the present disclosure includes a first mobile terminal with a camera, a second mobile terminal with a camera, a ball game video analysis device, and a display terminal. The first mobile terminal captures a person's action on a moving body used in a ball game from a first position to generate a first moving image. The second mobile terminal captures the action of the person from a second position to generate a second moving image. The ball game video analysis device generates locus information indicating the locus of the moving body based on the first moving image and the second moving image, and generates the locus information indicating the locus of the moving body, and the first moving image, the second moving image, and the locus information. A UI for managing a plurality of recorded information which is information indicating a correspondence relationship with the above and selecting a recorded information based on the locus information from the plurality of recorded information is displayed on the display terminal and selected through the UI. The first moving image or the second moving image having a corresponding relationship in the recorded information is displayed on the display terminal.
 本開示に係る球技映像解析方法は、装置が、球技に用いられる移動体に対する人物のアクションを互いに異なる位置から撮影した第1の動画及び第2の動画に基づいて、前記移動体の移動軌跡を示す軌跡情報を生成し、前記第1の動画及び前記第2の動画と前記軌跡情報との対応関係を示す情報である複数の記録情報を管理し、前記複数の記録情報の中から前記軌跡情報に基づいて記録情報を選択させるUIを表示し、前記UIを通じて選択された前記記録情報において対応関係にある第1の動画又は第2の動画を表示する。 In the ball game video analysis method according to the present disclosure, the apparatus obtains the movement locus of the moving body based on the first moving image and the second moving image of the action of the person with respect to the moving body used in the ball game taken from different positions. The locus information to be shown is generated, a plurality of recorded information which is information indicating the correspondence between the first moving image and the second moving image and the locus information is managed, and the locus information is selected from the plurality of recorded information. A UI for selecting recording information is displayed based on the above, and a first moving image or a second moving image having a corresponding relationship in the recording information selected through the UI is displayed.
 本開示に係るコンピュータプログラムは、球技に用いられる移動体に対する人物のアクションを互いに異なる位置から撮影した第1の動画及び第2の動画に基づいて、前記移動体の移動軌跡を示す軌跡情報を生成し、前記第1の動画及び前記第2の動画と前記軌跡情報との対応関係を示す情報である複数の記録情報を管理し、前記複数の記録情報の中から前記軌跡情報に基づいて記録情報を選択させるUIを表示し、前記UIを通じて選択された前記記録情報において対応関係にある第1の動画又は第2の動画を表示する、処理をコンピュータに実行させる。 The computer program according to the present disclosure generates trajectory information indicating the movement trajectory of the moving body based on the first moving image and the second moving image of the actions of the person with respect to the moving body used in the ball game taken from different positions. Then, a plurality of recorded information, which is information indicating the correspondence between the first moving image and the second moving image and the locus information, is managed, and the recorded information is recorded from the plurality of recorded information based on the locus information. Is displayed, and a computer is made to execute a process of displaying a first moving image or a second moving image having a corresponding relationship in the recorded information selected through the UI.
 なお、これらの包括的又は具体的な態様は、システム、装置、方法、集積回路、コンピュータプログラム又は記録媒体で実現されてもよく、システム、装置、方法、集積回路、コンピュータプログラム及び記録媒体の任意な組み合わせで実現されてもよい。 It should be noted that these comprehensive or specific aspects may be realized by a system, a device, a method, an integrated circuit, a computer program or a recording medium, and any of the system, a device, a method, an integrated circuit, a computer program and a recording medium. It may be realized by various combinations.
 本開示によれば、球技を行う人物のアクションを撮影した複数の動画の中から対象動画および比較動画をそれぞれ特定して、アクションの習熟度やフォームの改善点などを容易に把握することができる。 According to the present disclosure, it is possible to identify the target video and the comparative video from a plurality of videos of the action of a person performing a ball game, and easily grasp the proficiency level of the action and the improvement points of the form. ..
実施の形態1に係る球技映像解析システムの構成例を示す図The figure which shows the configuration example of the ball game image analysis system which concerns on Embodiment 1. ボールの位置の特定方法を説明するための図Diagram for explaining how to identify the position of the ball 実施の形態1に係る球技映像解析装置の構成例を示すブロック図A block diagram showing a configuration example of the ball game video analysis device according to the first embodiment. 基準フレーム画像の特定方法を説明するための図Diagram for explaining how to identify the reference frame image 事前作業の一例を示すフローチャートFlowchart showing an example of pre-work 事前作業に関する処理の一例を示すシーケンスチャートSequence chart showing an example of processing related to pre-work 図6において基準フレーム画像を変更する旨の操作が行われた場合の事前作業に関する処理の一例を示すシーケンスチャートA sequence chart showing an example of processing related to pre-work when an operation to change the reference frame image is performed in FIG. 基準フレーム確認用UI(User Interface)の詳細例を示す図A diagram showing a detailed example of the reference frame confirmation UI (User Interface) 基準フレーム変更用UIの詳細例を示す図The figure which shows the detailed example of the UI for changing a reference frame 解析結果の表示例を示す図Diagram showing a display example of analysis results 実施の形態2に係る球技映像解析システムの構成例を示す図The figure which shows the configuration example of the ball game image analysis system which concerns on Embodiment 2. 実施の形態2に係る球技映像解析装置の構成例を示すブロック図A block diagram showing a configuration example of the ball game video analysis device according to the second embodiment. 実施の形態2に係る表示端末の構成例を示すブロック図Block diagram showing a configuration example of a display terminal according to the second embodiment 姿勢モデルの一例を説明するための図Diagram for explaining an example of a posture model ジャンピングサーブを構成するサブアクションを説明するための図Diagram to illustrate the sub-actions that make up a jumping serve 映像比較用UIの表示例を示す図Diagram showing a display example of the video comparison UI サブアクション映像比較用UIの表示例を示す図Diagram showing a display example of the UI for sub-action video comparison 姿勢モデルを並べて表示した場合の姿勢比較用UIの一例を示す図The figure which shows an example of the posture comparison UI when the posture models are displayed side by side. 姿勢モデルを重ねて表示した場合の姿勢比較用UIの一例を示す図The figure which shows an example of the posture comparison UI when the posture model is superposed and displayed. サブアクション姿勢比較用UIの例を示す図Diagram showing an example of the UI for sub-action posture comparison 姿勢比較用UIにおいて姿勢モデルを選択する場合の例を説明するための図Diagram for explaining an example when selecting a posture model in the posture comparison UI 本開示に係る球技映像解析装置の機能をプログラムにより実現するコンピュータのハードウェア構成を示す図The figure which shows the hardware structure of the computer which realizes the function of the ball game image analysis apparatus which concerns on this disclosure by a program. 本開示に係る第1の携帯端末、第2の携帯端末及び表示端末の機能をプログラムにより実現するコンピュータのハードウェア構成を示す図The figure which shows the hardware composition of the computer which realizes the function of the 1st mobile terminal, the 2nd mobile terminal and the display terminal by a program which concerns on this disclosure.
 以下、図面を適宜参照して、本開示の実施の形態について、詳細に説明する。ただし、必要以上に詳細な説明は省略する場合がある。例えば、すでによく知られた事項の詳細説明及び実質的に同一の構成に対する重複説明を省略する場合がある。これは、以下の説明が不必要に冗長になるのを避け、当業者の理解を容易にするためである。なお、添付図面及び以下の説明は、当業者が本開示を十分に理解するために提供されるのであって、これらにより特許請求の記載の主題を限定することは意図されていない。 Hereinafter, embodiments of the present disclosure will be described in detail with reference to the drawings as appropriate. However, more detailed explanation than necessary may be omitted. For example, detailed explanations of already well-known matters and duplicate explanations for substantially the same configuration may be omitted. This is to avoid unnecessary redundancy of the following description and to facilitate the understanding of those skilled in the art. It should be noted that the accompanying drawings and the following description are provided for those skilled in the art to fully understand the present disclosure, and are not intended to limit the subject matter of the claims.
 (実施の形態1)
 <球技映像解析システムの構成>
 図1は、本実施の形態に係る球技映像解析システムの構成例を示す図である。
(Embodiment 1)
<Structure of ball game video analysis system>
FIG. 1 is a diagram showing a configuration example of a ball game video analysis system according to the present embodiment.
 球技映像解析システム10は、球技を撮影した映像を解析し、当該球技に用いられる移動体の軌跡を解析するシステムである。本実施の形態では、球技の1つであるバレーボールを例に説明する。ただし、球技映像解析システム10は、バレーボールに限らず、サッカー、野球、卓球、バスケットボール、テニス、ラグビー、アメリカンフットボール、ラクロス、バドミントン、又はアイスホッケー等、様々な球技に適用可能である。また、球技映像解析システム10が対象とする移動体は、典型的にはボールである。ただし、球技映像解析システム10が対象とする移動体は、ボールに限らず、例えばバドミントンのシャトル又はアイスホッケーのパックといった「球」の概念に当てはまらない形状のものであってもよい。 The ball game video analysis system 10 is a system that analyzes a video of a ball game and analyzes the trajectory of a moving body used in the ball game. In the present embodiment, volleyball, which is one of the ball games, will be described as an example. However, the ball game video analysis system 10 is not limited to volleyball, and can be applied to various ball games such as soccer, baseball, table tennis, basketball, tennis, rugby, American football, lacrosse, badminton, and ice hockey. The moving body targeted by the ball game video analysis system 10 is typically a ball. However, the moving body targeted by the ball game video analysis system 10 is not limited to a ball, and may have a shape that does not fit the concept of a "ball" such as a badminton shuttle or an ice hockey puck.
 球技映像解析システム10は、第1の携帯端末20A、第2の携帯端末20B、表示端末30、及び、球技映像解析装置100を備える。第1の携帯端末20A、第2の携帯端末20B、表示端末30、及び、球技映像解析装置100は、通信ネットワークNを通じて互いに情報を送受信できる。通信ネットワークNは、例えば、無線通信網(無線LAN(Local Area Network)、4G、5G等)、インターネット網、又は、それらの組み合わせによって構成される。 The ball game video analysis system 10 includes a first mobile terminal 20A, a second mobile terminal 20B, a display terminal 30, and a ball game video analysis device 100. The first mobile terminal 20A, the second mobile terminal 20B, the display terminal 30, and the ball game video analysis device 100 can transmit and receive information to and from each other through the communication network N. The communication network N is composed of, for example, a wireless communication network (wireless LAN (Local Area Network), 4G, 5G, etc.), an Internet network, or a combination thereof.
 第1の携帯端末20A及び第2の携帯端末20Bは、カメラを備える無線通信端末である。第1の携帯端末20A及び第2の携帯端末20Bの例は、携帯電話、スマートフォン、タブレット又はモバイルPC等である。 The first mobile terminal 20A and the second mobile terminal 20B are wireless communication terminals provided with a camera. Examples of the first mobile terminal 20A and the second mobile terminal 20B are mobile phones, smartphones, tablets, mobile PCs, and the like.
 第1の携帯端末20A及び第2の携帯端末20Bは、球技を行う人物(以下「プレイヤー」という)のアクションを撮影する。本実施の形態では、プレイヤーのアクションの1つであるバレーボールのサーブを例に説明する。ただし、プレイヤーのアクションは、バレーボールのサーブに限らず、ディグ、レセプション、トス、アタック、ブロック等であってもよい。あるいは、球技を行う人物のアクションは、バレーボール以外の上記スポーツにおける特定のアクションであってもよい。 The first mobile terminal 20A and the second mobile terminal 20B photograph the action of a person performing a ball game (hereinafter referred to as "player"). In the present embodiment, a volleyball serve, which is one of the actions of the player, will be described as an example. However, the action of the player is not limited to the serve of volleyball, and may be a dig, a reception, a toss, an attack, a block, or the like. Alternatively, the action of the person performing the ball game may be a specific action in the above sports other than volleyball.
 第1の携帯端末20A及び第2の携帯端末20Bは、例えば、プレイヤーのサーブの練習を、互いに異なる方向から動画で撮影し、それぞれ、第1の動画201A及び第2の動画201Bを生成する。例えば、図1に示すように、第1の携帯端末20Aはプレイヤーを右後方から撮影し、第2の携帯端末20Bはプレイヤーを左後方から撮影する。 The first mobile terminal 20A and the second mobile terminal 20B, for example, shoot a player's serve practice as a moving image from different directions, and generate a first moving image 201A and a second moving image 201B, respectively. For example, as shown in FIG. 1, the first mobile terminal 20A photographs the player from the rear right, and the second mobile terminal 20B photographs the player from the rear left.
 第1の携帯端末20A及び第2の携帯端末20Bは、それぞれ、生成した第1の動画201A及び第2の動画201Bを、通信ネットワークNを通じて、球技映像解析装置100に送信する。 The first mobile terminal 20A and the second mobile terminal 20B transmit the generated first moving image 201A and second moving image 201B to the ball game video analysis device 100 through the communication network N, respectively.
 表示端末30は、球技映像解析装置100による解析結果を表示する。なお、解析結果の表示例については後述する(図10参照)。表示端末30の例は、携帯電話、スマートフォン、タブレット又はモバイルPC等である。表示端末30は、第3の携帯端末と読み替えられてもよい。なお、解析結果を表示する表示端末30を別途設ける構成を例示したが、第1の携帯端末20A、第2の携帯端末20Bのいずれかを、解析結果を表示する端末として利用してもよい。 The display terminal 30 displays the analysis result by the ball game video analysis device 100. A display example of the analysis result will be described later (see FIG. 10). Examples of the display terminal 30 are mobile phones, smartphones, tablets, mobile PCs, and the like. The display terminal 30 may be read as a third mobile terminal. Although the configuration in which the display terminal 30 for displaying the analysis result is separately provided is illustrated, either the first mobile terminal 20A or the second mobile terminal 20B may be used as a terminal for displaying the analysis result.
 球技映像解析装置100は、第1の携帯端末20A及び第2の携帯端末20Bからそれぞれ受信した第1の動画201A及び第2の動画201Bに基づいて、3次元空間におけるボールの移動軌跡を解析する。3次元空間におけるボールの移動軌跡は、ボールの3次元座標を時系列に並べることで生成できる。 The ball game video analysis device 100 analyzes the movement trajectory of the ball in the three-dimensional space based on the first moving image 201A and the second moving image 201B received from the first mobile terminal 20A and the second mobile terminal 20B, respectively. .. The movement locus of the ball in the three-dimensional space can be generated by arranging the three-dimensional coordinates of the ball in chronological order.
 時刻tのボールの3次元座標は、図2に示すように、第1の動画201Aの中の時刻tのフレーム画像F1と、第2の動画201Bの中の時刻tのフレーム画像F2から算出できる。すなわち、図2に示すように、第1の携帯端末20A(のカメラ)の位置U1からフレーム画像F1におけるボールの位置B1へ伸ばしたベクトルV1と、第2の携帯端末20B(のカメラ)の位置U2からフレーム画像F2におけるボールの位置B2へ伸ばしたベクトルV2との交点が、ボールの3次元座標Paとなる。 As shown in FIG. 2, the three-dimensional coordinates of the ball at time t can be calculated from the frame image F1 at time t in the first moving image 201A and the frame image F2 at time t in the second moving image 201B. .. That is, as shown in FIG. 2, the vector V1 extending from the position U1 of the first mobile terminal 20A (camera) to the position B1 of the ball in the frame image F1 and the position of the second mobile terminal 20B (camera). The intersection with the vector V2 extending from U2 to the position B2 of the ball in the frame image F2 is the three-dimensional coordinate Pa of the ball.
 ここで、第1の携帯端末20Aが第1の動画201Aの各フレーム画像に付与したタイムコード(時刻情報)と、第2の携帯端末20Bが第2の動画201Bの各フレーム画像に付与したタイムコードとが完全に正確な時刻ならば、時刻tのタイムコードが付与されたフレーム画像F1、F2を用いて、上述の方法でボールの3次元座標を正確に特定できる。 Here, the time code (time information) given to each frame image of the first moving image 201A by the first mobile terminal 20A and the time given to each frame image of the second moving image 201B by the second mobile terminal 20B. If the code is a completely accurate time, the three-dimensional coordinates of the ball can be accurately specified by the above method using the frame images F1 and F2 to which the time code of the time t is assigned.
 しかし、特許文献1に示すような球技映像解析に特化した専用カメラではなく、携帯端末に搭載されたカメラのような汎用的なカメラを用いる場合、第1の携帯端末20A及び第2の携帯端末20Bは、必ずしも正確なタイムコードを付与できるとは限らない。以下、その理由を説明する。 However, when a general-purpose camera such as a camera mounted on a mobile terminal is used instead of a dedicated camera specialized in ball game video analysis as shown in Patent Document 1, the first mobile terminal 20A and the second mobile phone are used. The terminal 20B cannot always give an accurate time code. The reason will be described below.
 球技に係るスポーツのボールは高速に移動するため、ボールの3次元位置を特定するためには、例えば30fps以上の高フレームレートで撮影し、第1の携帯端末20Aが撮影したフレーム画像と第2の携帯端末20Bが撮影したフレーム画像との間にミリ秒単位の同期精度が求められる。 Since the ball of sports related to ball games moves at high speed, in order to specify the three-dimensional position of the ball, for example, a frame image taken by a first mobile terminal 20A and a second frame image taken at a high frame rate of 30 fps or more are taken. The synchronization accuracy in milliseconds with the frame image captured by the mobile terminal 20B of the above is required.
 ここで、NTP(Network Time Protocol)を使用して第1の携帯端末20A及び第2の携帯端末20Bの内部クロックを同期させることが考えられる。しかし、NTPの同期精度は秒単位であるため、第1の携帯端末20Aの内部クロックと第2の携帯端末20Bの内部クロックとの間にミリ秒単位の同期ずれが生じる可能性がある。よって、この方法では、高速に移動するボールの3次元座標を十分な精度で特定できない場合がある。 Here, it is conceivable to synchronize the internal clocks of the first mobile terminal 20A and the second mobile terminal 20B by using NTP (Network Time Protocol). However, since the synchronization accuracy of NTP is in seconds, there is a possibility that a synchronization shift in milliseconds may occur between the internal clock of the first mobile terminal 20A and the internal clock of the second mobile terminal 20B. Therefore, with this method, it may not be possible to specify the three-dimensional coordinates of a ball moving at high speed with sufficient accuracy.
 または、GPS(Global Positioning System)といった衛星測位システムを使用して第1の携帯端末20A及び第2の携帯端末20Bの内部クロックを同期させることが考えられる。しかし、この方法では、GPS機能を搭載していない携帯端末のカメラは使用できなくなってしまう。また、屋内施設等においてGPS信号が受信できない場合に、第1の携帯端末20Aの内部クロックと第2の携帯端末20Bの内部クロックとの間に同期ずれが生じてしまう可能性がある。よって、この方法では、高速に移動するボールの3次元座標を十分な精度で特定できない場合がある。 Alternatively, it is conceivable to synchronize the internal clocks of the first mobile terminal 20A and the second mobile terminal 20B by using a satellite positioning system such as GPS (Global Positioning System). However, with this method, the camera of the mobile terminal that does not have the GPS function cannot be used. Further, when the GPS signal cannot be received in an indoor facility or the like, there is a possibility that a synchronization shift may occur between the internal clock of the first mobile terminal 20A and the internal clock of the second mobile terminal 20B. Therefore, with this method, it may not be possible to specify the three-dimensional coordinates of a ball moving at high speed with sufficient accuracy.
 または、第1の携帯端末20A及び/又は第2の携帯端末20Bがタイムコードを付与する際に、同期ずれが発生してしまう可能性がある。例えば、第1の携帯端末20Aよりも第2の携帯端末20Bの処理性能が低い場合は、携帯端末20Bのタイムコード付与の処理負荷が大きく、全体的に第1の携帯端末20Aよりも少し遅いタイムコードを付与してしまう可能性がある。この場合、フレーム画像F1とフレーム画像F2のタイムコードが同じであっても、フレーム画像F2は、実際にはフレーム画像F1よりも少し以前に撮影されたものになることがある。この少し以前に撮影されたフレーム画像F2では、ボールの位置がB3となり、図2に示すように、ボールの3次元座標もPbにずれてしまう。よって、算出されるボールの移動軌跡もずれてしまう。 Alternatively, when the first mobile terminal 20A and / or the second mobile terminal 20B assigns a time code, a synchronization shift may occur. For example, when the processing performance of the second mobile terminal 20B is lower than that of the first mobile terminal 20A, the processing load of time code assignment of the mobile terminal 20B is large, and it is slightly slower than the first mobile terminal 20A as a whole. There is a possibility of giving a time code. In this case, even if the time codes of the frame image F1 and the frame image F2 are the same, the frame image F2 may actually be taken a little earlier than the frame image F1. In the frame image F2 taken shortly before this, the position of the ball is B3, and as shown in FIG. 2, the three-dimensional coordinates of the ball are also shifted to Pb. Therefore, the calculated movement locus of the ball also deviates.
 そこで、本実施の形態に係る球技映像解析装置100は、第1の携帯端末20Aと第2の携帯端末20Bとの間のフレーム画像の同期ずれを特定する機能を有する。そして、球技映像解析装置100は、その特定した同期ずれを考慮してボールの移動軌跡を算出する。これにより、特許文献1に示すような球技映像解析に特化した専用カメラではなく、携帯端末に搭載されたカメラのような汎用的なカメラを用いても、ボールの移動軌跡を算出できる。以下、詳細に説明する。 Therefore, the ball game video analysis device 100 according to the present embodiment has a function of specifying the synchronization deviation of the frame image between the first mobile terminal 20A and the second mobile terminal 20B. Then, the ball game image analysis device 100 calculates the movement locus of the ball in consideration of the specified synchronization deviation. As a result, the movement locus of the ball can be calculated by using a general-purpose camera such as a camera mounted on a mobile terminal instead of a dedicated camera specialized in ball game video analysis as shown in Patent Document 1. Hereinafter, a detailed description will be given.
 <球技映像解析装置の構成>
 図3は、本実施の形態に係る球技映像解析装置100の構成例を示すブロック図である。
<Structure of ball game video analysis device>
FIG. 3 is a block diagram showing a configuration example of the ball game video analysis device 100 according to the present embodiment.
 球技映像解析装置100は、動画受信部101、画像変換部102、基準フレーム特定部103、基準フレーム変更部104、軌跡解析部105、及び、結果表示部106を備える。 The ball game video analysis device 100 includes a moving image receiving unit 101, an image conversion unit 102, a reference frame specifying unit 103, a reference frame changing unit 104, a trajectory analysis unit 105, and a result display unit 106.
 動画受信部101は、第1の携帯端末20Aから第1の動画201Aを受信し、第2の携帯端末20Bから第2の動画201Bを受信する。動画のフォーマットの例は、MPEG4、TSファイル、H.264、H.265等である。第1の動画201Aを構成する各フレーム画像には、第1の携帯端末20Aによってタイムコードが付与されている。第2の動画201Bを構成する各フレーム画像には、第2の携帯端末20Bによってタイムコードが付与されている。図1にて説明したとおり、第1の動画201A及び第2の動画201Bには、例えば、プレイヤーがサーブを打つシーンが撮影される。 The video receiving unit 101 receives the first video 201A from the first mobile terminal 20A, and receives the second video 201B from the second mobile terminal 20B. Examples of video formats include MPEG4, TS files, H.D. 264, H. It is 265 mag. A time code is assigned to each frame image constituting the first moving image 201A by the first mobile terminal 20A. A time code is assigned to each frame image constituting the second moving image 201B by the second mobile terminal 20B. As described with reference to FIG. 1, in the first moving image 201A and the second moving image 201B, for example, a scene in which the player hits a serve is shot.
 画像変換部102は、第1の動画201Aの各フレーム画像を静止画に変換し、第1のフレーム画像群202Aを生成する。画像変換部102は、第2の動画201Bの各フレーム画像を静止画に変換し、第2のフレーム画像群202Bを生成する。静止画のフォーマットの例は、JPEG、GIF、TIFF等である。 The image conversion unit 102 converts each frame image of the first moving image 201A into a still image, and generates the first frame image group 202A. The image conversion unit 102 converts each frame image of the second moving image 201B into a still image, and generates a second frame image group 202B. Examples of still image formats are JPEG, GIF, TIFF and the like.
 基準フレーム特定部103は、第1のフレーム画像群202A及び第2のフレーム画像群202Bの中から、それぞれ、フレーム同期のための基準となるフレーム画像(以下「基準フレーム画像」という)を特定する。以下、第1のフレーム画像群202Aの中から特定された基準フレーム画像を第1の基準フレーム画像といい、第2のフレーム画像群202Bの中から特定された基準フレーム画像を第2の基準フレーム画像という。 The reference frame specifying unit 103 identifies a reference frame image (hereinafter referred to as “reference frame image”) for frame synchronization from the first frame image group 202A and the second frame image group 202B, respectively. .. Hereinafter, the reference frame image specified from the first frame image group 202A is referred to as a first reference frame image, and the reference frame image specified from the second frame image group 202B is referred to as a second reference frame. It is called an image.
 基準フレーム特定部103は、アクションを構成する特定のサブアクションが撮影されたフレーム画像を、基準フレーム画像として特定する。例えば、基準フレーム特定部103は、サーブ(アクション)の一連の動作が撮影されたフレーム画像群のうち、サーブを打った手がボールに当たった瞬間(サブアクション)が撮影されたフレーム画像を、基準フレーム画像として特定する。次に、図4を参照して、基準フレーム画像を特定する方法の一例を説明する。 The reference frame specifying unit 103 specifies a frame image in which a specific sub-action constituting the action is captured as a reference frame image. For example, the reference frame specifying unit 103 captures a frame image in which the moment (sub-action) when the hand hitting the serve hits the ball is captured in the frame image group in which the series of actions of the serve (action) is captured. Specify as a reference frame image. Next, an example of a method of specifying the reference frame image will be described with reference to FIG.
 図4に示すように、基準フレーム特定部103は、第1のフレーム画像群202Aに対してボールの検出領域211Aを設定する。基準フレーム特定部103は、第2のフレーム画像群202Bに対してボールの検出領域211Bを設定する。これにより、ボールの検出領域が小さくなるので、ボールの検出処理に要する時間を短縮できる。なお、当該ボールの検出領域の設定は行われなくてもよい。 As shown in FIG. 4, the reference frame specifying unit 103 sets the ball detection region 211A for the first frame image group 202A. The reference frame specifying unit 103 sets the ball detection area 211B for the second frame image group 202B. As a result, the ball detection area becomes smaller, so that the time required for the ball detection process can be shortened. It is not necessary to set the detection area of the ball.
 基準フレーム特定部103は、公知の方法によって、第1のフレーム画像群202Aの各フレーム画像からボールを検出する。例えば、基準フレーム特定部103は、パターンマッチングによってフレーム画像からボールを検出する。あるいは、基準フレーム特定部103は、ボール検出用に予め学習されたディープラーニングを用いて、フレーム画像からボールを検出する。基準フレーム特定部103は、同様に、第2のフレーム画像群202Bの各フレーム画像からもボールを検出する。 The reference frame specifying unit 103 detects a ball from each frame image of the first frame image group 202A by a known method. For example, the reference frame specifying unit 103 detects a ball from a frame image by pattern matching. Alternatively, the reference frame specifying unit 103 detects the ball from the frame image by using deep learning learned in advance for ball detection. Similarly, the reference frame specifying unit 103 detects the ball from each frame image of the second frame image group 202B.
 基準フレーム特定部103は、第1のフレーム画像群202Aの各フレーム画像から検出したボールの位置に基づいて、ボールの2次元の移動軌跡を算出する。そして、基準フレーム特定部103は、そのボールの2次元の移動軌跡の変化に基づいて、第1の基準フレーム画像を特定する。例えば、基準フレーム特定部103は、次のステップS11~ステップS13の処理により第1の基準フレーム画像を特定する。 The reference frame specifying unit 103 calculates the two-dimensional movement locus of the ball based on the position of the ball detected from each frame image of the first frame image group 202A. Then, the reference frame specifying unit 103 identifies the first reference frame image based on the change in the two-dimensional movement locus of the ball. For example, the reference frame specifying unit 103 identifies the first reference frame image by the processing of the following steps S11 to S13.
 (S11)基準フレーム特定部103は、第1のフレーム画像群202Aから、ボールのX方向の移動量が所定の閾値以上である(つまりボールが横方向に大きく移動した)第1のフレーム画像を抽出する。これにより、サーブを打った手がボールに当たった直後の、ボールがネット方向に飛んでいる第1のフレーム画像を抽出できる。 (S11) The reference frame specifying unit 103 obtains a first frame image from the first frame image group 202A in which the amount of movement of the ball in the X direction is equal to or greater than a predetermined threshold value (that is, the ball has moved significantly in the lateral direction). Extract. As a result, it is possible to extract the first frame image in which the ball is flying toward the net immediately after the hand hitting the serve hits the ball.
 (S12)基準フレーム特定部103は、ステップS11で抽出した第1のフレーム画像より以前の一定期間に含まれる複数の第1のフレーム画像のうち、ボールのY軸方向の移動量が所定の閾値以上変化し、かつ、ステップS11で抽出した第1のフレーム画像に最も近い第1のフレーム画像を特定する。これにより、サーブのために投げ上げられたボールが上昇中又は落下中である複数の第1のフレーム画像のうち、サーブを打った手がボールに当たった瞬間の第1のフレーム画像を特定できる。 (S12) In the reference frame specifying unit 103, the movement amount of the ball in the Y-axis direction is a predetermined threshold value among the plurality of first frame images included in a certain period before the first frame image extracted in step S11. The first frame image that changes as described above and is closest to the first frame image extracted in step S11 is specified. Thereby, among the plurality of first frame images in which the ball thrown for the serve is rising or falling, the first frame image at the moment when the hand hitting the serve hits the ball can be identified. ..
 (S13)基準フレーム特定部103は、ステップS12で特定した第1のフレーム画像を、第1の基準フレーム画像とする。 (S13) The reference frame specifying unit 103 uses the first frame image specified in step S12 as the first reference frame image.
 基準フレーム特定部103は、上記と同様の処理により、第2の基準フレーム画像を特定する。そして、基準フレーム特定部103は、第1の基準フレーム画像に対応するタイムコード(以下「第1の基準タイムコード」という)と、第2の基準フレーム画像に対応するタイムコード(以下「第2の基準タイムコード」という)とを、同期ずれ情報203に書き込む。 The reference frame specifying unit 103 specifies the second reference frame image by the same processing as described above. Then, the reference frame specifying unit 103 has a time code corresponding to the first reference frame image (hereinafter referred to as “first reference time code”) and a time code corresponding to the second reference frame image (hereinafter referred to as “second reference time code”). "Reference time code") is written in the synchronization deviation information 203.
 なお、基準フレーム特定部103は、サーブとは異なるアクション及びサブアクションによって基準フレーム画像を特定してもよい。例えば、基準フレーム特定部103は、プレイヤーのアタック(又はディグ、レセプション、トス、ブロック等)を行った手がボールに当たった瞬間が撮影されているフレーム画像を、基準フレーム画像としてもよい。 Note that the reference frame specifying unit 103 may specify the reference frame image by an action and a sub-action different from the serve. For example, the reference frame specifying unit 103 may use a frame image in which the moment when the hand that made the attack (or dig, reception, toss, block, etc.) of the player hits the ball is captured as the reference frame image.
 また、基準フレーム特定部103は、第1の基準タイムコードから第2の基準タイムコードを差し引いた値(以下「同期時間差」という)を、同期ずれ情報203に書き込んでもよい。この同期ずれ情報203に書き込まれた同期時間差は、第1の携帯端末20Aと第2の携帯端末20Bとの間のフレーム画像の同期ずれの大きさに相当する。例えば、同期時間差がほぼ0の場合、第1の携帯端末20Aが付与するタイムコードと、第2の携帯端末20Bが付与するタイムコードとは、ほとんどずれない傾向にあることを示す。同期時間差が正値の場合、第2の携帯端末20Bが付与するタイムコードは、第1の携帯端末20Aが付与するタイムコードよりも遅れる傾向にあることを示す。同期時間差が負値の場合、第1の携帯端末20Aが付与するタイムコードは、第2の携帯端末20Bが付与するタイムコードよりも遅れる傾向にあることを示す。 Further, the reference frame specifying unit 103 may write a value obtained by subtracting the second reference time code from the first reference time code (hereinafter referred to as “synchronization time difference”) in the synchronization deviation information 203. The synchronization time difference written in the synchronization deviation information 203 corresponds to the magnitude of the synchronization deviation of the frame image between the first mobile terminal 20A and the second mobile terminal 20B. For example, when the synchronization time difference is almost 0, it is shown that the time code assigned by the first mobile terminal 20A and the time code assigned by the second mobile terminal 20B tend to be almost the same. When the synchronization time difference is a positive value, it indicates that the time code assigned by the second mobile terminal 20B tends to be later than the time code assigned by the first mobile terminal 20A. When the synchronization time difference is a negative value, it indicates that the time code assigned by the first mobile terminal 20A tends to be later than the time code assigned by the second mobile terminal 20B.
 図3の説明に戻る。 Return to the explanation in Fig. 3.
 基準フレーム特定部103が上述した方法によって自動的に特定した基準フレーム画像は、必ずしも正しい基準フレーム画像であるとは限らない。そこで、基準フレーム特定部103は、ユーザが基準フレーム画像を確認するためのUI(User Interface)(以下「基準フレーム確認用UI」という)310を提供する。なお、基準フレーム確認用UI310の詳細については後述する(図8参照)。 The reference frame image automatically identified by the reference frame specifying unit 103 by the method described above is not necessarily the correct reference frame image. Therefore, the reference frame specifying unit 103 provides a UI (User Interface) (hereinafter referred to as “reference frame confirmation UI”) 310 for the user to confirm the reference frame image. The details of the reference frame confirmation UI 310 will be described later (see FIG. 8).
 基準フレーム変更部104は、ユーザが基準フレーム画像を変更するためのUI(以下「基準フレーム変更用UI」という)320を提供する。なお、基準フレーム変更用UI320の詳細については後述する(図9参照)。基準フレーム変更部104は、基準フレーム変更用UI320を通じて基準フレーム画像が変更された場合、その変更された基準フレーム画像に基づいて、同期ずれ情報203を変更する。 The reference frame changing unit 104 provides a UI (hereinafter referred to as "reference frame changing UI") 320 for the user to change the reference frame image. The details of the reference frame changing UI 320 will be described later (see FIG. 9). When the reference frame image is changed through the reference frame change UI 320, the reference frame change unit 104 changes the synchronization shift information 203 based on the changed reference frame image.
 軌跡解析部105は、例えばプレイヤーによるサーブのアクションを撮影した第1の動画201A及び第2の動画201Bから、同期ずれ情報203を用いて、アクションに係るボールの3次元空間における移動軌跡を特定し、そのボールの移動軌跡を示す情報を含む解析結果情報204を生成する。なお、軌跡解析部105が解析対象とする第1の動画201A及び第2の動画201Bは、上述した同期ずれ情報203を生成するために撮影された第1の動画201A及び第2の動画201Bと異なるものであってよい。 The locus analysis unit 105 identifies the movement locus of the ball in the three-dimensional space related to the action by using the synchronization deviation information 203 from, for example, the first video 201A and the second video 201B in which the action of the serve by the player is photographed. , Generates analysis result information 204 including information indicating the movement trajectory of the ball. The first moving image 201A and the second moving image 201B to be analyzed by the locus analysis unit 105 are the first moving image 201A and the second moving image 201B taken to generate the above-mentioned synchronization deviation information 203. It can be different.
 例えば、軌跡解析部105は、事前にキャリブレーション処理により求めた第1の携帯端末20A及び第2の携帯端末20Bの各々のカメラパラメータ(世界座標をカメラ画像に変換するための透視投影変換行列による)と、第1のフレーム画像におけるボールの位置と、同期ずれ情報203によって補正された、当該第1のフレーム画像に同期する第2のフレーム画像におけるボールの位置とに基づいて、当該第1のフレーム画像及び第2のフレーム画像が撮影されたタイミングにおけるボールの3次元座標を特定する。軌跡解析部105は、このように特定したボールの3次元座標を時系列に並べることにより、3次元空間におけるボールの移動軌跡を特定する。軌跡解析部105は、その特定したボールの移動軌跡を示す情報を含む解析結果情報204を生成する。これにより、球技映像解析に特化した専用カメラではなく、携帯端末に搭載されたカメラを用いても、十分な精度で3次元空間におけるボールの移動軌跡を解析できる。 For example, the trajectory analysis unit 105 uses a perspective projection conversion matrix for converting the camera parameters (world coordinates into a camera image) of the first mobile terminal 20A and the second mobile terminal 20B obtained in advance by calibration processing. ), The position of the ball in the first frame image, and the position of the ball in the second frame image synchronized with the first frame image corrected by the synchronization deviation information 203. The three-dimensional coordinates of the ball at the timing when the frame image and the second frame image are taken are specified. The locus analysis unit 105 specifies the movement locus of the ball in the three-dimensional space by arranging the three-dimensional coordinates of the ball thus specified in chronological order. The locus analysis unit 105 generates analysis result information 204 including information indicating the movement locus of the specified ball. As a result, the movement trajectory of the ball in the three-dimensional space can be analyzed with sufficient accuracy even if a camera mounted on the mobile terminal is used instead of a dedicated camera specialized in ball game video analysis.
 結果表示部106は、解析結果情報204を、例えば表示端末30に表示させる。なお、結果表示部106は、第1の携帯端末20A及び/又は第2の携帯端末20Bに解析結果情報204を表示させてもよい。 The result display unit 106 displays the analysis result information 204 on, for example, the display terminal 30. The result display unit 106 may display the analysis result information 204 on the first mobile terminal 20A and / or the second mobile terminal 20B.
 <事前作業>
 図5は、事前作業の一例を示すフローチャートである。事前作業とは、軌跡解析部105によって本番のアクションの解析を行う前に行われる作業であり、例えば、同期ずれ情報203を生成するために行われる作業である。
<Preliminary work>
FIG. 5 is a flowchart showing an example of the preliminary work. The pre-work is a work performed before the trajectory analysis unit 105 analyzes the actual action, and is, for example, a work performed to generate the synchronization deviation information 203.
 ユーザは、プレイヤー及びボールを互いに異なる方向から撮影できる位置に第1の携帯端末20A及び第2の携帯端末20Bをそれぞれ設置する(S101)。 The user installs the first mobile terminal 20A and the second mobile terminal 20B at positions where the player and the ball can be photographed from different directions (S101).
 ユーザは、第1の携帯端末20A及び第2の携帯端末20Bをペアリングする(S102)。ペアリングの情報は、球技映像解析装置100へ送信されてよい。 The user pairs the first mobile terminal 20A and the second mobile terminal 20B (S102). The pairing information may be transmitted to the ball game video analyzer 100.
 ユーザは、第1の携帯端末20A及び第2の携帯端末20Bに対して、事前作業に係る撮影の開始を指示する(S103)。これにより、第1の携帯端末20A及び第2の携帯端末20Bは、撮影を開始する。 The user instructs the first mobile terminal 20A and the second mobile terminal 20B to start shooting related to the preliminary work (S103). As a result, the first mobile terminal 20A and the second mobile terminal 20B start shooting.
 プレイヤーはサーブを打つ(S104)。これにより、第1の携帯端末20A及び第2の携帯端末20Bは、それぞれ、プレイヤーがサーブを打ってボールが飛んでいく第1の動画201A及び第2の動画201Bを撮影できる。なお、ユーザとプレイヤーは同一人物であってもよい。 The player hits the serve (S104). As a result, the first mobile terminal 20A and the second mobile terminal 20B can shoot the first moving image 201A and the second moving image 201B in which the player hits the serve and the ball flies, respectively. The user and the player may be the same person.
 ユーザは、第1の携帯端末20A及び第2の携帯端末20Bに対して、事前作業に係る撮影の停止を指示する(S105)。第1の携帯端末20A及び第2の携帯端末20Bは、それぞれ、撮影の停止の指示を受け、第1の動画201A及び第2の動画201Bを球技映像解析装置100へ送信する。球技映像解析装置100は、第1の動画201Aから第1の基準フレーム画像を、第2の動画201Bから第2の基準フレーム画像を特定し、基準フレーム確認用UI310を表示端末30に表示させる。 The user instructs the first mobile terminal 20A and the second mobile terminal 20B to stop shooting related to the preliminary work (S105). The first mobile terminal 20A and the second mobile terminal 20B receive an instruction to stop shooting, respectively, and transmit the first moving image 201A and the second moving image 201B to the ball game video analysis device 100, respectively. The ball game video analysis device 100 identifies the first reference frame image from the first moving image 201A and the second reference frame image from the second moving image 201B, and displays the reference frame confirmation UI 310 on the display terminal 30.
 ユーザは、表示端末30に表示された基準フレーム確認用UI310から、第1の基準フレーム画像及び第2の基準フレーム画像を確認する(S106)。ユーザは、例えば、基準フレーム画像を変更する場合、基準フレーム確認用UI310を通じて、球技映像解析装置100に対して、基準フレーム画像の変更を指示する。この場合、球技映像解析装置100は、基準フレーム変更用UI320を表示端末30に表示させる。 The user confirms the first reference frame image and the second reference frame image from the reference frame confirmation UI 310 displayed on the display terminal 30 (S106). For example, when changing the reference frame image, the user instructs the ball game video analysis device 100 to change the reference frame image through the reference frame confirmation UI 310. In this case, the ball game video analysis device 100 causes the display terminal 30 to display the UI 320 for changing the reference frame.
 なお、上記のステップS103、ステップS105における撮影の開始及び停止の指示は、どちらか一方の携帯端末から行うことができてよい。例えば、第1の携帯端末20Aから撮影の開始及び停止の指示が行われた場合、この撮影の開始及び指示は、球技映像解析装置100を介して、第2の携帯端末20Bに自動で送信されてよい。 Note that the instructions for starting and stopping shooting in steps S103 and S105 may be given from either mobile terminal. For example, when an instruction to start and stop shooting is given from the first mobile terminal 20A, the start and instruction of shooting is automatically transmitted to the second mobile terminal 20B via the ball game video analyzer 100. You can.
 ユーザは、表示端末30に表示された基準フレーム変更用UI320を操作して、基準フレーム画像を変更する(S107)。なお、ステップS106及びステップS107の詳細については後述する(図7、図8参照)。 The user operates the reference frame change UI 320 displayed on the display terminal 30 to change the reference frame image (S107). The details of steps S106 and S107 will be described later (see FIGS. 7 and 8).
 以上の処理により、球技映像解析装置100は、ユーザによって確認及び修正された第1の基準フレーム画像及び第2の基準フレーム画像に基づいて、同期ずれ情報203を生成できる。 By the above processing, the ball game image analysis device 100 can generate the synchronization deviation information 203 based on the first reference frame image and the second reference frame image confirmed and corrected by the user.
 <事前作業に関する処理>
 図6は、事前作業に関する処理の一例を示すシーケンスチャートである。図6は、図5におけるステップS106~ステップS107の詳細な説明に相当する。
<Processing related to pre-work>
FIG. 6 is a sequence chart showing an example of processing related to pre-work. FIG. 6 corresponds to a detailed description of steps S106 to S107 in FIG.
 動画受信部101は、第1の携帯端末20Aから第1の動画201Aを受信し(S201)、第2の携帯端末20Bから第2の動画201Bを受信する(S202)。 The video receiving unit 101 receives the first video 201A from the first mobile terminal 20A (S201), and receives the second video 201B from the second mobile terminal 20B (S202).
 画像変換部102は、第1の動画201Aから第1のフレーム画像群202Aを生成し、第2の動画201Bから第2のフレーム画像群202Bを生成する(S203)。 The image conversion unit 102 generates the first frame image group 202A from the first moving image 201A, and generates the second frame image group 202B from the second moving image 201B (S203).
 基準フレーム特定部103は、第1のフレーム画像群202Aの中から、アクションを構成する特定のサブアクションが撮影された第1の基準フレーム画像を探索する(S204)。例えば、基準フレーム特定部103は、前述のように、ボールの2次元(X方向、Y方向)の移動軌跡の変化に基づいて、プレイヤーのサーブを打った手がボールに当たった瞬間が撮影された第1の基準フレーム画像を探索する。 The reference frame specifying unit 103 searches the first frame image group 202A for a first reference frame image in which a specific sub-action constituting the action is captured (S204). For example, as described above, the reference frame specifying unit 103 captures the moment when the hand hitting the player's serve hits the ball based on the change in the two-dimensional (X direction, Y direction) movement locus of the ball. The first reference frame image is searched.
 基準フレーム特定部103は、上記同様、第2のフレーム画像群202Bの中から、第2の基準フレーム画像を探索する(S205)。このとき、基準フレーム特定部103は、第1の基準フレーム画像のタイムコードから所定時間遡ったタイムコード以降の第2のフレーム画像群202Bを、第2の基準フレーム画像の探索範囲としてよい。これにより、第2のフレーム画像群202Bのすべてを探索範囲とする場合と比較して、探索に要する時間を短縮できる。 Similar to the above, the reference frame specifying unit 103 searches for the second reference frame image from the second frame image group 202B (S205). At this time, the reference frame specifying unit 103 may use the second frame image group 202B after the time code that goes back a predetermined time from the time code of the first reference frame image as the search range of the second reference frame image. As a result, the time required for the search can be shortened as compared with the case where the entire second frame image group 202B is set as the search range.
 基準フレーム特定部103は、第1の基準フレーム画像のタイムコード(第1の基準タイムコード)、及び、第2の基準フレーム画像のタイムコード(第2の基準タイムコード)を、同期ずれ情報203に書き込む(S206)。 The reference frame specifying unit 103 uses the time code of the first reference frame image (first reference time code) and the time code of the second reference frame image (second reference time code) as synchronization deviation information 203. Write to (S206).
 基準フレーム変更部104は、第1の基準フレーム画像及び第2の基準フレーム画像、並びに、基準フレーム確認用UI310を表示するための情報を、表示端末30に送信する(S207)。 The reference frame changing unit 104 transmits information for displaying the first reference frame image, the second reference frame image, and the reference frame confirmation UI 310 to the display terminal 30 (S207).
 表示端末30は、第1の基準フレーム画像、第2の基準フレーム画像、及び、基準フレーム確認用UI310を表示する(S208)。なお、基準フレーム確認用UI310の詳細については後述する(図8参照)。 The display terminal 30 displays the first reference frame image, the second reference frame image, and the reference frame confirmation UI 310 (S208). The details of the reference frame confirmation UI 310 will be described later (see FIG. 8).
 ステップS208にて表示された基準フレーム確認用UI310に対して、ユーザが、基準フレーム画像を変更しない旨の操作を行った場合、表示端末30は、変更なしを示す指示情報を球技映像解析装置100に送信する(S209)。 When the user performs an operation to the effect that the reference frame image is not changed with respect to the reference frame confirmation UI 310 displayed in step S208, the display terminal 30 provides instruction information indicating no change to the ball game video analysis device 100. (S209).
 基準フレーム変更部104は、変更なしを示す指示情報を受信した場合、ステップS206にて書き込まれた同期ずれ情報203の内容を確定する(S210)。そして、球技映像解析装置100は本処理を終了する。 When the reference frame changing unit 104 receives the instruction information indicating that there is no change, the reference frame changing unit 104 determines the content of the synchronization deviation information 203 written in step S206 (S210). Then, the ball game video analysis device 100 ends this process.
 図7は、図6において基準フレーム画像を変更する旨の操作が行われた場合の事前作業に関する処理の一例を示すシーケンスチャートである。 FIG. 7 is a sequence chart showing an example of processing related to pre-work when an operation to change the reference frame image is performed in FIG.
 ステップS208にて表示された基準フレーム確認用UI310に対して、ユーザが基準フレーム画像を変更する旨の操作を行った場合、表示端末30は、変更ありを示す指示情報を、球技映像解析装置100に送信する(S221)。 When the user performs an operation to change the reference frame image with respect to the reference frame confirmation UI 310 displayed in step S208, the display terminal 30 provides the instruction information indicating the change to the ball game video analysis device 100. (S221).
 基準フレーム変更部104は、変更ありを示す指示情報を受信した場合、次の処理を行う。すなわち、基準フレーム変更部104は、第1のフレーム画像群202Aから、ステップS204で探索した第1の基準フレーム画像の前後に撮影された複数の第1のフレーム画像を抽出し、それら抽出した複数の第1のフレーム画像を、第1の基準フレーム画像の候補とする。 When the reference frame changing unit 104 receives the instruction information indicating that there is a change, the reference frame changing unit 104 performs the following processing. That is, the reference frame changing unit 104 extracts a plurality of first frame images taken before and after the first reference frame image searched in step S204 from the first frame image group 202A, and extracts a plurality of the extracted first frame images. The first frame image of is used as a candidate for the first reference frame image.
 同様に、基準フレーム変更部104は、第2のフレーム画像群202Bから、ステップS205で探索した第2の基準フレーム画像の前後に撮影された複数の第2のフレーム画像を抽出し、それら抽出した複数の第2のフレーム画像を、第2の基準フレーム画像の候補とする。そして、基準フレーム変更部104は、複数の第1の候補フレームの候補とこれらのタイムコード、複数の第2の基準フレーム画像の候補とこれらのタイムコード、及び、基準フレーム変更用UI320を表示するための情報を、表示端末30に送信する(S222)。 Similarly, the reference frame changing unit 104 extracts a plurality of second frame images taken before and after the second reference frame image searched in step S205 from the second frame image group 202B, and extracts them. A plurality of second frame images are candidates for the second reference frame image. Then, the reference frame changing unit 104 displays a plurality of first candidate frame candidates and their time codes, a plurality of second reference frame image candidates and these time codes, and a reference frame changing UI 320. Information for this is transmitted to the display terminal 30 (S222).
 表示端末30は、基準フレーム変更用UI320を表示する(S223)。なお、基準フレーム変更用UI320の詳細については後述する(図9参照)。 The display terminal 30 displays the reference frame change UI 320 (S223). The details of the reference frame changing UI 320 will be described later (see FIG. 9).
 ユーザは、ステップS223にて表示された基準フレーム変更用UI320を操作して、複数の第1の基準フレーム画像の候補の中から、変更後の第1の基準フレーム画像を選択し、複数の第2の基準フレーム画像の候補の中から、変更後の第2の基準フレーム画像を選択する(S224)。 The user operates the reference frame change UI 320 displayed in step S223 to select the changed first reference frame image from the plurality of first reference frame image candidates, and a plurality of first reference frame images. From the two reference frame image candidates, the changed second reference frame image is selected (S224).
 表示端末30は、ステップS224にて選択された、変更後の第1の基準フレーム画像のタイムコード(以下「変更後の第1の基準タイムコード」という)、及び/又は、変更後の第2の基準フレーム画像のタイムコード(以下「変更後の第2の基準タイムコード」という)を、球技映像解析装置100に送信する(S225)。 The display terminal 30 uses the time code of the first reference frame image after the change (hereinafter referred to as “the first reference time code after the change”) selected in step S224, and / or the second after the change. The time code of the reference frame image (hereinafter referred to as “the second reference time code after the change”) is transmitted to the ball game image analysis device 100 (S225).
 基準フレーム変更部104は、表示端末30から受信した、変更後の第1の基準タイムコード、及び/又は、変更後の第2の基準タイムコードを、同期ずれ情報203に上書きし、当該同期ずれ情報203の内容を確定する(S226)。これにより、基準フレーム特定部103によって自動的に生成された同期ずれ情報203の内容が、基準フレーム変更部104を通じてユーザによって変更された基準フレーム画像のタイムコードにて上書きされる。そして、球技映像解析装置100は本処理を終了する。 The reference frame changing unit 104 overwrites the changed first reference time code and / or the changed second reference time code received from the display terminal 30 on the synchronization deviation information 203, and the synchronization deviation. The content of the information 203 is confirmed (S226). As a result, the content of the synchronization deviation information 203 automatically generated by the reference frame specifying unit 103 is overwritten with the time code of the reference frame image changed by the user through the reference frame changing unit 104. Then, the ball game video analysis device 100 ends this process.
 <基準フレーム確認用UIの詳細>
 図8は、基準フレーム確認用UIの詳細例を示す図である。
<Details of the reference frame confirmation UI>
FIG. 8 is a diagram showing a detailed example of the reference frame confirmation UI.
 図8に示すように、基準フレーム確認用UI310は、第1の画像表示領域210A、第2の画像表示領域210B、OKボタン312、変更ボタン313、及び、やり直しボタン314を含む。なお、基準フレーム確認用UI310は、変更ボタン313又はやり直しボタン314の何れか一方を含まなくてもよい。 As shown in FIG. 8, the reference frame confirmation UI 310 includes a first image display area 210A, a second image display area 210B, an OK button 312, a change button 313, and a redo button 314. The reference frame confirmation UI 310 may not include either the change button 313 or the redo button 314.
 表示端末30は、第1の画像表示領域210Aに、図6のステップS207にて球技映像解析装置100から送信された第1の基準フレーム画像を表示する。表示端末30は、第2の画像表示領域210Bに、図6のステップS207にて球技映像解析装置100から送信された第2の基準フレーム画像を表示する。図8に示すように、第1の画像表示領域210Aと第2の画像表示領域210Bは、左右に並べて配置されてよい。ただし、第1の画像表示領域210Aと第2の画像表示領域210Bの配置態様は、左右に並べられる配置に限られない。例えば、第1の画像表示領域210Aと第2の画像表示領域210Bは、別々のウィンドウとして表示されてもよい。 The display terminal 30 displays the first reference frame image transmitted from the ball game image analysis device 100 in step S207 of FIG. 6 in the first image display area 210A. The display terminal 30 displays the second reference frame image transmitted from the ball game image analysis device 100 in step S207 of FIG. 6 in the second image display area 210B. As shown in FIG. 8, the first image display area 210A and the second image display area 210B may be arranged side by side. However, the arrangement mode of the first image display area 210A and the second image display area 210B is not limited to the arrangement arranged side by side. For example, the first image display area 210A and the second image display area 210B may be displayed as separate windows.
 表示端末30は、OKボタン312が押下された場合、図6のステップS209における変更なしを示す指示情報を、球技映像解析装置100に送信する。ユーザは、第1の画像表示領域210Aに表示された第1の基準フレーム画像と、第2の画像表示領域210Bに表示された第2の基準フレーム画像とのいずれもが、正しくサブアクションが撮影された画像である場合、OKボタン312を押下してよい。これにより、図6のステップS210に示すように、同期ずれ情報203の内容が確定される。 When the OK button 312 is pressed, the display terminal 30 transmits instruction information indicating no change in step S209 of FIG. 6 to the ball game video analysis device 100. The user correctly captures both the first reference frame image displayed in the first image display area 210A and the second reference frame image displayed in the second image display area 210B. In the case of the image, the OK button 312 may be pressed. As a result, as shown in step S210 of FIG. 6, the content of the synchronization shift information 203 is determined.
 表示端末30は、変更ボタン313が押下された場合、図7のステップS221における変更ありを示す指示情報を、球技映像解析装置100に送信する。ユーザは、第1の画像表示領域210Aに表示された第1の基準フレーム画像と、第2の画像表示領域210Bに表示された第2の基準フレーム画像とのうちの少なくとも一方が、正しくサブアクションが撮影された画像でない場合、変更ボタン313を押下してよい。これにより、図7のステップS223に示すように、表示端末30には、基準フレーム変更用UI320が表示される。 When the change button 313 is pressed, the display terminal 30 transmits instruction information indicating that there is a change in step S221 of FIG. 7 to the ball game video analysis device 100. The user correctly sub-acts at least one of the first reference frame image displayed in the first image display area 210A and the second reference frame image displayed in the second image display area 210B. If is not the captured image, the change button 313 may be pressed. As a result, as shown in step S223 of FIG. 7, the reference frame change UI 320 is displayed on the display terminal 30.
 表示端末30は、やり直しボタン314が押下された場合、事前作業のやり直しを示す指示情報を球技映像解析装置100に送信する。この指示を受信した場合、球技映像解析装置100、第1の携帯端末20A及び第2の携帯端末20Bは、図5のステップS103から事前作業をやり直す。例えば、ステップS104のサーブに失敗した場合、第1の基準フレーム画像及び/又は第2の基準フレーム画像が明らかにプレイヤーによるサーブのボールとは別のボール(例えば床に転がっているボールや他のプレイヤーが練習中のボール等)を認識したものである場合、ユーザは、やり直しボタン314を押下してよい。 When the redo button 314 is pressed, the display terminal 30 transmits instruction information indicating redoing the pre-work to the ball game video analysis device 100. When this instruction is received, the ball game video analysis device 100, the first mobile terminal 20A, and the second mobile terminal 20B redo the preliminary work from step S103 of FIG. For example, if the serve in step S104 fails, the first reference frame image and / or the second reference frame image is clearly different from the ball served by the player (eg, a ball lying on the floor or another). If the player recognizes the ball being practiced, the user may press the redo button 314.
 なお、第1の基準フレーム画像には、ステップS204にて探索された第1の基準フレーム画像の直前の3枚の第1のフレーム画像と、当該第1の基準フレームの直後の3枚の第1のフレーム画像とが含まれてよい。この場合、第1の基準フレーム画像は、ステップS204にて探索された第1の基準フレーム画像も含めて、合計7枚となる。同様に、第2の基準フレーム画像には、ステップS205にて探索された第2の基準フレーム画像の直前の3枚の第2のフレーム画像と、当該第2の基準フレーム画像の直後の3枚の第2のフレーム画像とが含まれてよい。この場合、第2の基準フレーム画像は、ステップS205にて探索された第2の基準フレーム画像も含めて、合計7枚となる。 The first reference frame image includes three first frame images immediately before the first reference frame image searched in step S204 and three third images immediately after the first reference frame. A frame image of 1 may be included. In this case, the number of the first reference frame images is seven in total, including the first reference frame image searched in step S204. Similarly, the second reference frame image includes three second frame images immediately before the second reference frame image searched in step S205 and three images immediately after the second reference frame image. A second frame image of the above may be included. In this case, the number of the second reference frame images is seven in total, including the second reference frame image searched in step S205.
 この場合、表示端末30は、第1の画像表示領域210A及び第2の画像表示領域210Bにおいて、それぞれ、7枚の第1の基準フレーム画像、及び、7枚の第2の基準フレーム画像を、同期させて連続的に表示してもよい。 In this case, the display terminal 30 displays seven first reference frame images and seven second reference frame images in the first image display area 210A and the second image display area 210B, respectively. It may be displayed continuously in synchronization.
 なお、上記の7枚は一例であり、複数の第1及び第2の基準フレーム画像の枚数はいくつであってもよい。また、複数の第1及び第2の基準フレーム画像は、1フレームずつ連続する場合に限られず、所定のフレーム間隔でスキップされたものであってもよい。 Note that the above 7 images are an example, and the number of a plurality of first and second reference frame images may be any number. Further, the plurality of first and second reference frame images are not limited to the case where each frame is continuous, and may be skipped at predetermined frame intervals.
 これにより、ユーザは、1枚の第1の基準フレーム画像と第2の基準フレーム画像との間の同期を確認するだけでなく、複数枚の第1の基準フレーム画像と第2の基準フレーム画像との間の同期を容易に確認することができる。 As a result, the user not only confirms the synchronization between the one first reference frame image and the second reference frame image, but also a plurality of first reference frame images and the second reference frame image. You can easily check the synchronization with.
 なお、図8に示した基準フレーム確認用UI310を表示端末30へ表示する例を示したが、この基準フレーム確認用UI310を、第1の携帯端末20A又は第2の携帯端末20Bのいずれか一方に表示させてもよい。あるいは、第1の基準フレーム画像に対する基準フレーム確認用UI310を第1の携帯端末20Aに表示させ、第2の基準フレーム画像に対する基準フレーム確認用UI310を第2の携帯端末20Bに表示させてもよい。 Although an example of displaying the reference frame confirmation UI 310 shown in FIG. 8 on the display terminal 30, the reference frame confirmation UI 310 is displayed on either the first mobile terminal 20A or the second mobile terminal 20B. It may be displayed in. Alternatively, the reference frame confirmation UI 310 for the first reference frame image may be displayed on the first mobile terminal 20A, and the reference frame confirmation UI 310 for the second reference frame image may be displayed on the second mobile terminal 20B. ..
 <基準フレーム変更用UIの詳細>
 図9は、基準フレーム変更用UIの詳細例を示す図である。
<Details of UI for changing reference frame>
FIG. 9 is a diagram showing a detailed example of the UI for changing the reference frame.
 図9に示すように、基準フレーム変更用UI320は、第1の画像表示領域210A、第2の画像表示領域210B、OKボタン322、及び、戻りボタン323を含む。さらに、基準フレーム変更用UI320は、第1の画像表示領域210Aに対応する、1フレーム戻りボタン324A、Nフレーム戻りボタン325A、1フレーム進みボタン326A、Nフレーム進みボタン327Aを含む。ここで、Nは2以上の予め定められた整数である。さらに、基準フレーム変更用UI320は、第2の画像表示領域210Bに対応する、1フレーム戻りボタン324B、Nフレーム戻りボタン325B、1フレーム進みボタン326B、Nフレーム進みボタン327Bを含む。 As shown in FIG. 9, the reference frame changing UI 320 includes a first image display area 210A, a second image display area 210B, an OK button 322, and a return button 323. Further, the reference frame change UI 320 includes a 1-frame return button 324A, an N-frame return button 325A, a 1-frame advance button 326A, and an N-frame advance button 327A corresponding to the first image display area 210A. Here, N is a predetermined integer of 2 or more. Further, the reference frame change UI 320 includes a 1-frame return button 324B, an N-frame return button 325B, a 1-frame advance button 326B, and an N-frame advance button 327B corresponding to the second image display area 210B.
 表示端末30は、戻りボタン323が押下された場合、図8に示す基準フレーム確認用UI310の表示に戻る。 When the return button 323 is pressed, the display terminal 30 returns to the display of the reference frame confirmation UI 310 shown in FIG.
 表示端末30は、1フレーム戻りボタン324Aが押下された場合、第1の画像表示領域210Aに表示中の第1の基準フレーム画像の候補よりも1フレーム以前の第1の基準フレーム画像の候補を、第1の画像表示領域210Aに表示する。なお、第2の画像表示領域210Bに対応する1フレーム戻りボタン324Bについても同様である。 When the 1-frame return button 324A is pressed, the display terminal 30 selects a candidate for the first reference frame image one frame before the candidate for the first reference frame image displayed in the first image display area 210A. , Is displayed in the first image display area 210A. The same applies to the 1-frame return button 324B corresponding to the second image display area 210B.
 表示端末30は、Nフレーム戻りボタン325Aが押下された場合、第1の画像表示領域210Aに表示中の第1の基準フレーム画像の候補よりもNフレーム以前の第1の基準フレーム画像の候補を、第1の画像表示領域210Aに表示する。なお、第2の画像表示領域210Bに対応するNフレーム戻りボタン325Bについても同様である。 When the N-frame return button 325A is pressed, the display terminal 30 selects a candidate for the first reference frame image N frames or earlier than the candidate for the first reference frame image displayed in the first image display area 210A. , Is displayed in the first image display area 210A. The same applies to the N-frame return button 325B corresponding to the second image display area 210B.
 表示端末30は、1フレーム進みボタン326Aが押下された場合、第1の画像表示領域210Aに表示中の第1の基準フレーム画像の候補よりも1フレーム以後の第1の基準フレーム画像の候補を、第1の画像表示領域210Aに表示する。なお、第2の画像表示領域210Bに対応する1フレーム進みボタン326Bについても同様である。 When the one-frame advance button 326A is pressed, the display terminal 30 selects a candidate for the first reference frame image one frame or later from the candidate for the first reference frame image displayed in the first image display area 210A. , Is displayed in the first image display area 210A. The same applies to the one-frame advance button 326B corresponding to the second image display area 210B.
 表示端末30は、Nフレーム進みボタン327Aが押下された場合、第1の画像表示領域210Aに表示中の第1の基準フレーム画像の候補よりもNフレーム以後の第1の基準フレーム画像の候補を、第1の画像表示領域210Aに表示する。なお、第2の画像表示領域210Bに対応するNフレーム進みボタン327Bについても同様である。 When the N-frame advance button 327A is pressed, the display terminal 30 selects a candidate for the first reference frame image after the N frame rather than a candidate for the first reference frame image displayed in the first image display area 210A. , Is displayed in the first image display area 210A. The same applies to the N-frame advance button 327B corresponding to the second image display area 210B.
 ユーザは、第1の画像表示領域210A及び第2の画像表示領域210Bのいずれにも、正しくサブアクションが撮影された画像が表示されるように、上述した各ボタン324A~327A、324B~327Bを操作してよい。 The user presses the buttons 324A to 327A and 324B to 327B described above so that the image in which the sub-action is correctly captured is displayed in both the first image display area 210A and the second image display area 210B. You may operate it.
 表示端末30は、OKボタン322が押下された場合、第1の画像表示領域210Aに表示中の第1の基準フレーム画像の候補のタイムコード(変更後の第1の基準タイムコード)、及び、第2の画像表示領域210Bに表示中の第2の基準フレーム画像の候補のタイムコード(変更後の第2の基準タイムコード)を、球技映像解析装置100へ送信する。この処理は、図7のステップS225に相当する。ユーザは、第1の画像表示領域210A及び第2の画像表示領域210Bのいずれにも正しくサブアクションが撮影された画像が表示されるように操作した後、OKボタン322を押下してよい。 When the OK button 322 is pressed, the display terminal 30 has a time code of a candidate for the first reference frame image being displayed in the first image display area 210A (first reference time code after the change), and The time code of the candidate for the second reference frame image displayed in the second image display area 210B (the changed second reference time code) is transmitted to the ball game image analysis device 100. This process corresponds to step S225 in FIG. The user may press the OK button 322 after operating so that the image in which the sub-action is correctly captured is displayed in both the first image display area 210A and the second image display area 210B.
 <解析結果画面>
 図10は、解析結果の表示例を示す図である。
<Analysis result screen>
FIG. 10 is a diagram showing a display example of the analysis result.
 第1の携帯端末20A及び第2の携帯端末20Bは、プレイヤーのサーブの練習を撮影し、それぞれ、第1の動画201A及び第2の動画201Bを生成及び送信する。なお、この第1の動画201A及び第2の動画201Bは、事前作業のために撮影された第1の動画201A及び第2の動画201Bとは異なるものであってよい。 The first mobile terminal 20A and the second mobile terminal 20B photograph the player's serve practice and generate and transmit the first video 201A and the second video 201B, respectively. The first moving image 201A and the second moving image 201B may be different from the first moving image 201A and the second moving image 201B taken for the preliminary work.
 軌跡解析部105は、サーブの練習が撮影された第1の動画201A及び第2の動画201Bとの間のフレーム画像の同期ずれを同期ずれ情報203に基づいて補正した上で、ボールの移動軌跡を解析し、解析結果情報204を生成する。例えば、同期ずれ情報203において、第2の基準タイムコードが第1の基準タイムコードよりも以後である場合、軌跡解析部105は、第2の基準タイムコードと第1の基準タイムコードとの間の差分時間の分、第1の動画201Aの各フレーム画像を遅らせる補正を行った上で、第1の動画201の各フレーム画像と第2の動画201Bの各フレーム画像とを同期させて、ボールの移動軌跡を解析する。これにより、第2の携帯端末20Bによるタイムコード付与の遅れが補正され、十分な精度のボールの移動軌跡を算出できる。 The trajectory analysis unit 105 corrects the synchronization deviation of the frame image between the first moving image 201A and the second moving image 201B in which the practice of serving is taken based on the synchronization deviation information 203, and then the movement trajectory of the ball. Is analyzed, and analysis result information 204 is generated. For example, in the synchronization deviation information 203, when the second reference time code is later than the first reference time code, the locus analysis unit 105 between the second reference time code and the first reference time code. After correcting to delay each frame image of the first moving image 201A by the difference time of the first moving image 201, each frame image of the first moving image 201 and each frame image of the second moving image 201B are synchronized to form a ball. Analyze the movement trajectory of. As a result, the delay in assigning the time code by the second mobile terminal 20B is corrected, and the movement locus of the ball with sufficient accuracy can be calculated.
 結果表示部106は、解析結果情報204に基づく解析結果画面330を、表示端末30に表示させる。例えば、解析結果画面330には、図10に示すように、第1の動画201A又は第2の動画201Bの再生に対して、解析結果情報204が示すボールの移動軌跡331が重畳表示されてよい。加えて、解析結果画面330には、手で打たれたボールが手から離れた瞬間のボールの高さ及び速度が表示されてもよい。加えて、解析結果画面330には、ボールがネット上を通過する瞬間のボールの高さ及び速度が表示されてもよい。 The result display unit 106 causes the display terminal 30 to display the analysis result screen 330 based on the analysis result information 204. For example, on the analysis result screen 330, as shown in FIG. 10, the movement locus 331 of the ball indicated by the analysis result information 204 may be superimposed and displayed on the reproduction of the first moving image 201A or the second moving image 201B. .. In addition, the analysis result screen 330 may display the height and velocity of the ball at the moment when the ball hit by the hand leaves the hand. In addition, the analysis result screen 330 may display the height and velocity of the ball at the moment when the ball passes over the net.
 これにより、プレイヤーは、解析結果画面330から、サーブのときの体の動きに加えて、そのサーブによるボールの移動軌跡を詳細に確認できる。 As a result, the player can confirm in detail the movement trajectory of the ball by the serve in addition to the movement of the body at the time of the serve from the analysis result screen 330.
 なお、上述では、事前作業として、予め同期調整用の動画を撮影し、同期調整及び確認を行ってから、解析処理用の動画を撮影して、ボールの軌跡等が重畳された動画を解析結果として表示させる例を説明した。しかし、本実施の形態は、同期調整用の動画と解析処理用の動画とを区別することなく、例えば、最初に撮影した動画において、連続して同期調整と解析処理とを行う構成であってもよい。さらに、同期ずれの検出を自動化し、順次撮影した動画に対する解析処理を行う中、任意のタイミングで同期調整を実行するようにしてもよい。 In the above, as a preliminary work, a video for synchronization adjustment is shot in advance, synchronization adjustment and confirmation are performed, then a video for analysis processing is shot, and the video on which the trajectory of the ball is superimposed is analyzed. An example of displaying as is described. However, in the present embodiment, without distinguishing between the moving image for synchronization adjustment and the moving image for analysis processing, for example, in the first moving image taken, the synchronization adjustment and the analysis processing are continuously performed. May be good. Further, the synchronization adjustment may be executed at an arbitrary timing while the detection of the synchronization deviation is automated and the analysis processing for the sequentially shot moving images is performed.
 <変形例>
 上述では、フレーム同期のずれを示す情報として、第1の携帯端末20Aが第1のフレーム画像に付与した第1のタイムコードと、第2の携帯端末20Bが第2のフレーム画像に付与した第2のタイムコードとを用いる例を説明した。しかし、フレーム同期のずれを示す情報は、タイムコードに限られない。
<Modification example>
In the above description, as information indicating the frame synchronization deviation, the first time code assigned to the first frame image by the first mobile terminal 20A and the second time code assigned to the second frame image by the second mobile terminal 20B. An example of using the time code of 2 has been described. However, the information indicating the frame synchronization deviation is not limited to the time code.
 例えば、フレーム同期のずれを示す情報として、第1の携帯端末20Aが生成した第1の動画201Aにおける第1のフレーム画像のフレーム番号(以下「第1のフレーム番号」という)と、第2の携帯端末20Bが生成した第2の動画201Bにおけるフレーム画像(以下「第2のフレーム番号」という)とを用いてもよい。この場合、基準フレーム特定部103は、第1の基準フレーム画像の第1のフレーム番号(以下「第1の基準フレーム番号」という)と、第2の基準フレーム画像の第2のフレーム番号(以下「第2の基準フレーム番号」という)とを、同期ずれ情報203に書き込んでよい。そして、基準フレーム変更部104は、基準フレーム変更用UI320を通じて選択された、変更後の第1の基準フレーム画像のフレーム番号(以下「変更後の第1の基準フレーム番号」という)、及び/又は、変更後の第2の基準フレーム画像のフレーム番号(以下「変更後の第2の基準フレーム番号」という)を、同期ずれ情報203に上書きしてもよい。 For example, as information indicating the deviation of frame synchronization, the frame number of the first frame image in the first moving image 201A generated by the first mobile terminal 20A (hereinafter referred to as "first frame number") and the second A frame image (hereinafter referred to as "second frame number") in the second moving image 201B generated by the mobile terminal 20B may be used. In this case, the reference frame specifying unit 103 has a first frame number of the first reference frame image (hereinafter referred to as "first reference frame number") and a second frame number of the second reference frame image (hereinafter referred to as "first reference frame number"). “Second reference frame number”) may be written in the synchronization shift information 203. Then, the reference frame changing unit 104 uses the frame number of the changed first reference frame image (hereinafter referred to as “changed first reference frame number”) selected through the reference frame changing UI 320, and / or , The frame number of the changed second reference frame image (hereinafter referred to as “the changed second reference frame number”) may be overwritten on the synchronization shift information 203.
 (実施の形態2)
 実施の形態2では、球技を行う人物(プレイヤー)のアクションを撮影した複数の動画の中から対象動画および比較動画をそれぞれ特定して、アクションの習熟度やフォームの改善点などを把握できる球技映像解析システムについて説明する。なお、実施の形態2では、実施の形態1と共通する構成要素については、同一の参照番号を付し、説明を省略する場合がある。
(Embodiment 2)
In the second embodiment, the target video and the comparative video are specified from a plurality of videos of the action of the person (player) performing the ball game, and the proficiency level of the action and the improvement points of the form can be grasped. The analysis system will be described. In the second embodiment, the same reference numbers may be assigned to the components common to the first embodiment, and the description may be omitted.
 また、実施の形態2では、プレイヤーによるアクションの練習がバレーボールのサーブの練習である場合を例に説明する。しかし、実施の形態2は、この例に限らず、テニスのサーブ又はボレーの練習、バドミントンのサーブ又はスマッシュの練習、サッカーのヘディングの練習、野球のバッティングの練習、ゴルフのスイングの練習といった様々なアクションの練習に適用可能である。 Further, in the second embodiment, a case where the action practice by the player is a volleyball serve practice will be described as an example. However, the second embodiment is not limited to this example, and various exercises such as tennis serve or volley practice, badminton serve or smash practice, soccer heading practice, baseball batting practice, and golf swing practice. It can be applied to practice actions.
 <球技映像解析システムの構成>
 図11は、実施の形態2に係る球技映像解析システム10の構成例を示す図である。
<Structure of ball game video analysis system>
FIG. 11 is a diagram showing a configuration example of the ball game video analysis system 10 according to the second embodiment.
 実施の形態2に係る球技映像解析システム10は、図1に示した実施の形態1に係る球技映像解析システム10と同様である。ただし、図11に示すように、第1の携帯端末20A(のカメラ)は、コートの一方のサイドのネットの近くに、サーブを行うプレイヤーに向かって設置されてよい。第2の携帯端末20B(のカメラ)は、コートの他方のサイドのネットの近くに、サーブを行うプレイヤーに向かって設置されてよい。 The ball game video analysis system 10 according to the second embodiment is the same as the ball game video analysis system 10 according to the first embodiment shown in FIG. However, as shown in FIG. 11, the first mobile terminal 20A (camera) may be installed near the net on one side of the court toward the serving player. The second mobile terminal 20B (camera) may be installed near the net on the other side of the court towards the serving player.
 また、実施の形態2にて説明する球技映像解析システム10は、実施の形態1にて説明した方法によって同期ずれ等が補正されるように構成されてよい。 Further, the ball game video analysis system 10 described in the second embodiment may be configured so that the synchronization deviation or the like is corrected by the method described in the first embodiment.
 <球技映像解析装置の構成>
 図12は、実施の形態2に係る球技映像解析装置100の構成例を示すブロック図である。
<Structure of ball game video analysis device>
FIG. 12 is a block diagram showing a configuration example of the ball game video analysis device 100 according to the second embodiment.
 図12に示すように、球技映像解析装置100は、動画受信部101、軌跡解析部105、姿勢解析部131、サブアクション解析部132、記録管理部133、及び、表示制御部134を有する。 As shown in FIG. 12, the ball game video analysis device 100 includes a moving image receiving unit 101, a trajectory analysis unit 105, a posture analysis unit 131, a sub-action analysis unit 132, a record management unit 133, and a display control unit 134.
 動画受信部101は、第1の携帯端末20Aから第1の動画201Aを受信し、第2の携帯端末20Bから第2の動画201Bを受信する。第1の動画201A及び第2の動画201Bには、例えば、プレイヤーがサーブを行うシーンが撮影される。 The video receiving unit 101 receives the first video 201A from the first mobile terminal 20A, and receives the second video 201B from the second mobile terminal 20B. In the first moving image 201A and the second moving image 201B, for example, a scene in which the player serves is shot.
 軌跡解析部105は、例えば特許文献1に開示されている技術を用いて、第1の動画201A及び第2の動画201Bから、ボールの移動軌跡を特定し、軌跡情報220を生成する。軌跡情報220は、動画のある時点のタイムコードと、当該タイムコードのときのボールの3次元位置との対応関係を示す複数の情報を含む。 The trajectory analysis unit 105 identifies the movement trajectory of the ball from the first moving image 201A and the second moving image 201B by using, for example, the technique disclosed in Patent Document 1, and generates the trajectory information 220. The locus information 220 includes a plurality of information indicating the correspondence relationship between the time code at a certain time point of the moving image and the three-dimensional position of the ball at the time code.
 姿勢解析部131は、第1の動画201A及び第2の動画201Bに基づいて、アクションが行われた期間におけるプレイヤーの姿勢を示す姿勢モデル情報221を生成する。例えば、姿勢解析部131は、第1の動画201A及び第2の動画201Bから、公知の関節推定技術を用いて、プレイヤーの所定の各関節の3次元位置(以下「関節位置」という)と、当該各関節位置の動きとを特定する。姿勢解析部131は、その特定した各関節位置と当該各関節位置の動きとを示す情報を、姿勢モデル情報221として生成する。なお、姿勢モデル情報221の詳細については後述する(図14参照)。 The posture analysis unit 131 generates posture model information 221 indicating the posture of the player during the period in which the action is performed, based on the first video 201A and the second video 201B. For example, the posture analysis unit 131 uses a known joint estimation technique to determine the three-dimensional position of each predetermined joint of the player (hereinafter referred to as “joint position”) from the first moving image 201A and the second moving image 201B. The movement of each joint position is specified. The posture analysis unit 131 generates information indicating each of the specified joint positions and the movement of each joint position as posture model information 221. The details of the posture model information 221 will be described later (see FIG. 14).
 サブアクション解析部132は、軌跡情報220及び/又は姿勢モデル情報221に基づいて、第1の動画201A及び第2の動画201Bに撮影されたアクションを構成する各サブアクションを解析する。そして、サブアクション解析部132は、その解析した各サブアクションの発生タイミングを示すサブアクション情報222を生成する。例えば、バレーボールのアクションの一例であるジャンピングサーブは、「助走」、「トス」、「ジャンプ」、及び、サーブを打った手がボールに当たったことを意味する「ヒット」といった一連のサブアクションにて構成される。なお、サブアクションの詳細については後述する(図15参照)。 The sub-action analysis unit 132 analyzes each sub-action constituting the action captured in the first moving image 201A and the second moving image 201B based on the locus information 220 and / or the posture model information 221. Then, the sub-action analysis unit 132 generates sub-action information 222 indicating the occurrence timing of each analyzed sub-action. For example, a jumping serve, which is an example of a volleyball action, is a series of sub-actions such as "run-up", "toss", "jump", and "hit" which means that the hand that hit the serve hits the ball. It is composed of. The details of the sub-action will be described later (see FIG. 15).
 記録管理部133は、複数の記録情報200を管理する。記録情報200は、第1の動画201A及び第2の動画201Bの撮影日時と、当該第1の動画201A及び第2の動画201Bと、当該第1の動画201A及び第2の動画201Bから生成された軌跡情報220、姿勢モデル情報221及びサブアクション情報222との対応関係を示す情報である。すなわち、記録情報200は、ある日時にプレイヤーが行ったアクションが撮影された動画と、その動画から生成された軌跡情報220、姿勢モデル情報221及びサブアクション情報222との対応関係を示す情報であるともいえる。 The record management unit 133 manages a plurality of record information 200. The recording information 200 is generated from the shooting date and time of the first moving image 201A and the second moving image 201B, the first moving image 201A and the second moving image 201B, and the first moving image 201A and the second moving image 201B. It is information showing the correspondence relationship with the locus information 220, the posture model information 221 and the sub-action information 222. That is, the recording information 200 is information indicating the correspondence relationship between the moving image of the action performed by the player at a certain date and time and the locus information 220, the posture model information 221 and the sub action information 222 generated from the moving image. It can be said that.
 表示制御部134は、記録管理部133において管理される複数の記録情報200の中から軌跡情報220に基づいて記録情報200を選択するためのUIを、表示端末30に表示させる。また、表示制御部134は、当該UIを通じて選択された記録情報200において対応関係にある第1の動画201A又は第2の動画201Bを、表示端末30に表示させる。 The display control unit 134 causes the display terminal 30 to display a UI for selecting the record information 200 based on the locus information 220 from the plurality of record information 200 managed by the record management unit 133. Further, the display control unit 134 causes the display terminal 30 to display the first moving image 201A or the second moving image 201B having a corresponding relationship in the recorded information 200 selected through the UI.
 また、表示制御部134は、表示端末30に表示されたUIを通じて、記録管理部133にて管理される複数の記録情報200の中から、例えば第1の記録情報200A(図13参照)及び第2の記録情報200B(図13参照)が選択された場合、次の処理を行ってよい。すなわち、表示制御部134は、第1の記録情報200Aにおいて対応関係にある第1の動画201A又は第2の動画201Bと、第2の記録情報200Bにおいて対応関係にある第1の動画201A又は第2の動画201Bとを並べて表示するUI(以下「映像比較用UI」という)400Aを、表示端末30に表示させてよい。なお、映像比較用UI400Aの詳細については後述する(図16及び図17参照)。 Further, the display control unit 134 has, for example, the first record information 200A (see FIG. 13) and the first record information 200A from among the plurality of record information 200 managed by the record management unit 133 through the UI displayed on the display terminal 30. When the recording information 200B (see FIG. 13) of No. 2 is selected, the following processing may be performed. That is, the display control unit 134 has a correspondence relationship between the first moving image 201A or the second moving image 201B in the first recording information 200A and the first moving image 201A or the first moving image 201A or the second moving image 201B having a correspondence relationship in the second recording information 200B. A UI (hereinafter referred to as "video comparison UI") 400A for displaying the moving image 201B of 2 side by side may be displayed on the display terminal 30. The details of the video comparison UI 400A will be described later (see FIGS. 16 and 17).
 また、表示制御部134は、表示端末30に表示された映像比較用UI400Aを通じて、記録管理部133にて管理される複数の記録情報200の中から、例えば第1の記録情報200A及び第2の記録情報200Bが選択された場合、次の処理を行ってよい(図14参照)。すなわち、表示制御部134は、第1の記録情報200Aにおいて対応関係にある姿勢モデル情報221から生成された第1の姿勢モデル231Aと、第2の記録情報200Bにおいて対応関係にある姿勢モデル情報221から生成された第2の姿勢モデル231Bとを、並べて又は重ねて表示するUI(以下「姿勢比較用UI」という)400Cを、表示端末30に表示させてよい。なお、姿勢比較用UI400Cの詳細については後述する(図18~図21参照)。 Further, the display control unit 134 uses, for example, the first record information 200A and the second record information 200A from among the plurality of record information 200 managed by the record management unit 133 through the video comparison UI 400A displayed on the display terminal 30. When the record information 200B is selected, the following processing may be performed (see FIG. 14). That is, the display control unit 134 has the posture model information 221 that has a correspondence relationship with the first posture model 231A generated from the posture model information 221 that has a correspondence relationship with the first recording information 200A and the second recording information 200B. The display terminal 30 may display a UI (hereinafter referred to as “posture comparison UI”) 400C for displaying the second posture model 231B generated from the above side by side or in an overlapping manner. The details of the posture comparison UI 400C will be described later (see FIGS. 18 to 21).
 また、表示制御部134は、第1の記録情報200Aにおいて対応関係にある軌跡情報220から生成された移動体(ボール)の移動軌跡241Aと、第2の記録情報200Bにおいて対応関係にある軌跡情報220から生成された移動体(ボール)の移動軌跡241Bとを、姿勢比較用UI400Cにおいて、姿勢モデル231A、231Bと共に表示端末30に表示させてよい(図18~図21参照)。 Further, the display control unit 134 has a locus information corresponding to the movement locus 241A of the moving body (ball) generated from the locus information 220 having a correspondence relationship in the first recording information 200A and the locus information having a correspondence relationship in the second recording information 200B. The movement locus 241B of the moving body (ball) generated from the 220 may be displayed on the display terminal 30 together with the posture models 231A and 231B in the posture comparison UI400C (see FIGS. 18 to 21).
 また、表示制御部134は、複数のサブアクションのうちの1つのサブアクションを選択可能なサブアクション選択UI(447)を表示端末30に表示させてよい。そして、表示制御部134は、サブアクション選択UI(447)を通じて選択された1つのサブアクションの発生タイミングにおける姿勢モデル231A、231Bを、姿勢比較用UI(400D)において、表示端末30に表示させてよい(図20参照)。 Further, the display control unit 134 may display the sub-action selection UI (447) capable of selecting one of the plurality of sub-actions on the display terminal 30. Then, the display control unit 134 causes the display terminal 30 to display the posture models 231A and 231B at the occurrence timing of one sub-action selected through the sub-action selection UI (447) on the posture comparison UI (400D). Good (see Figure 20).
 <表示端末の構成>
 図13は、実施の形態2に係る表示端末30の構成例を示すブロック図である。
<Display terminal configuration>
FIG. 13 is a block diagram showing a configuration example of the display terminal 30 according to the second embodiment.
 表示端末30は、情報送受信部151、及び、UI制御部152を有する。 The display terminal 30 has an information transmission / reception unit 151 and a UI control unit 152.
 情報送受信部151は、球技映像解析装置100と、通信ネットワークNを通じて、情報を送受信する。例えば、情報送受信部151は、球技映像解析装置100から、複数の記録情報200(例えば第1の記録情報200A及び第2の記録情報200B)を受信する。 The information transmission / reception unit 151 transmits / receives information through the ball game video analysis device 100 and the communication network N. For example, the information transmission / reception unit 151 receives a plurality of recording information 200 (for example, the first recording information 200A and the second recording information 200B) from the ball game video analysis device 100.
 UI制御部152は、球技映像解析装置100の表示制御部134と連携して、映像比較用UI400A及び姿勢比較用UI400C等を、出力装置1002(図23参照)の一例であるディスプレイに表示する。また、UI制御部152は、入力装置1001(図23参照)の一例であるタッチパネルを通じて、当該UIに対するユーザからの入力を検出する。UI制御部152は、UIに対する入力を検出した場合、当該入力内容を、情報送受信部151を通じて、球技映像解析装置100へ送信する。 The UI control unit 152, in cooperation with the display control unit 134 of the ball game image analysis device 100, displays the image comparison UI 400A, the posture comparison UI 400C, and the like on a display which is an example of the output device 1002 (see FIG. 23). Further, the UI control unit 152 detects an input from the user to the UI through a touch panel which is an example of the input device 1001 (see FIG. 23). When the UI control unit 152 detects an input to the UI, the UI control unit 152 transmits the input content to the ball game video analysis device 100 through the information transmission / reception unit 151.
 <姿勢モデルの詳細>
 図14は、姿勢モデルの一例を説明するための図である。
<Details of posture model>
FIG. 14 is a diagram for explaining an example of the posture model.
 姿勢モデル情報221には、プレイヤーの各関節位置と当該各関節位置の動きとを示す情報が含まれる。 The posture model information 221 includes information indicating each joint position of the player and the movement of each joint position.
 例えば、表示制御部134又はUI制御部152は、第1の記録情報200Aにおいて対応関係にある姿勢モデル情報221を参照して、アクション期間のサブアクション「ジャンプ」のタイミングにおける各関節位置(図14の白丸及び黒丸)を特定する。同様に、表示制御部134又はUI制御部152は、第2の記録情報200Bにおいて対応関係にある姿勢モデル情報221を参照して、アクション期間のサブアクション「ジャンプ」のタイミングにおける各関節位置を特定する。そして、表示制御部134は、特定した各関節位置を人体の構造に基づいて線で結ぶことにより、図14に示すようなプレイヤーの姿勢モデル231A、231Bを生成する。姿勢モデル231A、231Bは3次元のモデルであるため、表示制御部134又はUI制御部152は、姿勢モデルを任意に3D回転などの操作により、3D空間上での視点を変えて表示させることができる。 For example, the display control unit 134 or the UI control unit 152 refers to the posture model information 221 that has a corresponding relationship in the first recording information 200A, and refers to each joint position at the timing of the sub-action “jump” during the action period (FIG. 14). (White circle and black circle) are specified. Similarly, the display control unit 134 or the UI control unit 152 identifies each joint position at the timing of the sub-action "jump" in the action period with reference to the posture model information 221 that is in a corresponding relationship in the second recording information 200B. do. Then, the display control unit 134 generates the player posture models 231A and 231B as shown in FIG. 14 by connecting the specified joint positions with a line based on the structure of the human body. Since the posture models 231A and 231B are three-dimensional models, the display control unit 134 or the UI control unit 152 can display the posture model by arbitrarily changing the viewpoint in the 3D space by an operation such as 3D rotation. can.
 また、表示制御部134又はUI制御部152は、図14に示すように、サブアクション「ジャンプ」のタイミングにおける、第1の姿勢モデル231Aと第2の姿勢モデル231Bとを、首の関節位置232A、232Bを基準に重ねて表示できる。 Further, as shown in FIG. 14, the display control unit 134 or the UI control unit 152 sets the first posture model 231A and the second posture model 231B at the timing of the sub-action “jump” and the neck joint position 232A. It can be displayed in an overlapping manner based on 232B.
 第1の姿勢モデル231A及び第2の姿勢モデル231Bは、同じプレイヤーが異なる日時に行ったサーブを撮影した動画に基づいて生成されたものであってよい。この場合、ユーザは、重畳表示された第1の姿勢モデル231A及び第2の姿勢モデル231Bから、同じプレイヤーが異なる日時に行ったサブアクション「ジャンプ」時の姿勢の違いを、視覚的に認識できる。よって、ユーザは、例えば、同じプレイヤーのサーブの上達を容易に確認できる。 The first posture model 231A and the second posture model 231B may be generated based on moving images of serves taken by the same player at different dates and times. In this case, the user can visually recognize the difference in posture at the time of the sub-action "jump" performed by the same player at different dates and times from the superimposed display of the first posture model 231A and the second posture model 231B. .. Therefore, the user can easily confirm the progress of the serve of the same player, for example.
 第1の姿勢モデル231A及び第2の姿勢モデル231Bは、異なるプレイヤーが行ったサーブを撮影した動画に基づいて生成されたものであってよい。この場合、ユーザは、
重畳表示された第1の姿勢モデル231A及び第2の姿勢モデル231Bから、異なるプレイヤーが行ったサブアクション「ジャンプ」時の姿勢の違いを、視覚的に認識できる。よって、ユーザは、例えば、手本となるプレイヤーと練習中のプレイヤーとの間のサーブ時の姿勢の違いを容易に確認できる。
The first posture model 231A and the second posture model 231B may be generated based on moving images of serves performed by different players. In this case, the user
From the superimposed display of the first posture model 231A and the second posture model 231B, it is possible to visually recognize the difference in posture at the time of the sub-action "jump" performed by different players. Therefore, the user can easily confirm, for example, the difference in posture at the time of serving between the model player and the player who is practicing.
 また、姿勢モデル情報221を参照することにより、プレイヤーがジャンプしたタイミング、及び、プレイヤーがジャンプした際の最高到達点を特定できる。例えば、サブアクション解析部132は、プレイヤーの両足首の関節位置が地面から一定距離以上離れたタイミングを、プレイヤーがジャンプしたタイミングとして検出してもよい。例えば、サブアクション解析部132は、プレイヤーの首の関節位置232A、232Bが最高地点にあるタイミングを、プレイヤーのジャンプが最高地点に達したタイミングとして検出してもよい。 Further, by referring to the posture model information 221, the timing at which the player jumps and the maximum reaching point when the player jumps can be specified. For example, the sub-action analysis unit 132 may detect the timing when the joint positions of both ankles of the player are separated from the ground by a certain distance or more as the timing when the player jumps. For example, the sub-action analysis unit 132 may detect the timing when the joint positions 232A and 232B of the player's neck are at the highest point as the timing when the player's jump reaches the highest point.
 なお、姿勢モデル231A、231Bは、図14に示すような各関節位置を線で結んだモデルに限られない。例えば、姿勢モデル231A、231Bは、テクスチャマッピングされた3次元のCG(Computer Graphics)モデルであってもよい。 Note that the posture models 231A and 231B are not limited to the model in which each joint position is connected by a line as shown in FIG. For example, the attitude models 231A and 231B may be texture-mapped three-dimensional CG (Computer Graphics) models.
 <サブアクションの解析>
 図15は、ジャンピングサーブを構成するサブアクションを説明するための図である。
<Analysis of sub-actions>
FIG. 15 is a diagram for explaining sub-actions constituting the jumping serve.
 図15に示すように、バレーボールのアクションの一例であるジャンピングサーブは、「助走」、「トス」、「ジャンプ」、及び、サーブを打った手がボールに当たったことを意味する「ヒット」といった一連のサブアクションにて構成される。 As shown in FIG. 15, jumping serve, which is an example of volleyball action, includes "running", "toss", "jump", and "hit" which means that the hand hitting the serve hits the ball. It consists of a series of sub-actions.
 サブアクション解析部132は、ボールの軌跡情報220及び/又はプレイヤーの姿勢モデル情報221に基づいて、各サブアクションが発生したタイミングのフレーム画像を特定する。 The sub-action analysis unit 132 specifies a frame image at the timing when each sub-action occurs based on the ball trajectory information 220 and / or the player's posture model information 221.
 例えば、サブアクション解析部132は、ボールの軌跡情報220を参照して、ボールの移動軌跡が静止状態から上方への移動に変化したタイミングのフレーム画像を特定する。そして、サブアクション解析部132、その特定したフレーム画像が撮影されたタイミング(例えば時刻情報)を、サブアクション「トス」の発生タイミングとして、サブアクション情報222に書き込む。 For example, the sub-action analysis unit 132 refers to the ball locus information 220 to specify a frame image at the timing when the ball's movement locus changes from a stationary state to an upward movement. Then, the sub-action analysis unit 132 writes the timing (for example, time information) at which the specified frame image is taken in the sub-action information 222 as the occurrence timing of the sub-action "toss".
 例えば、サブアクション解析部132は、プレイヤーの姿勢モデル情報221を参照して、プレイヤーの両足首の関節位置が地面から一定距離以上離れたタイミングのフレーム画像を特定する。そして、サブアクション解析部132は、その特定したフレーム画像が撮影されたタイミングを、サブアクション「ジャンプ」の発生タイミングとして、サブアクション情報222に書き込む。 For example, the sub-action analysis unit 132 refers to the posture model information 221 of the player and specifies a frame image at the timing when the joint positions of both ankles of the player are separated from the ground by a certain distance or more. Then, the sub-action analysis unit 132 writes the timing at which the specified frame image is taken in the sub-action information 222 as the occurrence timing of the sub-action “jump”.
 例えば、サブアクション解析部132は、ボールの軌跡情報220を参照して、ボールのネットに近づく方向の移動量が所定の閾値以上に変化したタイミングのフレーム画像を特定する。そして、サブアクション解析部132は、その特定したフレーム画像が撮影されたタイミングを、サブアクション「ヒット」の発生タイミングとして、サブアクション情報222に書き込む。 For example, the sub-action analysis unit 132 refers to the trajectory information 220 of the ball and specifies a frame image at the timing when the amount of movement of the ball in the direction approaching the net changes to a predetermined threshold value or more. Then, the sub-action analysis unit 132 writes the timing at which the specified frame image is captured in the sub-action information 222 as the occurrence timing of the sub-action “hit”.
 これにより、サブアクション情報222には、アクションを構成する各サブアクションが発生したタイミングを示す情報(例えば時刻情報)が書き込まれる。よって、サブアクション情報222を参照することにより、動画の中から、所望のサブアクションの時のフレーム画像を特定できる。また、サブアクション情報222を参照することにより、姿勢モデル情報221の中から、所望のサブアクションの時の姿勢モデルを特定できる。 As a result, information (for example, time information) indicating the timing at which each sub-action constituting the action occurs is written in the sub-action information 222. Therefore, by referring to the sub-action information 222, the frame image at the time of the desired sub-action can be specified from the moving image. Further, by referring to the sub-action information 222, the posture model at the time of the desired sub-action can be specified from the posture model information 221.
 <映像比較用UIの詳細>
 図16は、映像比較用UI400Aの表示例を示す図である。映像比較用UI400Aは、比較用UIの一例であり、互いに異なる複数の記録情報200に係る動画を比較するためのUIである。
<Details of UI for video comparison>
FIG. 16 is a diagram showing a display example of the video comparison UI 400A. The video comparison UI 400A is an example of a comparison UI, and is a UI for comparing moving images related to a plurality of recorded information 200 that are different from each other.
 図16に示すように、映像比較用UI400Aは、選択リスト401A、401B、表示領域402A、402B、追加ボタン403、シークバー410、再生ボタン411、一時停止ボタン412、スロー再生ボタン413、繰り返しボタン414、並べるボタン415、拡大ボタン416、カメラ切替ボタン417、映像反転ボタン418、及び、3D切替ボタン419を含む。 As shown in FIG. 16, the video comparison UI 400A includes selection lists 401A, 401B, display areas 402A, 402B, add button 403, seek bar 410, play button 411, pause button 412, slow play button 413, repeat button 414, and so on. It includes a side-by-side button 415, an enlargement button 416, a camera switching button 417, a video inversion button 418, and a 3D switching button 419.
 選択リスト401A、401Bには、記録情報200に係る軌跡情報220から特定される情報が、リスト形式で表示される。軌跡情報220から特定される情報は、例えば、サーブにおける、ネット通過時のボールの高さ、スピード及び角度を示す情報を含む。選択リスト401、401Bの1つの行は、1つの記録情報200に対応してよい。 In the selection lists 401A and 401B, the information specified from the locus information 220 related to the record information 200 is displayed in a list format. The information specified from the locus information 220 includes, for example, information indicating the height, speed, and angle of the ball when passing through the net in the serve. One line of the selection lists 401 and 401B may correspond to one recording information 200.
 また、記録情報200には、アクション(例えばサーブ)の評価を示す情報が含まれてもよい。この場合、選択リスト401A、401Bには、図16に示すように、アクションの評価も表示されてよい。アクションの評価は、ユーザが手動で入力したものであってもよい。あるいは、アクションの評価は、ネット通過時におけるボールの高さ、スピード及び角度に基づいて、自動的に決定されたものであってよい。この場合、アクションの評価は、ネット通過時のボールの角度が所定の閾値以上であるか否かに基づいて決定されてもよい。ここで、アクションの評価で参照される閾値は、ユーザが任意に設定できるようにしてもよい。 Further, the recorded information 200 may include information indicating the evaluation of the action (for example, serve). In this case, the selection lists 401A and 401B may also display the evaluation of the action, as shown in FIG. The evaluation of the action may be manually entered by the user. Alternatively, the evaluation of the action may be automatically determined based on the height, speed and angle of the ball when passing through the net. In this case, the evaluation of the action may be determined based on whether or not the angle of the ball when passing through the net is equal to or greater than a predetermined threshold value. Here, the threshold value referred to in the evaluation of the action may be arbitrarily set by the user.
 表示領域402Aには、選択リスト401Aから選択された第1の記録情報200A(例えば図16の点で塗りつぶされた行)に関連する第1の動画201A又は第2の動画201Bが表示される。表示領域402Bには、選択リスト401Bから選択された第2の記録情報200B(例えば図16の点で塗りつぶされた行)に関連する第1の動画201A又は第2の動画201Bが表示される。 In the display area 402A, the first moving image 201A or the second moving image 201B related to the first recording information 200A selected from the selection list 401A (for example, the line filled with the dots in FIG. 16) is displayed. In the display area 402B, the first moving image 201A or the second moving image 201B related to the second recording information 200B (for example, the line filled with the dots in FIG. 16) selected from the selection list 401B is displayed.
 図16に示すように、映像比較用UI400Aにおいて、選択リスト401A及び表示領域402Aのセット(以下「第1のセット」という)と、選択リスト401B及び表示領域402Bのセット(以下「第2のセット」という)とは、並べて表示されてよい。これにより、ユーザは、選択リスト401A及び選択リスト401Bから、それぞれ、比較したい行(記録情報200)を選択し、表示領域402A、402Bにそれぞれ表示された、選択した行に関連する動画を、比較しながら視聴できる。 As shown in FIG. 16, in the video comparison UI 400A, a set of the selection list 401A and the display area 402A (hereinafter referred to as the “first set”) and a set of the selection list 401B and the display area 402B (hereinafter referred to as the “second set”). ") May be displayed side by side. As a result, the user selects a line (recording information 200) to be compared from the selection list 401A and the selection list 401B, respectively, and compares the moving images related to the selected line displayed in the display areas 402A and 402B, respectively. You can watch while watching.
 なお、第1のセットに含まれる選択リスト401Aには、第1の携帯端末20A及び第2の携帯端末20Bを用いて当日撮影した第1の動画201A及び第2の動画201Bから生成された記録情報200が、随時表示されてよい。 The selection list 401A included in the first set includes records generated from the first moving image 201A and the second moving image 201B taken on the same day using the first mobile terminal 20A and the second mobile terminal 20B. Information 200 may be displayed at any time.
 追加ボタン403が押下された場合、表示制御部134は、新規の選択リスト及び表示領域を含む新規セットを、映像比較用UI400Aに追加する。この場合、UI制御部152は、ユーザに対して、新規の選択リストに表示させる記録情報200の日付を問い合わせてもよい。そして、UI制御部152は、表示制御部134と連携して、ユーザから入力された日付を含む1又は複数の記録情報200を記録管理部133から取得し、その取得した記録情報200に基づいて、追加された新規セットの選択リストを表示してもよい。 When the add button 403 is pressed, the display control unit 134 adds a new set including a new selection list and a display area to the video comparison UI 400A. In this case, the UI control unit 152 may inquire of the user about the date of the recorded information 200 to be displayed in the new selection list. Then, the UI control unit 152, in cooperation with the display control unit 134, acquires one or a plurality of record information 200 including the date input by the user from the record management unit 133, and based on the acquired record information 200. , You may want to display the selection list of the new set added.
 シークバー410は、表示領域402A、402Bに表示されている動画の再生中の時間を示すバーである。ユーザがシークバー410のある位置を選択した場合、UI制御部152は、その選択された位置に対応する時間から動画を再生してよい。 The seek bar 410 is a bar indicating the playing time of the moving image displayed in the display areas 402A and 402B. When the user selects a certain position of the seek bar 410, the UI control unit 152 may play the moving image from the time corresponding to the selected position.
 再生ボタン411が押下された場合、UI制御部152は、表示領域402A、402Bに表示されている動画の再生を開始する。 When the play button 411 is pressed, the UI control unit 152 starts playing the moving image displayed in the display areas 402A and 402B.
 一時停止ボタン412が押下された場合、UI制御部152は、表示領域402A、402Bに表示されている動画の再生を一時停止する。 When the pause button 412 is pressed, the UI control unit 152 pauses the playback of the moving image displayed in the display areas 402A and 402B.
 スロー再生ボタン413が押下された場合、UI制御部152は、表示領域402A、402に表示されている動画をスロー再生(例えば1/2の速度で再生)する。 When the slow playback button 413 is pressed, the UI control unit 152 slow-plays (for example, plays back at 1/2 speed) the moving images displayed in the display areas 402A and 402.
 繰り返しボタン414が押下された場合、UI制御部152は、表示領域402A、402Bに表示されている動画を繰り返し再生する。なお、スロー再生ボタン413と繰り返しボタン414とは、併用可能であってよい。 When the repeat button 414 is pressed, the UI control unit 152 repeatedly plays back the moving image displayed in the display areas 402A and 402B. The slow play button 413 and the repeat button 414 may be used together.
 映像比較用UI400A内の複数のセットが選択され、並べるボタン415が押下された場合、UI制御部152は、選択された複数のセットのサブアクションを比較するためのUIであるサブアクション映像比較用UI400B(図17参照)の表示に切り替える。なお、サブアクション映像比較用UI400Bの詳細については後述する(図17参照)。 When a plurality of sets in the video comparison UI 400A are selected and the arranging button 415 is pressed, the UI control unit 152 is a UI for comparing the sub-actions of the selected plurality of sets for sub-action video comparison. Switch to the display of UI400B (see FIG. 17). The details of the sub-action video comparison UI 400B will be described later (see FIG. 17).
 映像比較用UI400A内の少なくとも1つのセットが選択され、拡大ボタン416が押下された場合、UI制御部152は、選択されたセットの表示領域に表示されている動画の、プレイヤーを含む範囲を拡大表示する。なお、拡大表示させる範囲は、ユーザが任意に指定可能であってよい。また、表示の拡大率は、ユーザが任意に倍率を指定できるものであってもよい。 When at least one set in the video comparison UI 400A is selected and the enlargement button 416 is pressed, the UI control unit 152 expands the range including the player of the moving image displayed in the display area of the selected set. indicate. The range to be enlarged may be arbitrarily specified by the user. Further, the enlargement ratio of the display may be such that the user can arbitrarily specify the magnification.
 映像比較用UI400A内の少なくとも1つのセットが選択され、カメラ切替ボタン417が押下された場合、UI制御部152は、その選択されたセットの表示領域に表示されている動画(例えば第1の動画201A)を、別の携帯端末で撮影された動画(例えば第2の動画201B)に切り替える。これにより、例えば、各表示領域402A、402Bに表示される動画を、同じ方向から撮影された動画に揃えることができる。 When at least one set in the video comparison UI 400A is selected and the camera switching button 417 is pressed, the UI control unit 152 uses the moving image displayed in the display area of the selected set (for example, the first moving image). 201A) is switched to a moving image (for example, a second moving image 201B) taken by another mobile terminal. Thereby, for example, the moving images displayed in the display areas 402A and 402B can be aligned with the moving images shot from the same direction.
 映像比較用UI400A内の少なくとも1つのセットが選択され、映像反転ボタン418が押下された場合、UI制御部152は、その選択されたセットの表示領域に表示されている動画を左右反転する。これにより、例えば、右利きのプレイヤーのアクションと左利きのプレイヤーのアクションとを容易に比較できる。 When at least one set in the video comparison UI 400A is selected and the video inversion button 418 is pressed, the UI control unit 152 inverts the moving image displayed in the display area of the selected set. This makes it easy to compare, for example, the actions of a right-handed player with the actions of a left-handed player.
 映像比較用UI400A内の少なくとも1つのセットが選択され、3D切替ボタン419が押下された場合、UI制御部152は、選択されたセットの姿勢モデルを比較するための姿勢比較用UI400C(図18参照)の表示に切り替える。なお、姿勢比較用UI400Cの詳細については後述する(図18参照)。 When at least one set in the video comparison UI 400A is selected and the 3D switching button 419 is pressed, the UI control unit 152 performs the posture comparison UI 400C for comparing the posture models of the selected set (see FIG. 18). ) Is displayed. The details of the posture comparison UI 400C will be described later (see FIG. 18).
 図17は、サブアクション映像比較用UI400Bの表示例を示す図である。 FIG. 17 is a diagram showing a display example of the sub-action video comparison UI 400B.
 例えば、図16に示した映像比較用UI400Aにおいて、第1のセット及び第2のセットが選択され、並べるボタン415が押下された場合、UI制御部152は、図17に示すサブアクション映像比較用UI400Bを表示する。 For example, in the video comparison UI 400A shown in FIG. 16, when the first set and the second set are selected and the arranging button 415 is pressed, the UI control unit 152 is used for sub-action video comparison shown in FIG. Display UI400B.
 サブアクション映像比較用UI400Bは、表示領域421A、421B、及び、並べるボタン415から拡張された複数のサブアクション選択ボタン420を含む。なお、サブアクション選択ボタン420は、サブアクション選択UIの一例である。 The sub-action video comparison UI 400B includes display areas 421A and 421B, and a plurality of sub-action selection buttons 420 extended from the side-by-side buttons 415. The sub-action selection button 420 is an example of the sub-action selection UI.
 表示領域421A、421Bは、図17に示すように、並べて配置される。UI制御部152は、第1のセットに表示された動画のプレイヤーを含む範囲を拡大して表示領域421Aに表示する。同様に、UI制御部152は、第2のセットに表示された動画のプレイヤーを含む範囲を拡大して表示領域421Bに表示する。このとき、UI制御部152は、プレイヤーの首の関節位置が表示領域421A、421Bの中央に位置するように、表示領域421A、421Bにおける動画の表示を調整してもよい。 The display areas 421A and 421B are arranged side by side as shown in FIG. The UI control unit 152 expands the range including the player of the moving image displayed in the first set and displays it in the display area 421A. Similarly, the UI control unit 152 expands the range including the player of the moving image displayed in the second set and displays it in the display area 421B. At this time, the UI control unit 152 may adjust the display of the moving image in the display areas 421A and 421B so that the joint position of the player's neck is located at the center of the display areas 421A and 421B.
 複数のサブアクション選択ボタン420は、サブアクションを選択するためのボタンであり、例えば、「トス」、「ジャンプ」、「ヒット」のサブアクションに対応する。 The plurality of sub-action selection buttons 420 are buttons for selecting sub-actions, and correspond to, for example, "toss", "jump", and "hit" sub-actions.
 例えば、ユーザがサブアクション選択ボタン420「ヒット」を選択した場合、UI制御部152は、次の処理を行う。すなわち、UI制御部152は、第1の記録情報200Aに係るサブアクション情報222を参照して、第1のセットに表示された動画の中から、サブアクション「ヒット」が撮影されたタイミングのフレーム画像を特定する。そして、UI制御部152は、その特定したフレーム画像を表示領域421Aに表示する。同様に、UI制御部152は、第2の記録情報200Bに係るサブアクション情報222を参照して、第2のセットに表示された動画の中から、サブアクション「ヒット」が撮影されたタイミングのフレーム画像を特定する。そして、UI制御部152は、その特定したフレーム画像を表示領域421Bに表示する。この状態で再生ボタン411が押下された場合、UI制御部152は、この特定したサブアクション「ヒット」のタイミングのフレーム画像から、動画の再生を開始してよい。これにより、ユーザは、例えば、異なる日時に行われたサーブの、サブアクション「ヒット」の時のプレイヤーの映像を容易に比較できる。なお、サブアクション「ヒット」のフレーム画像のタイミングが揃っていれば、「助走」や「トス」などを含めて動画再生を行うようにしてもよい。 For example, when the user selects the sub-action selection button 420 "hit", the UI control unit 152 performs the following processing. That is, the UI control unit 152 refers to the sub-action information 222 related to the first recording information 200A, and refers to the frame at the timing when the sub-action "hit" is shot from the moving images displayed in the first set. Identify the image. Then, the UI control unit 152 displays the specified frame image in the display area 421A. Similarly, the UI control unit 152 refers to the sub-action information 222 related to the second recording information 200B, and refers to the timing at which the sub-action "hit" is shot from the moving images displayed in the second set. Identify the frame image. Then, the UI control unit 152 displays the specified frame image in the display area 421B. When the play button 411 is pressed in this state, the UI control unit 152 may start playing the moving image from the frame image at the timing of the specified sub-action “hit”. This allows the user to easily compare, for example, the player's footage at the time of the sub-action "hit" of the serve performed at different dates and times. If the timing of the frame image of the sub-action "hit" is aligned, the moving image may be played including "run-up" and "toss".
 <姿勢比較用UIの詳細>
 図18は、姿勢モデルを並べて表示した場合の姿勢比較用UIの一例を示す図である。図19は、姿勢モデルを重ねて表示した場合の姿勢比較用UIの一例を示す図である。姿勢比較用UI400Cは、比較用UIの一例であり、異なる複数の記録情報200に係る姿勢モデル情報221を比較するためのUIである。
<Details of posture comparison UI>
FIG. 18 is a diagram showing an example of a posture comparison UI when the posture models are displayed side by side. FIG. 19 is a diagram showing an example of a posture comparison UI when the posture models are superimposed and displayed. The posture comparison UI 400C is an example of a comparison UI, and is a UI for comparing posture model information 221 related to a plurality of different recorded information 200.
 例えば、図16に示した映像比較用UI400Aにおいて、第1のセット及び第2のセットが選択され、3D切替ボタン419が押下された場合、UI制御部152は、図18又は図19に示す姿勢比較用UI400Cを表示する。 For example, in the video comparison UI 400A shown in FIG. 16, when the first set and the second set are selected and the 3D switching button 419 is pressed, the UI control unit 152 takes the posture shown in FIG. 18 or FIG. The comparison UI 400C is displayed.
 姿勢比較用UI400Cは、表示領域434、シークバー410、再生ボタン411、一時停止ボタン412、スロー再生ボタン413、繰り返しボタン414、映像切替ボタン441、軌跡表示切替ボタン442、ボール表示切替ボタン443、選手表示切替ボタン444、重畳分離切替ボタン445、及び、並べるボタン446を含む。 The posture comparison UI 400C includes a display area 434, a seek bar 410, a play button 411, a pause button 412, a slow play button 413, a repeat button 414, a video switching button 441, a trajectory display switching button 442, a ball display switching button 443, and a player display. It includes a changeover button 444, a superposition separation changeover button 445, and a side-by-side button 446.
 UI制御部152は、図18及び図19に示すように、第1のセットに対応する姿勢モデル231A、ボール240A及びボールの移動軌跡241Aと、第2のセットに対応する姿勢モデル231B、ボール240B及びボールの移動軌跡241Bとを、表示領域434に表示する。以下、第1のセットに対応する姿勢モデル231A、ボール240A及びボールの移動軌跡241Aをまとめて、第1の姿勢モデル等という。同様に、第2のセットに対応する姿勢モデル231B、ボール240B及びボールの移動軌跡241Bをまとめて、第2の姿勢モデル等という。 As shown in FIGS. 18 and 19, the UI control unit 152 includes a posture model 231A, a ball 240A, and a ball movement locus 241A corresponding to the first set, and a posture model 231B, a ball 240B corresponding to the second set. And the movement locus 241B of the ball are displayed in the display area 434. Hereinafter, the posture model 231A, the ball 240A, and the ball movement locus 241A corresponding to the first set are collectively referred to as the first posture model and the like. Similarly, the posture model 231B, the ball 240B, and the ball movement locus 241B corresponding to the second set are collectively referred to as a second posture model and the like.
 ユーザは、表示領域434に対して所定の操作を行うことにより、表示領域434に表示されている第1及び第2の姿勢モデル等を、任意に3D回転などの操作により、3D空間上での視点を変えて表示させることができる。これにより、ユーザは、異なる日時に行われたサーブ時のプレイヤーの姿勢を、様々な視点から確認及び比較することができる。 By performing a predetermined operation on the display area 434, the user can arbitrarily rotate the first and second posture models and the like displayed in the display area 434 in 3D space. The viewpoint can be changed and displayed. As a result, the user can confirm and compare the postures of the players at the time of serving at different dates and times from various viewpoints.
 シークバー410は、表示領域434に表示されている第1及び第2の姿勢モデル等の再生中の時間を示すバーである。すなわち、表示領域434に表示されている姿勢モデル等は、シークバー410に表示中の再生時間のタイミングにおけるプレイヤーの姿勢等を表す。 The seek bar 410 is a bar indicating the playing time of the first and second posture models and the like displayed in the display area 434. That is, the posture model or the like displayed in the display area 434 represents the posture or the like of the player at the timing of the reproduction time being displayed on the seek bar 410.
 再生ボタン411が押下された場合、UI制御部152は、表示領域434に表示されている第1及び第2の姿勢モデル等の動作の再生を開始する。再生中、表示領域434において、第1及び第2の姿勢モデル等は、アニメーション動画のように表示される。なお、この第1及び第2の姿勢モデルのアニメーション動画の再生中、実際の映像である第1の動画201A及び第2の動画201Bを別ウィンドウ(図示せず)にて同期再生させるようにしてもよい。 When the play button 411 is pressed, the UI control unit 152 starts playing the actions of the first and second posture models and the like displayed in the display area 434. During playback, in the display area 434, the first and second posture models and the like are displayed like an animation movie. During the playback of the animation videos of the first and second posture models, the first video 201A and the second video 201B, which are the actual videos, are synchronizedly played in a separate window (not shown). May be good.
 一時停止ボタン412が押下された場合、UI制御部152は、表示領域434に表示されている第1及び第2の姿勢モデル等の動作を一時停止する。 When the pause button 412 is pressed, the UI control unit 152 suspends the operations of the first and second posture models displayed in the display area 434.
 スロー再生ボタン413が押下された場合、UI制御部152は、表示領域434に表示されている第1及び第2の姿勢モデル等の動作をスロー再生(例えば1/2の速度で再生)する。 When the slow playback button 413 is pressed, the UI control unit 152 slow-plays (for example, plays at 1/2 speed) the operations of the first and second posture models displayed in the display area 434.
 繰り返しボタン414が押下された場合、UI制御部152は、表示領域434に表示されている第1及び第2の姿勢モデル等の動作を繰り返し再生する。なお、スロー再生ボタン413と繰り返しボタン414とは、併用可能であってよい。 When the repeat button 414 is pressed, the UI control unit 152 repeatedly reproduces the operations of the first and second posture models and the like displayed in the display area 434. The slow play button 413 and the repeat button 414 may be used together.
 映像切替ボタン441が押下された場合、UI制御部152は、図16に示した映像比較用UI400Aの表示に切り替える。 When the video switching button 441 is pressed, the UI control unit 152 switches to the display of the video comparison UI 400A shown in FIG.
 軌跡表示切替ボタン442が押下された場合、UI制御部152は、表示領域434における移動軌跡241A、241Bの表示と非表示を切り替える。 When the locus display switching button 442 is pressed, the UI control unit 152 switches between displaying and hiding the movement loci 241A and 241B in the display area 434.
 ボール表示切替ボタン443が押下された場合、UI制御部152は、表示領域434におけるボール240A、240Bの表示と非表示を切り替える。 When the ball display switching button 443 is pressed, the UI control unit 152 switches between displaying and hiding the balls 240A and 240B in the display area 434.
 選手表示切替ボタン444が押下された場合、UI制御部152は、表示領域434における姿勢モデル231A、231Bの表示と非表示を切り替える。 When the player display switching button 444 is pressed, the UI control unit 152 switches between displaying and hiding the posture models 231A and 231B in the display area 434.
 重畳分離切替ボタン445が押下された場合、UI制御部152は、表示領域434において、図18に示すような第1の姿勢モデル等と第2の姿勢モデル等を分離した表示と、図19に示すような第1の姿勢モデル等及び第2の姿勢モデル等を重ねた表示とを切り替える。UI制御部152は、第1の姿勢モデル231Aと第2の姿勢モデル231Bを重ねて表示する場合、首の関節位置232A、232Bを基準に重ねて表示してよい。また、UI制御部152は、第1の姿勢モデル231Aと第2の姿勢モデル231Bを重ねて表示する場合、図19に示すように、姿勢モデル231A、231Bの位置の移動に合わせて、ボール240A、240B及びボールの移動軌跡241A、241Bの位置も移動させて表示する。 When the superimposition separation switching button 445 is pressed, the UI control unit 152 displays in the display area 434 by separating the first posture model and the like as shown in FIG. 18 and the second posture model and the like, and in FIG. The display is switched between the first posture model and the like as shown and the display in which the second posture model and the like are superimposed. When the first posture model 231A and the second posture model 231B are displayed on top of each other, the UI control unit 152 may display the first posture model 231A and the second posture model 231B on top of each other with reference to the neck joint positions 232A and 232B. Further, when the UI control unit 152 displays the first posture model 231A and the second posture model 231B in an overlapping manner, as shown in FIG. 19, the UI control unit 152 matches the movement of the positions of the posture models 231A and 231B with the ball 240A. , 240B and the positions of the ball movement loci 241A and 241B are also moved and displayed.
 並べるボタン446が押下された場合、UI制御部152は、サブアクション姿勢比較用UI400D(図20参照)を表示する。サブアクション姿勢比較用UI400Dの詳細については後述する(図20参照)。 When the line-up button 446 is pressed, the UI control unit 152 displays the sub-action posture comparison UI 400D (see FIG. 20). Details of the UI400D for sub-action posture comparison will be described later (see FIG. 20).
 なお、姿勢比較用UI400Cには、モデル反転ボタン(図示せず)が含まれてもよい。姿勢比較用UI400C内の少なくとも1つの姿勢モデルが選択され、モデル反転ボタンが押下された場合、UI制御部152は、その選択された姿勢モデルを左右反転して表示する。これにより、例えば、右利きのプレイヤーの姿勢と左利きのプレイヤーの姿勢とを容易に比較できる。 Note that the posture comparison UI400C may include a model inversion button (not shown). When at least one posture model in the posture comparison UI 400C is selected and the model reversal button is pressed, the UI control unit 152 displays the selected posture model in a left-right reversal. Thereby, for example, the posture of a right-handed player and the posture of a left-handed player can be easily compared.
 図20は、サブアクション姿勢比較用UIの例を示す図である。 FIG. 20 is a diagram showing an example of a UI for sub-action posture comparison.
 例えば、図18及び図19に示した姿勢比較用UI400Cにおいて並べるボタン446が押下された場合、UI制御部152は、図20に示すサブアクション姿勢比較用UI400Dを表示する。 For example, when the line-up button 446 is pressed in the posture comparison UI 400C shown in FIGS. 18 and 19, the UI control unit 152 displays the sub-action posture comparison UI 400D shown in FIG. 20.
 サブアクション姿勢比較用UI400Dは、表示領域451A、451B、及び、並べるボタン446から拡張された複数のサブアクション選択ボタン447を含む。なお、サブアクション選択ボタン447は、サブアクション選択UIの一例である。 The sub-action posture comparison UI 400D includes display areas 451A and 451B, and a plurality of sub-action selection buttons 447 extended from the side-by-side buttons 446. The sub-action selection button 447 is an example of the sub-action selection UI.
 表示領域451A、451Bは、図20に示すように、並べて配置される。UI制御部152は、第1の姿勢モデル等(231A、240A、241A)を拡大して表示領域451Aに表示する。同様に、UI制御部152は、第2の姿勢モデル等(231B、240B、241B)を拡大して表示領域451Bに表示する。このとき、UI制御部152は、姿勢モデルの首の関節位置232A、232Bが表示領域451A、451Bの中央に位置するように、姿勢モデル231A、231Bの表示を調整してもよい。 The display areas 451A and 451B are arranged side by side as shown in FIG. The UI control unit 152 enlarges the first posture model and the like (231A, 240A, 241A) and displays them in the display area 451A. Similarly, the UI control unit 152 enlarges the second posture model and the like (231B, 240B, 241B) and displays them in the display area 451B. At this time, the UI control unit 152 may adjust the display of the posture models 231A and 231B so that the joint positions 232A and 232B of the neck of the posture model are located at the center of the display areas 451A and 451B.
 複数のサブアクション選択ボタン447は、サブアクションを選択するためのボタンであり、例えば、「トス」、「ジャンプ」、「ヒット」のサブアクションに対応する。 The plurality of sub-action selection buttons 447 are buttons for selecting sub-actions, and correspond to, for example, "toss", "jump", and "hit" sub-actions.
 例えば、ユーザがサブアクション選択ボタン447「ジャンプ」を選択した場合、UI制御部152は、次の処理を行う。すなわち、UI制御部152は、第1の記録情報200Aに係るサブアクション情報222を参照して、サブアクション「ジャンプ」のタイミングの第1の姿勢モデル等を特定する。そして、UI制御部152は、その特定した第1の姿勢モデル等を表示領域451Aに表示する。同様に、UI制御部152は、第2の記録情報200Bに係るサブアクション情報222を参照して、サブアクション「ジャンプ」のタイミングの第2の姿勢モデル等を特定する。そして、UI制御部152は、その特定した第2の姿勢モデル等を表示領域451Bに表示する。この状態で再生ボタン411が押下された場合、UI制御部152は、この特定したサブアクション「ジャンプ」のタイミングの姿勢モデル等から、当該姿勢モデル等の動作の再生を開始してよい。これにより、ユーザは、例えば、異なる日時に行われたサーブの、サブアクション「ジャンプ」の時のプレイヤーの姿勢とその動きを容易に比較できる。なお、サブアクション「ジャンプ」のフレーム画像のタイミングが揃っていれば、「助走」や「トス」などを含めて動画再生を行うようにしてもよい。 For example, when the user selects the sub-action selection button 447 "jump", the UI control unit 152 performs the following processing. That is, the UI control unit 152 specifies the first posture model and the like of the timing of the sub-action "jump" with reference to the sub-action information 222 related to the first recorded information 200A. Then, the UI control unit 152 displays the specified first posture model and the like in the display area 451A. Similarly, the UI control unit 152 specifies the second posture model and the like of the timing of the sub action "jump" with reference to the sub action information 222 related to the second recorded information 200B. Then, the UI control unit 152 displays the specified second posture model and the like in the display area 451B. When the play button 411 is pressed in this state, the UI control unit 152 may start playing the motion of the posture model or the like from the posture model or the like at the timing of the specified sub-action “jump”. This allows the user to easily compare, for example, the posture of the player at the time of the sub-action "jump" of the serve performed at different dates and times and the movement thereof. If the timing of the frame image of the sub-action "jump" is aligned, the moving image may be played including "run-up" and "toss".
 図21は、姿勢比較用UI400Cにおいて姿勢モデルを選択する場合の例を説明するための図である。 FIG. 21 is a diagram for explaining an example in the case of selecting a posture model in the posture comparison UI 400C.
 ユーザが、姿勢比較用UI400Cに含まれる右側のサイドバー430(図18、図19参照)を押下した場合、UI制御部152は、図21に示すような、選択リスト431を表示する。選択リスト431には、図16の場合と同様、記録情報200に係る軌跡情報220から特定される情報が、リスト形式で表示される。 When the user presses the right sidebar 430 (see FIGS. 18 and 19) included in the posture comparison UI 400C, the UI control unit 152 displays the selection list 431 as shown in FIG. 21. In the selection list 431, the information specified from the locus information 220 related to the recorded information 200 is displayed in a list format as in the case of FIG.
 また、選択リスト431の各行には、チェックボックス432が表示される。ユーザは、選択リスト431の中から、比較したい行のチェックボックス432をONに設定する。 In addition, a check box 432 is displayed in each line of the selection list 431. The user sets the check box 432 of the row to be compared from the selection list 431 to ON.
 UI制御部152は、チェックボックス432がONに設定された記録情報200を特定し、その特定した記録情報200に係る姿勢モデル情報221から生成された姿勢モデル等を、表示領域434に表示する。これにより、ユーザは、所望の姿勢モデル等を表示領域434に表示させて比較することができる。 The UI control unit 152 identifies the recording information 200 in which the check box 432 is set to ON, and displays the posture model or the like generated from the posture model information 221 related to the specified recording information 200 in the display area 434. As a result, the user can display a desired posture model or the like in the display area 434 and compare them.
 また、サイドバー430には、ボールフィルタリングボタン433が表示されてよい。ボールフィルタリングボタン433が押下された場合、UI制御部152は、選択リスト431に表示する記録情報200をフィルタリングするため条件ボックス(図示せず)を表示する。条件ボックスには、フィルタリングの閾値となる、ボールの高さ、スピード、角度及び評価のうちの少なくとも1つが入力できてよい。そして、UI制御部152は、条件ボックスに入力された条件に適合する記録情報200を、選択リスト431として表示する。これにより、ユーザは、大量の記録情報200の中から、比較候補とする記録情報200を効率的に抽出できる。 Further, the ball filtering button 433 may be displayed on the sidebar 430. When the ball filtering button 433 is pressed, the UI control unit 152 displays a condition box (not shown) for filtering the recording information 200 to be displayed in the selection list 431. In the condition box, at least one of the ball height, speed, angle and evaluation, which is the threshold for filtering, may be entered. Then, the UI control unit 152 displays the recording information 200 that matches the conditions input in the condition box as the selection list 431. As a result, the user can efficiently extract the recorded information 200 as a comparison candidate from the large amount of recorded information 200.
 <ハードウェア構成>
 以上、本開示に係る実施形態について図面を参照して詳述した。なお、上述した第1の携帯端末20A、第2の携帯端末20B、表示端末30、及び、球技映像解析装置100の機能は、コンピュータプログラムにより実現され得る。
<Hardware configuration>
The embodiments according to the present disclosure have been described in detail with reference to the drawings. The functions of the first mobile terminal 20A, the second mobile terminal 20B, the display terminal 30, and the ball game video analysis device 100 described above can be realized by a computer program.
 図22は、球技映像解析装置100の機能をプログラムにより実現するコンピュータのハードウェア構成を示す図である。 FIG. 22 is a diagram showing a hardware configuration of a computer that realizes the functions of the ball game video analysis device 100 by a program.
 このコンピュータ1000Aは、キーボード又はマウス、タッチパッドなどの入力装置1001、ディスプレイ又はスピーカーなどの出力装置1002、CPU(Central Processing Unit)1003、GPU(Graphics Processing Unit)1004、ROM(Read Only Memory)1005、RAM(Random Access Memory)1006、HDD(Hard Disk Drive)、SSD(Solid State Drive)又はフラッシュメモリなどの記憶装置1007、DVD-ROM(Digital Versatile Disk Read Only Memory)又はUSB(Universal Serial Bus)メモリなどの記録媒体から情報を読み取る読取装置1008、通信ネットワークNを介して有線又は無線にて通信を行う送受信装置1009を備え、各部は、バス1010により接続される。 The computer 1000A includes an input device 1001 such as a keyboard or mouse and a touch pad, an output device 1002 such as a display or a speaker, a CPU (Central Processing Unit) 1003, a GPU (Graphics Processing Unit) 1004, and a ROM (Read Only Memory) 1005. RAM (RandomAccessMemory) 1006, HDD (HardDiskDrive), SSD (SolidStateDrive) or storage device 1007 such as flash memory, DVD-ROM (DigitalVersatileDiskReadOnlyMemory) or USB (UniversalSerialBus) memory, etc. A reading device 1008 that reads information from the recording medium of the above, and a transmitting / receiving device 1009 that communicates by wire or wirelessly via the communication network N are provided, and each part is connected by a bus 1010.
 そして、読取装置1008は、上記各装置の機能を実現するためのプログラムを記録した記録媒体からそのプログラムを読み取り、記憶装置1007に記憶させる。あるいは、
送受信装置1009が、通信ネットワークNに接続されたサーバ装置と通信を行い、サーバ装置からダウンロードした上記各装置の機能を実現するためのプログラムを記憶装置1007に記憶させる。
Then, the reading device 1008 reads the program from the recording medium on which the program for realizing the functions of the above devices is recorded, and stores the program in the storage device 1007. or,
The transmission / reception device 1009 communicates with the server device connected to the communication network N, and stores the program downloaded from the server device for realizing the function of each device in the storage device 1007.
 そして、CPU1003が、記憶装置1007に記憶されたプログラムをRAM1006にコピーし、そのプログラムに含まれる命令をRAM1006から順次読み出して実行することにより、上記各装置の機能が実現される。 Then, the CPU 1003 copies the program stored in the storage device 1007 to the RAM 1006, and sequentially reads and executes the instructions included in the program from the RAM 1006, whereby the functions of the above devices are realized.
 図23は、第1の携帯端末20A、第2の携帯端末20B及び表示端末30の機能をプログラムにより実現するコンピュータのハードウェア構成を示す図である。なお、図11にて説明済みの構成要素については、同じ参照符号を付して説明を省略する場合がある。 FIG. 23 is a diagram showing a hardware configuration of a computer that programmatically realizes the functions of the first mobile terminal 20A, the second mobile terminal 20B, and the display terminal 30. The components described with reference to FIG. 11 may be designated by the same reference numerals and the description thereof may be omitted.
 このコンピュータ1000Bは、入力装置1001、出力装置1002、CPU1003、GPU1004、ROM1005、RAM1006、記憶装置1007、読取装置1008、送受信装置1009、カメラ装置1011を備え、各部は、バス1010により接続される。 This computer 1000B includes an input device 1001, an output device 1002, a CPU 1003, a GPU 1004, a ROM 1005, a RAM 1006, a storage device 1007, a reading device 1008, a transmission / reception device 1009, and a camera device 1011.
 カメラ装置1011は、撮像素子を備え、撮像画像(動画及び静止画)を生成する。生成された撮像画像は記憶装置1007に記憶される。記憶装置1007に記憶された撮像画像は、送受信装置1009及び通信ネットワークNを通じて外部のサーバ等に送信されてよい。 The camera device 1011 includes an image sensor and generates captured images (moving images and still images). The generated captured image is stored in the storage device 1007. The captured image stored in the storage device 1007 may be transmitted to an external server or the like through the transmission / reception device 1009 and the communication network N.
 上記の実施の形態の説明に用いた各機能ブロックは、典型的には集積回路であるLSIとして実現される。これらは個別に1チップ化されてもよいし、一部又は全てを含むように1チップ化されてもよい。ここでは、LSIとしたが、集積度の違いにより、IC、システムLSI、スーパーLSI、ウルトラLSIと呼称されることもある。 Each functional block used in the description of the above embodiment is typically realized as an LSI which is an integrated circuit. These may be individually integrated into one chip, or may be integrated into one chip so as to include a part or all of them. Although it is referred to as LSI here, it may be referred to as IC, system LSI, super LSI, or ultra LSI depending on the degree of integration.
 さらには、半導体技術の進歩又は派生する別技術によりLSIに置き換わる集積回路化の技術が登場すれば、当然、その技術を用いて機能ブロックの集積化を行ってもよい。 Furthermore, if an integrated circuit technology that replaces an LSI appears due to advances in semiconductor technology or another technology derived from it, it is naturally possible to integrate functional blocks using that technology.
 (本開示のまとめ)
 本開示に係る球技映像解析装置(100)は、球技に用いられる移動体(例えばボール)に対する人物(例えばプレイヤー)のアクションを互いに異なる位置から撮影した第1の動画(201A)及び第2の動画(201B)に基づいて、移動体の移動軌跡を示す軌跡情報(220)を生成する軌跡解析部(105)と、第1の動画及び第2の動画と軌跡情報との対応関係を示す情報である複数の記録情報(200)を管理する記録管理部(133)と、複数の記録情報の中から軌跡情報に基づいて記録情報を選択させるUI(User Interface)を表示し、UIを通じて選択された記録情報において対応関係にある第1の動画又は第2の動画を表示する表示制御部(134)とを備える。
(Summary of this disclosure)
The ball game video analysis device (100) according to the present disclosure is a first moving image (201A) and a second moving image (201A) in which the actions of a person (for example, a player) with respect to a moving body (for example, a ball) used in the ball game are photographed from different positions. Based on (201B), the locus analysis unit (105) that generates the locus information (220) indicating the locus of the moving body and the information indicating the correspondence between the first moving image and the second moving image and the locus information. A record management unit (133) that manages a plurality of recorded information (200) and a UI (User Interface) that selects recorded information based on the trajectory information from the plurality of recorded information are displayed and selected through the UI. A display control unit (134) for displaying a first moving image or a second moving image having a corresponding relationship in the recorded information is provided.
 この構成によれば、記録管理部において管理されている複数の記録情報の中から、軌跡情報に基づいて記録情報が選択され、その選択された記録情報において対応関係にある動画が表示される。当該動画には、球技を行う人物のアクションが撮影されている。よって、ユーザは、人物のアクションが撮影された複数の動画の中から、軌跡情報に基づいて、例えばアクションが成功又は失敗した動画等、球技の練習に役立つ動画を容易に見つけて視聴できる。 According to this configuration, the recorded information is selected based on the locus information from the plurality of recorded information managed by the record management unit, and the corresponding moving images are displayed in the selected recorded information. In the video, the action of a person performing a ball game is recorded. Therefore, the user can easily find and watch a video useful for practicing ball games, such as a video in which the action succeeds or fails, based on the trajectory information, from a plurality of videos in which the action of the person is shot.
 また、表示制御部は、複数の記録情報の中から、第1の記録情報及び第2の記録情報が選択された場合、第1の記録情報において対応関係にある第1の動画又は第2の動画と、第2の記録情報において対応関係にある第1の動画又は第2の動画とを、並べて表示してよい。 Further, when the first recorded information and the second recorded information are selected from the plurality of recorded information, the display control unit has a first moving image or a second moving image having a corresponding relationship in the first recorded information. The moving image and the first moving image or the second moving image having a corresponding relationship in the second recorded information may be displayed side by side.
 この構成によれば、選択された第1の記録情報に係る動画と、選択された第2の記録情報に係る動画とが並べて表示される。よって、ユーザは、並べて表示されたアクションが撮影された動画を見て、球技の練習に役立てることができる。 According to this configuration, the moving image related to the selected first recording information and the moving image related to the selected second recording information are displayed side by side. Therefore, the user can watch a moving image of the actions displayed side by side and use it for practicing ball games.
 また、球技映像解析装置は、第1の動画及び第2の動画に基づいて、アクションのときの人物の姿勢を示す姿勢モデル情報(221)を生成する姿勢解析部(131)をさらに備えてよい。記録情報は、第1の動画及び第2の動画と、軌跡情報と、姿勢モデル情報との対応関係を示す情報であってよい。表示制御部は、複数の記録情報の中から、第1の記録情報及び第2の記録情報が選択された場合、第1の記録情報において対応関係にある姿勢モデル情報から生成された姿勢モデル(231A)と、第2の記録情報において対応関係にある姿勢モデル情報から生成された姿勢モデル(231B)とを、並べて又は重ねて表示してよい。 Further, the ball game video analysis device may further include a posture analysis unit (131) that generates posture model information (221) indicating the posture of the person at the time of action based on the first moving image and the second moving image. .. The recorded information may be information indicating the correspondence between the first moving image and the second moving image, the locus information, and the posture model information. When the first recorded information and the second recorded information are selected from the plurality of recorded information, the display control unit generates a posture model (posture model) generated from the posture model information having a corresponding relationship in the first recorded information. 231A) and the posture model (231B) generated from the posture model information corresponding to each other in the second recorded information may be displayed side by side or overlapped.
 この構成によれば、選択された第1の記録情報に係る姿勢モデルと、選択された第2の記録情報に係る姿勢モデルとが並べて又は重ねて表示される。よって、ユーザは、並べて又は重ねて表示されたアクション時の姿勢モデルを見て、球技の練習に役立てることができる。 According to this configuration, the posture model related to the selected first recorded information and the posture model related to the selected second recorded information are displayed side by side or overlapped. Therefore, the user can see the posture model at the time of action displayed side by side or overlapped, and can use it for practicing ball games.
 また、表示制御部は、第1の記録情報において対応関係にある軌跡情報から生成された移動体の移動軌跡(241A)と、第2の記録情報において対応関係にある軌跡情報から生成された移動体の移動軌跡(241B)とを、姿勢モデルと共に表示してよい。 Further, the display control unit has a movement locus (241A) of the moving body generated from the locus information having a correspondence relationship in the first recording information, and a movement generated from the locus information having a correspondence relationship in the second recording information. The movement locus (241B) of the body may be displayed together with the posture model.
 この構成によれば、姿勢モデルと共に、移動体の移動軌跡も表示される。よって、ユーザは、姿勢モデルと共に表示されたアクション時の移動体の動きを見て、球技の練習に役立てることができる。 According to this configuration, the movement locus of the moving body is displayed along with the posture model. Therefore, the user can see the movement of the moving body at the time of the action displayed together with the posture model and use it for practicing the ball game.
 また、球技映像解析装置は、軌跡情報及び姿勢モデル情報に基づいて、アクションを構成する複数のサブアクションのそれぞれの発生タイミングを示すサブアクション情報(222)を生成するサブアクション解析部(132)をさらに備えてよい。記録情報は、第1の動画及び第2の動画と、軌跡情報と、姿勢モデル情報と、サブアクション情報との対応関係を示す情報であってよい。表示制御部は、複数のサブアクションのうちの1つのサブアクションを選択可能なサブアクション選択UI(420)を表示し、サブアクション選択UIを通じて選択された1つのサブアクションの発生タイミングにおける姿勢モデルを表示してよい。 In addition, the ball game video analysis device includes a sub-action analysis unit (132) that generates sub-action information (222) indicating the occurrence timing of each of the plurality of sub-actions constituting the action based on the trajectory information and the posture model information. You may prepare further. The recorded information may be information indicating the correspondence between the first moving image and the second moving image, the trajectory information, the posture model information, and the sub-action information. The display control unit displays a sub-action selection UI (420) capable of selecting one of a plurality of sub-actions, and displays a posture model at the occurrence timing of one sub-action selected through the sub-action selection UI. May be displayed.
 この構成によれば、サブアクション選択UIを通じて選択された1つのサブアクションの発生タイミングにおける姿勢モデルが表示される。よって、ユーザは、サブアクション選択UIを通じて、所望のサブアクションのときの姿勢モデルを表示させ、球技の練習に役立てることができる。 According to this configuration, the posture model at the occurrence timing of one sub-action selected through the sub-action selection UI is displayed. Therefore, the user can display the posture model at the time of the desired sub-action through the sub-action selection UI, which can be useful for practicing ball games.
 以上、添付図面を参照しながら実施の形態について説明したが、本開示はかかる例に限定されない。当業者であれば、特許請求の範囲に記載された範疇内において、各種の変更例、修正例、置換例、付加例、削除例、均等例に想到し得ることは明らかであり、それらについても本開示の技術的範囲に属すると了解される。また、発明の趣旨を逸脱しない範囲において、上述した実施の形態における各構成要素を任意に組み合わせてもよい。 Although the embodiment has been described above with reference to the attached drawings, the present disclosure is not limited to such an example. It is clear that a person skilled in the art can come up with various modifications, modifications, substitutions, additions, deletions, and equality within the scope of the claims. It is understood that it belongs to the technical scope of the present disclosure. Further, each component in the above-described embodiment may be arbitrarily combined as long as the gist of the invention is not deviated.
 なお、本出願は、2020年3月16日出願の日本特許出願(特願2020-045203)に基づくものであり、その内容は本出願の中に参照として援用される。 Note that this application is based on a Japanese patent application (Japanese Patent Application No. 2020-045203) filed on March 16, 2020, the contents of which are incorporated herein by reference.
 本開示の技術は、移動体を用いたスポーツの解析に利用可能である。 The technology of the present disclosure can be used for analysis of sports using mobile objects.
 10 球技映像解析システム
 20A 第1の携帯端末
 20B 第2の携帯端末
 30 表示端末
 100 球技映像解析装置
 101 動画受信部
 102 画像変換部
 103 基準フレーム特定部
 104 基準フレーム変更部
 105 軌跡解析部
 106 結果表示部
 131 姿勢解析部
 132 サブアクション解析部
 133 記録管理部
 134 表示制御部
 151 情報送受信部
 152 UI制御部
 200 記録情報
 200A 第1の記録情報
 200B 第2の記録情報
 201A 第1の動画
 201B 第2の動画
 202A 第1のフレーム画像群
 202B 第2のフレーム画像群
 203 同期ずれ情報
 204 解析結果情報
 210A 第1の画像表示領域
 210B 第2の画像表示領域
 211A、211B ボールの検出領域
 220 軌跡情報
 221 姿勢モデル情報
 222 サブアクション情報
 231A 第1の姿勢モデル
 231B 第2の姿勢モデル
 232A、232B 首の関節位置
 240A、240B ボール
 241A、241B ボールの移動軌跡
 310 基準フレーム確認用UI
 312 OKボタン
 313 変更ボタン
 314 やり直しボタン
 320 基準フレーム変更用UI
 322 OKボタン
 323 戻りボタン
 324A、324B 1フレーム戻りボタン
 325A、325B Nフレーム戻りボタン
 326A、326B 1フレーム進みボタン
 327A、327B Nフレーム進みボタン
 330 解析結果画面
 331 ボールの移動軌跡
 400A 映像比較用UI
 400B サブアクション映像比較用UI
 400C 姿勢比較用UI
 400D サブアクション姿勢比較用UI
 401A、401B 選択リスト
 402A、402B 表示領域
 403 追加ボタン
 410 シークバー
 411 再生ボタン
 412 一時停止ボタン
 413 スロー再生ボタン
 414 繰り返しボタン
 415 並べるボタン
 416 拡大ボタン
 417 カメラ切替ボタン
 418 映像反転ボタン
 419 3D切替ボタン
 420 サブアクション選択ボタン
 421A、421B 表示領域
 430 サイドバー
 431 選択リスト
 432 チェックボックス
 433 ボールフィルタリングボタン
 434 表示領域
 441 映像切替ボタン
 442 軌跡表示切替ボタン
 443 ボール表示切替ボタン
 444 選手表示切替ボタン
 445 重畳分離切替ボタン
 446 並べるボタン
 447 サブアクション選択ボタン
 451A、451B 表示領域
 1000 コンピュータ
 1001 入力装置
 1002 出力装置
 1003 CPU
 1004 GPU
 1005 ROM
 1006 RAM
 1007 記憶装置
 1008 読取装置
 1009 送受信装置
 1010 バス
 1011 カメラ装置
10 Ball game video analysis system 20A 1st mobile terminal 20B 2nd mobile terminal 30 Display terminal 100 Ball game video analysis device 101 Video receiver 102 Image conversion unit 103 Reference frame identification unit 104 Reference frame change unit 105 Trajectory analysis unit 106 Result display Unit 131 Posture analysis unit 132 Sub-action analysis unit 133 Recording management unit 134 Display control unit 151 Information transmission / reception unit 152 UI control unit 200 Recording information 200A First recording information 200B Second recording information 201A First video 201B Second Movie 202A First frame image group 202B Second frame image group 203 Synchronous deviation information 204 Analysis result information 210A First image display area 210B Second image display area 211A, 211B Ball detection area 220 Trajectory information 221 Attitude model Information 222 Sub-action information 231A First posture model 231B Second posture model 232A, 232B Neck joint position 240A, 240B Ball 241A, 241B Ball movement trajectory 310 Reference frame confirmation UI
312 OK button 313 Change button 314 Redo button 320 Reference frame change UI
322 OK button 323 Return button 324A, 324B 1 frame return button 325A, 325B N frame return button 326A, 326B 1 frame advance button 327A, 327B N frame advance button 330 Analysis result screen 331 Ball movement trajectory 400A Video comparison UI
400B Sub-action video comparison UI
400C Posture comparison UI
400D Sub-action posture comparison UI
401A, 401B Selection list 402A, 402B Display area 403 Add button 410 Seek bar 411 Play button 412 Pause button 413 Slow play button 414 Repeat button 415 Arrange button 416 Enlarge button 417 Camera switch button 418 Video inversion button 419 3D switch button 420 Sub action Selection button 421A, 421B Display area 430 Sidebar 431 Selection list 432 Check box 433 Ball filtering button 434 Display area 441 Video switching button 442 Trajectory display switching button 443 Ball display switching button 444 Player display switching button 445 Superimposition separation switching button 446 Arrange button 447 Sub-action selection buttons 451A, 451B Display area 1000 Computer 1001 Input device 1002 Output device 1003 CPU
1004 GPU
1005 ROM
1006 RAM
1007 Storage device 1008 Reading device 1009 Transmission / reception device 1010 Bus 1011 Camera device

Claims (8)

  1.  球技に用いられる移動体に対する人物のアクションを互いに異なる位置から撮影した第1の動画及び第2の動画に基づいて、前記移動体の移動軌跡を示す軌跡情報を生成する軌跡解析部と、
     前記第1の動画及び前記第2の動画と前記軌跡情報との対応関係を示す情報である複数の記録情報を管理する記録管理部と、
     前記複数の記録情報の中から前記軌跡情報に基づいて前記記録情報を選択させるUI(User Interface)を表示し、前記UIを通じて選択された前記記録情報において対応関係にある第1の動画又は第2の動画を表示する表示制御部と、を備えた、
     球技映像解析装置。
    A trajectory analysis unit that generates trajectory information indicating the movement trajectory of the moving body based on the first moving image and the second moving image of the action of the person with respect to the moving body used in the ball game from different positions.
    A record management unit that manages a plurality of record information that is information indicating a correspondence relationship between the first moving image and the second moving image and the locus information.
    A UI (User Interface) for selecting the recorded information based on the locus information from the plurality of recorded information is displayed, and a first moving image or a second moving image or a second video having a corresponding relationship in the recorded information selected through the UI is displayed. Equipped with a display control unit that displays the video of
    Ball game video analysis device.
  2.  前記表示制御部は、前記複数の記録情報の中から、第1の記録情報及び第2の記録情報が選択された場合、前記第1の記録情報において対応関係にある第1の動画又は第2の動画と、前記第2の記録情報において対応関係にある第1の動画又は第2の動画とを、並べて表示する、
     請求項1に記載の球技映像解析装置。
    When the first recorded information and the second recorded information are selected from the plurality of recorded information, the display control unit has a first moving image or a second moving image having a corresponding relationship in the first recorded information. And the first moving image or the second moving image having a corresponding relationship in the second recorded information are displayed side by side.
    The ball game video analysis device according to claim 1.
  3.  前記第1の動画及び前記第2の動画に基づいて、前記アクションのときの前記人物の姿勢を示す姿勢モデル情報を生成する姿勢解析部、をさらに備え、
     前記記録情報は、前記第1の動画及び前記第2の動画と、前記軌跡情報と、前記姿勢モデル情報との対応関係を示す情報であり、
     前記表示制御部は、前記複数の記録情報の中から、第1の記録情報及び第2の記録情報が選択された場合、前記第1の記録情報において対応関係にある姿勢モデル情報から生成された姿勢モデルと、前記第2の記録情報において対応関係にある姿勢モデル情報から生成された姿勢モデルとを、並べて又は重ねて表示する、
     請求項1に記載の球技映像解析装置。
    A posture analysis unit that generates posture model information indicating the posture of the person at the time of the action based on the first moving image and the second moving image is further provided.
    The recorded information is information indicating a correspondence relationship between the first moving image and the second moving image, the locus information, and the posture model information.
    When the first recorded information and the second recorded information are selected from the plurality of recorded information, the display control unit is generated from the posture model information having a corresponding relationship in the first recorded information. The posture model and the posture model generated from the posture model information having a corresponding relationship in the second recorded information are displayed side by side or overlapped.
    The ball game video analysis device according to claim 1.
  4.  前記表示制御部は、前記第1の記録情報において対応関係にある軌跡情報から生成された前記移動体の移動軌跡と、前記第2の記録情報において対応関係にある軌跡情報から生成された前記移動体の移動軌跡とを、前記姿勢モデルと共に表示する、
     請求項3に記載の球技映像解析装置。
    The display control unit has the movement locus of the moving body generated from the locus information having a correspondence relationship in the first recording information, and the movement generated from the locus information having a correspondence relationship in the second recording information. The movement locus of the body is displayed together with the posture model.
    The ball game video analysis device according to claim 3.
  5.  前記軌跡情報及び前記姿勢モデル情報に基づいて、前記アクションを構成する複数のサブアクションのそれぞれの発生タイミングを示すサブアクション情報を生成するサブアクション解析部、をさらに備え、
     前記記録情報は、前記第1の動画及び前記第2の動画と、前記軌跡情報と、前記姿勢モデル情報と、前記サブアクション情報との対応関係を示す情報であり、
     前記表示制御部は、
     前記複数のサブアクションのうちの1つのサブアクションを選択可能なサブアクション選択UIを表示し、
     前記サブアクション選択UIを通じて選択された1つのサブアクションの発生タイミングにおける前記姿勢モデルを表示する、
     請求項3に記載の球技映像解析装置。
    A sub-action analysis unit that generates sub-action information indicating the occurrence timing of each of the plurality of sub-actions constituting the action based on the locus information and the posture model information is further provided.
    The recorded information is information indicating the correspondence between the first moving image and the second moving image, the locus information, the posture model information, and the sub action information.
    The display control unit
    Display the sub-action selection UI that allows you to select one of the plurality of sub-actions.
    The posture model at the occurrence timing of one sub-action selected through the sub-action selection UI is displayed.
    The ball game video analysis device according to claim 3.
  6.  カメラ付きの第1の携帯端末と、カメラ付きの第2の携帯端末と、球技映像解析装置と、表示端末とを含む球技映像解析システムであって、
     前記第1の携帯端末は、球技に用いられる移動体に対する人物のアクションを第1の位置から撮影して第1の動画を生成し、
     前記第2の携帯端末は、前記人物の前記アクションを第2の位置から撮影して第2の動画を生成し、
     前記球技映像解析装置は、
     前記第1の動画及び前記第2の動画に基づいて、前記移動体の軌跡を示す軌跡情報を生成し、
     前記第1の動画及び前記第2の動画と前記軌跡情報との対応関係を示す情報である複数の記録情報を管理し、
     前記複数の記録情報の中から前記軌跡情報に基づいて記録情報を選択させるUIを前記表示端末に表示させ、
     前記UIを通じて選択された前記記録情報において対応関係にある第1の動画又は第2の動画を前記表示端末に表示させる、
     球技映像解析システム。
    A ball game video analysis system including a first mobile terminal with a camera, a second mobile terminal with a camera, a ball game video analysis device, and a display terminal.
    The first mobile terminal captures a person's action on a moving body used in a ball game from a first position and generates a first moving image.
    The second mobile terminal captures the action of the person from a second position to generate a second moving image.
    The ball game video analysis device is
    Based on the first moving image and the second moving image, locus information indicating the locus of the moving body is generated.
    It manages a plurality of recorded information which is information indicating a correspondence relationship between the first moving image and the second moving image and the locus information.
    A UI for selecting the recorded information based on the locus information from the plurality of recorded information is displayed on the display terminal.
    A first moving image or a second moving image having a corresponding relationship in the recorded information selected through the UI is displayed on the display terminal.
    Ball game video analysis system.
  7.  情報処理装置によって、
     球技に用いられる移動体に対する人物のアクションを互いに異なる位置から撮影した第1の動画及び第2の動画に基づいて、前記移動体の移動軌跡を示す軌跡情報を生成し、
     前記第1の動画及び前記第2の動画と前記軌跡情報との対応関係を示す情報である複数の記録情報を管理し、
     前記複数の記録情報の中から前記軌跡情報に基づいて記録情報を選択させるUIを表示し、前記UIを通じて選択された前記記録情報において対応関係にある第1の動画又は第2の動画を表示する、
     球技映像解析方法。
    Depending on the information processing device
    Based on the first moving image and the second moving image in which the actions of the person with respect to the moving body used in the ball game are taken from different positions, the locus information indicating the moving locus of the moving body is generated.
    It manages a plurality of recorded information which is information indicating a correspondence relationship between the first moving image and the second moving image and the locus information.
    A UI for selecting recorded information based on the locus information from the plurality of recorded information is displayed, and a first moving image or a second moving image having a corresponding relationship in the recorded information selected through the UI is displayed. ,
    Ball game video analysis method.
  8.  球技に用いられる移動体に対する人物のアクションを互いに異なる位置から撮影した第1の動画及び第2の動画に基づいて、前記移動体の移動軌跡を示す軌跡情報を生成し、
     前記第1の動画及び前記第2の動画と前記軌跡情報との対応関係を示す情報である複数の記録情報を管理し、
     前記複数の記録情報の中から前記軌跡情報に基づいて記録情報を選択させるUIを表示し、前記UIを通じて選択された前記記録情報において対応関係にある第1の動画又は第2の動画を表示する、
     処理をコンピュータに実行させる、コンピュータプログラム。
    Based on the first moving image and the second moving image in which the actions of the person with respect to the moving body used in the ball game are taken from different positions, the locus information indicating the moving locus of the moving body is generated.
    It manages a plurality of recorded information which is information indicating a correspondence relationship between the first moving image and the second moving image and the locus information.
    A UI for selecting recorded information based on the locus information from the plurality of recorded information is displayed, and a first moving image or a second moving image having a corresponding relationship in the recorded information selected through the UI is displayed. ,
    A computer program that lets a computer perform processing.
PCT/JP2021/008950 2020-03-16 2021-03-08 Ball game footage analysis device, ball game footage analysis system, ball game footage analysis method, and computer program WO2021187193A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-045203 2020-03-16
JP2020045203A JP7437652B2 (en) 2020-03-16 2020-03-16 Ball game video analysis device, ball game video analysis system, ball game video analysis method, and computer program

Publications (1)

Publication Number Publication Date
WO2021187193A1 true WO2021187193A1 (en) 2021-09-23

Family

ID=77771255

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/008950 WO2021187193A1 (en) 2020-03-16 2021-03-08 Ball game footage analysis device, ball game footage analysis system, ball game footage analysis method, and computer program

Country Status (2)

Country Link
JP (1) JP7437652B2 (en)
WO (1) WO2021187193A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023106201A1 (en) * 2021-12-09 2023-06-15 Necソリューションイノベータ株式会社 Play analysis device, play analysis method, and computer-readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015154890A (en) * 2014-02-21 2015-08-27 株式会社横浜DeNAベイスターズ pitching analysis support system
JP2017091092A (en) * 2015-11-06 2017-05-25 富士通株式会社 Image processing method, image processing program, and image processing apparatus
WO2019225415A1 (en) * 2018-05-21 2019-11-28 パナソニックIpマネジメント株式会社 Ball game video analysis device and ball game video analysis method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2015154890A (en) * 2014-02-21 2015-08-27 株式会社横浜DeNAベイスターズ pitching analysis support system
JP2017091092A (en) * 2015-11-06 2017-05-25 富士通株式会社 Image processing method, image processing program, and image processing apparatus
WO2019225415A1 (en) * 2018-05-21 2019-11-28 パナソニックIpマネジメント株式会社 Ball game video analysis device and ball game video analysis method

Also Published As

Publication number Publication date
JP2021145702A (en) 2021-09-27
JP7437652B2 (en) 2024-02-26

Similar Documents

Publication Publication Date Title
US11783721B2 (en) Virtual team sport trainer
US20210350833A1 (en) Play Sequence Visualization and Analysis
Wilson Development in video technology for coaching
US8848058B2 (en) Method for analyzing the motion of a person during an activity
JP4905474B2 (en) Video processing apparatus, video processing method, and program
US8885979B2 (en) Apparatus and associated methodology for analyzing subject motion in images
JP6136926B2 (en) Information processing apparatus, storage medium, and information processing method
JP6213146B2 (en) Information processing apparatus, recording medium, and information processing method
WO2020235339A1 (en) Play analyzing device, and play analyzing method
WO2021187193A1 (en) Ball game footage analysis device, ball game footage analysis system, ball game footage analysis method, and computer program
JPH06105231A (en) Picture synthesis device
KR101540771B1 (en) Judgment system and method for sports game using augmented-reality glass
JP7113335B2 (en) Play analysis device and play analysis method
JP7113336B2 (en) Play analysis device and play analysis method
KR101164894B1 (en) A device and system producing a ball playing image in player&#39;s views for the baseball game broadcasting and the recording medium thereof
KR102454801B1 (en) Apparatus, method and recording medium storing command for determining video for sports broadcasting
WO2021182081A1 (en) Ball game video analysis device, ball game video analysis method, and computer program
WO2020071092A1 (en) Play analysis device and play analysis method
WO2020230677A1 (en) Play analysis apparatus and play analysis method
US20230368471A1 (en) Method and system for converting 2-d video into a 3-d rendering with enhanced functionality
JP7296546B2 (en) Play analysis device and play analysis method
JP2021150668A (en) Analysis information correction device, analysis information correction method, and computer program
JP7141629B2 (en) Player position visualization device
WO2022197932A1 (en) Method and system for training an athletic motion by an individual
JP2022173865A (en) Play analysis device, play analysis method, and, computer program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21772530

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21772530

Country of ref document: EP

Kind code of ref document: A1