WO2022215116A1 - 情報処理プログラム、装置、及び方法 - Google Patents
情報処理プログラム、装置、及び方法 Download PDFInfo
- Publication number
- WO2022215116A1 WO2022215116A1 PCT/JP2021/014499 JP2021014499W WO2022215116A1 WO 2022215116 A1 WO2022215116 A1 WO 2022215116A1 JP 2021014499 W JP2021014499 W JP 2021014499W WO 2022215116 A1 WO2022215116 A1 WO 2022215116A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- time
- ice
- athlete
- height
- information processing
- Prior art date
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 114
- 238000000034 method Methods 0.000 title claims description 46
- 238000003672 processing method Methods 0.000 claims description 4
- 238000010586 diagram Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 238000005516 engineering process Methods 0.000 description 8
- 238000003384 imaging method Methods 0.000 description 4
- 239000004065 semiconductor Substances 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000009191 jumping Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000012447 hatching Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/40—Scenes; Scene-specific elements in video content
- G06V20/41—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
- G06V20/42—Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items of sport video content
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/292—Multi-camera tracking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/23—Recognition of whole body movements, e.g. for sport training
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30221—Sports video; Sports image
Definitions
- the disclosed technology relates to an information processing program, an information processing device, and an information processing method.
- predetermined scenes are cut out from video during sports competitions.
- the predetermined scene is, for example, a scene including the moment of impact with a ball in golf, baseball, tennis, or the like, or a scene including jumping or landing in gymnastics.
- an information processing device that identifies a decisive moment from the continuous motion of a subject and extracts it as an image.
- the device receives sensor data from a sensor attached to the user or an object that contacts the user, and time information corresponding to the sensor data. Also, this device identifies the time when a predetermined motion pattern occurred in the user or object based on the sensor data and the time information. Then, according to the specified time, the device selects one or more images from a series of images including the user or the object taken at predetermined time intervals.
- the disclosed technique aims at estimating the ice take-off time and ice landing time of a jump in figure skating.
- the technology disclosed obtains images captured by each of a plurality of cameras that capture an athlete on a skating rink from two intersecting directions.
- the disclosed technique specifies the height of at least a part of the athlete and the equipment worn by the athlete in each of the plurality of frames included in the video. Then, based on the change in height, the disclosed technology calculates the ice takeoff time and the ice landing time of the jump performed by the athlete from the frame corresponding to the ice takeoff time and the ice landing time. presume.
- it has the effect of being able to estimate the ice take-off time and ice landing time of a jump in figure skating.
- FIG. 1 is a schematic diagram of an information processing system according to first and third embodiments;
- FIG. 1 is a functional block diagram of an information processing device according to first and third embodiments;
- FIG. 10 is a diagram for explaining selection of an optimum identification camera;
- FIG. FIG. 4 is a diagram for explaining calculation of the positions of the tip and the terminal end of a blade as predetermined parts;
- FIG. 4 is a diagram for explaining estimation of ice take-off time and ice accretion time in the first embodiment; It is a figure for demonstrating specification of a reference line.
- 1 is a block diagram showing a schematic configuration of a computer functioning as an information processing device;
- FIG. 4 is a flow chart showing an example of an information processing routine in the first embodiment;
- FIG. 7 is a functional block diagram of an information processing device according to a second embodiment
- FIG. 11 is a diagram for explaining estimation of ice take-off time in the second embodiment
- FIG. 11 is a diagram for explaining estimation of icing time in the second embodiment
- FIG. 4 is a diagram for explaining how to identify the take-off point and the ice-landing point
- FIG. 5 is a diagram for explaining specific details of the take-off point
- FIG. 10 is a diagram showing a rotation angle ⁇ calculated from each frame included in a jump section
- FIG. 15 is an enlarged view of a portion indicated by a dashed frame in FIG. 14
- 9 is a flow chart showing an example of an information processing routine in the second embodiment
- an information processing system 100 includes an information processing device 10, a tracking camera 20, and an identification camera 22.
- FIG. 1 An information processing system 100 according to the first embodiment includes an information processing device 10, a tracking camera 20, and an identification camera 22.
- the tracking camera 20 is, for example, a motion tracking imaging device that captures an image that can identify the position of the athlete 32 on the skating rink 30 .
- a plurality of (for example, two) tracking cameras 20 are installed on the ceiling, side wall, or the like of the venue.
- the identification camera 22 is a photographing device that photographs the athlete 32 on the skating rink 30 from two intersecting directions.
- the identification camera 22 includes a plurality of first cameras 22A installed along the short side of the skating rink 30 so that the shooting direction is parallel to the long axis direction of the skating rink 30 .
- the identification camera 22 also includes a plurality of second cameras 22B installed along the long side of the skating rink 30 so that the shooting direction is parallel to the short axis direction of the skating rink.
- first camera 22A and the second camera 22B are described without distinction, they are referred to as "specifying camera 22".
- position coordinates of the position where the shooting direction of the first camera 22A and the shooting direction of the second camera 22B intersect on the xy plane in which the skating rink 30 is viewed from above. are specified for each.
- the position coordinates of the position where the imaging direction of the i-th first camera 22A and the imaging direction of the j-th second camera 22B intersect are assumed to be (x c — ij , y c — ij ).
- Each of the tracking camera 20 and the identification camera 22 outputs images captured at a predetermined frame rate (eg, 120 fps).
- the output video includes a plurality of frames, and each frame is associated with time information. Each video is temporally synchronized based on the time information. Note that the numbers of the tracking cameras 20 and the identification cameras 22 are not limited to the examples in FIGS.
- the information processing device 10 functionally includes an acquisition unit 12, an identification unit 14, an estimation unit 16, and a calculation unit 18, as shown in FIG.
- the acquisition unit 12 acquires the video output from the tracking camera 20 .
- the acquisition unit 12 acquires the position of the athlete 32 on the skating rink 30 by motion tracking from the acquired video.
- the acquisition unit 12 recognizes the athlete 32 who is the object of motion tracking for each frame of the video captured by each tracking camera 20 .
- the acquisition unit 12 recognizes the athlete 32 from each frame based on the characteristics of the athlete 32 or the clothing worn by the athlete 32, such as the color and shape.
- the acquisition unit 12 may also recognize the moving object indicated by the difference between frames as the player 32 .
- the acquisition unit 12 calculates the position of the player 32 recognized in each frame for each frame.
- the acquisition unit 12 generates trajectory information by tracking the position of the player 32 in each frame.
- the position of the player 32 may be calculated three-dimensionally or may be calculated two-dimensionally.
- a case of calculating the position coordinates (x p , y p ) of the recognized player 32 on the xy plane in which the skating rink 30 is viewed from above will be described.
- the acquisition unit 12 selects the optimum identification camera 22 for the position of the player 32 from among the identification cameras 22, and obtains the image captured by the selected identification camera 22. Specifically, the acquisition unit 12 selects each of the first camera 22A and the second camera 22B where the position where the photographing direction of the first camera 22A and the photographing direction of the second camera 22B intersect is closest to the position of the athlete. Get the video captured by
- the acquisition unit 12 obtains the position coordinates (x Identify the position coordinates (x c — ij , y c — ij ) that are closest to p , y p ). Then, the acquiring unit 12 obtains the frame n and the time information of the video captured by each of the i-th first camera 22A and the j-th second camera B corresponding to the specified (x c — ij , y c — ij ). Get frames to sync. For example, as shown in FIG. 3, assume that the position coordinates (x c — 24 , y c — 24 ) are closest to the position coordinates (x p , y p ) of the player 32 . In this case, the acquiring unit 12 acquires frames of images captured by each of the second first camera 22A and the fourth second camera 22B (cameras indicated by hatching in FIG. 3).
- the acquisition unit 12 delivers the acquired frames to the identification unit 14 and delivers the generated trajectory information to the calculation unit 18 .
- the identifying unit 14 identifies the height of the athlete 32 and the height of at least a part of the equipment worn by the athlete 32 in each of the frames delivered from the acquiring unit 12 .
- the specifying unit 14 three-dimensionally analyzes each of the frames delivered from the acquisition unit 12, and determines the three-dimensional position (x, y , z).
- z is the height of the predetermined portion.
- the predetermined portions include the leading edge 34 and the trailing edge 36 of the blade of the skate shoe worn by the athlete 32, as shown in FIG.
- the predetermined parts may include each joint of the player 32, the head, and parts of the face such as the eyes, nose, and mouth.
- an existing method such as a recognition method using the shape of the predetermined part or a recognition method using a human body skeleton model may be used.
- the identifying unit 14 transfers the height z of the predetermined part calculated for each frame to the estimating unit 16 and transfers the three-dimensional position (x, y, z) of the predetermined part to the calculating unit 18 .
- the estimating unit 16 calculates the take-off time of the jump from the frame corresponding to the take-off and the frame corresponding to the landing of the jump performed by the athlete 32, based on the change in the height of the predetermined portion specified for each frame. and estimate the icing time. Specifically, as shown in FIG. 5, the estimating unit 16 determines the frame in which the height of the predetermined portion exceeds the reference value indicating the height at the time of icing, or the time of the previous frame exceeding the reference value. Information is estimated as ice-off time. The estimating unit 16 also estimates the time information of the frame in which the height of the predetermined part returns to the reference value after exceeding the reference value or the frame immediately before returning to the reference value as the icing time.
- the reference value may be, for example, a value obtained by adding a margin to the average of the heights of predetermined parts specified in advance from images in which it is known that the athlete 32 is on ice.
- the height z of the predetermined portion identified from the frame FT1 is below the reference value, and the height z of the predetermined portion identified from the next frame FT2 exceeds the reference value. Therefore, the time information tT1 of the frame FT1 or the time information tT2 of the frame FT2 is estimated as the ice break-off time. Also, the height z of the predetermined portion identified from the frame FL2 exceeds the reference value, and the height z of the predetermined portion identified from the next frame FL1 is below the reference value. Therefore, the time information tL1 of the frame FL1 or the time information tL2 of the frame FL2 is estimated as the icing time. The estimating unit 16 transfers the estimated ice take-off time and ice accretion time to the calculator 18 .
- the calculation unit 18 calculates the absolute angle of the blade with respect to the shooting direction of the identification camera 22 using the positions of the leading edge 34 and trailing edge 36 of the blade identified from each of the frames of the ice release time and the ice accretion time. do. For example, the calculation unit 18 calculates the angle between the photographing direction of the identification camera 22 or a line perpendicular to the photographing direction and the line connecting the tip 34 and the terminal end 36 of the blade, that is, the direction of the blade, as the absolute angle of the blade. can be calculated as Frames captured by the first camera 22A and the second camera 22B are used as the identifying camera 22 to identify the tip 34 and the terminal end 36 of the blade. Therefore, the calculation unit 18 determines one of the first camera 22A and the second camera 22B as the main identification camera 22, and determines the absolute angle of the blade based on the photographing direction of the main identification camera 22. should be calculated.
- the calculation unit 18 also converts the absolute angle of the blade into an angle with respect to the reference line for determining insufficient rotation of the jump (hereinafter referred to as "rotation angle ⁇ "). Specifically, the calculator 18 calculates the ice takeoff point A and the ice accretion point based on the trajectory information received from the acquirer 12 and the ice takeoff time and ice accretion time received from the estimator 16 . Identify the location of each of B. More specifically, in the trajectory information, the calculation unit 18 specifies the position coordinates of the player 32 specified from the frame whose time information is the ice takeoff time as the ice takeoff point A, and uses the ice landing time as the time information. The position coordinates of the athlete 32 specified from the frame are specified as the ice landing point B.
- the calculation unit 18 specifies a straight line passing through the ice takeoff point A and the ice landing point B as the reference line (AB) for judging the rotation of the jump.
- the calculation unit 18 subtracts the angle difference between the specified reference line (AB) and the line based on the photographing direction of the main specifying camera 22 from the absolute angle of the blade to obtain the rotation angle ⁇ of the blade. calculate.
- the calculation unit 18 outputs the rotation angle ⁇ of the blade calculated from the frame of the icing time as the rotation angle ⁇ of the blade during icing.
- the calculation unit 18 determines from each of the frames included in the section from the ice takeoff time to the ice landing time (hereinafter also referred to as "jump section") and a predetermined number of frames before and after the section. Information other than the rotation angle may be calculated based on the three-dimensional position of the predetermined part. For example, the calculation unit 18 calculates the position of the waist as the predetermined part, and calculates the difference between the minimum and maximum values of the waist position calculated from each frame included in the jump section as the height of the jump. good too. Further, the calculation unit 18 may calculate the distance from the specified ice takeoff point A to the ice landing point B as the flying distance of the jump.
- the calculation unit 18 may calculate the rotational speed from the time from the ice take-off time to the ice landing time and the change in the rotation angle in the jump section. Further, the calculation unit 18 may calculate the railroad crossing speed from the time from the frame corresponding to the ice break-off time to a predetermined frame before and the amount of change in the position of the predetermined portion during that time.
- the calculation unit 18 outputs a calculation result including the calculated rotation angle ⁇ at the time of icing and other calculated information.
- the output rotation angle ⁇ at the time of icing can be used to determine insufficient jump rotation.
- the output calculation result may be used as stats displayed on the screen of a television broadcast or the like. Further, for example, the calculation unit 18 may generate and output image data of an image 38 (see FIG. 1) indicating a jump section on the trajectory indicated by the trajectory information.
- the information processing device 10 can be realized by, for example, a computer 40 shown in FIG.
- the computer 40 includes a CPU (Central Processing Unit) 41 , a memory 42 as a temporary storage area, and a non-volatile storage section 43 .
- the computer 40 also includes an input/output device 44 such as an input unit and a display unit, and an R/W (Read/Write) unit 45 that controls reading and writing of data to and from a storage medium 49 .
- the computer 40 also has a communication I/F (Interface) 46 connected to a network such as the Internet.
- the CPU 41 , memory 42 , storage unit 43 , input/output device 44 , R/W unit 45 and communication I/F 46 are connected to each other via bus 47 .
- the storage unit 43 can be implemented by a HDD (Hard Disk Drive), SSD (Solid State Drive), flash memory, or the like.
- An information processing program 50 for causing the computer 40 to function as the information processing apparatus 10 is stored in the storage unit 43 as a storage medium.
- the information processing program 50 has an acquisition process 52 , an identification process 54 , an estimation process 56 and a calculation process 58 .
- the CPU 41 reads out the information processing program 50 from the storage unit 43, develops it in the memory 42, and sequentially executes the processes of the information processing program 50.
- the CPU 41 operates as the acquisition unit 12 shown in FIG. 2 by executing the acquisition process 52 . Further, the CPU 41 operates as the specifying unit 14 shown in FIG. 2 by executing the specifying process 54 . Also, the CPU 41 operates as the estimation unit 16 shown in FIG. 2 by executing the estimation process 56 . Further, the CPU 41 operates as the calculator 18 shown in FIG. 2 by executing the calculation process 58 .
- the computer 40 executing the information processing program 50 functions as the information processing device 10 . Note that the CPU 41 that executes the program is hardware.
- the functions realized by the information processing program 50 can also be realized by, for example, a semiconductor integrated circuit, more specifically an ASIC (Application Specific Integrated Circuit) or the like.
- a semiconductor integrated circuit more specifically an ASIC (Application Specific Integrated Circuit) or the like.
- the information processing routine shown in FIG. 8 is executed in the information processing device 10.
- FIG. The information processing routine is an example of the information processing method of technology disclosed herein.
- the acquisition unit 12 acquires the video output from the tracking camera 20. Then, the acquisition unit 12 calculates the position coordinates of the athlete 32 on the xy plane in which the skating rink 30 is viewed from above by motion tracking from the acquired video.
- step S12 the acquisition unit 12 determines that the position where the photographing direction of the first camera 22A and the photographing direction of the second camera 22B intersect is closest to the position of the player. Get a frame of video taken at each of the
- step S14 the specifying unit 14 three-dimensionally analyzes each of the frames acquired in step S12, and calculates the three-dimensional positions of the athlete 32 and the predetermined parts of the equipment worn by the athlete 32. . Thereby, the height of the predetermined portion is specified.
- step S16 the estimating unit 16 detects the frame in which the height of the predetermined portion specified in step S14 exceeds the reference value indicating the height at the time of icing or the previous frame exceeding the reference value. is estimated as the ice break-off time. Also, the estimating unit 16 estimates the time information of the frame in which the height of the predetermined part returns to the reference value after exceeding the reference value or the frame immediately before returning to the reference value as the icing time.
- step S18 the calculation unit 18 uses the positions of the blade tip 34 and the tip end 36 specified from each of the ice takeoff time frame and the ice accretion time frame, and uses the shooting direction of the specification camera 22 as a reference. Calculate the absolute angle of the blade. Further, the calculation unit 18 calculates each of the ice takeoff point A and the ice accretion point B based on the trajectory information delivered from the acquisition unit 12 and the ice takeoff time and ice landing time delivered from the estimation unit 16.
- the calculation unit 18 specifies a straight line passing through the ice take-off point A and the ice accretion point B as the reference line (AB), and the line based on the photographing direction of the main identification camera 22 and the specified reference line
- the angle difference from (AB) is subtracted from the absolute angle of the blade to calculate the rotation angle ⁇ of the blade.
- the calculation unit 18 calculates information such as the height of the jump, the flight distance, and the crossing speed based on the three-dimensional positions of the predetermined parts specified from each of the jump section and a predetermined number of frames before and after the section. calculate.
- the calculation unit 18 outputs the calculation result of the rotation angle ⁇ and other information, and the information processing routine ends.
- the information processing device captures images of the athletes on the skating rink from two intersecting directions captured by each of a plurality of cameras. get.
- the information processing device specifies the height of the athlete and at least a part of the equipment worn by the athlete in each of the plurality of frames included in the acquired video.
- the information processing device calculates the ice takeoff time and the ice landing time of the jump performed by the athlete from the frame corresponding to the ice takeoff time and the ice landing time, based on the change in the height of the predetermined part. presume.
- the information processing device estimates the ice take-off time and the ice landing time of the jump based on images captured by each of a plurality of cameras that capture images of the athlete from two intersecting directions. Time of day and icing time can be estimated. Furthermore, information such as the rotation angle of the jump at the time of icing can also be calculated with high accuracy based on the ice take-off time and ice accretion time that are estimated with high accuracy.
- an information processing system 200 includes an information processing device 210 and a camera 222 .
- the camera 222 is a photographing device capable of photographing the three-dimensional position of the athlete 32 on the skating rink 30 and a predetermined portion of the clothing worn by the athlete 32 so as to be able to be identified. be.
- a plurality of (for example, two) cameras 222 are installed at positions where the three-dimensional position of the above-described predetermined portion can be measured by a stereo camera method.
- the camera 222 outputs video captured at a predetermined frame rate (120 fps, for example).
- the output video includes a plurality of frames, and each frame is associated with time information.
- a single ToF (Time-of-Flight) camera may be used.
- the camera 222 also functions, for example, as an imaging device for motion tracking, which captures an image capable of identifying the position of the athlete 32 on the skating rink 30, like the tracking camera 20 in the first embodiment.
- the information processing device 210 functionally includes an acquisition unit 212, an identification unit 214, an estimation unit 216, and a calculation unit 218, as shown in FIG.
- the acquisition unit 212 acquires the video output from the camera 222 .
- the acquisition unit 212 transfers the acquired video to the identification unit 214 .
- the specifying unit 214 specifies the blade height of the skate shoes worn by the athlete 32 in each of the plurality of frames included in the video transferred from the acquiring unit 212 .
- the identifying unit 214 calculates the three-dimensional positions (x, y, z) of the tip 34 and the terminal end 36 of the blade as shown in FIG. 4, like the identifying unit 14 in the first embodiment.
- Trajectory information similar to the trajectory information of the player 32 calculated by the acquisition unit 12 of the first embodiment is calculated from the three-dimensional position (x, y).
- the three-dimensional position z corresponds to the height of the tip 34 or the terminal end 36 of the blade (hereinafter also simply referred to as "blade").
- the identifying unit 214 may also calculate the three-dimensional position (x, y, z) of a predetermined part other than the blade, like the identifying unit 14 in the first embodiment.
- the identifying unit 214 transfers the height z of the blade to the estimating unit 216 , and also transfers the three-dimensional positions (x, y, z) of the blade and other predetermined parts, and trajectory information to the calculating unit 218 .
- the estimating unit 216 estimates the ice take-off time and ice accretion time of the jump based on the blade height change and the time information of each of the plurality of frames received from the specifying unit 214 . Specifically, as shown in FIG. 10, the estimating unit 216 determines that the degree of change in blade height with respect to time information of each of a plurality of frames changes from within a predetermined value to exceeding a predetermined value. The time corresponding to the change point is estimated as the ice break-off time.
- the predetermined value may be a value that indicates that the height of the blade is almost unchanged, that is, a value that allows it to be determined that the player 32 is in an icing state.
- the change in the height z of the blade is within a predetermined value up to the time information tT1 , and the change in the height z exceeds the predetermined value after the time information tT2 .
- the estimating unit 216 calculates a straight line obtained by linearly approximating the height z up to time information t T1 when the change in the height z of the blade is within a predetermined value, and a straight line obtained by approximating the height z after the time information t T2 when the change in the height z exceeds the predetermined value. is detected as a point of change. Then, the estimating unit 216 estimates the time information tT corresponding to the point of change as the ice break-off time.
- the estimating unit 216 similarly determines that the icing time corresponds to a change point at which the degree of change in blade height changes from exceeding a predetermined value to within a predetermined value. Estimate the time when the icing occurs.
- the change in the height z of the blade exceeds a predetermined value up to the time information tL2 , and the change in the height z is within the predetermined value after the time information tL1 .
- the estimating unit 216 performs curve approximation of the height z up to the time information t L2 where the change in the height z of the blade exceeds a predetermined value, and the change in the height z is within a predetermined value.
- a point of intersection with a straight line obtained by linearly approximating the height z after the time information tL1 is detected as a change point.
- the estimation unit 216 estimates the time information tL corresponding to the change point as the icing time.
- the time information of a frame as the ice-break time is estimated as time.
- the time information of the frame before or after that timing is It is estimated as time.
- the estimation unit 216 transfers the estimated ice take-off time and ice accretion time to the calculation unit 218 .
- the calculator 218 calculates the rotation angle ⁇ for each frame, like the calculator 18 in the first embodiment. At this time, as shown in FIG. Identify the position of player 32 calculated from each of the frames corresponding to the information. Then, the calculation unit 218 specifies, as the ice takeoff point A, a position corresponding to the ice takeoff time tT between the two specified positions. In FIG. 12, for convenience of explanation, the trajectory of the position of the player 32 is represented by a straight line corresponding to the time axis.
- the calculation unit 218 calculates F T1 as the frame corresponding to the time information t T1 immediately before the ice-break time t T and the time information t T2 immediately after the ice-break time t T .
- the calculation unit 218 sets the position of the player 32 calculated from the frame FT1 to PT1, the position of the player 32 calculated from the frame FT2 to PT2 , and between PT1 and PT2 , Assume that the position of player 32 varies linearly.
- the calculation unit 218 identifies a straight line passing through the identified ice takeoff point A and ice landing point B as the reference line (AB) for judging the rotation of the jump. Calculate the rotation angle ⁇ from each of the frames included in .
- FIG. 14 shows an example of the rotation angle ⁇ calculated from each frame included in the jump section. Furthermore, the calculation unit 218 calculates the rotation angle ⁇ at the ice landing time based on the rotation angle ⁇ calculated from the frames before and after the ice landing time estimated by the estimation unit 216 .
- FIG. 15 is an enlarged view of the portion indicated by the dashed frame in FIG.
- F L2 be the frame corresponding to the time information t L2 immediately before the icing time t L estimated by the estimation unit 216
- F L1 be the frame corresponding to the time information t L1 immediately after the icing time t L
- the calculation unit 218 assumes that the rotation angle of the blade calculated from the frame FL1 is ⁇ L1, the rotation angle calculated from the frame FL2 is ⁇ L2 , and that the rotation speed during the jump is substantially constant.
- the calculation unit 218 calculates the rotation angle between ⁇ L1 and ⁇ L2 , which corresponds to the ratio of t L to the time (t L1 ⁇ t L2 ) for one frame, at the icing time t
- the rotation angle can be calculated with higher accuracy than in the case of the first embodiment.
- the calculation unit 218 also calculates information other than the rotation angle and outputs the calculation result, like the calculation unit 18 in the first embodiment.
- the information processing device 210 can be realized, for example, by the computer 40 shown in FIG.
- the storage unit 43 of the computer 40 stores an information processing program 250 for causing the computer 40 to function as the information processing device 210 .
- the information processing program 250 has an acquisition process 252 , an identification process 254 , an estimation process 256 and a calculation process 258 .
- the CPU 41 reads out the information processing program 250 from the storage unit 43, develops it in the memory 42, and sequentially executes the processes of the information processing program 250.
- the CPU 41 operates as the acquisition unit 212 shown in FIG. 9 by executing the acquisition process 252 . Further, the CPU 41 operates as the specifying unit 214 shown in FIG. 9 by executing the specifying process 254 . Further, the CPU 41 operates as the estimation unit 216 shown in FIG. 9 by executing the estimation process 256 . Further, the CPU 41 operates as the calculation unit 218 shown in FIG. 9 by executing the calculation process 258 .
- the computer 40 executing the information processing program 250 functions as the information processing device 210 .
- the functions realized by the information processing program 250 can also be realized by, for example, a semiconductor integrated circuit, more specifically an ASIC or the like.
- the information processing routine shown in FIG. 16 is executed in the information processing device 210 .
- the information processing routine is an example of the information processing method of technology disclosed herein.
- step S210 the acquisition unit 212 acquires the video output from the camera 222 and transfers it to the identification unit 214. Then, the identifying unit 214 determines the three-dimensional position (x, y, z ) is calculated. This three-dimensional position (x, y, z) includes the trajectory information of player 32 and the height of the blade.
- step S212 the estimating unit 216 determines that the degree of change in the height of the blade with respect to the time information of each of the plurality of frames corresponds to a point of change from within a predetermined value to exceeding a predetermined value. Estimate the time as the ice break-off time. Similarly, for the icing time, the estimating unit 216 calculates the icing time corresponding to the change point at which the degree of change in the blade height z changes from exceeding a predetermined value to within a predetermined value. estimated as
- step S214 the calculation unit 218 calculates the position of the player 32 from each of the frames corresponding to the time information immediately before and after the estimated take-off time based on the trajectory information of the player 32. identify. Then, the calculation unit 218 specifies, as the ice takeoff point A, a position corresponding to the ice takeoff time between the two specified positions. Similarly, the calculator 218 identifies the position of the player 32 calculated from each of the frames corresponding to the time information immediately before and after the estimated ice landing time. Then, the calculation unit 218 identifies a position corresponding to the icing time as the icing point B between the two identified positions. Further, the calculation unit 218 identifies a straight line passing through the identified ice takeoff point A and ice landing point B as the reference line (AB) for judging the rotation of the jump.
- the calculation unit 218 identifies a straight line passing through the identified ice takeoff point A and ice landing point B as the reference line (AB) for judging the rotation of the jump.
- step S216 the calculation unit 218 calculates the rotation angle ⁇ from each frame included in the jump section using the identified reference line (AB). Then, the calculation unit 218 calculates the rotation angle corresponding to the ice accretion time between the rotation angles calculated from each of the frames corresponding to the time information immediately before and after the estimated ice accretion time.
- step S218 the calculation unit 218 calculates information other than the rotation angle, outputs the calculation result together with the rotation angle calculated in step S216, and ends the information processing routine.
- the information processing device acquires the video captured by the camera that captures the athlete on the skating rink.
- the information processing device also identifies the height of the blades of the skate shoes worn by the athlete in each of the plurality of frames included in the acquired video.
- the information processing device estimates the ice take-off time and ice landing time of the jump based on the change in the height of the blade and the time information of each of the plurality of frames. As a result, it is possible to estimate the ice take-off time and ice landing time of a jump in figure skating without attaching a sensor or the like to the athlete or the equipment worn by the athlete.
- the information processing device detects a point of change in the height of the blade, and estimates the time information corresponding to the point of change as the ice take-off time and the ice accretion time. It is possible to estimate accurate ice take-off time and ice accretion time. Further, based on the detailed estimated ice take-off time and ice landing time, it is possible to accurately calculate information such as the rotation angle of the jump at the time of ice landing.
- an information processing system 300 includes an information processing device 310, a tracking camera 20, and an identification camera 22.
- the information processing device 310 functionally includes an acquisition unit 12, an identification unit 14, an estimation unit 216, and a calculation unit 218, as shown in FIG. That is, the information processing apparatus 310 according to the third embodiment has a configuration in which the information processing apparatus 10 according to the first embodiment and the information processing apparatus 210 according to the second embodiment are combined.
- the information processing device 310 uses the plurality of identification cameras 22 to determine the optimum identification position based on the position of the athlete 32 obtained from the video captured by the tracking camera 20 . select the camera 22 for Then, the information processing device 310 calculates the three-dimensional position of the predetermined part including the blade from the image captured by the selected identification camera 22 . This also specifies the height of the blade. Further, the information processing device 310 detects the point of change in the height of the blade, and estimates the time information corresponding to the point of change as the ice release time and the ice accretion time.
- the information processing device 310 can be realized, for example, by the computer 40 shown in FIG.
- the storage unit 43 of the computer 40 stores an information processing program 350 for causing the computer 40 to function as the information processing device 310 .
- the information processing program 350 has an acquisition process 52 , an identification process 54 , an estimation process 256 and a calculation process 258 .
- the CPU 41 reads out the information processing program 350 from the storage unit 43, develops it in the memory 42, and sequentially executes the processes of the information processing program 350.
- the CPU 41 operates as the acquisition unit 12 shown in FIG. 2 by executing the acquisition process 52 . Further, the CPU 41 operates as the specifying unit 14 shown in FIG. 2 by executing the specifying process 54 . Further, the CPU 41 operates as the estimation unit 216 shown in FIG. 2 by executing the estimation process 256 . Further, the CPU 41 operates as the calculation unit 218 shown in FIG. 2 by executing the calculation process 258 .
- the computer 40 executing the information processing program 350 functions as the information processing device 310 .
- the functions realized by the information processing program 350 can also be realized by, for example, a semiconductor integrated circuit, more specifically an ASIC or the like.
- the information processing apparatus 210 executes steps S10 to S14 of the information processing routine shown in FIG. 8 and steps S212 to S218 of the information processing routine shown in FIG.
- the ice take-off time and ice landing time of a jump in figure skating can be further improved. It can be estimated with high accuracy.
- the optimum identification camera 22 is selected from the plurality of identification cameras 22 based on the position of the athlete 32 obtained from the video captured by the tracking camera 20 for each frame. Although the case of selection has been described, the present invention is not limited to this. Rough ice release time and ice accretion time are obtained from the video captured by the tracking camera 20, and the optimum identification camera 22 is selected for the frame corresponding to the ice release time and ice accretion time. good too.
- the information processing device acquires the three-dimensional position including the position in the height direction of the predetermined part of the player 32 from each frame of the video captured by the tracking camera 20 . Then, the information processing device acquires the time information of the frame in which the height of the predetermined portion exceeds the reference value indicating the height at the time of ice accretion or the frame immediately preceding the reference value, as the approximate ice release time. do. In addition, the information processing device acquires time information of the frame in which the height of the predetermined portion returns to the reference value from the state exceeding the reference value or the frame immediately before returning to the reference value as rough icing time. .
- the information processing device identifies a section from a predetermined number of frames before the frame corresponding to the approximate ice takeoff time to a predetermined number of frames after the frame corresponding to the approximate ice landing time. Then, based on the position of the athlete 32 on the skating rink 30 acquired from each frame included in the specified section, the information processing device explains each frame included in the section using FIG. Select the appropriate identifying camera 22 as before. This makes it possible to limit the selection of the optimum identification camera 22 and the execution of the processing on the video imaged by the identification camera 22 to the jump section, thereby reducing the amount of processing.
- the information processing program is pre-stored (installed) in the storage unit, but the present invention is not limited to this.
- the program according to the technology disclosed herein can also be provided in a form stored in a storage medium such as a CD-ROM, DVD-ROM, USB memory, or the like.
Abstract
Description
図1及び図2に示すように、第1実施形態に係る情報処理システム100は、情報処理装置10と、トラッキング用カメラ20と、特定用カメラ22とを含む。
次に、第2実施形態について説明する。なお、第2実施形態に係る情報処理システムにおいて、第1実施形態に係る情報処理システム100と同様の構成については、同一符号を付して詳細な説明を省略する。また、第2実施形態において、第1実施形態の機能部と符号の下2桁が共通する機能部について、第1実施形態の機能部と共通する機能については、詳細な説明を省略する。
次に、第3実施形態について説明する。なお、第3実施形態に係る情報処理システムにおいて、第1実施形態に係る情報処理システム100、及び第2実施形態に係る情報処理システム100と同様の構成については、同一符号を付して詳細な説明を省略する。
10、210、310 情報処理装置
12、212 取得部
14、214 特定部
16、216 推定部
18、218 算出部
20 トラッキング用カメラ
22 特定用カメラ
22A 第1カメラ
22B 第2カメラ
222 カメラ
30 スケートリンク
32 競技者
34 ブレードの先端
36 ブレードの終端
40 コンピュータ
41 CPU
42 メモリ
43 記憶部
49 記憶媒体
50、250、350 情報処理プログラム
Claims (20)
- スケートリンク上の競技者を、交差する2方向から撮影する複数のカメラの各々で撮影された映像を取得し、
前記映像に含まれる複数のフレームの各々における、前記競技者及び前記競技者が装着する装着物の少なくとも一部の部位の高さを特定し、
前記高さの変化に基づいて、前記競技者が行ったジャンプの離氷時に対応するフレーム及び着氷時に対応するフレームから、前記ジャンプの離氷時刻及び着氷時刻を推定する
ことを含む処理をコンピュータに実行させるための情報処理プログラム。 - 前記映像を取得する処理は、撮影方向が前記スケートリンクの長軸方向と平行な複数の第1カメラ、及び撮影方向が前記スケートリンクの短軸方向と平行な複数の第2カメラのうち、前記第1カメラの撮影方向と前記第2カメラの撮影方向とが交差する位置が、前記競技者の位置に最も近い前記第1カメラ及び前記第2カメラの各々で撮影された映像を取得することを含む請求項1に記載の情報処理プログラム。
- 前記離氷時刻及び前記着氷時刻を推定する処理は、前記部位の高さが、着氷時の高さを示す基準値を超えたフレーム又は前記基準値を超える1つ前のフレームの時刻情報を、前記離氷時刻として推定し、前記部位の高さが、前記基準値を超えた状態から前記基準値に戻ったフレーム又は前記基準値に戻る1つ前のフレームの時刻情報を、前記着氷時刻として推定することを含む請求項1又は請求項2に記載の情報処理プログラム。
- 前記部位を、前記競技者が装着するスケートシューズのブレードとする請求項1~請求項3のいずれか1項に記載の情報処理プログラム。
- スケートリンク上の競技者を撮影するカメラで撮影された映像を取得し、
前記映像に含まれる複数のフレームの各々における、前記競技者が装着するスケートシューズのブレードの高さを特定し、
前記高さの変化、及び前記複数のフレームの各々の時刻情報に基づいて、前記競技者が行ったジャンプの離氷時刻及び着氷時刻を推定する
ことを含む処理をコンピュータに実行させるための情報処理プログラム。 - 前記離氷時刻を推定する処理は、前記複数のフレームの各々の時刻情報に対する前記高さの変化の度合いが、所定値以内の状態から前記所定値を超える状態に変化した変化点に対応する時刻を離氷時刻として推定し、前記着氷時刻を推定する処理は、前記高さの変化の度合いが、前記所定値を超える状態から前記所定値以内の状態に変化した変化点に対応する時刻を着氷時刻として推定することを含む請求項5に記載の情報処理プログラム。
- 前記映像を取得する処理は、前記競技者を、交差する2方向から撮影する複数のカメラの各々で撮影された映像を取得することを含む請求項5又は請求項6に記載の情報処理プログラム。
- 前記映像を取得する処理は、撮影方向が前記スケートリンクの長軸方向と平行な複数の第1カメラ、及び撮影方向が前記スケートリンクの短軸方向と平行な複数の第2カメラのうち、前記第1カメラの撮影方向と前記第2カメラの撮影方向とが交差する位置が、前記競技者の位置に最も近い前記第1カメラ及び前記第2カメラの各々で撮影された映像を取得することを含む請求項7に記載の情報処理プログラム。
- 前記離氷時刻及び前記着氷時刻の各々における、前記ブレードの向きに基づく基準線に対する前記ブレードの角度を算出することをさらに含む処理を前記コンピュータに実行させる請求項4~請求項8のいずれか1項に記載の情報処理プログラム。
- 前記着氷時刻の前後のフレームから算出された前記角度に基づいて、前記着氷時刻における前記ブレードの角度を算出する請求項9に記載の情報処理プログラム。
- 前記離氷時刻に対応するフレームの所定数前のフレームから、前記着氷時刻に対応するフレームの所定数後のフレームまでの各フレームにおける前記競技者及び前記競技者が装着する装着物の少なくとも一部の部位の3次元位置に基づいて、踏切速度、ジャンプの高さ、飛距離、及び回転速度の少なくとも1つを算出する請求項1~請求項10のいずれか1項に記載の情報処理プログラム。
- スケートリンク上の競技者を、交差する2方向から撮影する複数のカメラの各々で撮影された映像を取得する取得部と、
前記映像に含まれる複数のフレームの各々における、前記競技者及び前記競技者が装着する装着物の少なくとも一部の部位の高さを特定する特定部と、
前記高さの変化に基づいて、前記競技者が行ったジャンプの離氷時に対応するフレーム及び着氷時に対応するフレームから、前記ジャンプの離氷時刻及び着氷時刻を推定する推定部と、
を含む情報処理装置。 - 前記取得部は、撮影方向が前記スケートリンクの長軸方向と平行な複数の第1カメラ、及び撮影方向が前記スケートリンクの短軸方向と平行な複数の第2カメラのうち、前記第1カメラの撮影方向と前記第2カメラの撮影方向とが交差する位置が、前記競技者の位置に最も近い前記第1カメラ及び前記第2カメラの各々で撮影された映像を取得する請求項12に記載の情報処理装置。
- 前記推定部は、前記部位の高さが、着氷時の高さを示す基準値を超えたフレーム又は前記基準値を超える1つ前のフレームの時刻情報を、前記離氷時刻として推定し、前記部位の高さが、前記基準値を超えた状態から前記基準値に戻ったフレーム又は前記基準値に戻る1つ前のフレームの時刻情報を、前記着氷時刻として推定する請求項12又は請求項13に記載の情報処理装置。
- スケートリンク上の競技者を撮影するカメラで撮影された映像を取得する取得部と、
前記映像に含まれる複数のフレームの各々における、前記競技者が装着するスケートシューズのブレードの高さを特定する特定部と、
前記高さの変化、及び前記複数のフレームの各々の時刻情報に基づいて、前記競技者が行ったジャンプの離氷時刻及び着氷時刻を推定する推定部と、
を含む情報処理装置。 - 前記推定部は、前記複数のフレームの各々の時刻情報に対する前記高さの変化の度合いが、所定値以内の状態から前記所定値を超える状態に変化した変化点に対応する時刻を離氷時刻として推定し、前記着氷時刻を推定する処理は、前記高さの変化の度合いが、前記所定値を超える状態から前記所定値以内の状態に変化した変化点に対応する時刻を着氷時刻として推定する請求項15に記載の情報処理装置。
- スケートリンク上の競技者を、交差する2方向から撮影する複数のカメラの各々で撮影された映像を取得し、
前記映像に含まれる複数のフレームの各々における、前記競技者及び前記競技者が装着する装着物の少なくとも一部の部位の高さを特定し、
前記高さの変化に基づいて、前記競技者が行ったジャンプの離氷時に対応するフレーム及び着氷時に対応するフレームから、前記ジャンプの離氷時刻及び着氷時刻を推定する
ことを含む処理をコンピュータが実行する情報処理方法。 - スケートリンク上の競技者を撮影するカメラで撮影された映像を取得し、
前記映像に含まれる複数のフレームの各々における、前記競技者が装着するスケートシューズのブレードの高さを特定し、
前記高さの変化、及び前記複数のフレームの各々の時刻情報に基づいて、前記競技者が行ったジャンプの離氷時刻及び着氷時刻を推定する
ことを含む処理をコンピュータが実行する情報処理方法。 - スケートリンク上の競技者を、交差する2方向から撮影する複数のカメラの各々で撮影された映像を取得し、
前記映像に含まれる複数のフレームの各々における、前記競技者及び前記競技者が装着する装着物の少なくとも一部の部位の高さを特定し、
前記高さの変化に基づいて、前記競技者が行ったジャンプの離氷時に対応するフレーム及び着氷時に対応するフレームから、前記ジャンプの離氷時刻及び着氷時刻を推定する
ことを含む処理をコンピュータに実行させるための情報処理プログラムを記憶した記憶媒体。 - スケートリンク上の競技者を撮影するカメラで撮影された映像を取得し、
前記映像に含まれる複数のフレームの各々における、前記競技者が装着するスケートシューズのブレードの高さを特定し、
前記高さの変化、及び前記複数のフレームの各々の時刻情報に基づいて、前記競技者が行ったジャンプの離氷時刻及び着氷時刻を推定する
ことを含む処理をコンピュータに実行させるための情報処理プログラムを記憶した記憶媒体。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020237032930A KR20230147733A (ko) | 2021-04-05 | 2021-04-05 | 정보 처리 프로그램, 장치, 및 방법 |
EP21935924.7A EP4300948A1 (en) | 2021-04-05 | 2021-04-05 | Information processing program, device, and method |
PCT/JP2021/014499 WO2022215116A1 (ja) | 2021-04-05 | 2021-04-05 | 情報処理プログラム、装置、及び方法 |
JP2023512507A JPWO2022215116A1 (ja) | 2021-04-05 | 2021-04-05 | |
CN202180096595.3A CN117203956A (zh) | 2021-04-05 | 2021-04-05 | 信息处理程序、装置、以及方法 |
US18/476,925 US20240020976A1 (en) | 2021-04-05 | 2023-09-28 | Information processing program, device, and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/014499 WO2022215116A1 (ja) | 2021-04-05 | 2021-04-05 | 情報処理プログラム、装置、及び方法 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/476,925 Continuation US20240020976A1 (en) | 2021-04-05 | 2023-09-28 | Information processing program, device, and method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022215116A1 true WO2022215116A1 (ja) | 2022-10-13 |
Family
ID=83545227
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/014499 WO2022215116A1 (ja) | 2021-04-05 | 2021-04-05 | 情報処理プログラム、装置、及び方法 |
Country Status (6)
Country | Link |
---|---|
US (1) | US20240020976A1 (ja) |
EP (1) | EP4300948A1 (ja) |
JP (1) | JPWO2022215116A1 (ja) |
KR (1) | KR20230147733A (ja) |
CN (1) | CN117203956A (ja) |
WO (1) | WO2022215116A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015082817A (ja) | 2013-10-24 | 2015-04-27 | ソニー株式会社 | 情報処理装置、記録媒体、および情報処理方法 |
WO2016092933A1 (ja) * | 2014-12-08 | 2016-06-16 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
WO2016098415A1 (ja) * | 2014-12-18 | 2016-06-23 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2018142815A (ja) * | 2017-02-27 | 2018-09-13 | 富士通株式会社 | 3次元データ取得装置及び方法 |
WO2019229818A1 (ja) * | 2018-05-28 | 2019-12-05 | 富士通株式会社 | 表示方法、表示プログラムおよび情報処理装置 |
JP2020030190A (ja) * | 2018-08-24 | 2020-02-27 | 独立行政法人日本スポーツ振興センター | 位置追跡システム、及び位置追跡方法 |
JP2020031406A (ja) * | 2018-08-24 | 2020-02-27 | 独立行政法人日本スポーツ振興センター | 判定システム、及び判定方法 |
-
2021
- 2021-04-05 CN CN202180096595.3A patent/CN117203956A/zh active Pending
- 2021-04-05 JP JP2023512507A patent/JPWO2022215116A1/ja active Pending
- 2021-04-05 WO PCT/JP2021/014499 patent/WO2022215116A1/ja active Application Filing
- 2021-04-05 KR KR1020237032930A patent/KR20230147733A/ko active Search and Examination
- 2021-04-05 EP EP21935924.7A patent/EP4300948A1/en active Pending
-
2023
- 2023-09-28 US US18/476,925 patent/US20240020976A1/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015082817A (ja) | 2013-10-24 | 2015-04-27 | ソニー株式会社 | 情報処理装置、記録媒体、および情報処理方法 |
WO2016092933A1 (ja) * | 2014-12-08 | 2016-06-16 | ソニー株式会社 | 情報処理装置、情報処理方法およびプログラム |
WO2016098415A1 (ja) * | 2014-12-18 | 2016-06-23 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2018142815A (ja) * | 2017-02-27 | 2018-09-13 | 富士通株式会社 | 3次元データ取得装置及び方法 |
WO2019229818A1 (ja) * | 2018-05-28 | 2019-12-05 | 富士通株式会社 | 表示方法、表示プログラムおよび情報処理装置 |
JP2020030190A (ja) * | 2018-08-24 | 2020-02-27 | 独立行政法人日本スポーツ振興センター | 位置追跡システム、及び位置追跡方法 |
JP2020031406A (ja) * | 2018-08-24 | 2020-02-27 | 独立行政法人日本スポーツ振興センター | 判定システム、及び判定方法 |
Also Published As
Publication number | Publication date |
---|---|
JPWO2022215116A1 (ja) | 2022-10-13 |
KR20230147733A (ko) | 2023-10-23 |
US20240020976A1 (en) | 2024-01-18 |
EP4300948A1 (en) | 2024-01-03 |
CN117203956A (zh) | 2023-12-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7362806B2 (ja) | 情報処理装置、情報処理装置の制御方法、情報処理システム及びプログラム | |
KR102290932B1 (ko) | 레이더 데이터 및 이미지화기 데이터를 사용하여 물체를 추적하기 위한 디바이스, 시스템, 및 방법 | |
US11348255B2 (en) | Techniques for object tracking | |
EP2227299B1 (en) | Methods and processes for detecting a mark on a playing surface and for tracking an object | |
US10769810B2 (en) | Apparatus, systems and methods for shadow assisted object recognition and tracking | |
US10922871B2 (en) | Casting a ray projection from a perspective view | |
KR102226623B1 (ko) | 실내 스포츠를 위한 카메라를 이용한 운동량 산출 시스템 | |
WO2022215116A1 (ja) | 情報処理プログラム、装置、及び方法 | |
US20220392222A1 (en) | Information processing program, device, and method | |
KR101703316B1 (ko) | 영상을 기반으로 속도를 측정하는 방법 및 장치 | |
WO2021186645A1 (ja) | 情報処理プログラム、装置、及び方法 | |
KR101971060B1 (ko) | 모듈형 고속 촬영 장치, 고속 영상 기반의 공 운동 인식 장치 및 방법 | |
US20210343025A1 (en) | Method and controller for tracking moving objects | |
WO2021056552A1 (zh) | 视频的处理方法和装置 | |
CN111288972A (zh) | 应用于虚拟射箭系统的箭的运动参数测量系统和方法 | |
TWI775637B (zh) | 高爾夫揮桿解析系統、高爾夫揮桿解析方法及資訊記憶媒體 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21935924 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2023512507 Country of ref document: JP |
|
ENP | Entry into the national phase |
Ref document number: 20237032930 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1020237032930 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021935924 Country of ref document: EP |
|
ENP | Entry into the national phase |
Ref document number: 2021935924 Country of ref document: EP Effective date: 20230926 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |