US20220394322A1 - Information processing program, device, and method - Google Patents

Information processing program, device, and method Download PDF

Info

Publication number
US20220394322A1
US20220394322A1 US17/890,024 US202217890024A US2022394322A1 US 20220394322 A1 US20220394322 A1 US 20220394322A1 US 202217890024 A US202217890024 A US 202217890024A US 2022394322 A1 US2022394322 A1 US 2022394322A1
Authority
US
United States
Prior art keywords
ice
time
location
video
competitor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US17/890,024
Other languages
English (en)
Inventor
Kouji Nakamura
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Assigned to FUJITSU LIMITED reassignment FUJITSU LIMITED ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: NAKAMURA, KOUJI
Publication of US20220394322A1 publication Critical patent/US20220394322A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/215Motion-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/439Processing of audio elementary streams
    • H04N21/4394Processing of audio elementary streams involving operations for analysing the audio stream, e.g. detecting features or characteristics in audio streams
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/10Positions
    • A63B2220/13Relative positions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/807Photo cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/808Microphones
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2244/00Sports without balls
    • A63B2244/18Skating
    • A63B2244/183Artistic or figure skating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30221Sports video; Sports image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Definitions

  • the disclosed technology relates to an information processing program, an information processing device, and an information processing method.
  • a predetermined point or section such as a location and a posture of a competitor is specified from a video during a sport competition.
  • the predetermined point is, for example, a moment of impact on a ball in golf, baseball, tennis, or the like, a moment of jumping or landing in gymnastics or the like, or the like.
  • an information processing device that specifies a decisive moment from consecutive motions of a subject and extracts the moment as an image has been proposed.
  • the device receives sensor data from a sensor attached to a user or an object in contact with the user, and time information corresponding to the sensor data.
  • the device specifies a time at which a predetermined motion pattern occurs in the user or the object based on the sensor data and the time information. Then, the device selects one or a plurality of images from a series of images including the user or the object photographed at predetermined time intervals according to a specified time.
  • Patent Document 1 Japanese Patent Application Laid-Open (JP-A) No. 2015-82817
  • a non-transitory recording medium storing an information processing program executable by a computer to perform processing, the processing comprising: acquiring a sound signal collected by a microphone provided in a venue including a skating rink, and a first video obtained by photographing a competitor on the skating rink; estimating a takeoff-from-ice time and a landing-on-ice time of a jump performed by the competitor according to a silencing and a return of an ice sound based on the sound signal; and synchronizing time information of the sound signal with time information of the first video, and specifying each of a location corresponding to the takeoff-from-ice time and a location corresponding to the landing-on-ice time in a trajectory of a location of the competitor on the skating rink based on the first video.
  • FIG. 1 is a block diagram illustrating a schematic configuration of an information processing system according to a first embodiment.
  • FIG. 2 is a functional block of an information processing device according to the first embodiment.
  • FIG. 3 is a diagram for explaining estimation of a takeoff-from-ice time and a landing-on-ice time.
  • FIG. 4 is a diagram for explaining a delay time of a sound signal for a video.
  • FIG. 5 is a diagram for explaining specification of a takeoff-from-ice point and a landing-on-ice point.
  • FIG. 6 is a diagram for explaining specific details of the takeoff-from-ice point.
  • FIG. 7 is a block diagram illustrating a schematic configuration of a computer that functions as the information processing device.
  • FIG. 8 is a flowchart illustrating an example of an information processing routine according to the first embodiment.
  • FIG. 9 is a block diagram illustrating a schematic configuration of an information processing system according to a second embodiment.
  • FIG. 10 is a functional block of an information processing device according to the second embodiment.
  • FIG. 11 is a diagram for explaining specification of a reference line.
  • FIG. 12 is a diagram for explaining specification of a frame of a second video indicating a jump section.
  • FIG. 13 is a diagram for explaining calculation of locations of a distal end and a terminal end of a blade as a predetermined part.
  • FIG. 14 is a diagram illustrating a rotation angle 0 calculated from each of frames included in a jump section.
  • FIG. 15 is an enlarged view of a portion indicated by a broken-line frame in FIG. 14 .
  • FIG. 16 is a flowchart illustrating an example of an information processing routine according to the second embodiment.
  • an information processing system 100 includes an information processing device 10 , a microphone 20 , and a camera 22 .
  • the information processing system 100 performs information processing on a sound signal output from the microphone 20 and a video output from the camera 22 , and specifies a jump section from a takeoff-from-ice point to a landing-on-ice point on a trajectory of a location of a competitor.
  • the microphone 20 is installed in ice of a skating rink 30 .
  • the microphone 20 can be installed in the ice by being embedded in the skating rink 30 when ice is spread.
  • the microphone 20 collects a voice in a competition venue and outputs a sound signal. Since the microphone 20 is installed in the ice, a sound component indicating cheer, music, or the like is suppressed, and a sound component indicating a frictional sound between a surface (ice) of the skating rink 30 and a blade of a skate shoe becomes dominant in a sound component included in the sound signal collected by the microphone 20 . Time information is associated with each sampling point of the sound signal to be output.
  • the frictional sound between the surface (ice) of the skating rink 30 and the blade of the skate shoe is an example of an ice sound.
  • the camera 22 is, for example, a photographing device for motion tracking that photographs a video capable of specifying the location of the competitor 32 on the skating rink 30 .
  • a plurality of (For example, three) cameras 22 is installed on a ceiling, a side wall, or the like of a venue. In FIG. 1 , only one camera 22 is illustrated.
  • Each camera 22 outputs a video photographed at a predetermined frame rate (For example, 120 fps).
  • the output video includes a plurality of frames, and time information is associated with each frame.
  • the information processing device 10 functionally includes an acquisition section 12 , an estimation section 14 , an specifying section 16 , and an output section 18 as illustrated in FIG. 2 .
  • the acquisition section 12 acquires the sound signal output from the microphone 20 and the video output from each of the plurality of cameras 22 .
  • the acquisition section 12 delivers the acquired sound signal to the estimation section 14 and delivers the acquired video to the specifying section 16 .
  • the estimation section 14 estimates a takeoff-from-ice time and a landing-on-ice time of a jump performed by the competitor according to a silencing and a return of the ice sound based on the sound signal. For example, the estimation section 14 estimates the takeoff-from-ice time and the landing-on-ice time of the jump performed by the competitor based on a section in which a level of the sound signal is equal to or less than a predetermined threshold value. This is based on the fact that a frictional sound between the blade and the ice disappears at the time of takeoff-from-ice at the start of the jump, and the frictional sound returns at the time of landing-on-ice.
  • the estimation section 14 estimates a time at which the sound signal becomes equal to or less than a threshold value TH as a takeoff-from-ice time tA.
  • the estimation section 14 estimates, as a landing-on-ice time tB, the time at which the sound signal that has been equal to or less than the threshold value TH again exceeds the threshold value TH.
  • the estimation section 14 may estimate the takeoff-from-ice time tA and the landing-on-ice time tB of the jump based on the sound signal from which a predetermined frequency component included in the sound signal has been removed.
  • the predetermined frequency component can be, for example, a frequency component corresponding to a sound other than frictional sound between the blade and ice, such as cheers and music. As described above, in a case where the microphone 20 is installed in ice, sounds other than frictional sounds between the blade and the ice, such as cheers and music, are suppressed.
  • the takeoff-from-ice time tA and the landing-on-ice time tB of the jump can be estimated with higher accuracy by removing frequency components corresponding to sounds other than frictional sounds between the blade and the ice.
  • the sound signal includes a lot of cheers, music, and the like, and thus it is effective to remove a predetermined frequency component.
  • the estimation section 14 delivers the estimated takeoff-from-ice time tA and landing-on-ice time tB of the jump to the specifying section 16 .
  • the specifying section 16 acquires a trajectory of the location of the competitor 32 on the skating rink 30 by motion tracking from the video delivered from the acquisition section 12 .
  • the specifying section 16 recognizes the competitor 32 that is a target of motion tracking for each frame of the video photographed by each camera 22 .
  • the specifying section 16 recognizes the competitor 32 from each frame based on characteristics such as color, shape, and the like of a wearing object worn by the competitor 32 or the competitor 32 .
  • the specifying section 16 may recognize a moving object indicated by a difference between frames as the competitor 32 .
  • the specifying section 16 calculates the location of the competitor 32 recognized in each frame for each frame.
  • the specifying section 16 generates trajectory information by tracking the location of the competitor in each frame.
  • the location of the competitor may be calculated in three dimensions or in two dimensions. In the present embodiment, a case of calculating location coordinates (x, y) of the recognized competitor 32 in a xy plane in a plan view of the skating rink 30 will be described.
  • the specifying section 16 synchronizes the time information of the sound signal with the time information of the video.
  • the specifying section 16 may reflect a delay time of the sound signal with respect to the video according to a distance between the competitor 32 and the microphone 20 .
  • the delay time is X ⁇ 3230 using the sound speed of 3230 [m/s] in ice.
  • the specifying section 16 may use the delay time obtained as described above to synchronize time information obtained by subtracting the delay time from the time information of the sound signal with time information of the video.
  • the specifying section 16 specifies a section from a location corresponding to the takeoff-from-ice time tA to a location corresponding to the landing-on-ice time tB as a jump section on the trajectory of the location of the competitor 32 .
  • each of the takeoff-from-ice time tA and the landing-on-ice time tB is time information obtained by subtracting the delay time from each of the takeoff-from-ice time tA and the landing-on-ice time tB estimated by the estimation section 14 .
  • the specifying section 16 specifies the location of the competitor 32 acquired from each of the frames corresponding to the time information immediately before and immediately after the takeoff-from-ice time tA in the video in which the sound signal and the time information are synchronized with each other. Then, the specifying section 16 specifies a location (hereinafter, also referred to as a “takeoff-from-ice point”) corresponding to the takeoff-from-ice time tA between the specified two locations.
  • a location hereinafter, also referred to as a “takeoff-from-ice point”
  • the trajectory of the location of the competitor 32 is represented by a straight line corresponding to a time axis of the sound signal.
  • the specifying section 16 sets a frame corresponding to time information tl immediately before the takeoff-from-ice time tA to ml, and sets a frame corresponding to time information t 2 immediately after the takeoff-from-ice time tA to m 2 .
  • the specifying section 16 assumes that the location acquired from the frame ml is P 1 , the location acquired from the frame m 2 is P 2 , and the location of the competitor 32 changes linearly between P 1 and P 2 .
  • the specifying section 16 also specifies a location (hereinafter, also referred to as a “landing-on-ice point”) corresponding to the landing-on-ice time tB in the same manner as the takeoff-from-ice point.
  • the specifying section 16 delivers the trajectory of the location of the competitor 32 acquired by the motion tracking and the specified locations of the takeoff-from-ice point and the landing-on-ice point to the output section 18 .
  • the output section 18 superimposes an image indicating the trajectory of the location of the competitor 32 on the image indicating the skating rink 30 , and generates and outputs image data for displaying the specified jump section in a display mode different from other sections in the trajectory.
  • the output section 18 generates image data indicating an image 38 as illustrated in FIG. 1 .
  • the trajectory is indicated by a broken line
  • the jump section is indicated by a solid line.
  • the image 38 indicated by the output image data is displayed on, for example, a display device or the like used by a referee, and can be used for determining a rink cover or the like. It is also possible to insert the image 38 into a screen of television broadcasting or the like.
  • the information processing device 10 can be realized by, for example, a computer 40 illustrated in FIG. 7 .
  • the computer 40 includes a central processing unit (CPU) 41 , a memory 42 as a temporary storage area, and a nonvolatile storage section 43 .
  • the computer 40 includes an input/output device 44 such as an input section and a display section, and a read/write (R/W) section 45 that controls reading and writing of data with respect to a storage medium 49 .
  • the computer 40 further includes a communication interface (I/F) 46 connected to a network such as the Internet.
  • the CPU 41 , the memory 42 , the storage section 43 , the input/output device 44 , the R/W section 45 , and the communication I/F 46 are connected to each other via a bus 47 .
  • the storage section 43 can be realized by a hard disk drive (HDD), a solid state drive (SSD), a flash memory, or the like.
  • the storage section 43 as a storage medium stores an information processing program 50 for causing the computer 40 to function as the information processing device 10 .
  • the information processing program 50 includes an acquisition process 52 , an estimation process 54 , a specification process 56 , and an output process 58 .
  • the CPU 41 reads the information processing program 50 from the storage section 43 , develops the program in the memory 42 , and sequentially executes the processes included in the information processing program 50 .
  • the CPU 41 operate as the acquisition section 12 illustrated in FIG. 2 by executing the acquisition process 52 .
  • the CPU 41 operates as the estimation section 14 illustrated in FIG. 2 by executing the estimation process 54 .
  • the CPU 41 operates as the specifying section 16 illustrated in FIG. 2 by executing the specification process 56 .
  • the CPU 41 operates as the output section 18 illustrated in FIG. 2 by executing the output process 58 .
  • the computer 40 that has executed the information processing program 50 functions as the information processing device 10 .
  • the CPU 41 that executes the program is hardware.
  • Functions implemented by the information processing program 50 can also be implemented by, for example, a semiconductor integrated circuit, more specifically, an application specific integrated circuit (ASIC) or the like.
  • a semiconductor integrated circuit more specifically, an application specific integrated circuit (ASIC) or the like.
  • ASIC application specific integrated circuit
  • the information processing routine is an example of an information processing method of the disclosed technology.
  • Step S 12 the acquisition section 12 acquires the sound signal and the video input to the information processing device 10 .
  • the acquisition section 12 delivers the acquired sound signal to the estimation section 14 and delivers the acquired video to the specifying section 16 .
  • Step S 14 the estimation section 14 estimates the time when the sound signal becomes equal to or less than the threshold value TH as the takeoff-from-ice time tA, and estimates the time when the sound signal that has become equal to or less than the threshold value TH again exceeds the threshold value TH as the landing-on-ice time tB.
  • the estimation section 14 delivers the estimated takeoff-from-ice time tA and landing-on-ice time tB of the jump to the specifying section 16 .
  • Step S 16 the specifying section 16 acquires the trajectory of the location of the competitor 32 on the skating rink 30 by motion tracking from each frame of the video delivered from the acquisition section 12 .
  • Step S 18 the specifying section 16 synchronizes the time information of the sound signal with the time information of the video. Then, the specifying section 16 specifies the location of the competitor 32 acquired from each of the frames corresponding to the time information immediately before and immediately after the takeoff-from-ice time tA in each frame of the video in which the sound signal and the time information are synchronized with each other. Then, the specifying section 16 specifies the location of the takeoff-from-ice point A corresponding to the takeoff-from-ice time tA between the specified locations.
  • the specifying section 16 specifies the location of the competitor 32 acquired from each of the frames corresponding to the time information immediately before and immediately after the landing-on-ice time tB. Then, the specifying section 16 specifies the location of the landing-on-ice point B corresponding to the landing-on-ice time tB between the specified locations. A section from the specified takeoff-from-ice point A to the landing-on-ice point B is specified as the jump section. The specifying section 16 delivers the trajectory of the location of the competitor 32 acquired by the motion tracking and the specified locations of the takeoff-from-ice point and the landing-on-ice point to the output section 18 .
  • Step S 20 the output section 18 superimposes the image indicating the trajectory of the location of the competitor 32 on the image indicating the skating rink 30 , and generates and outputs image data for displaying the specified jump section in the display mode different from the image indicating the trajectory. Then, the information processing routine ends.
  • the information processing device acquires the sound signal collected by the microphone provided in the venue including the skating rink and the video obtained by identifiably photographing the location of the competitor on the skating rink. Then, the information processing device estimates the takeoff-from-ice time and the landing-on-ice time of the jump performed by the competitor according to the silencing and return of the ice sound based on the sound signal. The information processing device synchronizes the time information of the sound signal with the time information of the video image, and specifies each of the location corresponding to the takeoff-from-ice time and the location corresponding to the landing-on-ice time in the trajectory of the location of the competitor on the skating rink acquired from the video by motion tracking. As a result, it is possible to specify the takeoff-from-ice point and the landing-on-ice point of a jump in figure skating without attaching a sensor or the like to the competitor.
  • an information processing system 200 includes an information processing device 210 , a microphone 20 , a first camera 22 , and a second camera 24 .
  • the information processing system 200 performs information processing on a sound signal output from the microphone 20 and videos output from the first camera 22 and the second camera 24 , and specifies a jump section from a takeoff-from-ice point to a landing-on-ice point on a trajectory of a location of a competitor, similarly to the first embodiment.
  • the information processing system 200 calculates information on the jump, such as a rotation angle of a blade at the time of landing-on-ice of the jump, using the specified the jump section.
  • the first camera 22 is similar to the camera 22 in the first embodiment, and is, for example, a photographing device for motion tracking that photographs a video capable of specifying a location of a competitor 32 on a skating rink 30 .
  • the second camera 24 is a photographing device that identifiably photographs three-dimensional positions of a predetermined part of a wearing object worn by the competitor 32 and the competitor 32 on the skating rink 30 .
  • a plurality of (for example, two) second cameras 24 are installed at locations where the three-dimensional location of the predetermined part can be measured by a stereo camera system. In FIG. 9 , only one second camera 24 is illustrated.
  • Each second camera 24 outputs a video photographed at a predetermined frame rate (for example, 30 fps, 60 fps, and the like).
  • the output video includes a plurality of frames, and time information is associated with each frame.
  • One ToF (Time-of-Flight) type camera may be used as the second camera 24 .
  • first camera 22 the video photographed by first camera 22
  • second camera 24 the video photographed by second camera 24
  • the information processing device 210 functionally includes an acquisition section 212 , an estimation section 14 , an specifying section 216 , a calculation section 19 , and an output section 218 as illustrated in FIG. 10 .
  • the acquisition section 212 acquires the sound signal output from the microphone 20 , the first video output from each of the plurality of first cameras 22 , and the second video output from each of the plurality of second cameras 24 .
  • the acquisition section 212 delivers the acquired sound signal to the estimation section 14 , and delivers the acquired first video and second video to the specifying section 216 .
  • the specifying section 216 specifies a location of each of a takeoff-from-ice point A and a landing-on-ice point B based on a takeoff-from-ice time tA and a landing-on-ice time tB estimated by the estimation section 14 and the first video. As illustrated in FIG. 11 , the specifying section 216 specifies a straight line passing through the takeoff-from-ice point A and the landing-on-ice point B as a reference line (A-B) for determining the rotation of the jump.
  • A-B reference line
  • the specifying section 216 delivers the trajectory of the location of the competitor 32 and the locations of the takeoff-from-ice point A and the landing-on-ice point B to the output section 218 , and delivers the locations of the takeoff-from-ice point A and the landing-on-ice point B and the information of the reference line (A-B) to the calculation section 19 .
  • the specifying section 216 synchronizes the time information of the sound signal with the time information of the second video, and specifies a frame corresponding to the takeoff-from-ice time tA to a frame corresponding to the landing-on-ice time tB of the jump in the second video delivered from the acquisition section 212 as the jump section.
  • the specifying section 216 specifies, as a start frame mS corresponding to the takeoff-from-ice time tA, a frame preceding the frame (hereinafter, referred to as a “takeoff-from-ice frame mA”) of the second video of the time information synchronized with the takeoff-from-ice time tA, by a predetermined number of frames.
  • the specifying section 216 specifies, as an end frame mE corresponding to the landing-on-ice time tB, a frame following the frame (hereinafter, referred to as a “landing-on-ice frame mB”) of the second video of the time information synchronized with the landing-on-ice time tB, by a predetermined number of frames.
  • the reason why the frames before and after the takeoff-from-ice frame mA to the landing-on-ice frame mB are included is to specify the start frame mS and the end frame mE so as to reliably include the takeoff-from-ice to the landing-on-ice.
  • the predetermined number can be 1 , for example.
  • the specifying section 216 may use a predetermined number based on the delay time when specifying the start frame mS, and may use 1 as the predetermined number when specifying the end frame mE.
  • the specifying section 216 extracts a section from the start frame mS to the end frame mE as a jump section from the second video delivered from the acquisition section 212 , and delivers the section to the calculation section 19 .
  • the calculation section 19 three-dimensionally analyzes each of the frames of the second video included in the jump section delivered from the specifying section 216 , and calculates the three-dimensional locations (x, y, z) of the predetermined part of the wearing object worn by the competitor 32 and the competitor 32 .
  • the predetermined part includes a distal end 34 and a terminal end 36 of a blade of a skate shoe worn by the competitor 32 .
  • the predetermined part may include each joint, head, and face parts such as eyes, nose, and mouth of the competitor 32 .
  • an existing method such as a recognition method using a shape of a predetermined part or a recognition method using a human skeleton model can be used.
  • the information processing system 200 includes three or more second cameras 24 .
  • the calculation section 19 may calculate the three-dimensional location of the predetermined part using two images obtained by photographing the competitor 32 at an angle suitable for calculating the three-dimensional location among the videos photographed by each of the plurality of second cameras 24 .
  • the calculation section 19 calculates an absolute angle of the blade with a photographing direction of the second camera 24 as a reference using the locations of the distal end 34 and the terminal end 36 of the blade calculated from each of the frames included in the jump section. For example, the calculation section 19 can calculate an angle formed by the photographing direction of the second camera 24 or a line perpendicular to the photographing direction and a line connecting the distal end 34 and the terminal end 36 of the blade as the absolute angle of the blade.
  • the calculation section 19 may determine any one of the second cameras 24 among the plurality of second cameras 24 as a main camera and calculate the absolute angle of the blade based on a photographing direction of the main second camera 24 .
  • the calculation section 19 converts the absolute angle of the blade into an angle (hereinafter, referred to as a “rotation angle ⁇ ”) with respect to a reference line for determining insufficient rotation of the jump. Specifically, the calculation section 19 calculates the rotation angle ⁇ of the blade by subtracting the angle difference between the line with the photographing direction of the second camera 24 as a reference and the reference line (A-B) specified by the specifying section 216 from the absolute angle of the blade.
  • FIG. 14 illustrates the rotation angle ⁇ calculated from each of the frames included in the jump section.
  • the calculation section 19 calculates a delay time At of the sound signal for the second video at the time of the landing-on-ice.
  • the delay time is the distance X [m] ⁇ 3230 [m/s] (sound velocity in ice).
  • the distance X is a distance between the location of the microphone 20 and the landing-on-ice point B.
  • the calculation section 19 calculates the rotation angle of the blade at the time of the landing-on-ice based on a rotation angle ⁇ (mE) calculated from the end frame mE and a rotation angle ⁇ (mE ⁇ 1 ) calculated from a frame mE ⁇ 1 preceding the end frame mE by one frame.
  • FIG. 15 is an enlarged view of a portion indicated by a broken line frame in FIG. 14 .
  • the landing-on-ice time tB estimated based on the sound signal includes a corrected landing-on-ice time tB ⁇ t in consideration of the calculated delay time ⁇ t within the time of one frame from the frame mE ⁇ 1 to the frame mE.
  • the delay time ⁇ t is a minute time compared to a time for one frame.
  • the calculation section 19 assumes that the rotation speed during the jump is substantially constant, and linearly interpolates the rotation angle between the frame mE- 1 and the frame mE using the rotation angle ⁇ (mE ⁇ 1) and the rotation angle ⁇ (mE). Then, the calculation section 19 calculates a rotation angle corresponding to the corrected landing-on-ice time tB ⁇ t as a rotation angle ⁇ (tB ⁇ t) at the time of the landing-on-ice.
  • the calculation section 19 can also calculate other information based on the three-dimensional location of the predetermined part corresponding to the jump section. For example, the calculation section 19 can calculate a location of a waist as the predetermined part, and calculate a difference between a minimum value and a maximum value of the location of the waist calculated from each frame included in the jump section as a jump height. The calculation section 19 can calculate the distance from the takeoff-from-ice point A to the landing-on-ice point B specified by the specifying section 216 as a jump distance. The calculation section 19 can calculate a rotation speed from the time from the takeoff-from-ice time to to the landing-on-ice time tB and the change in the rotation angle in the jump section.
  • the calculation section 19 can calculate a crossing speed from a time from the start frame mS to the predetermined frame and a change amount of the location of the predetermined part during that time.
  • the calculation section 19 delivers the rotation angle ⁇ (tB ⁇ t) at the time of the landing-on-ice and other calculated information to the output section 218 .
  • the output section 218 generates and outputs image data of an image 38 indicating the jump section on the trajectory of the location of the competitor 32 .
  • the output section 218 outputs information such as the rotation angle ⁇ (tB ⁇ t) at the time of the landing-on-ice delivered from the calculation section 19 .
  • the rotation angle ⁇ (tB ⁇ t) at the time of the landing-on-ice can be used for determination of insufficient rotation of the jump or the like.
  • the output information can also be used as studs to be displayed on a screen of television broadcasting or the like.
  • the information processing device 210 can be realized by, for example, a computer 40 illustrated in FIG. 7 .
  • a storage section 43 of the computer 40 stores an information processing program 250 for causing the computer 40 to function as the information processing device 210 .
  • the information processing program 250 includes an acquisition process 252 , an estimation process 54 , a specification process 256 , a calculation process 59 , and an output process 258 .
  • a CPU 41 reads the information processing program 250 from the storage section 43 , develops the program in a memory 42 , and sequentially executes the processes included in the information processing program 250 .
  • the CPU 41 operates as the acquisition section 212 illustrated in FIG. 10 by executing the acquisition process 252 .
  • the CPU 41 operates as the estimation section 14 illustrated in FIG. 10 by executing the estimation process 54 .
  • the CPU 41 operates as the specifying section 216 illustrated in FIG. 10 by executing the specification process 256 .
  • the CPU 41 operates as the calculation section 19 illustrated in FIG. 10 by executing the calculation process 59 .
  • the CPU 41 operates as the output section 218 illustrated in FIG. 10 by executing the output process 258 .
  • the computer 40 that has executed the information processing program 250 functions as the information processing device 210 .
  • the functions implemented by the information processing program 250 can also be implemented by, for example, a semiconductor integrated circuit, more specifically, an ASIC or the like.
  • the sound signal output from the microphone 20 , the first video photographed by each of the plurality of first cameras 22 , and the second video photographed by each of the plurality of second cameras 24 are input to the information processing device 210 .
  • the information processing device 210 executes the information processing routine illustrated in FIG. 16 .
  • the same step numbers are assigned to the same processes as those of the information processing routine ( FIG. 8 ) in the first embodiment, and a detailed description thereof will be omitted.
  • the information processing routine is an example of an information processing method of the disclosed technology.
  • Step S 212 the acquisition section 212 acquires the sound signal, the first video, and the second video input to the information processing device 210 .
  • the acquisition section 212 delivers the acquired sound signal to the estimation section 14 , and delivers the acquired first video and second video to the specifying section 216 .
  • the specifying section 216 specifies the straight line passing through the takeoff-from-ice point A and the landing-on-ice point B as the reference line (A-B) for determining the rotation of the jump using the location of each of the takeoff-from-ice point A and the landing-on-ice point B specified in Step S 18 .
  • the specifying section 216 specifies, as the start frame mS corresponding to the takeoff-from-ice time tA, the frame preceding the takeoff-from-ice frame mA of the time information synchronized with the takeoff-from-ice time tA in the second video, by a predetermined number of frames (for example, one frame).
  • the specifying section 216 specifies, as the end frame mE corresponding to the landing-on-ice time tB, the frame following the landing-on-ice frame mB of time information synchronized with the landing-on-ice time tB, by a predetermined number (for example, one frame).
  • the specifying section 216 extracts a section from the start frame mS to the end frame mE as a jump section from the second video delivered from the acquisition section 212 , and delivers the section to the calculation section 19 .
  • Step S 222 the calculation section 19 three-dimensionally analyzes each of the frames included in the jump section delivered from the specifying section 216 , and calculates a three-dimensional location (x, y, z) of a predetermined part including the distal end 34 and the terminal end 36 of the blade. Then, the calculation section 19 calculates, as the absolute angle of the blade, an angle formed by the line with the photographing direction of the second camera 24 as a reference and the line connecting the distal end 34 and the terminal end 36 of the blade. The calculation section 19 calculates the rotation angle ⁇ of the blade by subtracting the angle difference between the reference line specified by the specifying section 216 in Step S 218 and the line with the photographing direction of the second camera 24 as a reference from the absolute angle of the blade.
  • Step S 226 the calculation section 19 calculates other information such as the jump distance and the rotation speed based on the three-dimensional location of the predetermined part corresponding to the jump section, the locations of the takeoff-from-ice point A and the landing-on-ice point B specified by the specifying section 216 , and the like.
  • Step S 228 the output section 218 generates image data of the image 38 indicating the jump section on the trajectory of the location of the competitor 32 , and outputs the image data and information such as the rotation angle ⁇ (tB ⁇ t) at the time of the landing-on-ice calculated in Step S 226 . Then, the information processing routine ends.
  • the information processing device estimates the takeoff-from-ice time and the landing-on-ice time from the sound signal, thereby accurately specifying the locations of the takeoff-from-ice point and the landing-on-ice point.
  • the information processing device specifies a reference line for determining the rotation of the jump using the specified takeoff-from-ice point and landing-on-ice point.
  • the present invention is not limited thereto.
  • the program according to the disclosed technology can also be provided in a form stored in a storage medium such as a CD-ROM, a DVD-ROM, or a USB memory.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Physical Education & Sports Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Security & Cryptography (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)
  • Studio Devices (AREA)
  • Devices For Executing Special Programs (AREA)
US17/890,024 2020-03-18 2022-08-17 Information processing program, device, and method Abandoned US20220394322A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2020/012122 WO2021186645A1 (ja) 2020-03-18 2020-03-18 情報処理プログラム、装置、及び方法

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2020/012122 Continuation WO2021186645A1 (ja) 2020-03-18 2020-03-18 情報処理プログラム、装置、及び方法

Publications (1)

Publication Number Publication Date
US20220394322A1 true US20220394322A1 (en) 2022-12-08

Family

ID=77771747

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/890,024 Abandoned US20220394322A1 (en) 2020-03-18 2022-08-17 Information processing program, device, and method

Country Status (6)

Country Link
US (1) US20220394322A1 (ja)
EP (1) EP4093024A4 (ja)
JP (1) JP7444238B2 (ja)
KR (1) KR20220128433A (ja)
CN (1) CN115136591A (ja)
WO (1) WO2021186645A1 (ja)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6539336B1 (en) * 1996-12-12 2003-03-25 Phatrat Technologies, Inc. Sport monitoring system for determining airtime, speed, power absorbed and other factors such as drop distance
DE19614253A1 (de) * 1996-03-05 1997-09-11 Karl Leonhardtsberger Elektronisches Aufzeichnungs- und Wiedergabeverfahren für Bewegungsabläufe auf Sportplätzen und/oder in Sport- und Eislaufhallen
JP4541744B2 (ja) * 2004-03-31 2010-09-08 ヤマハ株式会社 音像移動処理装置およびプログラム
RU2399402C2 (ru) * 2008-03-04 2010-09-20 Алексей Николаевич Мишин Система измерения параметров вращательного движения фигуриста
JP5515671B2 (ja) * 2009-11-20 2014-06-11 ソニー株式会社 画像処理装置、その制御方法およびプログラム
JP5924109B2 (ja) * 2012-05-11 2016-05-25 セイコーエプソン株式会社 センサーユニット、運動解析装置
JP6213146B2 (ja) 2013-10-24 2017-10-18 ソニー株式会社 情報処理装置、記録媒体、および情報処理方法
WO2016092933A1 (ja) * 2014-12-08 2016-06-16 ソニー株式会社 情報処理装置、情報処理方法およびプログラム
JP6673221B2 (ja) * 2014-12-18 2020-03-25 ソニー株式会社 情報処理装置、情報処理方法、およびプログラム
CN109074629A (zh) * 2015-10-29 2018-12-21 Oy沃肯视觉有限公司 使用联网照相机对关注的区域进行视频摄像
US10083537B1 (en) * 2016-02-04 2018-09-25 Gopro, Inc. Systems and methods for adding a moving visual element to a video
JP6882057B2 (ja) * 2017-05-11 2021-06-02 キヤノン株式会社 信号処理装置、信号処理方法、およびプログラム
JP2019033869A (ja) * 2017-08-14 2019-03-07 ソニー株式会社 情報処理装置、情報処理方法、及び、プログラム
KR20220128404A (ko) * 2020-02-27 2022-09-20 후지쯔 가부시끼가이샤 정보 처리 프로그램, 장치 및 방법

Also Published As

Publication number Publication date
WO2021186645A1 (ja) 2021-09-23
EP4093024A1 (en) 2022-11-23
CN115136591A (zh) 2022-09-30
JP7444238B2 (ja) 2024-03-06
EP4093024A4 (en) 2023-03-01
KR20220128433A (ko) 2022-09-20
JPWO2021186645A1 (ja) 2021-09-23

Similar Documents

Publication Publication Date Title
US10115020B2 (en) Image processing method, non-transitory computer-readable recording medium, and image processing device
WO2017038541A1 (ja) 映像処理装置、映像処理方法、及び、プログラム
JP6641163B2 (ja) オブジェクト追跡装置及びそのプログラム
US11227388B2 (en) Control method and device for mobile platform, and computer readable storage medium
WO2013119352A1 (en) Head pose tracking using a depth camera
JP6834590B2 (ja) 3次元データ取得装置及び方法
US20160065984A1 (en) Systems and methods for providing digital video with data identifying motion
US20220392222A1 (en) Information processing program, device, and method
KR20220066759A (ko) 선수 추적 방법, 선수 추적 장치 및 선수 추적 시스템
CN110490131B (zh) 一种拍摄设备的定位方法、装置、电子设备及存储介质
JP5253227B2 (ja) 画像入力装置及び被写体検出方法、プログラム
US20220394322A1 (en) Information processing program, device, and method
US20220084244A1 (en) Information processing apparatus, information processing method, and program
US20240020976A1 (en) Information processing program, device, and method
US20230018179A1 (en) Image processing apparatus, image processing method and storage medium
KR20210146265A (ko) 골프 스윙에 관한 정보를 추정하기 위한 방법, 디바이스 및 비일시성의 컴퓨터 판독 가능한 기록 매체
JP7240258B2 (ja) 画像処理装置、画像処理装置の制御方法及びプログラム
US10853952B2 (en) Image processing apparatus, image processing method, and non-transitory computer-readable storage medium
WO2024004190A1 (ja) 3次元位置算出方法、装置、及びプログラム
KR102544972B1 (ko) 선수 추적 방법, 선수 추적 장치 및 선수 추적 시스템
JP2022160233A (ja) 情報処理装置、情報処理方法およびプログラム
KR20220028740A (ko) 골프 스윙에 관한 정보를 추정하기 위한 방법, 디바이스 및 비일시성의 컴퓨터 판독 가능한 기록 매체
JPWO2023062757A5 (ja)
JP2009239721A (ja) テレビジョン受像装置及びテレビジョン受像方法

Legal Events

Date Code Title Description
AS Assignment

Owner name: FUJITSU LIMITED, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:NAKAMURA, KOUJI;REEL/FRAME:060840/0022

Effective date: 20220725

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION