CN113362366B - Sphere rotation speed determining method and device, terminal and storage medium - Google Patents
Sphere rotation speed determining method and device, terminal and storage medium Download PDFInfo
- Publication number
- CN113362366B CN113362366B CN202110556785.0A CN202110556785A CN113362366B CN 113362366 B CN113362366 B CN 113362366B CN 202110556785 A CN202110556785 A CN 202110556785A CN 113362366 B CN113362366 B CN 113362366B
- Authority
- CN
- China
- Prior art keywords
- sphere
- target track
- image
- frame
- track point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/207—Analysis of motion for motion estimation over a hierarchy of resolutions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/66—Analysis of geometric attributes of image moments or centre of gravity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Geometry (AREA)
- Image Analysis (AREA)
Abstract
The application is suitable for the fields of image processing and data processing, and provides a method, a device, a terminal and a storage medium for determining the rotation speed of a sphere. The method for determining the rotation speed of the ball comprises the following steps: acquiring a plurality of frames of images obtained by continuously acquiring a ball movement process and a time stamp corresponding to each frame of image in the plurality of frames of images; determining the space coordinates of a target track point of a sphere in each frame of image, wherein the target track point is a track point in a sphere motion track; determining the movement speed of the sphere on each target track point according to the space coordinates of the target track point of the sphere in each frame of image and the corresponding time stamp of each frame of image; and calculating the rotating speed of the sphere at each target track point based on the moving speed of the sphere at each target track point. The embodiment of the application can realize the calculation of the rotation speed of the sphere.
Description
Technical Field
The application belongs to the field of image processing and data processing, and particularly relates to a method and device for determining a sphere rotation speed, a terminal and a storage medium.
Background
With the importance of people on health, ball games are popularized and promoted, wherein table tennis serving as a national ball has the advantages of small occupied space, less countermeasures, suitability for the old and the young, and the like, and is popular with the masses. Furthermore, a personalized training scheme can be provided for ball players through calculation of ball rotating speeds, the problem that the existing ball speed measuring technology is inconvenient to use and high in cost exists, and at present, no low-cost technical scheme for measuring the rotating speeds and flight trajectories of respective balls for respective sports training and competition is available.
Disclosure of Invention
The embodiment of the application provides a method, a device, a terminal and a storage medium for determining the rotation speed of a sphere, which can realize the calculation of the rotation speed of the sphere.
An embodiment of the present application provides a method for determining a rotational speed of a sphere, including:
acquiring a multi-frame image obtained by continuously acquiring a ball movement process and a time stamp corresponding to each frame of image in the multi-frame image;
determining the space coordinates of target track points of the spheres in each frame of image, wherein the target track points are track points in the movement track of the spheres;
determining the movement speed of the sphere on each target track point according to the space coordinates of the target track points of the sphere in each frame image and the corresponding time stamp of each frame image;
and calculating the rotating speed of the sphere at each target track point based on the moving speed of the sphere at each target track point.
The device for determining the rotational speed of a sphere according to the second aspect of the embodiment of the present application includes:
the image acquisition unit is used for acquiring multi-frame images obtained by continuously acquiring the ball movement process and a time stamp corresponding to each frame of image in the multi-frame images;
a space coordinate determining unit, configured to determine a space coordinate of a target track point of the sphere in each frame of image, where the target track point is a track point in the sphere motion track;
a motion speed determining unit, configured to determine a motion speed of the sphere at each target track point according to a spatial coordinate of the target track point of the sphere in each frame image and a timestamp corresponding to each frame image;
and the rotating speed determining unit is used for calculating the rotating speed of the sphere at each target track point based on the moving speed of the sphere at each target track point.
A third aspect of the embodiments of the present application provides a terminal, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor implements the steps of the above method when executing the computer program.
A fourth aspect of the present embodiments provides a computer readable storage medium storing a computer program which, when executed by a processor, implements the steps of the above method.
A fifth aspect of the embodiments provides a computer program product which, when run on a terminal, causes the terminal to perform the steps of the method.
In the embodiment of the application, the space coordinates of the target track points of the ball are determined in each frame of images of the multi-frame images obtained by continuously collecting the ball movement process, and the movement speed of the ball on each target track point is determined according to the space coordinates of the target track points of the ball in each frame of images and the corresponding time stamp of each frame of images, so that the rotation speed of the ball on each target track point can be calculated based on the movement speed of the ball on each target track point, and the calculation of the rotation speed of the ball is realized.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required for the embodiments or the description of the prior art will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic implementation flow chart of a method for determining a rotational speed of a sphere according to an embodiment of the present application;
fig. 2 is a schematic flowchart of a specific implementation of step S102 provided in the embodiment of the present application;
fig. 3 is a schematic flowchart of a specific implementation of step S104 provided in the embodiment of the present application;
fig. 4 is a schematic structural diagram of a device for determining rotational speed of a sphere according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a terminal provided in an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present application without making any inventive effort, are intended to be protected herein.
With the importance of people on health, ball games are popularized and promoted, wherein the ball games are popular among the masses due to the advantages of small occupied space, less countermeasures, suitability for the old and the young, and the like.
The ball player can be provided with a personalized training scheme through calculating the ball rotating speed, so that a method for determining the ball rotating speed is needed, and the calculation of the rotating speed of the ball in the motion process is realized.
In order to illustrate the technical solution of the present application, the following description is made by specific examples.
Fig. 1 shows a schematic implementation flow chart of a method for determining a rotational speed of a sphere according to an embodiment of the present application, where the method may be applied to a terminal, and may be applicable to a situation where the rotational speed of the sphere needs to be calculated. The terminal can be a smart phone, a computer and other devices.
Specifically, the method for determining the rotational speed of the ball may include the following steps S101 to S104.
Step S101, acquiring a plurality of frames of images obtained by continuously acquiring a ball movement process and a time stamp corresponding to each frame of image in the plurality of frames of images.
Wherein, the ball body can be a ball body such as a table tennis ball, a tennis ball and the like.
In the embodiment of the application, the terminal can continuously acquire the ball movement process through the camera to obtain multi-frame images, and perform image processing based on the multi-frame images to calculate the actual rotation speed of the ball.
The camera may or may not be a camera carried by the terminal, and if the camera is not a camera carried by the terminal, the terminal may acquire a sphere movement process shot by the camera through bluetooth, a wireless network, or the like.
In order to improve accuracy and success rate of table tennis ball identification, in some embodiments of the present application, the camera may be a binocular camera, and the motion process of the ball body is continuously collected by the binocular camera, so as to obtain multiple left images and multiple right images, so that the image frame rate is maintained above 200 fps. It should be noted that other cameras may be used to capture images, which is not limited herein.
As the table tennis ball has high flying speed in the flying process, the phenomenon of 'smear' can be easily generated in the image. "smear", also known as motion blur, is caused by too fast relative motion between the target object and the camera or a long exposure time of the camera, resulting in degradation of image quality, affecting or even severely interfering with the viewing and recognition of the image. Therefore, in order to improve the accuracy of the rotation speed calculation, in some embodiments of the present application, the terminal may remove the smear in each frame image, obtain each frame image after removing the smear, and perform the rotation speed calculation based on each frame image after removing the smear.
Specifically, the method for eliminating the smear in each frame of image may include any one of color segmentation, background subtraction, or a combination of the above methods, which is not limited herein.
Step S102, determining the space coordinates of the target track point of the sphere in each frame of image.
Wherein the target track point is a track point in a sphere motion track. The above-mentioned spatial coordinates refer to coordinates of the target track point in the world coordinate system.
Specifically, as shown in fig. 2, the above determination of the spatial coordinates of the target trajectory point of the sphere in each frame image may include the following steps S201 to S202.
Step S201, image recognition is carried out on multiple frames of images, and pixel coordinates of the sphere center of the sphere in each frame of image are obtained.
In some embodiments of the present application, image recognition may be performed on each of the multiple frame images, the edge contour points of the sphere may be recognized, and the pixel coordinates of the sphere center of the sphere in each frame image may be calculated according to the sphere recognition result of each frame image.
Because the sphere flies quickly and appears as an ellipse in the image, in some embodiments of the present application, the coordinates of the center of the ellipse in the image may be obtained based on the coordinate information of the edge contour point of the sphere and the corresponding gray value, and the coordinates of the center of the ellipse may be used as the coordinates of the pixel of the center of the sphere.
Specifically, it is assumed that the center coordinates of the ellipse are set to (x 0 ,y 0 ) The pixel size in the elliptical image is m×n, the formula is given by
And solving to calculate the center coordinates of the ellipse, wherein I (x, y) represents the gray value of the pixel point with the pixel coordinates of (x, y).
It should be noted that, the embodiments of the present application do not limit a specific manner of image recognition, for example, a feature detection algorithm such as hough transform may be used, or image recognition may be performed on a multi-frame image by using a pre-established neural network.
Step S202, converting pixel coordinates to obtain space coordinates of a target track point of the sphere in each frame of image.
The currently adopted rotational speed calculation method is to identify marks on the sphere to calculate the rotational speed of the sphere. Taking a table tennis as an example, the terminal can identify a mark or logo preset on the table tennis to calculate the rotation speed of the table tennis. However, this method requires a high device and a high computational load, and for a fast ball, the frequency of acquisition needs to be high enough to ensure the accuracy of the rotational speed, and the marks or logo on the image become blurred and difficult to identify due to the exposure time, resulting in a low accuracy.
In order to ensure the precision of the calculated rotation speed of the sphere, in some embodiments of the present application, the pixel coordinates of the sphere center of the sphere in each frame of image may be converted into world coordinates under a world coordinate system, and the motion trail of the sphere may be fitted according to the world coordinates of the sphere center of the sphere in each frame of image and the corresponding timestamp of each frame of image; then, according to the world coordinates of the sphere centers of the spheres in each frame of image, the space coordinates of the target track points corresponding to the sphere centers on the motion track can be obtained.
Specifically, the camera may be a binocular camera. In some embodiments of the present application, epipolar correction may be performed on the binocular camera in advance, internal and external parameters of the binocular camera are obtained, and pixel coordinates of a spherical center of a sphere in each frame of image acquired by the binocular camera are converted into world coordinates in a world coordinate system by using the internal and external parameters of the binocular camera.
Assuming that a coordinate system corresponding to a left-eye camera in the binocular camera is taken as a world coordinate system, the internal and external parameters of the left-eye camera relative to the world coordinate system are respectively K 1 And [ R ] 1 ,T 1 ]The internal and external parameters of the right-eye camera relative to the world coordinate system are respectively K 2 And [ R ] 2 ,T 2 ]Then the world coordinates of the sphere center can be obtained by conversion according to the formula (1) and the formula (2):
wherein, the liquid crystal display device comprises a liquid crystal display device,for pixel coordinates in the left image, < >>For pixel coordinates in the right image, < >>Is the world coordinate of the center of the sphere.
Specifically, for the formula (1) and the formula (2)And->Elimination can be performed to obtain formula (3):
further, coordinates of at least three pairs of mutually matched pixel points in the left image and the right image are obtained, and world coordinates of the sphere center of the sphere can be solved.
It should be noted that, because the above formula (3) may not be completely linearly related due to factors such as image noise or calculation error, in some embodiments of the present application, the world coordinates of the center of sphere may be obtained by fitting by other methods such as least square method
In addition, in the embodiment, the world coordinate system may be calculated by using the coordinate system corresponding to the right-eye camera as the world coordinate system, which is not limited herein.
In some embodiments of the present application, the world coordinate of the sphere center of the sphere in each frame of image may be used as the spatial coordinate of the corresponding target track point, that is, the spatial position of the sphere center of the sphere in each frame of image is the spatial position of the target track point on the motion track when the sphere is in the timestamp corresponding to each frame of image.
In the embodiment of the application, the pixel coordinates of the sphere center of the sphere in each frame of image are obtained by carrying out image recognition on the multi-frame image, and the pixel coordinates are converted to obtain the space coordinates of the target track point of the sphere in each frame of image, so that the rotating speed is not required to be determined based on the recognition of the marks on the sphere, errors caused by the marks are avoided, the precision of the rotating speed of the sphere is effectively improved, and for the sphere which moves rapidly, high-precision rotating speed calculation can be realized even under the condition of low image acquisition frequency, expensive equipment is not required, and the cost is reduced.
Step S103, determining the movement speed of the sphere on each target track point according to the space coordinates of the target track point of the sphere in each frame image and the corresponding time stamp of each frame image.
Specifically, assuming that the world coordinates of each locus point are { p (0), p (1), …, p (n) }, the movement velocity of the sphere is equal to the first derivative of displacement according to newton's second law, i.eDiscretization into->The moving speed of the first target track point is +.> The same can be used for obtaining the motion speed { v (0), v (1), …, v (l-1) } of each target track point; wherein T is the sampling step length, and the difference value can be calculated and obtained by utilizing the time stamp corresponding to each target track point.
In practical application, when the movement of the ball body is flight movement, the ball body can collide when falling to the ground, the rotation state of the table tennis ball in the flight process can be changed by the collision, and then the rotation speed can also be changed, so that in some embodiments of the application, the falling judgment can be carried out on the movement track of the ball body.
Specifically, in some embodiments of the present application, the drop point determination may be performed according to the motion trajectory obtained by fitting.
If the movement track of the sphere comprises a landing point, the movement speed of the sphere on each target track point in the movement track before the sphere moves to the landing point can be determined according to the space coordinates of the target track point of the sphere in each frame of image and the time stamp corresponding to each frame of image, namely, the movement speed of each corresponding target track point before the sphere lands is calculated.
If the movement track of the sphere does not contain the falling point, the movement speed of the sphere on each target track point on the movement track of the sphere can be determined according to the space coordinates of the target track points of the sphere in each frame of image and the time stamp corresponding to each frame of image, namely, the movement speeds of all the target track points are calculated.
Step S104, calculating the rotating speed of the sphere at each target track point based on the moving speed of the sphere at each target track point.
In one embodiment, as shown in fig. 3, when the movement of the ball is a projectile movement, the step S104 may include the following steps S301 to S302.
Step S301, calculating acceleration of the sphere at each target track point based on the movement speed of the sphere at each target track point.
In one embodiment, a three-dimensional coordinate system may be pre-established and rotational speeds of the sphere at each target trajectory point may be calculated based on the three-dimensional coordinate system. It should be noted that, the process of establishing the three-dimensional coordinate system may be adjusted according to the actual situation, and for convenience of subsequent calculation, the gravity direction may be taken as the z axis in the three-dimensional coordinate system, and two lines perpendicular to each other in a plane perpendicular to the z axis may be taken as the x axis and the y axis, which is not limited herein.
Preferably, taking a ball as an example of a table tennis ball, in practical application, the short side of the table tennis table can be taken as an x axis, the long side of the table tennis table can be taken as a y axis, the upward direction perpendicular to the table top is taken as a z axis, a right-hand coordinate system is established, and the origin can be any table angle of the table tennis table.
Further, the ball is subjected to stress analysis, and the ball receives gravity G and air resistance F in the flight process d And Magnus force F M . Wherein the gravity G direction is vertically downward, and the air resistance F d The direction is opposite to the movement direction of the sphere. Magnus force F M The ball is subjected to viscous force in the flying rotation process, and the direction of air movement near the upper edge and the lower edge of the ball is opposite to the direction of table tennis movement, when the ball is a lower spin ball or an upper spin ball, the upper edge and the lower edge of the ball are identical to the air flow direction or opposite to the air flow direction, so that the air flow speeds near two positions are inconsistent, the pressure intensity received by the upper edge and the lower edge is different, the generated magnus force is determined by a right-hand rule, the right-hand thumb is opened, the movement direction of the ball passes through the palm, the thumb points to the direction of angular velocity, and the directions pointed by the other four fingers are the directions of the magnus force.
Specifically, gravity G and air resistance F d And Magnus force F M The calculation formulas of (a) are respectively as follows:
G=[0,0,-mg]
wherein m is the mass of the sphere, g is the gravitational acceleration, C D Is viscosity coefficient, ρ is air density, A is cross-sectional area of sphere, C M For the magnus force coefficient, ω is the rotational speed of the sphere, r is the radius of the sphere, and v is the velocity of the sphere.
Assuming that the rotation speed of the table tennis ball before the falling point is fixed, the centripetal acceleration of each track point in the table tennis ball flying process is as follows:
wherein v is x Representing the movement speed of the sphere on the x-axis at the target track point, v y Representing the movement speed of the sphere on the y axis at the target track point, v z Representing the velocity of the sphere in the z-axis at the target trajectory point,omega represents the rotational speed of the sphere, omega x For rotational speed on the x-axis, ω z For the rotational speed on the z-axis, < >>
Based on the stress condition of the sphere and the formula (4), the following can be obtained:
wherein, the liquid crystal display device comprises a liquid crystal display device,for the derivative of the flight speed v> And because the sphere moves as a projectile, namely +.>The gravity acceleration is also the centripetal acceleration of the table tennis.
Thus, in the embodiment of the present application, the acceleration of the sphere at each target locus point can be calculated from the velocity of movement of the sphere at each target locus point based on the above formula (5)
Step S302, calculating the rotating speed of the sphere at each target track point according to the acceleration of the sphere at each target track point and the moving speed of the sphere at each target track point.
Specifically, it can be calculated according to the formula (6)Is calculated according to the calculation result of (2);
wherein T is the sampling step length, and k is the index of the target track point.
At this time, the left and right sides of ax=b are multiplied by the least square inverse a -1 It can be derived that x=a -1 b, i.e. according to x=a - 1 b obtainingSpecific values of (3).
In some embodiments of the present application, assuming that there are L sphere target trajectory points, the transverse dimension of A is 3×L, which is a overdetermined matrix, and therefore its least squares inverse A can be found by Singular Value Decomposition (SVD) algorithm -1 。
Furthermore, according toIs the calculation result and formula->Calculating to obtain the sphere on each target trackThe rotational speed ω on the trace.
In the embodiment of the application, the space coordinates of the target track points of the ball are determined in each frame of images of the multi-frame images obtained by continuously collecting the ball movement process, and the movement speed of the ball on each target track point is determined according to the space coordinates of the target track points of the ball in each frame of images and the corresponding time stamp of each frame of images, so that the rotation speed of the ball on each target track point can be calculated based on the movement speed of the ball on each target track point, and the calculation of the rotation speed of the ball is realized.
In some embodiments of the present application, after calculating the rotational speed of the ball at each target track point, a suitable striking mode may be determined according to the rotational speed, and a warning may be given to the striking mode; or based on the batting mode of the batting, the ball motion track obtained by fitting and the rotating speed of each target track point obtained by calculation, the batting data analysis is carried out so as to evaluate the batting capability of the user.
In the embodiment of the application, the flight track of the ball can be provided for a user, and the batting data analysis is provided by combining the flight track and the rotating speed of the ball so as to meet the analysis and evaluation requirements on the training effect in the complex multi-table tennis training process.
It should be noted that, for the sake of simplicity of description, the foregoing method embodiments are all expressed as a series of combinations of actions, but it should be understood by those skilled in the art that the present application is not limited by the order of actions described, as some steps may be performed in other order according to the present application.
Fig. 4 is a schematic structural diagram of a device 400 for determining a rotational speed of a sphere according to an embodiment of the present application, where the device 400 for determining a rotational speed of a sphere is configured on a terminal.
In one embodiment, the determining device 400 for determining the rotational speed of the sphere may include: an image acquisition unit 401, a spatial coordinate determination unit 402, a movement speed determination unit 403, and a rotation speed determination unit 404.
An image obtaining unit 401, configured to obtain a plurality of frame images obtained by continuously collecting a sphere motion process, and a timestamp corresponding to each frame image in the plurality of frame images;
a spatial coordinate determining unit 402, configured to determine spatial coordinates of a target track point of the sphere in each frame of image, where the target track point is a track point in the sphere motion track;
a motion speed determining unit 403, configured to determine a motion speed of the sphere at each target track point according to the spatial coordinates of the target track point of the sphere in each frame image and the time stamp corresponding to each frame image;
a rotation speed determining unit 404, configured to calculate a rotation speed of the sphere at each target track point based on a movement speed of the sphere at each target track point.
In some embodiments of the present application, the spatial coordinate determining unit 402 may be specifically configured to: performing image recognition on the multi-frame images to obtain pixel coordinates of the sphere center of the sphere in each frame of image; and converting the pixel coordinates to obtain the space coordinates of the target track points of the spheres in each frame of image.
In some embodiments of the present application, the movement of the ball is a projectile movement, and the movement speed determining unit 403 may be further specifically configured to: if the movement track of the sphere comprises a drop point, determining the movement speed of the sphere on each target track point in the movement track before the sphere moves to the drop point according to the space coordinates of the target track point of the sphere in each frame of image and the time stamp corresponding to each frame of image; if the movement track of the sphere does not contain a drop point, determining the movement speed of the sphere on each target track point on the movement track of the sphere according to the space coordinates of the target track points of the sphere in each frame of image and the time stamp corresponding to each frame of image.
In some embodiments of the present application, the rotation speed determining unit 404 may be specifically configured to: calculating the acceleration of the sphere at each target track point based on the movement speed of the sphere at each target track point; and calculating the rotating speed of the sphere at each target track point according to the acceleration of the sphere at each target track point and the moving speed of the sphere at each target track point.
In some embodiments of the present application, the movement of the sphere is a projectile movement, and the rotation speed determining unit 404 may be specifically configured to: according to the formula
according toIs the calculation result and formula->Calculating to obtain the rotating speed omega of the sphere on each target track point;
wherein v is x Representing the movement speed of the sphere on the x axis at the target track point, v y Representing the movement speed of the sphere on the y axis at the target track point, v z Representing the speed of movement of the sphere in the z-axis at the target trajectory point, g representing the gravitational acceleration, representing the mass of the sphere, C D Represents the viscosity coefficient, ρ represents the air density, A represents the cross-sectional area of the sphere, C M Represents the magnus force coefficient, r represents the radius of the sphere, ω represents the rotational speed of the sphere, ω x For rotational speed on the x-axis, ω z For the rotational speed on the z-axis, < >>T is the sampling step length, and k is the index of the target track point.
In some embodiments of the present application, the above-mentioned determining device for a rotational speed of a sphere may further include a smear removing unit, specifically configured to: and eliminating the smear in each frame of image to obtain each frame of image after eliminating the smear.
In one embodiment, the determining device of the rotation speed of the sphere may further include a depth camera and a sphere reconstruction unit, wherein the depth camera may be a structured light camera, a TOF camera, a binocular camera, or the like, and is configured to continuously acquire multiple frames of images during the flight of the sphere and further acquire depth information of the sphere; the sphere reconstruction unit is used for reconstructing a three-dimensional model of the sphere flight track according to the depth information of the sphere, and can provide a three-dimensional flight track model of the sphere for users so as to meet the analysis and evaluation requirements of training effects in the complex multi-table tennis training process.
It should be noted that, for convenience and brevity of description, the specific working process of the above-mentioned sphere rotation speed determining device 400 may refer to the corresponding process of the method described in fig. 1 to 3, and will not be described herein again.
Fig. 5 is a schematic diagram of a terminal according to an embodiment of the present application. The terminal 5 may include: a processor 50, a memory 51 and a computer program 52 stored in the memory 51 and executable on the processor 50, such as a program for determining the rotational speed of a sphere. The processor 50, when executing the computer program 52, implements the steps of the above-described embodiments of the method for determining the rotational speed of each sphere, such as steps S101 to S105 shown in fig. 1. Alternatively, the processor 50 implements the functions of the modules/units in the above-described device embodiments when executing the computer program 52, such as the image acquisition unit 401, the spatial coordinate determination unit 402, the movement speed determination unit 403, and the rotation speed determination unit 404 shown in fig. 4.
The computer program may be divided into one or more modules/units which are stored in the memory 51 and executed by the processor 50 to complete the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing the specified functions, which instruction segments describe the execution of the computer program in the terminal.
For example, the computer program may be split into: an image acquisition unit, a space coordinate determination unit, a movement speed determination unit and a rotation speed determination unit. The specific functions of each unit are as follows: the image acquisition unit is used for acquiring multi-frame images obtained by continuously acquiring the ball movement process and a time stamp corresponding to each frame of image in the multi-frame images; a space coordinate determining unit, configured to determine a space coordinate of a target track point of the sphere in each frame of image, where the target track point is a track point in the sphere motion track; a motion speed determining unit, configured to determine a motion speed of the sphere at each target track point according to a spatial coordinate of the target track point of the sphere in each frame image and a timestamp corresponding to each frame image; and the rotating speed determining unit is used for calculating the rotating speed of the sphere at each target track point based on the moving speed of the sphere at each target track point.
The terminal may include, but is not limited to, a processor 50, a memory 51. It will be appreciated by those skilled in the art that fig. 5 is merely an example of a terminal and is not intended to be limiting, and that more or fewer components than shown may be included, or certain components may be combined, or different components may be included, for example, the terminal may also include input and output devices, network access devices, buses, etc.
The processor 50 may be a central processing unit (Central Processing Unit, CPU), other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 51 may be an internal storage unit of the terminal, such as a hard disk or a memory of the terminal. The memory 51 may also be an external storage device of the terminal, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card) or the like, which are provided on the terminal. Further, the memory 51 may also include both an internal storage unit and an external storage device of the terminal. The memory 51 is used for storing the computer program as well as other programs and data required by the terminal. The memory 51 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional units and modules is illustrated, and in practical application, the above-described functional distribution may be performed by different functional units and modules according to needs, i.e. the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-described functions. The functional units and modules in the embodiment may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit, where the integrated units may be implemented in a form of hardware or a form of a software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working process of the units and modules in the above system may refer to the corresponding process in the foregoing method embodiment, which is not described herein again.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and in part, not described or illustrated in any particular embodiment, reference is made to the related descriptions of other embodiments.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal and method may be implemented in other manners. For example, the apparatus/terminal embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated modules/units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the present application may implement all or part of the flow of the method of the above embodiment, or may be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of each method embodiment described above. Wherein the computer program comprises computer program code which may be in source code form, object code form, executable file or some intermediate form etc. The computer readable medium may include: any entity or device capable of carrying the computer program code, a recording medium, a U disk, a removable hard disk, a magnetic disk, an optical disk, a computer Memory, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), an electrical carrier signal, a telecommunications signal, a software distribution medium, and so forth. It should be noted that the computer readable medium contains content that can be appropriately scaled according to the requirements of jurisdictions in which such content is subject to legislation and patent practice, such as in certain jurisdictions in which such content is subject to legislation and patent practice, the computer readable medium does not include electrical carrier signals and telecommunication signals.
The above embodiments are only for illustrating the technical solution of the present application, and are not limiting; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present application, and are intended to be included in the scope of the present application.
Claims (10)
1. A method for determining rotational speed of a sphere, comprising:
acquiring a multi-frame image obtained by continuously acquiring a ball movement process and a time stamp corresponding to each frame of image in the multi-frame image; the movement of the sphere is flying movement;
determining the space coordinates of target track points of the spheres in each frame of image, wherein the target track points are track points in the movement track of the spheres;
determining the movement speed of the sphere on each target track point according to the space coordinates of the target track points of the sphere in each frame image and the corresponding time stamp of each frame image;
calculating the acceleration of the sphere at each target track point based on the movement speed of the sphere at each target track point;
calculating the rotating speed of the sphere on each target track point according to the acceleration of the sphere on each target track point and the moving speed of the sphere on each target track point;
the calculating the rotation speed of the sphere at each target track point according to the acceleration of the sphere at each target track point and the movement speed of the sphere at each target track point comprises the following steps:
according toIs the calculation result and formula->Calculating to obtain the sphere in eachThe rotation speed omega of each target track point;
wherein v is x Representing the movement speed of the sphere on the x axis at the target track point, v y Representing the movement speed of the sphere on the y axis at the target track point, v z Representing the speed of movement of the sphere in the z-axis at the target trajectory point, g representing the gravitational acceleration, m represents the mass of the sphere, C D Represents the viscosity coefficient, ρ represents the air density, A represents the cross-sectional area of the sphere, C M Represents the magnus force coefficient, r represents the radius of the sphere, ω represents the rotational speed of the sphere, ω x For rotational speed on the x-axis, ω z For the rotational speed in the z-axis,t is the sampling step length, k is the index of the target track point,>the movement speed of the target locus point k is represented.
2. The method for determining the rotational speed of a sphere according to claim 1, wherein determining the spatial coordinates of the target trajectory point of the sphere in each frame of image comprises:
performing image recognition on the multi-frame images to obtain pixel coordinates of the sphere center of the sphere in each frame of image;
and converting the pixel coordinates to obtain the space coordinates of the target track points of the spheres in each frame of image.
3. A method of determining rotational speed of a sphere as claimed in claim 2, wherein the movement of the sphere is a flying movement; the determining the movement speed of the sphere on each target track point according to the space coordinates of the target track point of the sphere in each frame image and the time stamp corresponding to each frame image comprises the following steps:
if the movement track of the sphere comprises a drop point, determining the movement speed of the sphere on each target track point in the movement track before the sphere moves to the drop point according to the space coordinates of the target track point of the sphere in each frame of image and the time stamp corresponding to each frame of image;
if the movement track of the sphere does not contain a drop point, determining the movement speed of the sphere on each target track point on the movement track of the sphere according to the space coordinates of the target track points of the sphere in each frame of image and the time stamp corresponding to each frame of image.
4. A method of determining the rotational speed of a sphere as claimed in any one of claims 1 to 3, comprising, prior to said determining the spatial coordinates of the target locus point of said sphere in each frame of image:
and eliminating the smear in each frame of image to obtain each frame of image after eliminating the smear.
5. A device for determining a rotational speed of a sphere, comprising:
the image acquisition unit is used for acquiring multi-frame images obtained by continuously acquiring the ball movement process and a time stamp corresponding to each frame of image in the multi-frame images; the movement of the sphere is flying movement;
a space coordinate determining unit, configured to determine a space coordinate of a target track point of the sphere in each frame of image, where the target track point is a track point in the sphere motion track;
a motion speed determining unit, configured to determine a motion speed of the sphere at each target track point according to a spatial coordinate of the target track point of the sphere in each frame image and a timestamp corresponding to each frame image;
a rotation speed determining unit for calculating acceleration of the sphere at each target track point based on the movement speed of the sphere at each target track point; calculating the rotating speed of the sphere on each target track point according to the acceleration of the sphere on each target track point and the moving speed of the sphere on each target track point; and for according to the formula Calculated->Is calculated according to the calculation result of (2); according to->Is the calculation result and formula->Calculating to obtain the rotating speed omega of the sphere on each target track point;
wherein v is x Representing the movement speed of the sphere on the x axis at the target track point, v y Representing the movement speed of the sphere on the y axis at the target track point, v z Representing the speed of movement of the sphere in the z-axis at the target trajectory point, g representing the gravitational acceleration, m represents the mass of the sphere, C D Represents the coefficient of viscosity, ρ represents the air density, a represents the cross-sectional area of the sphere,C M represents the magnus force coefficient, r represents the radius of the sphere, ω represents the rotational speed of the sphere, ω x For rotational speed on the x-axis, ω z For the rotational speed in the z-axis,t is the sampling step length, k is the index of the target track point,>representing the speed of movement of the target trajectory point.
6. The device for determining the rotational speed of a sphere according to claim 5, wherein the spatial coordinate determining unit is further specifically configured to:
performing image recognition on the multi-frame images to obtain pixel coordinates of the sphere center of the sphere in each frame of image;
and converting the pixel coordinates to obtain the space coordinates of the target track points of the spheres in each frame of image.
7. The apparatus according to claim 5, further comprising a smear removal unit for removing smear in each of the frame images.
8. The apparatus for determining rotational speed of a sphere according to claim 5, further comprising a depth camera and sphere reconstruction unit, wherein:
the depth camera is used for continuously acquiring multi-frame images in the ball flight process and acquiring depth information of the ball;
the sphere reconstruction unit is used for reconstructing a three-dimensional model of the sphere flight track according to the depth information of the sphere.
9. A terminal comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 4 when the computer program is executed.
10. A computer readable storage medium storing a computer program, characterized in that the computer program when executed by a processor implements the steps of the method according to any one of claims 1 to 4.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110556785.0A CN113362366B (en) | 2021-05-21 | 2021-05-21 | Sphere rotation speed determining method and device, terminal and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110556785.0A CN113362366B (en) | 2021-05-21 | 2021-05-21 | Sphere rotation speed determining method and device, terminal and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113362366A CN113362366A (en) | 2021-09-07 |
CN113362366B true CN113362366B (en) | 2023-07-04 |
Family
ID=77527061
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110556785.0A Active CN113362366B (en) | 2021-05-21 | 2021-05-21 | Sphere rotation speed determining method and device, terminal and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113362366B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103364579A (en) * | 2013-07-02 | 2013-10-23 | 北京理工大学 | Method for predicting ping-pong spin angle velocity of ping-pong robot |
CN105678802A (en) * | 2014-04-21 | 2016-06-15 | 杨祖立 | Method for generating three-dimensional information by identifying two-dimensional image |
CN108731670A (en) * | 2018-05-18 | 2018-11-02 | 南京航空航天大学 | Inertia/visual odometry combined navigation locating method based on measurement model optimization |
CN110135308A (en) * | 2019-04-30 | 2019-08-16 | 天津工业大学 | A kind of direct free kick type identification method based on video analysis |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7020046B1 (en) * | 2003-06-16 | 2006-03-28 | The United States Of America As Represented By The Secretary Of The Navy | System and method for target motion analysis with intelligent parameter evaluation plot |
US8542898B2 (en) * | 2010-12-16 | 2013-09-24 | Massachusetts Institute Of Technology | Bayesian inference of particle motion and dynamics from single particle tracking and fluorescence correlation spectroscopy |
WO2012158432A2 (en) * | 2011-05-09 | 2012-11-22 | Aptima Inc | Systems and methods for scenario generation and monitoring |
CN106780620B (en) * | 2016-11-28 | 2020-01-24 | 长安大学 | Table tennis motion trail identification, positioning and tracking system and method |
CN109001484B (en) * | 2018-04-18 | 2021-04-02 | 广州视源电子科技股份有限公司 | Method and device for detecting rotation speed |
CN111754549B (en) * | 2020-06-29 | 2022-10-04 | 华东师范大学 | Badminton player track extraction method based on deep learning |
-
2021
- 2021-05-21 CN CN202110556785.0A patent/CN113362366B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103364579A (en) * | 2013-07-02 | 2013-10-23 | 北京理工大学 | Method for predicting ping-pong spin angle velocity of ping-pong robot |
CN105678802A (en) * | 2014-04-21 | 2016-06-15 | 杨祖立 | Method for generating three-dimensional information by identifying two-dimensional image |
CN108731670A (en) * | 2018-05-18 | 2018-11-02 | 南京航空航天大学 | Inertia/visual odometry combined navigation locating method based on measurement model optimization |
CN110135308A (en) * | 2019-04-30 | 2019-08-16 | 天津工业大学 | A kind of direct free kick type identification method based on video analysis |
Also Published As
Publication number | Publication date |
---|---|
CN113362366A (en) | 2021-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107820593B (en) | Virtual reality interaction method, device and system | |
CN111354042B (en) | Feature extraction method and device of robot visual image, robot and medium | |
US11582426B2 (en) | Method and apparatus for sensing moving ball | |
US11676312B2 (en) | Object trajectory simulation | |
CN105359054B (en) | Equipment is positioned and is orientated in space | |
US20120277890A1 (en) | Method of Ball Game Motion Recognition, Apparatus for the same, and motion assisting device | |
CN105797319B (en) | A kind of badminton data processing method and device | |
CN103916586B (en) | image analysis apparatus and image analysis method | |
CN109001484B (en) | Method and device for detecting rotation speed | |
CN115624735B (en) | Auxiliary training system for ball games and working method | |
CN110567484B (en) | Method and device for calibrating IMU and rigid body posture and readable storage medium | |
CN107370941A (en) | A kind of information processing method and electronic equipment | |
CN104732560B (en) | Virtual video camera image pickup method based on motion capture system | |
CN101879376A (en) | Realization method for gyro sensor in interactive games | |
CN111736190B (en) | Unmanned aerial vehicle airborne target detection system and method | |
CN113362366B (en) | Sphere rotation speed determining method and device, terminal and storage medium | |
CN114519740A (en) | Track calculation device, track calculation method, and computer-readable recording medium | |
CN110910489B (en) | Monocular vision-based intelligent court sports information acquisition system and method | |
CN116012417A (en) | Track determination method and device of target object and electronic equipment | |
CN114241602B (en) | Deep learning-based multi-objective moment of inertia measurement and calculation method | |
CN113724176A (en) | Multi-camera motion capture seamless connection method, device, terminal and medium | |
WO2015146155A1 (en) | Swing data compression method, swing data compression device, swing analysis device, and swing data compression program | |
CN106650659B (en) | Method and device for identifying motion parameters of bat | |
Wang | [Retracted] Parameter Testing and System of Skiing Aerial Skills under the Background of Artificial Intelligence | |
Chen et al. | Prediction of ping-pong ball trajectory based on neural network using player’s body motions |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |