CN114733189A - Control method, device, medium and electronic equipment for somatosensory ball hitting - Google Patents

Control method, device, medium and electronic equipment for somatosensory ball hitting Download PDF

Info

Publication number
CN114733189A
CN114733189A CN202210422679.8A CN202210422679A CN114733189A CN 114733189 A CN114733189 A CN 114733189A CN 202210422679 A CN202210422679 A CN 202210422679A CN 114733189 A CN114733189 A CN 114733189A
Authority
CN
China
Prior art keywords
swing
racket
data
time point
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202210422679.8A
Other languages
Chinese (zh)
Inventor
郑悠
安鹏
徐力
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo University of Technology
Original Assignee
Ningbo University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo University of Technology filed Critical Ningbo University of Technology
Priority to CN202210422679.8A priority Critical patent/CN114733189A/en
Publication of CN114733189A publication Critical patent/CN114733189A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8011Ball

Abstract

The disclosure provides a method, a device, a medium and an electronic device for controlling somatosensory ball hitting. The method comprises the steps of performing standard batting analysis on swing data through first flight data of a virtual ball body and swing videos of a user swinging a racket to determine virtual batting data; and then determining second flight data of the virtual ball after being hit by the racket according to the virtual hitting data and the first flight data. The sports scene generated by the motion sensing device is more real, and the user can be forced to change the action, so that the aim of correcting the technical action of the user is fulfilled.

Description

Control method, device, medium and electronic equipment for somatosensory ball hitting
Technical Field
The disclosure relates to the technical field of human-computer interaction, in particular to a method, a device, a medium and an electronic device for controlling somatosensory hitting.
Background
The motion sensing game is a novel electronic game which is operated by transmitting the change of the body motion to game equipment through a sensor. Such as tennis or badminton.
At present, in the body sensing game of ball class, neglected and experienced the user, user experience is relatively poor.
Therefore, the present disclosure provides a method for controlling a somatosensory ball to solve one of the above technical problems.
Disclosure of Invention
An object of the present disclosure is to provide a method, an apparatus, a medium, and an electronic device for controlling a somatosensory ball, which can solve at least one of the above-mentioned technical problems. The specific scheme is as follows:
according to a specific embodiment of the present disclosure, in a first aspect, the present disclosure provides a method for controlling a somatosensory ball, including:
acquiring first flight data of a virtual sphere;
the swing data and the swing video of the virtual ball body of the racket are acquired in real time through the matching of the motion sensing equipment and the racket;
performing standard ball hitting analysis on the swing data based on the first flight data and the swing video, and determining virtual ball hitting data;
determining second flight data of the virtual ball after being struck by the racket based on the virtual ball striking data and the first flight data.
According to a second aspect, the present disclosure provides a control device for a somatosensory ball, comprising:
the first acquiring unit is used for acquiring first flight data of the virtual sphere;
the second acquisition unit is used for acquiring swing data and swing videos of the virtual ball body of the racket in real time through the cooperation of the somatosensory equipment and the racket;
a first determining unit, configured to perform standard ball hitting analysis on the swing data based on the first flight data and the swing video, and determine virtual ball hitting data;
a second determining unit for determining second flight data of the virtual ball after being hit by the racket based on the virtual hitting data and the first flight data.
According to a third aspect, there is provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements a method of controlling a somatosensory ball shot as in any one of the above.
According to a fourth aspect thereof, the present disclosure provides an electronic device, comprising: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement a method of controlling a somatosensory ball as described in any one of the above.
Compared with the prior art, the scheme of the embodiment of the disclosure at least has the following beneficial effects:
the disclosure provides a method, a device, a medium and an electronic device for controlling somatosensory hitting. The method comprises the steps of performing standard batting analysis on swing data through first flight data of a virtual ball body and swing videos of a user swinging a racket to determine virtual batting data; and then determining second flight data after the virtual ball body is hit by the racket through the virtual hitting data and the first flight data. The sports scene generated by the motion sensing device is more real, and the user can be forced to change the action, so that the aim of correcting the technical action of the user is fulfilled.
Drawings
Fig. 1 shows a flowchart of a control method of a somatosensory ball impact according to an embodiment of the disclosure;
FIG. 2 shows a device relationship diagram according to an embodiment of the present disclosure;
FIG. 3 shows a schematic diagram of an apparatus composition according to an embodiment of the present disclosure;
FIG. 4 shows a schematic view of a virtual sphere and a racquet in three-dimensional space at any point in time of a swing according to an embodiment of the present disclosure;
fig. 5 shows a block diagram of elements of a control device for a somatosensory ball strike according to an embodiment of the disclosure;
FIG. 6 is a schematic diagram illustrating an electronic device connection structure provided in accordance with an embodiment of the present disclosure;
description of the reference numerals
21-somatosensory device, 22-display device, 23-racket;
211-camera, 212-positioning device, 213-wireless communication device;
231-preset measuring site, 232-gravity sensor.
Detailed Description
To make the objects, technical solutions and advantages of the present disclosure clearer, the present disclosure will be described in further detail with reference to the accompanying drawings, and it is apparent that the described embodiments are only a part of the embodiments of the present disclosure, rather than all embodiments. All other embodiments, which can be derived by one of ordinary skill in the art from the embodiments disclosed herein without making any creative effort, shall fall within the scope of protection of the present disclosure.
The terminology used in the embodiments of the disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in the disclosed embodiments and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise, and "the plural" typically includes at least two.
It should be understood that the term "and/or" as used herein is merely one type of association that describes an associated object, meaning that three relationships may exist, e.g., a and/or B may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the character "/" herein generally indicates that the former and latter related objects are in an "or" relationship.
It should be understood that although the terms first, second, third, etc. may be used in the embodiments of the present disclosure, these descriptions should not be limited to these terms. These terms are only used to distinguish one description from another. For example, a first could also be termed a second, and, similarly, a second could also be termed a first, without departing from the scope of embodiments of the present disclosure.
The words "if", as used herein, may be interpreted as "at … …" or "at … …" or "in response to a determination" or "in response to a detection", depending on the context. Similarly, the phrases "if determined" or "if detected (a stated condition or event)" may be interpreted as "when determined" or "in response to a determination" or "when detected (a stated condition or event)" or "in response to a detection (a stated condition or event)", depending on the context.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that an article or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such article or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in the article or device in which the element is included.
It is to be noted that the symbols and/or numbers present in the description are not reference numerals if they are not already marked in the description of the figures.
Alternative embodiments of the present disclosure are described in detail below with reference to the accompanying drawings.
Example 1
Embodiments provided by the present disclosure, namely, embodiments of a method for controlling a somatosensory ball.
The embodiments of the present disclosure are described in detail below with reference to fig. 1.
Step S101, first flight data of the virtual sphere are obtained.
The method is applied to the hitting games of the body sensing device 21, such as tennis games, badminton games and table tennis games. As shown in fig. 2 and 3, a virtual competitive scene is generated by the motion sensing device 21 together with the display device 22 and the racket 23. The user is immersed in the virtual competitive scene by viewing the game image in the display device 22, holding the racket 23 and striking a virtual sphere in the virtual competitive scene. The motion sensing device 21 collects the hitting information of the user through various sensors, and generates the return information based on the hitting information, thereby realizing virtual competition. The display device 22 comprises a display screen and/or a VR head display, the display device 22 is used for displaying an immersive virtual competitive scene, and a user judges the flight direction and the flight speed of the virtual sphere by watching the virtual competitive scene, so as to click back on the virtual sphere. The motion sensing device 21 at least comprises a camera 211, a wireless communication device 213 and a positioning device 212, wherein the camera 211 is used for collecting motion videos of users participating in sports; the positioning device 212 is matched with the preset measuring point 231 arranged on the racket 23, and is used for positioning the preset measuring point 231 in a three-dimensional space. The racket 23 has a competitive function, and further includes at least a gravity sensor 232 and a preset measurement point 231, the racket 23 collects acceleration of the racket 23 in a gravity direction during swinging through the gravity sensor 232, and wirelessly transmits the acceleration to the motion sensing device 21, and the motion sensing device 21 analyzes the acceleration in the gravity direction. In order to ensure data consistency between the racket 23 and the body sensing device 21, the racket 23 needs to be timed with the body sensing device 21 when the body sensing game is started.
The first flight data is flight data when the virtual sphere flies to the user field in the virtual competition scene. The first flight data at least comprises a plurality of first flight time points in the virtual ball flight process after a virtual opponent uses a virtual racket 23 to hit the virtual ball in the virtual competitive scene, and a first flight position and a first flight speed corresponding to each first flight time point. The more the plurality of first flight time points are, the smaller the granularity of the flight track is, the smoother the flight track is, and the more real the competitive experience of the user is. In the process of butt-hitting between the user and the virtual opponent, the first flight data is generated by the somatosensory device 21 according to second flight data of the virtual ball flying to the virtual opponent site after the user hits the ball; when the virtual opponent serves, the first flight data is generated by the motion sensing device 21 according to the preset capability of the virtual opponent.
Step S102, obtaining swing data and swing video of the racket 23 for the virtual ball in real time through cooperation of the motion sensing device 21 and the racket 23.
The motion sensing device 21 and the racket 23 transmit information in a wireless communication mode and/or a limited communication mode.
The swing data is motion data of the racket 23 generated in the process that the user swings the racket 23 with respect to the flying virtual ball.
The swing video refers to a video captured by the camera 211 in the motion sensing device 21 when the user swings the racket 23, and a time stamp is set in the swing video.
In some embodiments, the swing data includes a plurality of swing time points of the preset measurement point 231 on the racket 23 during the swing and the swing position, swing speed and acceleration in the gravity direction corresponding to each swing time point.
For example, as shown in fig. 2, the predetermined measuring point 231 is arranged at the connection between the face and the handle of the racket 23, which does not affect the hitting, and is the closest position to the face, so as to make the motion trajectory of the racket 23 located closer to the real situation. The more the plurality of swing time points are, the closer the motion track of the racket 23, which is swung by the user to the real track, is, the more real the generated virtual batting data is, and the more real the competitive experience of the user is.
The swing position is a position of the preset measurement point 231 in the three-dimensional space at the swing time point.
The swing speed is a moving speed of the preset measurement point 231 in the three-dimensional space at the swing time point. The swing speed has a direction of motion.
The acceleration in the gravity direction is the sum of the accelerations in the respective directions of the racket 23 in the gravity direction during the swing.
Correspondingly, the step of acquiring swing data of the racket 23 for the virtual ball in real time by matching the motion sensing device 21 with the racket 23 includes the following steps:
step S102-1, a positioning device 212 on the motion sensing device 21 is used to perform real-time positioning on the preset measurement point 231 on the racket 23, and obtain a plurality of swing time points of the preset measurement point 231 in the swing and swing positions and swing speeds corresponding to the swing time points.
The positioning device 212 includes: radar rangefinder, laser rangefinder, or binocular camera 211.
The radar range finder includes an electromagnetic wave emitting module and an electromagnetic wave receiving module, the electromagnetic wave emitting module and the electromagnetic wave receiving module form a triangular relationship with a preset measuring point 231 on the racket 23, and triangular ranging is performed in the triangular relationship by using emitted electromagnetic waves at any swing time point, so that the swing position of the racket 23 is obtained.
Similarly, the laser range finder includes a laser emitting module and a laser receiving module, the laser emitting module and the laser receiving module form a triangular relationship with the preset measuring point 231 on the racket 23, and the emitted laser is utilized to perform triangular range finding in the triangular relationship at any swinging time point, so as to obtain the swinging position of the racket 23.
The binocular camera 211 is composed of two cameras 211, the two cameras 211 shoot and recognize the preset measuring point 231 respectively, the preset measuring point 231 is positioned in two images respectively, and the swing position of the racket 23 is determined by the difference between the two positioning.
The swing speed of the previous swing time point of the two adjacent swing time points can be obtained by using the swing position corresponding to each of the two adjacent swing time points.
In step S102-2, the wireless communication device 213 of the motion sensing device 21 obtains the acceleration in the gravity direction corresponding to each swing time point collected by the gravity sensor 232 disposed on the racket 23.
The motion sensing device 21 receives information transmitted from the racket 23 through the wireless communication means 213. The free racket 23 from the constraint of wired communication, the more real the competitive experience of the user.
The gravity sensor 232, also called a gravity sensor, works according to the principle of the piezoelectric effect, which is the phenomenon that the external force applied to the crystal by the heteropolar crystal without a symmetric center not only deforms the crystal but also changes the polarization state of the crystal, and establishes an electric field inside the crystal, and the phenomenon that the medium is polarized due to the mechanical force action is called a positive piezoelectric effect.
The gravity sensor 232 utilizes the characteristic of crystal deformation caused by acceleration in the gravity sensor. Since this deformation generates a voltage, the acceleration can be converted into a voltage output by simply calculating the relationship between the generated voltage and the applied acceleration. There are, of course, many other ways to make an acceleration sensor, such as capacitance effect, thermal bubble effect, and optical effect, but the most basic principle is that a medium is deformed due to acceleration, and the deformation is measured and converted into a voltage output by a related circuit.
In the foregoing embodiment, there is no precedence relationship between step S102-1 and step S102-2, step S102-2 may also be executed before step S102-1, or step S102-1 and step S102-2 may also be executed simultaneously, which is not limited in this embodiment.
Step S103, performing standard batting analysis on the swing data based on the first flight data and the swing video, and determining virtual batting data.
The disclosed embodiments take the swing video as one of the bases for standard shot analysis. To determine the validity of the virtual ball strike data.
The standard ball hitting analysis is to analyze whether the ball hitting action of the user meets the action standard.
In some embodiments, the virtual ball striking data includes a first swing angle and a first swing speed at which the racket 23 strikes the virtual sphere.
The first swing angle and the first swing speed are both the angle and speed of the racket 23 when hitting the virtual sphere.
The impact of the swing speed of the racket on the virtual ball body is considered, and the impact of the angle of the racket for hitting the virtual ball body on the virtual ball body is also considered, so that the hitting reality is improved, and the user experience is guaranteed.
Correspondingly, the standard batting analysis is carried out on the swing data based on the first flight data and the swing video, and virtual batting data is determined, and the method comprises the following steps:
in step S103-1, corresponding swing angles are acquired based on the respective accelerations.
Wherein, the swing angle is an included angle between a midline of the racket face of the racket 23 and a horizontal plane.
The midline of the racket face of the racket 23 refers to the extension line of the racket face of the racket 23 along the handle direction of the racket 23.
The method for obtaining the corresponding swing angle based on each acceleration is not described in detail in this embodiment, and reference may be made to each prior art.
Step S103-2, determining an area position of the board surface of the racket 23 corresponding to the swing time point based on the swing position, a preset racket area, and a swing angle at each swing time point.
The region position may be understood as a planar region surrounded by a plurality of three-dimensional spatial positions at the corresponding swing time point. The area of the plane area satisfies a preset beat face area, and the position and the angle of the plane area in the three-dimensional space are respectively consistent with the waving position and the waving angle. That is, the region position represents the posture of the racket 23 in the three-dimensional space at the corresponding swing time point.
And step S103-3, determining the posture data of the racket 23 corresponding to the swing time point based on the swing position, the area position and the swing angle of each swing time point.
The racket 23 pose data at least comprises: swing position, area position, and swing angle of the racket 23.
And step S103-4, performing hitting analysis on the first flight data and all posture data of the racket 23 in the swing, and acquiring a first swing time point of the racket 23 hitting the virtual sphere.
In some embodiments, the performing a ball hitting analysis on the first flight data and all posture data of the racket 23 during the swing to obtain a first swing time point when the racket 23 hits the virtual sphere, as shown in fig. 4, comprises the following steps:
and step S103-4-1, acquiring a first flight position and a first flight speed at any swing time point from the first flight data.
The first flight data at least comprises a plurality of first flight time points in the virtual ball flight process after a virtual opponent uses a virtual racket 23 to hit the virtual ball in the virtual competitive scene, and a first flight position and a first flight speed corresponding to each first flight time point.
Step S103-4-2, a first plane is generated based on all the area positions at any swing time point, and a first straight line is generated in a direction corresponding to a first flight speed at any swing time point via a first flight position at any swing time point.
The first plane is a plane formed by all the area positions at any swing time point in the three-dimensional space. That is, all the area positions at each swing time point are on the first plane, and one first plane is generated at each swing time point.
At each swing time point, the extending direction of the first straight line is consistent with the direction of the first flying speed of the virtual sphere, and the first straight line passes through the first flying position of the virtual sphere. A first straight line is generated at each swing time point.
And step S103-4-3, determining a first intersection point of the first straight line and the first plane at any waving time point. As shown in fig. 4.
Step S103-4-4, when the first intersection point on any waving time point is in the area position of the first plane corresponding to the waving time point, calculating a first distance between the first flight position on the waving time point and the first intersection point on the corresponding waving time point.
The racket 23 can hit the virtual sphere only when the virtual sphere flies into the area position of the first plane. If the direction of the first flying speed of the virtual sphere is outside the area position of the first plane, it indicates that the racket 23 is still far away from the virtual sphere, or that the racket 23 cannot hit the virtual sphere.
And S103-4-5, determining the swing time point corresponding to the minimum first distance as a first swing time point.
The minimum first distance, i.e., the distance that the virtual sphere is closest to the racket 23.
The minimum first distance corresponds to a swing time point which is a first swing time point, and it can be understood that the first swing time point is a time point at which the racket 23 strikes the virtual sphere. The more the swing time points are collected, the smaller the minimum first distance is, and when the minimum first distance is zero, it indicates that the racket 23 swung by the user is in full contact with the virtual sphere, and the more the virtual competitive scene is real.
Step S103-5, determining a swing image in the swing video corresponding to the first swing time point.
When the motion sensing game is started, the racket 23 and the motion sensing device 21 perform time synchronization, so that the data consistency between the racket 23 and the motion sensing device 21 is ensured.
In the above-described method, the swing image corresponding to the first swing time point in the swing video may be determined such that the time point of the swing image in the swing video is closest to the first swing time point.
A time stamp is set in the swing video. When the time point represented by the timestamp is the same as the first swing time point, taking the video image corresponding to the time point represented by the timestamp as a swing image; when the time points of the time stamp representations are different from the first swing time point, the time points of two time stamp representations which are closest to the first swing time point are determined from the swing video, and then the swing image corresponding to the first swing time point is calculated and obtained based on the two time points. For example, the first swing time point is 8 points, 10 minutes and 10 seconds, the time points represented by the two closest timestamps in the swing video are 8 points, 10 minutes, 9.2 seconds and 8 points, 10 minutes and 10.2 seconds, respectively, and 1-28 video images are located between the two timestamps, and one video image is generated every 0.0357 seconds; the time point of the 22 nd video image is 8: 9: 9.9854 seconds, the time point of the 23 th video image is 8: 10: 10.0211 seconds, the first swing time point is between the time point of the 22 nd video image and the time point of the 23 th video image, the time point closest to the first swing time point is selected from the time point of the 22 nd video image and the time point of the 23 th video image as the corresponding time point of the first swing time point in the swing video, the time point of the 22 nd video image is 0.0146 seconds from the first swing time point, and the time point of the 23 th video image is 0.0211 seconds from the first swing time point, so that the time point of the 22 nd video image is closest to the first swing time point, and the 22 nd video image can be determined to be the swing image.
And S103-6, when the waving image meets a preset standard action condition, determining that the waving angle and the waving speed corresponding to the first waving time point are respectively a first waving angle and a first waving speed.
In the specific embodiment, the batting action of the user is checked through the swing image, and the virtual batting data can be determined only if the batting action meets the standard. Therefore, the user is forced to change the batting action, and the purpose of correcting the technical action in the virtual competitive scene is achieved.
In some embodiments, when the swing image satisfies a preset standard motion condition, determining the swing angle and the swing speed corresponding to the first swing time point as the first swing angle and the first swing speed respectively includes:
and step S103-6-1, performing action characteristic identification on the swing image, and acquiring characteristic information of the batting action.
The characteristic information of the hitting action comprises: leg motion characteristic information, arm motion characteristic information, and/or abdomen motion characteristic information.
And S103-6-2, comparing the characteristic information with the standard hitting characteristic information in the virtual hitting data set to obtain an error result.
The virtual ball hitting data set may be stored in the motion sensing device 21 or may be stored in a remote server. The somatosensory device 21 accesses a remote server through the Internet, and obtains required standard hitting characteristic information from virtual hitting data in the remote server in a centralized manner.
And S103-6-3, when the error result is smaller than or equal to a preset error threshold, determining the swing angle and the swing speed corresponding to the first swing time point as a first swing angle and a first swing speed respectively.
Since the body types and the action habits of each user are different, the embodiment provides the wide margin for checking the batting action, and when the error result is within the wide margin range, the batting action of the user is in accordance with the requirement
In other embodiments, the method further comprises the steps of:
step S103-6-4, when the error result is greater than a preset error threshold value, determining that the first swing angle is a first flight angle corresponding to the first swing time point in the first flight data, and the first swing speed is equal to a first flight speed on the first swing time point in the first flight data, wherein the first flight angle is an angle between a direction of the first flight speed on the first swing time point and a horizontal plane included angle.
The first flight angle of the first flight data corresponding to the first swing time point may be understood as a first flight time point closest to the first swing time point determined from the first flight data, and the first flight angle determined at the closest first flight time point.
In this embodiment, the first swing angle and the first swing speed are consistent with the flight state of the virtual sphere, which means that the generated second flight data will be invalid. Further, this embodiment determines that the user has missed the shot because the user's shot has not met the criteria.
Step S104, determining second flight data after the virtual ball is hit by the racket 23 based on the virtual hitting data and the first flight data.
The second flight data is flight data when the virtual sphere flies to the field of the virtual opponent after the user hits the virtual sphere in the virtual competitive scene. The second flight data at least comprises a plurality of second flight time points of the virtual sphere in the flight process after being hit by the user, and a second flight position and a second flight speed corresponding to each second flight time point. The more the plurality of second flight time points are, the smaller the granularity of the flight track is, the smoother the flight track is, and the more realistic the competitive experience of the user is.
In some embodiments, the determining second flight data of the virtual ball after being hit by the racket 23 based on the virtual ball hitting data and the first flight data comprises the following steps:
and step S104-1, acquiring a first flight speed corresponding to the first swing time point from the first flight data.
The first flight speed at the first swing time is understood to mean that a first flight time closest to the first swing time is determined from the first flight data, and the first flight speed is determined at this closest first flight time.
Step S104-2, determining an initial flying speed of the virtual ball after being hit by the racket 23 based on the first flying speed corresponding to the first swing time point and the first swing speed.
The first swing speed is a speed at which the racket 23 hits the virtual sphere.
And S104-2, determining a preset contact characteristic value of the virtual sphere and the racket 23 based on the first swing angle.
The first swing angle is an angle at which the racket 23 hits the virtual sphere.
The preset contact characteristic value is determined by the characteristics of the racket 23 and the real ball body. Presetting contact characteristic values, including: the elastic force value of the real sphere, the elastic force value of the racket 23, and the friction force value of the racket 23 with the real sphere at the first swing angle.
The first waving angles have one-to-one correspondence with the preset contact characteristic values, and the preset contact characteristic values obtained by different first waving angles are different. The corresponding relationship may be stored in the data set, and the motion sensing device 21 queries the first swing angle to obtain a preset contact characteristic value. The data set may be stored in the motion sensing device 21 or may be stored in a remote server.
And step S104-3, determining the initial flying speed of the virtual sphere after being hit by the racket 23 based on the first flying speed corresponding to the first swing time point, the first swing speed and the preset contact characteristic value.
The first swing speed is a speed at which the racket 23 hits the virtual sphere.
The specific embodiment takes the characteristics of the real ball body and the racket as one of the bases of the batting effect of the user, thereby improving the batting authenticity and ensuring the user experience.
The process of obtaining the initial flying speed is not described in detail in this embodiment, and may be implemented by referring to various implementations in the prior art.
Step S104-4, determining the second flight data based on the initial flight speed.
The initial flying speed is obtained, and the initial flying direction of the virtual sphere is also obtained. Corresponding second flight data can thus be obtained.
In some embodiments, the racquet 23 further includes a vibration device.
The method further comprises the steps of:
step S104a, after the virtual hitting data is determined, generating vibration data based on the virtual hitting data and feeding the vibration data back to the racket 23, and triggering the racket 23 to enable the vibration device to generate corresponding hand feeling vibration based on the vibration data.
That is, when the body-sensing device 21 determines the virtual ball hitting data, the vibration data positively correlated with the virtual ball hitting data is generated, wherein the vibration data includes the amplitude and frequency of the vibration. The amplitude and frequency of the vibration are positively correlated with the first swing angle and the first swing speed of the virtual ball hitting data in the virtual ball hitting data. It is understood that the first swing angle and the first swing speed have a mapping relationship with the amplitude and frequency of the vibration. The mapping relation can be stored in a vibration data set, and vibration data corresponding to virtual batting data can be obtained in a table look-up mode. The motion sensing device 21 feeds the vibration information back to the racket 23 through the wireless communication device 213, and after the racket 23 obtains the vibration data, the vibration device enables the racket 23 to generate vibration corresponding to the vibration data, so that the user can feel the feeling similar to a real batting.
The embodiment of the present disclosure performs standard ball hitting analysis on swing data through first flight data of a virtual sphere and swing video of a user swinging a racket 23, and determines virtual ball hitting data; and then determines second flight data after the virtual ball is struck by the racket 23 through the virtual ball striking data and the first flight data. Not only the sports scene generated by the motion sensing device 21 is more real, but also the user can be forced to change the action, thereby achieving the purpose of correcting the technical action of the user.
Example 2
The present disclosure also provides an apparatus embodiment adapted to the above embodiment, for implementing the method steps described in the above embodiment, and the explanation based on the same name and meaning is the same as that of the above embodiment, and has the same technical effect as that of the above embodiment, and is not described again here.
As shown in fig. 5, the present disclosure provides a control device 500 for a somatosensory ball impact, comprising:
a first obtaining unit 501, configured to obtain first flight data of a virtual sphere;
the second obtaining unit 502 is configured to obtain swing data and swing video of the virtual ball of the racket in real time by matching the motion sensing device with the racket;
a first determining unit 503, configured to perform standard ball hitting analysis on the swing data based on the first flight data and the swing video, and determine virtual ball hitting data;
a second determining unit 504, configured to determine second flight data after the virtual ball is hit by the racket based on the virtual ball hitting data and the first flight data.
Optionally, the swing data includes a plurality of swing time points of a preset measurement point on the racket in the swing, and swing positions, swing speeds and accelerations in the gravity direction corresponding to the swing time points;
accordingly, the second obtaining unit 502 includes:
the first acquisition subunit is used for positioning a preset measurement site on the racket in real time through a positioning device on the motion sensing equipment, and acquiring a plurality of swing time points of the preset measurement site in swing and swing positions and swing speeds corresponding to the swing time points;
and the second acquisition subunit is used for acquiring the acceleration in the gravity direction corresponding to each waving time point acquired by the gravity sensor on the racket through the wireless communication device on the body sensing equipment.
Optionally, the virtual ball striking data comprises a first swing angle and a first swing speed of the racket striking the virtual ball;
accordingly, the first determining unit 503 includes:
a third acquiring subunit, configured to acquire a corresponding swing angle based on each acceleration, where the swing angle is an included angle between a center line of a racket face and a horizontal plane;
a first determining subunit, configured to determine, based on the swing position, a preset racket area, and a swing angle at each swing time point, an area position of the racket board corresponding to the swing time point;
a second determining subunit configured to determine racket pose data corresponding to swing time points based on the swing positions, the area positions, and the swing angles at the respective swing time points;
the fourth obtaining subunit is configured to perform ball hitting analysis on the first flight data and all racket posture data of the racket in the swing, and obtain a first swing time point at which the virtual sphere is hit by the racket;
a third determining subunit, configured to determine a swing image in the swing video corresponding to the first swing time point;
and the fourth determining subunit is configured to determine, when the swing image meets a preset standard action condition, that the swing angle and the swing speed corresponding to the first swing time point are the first swing angle and the first swing speed, respectively.
Optionally, the fourth obtaining subunit includes:
a fifth acquiring subunit, configured to acquire, from the first flight data, a first flight position and a first flight speed at any one of the swing time points;
a first generation subunit configured to generate a first plane based on all the area positions at any one of the swing time points; and (c) a second step of,
a second generating subunit, configured to generate a first straight line in a direction corresponding to a first flight speed at any one of the waving time points via a first flight position at the waving time point;
a fifth determining subunit, configured to determine a first intersection point between the first straight line at any one of the swing time points and the first plane;
a calculating subunit, configured to calculate a first distance between a first flight position at the swing time point and a first intersection point at a corresponding swing time point, when the first intersection point at any swing time point is within a region position of a first plane corresponding to the swing time point;
and the sixth determining subunit is configured to determine that the swing time point corresponding to the minimum first distance is the first swing time point.
Optionally, the fourth determining subunit includes:
a sixth acquiring subunit, configured to perform motion feature identification on the swing image, and acquire feature information of a ball hitting motion;
the obtaining subunit is used for carrying out error comparison on the characteristic information and standard ball hitting characteristic information in the virtual ball hitting data set to obtain an error result;
and the seventh determining subunit is configured to determine, when the error result is less than or equal to a preset error threshold, that the swing angle and the swing speed corresponding to the first swing time point are the first swing angle and the first swing speed, respectively.
Optionally, the fourth determining subunit further includes:
an eighth determining subunit, configured to determine that the first swing angle is the first flight angle corresponding to the first swing time point in the first flight data when the error result is greater than the preset error threshold, and the first swing speed is equal to the first flight speed on the first swing time point in the first flight data, where the first flight angle is an angle between a direction of the first flight speed on the first swing time point and a horizontal angle.
Optionally, the second determining unit 504 includes:
a seventh acquiring subunit, configured to acquire, from the first flight data, a first flight speed corresponding to the first swing time point;
a ninth determining subunit, configured to determine a preset contact characteristic value of the virtual sphere and the racket based on the first swing angle;
a tenth determining subunit, configured to determine an initial flying speed of the virtual sphere after being hit by the racket based on the first flying speed corresponding to the first swing time point, the first swing speed, and the preset contact characteristic value;
an eleventh determining subunit for determining the second flight data based on the initial flight speed.
The swing data are subjected to standard batting analysis through the first flight data of the virtual ball and the swing video of the racket swung by the user, and virtual batting data are determined; and then determining second flight data of the virtual ball after being hit by the racket according to the virtual hitting data and the first flight data. The sports scene generated by the motion sensing device is more real, and the user can be forced to change the action, so that the aim of correcting the technical action of the user is fulfilled.
Example 3
As shown in fig. 6, the present embodiment provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the one processor to cause the at least one processor to perform the method steps of the above embodiments.
Example 4
The disclosed embodiments provide a non-volatile computer storage medium having stored thereon computer-executable instructions that may perform the method steps as described in the embodiments above.
Example 5
Referring now to FIG. 6, shown is a schematic diagram of an electronic device suitable for use in implementing embodiments of the present disclosure. The electronic device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, the electronic device may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 601, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the electronic apparatus are also stored. The processing device 601, the ROM 602, and the RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, or the like; an output device 605 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, etc.; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided.
In particular, the processes described above with reference to the flow diagrams may be implemented as computer software programs, according to embodiments of the present disclosure. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer-readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of the embodiments of the present disclosure.
It should be noted that the computer readable medium in the present disclosure can be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. Where the name of an element does not in some cases constitute a limitation on the element itself.

Claims (10)

1. A method of controlling a somatosensory ball, comprising:
acquiring first flight data of a virtual sphere;
the swing data and the swing video of the virtual ball body of the racket are acquired in real time through the matching of the motion sensing equipment and the racket;
performing standard ball hitting analysis on the swing data based on the first flight data and the swing video, and determining virtual ball hitting data;
determining second flight data of the virtual ball after being struck by the racket based on the virtual ball striking data and the first flight data.
2. The method of claim 1,
the swing data comprises a plurality of swing time points of preset measuring points on the racket in the swing process, swing positions, swing speeds and accelerations in the gravity direction, wherein the swing positions, the swing speeds and the accelerations correspond to the swing time points;
correspondingly, cooperate real-time acquisition through body sensing equipment and racket the racket is directed against virtual spheroid's the data of waving includes:
the preset measuring points on the racket are positioned in real time through a positioning device on the motion sensing equipment, a plurality of swing time points of the preset measuring points in the swing and swing positions and swing speeds corresponding to the swing time points are obtained, and,
the acceleration in the gravity direction corresponding to each swing time point collected by a gravity sensor on the racket is obtained through a wireless communication device on the motion sensing equipment.
3. The method of claim 2,
the virtual ball hitting data includes a first swing angle and a first swing speed at which the virtual ball is hit by the racket;
accordingly, the performing standard ball strike analysis on the swing data based on the first flight data and the swing video, determining virtual ball strike data, comprises:
acquiring corresponding swing angles based on the accelerations, wherein the swing angles are included angles between a midline of a racket face and a horizontal plane;
determining an area position of the racket board corresponding to the swing time point based on the swing position, a preset racket area and a swing angle of each swing time point;
determining racket gesture data corresponding to swing time points based on the swing positions, the area positions and the swing angles of the respective swing time points;
performing ball hitting analysis on the first flight data and all racket posture data of the racket in the swing to obtain a first swing time point of the racket hitting the virtual ball;
determining a swing image corresponding to the first swing time point in the swing video;
and when the swing image meets a preset standard action condition, determining that the swing angle and the swing speed corresponding to the first swing time point are respectively a first swing angle and a first swing speed.
4. The method of claim 3, wherein the performing a stroke analysis on the first flight data and all of the racket pose data of the racket during the stroke to obtain a first swing time point for the racket to hit the virtual ball comprises:
acquiring a first flight position and a first flight speed at any swing time point from the first flight data;
generating a first plane based on all the region positions at any one of the swing time points, and,
generating a first straight line in a direction corresponding to a first flight speed of any waving time point through a first flight position on the waving time point;
determining a first intersection point of a first straight line and a first plane at any swing time point;
when a first intersection point on any swing time point is located in the area position of the first plane corresponding to the swing time point, calculating a first distance between a first flight position on the swing time point and the first intersection point on the corresponding swing time point;
and determining the swing time point corresponding to the minimum first distance as a first swing time point.
5. The method according to claim 3, wherein the determining that the swing angle and the swing speed corresponding to the first swing time point are the first swing angle and the first swing speed, respectively, when the swing image satisfies a preset standard motion condition comprises:
performing motion characteristic identification on the swing image to acquire characteristic information of a batting motion;
error comparison is carried out on the characteristic information and standard batting characteristic information in the virtual batting data set, and an error result is obtained;
and when the error result is smaller than or equal to a preset error threshold value, determining that the swing angle and the swing speed corresponding to the first swing time point are respectively a first swing angle and a first swing speed.
6. The method of claim 5, further comprising:
when the error result is greater than a preset error threshold value, determining that the first swing angle is a first flight angle corresponding to the first swing time point in the first flight data, and the first swing speed is equal to a first flight speed on the first swing time point in the first flight data, wherein the first flight angle is an angle between the direction of the first flight speed on the first swing time point and a horizontal plane included angle.
7. The method of claim 3, wherein determining second flight data of the virtual ball after being struck by the racket based on the virtual ball striking data and the first flight data comprises:
acquiring a first flight speed corresponding to the first swing time point from the first flight data;
determining a preset contact characteristic value of the virtual sphere and the racket based on the first swing angle;
determining an initial flying speed of the virtual sphere after being hit by the racket based on a first flying speed corresponding to the first swing time point, the first swing speed and the preset contact characteristic value;
determining the second flight data based on the initial airspeed.
8. A control device for a somatosensory ball, comprising:
the first acquiring unit is used for acquiring first flight data of the virtual sphere;
the second acquisition unit is used for acquiring swing data and swing videos of the virtual ball body of the racket in real time through the cooperation of the somatosensory equipment and the racket;
a first determining unit, configured to perform standard ball hitting analysis on the swing data based on the first flight data and the swing video, and determine virtual ball hitting data;
a second determining unit for determining second flight data of the virtual ball after being hit by the racket based on the virtual hitting data and the first flight data.
9. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
10. An electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
wherein the one or more programs, when executed by the one or more processors, implement the method of any of claims 1-7.
CN202210422679.8A 2022-04-21 2022-04-21 Control method, device, medium and electronic equipment for somatosensory ball hitting Withdrawn CN114733189A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210422679.8A CN114733189A (en) 2022-04-21 2022-04-21 Control method, device, medium and electronic equipment for somatosensory ball hitting

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210422679.8A CN114733189A (en) 2022-04-21 2022-04-21 Control method, device, medium and electronic equipment for somatosensory ball hitting

Publications (1)

Publication Number Publication Date
CN114733189A true CN114733189A (en) 2022-07-12

Family

ID=82284168

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210422679.8A Withdrawn CN114733189A (en) 2022-04-21 2022-04-21 Control method, device, medium and electronic equipment for somatosensory ball hitting

Country Status (1)

Country Link
CN (1) CN114733189A (en)

Similar Documents

Publication Publication Date Title
US8998717B2 (en) Device and method for reconstructing and analyzing motion of a rigid body
US9405372B2 (en) Self-contained inertial navigation system for interactive control using movable controllers
US20190121451A1 (en) Information processing apparatus and information processing method
US11173362B2 (en) Analysis apparatus, analysis method, and recording medium
CN104225899A (en) Motion analysis method and motion analysis device
CN105007995A (en) Measuring device for detecting hitting movement of hitting implement, training device, and method for training hitting movement
KR20180095588A (en) Method and apparatus for motion analysis of sports apparatus
US10773147B2 (en) Virtual golf simulation apparatus
JP2016067410A (en) Motion analysis device, motion analysis system, and motion analysis method and program
CN111184994B (en) Batting training method, terminal equipment and storage medium
JP2016218758A (en) Information processing apparatus and information processing method
KR20160106671A (en) Movement analysis device, movement analysis system, movement analysis method, display method for movement analysis information, and program
JP5017381B2 (en) Game system and game terminal
CN104587662A (en) motion analyzing apparatus andmotion analyzing method
KR102097033B1 (en) System for estimating motion by sensing interaction of point body
CN114733189A (en) Control method, device, medium and electronic equipment for somatosensory ball hitting
US20180250571A1 (en) Motion analysis device, motion analysis method, motion analysis system, and display method
KR101348419B1 (en) Virtual golf simulation apparatus and method providing video content
US10258851B2 (en) System and method for calculating projected impact generated by sports implements and gaming equipment
JP2016116572A (en) Motion analysis device, motion analysis method, program, and motion analysis system
CN111672089B (en) Electronic scoring system for multi-person confrontation type project and implementation method
CN107754286A (en) A kind of table tennis system based on inertial navigation and sense of touch correction positioning
JP6594254B2 (en) Virtual environment construction device, virtual environment construction method, program
JP2018117885A (en) Motion analysis device, motion analysis method, program, and motion analysis system
CN115068918B (en) Batting win-or-lose judging method and device, wearable equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20220712

WW01 Invention patent application withdrawn after publication