CN111672089B - Electronic scoring system for multi-person confrontation type project and implementation method - Google Patents

Electronic scoring system for multi-person confrontation type project and implementation method Download PDF

Info

Publication number
CN111672089B
CN111672089B CN202010571424.9A CN202010571424A CN111672089B CN 111672089 B CN111672089 B CN 111672089B CN 202010571424 A CN202010571424 A CN 202010571424A CN 111672089 B CN111672089 B CN 111672089B
Authority
CN
China
Prior art keywords
scoring
motion capture
competitor
collision
human body
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010571424.9A
Other languages
Chinese (zh)
Other versions
CN111672089A (en
Inventor
刘怀良
刘鑫帝
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Liangjiang Industry Hainan Co ltd
Original Assignee
Liangjiang Industry Hainan Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Liangjiang Industry Hainan Co ltd filed Critical Liangjiang Industry Hainan Co ltd
Priority to CN202010571424.9A priority Critical patent/CN111672089B/en
Publication of CN111672089A publication Critical patent/CN111672089A/en
Application granted granted Critical
Publication of CN111672089B publication Critical patent/CN111672089B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0605Decision makers and devices using detection means facilitating arbitration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0669Score-keepers or score display devices
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0625Emitting sound, noise or music
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/065Visualisation of specific exercise parameters

Abstract

The invention discloses an electronic scoring system and a realization method for a multi-person confrontation project, wherein the system comprises a motion capture system for positioning motion capture devices of all body parts of a competitor, and a scoring machine for receiving positioning information, establishing a 3D virtual human body model, judging whether collision occurs or not and scoring; the implementation method comprises the steps of wearing the motion capture device by a competitor, establishing a 3D virtual human body model of the competitor, obtaining positioning information obtained by the motion capture device, judging a collision event in real time, calculating scores in real time and the like. The invention can be suitable for various match rules, reduces the cost of wearable equipment, increases the wearing comfort level, can fully restore the match situation by the scoring system and the scoring method, can increase the match scoring accuracy, and reduces the occurrence of misjudgment situations.

Description

Electronic scoring system for multi-person confrontation type project and implementation method
Technical Field
The invention relates to a scoring system of fighting sports competitions, in particular to an electronic scoring system aiming at multi-person confrontation type projects and an implementation method thereof.
Background
In combat martial arts competitions, the match scene is scored by human judges, by visual judgment, or by the incapacity of the competitor. For example, the short-soldier scoring system of the Chinese martial arts association uses a plurality of real judges to hold a wireless integrator to score in real time, and finally, the final result is obtained through score comparison. The scoring method consumes a large amount of labor cost, and meanwhile due to the difficulty in observing the attack by naked eyes, very accurate results are difficult to obtain on the speed and the position of the attack and on whether real impact and damage are made, and misjudgment by referees intentionally or unintentionally is easy to occur.
In some wearable sensing wearable devices used in athletic competitions (e.g., fencing and taekwondo competitions), these wearable devices based on sensors such as impact, light, force sensors, and pressure sensors are used to perform striking sensing and striking judgment, and finally perform scoring schemes for judging and scoring.
However, such scoring devices and methods are based on different competition rules, different sensor devices need to be worn for different competitions, and scoring systems of different competitions cannot be unified. In addition, according to the rules of different competitions, the wearable device needs to be fixed on the limbs of the competitors, the shape characteristics of the sensor need to be considered in the competition garment design, and the wearable device needs to consider the protection of the sensor due to the fact that the sensor needs to directly sense the attack of an attacker, so that the material and the structure limitation of the garment are increased. This undoubtedly increases the complexity and high cost of the wearable device, which is difficult to popularize.
The above-mentioned drawbacks are worth solving.
Disclosure of Invention
In order to overcome the defects of the prior art, the invention provides an electronic scoring system for a multi-person confrontation type project and an implementation method thereof.
The technical scheme of the invention is as follows:
in one aspect, an electronic scoring system for a multi-person confrontation-type project, comprising:
the motion capture system is used for positioning the motion capture devices of all body parts of the competitors;
and the scoring machine receives the positioning information sent by the motion capture system in real time, acquires the 3D virtual human body model corresponding to the competitor through the positioning information, judges whether effective collision occurs or not through the 3D virtual human body model, and scores according to the type of the effective collision and a predefined scoring rule.
In the present invention according to the above aspect, the motion capture system includes a motion capture sensor and the motion capture device, the motion capture sensor is connected to the motion capture device by wire/wireless, and the motion capture device is worn on a different part of the competitor.
The invention according to the above aspect is characterized in that a vibration sensor is provided in the motion capture device.
The present invention according to the above aspect is characterized in that the motion capture device is worn on one or more parts of a human head, a chest, an arm, a hand, a thigh, a knee joint, a foot, a weapon, or a weapon.
The present invention according to the above aspect is characterized in that each of the limb parts or weapons wearing the motion capture device is configured with one or more attributes.
The present invention according to the above aspect is characterized in that the motion capture system is configured to detect orientation and coordinate data of six degrees of freedom, nine degrees of freedom, or ten degrees of freedom.
Further, the motion capture system comprises one or more of a motion sensing system based on active or passive optical capture, an inertial sensor system based on inertial motion capture, a structural induction system based on mechanical motion capture, an electromagnetic induction system based on electromagnetic motion capture, a heat generation and/or heat sensing system, an accelerometer.
The present invention according to the above aspect is characterized in that the motion capture device is fixed to a limb or a weapon of the competitor by a wearable device.
Further, the wearable device comprises one or more of a strap-on fastening device, an adhesive fastening device, a mechanical and/or structural fastening device, and a magnetic fastening device.
The invention according to the above aspect is characterized in that the scoring machine includes:
the communication device is used for receiving the positioning information sent by the motion capture system in real time;
and the computing device obtains the 3D virtual human body models corresponding to the competitors according to the positioning information received by the communication device, judges that suspected collision occurs by judging the volume cross of the 3D virtual human body models of the two competitors, and obtains a scoring result by judging the strength of the suspected collision event and the competition rule.
Further, the communication device is wirelessly connected with the motion capture system.
Further, the computing device includes:
the operation interface is used for controlling the operation of the computing device through interface operation;
the motion capture system calibration module is used for calibrating each motion capture device connected with the scoring machine according to the human body structure, forming a 3D virtual human body model of a competitor and defining the attribute of each motion capture device;
the original parameter and scoring rule setting module is used for setting judgment parameters in the collision event judgment process and setting a rule of collision result integration;
the scoring operation module is used for judging whether collision occurs according to the original parameters and scoring the real collision event;
and the data connection establishing module is used for establishing the connection relationship between the communication device and the motion capture system.
Further, the 3D virtual human body model generated by the motion capture system calibration module has a volume larger than that of the actual competitor.
Further, the score operation module comprises:
the collision event judgment unit is used for obtaining the real-time limb position, posture and motion trail of the competitor according to the 3D virtual human body model of the competitor, and judging and obtaining a virtual collision event;
and the collision force judging unit is used for judging whether the collision reaches a set threshold value according to the force index when the virtual collision event occurs, and obtaining different scores for the virtual collision events reaching different set threshold values.
Furthermore, the collision event determination unit determines whether the motion capture device worn by the competitor vibrates after obtaining the real-time limb position, posture and motion track of the competitor, and determines as a real collision event if the motion capture device worn by the competitor vibrates.
Furthermore, the scoring machine also comprises an output device, the output device comprises a display output, the display output is connected with the display, and the scoring machine displays the scoring result and the scoring data on the display through the display output.
Furthermore, the computing device also comprises an information export setting module which is used for exporting the scoring result and the scoring data counted by the scoring operation module and sending the scoring result and the scoring data to the output device.
Furthermore, the output device further comprises an audio output, the audio output is connected with a loudspeaker, and the scoring machine plays the collected audio data to the loudspeaker through the audio output.
Furthermore, the scoring machine also comprises a sound pickup device, the sound pickup device receives the audio information of the sound pickup and sends the audio information to the computing device, and the computing device sends the audio information to the action capturing device of the competitor, which can play audio, through the communication device.
In another aspect, a method for implementing an electronic scoring system for a multi-user confrontation type project as described above includes the following steps:
s100, the competitor wears different limb positions according to different attributes of each motion capture device;
s200, establishing connection between a scoring machine and a motion capture system, and establishing a 3D virtual human body model of a competitor;
s300, the scoring machine acquires positioning information obtained by the motion capture device in real time;
s400, the scoring machine judges a collision event in real time according to the 3D virtual human body models of different competitors and initial set original parameters;
and S500, calculating the scoring condition of the competitor in real time according to the scoring rule by the scoring machine.
The invention according to the above scheme is characterized in that step S400 specifically includes: judging whether a suspected collision occurs according to the volume intersection of the 3D virtual human body models of different competitors, judging whether the collision strength exceeds a preset threshold value on the basis of judging the suspected collision, and judging a real collision event if the collision strength exceeds the preset threshold value.
The present invention according to the above scheme is characterized in that, in step S500, the real-time scoring process includes:
(1) obtaining collision force parameters according to the limb movement speed and the trajectory of the competitor before collision;
(2) identifying the specific limb part of the competitor who collides to obtain an attack part and an attacked part;
(3) and real-time scoring is carried out based on scoring rules through the judgment of the real collision event.
According to the scheme, the dynamic collision detection method has the advantages that the dynamic collision detection is carried out on the competitors based on the motion capture system, and scoring is carried out by combining different competition rules, so that the whole system can be suitable for various competition rules, the change of the costumes of the competitors is reduced, and the corresponding cost of wearable equipment is reduced; in addition, the scoring system and the scoring method can fully restore the game situation, can increase the accuracy of the game scoring and reduce the occurrence of misjudgment.
Drawings
Fig. 1 is a system block diagram of an electronic scoring system for multi-person confrontation-type items.
Fig. 2 is a schematic diagram of an electronic scoring system for a multi-player confrontation-type project.
FIG. 3 is a schematic diagram of a competitor wearing a motion capture device.
Fig. 4 is a schematic structural diagram of a computing device.
Fig. 5 is a schematic diagram of a 3D virtual human body model.
FIG. 6 is a diagram illustrating the relationship between the head and the head motion capture device in the 3D virtual human body model.
FIG. 7 is a diagram illustrating the comparison between the actual actions of the participants and the 3D virtual human body model.
FIG. 8 is a flow chart of an implementation of the present invention.
In the figure, 1, a motion capture device; 11. a head motion capture device; 12. an arm motion capture device; 13. a hand motion capture device; 14. a chest motion capture device; 15. a leg motion capture device; 16. a knee joint motion capture device; 17. a foot motion capture device;
2. a motion capture sensor;
3. a scoring machine; 31. a head geometry label; 32. arm geometry labels; 33. a hand geometry label; 34. a chest geometry label; 35. a leg geometry label; 36. a knee joint geometry label; 37. a foot geometry tag;
4. a display;
5. a competitor;
6. a field of play;
7. geometric data coordinate system.
Detailed Description
The invention is further described with reference to the following figures and embodiments:
as shown in fig. 1 to 7, an electronic scoring system for a multi-player confrontation-type project includes: a motion capture system and a scoring machine. The motion capture system is used for positioning the motion capture devices of all body parts of the competitors; the scoring machine receives the positioning information sent by the motion capture system in real time, acquires the 3D virtual human body model corresponding to the competitor through the positioning information, judges whether effective collision occurs or not through the 3D virtual human body model, and scores according to the type of the effective collision and a predefined scoring rule.
Motion capture system
As shown in fig. 1, 2, and 3, the motion capture system is used to detect orientation and coordinate data in 3 degrees of freedom (3-axis accelerometers and 3-axis gyroscopes), or 9 degrees of freedom, or 10 degrees of freedom. It comprises one or more of a motion sensing system based on active or passive optical capture, an inertial sensor system based on inertial kinetic capture, a structural induction system based on mechanical kinetic capture, an electromagnetic induction system based on electromagnetic kinetic capture, a heat generation and/or heat sensing system, an accelerometer.
The motion capture system includes a motion capture sensor 1 and a motion capture device 2.
1. Motion capture sensor
The motion capture sensor 2 is located outside (or at an edge location) of the playing field 6 and is wired/wirelessly connected to the motion capture device 1. The motion capture sensor 2 captures the motion of a predetermined number of motion capture devices 1 in which the competitor 5 is placed at each part of the body.
2. Motion capture device
(1) The motion capture device 1 is worn on different body parts or weapons of the competitor 5 and serves as a basis for efficient scoring by recording whether different body parts are under attack and/or emitting an attack.
(2) The motion capture device is secured to a limb or weapon of the competitor by the wearable device. The wearable device comprises one or more of a binding type fixing device, an adhesive type fixing device, a mechanical type and/or structural fixing device and a magnetic fixing device.
In the present embodiment, each limb part or weapon wearing the motion capture device 1 is configured with one or more attributes, including attack, defense, and vulnerability, so that the scoring machine can score according to detailed scoring rules. For example: hands may be defined as attack and defense properties, heads may be defined as vulnerable properties, etc. The present invention is not limited to the above-mentioned attributes, for example, in another embodiment, each limb portion may not be configured with corresponding attributes, but only define a valid portion (chest is defined as a valid portion, wrist is defined as an invalid portion or a division-reduced portion, during the competition, the attacking chest records a positive score, and the attacking wrist records a zero score or a negative score). Other definitions may be made in other embodiments and are not described in detail herein.
Taking the example that each limb part or weapon of the human body is respectively configured with one or more attributes:
the motion capture device is worn on an attack site and/or an attacked site of a human body, specifically includes one or more of a head, a chest, an arm, a hand, a thigh, a knee joint, a foot, and a weapon, and further includes a head motion capture device 11, an arm motion capture device 12, a hand motion capture device 13, a chest motion capture device 14, a leg motion capture device 15, a knee joint motion capture device 16, and a foot motion capture device 17.
Preferably, on the premise of meeting the requirement of the scoring position for exercise measurement, the motion capture device is worn on the part of the competitor not directly facing the attack, the complexity of wearing equipment is reduced, the protection requirement on the device is reduced, the wearing of competition clothes of various martial arts competitions can be met, the limitation of the competitor on the aspect of wearing the clothes is eliminated, the cost of the whole scheme is reduced, and meanwhile, the flexibility of setting the scoring standard is improved.
Different motion capture devices worn by different body parts of the participants are matched with the initial parameters defined by the scoring machine. For example, if the motion capture device set in the scoring machine is the brain by default, the competitor wears the secondary motion capture device on the posterior cerebrum.
Preferably, in this embodiment, a vibration sensor is disposed in the motion capture device, and the vibration sensor is used to determine whether the position of the motion capture device is collided or not and determine the type of the collision. In other embodiments, the weapon or the limb of the competitor is further worn with sensors such as impact sensor, light sensor, force sensor, and pressure sensor, so that the scoring machine additionally obtains force parameter data of the sensors, and the force parameter data is used for increasing the accuracy of judging the collision event (the force parameter data can replace the estimation data calculated by the scoring device, such as impact judgment and force judgment) so that the scoring obtained by the scoring device is more accurate and reliable.
Preferably, the competitor also wears impact protection devices such as armor, which take advantage of their puncture and tear resistance properties to protect the wearer from injury due to impact. These injuries result from blows, throws, or other forces applied to the competitor, including forces from the competitor falling onto or against a contact surface (e.g., the ground, a fence or cage surrounding an arena), a puncture (e.g., weapon) or tearing force, and/or a bang, body part, or any other object (e.g., the ground) hitting the competitor.
Second, score machine
As shown in fig. 4-7, the scoring machine is connected (preferably wirelessly) to the motion capture system, receives the positioning information transmitted by the motion capture system in real time, performs real-time scoring calculations (including collating, processing, analyzing and reporting force parameter data, calculating one or more results (e.g., scores)), and outputs and displays the generated results.
The scoring machine is one or more of electronic equipment such as a desktop computer, a notebook computer, a smart phone, a tablet personal computer and a virtual reality all-in-one machine. Including a computer system or network (including a LAN, WAN, internet, or cloud) or any other device (e.g., embedded hardware) having processing capacity and the ability to send data to a visual display (including, but not limited to, sending data in real-time or near real-time).
1. The scoring machine includes a communication device.
The communication device is used for receiving positioning information sent by the motion capture system in real time and is used for communicating with the motion capture device worn on each competitor, so that the 6-freedom data can be communicated from each competitor to the scoring machine.
In one embodiment, the communication device is a one-way communication device such that the motion capture system sends the detected position information of the motion capture device to the scoring machine. In another embodiment, the communication device is a multi-directional communication device, which not only allows the scoring machine to receive the location information from the motion capture system, but also allows the scoring machine to return the data (including cumulative scores or voice data from the trainer) obtained by the scoring machine to the motion capture device of the competitor. Preferably, the communication means is single channel or multi-channel, multi-channel communication enabling simultaneous transmission and/or simultaneous reception of communications.
In a preferred embodiment, the wearable device further comprises headphones and a microphone on the head of the competitor, thereby enabling the competitor to receive an indication of the coach on one channel and to speak to the coach on another channel via the microphone. If multiple participants participate in the race (i.e., a team race), additional channels may be included to allow team participants on the arena to communicate with each other.
Preferably, the communication device employs a wireless communication method, including any suitable wireless communication means, such as bluetooth, local area network, network cloud, etc.
2. The scoring machine includes a computing device.
Establishing a link between the computing device and the azimuth data of the motion capture system, and further setting original parameters; and obtaining a 3D virtual human body model corresponding to the competitor according to the positioning information received by the communication device, judging that suspected collision occurs by judging the volume cross of the 3D virtual human body models of the two competitors, and obtaining a scoring result by judging the strength of a suspected collision event and a competition rule.
The software system in the computing device is a software module that can run on the scoring machine, and the running environment thereof can be any suitable operating system (such as Windows OS, Linux OS, Mac OS, Android OS, and the like), and the running mode thereof can be any running mode, such as local installation running or cloud computing, and the like. The software system may be based on different programming languages (including C, C + +, C #, PHP, Java, etc.) according to different operating environments.
After the computing device is connected with the motion capture system, orientation (orientation with no less than 6 degrees of freedom) data can be transmitted in real time, and scoring software in the computing device acquires the access authority of a scoring machine processor and a database so as to acquire real-time orientation data provided by the motion capture system and store the real-time orientation data in a memory database.
As shown in fig. 4, the computing device includes an operation interface, a motion capture system calibration module, an original parameter and scoring rule setting module, a scoring operation module, and a data connection establishing module.
(1) Operation interface
The operation interface is used for controlling the operation of the computing device through interface operation.
The computing device is a software user operation interface started after the computer runs the scoring software, and is a virtual operation interface which is usually used for modifying, establishing, increasing and deleting various parameters of the software and issuing various instructions by using a mouse and a keyboard as input means.
Preferably, the user performs operations related to calibration of the motion capture system, setting of the original parameters and scoring rules, and derivation of the setting information through the operation interface.
(2) Motion capture system calibration module
The motion capture system calibration module is used for calibrating each motion capture device connected with the scoring machine according to the human body structure, forming a 3D virtual human body model of the competitor and defining the attribute of each motion capture device. Specifically, the method comprises the following steps:
as shown in fig. 3, 5, the computing device calibrates the motion capture devices in all connections according to the anatomy, allowing each motion capture device to be assigned a tag whose nature can serve different project rules. Preferably, the computing device identifies the limb part of the competitor corresponding to each motion capture device, and sets the corresponding limb part label and the corresponding attribute.
In this embodiment, limb part tags include, but are not limited to: head geometry label 31, arm geometry label 32, hand geometry label 33, chest geometry label 34, leg geometry label 35, knee geometry label 36, foot geometry label 37, weapon label, etc. The attributes of the limb tag include a site available for attack (attribute), a site vulnerable to weakness (attribute), and the like, depending on the limb tag site.
And after the computing device is connected with the motion capture system and calibrated, obtaining the geometric data of the limb part corresponding to each motion capture device, and performing structure generation by using a three-dimensional model construction principle to generate a corresponding geometric data coordinate system 7 so as to form the 3D virtual human body model of the competitor. The geometry data of the limb portion specifically includes head volume geometry data (shape and size) (as shown in fig. 6), chest volume geometry data (shape and size), arm volume geometry data (shape and size), hand volume geometry data (shape and size), thigh volume geometry data (shape and size), calf volume geometry data (shape and size), foot volume geometry data (shape and size), and weapon volume geometry data (shape and size).
Since the relative positions of the participants' limbs and the motion capture devices they wear need to be acquired and then provided to the computing device, the positions in the 3D virtual human model are in the same relative positions as the positional coordinates provided by the motion capture devices.
Preferably, the motion capture system calibration module may establish a corresponding geometric data coordinate system through a setting of the system debugging, and may also refer to the relevant world mean data as a default value. In one embodiment, the relative position is a default value in the software that the competitor needs to calibrate the wearing position when wearing the motion capture device, e.g., the computing device default head motion capture device position is the central position of the back of the brain of the person, so the competitor needs to wear it by this default value; in another embodiment, the relative position can be debugged in the software operation interface, and the calibration is carried out according to the wearing position of the real competitor.
As shown in fig. 7, all the calibrated geometric data are combined according to the human body structure to generate a 3D virtual human body model representing the shape and size of the physical structure of the competitor (the shapes and volumes of the armour and special dresses of the competitor are also simulated and reproduced in the 3D virtual human body model), and the shapes and sizes are stored and managed in a scoring machine in a spatial data structure, so that the scoring software is allowed to call and calculate various information based on the data in real time, such as stroke judgment, score judgment, foul judgment and the like. The result is that the volume and shape of the virtual mannequin are completely or nearly identical to the real body of the competitor.
And updating the position and the coordinate of each limb part of the 3D virtual human body model according to the real-time position data provided by the motion capture device, so that the actual limb motion of the competitor is transmitted to scoring software through the motion capture device and is simulated in real time through the 3D virtual human body model. For example, when the motion capture device worn on the hand of the competitor transmits the orientation data of the scoring machine to the scoring machine in real time, the data is read by the scoring software and updated to the orientation of the hand of the virtual human body in real time, so that the hand posture of the virtual human body is consistent or nearly consistent with the real hand posture of the competitor, and the like is performed to all parts until the limb motions of the virtual human body can simulate the real-time limb motions of the competitor in real time or nearly in real time. And obtaining the 3D virtual human body model of each competitor according to the simulation method.
Similarly, the orientation data provided by the motion capture device would calibrate all of the 3D virtual mannequins to the actual relative positions of the participants. For example, in one embodiment, if the distance between the competitor a and the competitor B is 1 meter and the heights of the competitors are consistent, the generated 3D virtual human body model also has the distance of 1 meter and the heights of the competitors are consistent, and the distance between the virtual characters of the competitors truly reflects the actual distance of the competitors in the real world.
In another embodiment, after the players play the game by using the scoring system, the scoring device can play the whole game by means of the CGI by using the stored spatial structure data, and allow the operations of tentative, fast forward, reverse and the like.
(3) Original parameter and scoring rule setting module
The original parameter and scoring rule setting module is used for setting judgment parameters in the collision event judgment process and setting a rule of collision result integration. According to different requirements, in one embodiment, a user can modify, add and delete the original parameters and the set parameters of the scoring rules through an operation interface; in another embodiment, all default values for the original parameters and scoring rules are set in the scoring device and cannot be modified, deleted, or added in the software operation interface.
The raw parameters and scoring rules are variable values or constant variables preset and/or settable in a series of scoring devices, which are used to provide parameters that are referenced by the scoring devices in computing collision events (determining that a competitor hits another competitor) and in real-time. The original parameters and scoring rules include, but are not limited to:
A. a site tag that can be used for attack.
In one embodiment, the site type may be defined, including blunt, sharp, soft, and the like. In another embodiment, such a label may also be a numerical value, such as a numerical value to describe a degree of sharpness or a hardness.
B. A site tag that is not available for attack.
C. An offensive site tag.
In one embodiment, the type of site may be defined, including generic, fragile, strong, and the like. In another embodiment, such a label may also be a numerical value, such as a numerical value to describe a degree of strength or a degree of frailty.
D. A non-offensive location tag.
E. The suspected collision threshold (distance units).
F. Effective speed threshold (speed unit).
G. And judging a sensitivity threshold value for collision.
H. A vibration determination threshold (based on a vibration sensing device).
I. And setting scoring rules.
(4) Score calculation module
And the scoring operation module is used for judging whether collision occurs according to the original parameters and scoring the real collision event. The system comprises a collision event judging unit, a collision strength judging unit, a score judging unit and an information deriving unit.
A. Collision event determination unit
And the collision event judgment unit is used for obtaining the real-time limb position, posture and motion trail of the competitor according to the 3D virtual human body model of the competitor, and judging and obtaining the virtual collision event.
In this embodiment, the computing device performs an operation (each operation of the positioning information is referred to as "one frame") each time based on the positioning information received from the motion capture system, reads the latest positioning parameters of the motion capture device for each frame, updates the latest positioning parameters to the 3D virtual human body models of all the participants to reflect the real-time orientation, posture and motion trajectory of the real limb, and generates all the virtual limbs and the limbs of all the participants in the real world to be in approximately the same orientation, motion trajectory and posture.
Based on the above setting, when the real limb of the competitor a collides with the real limb of the other competitor B in the real world, the virtual limb of the competitor a in the computing device should also collide with the virtual limb of the competitor B in the same way or close to the same way, and the virtual collision event is judged to be obtained. And performing related strength interpretation through the obtained virtual collision event and the action track and speed transmitted by the action capturing device, and achieving the purpose of assisting in scoring of the match.
The positioning refresh frequency of the existing motion capture device does not necessarily satisfy the competitive project with high-speed limb motion characteristics, and when the occurrence point time of a real collision occurs in a gap of the refresh frequency of the positioning device, the real collision cannot be simulated by the virtual limb. In one embodiment, the positioning refresh frequency of the motion capture device is 0.02 second for one frame, the real collision between a competitor occurs in the process of 0.02 second after the motion capture device is refreshed, and when the real limbs of the competitor are separated from the collision state in the next positioning refresh, the virtual limbs of the competitor can not reflect the collision and cause the scoring error. To reduce this error occurrence rate, the computing device of the present invention introduces related error control mechanisms and means.
In a preferred embodiment, a vibration sensor is disposed within the motion capture device, and the vibration sensor is used as a part of the motion capture system to transmit the vibration parameters and the positioning parameters to the scoring machine in real time. And simultaneously, the collision event judgment unit performs real-time collision judgment on each current frame:
(a) the 3D virtual human body models of all the competitors are assigned corresponding colliders (Collider), the colliders completely cover the 3D virtual human body models of the competitors, and a computing device can monitor contact or intersection among the 3D virtual human body models of the competitors.
On the basis, the volume of the collision body is larger than the actual volume of the competitor through parameter adjustment, and the parameter can be set by a system debugger in an initial parameter setting link.
For example, in one case, the actual waist circumference of competitor A is 68cm, which corresponds to a radius of 10.82 cm. When the initial parameter setting value is 2cm, the waist circumference radius is set to 10.82+2=12.82cm, the waist circumference of the collision body of the virtual limb is set to 80.51cm, and so on to the whole body. In the above example, the waist circumference of the competitor is regarded as a circle, and when the limb shape is other shapes, different mathematical formulas are adopted for calculation, so that the volume of the collision body is larger than that of the actual competitor.
When the actual limb of the competitor A in the real world approaches to the actual limb of the competitor B by 2cm or more, the 3D virtual human body model of the competitor A contacts with the virtual limb collision body of the competitor B, and then the signal of virtual limb contact is obtained.
(b) After the signal of virtual limb contact is judged to appear, whether the vibration sensor on the corresponding limb label has transmitted the effectual vibration parameter of further monitoring.
If the effective vibration parameters appear, judging that collision occurs; if no effective vibration parameter exists, the occurrence of the collision is not judged.
In another preferred embodiment, the system commissioner may specify an effective velocity threshold at which the limb can actively impact other objects. When the speed of the 'position available for attack' reaches the threshold value, the moving distance of the 'position available for attack' in the state is recorded, and ray casting is carried out at each frame.
(a) Difference operation between two samplings
The positioning refresh frequency of the motion capture device is limited, and when the motion speed is extremely high, collision may occur between two motion capture samples, in which case the collision cannot be determined by means of model volume contact.
To avoid missing a collision due to a limited positioning frequency, the computing device transmits a ray from the position of the previous frame to the current position based on the object reaching the motion threshold at each frame, and checks whether a collision with another physical rigid body occurs. If a collision occurs, it is determined that the collision occurred.
(b) Prejudgement
The detection frequency of the motion capture device also causes hysteresis in the detection.
In response to this problem, the calculation means emits a ray to the detected latest velocity direction based on the current position of the object, and if a collision of the ray with the rigid body is detected and the distance is less than a threshold value, it is determined that an impact has occurred.
It should be noted that the two error control options described above may not be enabled if the accuracy and refresh rate of the selected pointing device meet the project criteria.
B. Collision force determination unit
The collision force judging unit judges whether the collision reaches a set threshold value according to the force index when the virtual collision event occurs, and the virtual collision events reaching different set threshold values obtain different scores. In the first error control selectable mechanism, the collision event determination unit determines whether the motion capture device worn by the competitor vibrates after obtaining the real-time limb orientation, posture and motion trajectory of the competitor, and determines the real collision event if the motion capture device vibrates.
In one embodiment, all of the limbs that collide with each other during the collision are involved. For example, the impact of the hand limbs with the head limbs subjects both the hands and the head to the impact.
(a) Determination of force index using collision parameters
The collision parameters include: the relative travel speed of the limb before the collision (the speed of reducing the distance of colliding with the limb), and the limb travel distance N seconds before the collision.
In this embodiment, the absolute instantaneous velocity will not produce a bump; on the other hand, motion does not accumulate energy indefinitely due to distance. Thus setting a threshold distance of movement (e.g., 0.5 meters) before impact. Before reaching the threshold, the impact strength increases with the movement distance; upon reaching the threshold, the increase in travel distance does not continue to increase the impact force.
(b) Assisted determination of force index using assisted collision parameters
The first type is a type label (or value) of a limb subjected to force calculation, which includes a weak portion (or a value representing the degree of weakness), a strong portion (or a value to be strengthened) and a general portion (or values describing various states of the portions).
The second category is the type of label (or value) of the limb that is struck, which includes blunt (value representing the blunt force of the object), sharp (value representing the sharp of the object), and soft (value representing the soft of the object to be deformed).
In another particular embodiment, the computing machine discriminates between "aggressors," "defenders," or both "fence partners," and the factors involved in discrimination include limb type labels involved in the collision site, the current absolute velocity (i.e., relative to the world's velocity) of the involved collision site.
(a) When the 'position available for attack' collides with the 'position available for attack', the judgment is that: the part that can be used for attacking is the attacking party, and the part that can attack is the defense party.
(b) When "a site available for attack" collides with "a site available for attack": if the absolute speed of the competitor A reaches the threshold value and the absolute speed of the competitor B does not reach the threshold value, judging that the competitor A attacks and the competitor B defends; if the absolute speeds of the two parties do not reach the threshold value, the system is judged to be collision-free; if both absolute speeds hit the threshold, both are determined to be in the notch.
And after the attacker, the defender or the barrier is judged, calculating the strength index born by the defender.
C. Scoring determination unit
And after the collision event is triggered and the force parameter is calculated, the calculating device further scores or penalizes the foul according to a preset scoring rule. And related initial parameter setting, scoring rule setting and touch and force judgment results are possibly but not necessarily quoted in the scoring process to carry out real-time operation, so that the real-time scoring conditions of both participants are obtained.
(5) Data connection establishing module
The data connection establishing module is used for establishing the connection relationship between the communication device and the motion capture system.
In a preferred embodiment, the scoring machine is also provided with a mounting means for securing communication from the wearable device such that data detection by the various motion capture means and communication to the scoring machine is secure (e.g. protected from third party tampering); and secondly, for securing communications (including viewing games and related CGIs, accessing associated audio data including commentary, communications of coaches and participants, announcements, music, broadcastings, etc.).
(6) Information export setting module
The information export setting module is used for exporting the scoring result and the scoring data counted by the scoring operation module and sending the scoring result and the scoring data to the output device.
3. The scoring machine also includes an output device.
(1) Display output
The output device comprises a display output which is connected with a visual display, and the scoring machine displays the scoring result and the scoring data on the visual display in a CGI image mode through the display output so that one or more of a competitor, a coach or a leader and a spectator can see the scoring result and the scoring data (whether the competitor is located in a local competition field or watches a competition at a remote place).
The scoring device derives relevant information and parameters according to the competition rules, and specifically comprises: effective scoring, accumulated scoring of competition players, foul events, point deduction information, winning information derivation, and display of competition process information by time-sequenced virtual human body orientation parameters.
The CGI image includes data visualization processing and display of all contestant score and force index of each effective hit. Further, the visual display also shows a CGI rendering of the participants' anatomy, showing where the force was applied. For example, the player's presentation map shows where the impact occurred (e.g., the imprint of a stick, weapon, or other object (e.g., a baseball bat striking the lower jaw)), adding a multidimensional representation of the force and strength of the impact.
The "Damage value" of a stroke can also be expressed as a score of the player who sent the stroke, one or more points of the player who received the stroke, or a combination of both. In one embodiment, the damage value is also represented as a visual representation of the impact, such as an artistic impression of, for example, a stick or baseball bat hitting the mandible, using a corresponding pressure map showing the relevant distribution of forces on the recipient's mandible. The damage value may further be represented as a visual rendering of the stroke, again say as an artistic impression, but recalibrated to simulate an edged weapon stroke (e.g. visually replacing a stick with a sword or spear).
The scoring machine includes capabilities for processing computer graphics (including video). In one arrangement, fighting may be viewed in real time or near real time, with impact "damage" indicating where the competitor was hit and the damage value to the competitor for each hit or accumulated strike, through impact data overlays or other display of the impact data, action playback, and computer-generated graphical display. The CGI device (e.g., software) may additionally include image characters to enable contextual displays, fighting targets, and other visual display elements for fighting playback, modeling, or gameplay.
In one embodiment, the scoring device includes a CGI device (e.g., software) to graphically display force parameter data and for multi-dimensional (e.g., 2D, 3D, 4D) presentation of game-related computer-generated imagery as a graphical overlay on a video recording of the participant, or as a CGI presentation of the participant. This is useful for real and simulated races, and for combinations of real and simulated races, enhancing the viewer experience while viewing the race, including in an interactive manner and for training and/or entertainment (e.g., gaming) purposes, where the viewer can be in real time, near real time or extended into the future, or during action playback.
Preferably, the scoring machine may also interpret the results of the stroke. For example: it is caused by a sharp weapon (e.g. sword or spear) rather than the specific weapon actually used, and uses CGI to present the result as an artistic impression; the use of CGI devices can provide a simulation of damage such that the viewer or viewer can see the performance of the competitor (e.g., in actual combat conditions), or the degree of damage that would otherwise be sustained using bladed rather than bare weapons; multiple strokes or forces (including simultaneous strokes or forces) may be recorded and viewed simultaneously, or alternatively viewed on a visual display.
Preferably, the CGI device may be an integral part of the scoring machine, or connected to the scoring machine by any suitable communication means and using any suitable communication protocol.
(2) Audio output
The output device also comprises audio output, the audio output is connected with a loudspeaker, and the scoring machine plays the collected audio data to the loudspeaker through the audio output. Such that audio data can be received (e.g., from a scoring machine or from an externally connected source) through one or more speakers and/or audio information (e.g., scoring instructions or voice) derived through the scoring and such that the audio data can be heard by one or more of the competitors, coaches or leaders, spectators (including video viewing players within the local arena and remotely).
4. The scoring machine also comprises a sound pickup device.
The sound pickup device receives the audio information of the sound pickup and sends the audio information to the computing device, and the computing device sends the audio information to the action capturing device of the competitor, which can play audio, through the communication device.
The sound pickup device enables audio data (e.g., voice) to be received (e.g., from a scoring machine or from an externally connected source) through one or more speakers so that the audio data can be heard by one or more of a competitor, coach or lead, or audience.
In a preferred embodiment, the scoring machine further comprises an ultra-slow motion video playback device enabling viewing on the visual display, for example slowing down the recording of 100 frames per second to 1 frame per second.
As shown in fig. 8, an implementation method of an electronic scoring system for a multi-user confrontation type project includes the following steps:
s100, the competitor wears the body at different positions according to different attributes of each motion capture device.
The competitor needs to fix the positioning device on each part of the body, and the specific fixing mode needs to be matched according to the parameters initially set by the scoring device. If the positioning device is defaulted to be positioned at the hindbrain part of the competitor in the initial setting of the scoring device, the competitor needs to fix the positioning device at the hindbrain part according to the setting. And so on.
S200, establishing connection (including wired or wireless connection) between the scoring machine and the motion capture system, and establishing a 3D virtual human body model of the competitor.
The physical structure of the competitor is simulated by known information through the positioning recognition of the whole body or partial limbs of the competitor or using an instrument formed by the initial setting. The specific method comprises the following steps: and according to the position of the positioning equipment, carrying out body structure simulation on the competitor and the body part of the competitor which are matched with the initial setting.
The head locator identified as initially set by the scoring machine generates a local range representing the actual head size and orientation of the competitor based on its orientation and heading at its location, this local range parameter may be set manually or may use a world average, the parameter being the length, width, thickness, etc. of the human head. By analogy, the scoring machine will continue to generate a local area based on the chest location device position (optionally including hand, leg, foot, instrument location, etc.), which may be manually set or may use a world average, the parameters being the body's three-dimensional data, etc. Finally, the body shape, position and orientation of all participants are simulated in real time through the identification of the positioning equipment in scoring system software.
Preferably, the size of the object in the virtual environment should be slightly larger than the actual size of the corresponding object (e.g., the circular virtual collision volume of the head should be larger than the actual size of the user's head in the virtual environment). This is to trigger the system scoring mechanism in all suspected collisions to avoid missing a determination of a collision.
After the scoring machine is connected with the motion capture system, the motion capture device worn by the competitor needs to be calibrated and identified. The method of calibration and identification may use manual identification and/or intelligent identification:
in one embodiment, the operator will manually identify the limb part of each competitor corresponding to each motion capture device through the operation interface of the scoring machine; in another embodiment, the scoring machine intelligently identifies and marks the labels of the limb parts according to the position of each motion capture device, and particularly identifies the labels of the motion capture devices according to the distribution of the human body structure. If the head tag is given by the device at the highest position, the chest tag is given by the device below the head tag device, the hand tag is given by the devices on both sides of the chest tag device, and so on until all devices have been given a tag.
In another embodiment, the scoring machine first tags with smart identification, and then still allows the user to modify the tags through the operator interface.
And S300, the scoring machine acquires the positioning information obtained by the motion capture device in real time.
And S400, judging a collision event in real time by the scoring machine according to the 3D virtual human body models of different competitors and the initial set original parameters.
In order to supplement the problem of insufficient performance of most civil positioning equipment (in particular to the incompleteness of acquisition of the limb movement track data of the competitor caused by the insufficiency of the positioning frequency of the positioning equipment), the scoring machine predicts the advance of the limb movement track according to the successfully acquired limb movement data of the competitor by using a track judgment mechanism, and detects the impact event by using the advance prediction as the data point according to the advance prediction.
(1) And judging whether the suspected collision occurs according to the volume intersection of the 3D virtual human body models of different competitors.
(2) On the basis of judging that the suspected collision occurs, evaluating and scoring the collision, judging whether the collision strength exceeds a preset threshold value, and if so, judging that the collision event is real; if not, the collision is determined to be not. The evaluation score was determined by:
A. the smallest distance between two competitors crossing the object when suspected to collide.
B. Relative motion state of the object at suspected collision: the distance is reduced to positive score, and the score is increased along with the speed of distance reduction; if the distance increases, the vehicle leaves the determination and is directly determined as a non-collision.
C. Oscillating: the larger the amplitude the higher the fraction.
And S500, calculating the scoring condition of the competitor in real time according to the scoring rule by the scoring machine.
The real-time scoring process comprises the following steps:
(1) obtaining collision force parameters according to the limb movement speed and the trajectory of the competitor before collision;
(2) identifying the specific limb part of the competitor who collides to obtain an attack part and an attacked part;
(3) and real-time scoring is carried out based on scoring rules through the judgment of the real collision event.
And setting a score standard by adjusting software parameters according to specific competition rules. The realization method can be suitable for competition systems such as short soldier competition, taekwondo competition, sword competition, western sword competition and the like, and the winning side and the losing north of the competition are determined according to different collision positions, collision strength and other parameters.
A. Determining the attacking party and the hit party
One of the non-attacking parts such as the body or the head, which is the volume intersection part, becomes the hit side;
the volume intersection part is one of the aggressive parts such as hands, instruments, feet and the like and becomes an aggressor;
both parties could theoretically be allowed to be either aggressors or hit parties.
B. Calculating the force of collision
The force is calculated according to the limb moving speed and the moving distance of the attacking party and the relative moving speed of the struck part of the struck party before the collision occurs.
It will be understood that modifications and variations can be made by persons skilled in the art in light of the above teachings and all such modifications and variations are intended to be included within the scope of the invention as defined in the appended claims.
The invention is described above with reference to the accompanying drawings, which are illustrative, and it is obvious that the implementation of the invention is not limited in the above manner, and it is within the scope of the invention to adopt various modifications of the inventive method concept and technical solution, or to apply the inventive concept and technical solution to other fields without modification.

Claims (6)

1. An electronic scoring system for multi-player confrontation-type items, comprising:
the motion capture system is used for positioning the motion capture devices of all body parts of the competitors;
and the scoring machine receives the positioning information sent by the motion capture system in real time, acquires the 3D virtual human body model corresponding to the competitor through the positioning information, judges whether effective collision occurs or not through the 3D virtual human body model, scores according to the type of the effective collision and a predefined scoring rule, and comprises:
the communication device is used for receiving the positioning information sent by the motion capture system in real time;
and the computing device obtains the 3D virtual human body models corresponding to the competitors according to the positioning information received by the communication device, judges that suspected collision occurs by judging the volume cross of the 3D virtual human body models of the two competitors, and obtains a scoring result by judging the strength of a suspected collision event and a competition rule, and comprises:
the operation interface is used for controlling the operation of the computing device through interface operation;
the motion capture system calibration module is used for calibrating each motion capture device connected with the scoring machine according to the human body structure, forming a 3D virtual human body model of a competitor and defining the attribute of each motion capture device;
the original parameter and scoring rule setting module is used for setting judgment parameters in the collision event judgment process and setting a rule of collision result integration;
the scoring operation module is used for judging whether collision occurs according to the original parameters and scoring the real collision event, and comprises:
the collision event judgment unit is used for obtaining the real-time limb position, posture and motion track of the competitor according to the 3D virtual human body model of the competitor, judging and obtaining a virtual collision event, judging whether the motion capture device worn by the competitor vibrates or not after obtaining the real-time limb position, posture and motion track of the competitor, and judging as a real collision event if the motion capture device worn by the competitor vibrates;
the collision force judging unit judges whether the collision reaches a set threshold value according to the force index when the virtual collision event occurs, and the virtual collision events reaching different set threshold values obtain different scores; and the data connection establishing module is used for establishing the connection relationship between the communication device and the motion capture system.
2. The electronic scoring system for multi-player confrontational items according to claim 1, wherein the motion capture system comprises a motion capture sensor and the motion capture device, the motion capture sensor is connected with the motion capture device in a wired/wireless manner, and the motion capture device is worn on different parts of the competitor or on a weapon.
3. An electronic scoring system for multi-person confrontation-type items according to claim 1, characterised in that each limb part or weapon wearing the motion capture device is individually configured with one or more attributes.
4. A method of implementing an electronic scoring system for multi-person confrontation type items according to any one of claims 1 to 3, comprising the steps of:
s100, the competitor wears different limb positions according to different attributes of each motion capture device;
s200, establishing connection between a scoring machine and a motion capture system, and establishing a 3D virtual human body model of a competitor;
s300, the scoring machine acquires positioning information obtained by the motion capture device in real time;
s400, the scoring machine judges a collision event in real time according to the 3D virtual human body models of different competitors and initial set original parameters;
and S500, calculating the scoring condition of the competitor in real time according to the scoring rule by the scoring machine.
5. The method of claim 4, wherein the step S400 comprises: judging whether a suspected collision occurs according to the volume intersection of the 3D virtual human body models of different competitors, judging whether the collision strength exceeds a preset threshold value on the basis of judging the suspected collision, and judging a real collision event if the collision strength exceeds the preset threshold value.
6. The method of claim 4, wherein in step S500, the real-time scoring process comprises:
(1) obtaining collision force parameters according to the limb movement speed and the trajectory of the competitor before collision;
(2) identifying the specific limb part of the competitor who collides to obtain an attack part and an attacked part;
(3) and real-time scoring is carried out based on scoring rules through the judgment of the real collision event.
CN202010571424.9A 2020-06-22 2020-06-22 Electronic scoring system for multi-person confrontation type project and implementation method Active CN111672089B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010571424.9A CN111672089B (en) 2020-06-22 2020-06-22 Electronic scoring system for multi-person confrontation type project and implementation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010571424.9A CN111672089B (en) 2020-06-22 2020-06-22 Electronic scoring system for multi-person confrontation type project and implementation method

Publications (2)

Publication Number Publication Date
CN111672089A CN111672089A (en) 2020-09-18
CN111672089B true CN111672089B (en) 2021-09-07

Family

ID=72455978

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010571424.9A Active CN111672089B (en) 2020-06-22 2020-06-22 Electronic scoring system for multi-person confrontation type project and implementation method

Country Status (1)

Country Link
CN (1) CN111672089B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112237731A (en) * 2020-10-19 2021-01-19 上海名图软件有限公司 Automatic scoring system and scoring method for badminton match

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU2009101201B4 (en) * 2009-10-23 2010-03-25 Chiron Ip Holdco Pty Ltd Electronic scoring system, method and armour for use in martial arts
CN101893935B (en) * 2010-07-14 2012-01-11 北京航空航天大学 Cooperative construction method for enhancing realistic table-tennis system based on real rackets
WO2012026681A2 (en) * 2010-08-24 2012-03-01 Yun Sang Bum Virtual reality martial arts system using a network, and method for controlling same
CN202487010U (en) * 2011-10-24 2012-10-10 山东大学 Military boxing on-line teaching system based on internet platform
CN104248835A (en) * 2013-06-29 2014-12-31 青岛联合创新技术服务平台有限公司 Intelligent basketball scoring system and work method thereof
CN103488291B (en) * 2013-09-09 2017-05-24 北京诺亦腾科技有限公司 Immersion virtual reality system based on motion capture
CN203989684U (en) * 2014-05-28 2014-12-10 东莞市新雷神仿真控制有限公司 A kind of skiing analogue system
CN105536228A (en) * 2016-02-01 2016-05-04 安徽北斗易通信息技术有限公司 Acceleration sensor based digital fighting evaluation system
KR101912126B1 (en) * 2016-02-04 2018-10-29 주식회사 골프존뉴딘홀딩스 Apparatus for base-ball practice, sensing device and sensing method used to the same and control method for the same
US10286280B2 (en) * 2016-04-11 2019-05-14 Charles Chungyohl Lee Motivational kinesthetic virtual training program for martial arts and fitness
CN205581785U (en) * 2016-04-15 2016-09-14 向京晶 Indoor virtual reality interactive system of many people
US20180001141A1 (en) * 2016-06-13 2018-01-04 Jerome Curry Motion interactive video recording for fighters in a mixed martial arts and boxing match
CN206497423U (en) * 2017-01-22 2017-09-15 隋文涛 A kind of virtual reality integrated system with inertia action trap setting
CN106648116B (en) * 2017-01-22 2023-06-20 隋文涛 Virtual reality integrated system based on motion capture
CN106843484B (en) * 2017-01-22 2019-12-13 隋文涛 Method for fusing indoor positioning data and motion capture data
US10071281B1 (en) * 2017-02-21 2018-09-11 Robosport Technologies, Llc Systems, devices, and methods for virtual and augmented reality sports training
CN207012526U (en) * 2017-03-24 2018-02-16 湖南铁道职业技术学院 A kind of combat sports electronics automatic score device
CN107469315A (en) * 2017-07-24 2017-12-15 烟台中飞海装科技有限公司 A kind of fighting training system
CN107754224A (en) * 2017-10-27 2018-03-06 姜俊 One kind action scoring apparatus and method
CN208097303U (en) * 2018-04-20 2018-11-16 山东珞麟网络科技有限公司 A kind of boxing training device based on the collision of sterically defined real-time detection and dynamics
CN108694871A (en) * 2018-05-22 2018-10-23 山东捷瑞数字科技股份有限公司 A kind of more soldier's military training checking systems based on large space virtual reality
CN108704299A (en) * 2018-06-02 2018-10-26 上海写轮智能科技有限公司 VR game intelligences boxing fistfight and grooming glove
CN209361823U (en) * 2018-12-18 2019-09-10 黑天鹅智能科技(福建)有限公司 A kind of tae kwon do headgear electronics protector
CN209392704U (en) * 2018-12-18 2019-09-17 黑天鹅智能科技(福建)有限公司 A kind of tae kwon do electronics protector
CN109395361A (en) * 2018-12-18 2019-03-01 黑天鹅智能科技(福建)有限公司 A kind of tae kwon do electronics protector and point system
CN110478891A (en) * 2019-08-08 2019-11-22 山东珞麟网络科技有限公司 A kind of action sequence recognition methods and motion capture data processing system
CN111309154A (en) * 2020-03-27 2020-06-19 上海景和国际展览有限公司 Remote human body action interactive education system based on 5G

Also Published As

Publication number Publication date
CN111672089A (en) 2020-09-18

Similar Documents

Publication Publication Date Title
US11836929B2 (en) Systems and methods for determining trajectories of basketball shots for display
CN102711931B (en) Electronics scoring system, method and the armor used in wushu
JP6234928B2 (en) System and method for detecting a user-dependent state of a sports item
WO2019114708A1 (en) Motion data monitoring method and system
US9120014B2 (en) System and method for gathering and analyzing objective motion data
CN105224070B (en) Athletic activity heads-up display system and method
KR100648554B1 (en) Play method and apparatus, and recording medium
EP2973215B1 (en) Feedback signals from image data of athletic performance
US10139899B1 (en) Hypercatching in virtual reality (VR) system
US20070021199A1 (en) Interactive games with prediction method
JP2018523868A (en) Integrated sensor and video motion analysis method
US20070021207A1 (en) Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method
JP2000033184A (en) Whole body action input type game and event device
JP2017521017A (en) Motion event recognition and video synchronization system and method
US11478681B2 (en) Method of real time monitoring of a person during an event and event dynamics system thereof
US20110256914A1 (en) Interactive games with prediction and plan with assisted learning method
KR20210126522A (en) Combat sports scoring system using augmented reality
CN115364473A (en) Trampoline electronic game
CN111672089B (en) Electronic scoring system for multi-person confrontation type project and implementation method
JP2003180896A5 (en)
US10201746B1 (en) Near-realistic sports motion analysis and activity monitoring
KR101723011B1 (en) A management system for training fencer and method thereof
US20220323847A1 (en) System and method for scoring combat sports
US20220152468A1 (en) Information processing apparatus and information processing system
KR102619792B1 (en) System and method for judging martial arts using image information based on artificial intelligence

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant