CN117392892A - XR-based simulated grenade training method, system, equipment and storage medium - Google Patents

XR-based simulated grenade training method, system, equipment and storage medium Download PDF

Info

Publication number
CN117392892A
CN117392892A CN202311383090.2A CN202311383090A CN117392892A CN 117392892 A CN117392892 A CN 117392892A CN 202311383090 A CN202311383090 A CN 202311383090A CN 117392892 A CN117392892 A CN 117392892A
Authority
CN
China
Prior art keywords
gesture
grenade
simulated
simulated grenade
preset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311383090.2A
Other languages
Chinese (zh)
Inventor
宋殿义
李松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202311383090.2A priority Critical patent/CN117392892A/en
Publication of CN117392892A publication Critical patent/CN117392892A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/003Simulators for teaching or training purposes for military purposes and tactics
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F42AMMUNITION; BLASTING
    • F42BEXPLOSIVE CHARGES, e.g. FOR BLASTING, FIREWORKS, AMMUNITION
    • F42B8/00Practice or training ammunition
    • F42B8/12Projectiles or missiles
    • F42B8/26Hand grenades
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Educational Technology (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Educational Administration (AREA)
  • Geometry (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

An XR-based simulated grenade training method, system, equipment and storage medium. The method comprises the steps of obtaining a simulated grenade state by constructing a virtual reality scene, obtaining a target gesture set corresponding to the simulated grenade state from a preset gesture database, accurately simulating states of grenade in various stages when the grenade is used, and comparing operation gestures of trained personnel in various stages with standard gestures to obtain an accurate evaluation result; and the track of the simulated grenade is predicted by utilizing a pre-trained tracking model, the throwing result of the simulated grenade in the virtual reality scene is obtained, the training result is generated based on the gesture matching result and the throwing result, the throwing result of the simulated grenade in the virtual reality scene is calculated by the tracking model, the throwing result can be accurately evaluated, the realistic striking effect can be displayed in the virtual scene, and the realistic scene is simulated based on the virtual reality, so that the reality is improved.

Description

XR-based simulated grenade training method, system, equipment and storage medium
Technical Field
The invention belongs to the technical field of military simulation training, and particularly relates to a simulation grenade training method, system, equipment and storage medium based on XR.
Background
The grenade is a small hand-thrown ammunition capable of attacking and defending, is also an ammunition with wider use and larger consumption, and plays an important role in modern war in the past due to the characteristics of small volume, small mass and convenient carrying and use.
At present, in the mine throwing training process, the throwing action is usually trained by using a simulated mine without explosion, then the simulated mine with killing power is used for training, and finally the live ammunition throwing is carried out. Because the simulated grenade training effect is not ideal enough, and the training personnel are often new soldiers (new scholars) who never throw live ammunition, the simulated grenade with killing power and accidents occur during live ammunition throwing, and casualties are often caused.
In the prior art, the simulation grenade for traditional training mainly simulates the appearance of grenade, and is trained in a common training field, so that the reality is poor, and feedback is insufficient to stimulate trained personnel. In addition, in the training process, the training process is limited by personnel resources, and most of the training process is that a teacher and a trained person train one to many, and when the trained person is judged by training results, the teacher can only judge the throwing accuracy of the trained person according to the throwing place of the simulated grenade, or judge whether the posture of the trained person is wrong or not according to experience, so that more details of the trained person and operations of the process such as hand holding action, arm running process and the like cannot be considered.
In summary, the training of the existing simulation grenade has high requirement on the training field, and requires a large throwing training field; in addition, in the whole training process, whether each action of the trained personnel is standard cannot be accurately and thoroughly judged, the throwing track of the simulated grenade cannot be analyzed, and a scientific training plan cannot be formulated for the trained personnel.
Disclosure of Invention
Therefore, the invention provides an XR-based simulated grenade training method, system, equipment and storage medium, which solve the problems that in the prior art, the throwing sense of reality is poor, and the operation details of a trained object are difficult to monitor in real time.
In order to achieve the above object, the present invention provides the following technical solutions: in a first aspect, an XR-based simulated grenade training method is provided, comprising:
constructing a virtual reality scene, acquiring state information and position information of a simulated grenade in the virtual reality scene, and displaying real-time images of the simulated grenade corresponding to the simulated grenade in the virtual reality scene according to the state information and the position information;
acquiring a target gesture set corresponding to the state of the simulated grenade from a preset gesture database, wherein the target gesture set of the preset gesture database stores a preset standard human gesture, and the gesture set of the state of the simulated grenade is generated according to the preset mapping relation between the standard human gesture and the state of the simulated grenade;
Acquiring first body posture data of a trained object, and matching the first body posture data with the target posture set to generate a posture matching result;
responding to a signal of switching the simulated grenade to a simulated excitation state, acquiring a coordinate parameter of the simulated grenade in a preset space coordinate system and a throwing parameter of the simulated grenade, and calculating a motion state of the simulated grenade in the preset space coordinate system through the throwing parameter;
predicting the track of the simulated grenade by using a tracking model, obtaining a throwing result of the simulated grenade in the virtual reality scene, and generating a striking effect of the simulated grenade in the virtual reality scene;
and generating a training result of the simulated grenade by using the gesture matching result and the throwing result.
As a preferred scheme of the simulation grenade training method based on XR, the state information of the simulation grenade comprises an installation locking state, a releasing insurance state, a state to be excited and a simulation excitation state;
and acquiring the vertical acceleration and the transverse acceleration of the simulated grenade in the installation locking state and the safety releasing state, and judging whether the simulated grenade falls off or not.
As a preferred scheme of the simulation grenade training method based on XR, the gesture set of the simulation grenade state comprises a standard human gesture of the simulated grenade for implementing actions of a trained object in a set state; the types of the standard human body gestures include a trunk gesture, a left hand gesture and a right hand gesture;
matching the first body pose data with the target pose set, comprising:
taking the standard human body posture corresponding to the first human body posture data as a target human body posture;
judging whether the first approximation degree of the first body posture data and the target body posture is smaller than a first preset threshold value or not;
if the first approximation degree of the first body posture data and the target body posture is smaller than a first preset threshold value, the posture matching result is qualified;
if the first approximation degree of the first body posture data and the target body posture is not smaller than a first preset threshold value, the posture matching result is unqualified.
As a preferred scheme of the simulation grenade training method based on XR, the method for generating the gesture matching result comprises the following steps:
splitting the first body posture data into a current torso posture, a current left hand posture and a current right hand posture;
Respectively calculating a second approximation degree of the trunk gesture of the current trunk gesture and the target human gesture, a third approximation degree of the left hand gesture of the current left hand gesture and the left hand gesture of the target human gesture, and a fourth approximation degree of the right hand gesture of the current right hand gesture and the right hand gesture of the target human gesture;
if the second approximation degree, the third approximation degree and the fourth approximation degree all accord with corresponding preset conditions, judging the gesture matching result as qualified;
and if at least one of the second approximation degree, the third approximation degree and the fourth approximation degree does not accord with the corresponding preset condition, judging the gesture matching result as unqualified.
As a preferred scheme of the simulation grenade training method based on XR, the body joint point of the trained object is captured by using an OpenPose human body posture detection algorithm; and generating the human body posture of the trained object by utilizing the skeleton connecting lines among the body joints.
As a preferred scheme of the simulation grenade training method based on XR, the simulation grenade training method based on XR further comprises the following steps:
detecting the squat gesture of the trained object, and acquiring second body gesture data, wherein the second body gesture data comprises a first height of the trained personnel when the simulated grenade is in the safe locking state and/or the arming state, and a second height of the trained object when the simulated grenade is switched to the simulated excitation state;
And the gesture matching result also comprises a squat result, and if the difference value between the first height and the second height exceeds the preset height difference value, the squat result is judged to be qualified.
As a preferred scheme of the simulation grenade training method based on XR, the throwing parameters comprise acceleration, angular velocity and movement direction;
predicting the track of the simulated grenade by using a tracking model to obtain a throwing result of the simulated grenade in the virtual reality scene, wherein the method comprises the following steps:
adding virtual wall parameters and virtual ground parameters in the preset space coordinate system based on a preset collision parameter library, constructing a virtual wall in the virtual reality scene through the virtual wall parameters, constructing a virtual ground in the virtual reality scene through the virtual ground parameters, and storing the virtual wall parameters and the virtual ground parameters through the collision parameter library;
calculating expected position coordinates of each time point of the simulated grenade in a preset time period in the preset space coordinate system based on the coordinate parameters, the throwing parameters, the virtual wall parameters and the virtual ground parameters;
and according to the expected position coordinates, confirming the explosion position of the simulated grenade in the virtual reality scene based on the preset explosion time delay, generating a throwing result of the simulated grenade based on the explosion position, and displaying a striking effect in the virtual reality scene.
In a second aspect, the present invention provides an XR-based simulated grenade training system, comprising:
the simulated grenade display module is used for constructing a virtual reality scene, acquiring state information and position information of a simulated grenade in the virtual reality scene, and displaying real-time images of the simulated grenade corresponding to the simulated grenade in the virtual reality scene according to the state information and the position information;
the target gesture set acquisition module is used for acquiring a target gesture set corresponding to the state of the simulated grenade from a preset gesture database, wherein the target gesture set of the preset gesture database stores preset standard human body gestures;
the gesture set generation module is used for generating a gesture set of the simulated grenade state according to a preset state mapping relation between the standard human gesture and the simulated grenade;
the gesture matching module is used for acquiring first body gesture data of the trained object, matching the first body gesture data with the target gesture set and generating a gesture matching result;
the motion state analysis module is used for responding to the signal of switching the simulation grenade to the simulation excitation state, acquiring the coordinate parameters of the simulation grenade in a preset space coordinate system and the throwing parameters of the simulation grenade, and calculating the motion state of the simulation grenade in the preset space coordinate system through the throwing parameters;
The throwing tracking module is used for predicting the track of the simulated grenade by utilizing a tracking model, obtaining the throwing result of the simulated grenade in the virtual reality scene and generating the striking effect of the simulated grenade in the virtual reality scene;
and the training result generation module is used for generating the training result of the simulated grenade by utilizing the gesture matching result and the throwing result.
As a preferred scheme of the simulation grenade training system based on XR, in the simulation grenade display module, the state information of the simulation grenade comprises an installation locking state, a releasing insurance state, a state to be excited and a simulation excitation state;
and acquiring the vertical acceleration and the transverse acceleration of the simulated grenade in the installation locking state and the safety releasing state, and judging whether the simulated grenade falls off or not.
As a preferred scheme of the simulation grenade training system based on XR, in the gesture set generation module, the gesture set of the simulation grenade state comprises a standard human body gesture of the trained object implementation action of the simulation grenade in a set state; the types of the standard human body gestures include a trunk gesture, a left hand gesture and a right hand gesture;
the gesture matching module comprises:
The target human body posture extraction sub-module is used for taking the standard human body posture corresponding to the first human body posture data as a target human body posture;
the approximation degree judging sub-module is used for judging whether the first approximation degree of the first body posture data and the target body posture is smaller than a first preset threshold value or not; if the first approximation degree of the first body posture data and the target body posture is smaller than a first preset threshold value, the posture matching result is qualified; if the first approximation degree of the first body posture data and the target body posture is not smaller than a first preset threshold value, the posture matching result is unqualified.
As a preferred scheme of the simulation grenade training system based on XR, the gesture matching module further comprises:
the gesture data splitting module is used for splitting the first body gesture data into a current trunk gesture, a current left hand gesture and a current right hand gesture;
the approximation degree calculation submodule is used for respectively calculating a second approximation degree of the trunk gesture of the current trunk gesture and the target human gesture, a third approximation degree of the left hand gesture of the current left hand gesture and the left hand gesture of the target human gesture and a fourth approximation degree of the right hand gesture of the current right hand gesture and the right hand gesture of the target human gesture;
The approximation degree judging sub-module is further used for judging that the gesture matching result is qualified if the second approximation degree, the third approximation degree and the fourth approximation degree all accord with corresponding preset conditions;
and if at least one of the second approximation degree, the third approximation degree and the fourth approximation degree does not accord with the corresponding preset condition, judging the gesture matching result as unqualified.
As a preferred scheme of the simulation grenade training system based on XR, the body joint point of the trained object is captured by using an OpenPose human body posture detection algorithm; and generating the human body posture of the trained object by utilizing the skeleton connecting lines among the body joints.
As a preferred embodiment of the XR-based simulated grenade training system, the system further comprises:
the squat gesture analysis module is used for detecting the squat gesture of the trained object and acquiring second body gesture data, wherein the second body gesture data comprises a first height of a trained person when the simulated grenade is in the safe locking state and/or the arming state, and a second height of the trained object when the simulated grenade is switched to the simulated excitation state;
and the gesture matching result in the gesture matching module further comprises a squat result, and if the difference between the first height and the second height exceeds the preset height difference, the squat result is judged to be qualified.
As a preferred scheme of the simulation grenade training system based on XR, in the motion state analysis module, the throwing parameters comprise acceleration, angular velocity and motion direction;
the throwing tracking module includes:
the virtual wall body ground processing submodule is used for adding virtual wall body parameters and virtual ground parameters in the preset space coordinate system based on a preset collision parameter library, constructing a virtual wall body in the virtual reality scene through the virtual wall body parameters, constructing a virtual ground in the virtual reality scene through the virtual ground parameters, and storing the virtual wall body parameters and the virtual ground parameters through the collision parameter library;
the expected position coordinate analysis submodule is used for calculating expected position coordinates of each time point of the simulated grenade in a preset time period in the preset space coordinate system based on the coordinate parameters, the throwing parameters, the virtual wall parameters and the virtual ground parameters;
and the throwing tracking simulation sub-module is used for confirming the explosion position of the simulated grenade in the virtual reality scene based on the preset explosion time delay according to the expected position coordinates, generating a throwing result of the simulated grenade based on the explosion position, and displaying the striking effect in the virtual reality scene.
In a third aspect, there is provided an electronic device comprising: a processor, and a memory communicatively coupled to the processor; the memory stores computer-executable instructions; the processor executes computer-executable instructions stored by the memory to implement the XR-based simulated grenade training method of the first aspect or any possible implementation thereof.
In a fourth aspect, there is provided a computer readable storage medium having stored therein computer executable instructions for implementing the XR-based simulated grenade training method of the first aspect, or any possible implementation thereof, when executed by a processor.
The invention has the following advantages: constructing a virtual reality scene, acquiring state information and position information of a simulated grenade in the virtual reality scene, and displaying real-time images of the simulated grenade corresponding to the simulated grenade in the virtual reality scene according to the state information and the position information; acquiring a target gesture set corresponding to the state of the simulated grenade from a preset gesture database, wherein the target gesture set of the preset gesture database stores a preset standard human gesture, and the gesture set of the state of the simulated grenade is generated according to the preset mapping relation between the standard human gesture and the state of the simulated grenade; acquiring first body posture data of a trained object, and matching the first body posture data with the target posture set to generate a posture matching result; responding to a signal of switching the simulated grenade to a simulated excitation state, acquiring a coordinate parameter of the simulated grenade in a preset space coordinate system and a throwing parameter of the simulated grenade, and calculating a motion state of the simulated grenade in the preset space coordinate system through the throwing parameter; predicting the track of the simulated grenade by using a tracking model, obtaining a throwing result of the simulated grenade in the virtual reality scene, and generating a striking effect of the simulated grenade in the virtual reality scene; and generating a training result of the simulated grenade by using the gesture matching result and the throwing result. The invention not only can accurately evaluate throwing results, but also can display realistic striking effects in virtual scenes, simulate realistic scenes based on virtual reality, and improve realism.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below. It will be apparent to those skilled in the art from this disclosure that the drawings described below are merely exemplary and that other embodiments may be derived from the drawings provided without undue effort.
FIG. 1 is a schematic diagram of an application scenario of an XR-based simulated grenade training system provided in an embodiment of the present invention;
FIG. 2 is a schematic flow chart of an XR-based simulated grenade training method provided in an embodiment of the present invention;
FIG. 3 is a flowchart of a throwing result of a simulated grenade in a virtual reality scene obtained in the XR-based simulated grenade training method provided in the embodiment of the present invention;
FIG. 4 is a schematic diagram of an XR-based simulated grenade training system architecture provided in an embodiment of the present invention;
fig. 5 is a schematic diagram of an electronic device employing an XR-based simulated grenade training method according to an embodiment of the present invention.
Detailed Description
Other advantages and advantages of the present invention will become apparent to those skilled in the art from the following detailed description, which, by way of illustration, is to be read in connection with certain specific embodiments, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
First, the terms involved in the present invention will be explained:
extended Reality (XR for short) refers to that a virtual environment capable of man-machine interaction is created by combining Reality with virtual through a computer, and is also a generic term for various technologies such as augmented Reality AR, virtual Reality VR, mixed Reality MR and the like. By integrating the visual interaction technologies of the three, the method brings the 'immersion' of seamless transition between the virtual world and the real world for the experienter.
Virtual Reality (VR) is mainly based on computer technology, utilizes and integrates various high-tech latest development achievements such as three-dimensional graphic technology, multimedia technology, simulation technology, display technology, servo technology and the like, and generates a realistic Virtual world with various sensory experiences such as three-dimensional vision, touch sense, smell sense and the like by means of equipment such as a computer and the like, so that people in the Virtual world generate an immersive sensation.
Openelse (human body posture recognition): the method is an open source library based on convolutional neural network and supervised learning, can realize the tracking of facial expression, trunk, limbs and even fingers of people, is not only suitable for a single person but also suitable for multiple people, and has better robustness.
In order to better understand the scheme of the embodiment of the present invention, an application scenario related to the embodiment of the present invention is described below.
Referring to fig. 1, an application scenario diagram of an XR-based simulated grenade training system according to an embodiment of the present invention includes a simulated grenade 100, a camera device 200, a positioning device 300, a server 400, and a wearable device 500.
The simulated grenade 100 comprises a first detection unit, a second detection unit and a tracker, wherein the first detection unit is used for detecting the switching between a safe locking state and a disarmed state of the simulated grenade, the second detection unit is used for detecting the switching between the disarmed state and a simulated excitation state of the simulated grenade, and the tracker is used for positioning the simulated grenade and judging whether the simulated grenade is in a falling state or not;
the camera device 200 is used for shooting and identifying the body posture of the trained object and acquiring first body posture data of the trained object;
the positioning device 300 is used for tracking the motion track of the simulated grenade and predicting the track when the simulated grenade is in a simulated excitation state;
The server 400 is configured to construct a virtual reality scene, receive signals of the first detection unit, the second detection unit and the tracker, obtain a state and a position of the simulated grenade, and generate a real-time image of the simulated grenade in the virtual reality scene; the system is also used for receiving first body posture data of the trained personnel sent by the camera device and generating a posture evaluation result; the method is also used for receiving track prediction data sent by the positioning device, obtaining a throwing evaluation result, generating a striking effect of the simulated grenade in the virtual reality scene, and generating a training result based on the posture evaluation result and the throwing evaluation result;
the wearing device 500 is used for generating sensory stimulation to trained personnel according to the real-time image of the simulated grenade and the striking effect of the simulated grenade.
It should be noted that, the "grenade" in the simulated grenade in the invention not only refers to grenade, but also includes weapon that can be thrown by tear-gas shells, smoke shells and the like, and can cause a certain striking effect.
It should be noted that, considering that in the virtual reality VR, the trainee cannot see the hand and the entity of the mine held by the hand, which is not beneficial to the operations such as pulling the pin, the wearable device 500 may further include a static detection module, where the static detection module is configured to detect the static posture of the trainee, and when the trainee keeps the low-head posture for a preset time (for example, three seconds), the projection of the VR in front of the eyes of the trainee is cancelled, so that the trainee can see the entity.
The following describes the technical scheme of the present invention and how the technical scheme of the present invention solves the above technical problems in detail with specific embodiments. The following embodiments may be combined with each other, and the same or similar concepts or processes may not be described in detail in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Example 1
Referring to fig. 2, an embodiment of the present invention provides an XR-based simulated grenade training method, including the following steps:
s201: constructing a virtual reality scene, acquiring a simulated grenade state and a position, and displaying a real-time image of a corresponding simulated grenade in the virtual reality scene according to the simulated grenade, wherein the simulated grenade state comprises an installation locking state, a releasing insurance state, a state to be excited and a simulated excitation state; the method comprises the steps of obtaining a target gesture set corresponding to the simulated grenade state from a preset gesture database, wherein the preset gesture database is used for storing a plurality of standard human gestures and generating a gesture set of the simulated grenade state according to the mapping relation between the standard human gestures and the simulated grenade state.
It will be appreciated that the trainee has different standard actions for each state of the simulated mine when using the simulated mine, corresponding to the actual operation and application of the mine. For example, in the safe locking state of the simulated grenade, the trained personnel can operate the simulated grenade by holding and pulling out pins; when the simulated grenade is unlocked, the trained personnel can hold and throw the simulated grenade; when the simulated grenade is in a simulated excitation state, namely, after the trained personnel throws the simulated grenade, the trained personnel should squat down immediately. Therefore, the standard human body posture corresponding to each state of the simulated grenade is generated according to the standard actions, so that the posture of the trained person can be compared and analyzed with the standard human body posture.
S202: and acquiring first body posture data of the trained personnel, and matching the first body posture data with the target posture set to generate a posture matching result.
Specifically, in step S202, the body nodes of the trained personnel may be captured by a human body posture detection algorithm; based on the skeleton connection line between the body joints, generating the body posture of the trained person, and then matching the first body posture data with the target posture set to generate a posture matching result.
S203: and responding to the signal that the simulated grenade is switched to the simulated excitation state, acquiring a coordinate parameter of the simulated grenade in a preset space coordinate system and a throwing parameter of the simulated grenade, predicting the track of the simulated grenade by utilizing a pre-trained tracking model, acquiring a throwing result of the simulated grenade in the virtual reality scene, generating a striking effect of the simulated grenade in the virtual reality scene, and calculating the motion state of the simulated grenade in the preset space coordinate system by using the space parameter.
It will be appreciated that for a simulated mine, its arming pin controls the simulated mine in a safe locked or unlocked state, and its grip has a safe locked position and a simulated excited position, thereby determining whether the simulated mine switches to a simulated excited state signal. Therefore, when the simulated grenade is switched to a signal simulating an excited state, which means that the safety pin of the simulated grenade is pulled out, and the simulated grenade is released, the holding piece of the simulated grenade moves from the installation locking position to the simulated excited position under the influence of throwing force, gravity and the like, and at this time, the track of the simulated grenade is tracked.
In addition, in step S203, the throwing parameters of the simulated grenade may be obtained to predict the motion trail thereof, and the motion trail and the image of the explosion result thereof may be simulated in the virtual reality scene, so as to improve the sense of reality experienced by the trained personnel.
S204: and generating a training result based on the gesture matching result and the throwing result.
Specifically, the training results comprise the gesture of the trained personnel and the throwing result of the simulated grenade, and the investigation is comprehensive.
In summary, according to the simulated grenade training method provided by the embodiment, the simulated grenade state is obtained by constructing the virtual reality scene, the simulated grenade state comprises the installation locking state, the releasing insurance state and the simulation excitation state, the target gesture set corresponding to the simulated grenade state is obtained from the preset gesture database, the first body gesture data of the trained personnel is obtained, the first body gesture data is matched with the target gesture set, the gesture matching result is generated, the states of the grenade in each stage when in use can be accurately simulated, and the operation gestures of the trained personnel in each stage are compared with the standard gestures, so that the accurate evaluation result is obtained; and the coordinate parameters of the simulated grenade in a preset space coordinate system and the throwing parameters of the simulated grenade are obtained by responding to the signal that the simulated grenade is switched to the simulated excitation state, the track of the simulated grenade is predicted by utilizing a pre-trained tracking model, the throwing result of the simulated grenade in the virtual reality scene is obtained, the space parameters are used for calculating the motion state of the simulated grenade in the preset space coordinate system, the training result is generated based on the gesture matching result and the throwing result, the throwing result of the simulated grenade in the virtual reality scene is calculated by utilizing the tracking model, the throwing result can be accurately evaluated, the realistic striking effect can be displayed in the virtual scene, and the realistic scene is simulated based on the virtual reality so as to improve the sense of reality.
In one possible embodiment, it is contemplated that during the training process, the trained personnel may cause the simulated grenade to fall out of the hand due to a holding error. At this time, even if the safety pin of the simulated grenade is not pulled out or just pulled out, the simulated grenade does not enter the simulated excitation state, and the trained person shall still judge that the simulated grenade falls off, and the training result is unqualified. Therefore, the simulated grenade training method provided in this embodiment further includes: and acquiring the vertical acceleration and the lateral acceleration of the simulated grenade in the installation locking state and the safety releasing state, and judging whether the simulated grenade falls off or not.
Specifically, a tracker can be built in the simulated grenade, when the simulated grenade is in a mounting locking state and a releasing insurance state, if the tracker detects that the vertical acceleration of the simulated grenade exceeds a first preset acceleration value and the transverse acceleration does not exceed a second preset acceleration value, the simulated grenade is judged to fall off, and at the moment, the simulated grenade can simulate explosion so as to correspond to the actual use situation of the grenade.
In one possible implementation, the set of poses of the simulated grenade state includes a standard human body pose of each action performed by the trained person in the simulated grenade state, and the types of standard human body poses include a torso pose, a left hand pose, and a right hand pose. Thus, matching the first body pose data with the target pose set to generate a pose matching result, comprising: confirming a standard human body posture corresponding to the first human body posture data as a target human body posture; judging whether the first approximation degree of the first body posture data and the target body posture is smaller than a first preset threshold value or not; if yes, the gesture matching result is qualified; if not, the gesture matching result is disqualified.
It can be appreciated that, since the standard human body gestures are collected and stored in advance for each simulated mine state, there can be a comparison standard according to each action gesture of the trained person when training, that is, whether the action of the trained person meets the standard can be judged by matching the first body gesture data with the target gesture set.
In this embodiment, for each standard human body posture and the first body posture data of the trained person, an openwise human body posture detection algorithm may be used to obtain a skeletal connection line between the body joints, and based on the skeletal connection line between the body joints, an image representing the human body posture may be constructed, that is, the human body posture may be generated, so that a posture matching result may be obtained through technologies such as image recognition. It will be appreciated that the torso pose of the present invention includes a head pose, a left leg pose, a right leg pose, a left arm pose, and a right arm pose.
In this embodiment, the gesture matching result may be further refined according to the value of the first approximation degree, for example, when the first approximation degree is 70% -80%, the gesture matching result is qualified; when the first approximation degree is more than 80% and not less than 90%, the gesture matching result is good; when the first approximation is greater than 90%, the pose matching result is excellent.
In one possible embodiment, for different standard human body poses, there are different torso poses, left hand poses and right hand poses, for example, the torso hardly changes when the pin is pulled out, and the left hand pose and the right hand pose are important; when throwing, the body posture, the left hand posture and the right hand posture are all obviously changed, and at the moment, the first similarity obtained by matching with the posture data of the whole body is not accurate.
Thus, generating a gesture matching result may include: splitting the first body posture data into a current torso posture, a current left hand posture and a current right hand posture; respectively calculating a second approximation degree of the trunk gesture of the current trunk gesture and the target human gesture, a third approximation degree of the left hand gesture of the current left hand gesture and the left hand gesture of the target human gesture, and a fourth approximation degree of the right hand gesture of the current right hand gesture and the right hand gesture of the target human gesture; if the second approximation degree, the third approximation degree and the fourth approximation degree all accord with corresponding preset conditions, the gesture matching result is qualified; otherwise, the test result is unqualified.
It can be understood that based on the judgment of the second approximation degree, the third approximation degree and the fourth approximation degree, whether all parts of the body meet the requirements can be judged, the judgment standard is stricter, the judgment result is more accurate, and the rationality is higher. Accordingly, the second approximation, the third approximation and the fourth approximation may be further refined, and evaluated as different grades according to different numerical ranges.
In one possible implementation, corresponding to the actual operation and application of the mine, the throwing member must squat immediately after throwing the mine, and the degree of squat is also critical. Therefore, the simulated grenade training system provided by the embodiment judges whether the operation of the trained personnel in the step is qualified or not by detecting the squatting gesture of the trained personnel.
Specifically, detecting the squat gesture of the trained personnel, and acquiring second body gesture data, wherein the second body gesture data comprises a first height of the trained personnel when the simulated grenade is in a safe locking state and/or a safety releasing state, and a second height of the trained personnel when the simulated grenade is switched to a simulated excitation state; and the gesture matching result further comprises a squat result, and if the difference value between the first height and the second height exceeds the preset height difference value, the squat result is qualified.
It will be appreciated that in general, the trained person is standing during the training process, so that the first height and the second height will have a difference, and when the difference exceeds the preset height difference, it can be determined that the trained person has performed a squat action.
It should be noted that, in the training process, the squatting gesture and timeliness of the trained personnel can be detected through the wearing device, and the squatting gesture and timeliness are combined with the first body gesture data of the trained personnel shot by the camera device to form double judgment, so that the detection accuracy of the squatting gesture of the trained personnel is enhanced.
In a possible implementation manner, the throwing parameters include acceleration, angular velocity and motion direction, and the simulated grenade training provided in this embodiment predicts the trajectory of the simulated grenade by constructing collision parameters such as a virtual wall and a virtual ground and then by tracking a model, so as to obtain the throwing result of the simulated grenade in a virtual reality scene.
Referring to fig. 3, in a flowchart of a method for obtaining a throwing result of a simulated mine in a virtual reality scene provided in this embodiment, a track of the simulated mine is predicted by using a pre-trained tracking model, so as to obtain the throwing result of the simulated mine in the virtual reality scene, which includes:
s301: based on a preset collision parameter library, virtual wall parameters and virtual ground parameters are added in the space coordinate system, wherein the virtual wall parameters are used for constructing a virtual wall in the virtual reality scene, the virtual ground parameters are used for constructing a virtual ground in the virtual reality scene, and the collision parameter library is used for storing the virtual wall parameters and the virtual ground parameters.
It can be understood that in an actual throwing scene, more or less obstacles exist, so that in the simulated grenade training system provided by the invention, the obstacles, such as walls and floors with different terrains, can be constructed in a virtual reality scene according to a preset collision parameter library.
S302: and calculating expected position coordinates of each time point of the simulated grenade in the preset time period in the space coordinate system based on the coordinate parameters, the throwing parameters, the virtual wall parameters and the virtual ground parameters.
It will be appreciated that the expected position coordinates described in this embodiment are projected in a virtual reality scene, and may be outside the wall, inside the wall, or hit on the wall or the ground.
Specifically, when the simulated grenade passes beyond the wall in the virtual reality scene, it may fall to the ground; when the simulated grenade is hit on a wall, the simulated grenade can have a collision process, so that the simulated grenade rebounds; when the simulated mine does not cross the wall, the simulated mine may fall to the ground in the wall; when the simulated grenade falls on the ground, it may also appear rolling; the simulated grenade may also explode in the air when the random time is too short or the throw distance is too great. For the above-described situations, the estimation calculation may be performed based on the throwing parameters including acceleration, angular velocity, and movement direction of the simulated grenade using a physical model set in advance.
S303: and according to the expected position coordinates, confirming the explosion position of the simulated grenade in the virtual reality scene based on the preset explosion time delay, generating a throwing result based on the explosion position, and displaying the striking effect in the virtual reality scene.
It will be appreciated that in the actual throwing of the mine, the projectile pulls and throws the safety pin of the mine, and the mine is exploded after a certain period of time, which may be referred to as the time delay of the explosion. Thus, the simulated grenade can also realize the explosion time delay, namely the preset explosion time delay, through the controller. For example, when the simulated grenade is in the moment from the safe locking state to the contact safety state, the simulated grenade emits a first-stage excitation signal, and after receiving the signal, the controller controls the simulated grenade to be excited within a preset time range, namely simulated explosion. It should be noted that, the preset time range may be random within a certain time period, for example, between 3 and 5 seconds, and the simulated grenade is randomly excited, and correspondingly, the preset explosion time delay may be any duration between 3 and 5 seconds. Based on the preset explosion time delay, the specific position of the simulated grenade in the space coordinate system during explosion can be determined, and accordingly a corresponding striking effect is generated.
It should be noted that, when the simulated hand mine is crashed on a wall or does not cross the wall, the trained personnel still has to carry out emergency risk avoidance at the moment, the gesture of the trained personnel also needs to be detected at the moment, and the standard human gesture of the emergency risk avoidance can be stored in the gesture database in advance.
Specifically, conditions requiring emergency avoidance, that is, conditions triggering emergency avoidance, may be preset, and the conditions may include that the grenade falls on a side of the trench, that the grenade falls on the ground within a threshold range from the trained personnel, and the like. Correspondingly, when the grenade falls on one side of the bullet-avoiding trench, the standard human body posture at the moment should be that the personnel jump into the other side of the bullet-avoiding trench away from the bullet-avoiding trench immediately to avoid; when the grenade falls on the ground within a threshold distance from the trainee, the standard human body posture at this time should be immediately jumped into a trench away from the grenade for avoidance by the personnel.
It should be noted that the method of the embodiments of the present disclosure may be performed by a single device, such as a computer or a server. The method of the embodiment can also be applied to a distributed scene, and is completed by mutually matching a plurality of devices. In the case of such a distributed scenario, one of the devices may perform only one or more steps of the methods of embodiments of the present disclosure, the devices interacting with each other to accomplish the methods.
It should be noted that the foregoing describes some embodiments of the present disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
Example 2
Referring to fig. 4, the embodiment of the present invention further provides an XR-based simulated grenade training system, including:
the simulated grenade display module 401 is configured to construct a virtual reality scene, obtain status information and position information of a simulated grenade in the virtual reality scene, and display a real-time image of the simulated grenade corresponding to the simulated grenade in the virtual reality scene according to the status information and the position information;
a target gesture set obtaining module 402, configured to obtain a target gesture set corresponding to a state of the simulated grenade from a preset gesture database, where a preset standard human body gesture is stored in the target gesture set of the preset gesture database;
the gesture set generating module 403 is configured to generate a gesture set of the simulated mine state according to a preset mapping relationship between the standard human gesture and the state of the simulated mine;
the gesture matching module 404 is configured to obtain first body gesture data of the trained object, match the first body gesture data with the target gesture set, and generate a gesture matching result;
the motion state analysis module 405 is configured to obtain a coordinate parameter of the simulated grenade in a preset space coordinate system and a throwing parameter of the simulated grenade in response to a signal that the simulated grenade is switched to a simulated excitation state, and calculate a motion state of the simulated grenade in the preset space coordinate system according to the throwing parameter;
The throwing tracking module 406 is configured to predict a trajectory of the simulated grenade by using a tracking model, obtain a throwing result of the simulated grenade in the virtual reality scene, and generate a striking effect of the simulated grenade in the virtual reality scene;
and the training result generating module 407 is configured to generate a training result of the simulated grenade by using the gesture matching result and the throwing result.
In one possible embodiment, in the simulated mine display module 401, the state information of the simulated mine includes an installation locking state, a disarmed state, a to-be-excited state, and a simulated excited state;
and acquiring the vertical acceleration and the transverse acceleration of the simulated grenade in the installation locking state and the safety releasing state, and judging whether the simulated grenade falls off or not.
In a possible embodiment, in the gesture set generating module 403, the gesture set of the simulated grenade state includes a standard human gesture of the simulated grenade for the trained object to perform the action in the set state; the types of the standard human body gestures include a trunk gesture, a left hand gesture and a right hand gesture;
the gesture matching module 404 includes:
a target human body posture extraction submodule 4041, configured to use a standard human body posture corresponding to the first human body posture data as a target human body posture;
An approximation judging submodule 4042, configured to judge whether a first approximation degree of the first body posture data and the target body posture is smaller than a first preset threshold; if the first approximation degree of the first body posture data and the target body posture is smaller than a first preset threshold value, the posture matching result is qualified; if the first approximation degree of the first body posture data and the target body posture is not smaller than a first preset threshold value, the posture matching result is unqualified.
In one possible embodiment, the gesture matching module 404 further includes:
a gesture data splitting submodule 4043, configured to split the first body gesture data into a current torso gesture, a current left hand gesture, and a current right hand gesture;
an approximation calculation submodule 4044 configured to calculate a second approximation of the torso pose of the current torso pose and the target body pose, a third approximation of the left hand pose of the current left hand pose and the target body pose, and a fourth approximation of the right hand pose of the current right hand pose and the target body pose, respectively;
the approximation judging submodule 4042 is further configured to judge that the pose matching result is qualified if the second approximation degree, the third approximation degree and the fourth approximation degree all conform to corresponding preset conditions;
And if at least one of the second approximation degree, the third approximation degree and the fourth approximation degree does not accord with the corresponding preset condition, judging the gesture matching result as unqualified.
In one possible embodiment, the body nodes of the trained object are captured using an openwise human body posture detection algorithm; and generating the human body posture of the trained object by utilizing the skeleton connecting lines among the body joints.
In one possible embodiment, the method further comprises:
a squat gesture analysis module 408 for detecting a squat gesture of a trained object, obtaining second body gesture data comprising a first height of a trained person when the simulated grenade is in the secure locked state and/or the disarmed state, and a second height of the trained object when the simulated grenade is switched to the simulated excited state;
the gesture matching result in the gesture matching module 404 further includes a squat result, and if the difference between the first height and the second height exceeds the preset height difference, the squat result is determined to be qualified.
In one possible embodiment, in the motion state analysis module 405, the throwing parameters include acceleration, angular velocity, and motion direction;
The pitch tracking module 406 includes:
a virtual wall and ground processing submodule 4061, configured to add a virtual wall parameter and a virtual ground parameter in the preset space coordinate system based on a preset collision parameter library, construct a virtual wall in the virtual reality scene by using the virtual wall parameter, construct a virtual ground in the virtual reality scene by using the virtual ground parameter, and store the virtual wall parameter and the virtual ground parameter by using the collision parameter library;
an expected position coordinate analysis submodule 4062, configured to calculate an expected position coordinate of the simulated grenade at each time point in the preset space coordinate system within a preset time period based on the coordinate parameter, the throwing parameter, the virtual wall parameter, and the virtual ground parameter;
and the throwing tracking simulation submodule 4063 is used for confirming the explosion position of the simulated grenade in the virtual reality scene based on the preset explosion time delay according to the expected position coordinate, generating a throwing result of the simulated grenade based on the explosion position, and displaying the striking effect in the virtual reality scene.
In one possible embodiment, the simulated grenade training system may further comprise a health monitoring module, wherein the health monitoring module may be used for monitoring the blood pressure and heart rate of trained personnel in real time, feeding back psychological and physiological conditions of the trainee in time, and preventing accidents.
It should be noted that, because the content of information interaction and execution process between the modules of the above system is based on the same concept as the method embodiment in the above embodiment of the present invention, the technical effects brought by the content are the same as the method embodiment of the present invention, and the specific content can be referred to the description in the foregoing illustrated method embodiment of the present invention, which is not repeated herein.
Example 3
Referring to fig. 5, a schematic structural diagram of an electronic device corresponding to an XR-based simulated grenade training method according to an embodiment of the present invention is shown. As shown in fig. 5, the electronic device of this embodiment includes: at least one processor 50 (only one shown in fig. 5), a memory 51, and a computer program stored in the memory 51 and executable on the at least one processor 50, the processor 50 implementing the steps in any of the various method embodiments described above when executing the computer program.
The electronic device may include, but is not limited to, a processor 50, a memory 51. It will be appreciated by those skilled in the art that fig. 5 is merely an example of an electronic device and is not meant to be limiting, and may include more or fewer components than shown, or may combine certain components, or different components, such as may also include input-output devices, network access devices, etc.
The processor 50 may be a central processing unit (Central Processing Unit, CPU), the processor 50 may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), off-the-shelf programmable gate arrays (Field Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The specific implementation process of the processor 50 can be referred to the above method embodiment, and its implementation principle and technical effects are similar, and this embodiment will not be described herein again.
The memory 51 may in some embodiments be an internal storage unit of the electronic device, such as a memory of the electronic device. The memory 51 may also be an external storage device of the electronic device in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash Card (Flash Card) or the like. Further, the memory 51 may also include both an internal storage unit and an external storage device of the electronic device. The memory 51 is used to store an operating system, application programs, boot loader (BootLoader), data, and other programs, etc., such as program codes of computer programs, etc. The memory 51 may also be used to temporarily store data that has been output or is to be output.
Example 4
The embodiments of the present invention also provide a computer readable storage medium storing a computer program, which when executed by a processor, implements the steps of the above-described method embodiments.
The computer readable storage medium described above may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk, or optical disk. A readable storage medium can be any available medium that can be accessed by a general purpose or special purpose computer.
An exemplary readable storage medium is coupled to the processor such the processor can read information from, and write information to, the readable storage medium. In the alternative, the readable storage medium may be integral to the processor. The processor and the readable storage medium may reside in an application specific integrated circuit (Application Specific Integrated Circuits, ASIC for short). The processor and the readable storage medium may reside as discrete components in the electronic device described above.
Those of ordinary skill in the art will appreciate that: all or part of the steps for implementing the method embodiments described above may be performed by hardware associated with program instructions. The foregoing program may be stored in a computer readable storage medium. The program, when executed, performs steps including the method embodiments described above; and the aforementioned storage medium includes: various media that can store program code, such as ROM, RAM, magnetic or optical disks.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/network device and method may be implemented in other manners. For example, the apparatus/network device embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions in actual implementation, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection via interfaces, devices or units, which may be in electrical, mechanical or other forms.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment. Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This invention is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.
It is to be understood that the invention is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the invention is limited only by the appended claims.

Claims (10)

1. The XR-based simulated grenade training method is characterized by comprising the following steps of:
constructing a virtual reality scene, acquiring state information and position information of a simulated grenade in the virtual reality scene, and displaying real-time images of the simulated grenade corresponding to the simulated grenade in the virtual reality scene according to the state information and the position information;
acquiring a target gesture set corresponding to the state of the simulated grenade from a preset gesture database, wherein the target gesture set of the preset gesture database stores a preset standard human gesture, and the gesture set of the state of the simulated grenade is generated according to the preset mapping relation between the standard human gesture and the state of the simulated grenade;
acquiring first body posture data of a trained object, and matching the first body posture data with the target posture set to generate a posture matching result;
responding to a signal of switching the simulated grenade to a simulated excitation state, acquiring a coordinate parameter of the simulated grenade in a preset space coordinate system and a throwing parameter of the simulated grenade, and calculating a motion state of the simulated grenade in the preset space coordinate system through the throwing parameter;
Predicting the track of the simulated grenade by using a tracking model, obtaining a throwing result of the simulated grenade in the virtual reality scene, and generating a striking effect of the simulated grenade in the virtual reality scene;
and generating a training result of the simulated grenade by using the gesture matching result and the throwing result.
2. The XR-based simulated grenade training method of claim 1, wherein the status information of the simulated grenade comprises an install lock status, a disarmed status, a to-be-activated status, and a simulated activation status;
and acquiring the vertical acceleration and the transverse acceleration of the simulated grenade in the installation locking state and the safety releasing state, and judging whether the simulated grenade falls off or not.
3. The XR-based simulated grenade training method of claim 1, wherein the set of poses of the simulated grenade state comprises a standard human body pose of the simulated grenade in which the trained subject performs actions in a set state; the types of the standard human body gestures include a trunk gesture, a left hand gesture and a right hand gesture;
matching the first body pose data with the target pose set, comprising:
Taking the standard human body posture corresponding to the first human body posture data as a target human body posture;
judging whether the first approximation degree of the first body posture data and the target body posture is smaller than a first preset threshold value or not;
if the first approximation degree of the first body posture data and the target body posture is smaller than a first preset threshold value, the posture matching result is qualified;
if the first approximation degree of the first body posture data and the target body posture is not smaller than a first preset threshold value, the posture matching result is unqualified.
4. The XR-based simulated grenade training method of claim 3, wherein generating a pose match result comprises:
splitting the first body posture data into a current torso posture, a current left hand posture and a current right hand posture;
respectively calculating a second approximation degree of the trunk gesture of the current trunk gesture and the target human gesture, a third approximation degree of the left hand gesture of the current left hand gesture and the left hand gesture of the target human gesture, and a fourth approximation degree of the right hand gesture of the current right hand gesture and the right hand gesture of the target human gesture;
if the second approximation degree, the third approximation degree and the fourth approximation degree all accord with corresponding preset conditions, judging the gesture matching result as qualified;
And if at least one of the second approximation degree, the third approximation degree and the fourth approximation degree does not accord with the corresponding preset condition, judging the gesture matching result as unqualified.
5. The XR-based simulated grenade training method of claim 1, wherein the body nodes of the trained subject are captured using an openwise human body posture detection algorithm; and generating the human body posture of the trained object by utilizing the skeleton connecting lines among the body joints.
6. The XR-based simulated grenade training method of claim 2, further comprising:
detecting the squat gesture of the trained object, and acquiring second body gesture data, wherein the second body gesture data comprises a first height of the trained personnel when the simulated grenade is in the safe locking state and/or the arming state, and a second height of the trained object when the simulated grenade is switched to the simulated excitation state;
and the gesture matching result also comprises a squat result, and if the difference value between the first height and the second height exceeds the preset height difference value, the squat result is judged to be qualified.
7. The XR-based simulated grenade training method of claim 2, wherein the throwing parameters comprise acceleration, angular velocity, and direction of motion;
Predicting the track of the simulated grenade by using a tracking model to obtain a throwing result of the simulated grenade in the virtual reality scene, wherein the method comprises the following steps:
adding virtual wall parameters and virtual ground parameters in the preset space coordinate system based on a preset collision parameter library, constructing a virtual wall in the virtual reality scene through the virtual wall parameters, constructing a virtual ground in the virtual reality scene through the virtual ground parameters, and storing the virtual wall parameters and the virtual ground parameters through the collision parameter library;
calculating expected position coordinates of each time point of the simulated grenade in a preset time period in the preset space coordinate system based on the coordinate parameters, the throwing parameters, the virtual wall parameters and the virtual ground parameters;
and according to the expected position coordinates, confirming the explosion position of the simulated grenade in the virtual reality scene based on the preset explosion time delay, generating a throwing result of the simulated grenade based on the explosion position, and displaying a striking effect in the virtual reality scene.
8. An XR-based simulated grenade training system, comprising:
The simulated grenade display module is used for constructing a virtual reality scene, acquiring state information and position information of a simulated grenade in the virtual reality scene, and displaying real-time images of the simulated grenade corresponding to the simulated grenade in the virtual reality scene according to the state information and the position information;
the target gesture set acquisition module is used for acquiring a target gesture set corresponding to the state of the simulated grenade from a preset gesture database, wherein the target gesture set of the preset gesture database stores preset standard human body gestures;
the gesture set generation module is used for generating a gesture set of the simulated grenade state according to a preset state mapping relation between the standard human gesture and the simulated grenade;
the gesture matching module is used for acquiring first body gesture data of the trained object, matching the first body gesture data with the target gesture set and generating a gesture matching result;
the motion state analysis module is used for responding to the signal of switching the simulation grenade to the simulation excitation state, acquiring the coordinate parameters of the simulation grenade in a preset space coordinate system and the throwing parameters of the simulation grenade, and calculating the motion state of the simulation grenade in the preset space coordinate system through the throwing parameters;
The throwing tracking module is used for predicting the track of the simulated grenade by utilizing a tracking model, obtaining the throwing result of the simulated grenade in the virtual reality scene and generating the striking effect of the simulated grenade in the virtual reality scene;
and the training result generation module is used for generating the training result of the simulated grenade by utilizing the gesture matching result and the throwing result.
9. An electronic device, comprising: a processor, and a memory communicatively coupled to the processor; the memory stores computer-executable instructions; the computer-executable instructions stored in the memory are executed by the processor to implement the XR-based simulated grenade training method of any one of claims 1 to 7.
10. A computer readable storage medium having stored therein computer executable instructions for implementing the XR-based simulated grenade training method of any one of claims 1 to 7 when executed by a processor.
CN202311383090.2A 2023-10-24 2023-10-24 XR-based simulated grenade training method, system, equipment and storage medium Pending CN117392892A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311383090.2A CN117392892A (en) 2023-10-24 2023-10-24 XR-based simulated grenade training method, system, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311383090.2A CN117392892A (en) 2023-10-24 2023-10-24 XR-based simulated grenade training method, system, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117392892A true CN117392892A (en) 2024-01-12

Family

ID=89438697

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311383090.2A Pending CN117392892A (en) 2023-10-24 2023-10-24 XR-based simulated grenade training method, system, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117392892A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117934782A (en) * 2024-03-20 2024-04-26 虚拟现实(深圳)智能科技有限公司 Construction method, device, equipment and storage medium of XR (X-ray) augmented reality scene

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160035718A (en) * 2014-09-23 2016-04-01 에이알비전 (주) Training system for grenade mock throwing
CN107806795A (en) * 2017-11-30 2018-03-16 中国人民解放军国防科技大学 Controllable safety training grenade
CN108180782A (en) * 2017-12-28 2018-06-19 郑州巍瀚信息科技有限公司 A kind of individual training system based on virtual reality
KR20180115147A (en) * 2017-04-12 2018-10-22 주식회사 케이에스아이티 Simulation method for throwing hand grenade by using Virtual Reality technology
CN111473687A (en) * 2020-04-29 2020-07-31 武汉和盾防务装备科技有限公司 Torpedo training system and running method thereof
CN111833439A (en) * 2020-07-13 2020-10-27 郑州胜龙信息技术股份有限公司 Artificial intelligence-based ammunition throwing analysis and mobile simulation training method
CN112402946A (en) * 2020-11-20 2021-02-26 腾讯科技(深圳)有限公司 Position acquisition method, device, equipment and storage medium in virtual scene
CN112464919A (en) * 2021-01-28 2021-03-09 长沙鹏阳信息技术有限公司 Smart safety monitoring method for grenade throwing training
CN219285852U (en) * 2022-11-29 2023-06-30 燧光科技(北京)有限公司 Mixed reality's of virtual construction scene grenade training analog device of throwing
CN117316016A (en) * 2023-10-24 2023-12-29 中国人民解放军国防科技大学 Simulation grenade safety training system, method, equipment and storage medium
CN117437820A (en) * 2023-10-24 2024-01-23 中国人民解放军国防科技大学 Digital twinning-based simulated grenade training method, system, equipment and storage medium

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20160035718A (en) * 2014-09-23 2016-04-01 에이알비전 (주) Training system for grenade mock throwing
KR20180115147A (en) * 2017-04-12 2018-10-22 주식회사 케이에스아이티 Simulation method for throwing hand grenade by using Virtual Reality technology
CN107806795A (en) * 2017-11-30 2018-03-16 中国人民解放军国防科技大学 Controllable safety training grenade
CN108180782A (en) * 2017-12-28 2018-06-19 郑州巍瀚信息科技有限公司 A kind of individual training system based on virtual reality
CN111473687A (en) * 2020-04-29 2020-07-31 武汉和盾防务装备科技有限公司 Torpedo training system and running method thereof
CN111833439A (en) * 2020-07-13 2020-10-27 郑州胜龙信息技术股份有限公司 Artificial intelligence-based ammunition throwing analysis and mobile simulation training method
CN112402946A (en) * 2020-11-20 2021-02-26 腾讯科技(深圳)有限公司 Position acquisition method, device, equipment and storage medium in virtual scene
CN112464919A (en) * 2021-01-28 2021-03-09 长沙鹏阳信息技术有限公司 Smart safety monitoring method for grenade throwing training
CN219285852U (en) * 2022-11-29 2023-06-30 燧光科技(北京)有限公司 Mixed reality's of virtual construction scene grenade training analog device of throwing
CN117316016A (en) * 2023-10-24 2023-12-29 中国人民解放军国防科技大学 Simulation grenade safety training system, method, equipment and storage medium
CN117437820A (en) * 2023-10-24 2024-01-23 中国人民解放军国防科技大学 Digital twinning-based simulated grenade training method, system, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117934782A (en) * 2024-03-20 2024-04-26 虚拟现实(深圳)智能科技有限公司 Construction method, device, equipment and storage medium of XR (X-ray) augmented reality scene
CN117934782B (en) * 2024-03-20 2024-05-31 虚拟现实(深圳)智能科技有限公司 Construction method, device, equipment and storage medium of XR (X-ray) augmented reality scene

Similar Documents

Publication Publication Date Title
US10388053B1 (en) System for seamless animation transition
CN102331840B (en) User selection and navigation based on looped motions
CN117392892A (en) XR-based simulated grenade training method, system, equipment and storage medium
CN101438121A (en) Instructor-lead training environment and interfaces therewith
US20210366183A1 (en) Glitch detection system
KR102231586B1 (en) Situation Coping Training Simulator and Driving Method Thereof, and Computer Readable Recording Medium
CN109063845B (en) Deep learning method based on generated samples and robot system
Paduraru et al. Rivergame-a game testing tool using artificial intelligence
CN117437820A (en) Digital twinning-based simulated grenade training method, system, equipment and storage medium
CN117316016A (en) Simulation grenade safety training system, method, equipment and storage medium
Pu et al. Orientation and decision-making for soccer based on sports analytics and AI: A systematic review
WO2022060241A1 (en) Interactive training device for carrying out training with the aid of virtual reality
US20090088246A1 (en) Interactive sound synthesis
Mahmoud et al. Believable NPCs in serious games: HTN planning approach based on visual perception
US20230214007A1 (en) Virtual reality de-escalation tool for delivering electronic impulses to targets
EP4370862A1 (en) Personalized combat simulation equipment
Wallace et al. Realism in modeling and simulation with implications for virtual reality, augmented reality, and immersive environments
Fountas Spiking neural networks for human-like avatar control in a simulated environment
Chi et al. Simulated casualties and medics for emergency training
Miene et al. Interpretation of spatio-temporal relations in real-time and dynamic environments
Rubak Imitation Learning with the Unity Machine Learning Agents Toolkit
US11645932B2 (en) Machine learning-aided mixed reality training experience identification, prediction, generation, and optimization system
KR102421092B1 (en) Virtual training apparatus for recognizing training movement and virtual training system
Sabilirrasyad et al. Jamarat ritual simulation with Myo armband for precise throws speed
Egly et al. Decision making for MiroSOT soccer playing robots

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination