CN106548675A - Virtual military training method and device - Google Patents
Virtual military training method and device Download PDFInfo
- Publication number
- CN106548675A CN106548675A CN201610981497.9A CN201610981497A CN106548675A CN 106548675 A CN106548675 A CN 106548675A CN 201610981497 A CN201610981497 A CN 201610981497A CN 106548675 A CN106548675 A CN 106548675A
- Authority
- CN
- China
- Prior art keywords
- training
- data
- military
- dimensional data
- personnel
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
- G09B9/003—Simulators for teaching or training purposes for military purposes and tactics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/50—Information retrieval; Database structures therefor; File system structures therefor of still image data
- G06F16/58—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
- G06F16/583—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content
- G06F16/5838—Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content using colour
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Library & Information Science (AREA)
- Business, Economics & Management (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- General Engineering & Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The embodiment of the present invention provides a kind of virtual military training method and device.Methods described includes:Obtain the three-dimensional data of the human body three-dimensional data, the three-dimensional data of training court and training device of the personnel that engage in military training, also, the rgb image data and depth image data of the personnel that engage in military training described in obtaining;According to the three-dimensional data of the human body three-dimensional data, the three-dimensional data of training court and training device, corresponding anthropometric dummy, training court model and training device model are set up respectively;Rgb image data and depth image data according to getting carries out posture analysis, obtains the kinematic data of human body;Based on the anthropometric dummy, training court model and training device model, virtual reality motion simulation is carried out according to the kinematic data;Training action during virtual reality motion simulation is compared with the standard form action in default motion criteria template database, training analysiss result is obtained.
Description
Technical field
The present invention relates to computer vision technique, more particularly to a kind of virtual military training method and device.
Background technology
With the development of science and technology, military training also progressively strides into the information-based stage.At present, the mode of military training is mostly
The personnel for engaging in military training can wear the special equipments such as the sensing helmet, data glove, stick to carry out with system once in a while
Interaction, improves the level of training.
However, existing military training method has following weak point:The above-mentioned sensing helmet, data glove etc. are special to be set
Standby costliness, heaviness, and it is inconvenient, it is also difficult to allow the personnel that engage in military training to produce true, natural sensation, so as to cause training
Inefficiency, training effect are limited, it is difficult to extensively apply.
The content of the invention
It is an object of the present invention to provide a kind of virtual military training method and device, to realize the military affairs of virtual reality
Training process, improves training effectiveness and training effect, is widely used.
According to an aspect of the present invention, there is provided a kind of virtual military training method.Methods described includes:Obtain and participate in military
The three-dimensional data of the human body three-dimensional data, the three-dimensional data of training court and training device of trainer, also, obtain the ginseng
Plus the rgb image data and depth image data of military training personnel;According to the human body three-dimensional data, the three-dimensional of training court
The three-dimensional data of data and training device, sets up corresponding anthropometric dummy, training court model and training device model respectively;
Rgb image data and depth image data according to getting carries out posture analysis, obtains the kinematic data of human body;Based on institute
Anthropometric dummy, training court model and training device model are stated, virtual reality motion simulation is carried out according to the kinematic data;
By the standard form action in the training action during virtual reality motion simulation and default motion criteria template database
Compare, obtain training analysiss result.
Preferably, the three of the three-dimensional data for obtaining participant, the three-dimensional data of training court and training device
Dimension data includes:
By Kinect somatosensory device obtain engage in military training the three-dimensional data of personnel, the three-dimensional data of training court and
The three-dimensional data of training device.
Preferably, the rgb image data and depth image data of personnel of engaging in military training described in the acquisition includes:
By the rgb image data and depth image data of the personnel that engage in military training described in the acquisition of Kinect somatosensory device.
Preferably, the training action during the motion simulation by virtual reality and default motion criteria template data
Standard form action in storehouse is compared, and obtaining training analysiss result includes:
In the case where observation viewpoint and observation visual angle are consistent, by the training action and the standard form action
It is synchronous to be superimposed, obtain training analysiss result.
Preferably, methods described also includes:
Engage in military training according to the training analysiss result is obtained the training data of personnel;
By the training data storage for obtaining to military training demographic data storehouse.
According to a further aspect in the invention, there is provided a kind of virtual military training device.Described device includes:Data acquisition mould
Block, for obtaining the three-dimensional of the human body three-dimensional data, the three-dimensional data of training court and training device of the personnel that engage in military training
Data, also, the rgb image data and depth image data of the personnel that engage in military training described in obtaining;Model building module, uses
In the three-dimensional data according to the human body three-dimensional data, the three-dimensional data of training court and training device, set up respectively corresponding
Anthropometric dummy, training court model and training device model;Posture analysis module, for according to the RGB image number for getting
Posture analysis are carried out according to depth image data, the kinematic data of human body is obtained;Motion simulation module, for based on the people
Body Model, training court model and training device model, carry out virtual reality motion simulation according to the kinematic data;Training
Analysis module, for by the training action during virtual reality motion simulation and default motion criteria template database
Standard form action is compared, and obtains training analysiss result.
Preferably, the data acquisition module is used for the three-dimensional that the personnel that engage in military training are obtained by Kinect somatosensory device
The three-dimensional data of data, the three-dimensional data of training court and training device.
Preferably, the data acquisition module is used for by engaging in military training personnel's described in the acquisition of Kinect somatosensory device
Rgb image data and depth image data.
Preferably, in the case that the training analysiss module is for being consistent in observation viewpoint and observation visual angle, by institute
Training action superposition synchronous with the standard form action is stated, training analysiss result is obtained.
Preferably, described device also includes:
Training data acquisition module, for the instruction of the personnel that engage in military training according to training analysiss result acquisition
Practice data;
Data memory module, the training data storage for obtaining arrive military training demographic data storehouse.
The virtual military training method and device for providing according to embodiments of the present invention, by obtaining the personnel that engage in military training
Human body three-dimensional data, the three-dimensional data of training court and training device three-dimensional data, set up corresponding people on this basis
Body Model, training court model and training device model, according to the rgb image data of the personnel that engage in military training for getting and
Depth image data carries out posture analysis, obtains the kinematic data of human body.It is based further on anthropometric dummy, training court model
With training device model, virtual reality motion simulation is carried out according to kinematic data, during virtual reality motion simulation
Training action is compared with the standard form action in default motion criteria template database, obtains training analysiss result.
It is achieved thereby that virtual military training process, improves training effectiveness and training effect, is widely used.
Description of the drawings
Fig. 1 is the flow chart of the virtual military training method for illustrating according to embodiments of the present invention;
Fig. 2 is the flow chart of the virtual military training method for illustrating according to embodiments of the present invention two;
Fig. 3 is the logic diagram of the virtual military training device for illustrating according to embodiments of the present invention three;
Fig. 4 is the logic diagram of the virtual military training device for illustrating according to embodiments of the present invention four.
Specific embodiment
(in some accompanying drawings, identical label represents identical element) and embodiment below in conjunction with the accompanying drawings, the tool to the present invention
Body embodiment is described in further detail.Following examples are used to illustrate the present invention, but are not limited to the scope of the present invention.
It will be understood by those skilled in the art that the term such as " first ", " second " in the present invention to be only used for difference asynchronous
Suddenly, equipment or module etc., neither represent any particular technology implication, also do not indicate that the inevitable logical order between them.
Embodiment one
Fig. 1 is the flow chart of the virtual military training method for illustrating according to embodiments of the present invention.Can be as shown in Figure 3
Device performs methods described.
With reference to Fig. 1, in step S110, human body three-dimensional data, the three-dimensional of training court of the personnel that engage in military training are obtained
The three-dimensional data of data and training device, also, obtain the rgb image data and depth image number of the personnel that engage in military training
According to.
Specifically, engage in military training personnel, training court are obtained using body feeling interaction equipment such as Kinect somatosensory device
And the information of training device, the input of three-dimensional data is completed, is processed for follow-up modeling and data source is provided.Meanwhile, using Kinect
Feeling device obtains the RGB image and depth image data of the personnel that engage in military training, and provides data for the process of follow-up posture analysis
Basis.
In step S120, according to the three-dimensional data of human body three-dimensional data, the three-dimensional data of training court and training device,
Corresponding anthropometric dummy, training court model and training device model are set up respectively.
In actual applications, it is possible to use 3DMAX softwares set up mould to the human body in scene, training court and training device
Type, can also add texture, to increase the third dimension and sense of reality of model.
In step S130, posture analysis are carried out according to the rgb image data and depth image data that get, obtain human body
Kinematic data.
The RGB image and depth image data of the personnel that engage in military training are obtained by aforementioned utilization Kinect somatosensory device,
Real-time whole body and skeleton tracking can be supported, a series of action of personnel that engages in military training is recognized, is obtained the motion of human body
Data are learned, including attitude parameter and various kinematics parameters, so that the motion of driving anthropometric dummy (i.e. visual human), control
The exercise attitudes of visual human.
In step S140, based on anthropometric dummy, training court model and training device model, carried out according to kinematic data
Virtual reality motion simulation.
Motion of virtual human is driven according to the kinematic data obtained by posture analysis in above-mentioned steps.Due to the fortune for capturing
Dynamic data of learning are to simulate the data of human body real motion, then based on the anthropometric dummy, training court model and training that establish
Equipment model, and obtain kinematic data can be carried out virtual reality motion simulation.Thus, the motion of visual human is exactly
The copy of human motion, simulation effect are very true to nature.
In step S150, by the training action during virtual reality motion simulation and default motion criteria template data
Standard form action in storehouse is compared, and obtains training analysiss result.
It should be noted that for different topics, can be gathered according to the physical feature of each personnel that engage in military training
Its exercise data is simultaneously rearranged after being modeled again, targetedly to each corresponding visual human of the personnel that engage in military training
Standard form action is built, so as to pre-build above-mentioned motion criteria template database.
In concrete implementation mode, by the observation viewpoint of the training action of each visual human and standard form action, regard
Angle is consistent, and synchronous superposition contrast shows, intuitively shows action difference, finds out technological deficiency and formulate improved technology side
Case, so as to improve the technical merit of the personnel of engaging in military training.Additionally, according to the individual characteristic of each personnel that engage in military training
And motor capacity, optimal stroking technique is provided by comparative analysiss, refer to so as to provide for the improvement of motion criteria template database
The property led is advised, to reach personalization, the optimization of military training.
Below in conjunction with specific processing example, come further that more intuitively illustratively the embodiment of the present invention specifically should
With.
400 meters of obstacle training are one of subjects of soldier's normalization basis physical training, as obstacle difficulty is larger, are participated in training
Personnel Skill Levels improve slow, and injured situation often occurs in causing training process.Using the military affairs of the embodiment of the present invention
Training can carry out the guidance of science to training, be effectively prevented from meaningless sick and wounded, raising training effect.It is specific as follows:
First, virtual 400 meters of obstacle Training scenes, that is, set up anthropometric dummy, 3D training field models and barrier model.Tool
Body ground, virtual obstacles Training scene is that obstacle training environment is indicated using method for expressing such as physics, mathematics, there is provided training
The basic data and dynamic data of environment.Wherein, it is several including barrier in obstacle training place landforms Geometric Modeling, obstacle training place
What modeling and participant modeling etc..Such as 400 meters of obstacle training places, single barriers are surveyed using Kinect sensor and participate in training
Personnel's depth data and RGB data, set up model to the human body in scene, place and equipment by 3DMAX softwares, and add stricture of vagina
Reason, increases the third dimension and sense of reality of model;
Secondly, attitude point is carried out according to the rgb image data and depth image data that get using Kinect somatosensory device
Analysis, obtains the kinematic data of human body.Specifically, participant the showing by single obstacle for being gathered by Kinect somatosensory device
Field data, obtains the bone image of student, and carries out real-time whole body and skeleton tracking, recognize its a series of action, obtain
The information such as three-dimensional coordinate and angle of multiple articulares at the head of movement human, arm and legs and feet, for driving simulation people
Motion;
Again, according to anthropometric dummy, training court model and training device model, and the kinesiology that posture analysis are obtained
Data carry out virtual reality motion simulation.Specifically, after obtaining the attitude information of each articulare of participant, according to each pass
The attitude information of node makes the action one of visual human and the person of participating in training in virtual scene controlling the motion of human body in virtual scene
Cause;
Finally, by the training action during virtual reality motion simulation and default motion criteria template database
Standard form action is compared, and obtains training analysiss result.Specifically, virtual training picture and standard exercise picture give phase
Same viewpoint, by visual human's training action and standard exercise action relative analyses, i.e., synchronously contrasts each and crosses the dynamic of barrier
Make, and show analysis result with three-dimensional animation, allow participant more efficiently to find out the defective action of tool, so as to improve to solve
Certainly technical movements defect, and then raising training achievement.
Virtual military training method provided in an embodiment of the present invention, by the human body three-dimensional for obtaining the personnel that engage in military training
The three-dimensional data of data, the three-dimensional data of training court and training device, sets up corresponding anthropometric dummy, training on this basis
Place model and training device model, according to the rgb image data and depth image number of the personnel that engage in military training for getting
According to posture analysis are carried out, the kinematic data of human body is obtained.It is based further on anthropometric dummy, training court model and training device
Model, carries out virtual reality motion simulation according to kinematic data, by the training action during virtual reality motion simulation with
Standard form action in default motion criteria template database is compared, and obtains training analysiss result.It is achieved thereby that
The military training process of virtual reality, improves training effectiveness and training effect, is widely used.
Embodiment two
Fig. 2 is the flow chart of the virtual military training method for illustrating according to embodiments of the present invention two, and the embodiment is visual
For another concrete implementation scheme of Fig. 1.Methods described can be performed in device as shown in Figure 4.
With reference to Fig. 2, in step S210, human body three-dimensional data, the three-dimensional of training court of the personnel that engage in military training are obtained
The three-dimensional data of data and training device, also, obtain the rgb image data and depth image number of the personnel that engage in military training
According to.
According to an exemplary embodiment of the present, in step S210 obtain engage in military training personnel human body three-dimensional data,
The process of the three-dimensional data of the three-dimensional data and training device of training court may include:Obtained by Kinect somatosensory device and participate in army
The three-dimensional data of the three-dimensional data of thing trainer, the three-dimensional data of training court and training device.
According to another exemplary embodiment of the present invention, the RGB image number of the personnel that engage in military training in step S210, is obtained
May include according to the process with depth image data:By the RGB figures of the personnel that engage in military training described in the acquisition of Kinect somatosensory device
As data and depth image data..
In step S220, according to the three-dimensional data of human body three-dimensional data, the three-dimensional data of training court and training device,
Corresponding anthropometric dummy, training court model and training device model are set up respectively.
Wherein, the step of above-mentioned steps S220 content the step of step S120, content is identical with above-described embodiment one,
This repeats no more.
In step S230, posture analysis are carried out according to the rgb image data and depth image data that get, obtain human body
Kinematic data.
Wherein, the step of above-mentioned steps S230 content the step of step S130, content is identical with above-described embodiment one,
This repeats no more.
In step S240, based on anthropometric dummy, training court model and training device model, carried out according to kinematic data
Virtual reality motion simulation.
Wherein, the step of above-mentioned steps S240 content the step of step S140, content is identical with above-described embodiment one,
This repeats no more.
In step S250, by the training action during virtual reality motion simulation and default motion criteria template data
Standard form action in storehouse is compared, and obtains training analysiss result.
According to an exemplary embodiment of the present, step S250 may include:It is consistent in observation viewpoint and observation visual angle
In the case of, by superposition training action synchronous with standard form action, obtain training analysiss result.
In the present embodiment, default motion criteria template database can be set up for each participant and form, with full
The personalized training demand of foot.Different participants can be decomposed, different training actions are designed, the single people that participates in training is broken through
The training bottleneck of member, reduces operational error and training risk, improves training quality and the level of training.
In step S260, the training data of the personnel that engage in military training is obtained according to training analysiss result.
In step S270, by the training data storage for obtaining to military training demographic data storehouse.
In actual applications, after training terminates every time, the training data of per the personnel that engage in military training is stored to army
Thing trainer data base, and record the achievement of per the personnel that engage in military training and need improved action, need to increase
Specialized training etc..
Virtual military training method provided in an embodiment of the present invention, with following technique effect:
On the one hand, by being realized based on the virtual military training process of body feeling interaction mode using Kinect somatosensory device.
The attitude information real-time of the participant that Kinect somatosensory device is extracted is good, and accuracy is high.Therefore, it is possible to quickly and accurately ring
The training action made by student is answered, further decomposed, designed complicated training action, and real-time point is made to training action
Analysis, monitoring and evaluation, can break through the training bottleneck of participant itself, reduce the training risk of highly difficult routine, improve
Training effect;Additionally, can reduce training cost using Kinect somatosensory device, and input equipment is dressed in avoiding training process
Triviality;
On the other hand, it is during virtual reality technology being introduced military training, emerging when participant can be excited to train
Interest and enthusiasm, improve training effectiveness.Meanwhile, the training demand of personalization is met, is provided accurately, surely for participant
Training effect's evaluation index of fixed and the training mode and quantification of personalization so that military training mode is more diversified, individual
Property;
Another further aspect, in the present embodiment, the observation viewpoint of virtual training action and standard form action, visual angle is protected
Hold consistent, training action and standard form action simultaneous display, so that the observing effect of training result is more accurately and true
It is real, directly perceived.And then obtain the high training analysiss result of accuracy.Analyze participant technology with being conducive to follow-up system is moved
Make, also, contribute to participant find in time, the technological deficiency of comprehension displacement, to the overall instruction for improving participant
Practicing quality and training achievement has larger facilitation.
Embodiment three
Based on identical technology design, Fig. 3 is to illustrate that according to embodiments of the present invention three virtual military training device is patrolled
Collect block diagram.May be used to perform the virtual military training method flow as described in embodiment one.
With reference to Fig. 3, virtual military training device includes data acquisition module 310, model building module 320, posture analysis
Module 330, motion simulation module 340 and training analysiss module 350.
Data acquisition module 310 is for obtaining the three-dimensional of the human body three-dimensional data of personnel of engaging in military training, training court
The three-dimensional data of data and training device, also, obtain the rgb image data and depth image number of the personnel that engage in military training
According to.
Model building module 320 is for according to human body three-dimensional data, the three-dimensional data of training court and training device
Three-dimensional data, sets up corresponding anthropometric dummy, training court model and training device model respectively.
The rgb image data and depth image data that posture analysis module 330 is got for basis carries out posture analysis,
Obtain the kinematic data of human body.
Motion simulation module 340 is used for based on anthropometric dummy, training court model and training device model, according to kinesiology
Data carry out virtual reality motion simulation.
Training analysiss module 350 is for by the training action during virtual reality motion simulation and default motion criteria
Standard form action in template database is compared, and obtains training analysiss result.
Virtual military training device provided in an embodiment of the present invention, by the human body three-dimensional for obtaining the personnel that engage in military training
The three-dimensional data of data, the three-dimensional data of training court and training device, sets up corresponding anthropometric dummy, training on this basis
Place model and training device model, according to the rgb image data and depth image number of the personnel that engage in military training for getting
According to posture analysis are carried out, the kinematic data of human body is obtained.It is based further on anthropometric dummy, training court model and training device
Model, carries out virtual reality motion simulation according to kinematic data, by the training action during virtual reality motion simulation with
Standard form action in default motion criteria template database is compared, and obtains training analysiss result.It is achieved thereby that
The military training process of virtual reality, improves training effectiveness and training effect, is widely used.
Example IV
Based on identical technology design, Fig. 4 is to illustrate that according to embodiments of the present invention four virtual military training device is patrolled
Collect block diagram.May be used to perform the virtual military training method flow as described in embodiment two.
With reference to Fig. 4, data acquisition module 310 engages in military training personnel's specifically for obtaining by Kinect somatosensory device
The three-dimensional data of three-dimensional data, the three-dimensional data of training court and training device.
Further, data acquisition module 310 can be used to engage in military training personnel's by the acquisition of Kinect somatosensory device
Rgb image data and depth image data.
Preferably, training analysiss module 350 can be used in the case where observation viewpoint and observation visual angle are consistent, will instruction
Practice action superposition synchronous with standard form action, obtain training analysiss result.
Alternatively, the virtual military training device also includes:
Training number of the training data acquisition module 360 for the personnel that engaged in military training according to the acquisition of training analysiss result
According to.
Data memory module 370 arrives military training demographic data storehouse for the training data storage that will be obtained.
Virtual military training device provided in an embodiment of the present invention, with following technique effect:First, by adopting
Kinect somatosensory device is realized based on the virtual military training process of body feeling interaction mode.The participant that Kinect somatosensory device is extracted
Attitude information real-time it is good, accuracy is high.Therefore, it is possible to quickly and accurately respond the training action made by student, enter one
The training action that step is decomposed, design is complicated, and analysis in real time, monitoring are made to training action and is evaluated, the people that participates in training can be broken through
Member the training bottleneck of itself, reduce the training risk of highly difficult routine, improve training effect;Additionally, adopting Kinect somatosensory
Device can reduce training cost, and the triviality of input equipment is dressed in avoiding training process;Second, virtual reality technology is drawn
During entering military training, the interest and enthusiasm when participant can be excited to train improves training effectiveness.Meanwhile, it is full
The personalized training demand of foot, for participant provide accurately, stably and personalization training mode and quantification instruction
Practice recruitment evaluation index so that military training mode is more diversified, personalized;3rd, in the present embodiment, by virtual instruction
Practice the observation viewpoint of action and standard form action, visual angle to be consistent, training action and standard form action simultaneous display, from
And cause the observing effect of training result more accurately and true, directly perceived.And then obtain the high training analysiss result of accuracy.Have
Analyze the technical movements of participant beneficial to follow-up system, also, contribute to participant find in time, comprehension displacement
Technological deficiency, have larger facilitation to the overall training quality and training achievement for improving participant.
It may be noted that according to the needs implemented, each step/part described in this application can be split as more multistep
The part operation of two or more step/parts or step/part also can be combined into new step/part by suddenly/part,
To realize the purpose of the present invention.
Above-mentioned the method according to the invention can be realized in hardware, firmware, or is implemented as being storable in recording medium
Software or computer code in (such as CD ROM, RAM, floppy disk, hard disk or magneto-optic disk), or it is implemented through network download
Original storage is in long-range recording medium or nonvolatile machine readable media and will be stored in the meter in local recording medium
Calculation machine code, so as to method described here can be stored in using general purpose computer, application specific processor or programmable or special
With the such software processes in the recording medium of hardware (such as ASIC or FPGA).It is appreciated that computer, processor, micro-
Processor controller or programmable hardware include can storing or receive software or computer code storage assembly (for example, RAM,
ROM, flash memory etc.), when the software or computer code are by computer, processor or hardware access and when performing, realize here
The processing method of description.Additionally, when general purpose computer is accessed for realizing the code of the process being shown in which, the execution of code
General purpose computer is converted to into the special-purpose computer of the process being shown in which for execution.
The above, the only specific embodiment of the present invention, but protection scope of the present invention is not limited thereto, any
Those familiar with the art the invention discloses technical scope in, change or replacement can be readily occurred in, should all be contained
Cover within protection scope of the present invention.Therefore, protection scope of the present invention should be defined by the scope of the claims.
Claims (10)
1. a kind of virtual military training method, it is characterised in that methods described includes:
Obtain three dimensions of human body three-dimensional data, the three-dimensional data of training court and the training device of the personnel that engage in military training
According to, also, the rgb image data and depth image data of the personnel that engage in military training described in obtaining;
According to the three-dimensional data of the human body three-dimensional data, the three-dimensional data of training court and training device, phase is set up respectively
Anthropometric dummy, training court model and the training device model answered;
Rgb image data and depth image data according to getting carries out posture analysis, obtains the kinematic data of human body;
Based on the anthropometric dummy, training court model and training device model, carried out according to the kinematic data virtually existing
Real motion simulation;
By the standard form in the training action during virtual reality motion simulation and default motion criteria template database
Action is compared, and obtains training analysiss result.
2. method according to claim 1, it is characterised in that the three-dimensional data of the acquisition participant, training court
Three-dimensional data and the three-dimensional data of training device include:
The engage in military training three-dimensional data of personnel, the three-dimensional data of training court and training are obtained by Kinect somatosensory device
The three-dimensional data of equipment.
3. method according to claim 1, it is characterised in that the RGB figures of the personnel that engage in military training described in the acquisition
As data and depth image data include:
By the rgb image data and depth image data of the personnel that engage in military training described in the acquisition of Kinect somatosensory device.
4. method according to claim 1, it is characterised in that the training during the motion simulation by virtual reality is moved
Work is compared with the standard form action in default motion criteria template database, and obtaining training analysiss result includes:
It is in the case where observation viewpoint and observation visual angle are consistent, the training action is synchronous with the standard form action
Superposition, obtains training analysiss result.
5. the method according to any one of claim 1-4, it is characterised in that methods described also includes:
Engage in military training according to the training analysiss result is obtained the training data of personnel;
By the training data storage for obtaining to military training demographic data storehouse.
6. a kind of virtual military training device, it is characterised in that described device includes:
Data acquisition module, for obtain the human body three-dimensional data of personnel of engaging in military training, the three-dimensional data of training court and
The three-dimensional data of training device, also, the rgb image data and depth image data of the personnel that engage in military training described in obtaining;
Model building module, for three according to the human body three-dimensional data, the three-dimensional data of training court and training device
Dimension data, sets up corresponding anthropometric dummy, training court model and training device model respectively;
Posture analysis module, for carrying out posture analysis according to the rgb image data and depth image data that get, obtains people
The kinematic data of body;
Motion simulation module, for based on the anthropometric dummy, training court model and training device model, according to the motion
Learning data carries out virtual reality motion simulation;
Training analysiss module, for by the training action during virtual reality motion simulation and default motion criteria template number
Compare according to the standard form action in storehouse, obtain training analysiss result.
7. device according to claim 6, it is characterised in that the data acquisition module is used for by Kinect somatosensory device
Obtain the three-dimensional data of the engage in military training three-dimensional data of personnel, the three-dimensional data of training court and training device.
8. device according to claim 6, it is characterised in that the data acquisition module is used for by Kinect somatosensory device
Engage in military training described in obtaining the rgb image data and depth image data of personnel.
9. device according to claim 6, it is characterised in that the training analysiss module is in observation viewpoint and observation
In the case that visual angle is consistent, by the superposition training action synchronous with the standard form action, training analysiss knot is obtained
Really.
10. the device according to any one of claim 6-9, it is characterised in that described device also includes:
Training data acquisition module, for the training number of the personnel that engage in military training according to training analysiss result acquisition
According to;
Data memory module, the training data storage for obtaining arrive military training demographic data storehouse.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610981497.9A CN106548675A (en) | 2016-11-08 | 2016-11-08 | Virtual military training method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610981497.9A CN106548675A (en) | 2016-11-08 | 2016-11-08 | Virtual military training method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106548675A true CN106548675A (en) | 2017-03-29 |
Family
ID=58395389
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610981497.9A Pending CN106548675A (en) | 2016-11-08 | 2016-11-08 | Virtual military training method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106548675A (en) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107423527A (en) * | 2017-08-23 | 2017-12-01 | 太仓苏易信息科技有限公司 | A kind of Virtual Modeling System for military exercises |
CN108376487A (en) * | 2018-02-09 | 2018-08-07 | 冯侃 | Based on the limbs training system and method in virtual reality |
CN108664121A (en) * | 2018-03-31 | 2018-10-16 | 中国人民解放军海军航空大学 | A kind of emulation combat system-of-systems drilling system |
CN109460396A (en) * | 2018-10-12 | 2019-03-12 | 中国平安人寿保险股份有限公司 | Model treatment method and device, storage medium and electronic equipment |
CN109559585A (en) * | 2018-12-07 | 2019-04-02 | 湖北安心智能科技有限公司 | A kind of simulated training simulation control subsystem and method |
CN109635925A (en) * | 2018-11-30 | 2019-04-16 | 北京首钢自动化信息技术有限公司 | A kind of sportsman's supplemental training data capture method, device and electronic equipment |
CN110781777A (en) * | 2019-10-10 | 2020-02-11 | 深圳市牧爵电子科技有限公司 | Method, system and storage medium for judging human body action in sports training |
CN111124125A (en) * | 2019-12-25 | 2020-05-08 | 南昌市小核桃科技有限公司 | Police affair training method and system based on virtual reality |
CN114259721A (en) * | 2022-01-13 | 2022-04-01 | 王东华 | Training evaluation system and method based on Beidou positioning |
CN115064026A (en) * | 2022-06-17 | 2022-09-16 | 陆校松 | Training method and device for crew service site |
CN115830229A (en) * | 2022-11-24 | 2023-03-21 | 江苏奥格视特信息科技有限公司 | Digital virtual human 3D model acquisition device |
CN115937894A (en) * | 2022-08-11 | 2023-04-07 | 北京中微盛鼎科技有限公司 | Military training method and system based on human body posture recognition |
CN116843196A (en) * | 2023-06-26 | 2023-10-03 | 西安速度时空大数据科技有限公司 | Intelligent training method and system applied to military training |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103198297A (en) * | 2013-03-15 | 2013-07-10 | 浙江大学 | Kinematic similarity assessment method based on correlation geometrical characteristics |
CN103455657A (en) * | 2013-06-21 | 2013-12-18 | 浙江理工大学 | Kinect based field operation simulation method and Kinect based field operation simulation system |
CN103646425A (en) * | 2013-11-20 | 2014-03-19 | 深圳先进技术研究院 | A method and a system for body feeling interaction |
CN103816654A (en) * | 2012-11-19 | 2014-05-28 | 大连鑫奇辉科技有限公司 | Dance training system |
CN205198906U (en) * | 2015-10-30 | 2016-05-04 | 山西睿智健科技有限公司 | A motion pattern corrects system for assisting body -building |
CN105847987A (en) * | 2016-03-24 | 2016-08-10 | 乐视控股(北京)有限公司 | Method and system for correcting human body actions through television and body feeling accessory component |
CN105999670A (en) * | 2016-05-31 | 2016-10-12 | 山东科技大学 | Shadow-boxing movement judging and guiding system based on kinect and guiding method adopted by same |
-
2016
- 2016-11-08 CN CN201610981497.9A patent/CN106548675A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103816654A (en) * | 2012-11-19 | 2014-05-28 | 大连鑫奇辉科技有限公司 | Dance training system |
CN103198297A (en) * | 2013-03-15 | 2013-07-10 | 浙江大学 | Kinematic similarity assessment method based on correlation geometrical characteristics |
CN103455657A (en) * | 2013-06-21 | 2013-12-18 | 浙江理工大学 | Kinect based field operation simulation method and Kinect based field operation simulation system |
CN103646425A (en) * | 2013-11-20 | 2014-03-19 | 深圳先进技术研究院 | A method and a system for body feeling interaction |
CN205198906U (en) * | 2015-10-30 | 2016-05-04 | 山西睿智健科技有限公司 | A motion pattern corrects system for assisting body -building |
CN105847987A (en) * | 2016-03-24 | 2016-08-10 | 乐视控股(北京)有限公司 | Method and system for correcting human body actions through television and body feeling accessory component |
CN105999670A (en) * | 2016-05-31 | 2016-10-12 | 山东科技大学 | Shadow-boxing movement judging and guiding system based on kinect and guiding method adopted by same |
Non-Patent Citations (1)
Title |
---|
张李杰等: "《基于虚拟现实技术的军事训练场地布局和模拟》", 《四川兵工学报》 * |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107423527A (en) * | 2017-08-23 | 2017-12-01 | 太仓苏易信息科技有限公司 | A kind of Virtual Modeling System for military exercises |
CN108376487A (en) * | 2018-02-09 | 2018-08-07 | 冯侃 | Based on the limbs training system and method in virtual reality |
CN108664121A (en) * | 2018-03-31 | 2018-10-16 | 中国人民解放军海军航空大学 | A kind of emulation combat system-of-systems drilling system |
CN109460396A (en) * | 2018-10-12 | 2019-03-12 | 中国平安人寿保险股份有限公司 | Model treatment method and device, storage medium and electronic equipment |
CN109460396B (en) * | 2018-10-12 | 2024-06-04 | 中国平安人寿保险股份有限公司 | Model processing method and device, storage medium and electronic equipment |
CN109635925A (en) * | 2018-11-30 | 2019-04-16 | 北京首钢自动化信息技术有限公司 | A kind of sportsman's supplemental training data capture method, device and electronic equipment |
CN109559585A (en) * | 2018-12-07 | 2019-04-02 | 湖北安心智能科技有限公司 | A kind of simulated training simulation control subsystem and method |
CN110781777A (en) * | 2019-10-10 | 2020-02-11 | 深圳市牧爵电子科技有限公司 | Method, system and storage medium for judging human body action in sports training |
CN111124125B (en) * | 2019-12-25 | 2023-06-20 | 南昌市小核桃科技有限公司 | Police service training method and system based on virtual reality |
CN111124125A (en) * | 2019-12-25 | 2020-05-08 | 南昌市小核桃科技有限公司 | Police affair training method and system based on virtual reality |
CN114259721A (en) * | 2022-01-13 | 2022-04-01 | 王东华 | Training evaluation system and method based on Beidou positioning |
CN115064026A (en) * | 2022-06-17 | 2022-09-16 | 陆校松 | Training method and device for crew service site |
CN115064026B (en) * | 2022-06-17 | 2023-12-19 | 陆校松 | Method and device for training on site of service |
CN115937894A (en) * | 2022-08-11 | 2023-04-07 | 北京中微盛鼎科技有限公司 | Military training method and system based on human body posture recognition |
CN115830229A (en) * | 2022-11-24 | 2023-03-21 | 江苏奥格视特信息科技有限公司 | Digital virtual human 3D model acquisition device |
CN115830229B (en) * | 2022-11-24 | 2023-10-13 | 江苏奥格视特信息科技有限公司 | Digital virtual human 3D model acquisition device |
CN116843196A (en) * | 2023-06-26 | 2023-10-03 | 西安速度时空大数据科技有限公司 | Intelligent training method and system applied to military training |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106548675A (en) | Virtual military training method and device | |
Ueda et al. | A hand-pose estimation for vision-based human interfaces | |
Gill et al. | An analysis of usage of different types of visualisation media within a collaborative planning workshop environment | |
CN111240476B (en) | Interaction method and device based on augmented reality, storage medium and computer equipment | |
CN106598229A (en) | Virtual reality scene generation method and equipment, and virtual reality system | |
CN109035415B (en) | Virtual model processing method, device, equipment and computer readable storage medium | |
Narang et al. | Simulating movement interactions between avatars & agents in virtual worlds using human motion constraints | |
JP7164045B2 (en) | Skeleton Recognition Method, Skeleton Recognition Program and Skeleton Recognition System | |
CN108389249A (en) | A kind of spaces the VR/AR classroom of multiple compatibility and its construction method | |
George et al. | Using virtual reality as a design input: Impacts on collaboration in a university design studio setting | |
KR20170018529A (en) | Simulator based on healthcare unit and simulation method using the same | |
Tao et al. | Manufacturing assembly simulations in virtual and augmented reality | |
Draganov et al. | Investigating Oculus Rift virtual reality display applicability to medical assistive system for motor disabled patients | |
CN114022512A (en) | Exercise assisting method, apparatus and medium | |
CN107122043A (en) | The analogy method and device of human body in virtual reality | |
Borrero et al. | Interaction of real robots with virtual scenarios through augmented reality: Application to robotics teaching/learning by means of remote labs | |
CN108629121B (en) | Virtual reality crowd simulation method and system based on terrorist valley effect avoidance | |
CN115496911A (en) | Target point detection method, device, equipment and storage medium | |
Basori et al. | Telerobotic 3D articulated arm-assisted surgery tools with augmented reality for surgery training | |
Hussain et al. | Modeling and simulation with augmented reality | |
Nelson et al. | A Virtual Reality Framework for Human-Virtual Crowd Interaction Studies | |
Ohmoto et al. | Design of Immersive Environment for Social Interaction Based on Socio-Spatial Information and the Applications. | |
CN110070777B (en) | Huchizhui fish skin painting simulation training system and implementation method | |
Capin et al. | Integration of avatars and autonomous virtual humans in networked virtual environments | |
Biro et al. | The SFU-Store-Nav 3D Virtual Human Platform for Human-Aware Robotics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20170329 |