CN117610794A - Scene simulation training evaluation system and method for emergency - Google Patents

Scene simulation training evaluation system and method for emergency Download PDF

Info

Publication number
CN117610794A
CN117610794A CN202410087035.7A CN202410087035A CN117610794A CN 117610794 A CN117610794 A CN 117610794A CN 202410087035 A CN202410087035 A CN 202410087035A CN 117610794 A CN117610794 A CN 117610794A
Authority
CN
China
Prior art keywords
simulation
attribute
training
scene
evaluation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410087035.7A
Other languages
Chinese (zh)
Other versions
CN117610794B (en
Inventor
苏横军
徐芳萍
战飚
龙飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanchang Lingxing Information Technology Co ltd
Original Assignee
Nanchang Lingxing Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanchang Lingxing Information Technology Co ltd filed Critical Nanchang Lingxing Information Technology Co ltd
Priority to CN202410087035.7A priority Critical patent/CN117610794B/en
Publication of CN117610794A publication Critical patent/CN117610794A/en
Application granted granted Critical
Publication of CN117610794B publication Critical patent/CN117610794B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/26Government or public services
    • G06Q50/265Personal security, identity or safety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Tourism & Hospitality (AREA)
  • Theoretical Computer Science (AREA)
  • Economics (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Strategic Management (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Business, Economics & Management (AREA)
  • Development Economics (AREA)
  • Marketing (AREA)
  • Educational Administration (AREA)
  • Game Theory and Decision Science (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Computer Security & Cryptography (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Computational Linguistics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides a scene simulation training evaluation system and a scene simulation training evaluation method for emergency, which are used for carrying out wearing detection of simulation equipment on simulation training personnel; selecting and importing a simulated training scene; loading a simulated training scene, performing operation monitoring, and performing fragment shooting when a simulated training person withholds a simulated shooting button to obtain a target fragment video; extracting a target evaluation picture in a target fragment video; and performing evaluation analysis on the target evaluation picture to generate a training evaluation result. Different simulated training scenes can be imported according to different training requirements, operation monitoring and fragment shooting are carried out through simulation equipment, target evaluation pictures are extracted, evaluation analysis is carried out, and training evaluation results are generated. The invention can realize standardized simulation training evaluation, saves simulation training cost and meets the actual application requirements.

Description

Scene simulation training evaluation system and method for emergency
Technical Field
The invention belongs to the technical field of simulation training, and particularly relates to a scene simulation training evaluation system and method for emergency events.
Background
Sudden events are natural disasters, accident disasters, public health events and social security events which are suddenly happened, cause or possibly cause serious social hazards and need emergency treatment measures to be taken for treatment.
For professional emergency handling personnel, when encountering an emergency, a skilled business capability is required to cope with various emergency events, so as to reduce public loss as much as possible, and thus, the professional needs to be intensively trained. However, the cost of performing a real emergency exercise is too high, and the data record storage and the later evaluation are greatly affected by the artificial factors, so that the standardized simulation training evaluation cannot be realized, and therefore, a system and a method for simulating the scene of the emergency are necessary to be provided for solving the technical problems.
Disclosure of Invention
The embodiment of the invention aims to provide a scene simulation training evaluation system and a scene simulation training evaluation method for emergencies, which aim to solve the problems in the background technology.
In order to achieve the above object, the embodiment of the present invention provides the following technical solutions:
a scene simulation training evaluation method for emergency events specifically comprises the following steps:
performing simulation equipment wearing detection on simulation training personnel, and generating a simulation start signal when the simulation equipment wearing is detected;
selecting and importing a simulation training scene according to the simulation starting signal;
loading the simulated training scene, performing operation monitoring through simulation equipment, and performing fragment shooting when a simulated training person withholds a simulated shooting button to obtain a target fragment video;
extracting a target evaluation picture in the target fragment video;
and carrying out evaluation analysis on the target evaluation picture to generate a training evaluation result.
As a further limitation of the technical solution of the embodiment of the present invention, the step of performing wearing detection of the simulation device by the simulation training person, and generating the simulation start signal when the wearing of the simulation device is detected, specifically includes the following steps:
performing simulation equipment wearing detection on simulation training personnel to generate wearing detection data;
real-time analysis is carried out on the wearing detection data, and a real-time analysis result is generated;
judging whether the wearing of the simulation equipment is completed or not according to the real-time analysis result;
if the wearing of the simulation equipment is completed, generating a simulation start signal.
As a further limitation of the technical solution of the embodiment of the present invention, the selecting and importing the simulated training scenario according to the simulated start signal specifically includes the following steps:
according to the simulation start signal, running a preset scene selection program;
selecting a simulated training scene from a scene database through the scene selection program;
and importing the simulated training scene into simulation equipment.
As a further limitation of the technical solution of the embodiment of the present invention, loading the simulated training scene, and performing operation monitoring through a simulation device, when a simulated training person pulls a simulated shooting button, performing fragment shooting, and obtaining a target fragment video specifically includes the following steps:
loading the simulated training scene in simulation equipment to perform scene follow-up display;
the simulation training personnel are judged whether to withhold the simulated shooting button or not through operation monitoring of the simulation equipment;
when a simulated training person withholds a simulated shooting button, a shooting signal and a shooting signal are generated simultaneously;
and performing laser simulated shooting and fragment shooting according to the shooting signals and the shooting signals to obtain a target fragment video.
As further defined by the technical solution of the embodiment of the present invention, the extracting the target evaluation picture in the target segment video specifically includes the following steps:
performing frame-by-frame processing on the target fragment video to obtain a plurality of frame-by-frame fragment pictures;
performing target recognition comparison on a plurality of the frame-by-frame fragment pictures to generate recognition comparison results;
and screening and marking target evaluation pictures from a plurality of frame-by-frame fragment pictures according to the identification comparison result.
As a further limitation of the technical solution of the embodiment of the present invention, the performing evaluation analysis on the target evaluation picture, and generating a training evaluation result specifically includes the following steps:
identifying a laser position and a target position in the target evaluation picture;
calculating a relative distance between the laser position and the target position;
and performing evaluation analysis according to the relative distance to generate a training evaluation result.
The system comprises a wearing detection processing unit, a scene selection importing unit, an operation monitoring shooting unit, an evaluation picture extracting unit and a picture evaluation analyzing unit, wherein:
the wearing detection processing unit is used for carrying out simulation equipment wearing detection on simulation training personnel and generating a simulation starting signal when the simulation equipment wearing is detected;
the scene selection and import unit is used for selecting and importing a simulation training scene according to the simulation start signal;
the operation monitoring shooting unit is used for loading the simulated training scene, performing operation monitoring through simulation equipment, and shooting fragments when a simulated training person pulls a simulated shooting button to acquire a target fragment video;
the evaluation picture extraction unit is used for extracting a target evaluation picture in the target fragment video;
and the picture evaluation analysis unit is used for performing evaluation analysis on the target evaluation picture and generating a training evaluation result.
As further defined by the technical solution of the embodiment of the present invention, the scenario selection and import unit specifically includes:
the selection operation module is used for operating a preset scene selection program according to the simulation start signal;
the scene selection module is used for selecting a simulated training scene from a scene database through the scene selection program;
and the scene importing module is used for importing the simulated training scene into the simulation equipment.
As a further limitation of the technical solution of the embodiment of the present invention, the operation monitoring shooting unit specifically includes:
the follow-up display module is used for loading the simulated training scene in the simulation equipment and carrying out scene follow-up display;
the operation monitoring module is used for performing operation monitoring through the simulation equipment and judging whether a simulation training person withholds a simulation shooting button;
the signal generation module is used for generating a shooting signal and a shooting signal simultaneously when the simulated training personnel withhold the simulated shooting key;
and the shooting module is used for performing laser simulated shooting and fragment shooting according to the shooting signals and the shooting signals to acquire a target fragment video.
As further defined by the technical solution of the embodiment of the present invention, the evaluation picture extraction unit specifically includes:
the frame-by-frame processing module is used for carrying out frame-by-frame processing on the target fragment video to obtain a plurality of frame-by-frame fragment pictures;
the identification comparison module is used for carrying out target identification comparison on a plurality of the frame-by-frame fragment pictures and generating an identification comparison result;
and the screening marking module is used for screening and marking target evaluation pictures from a plurality of frame-by-frame fragment pictures according to the identification comparison result.
Compared with the prior art, the invention has the beneficial effects that:
according to the invention, simulation equipment wearing detection is carried out on simulation training personnel; selecting and importing a simulated training scene; loading a simulated training scene, performing operation monitoring, and performing fragment shooting when a simulated training person withholds a simulated shooting button to obtain a target fragment video; extracting a target evaluation picture in a target fragment video; performing evaluation analysis on the target evaluation picture to generate a training evaluation result; different simulated training scenes can be imported according to different training requirements, operation monitoring and fragment shooting are carried out through simulation equipment, target evaluation pictures are extracted, evaluation analysis is carried out, and training evaluation results are generated. The invention can realize standardized simulation training evaluation, saves simulation training cost and meets the actual application requirements.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the following description will briefly introduce the drawings that are needed in the embodiments or the description of the prior art, and it is obvious that the drawings in the following description are only some embodiments of the present invention.
Fig. 1 shows a flowchart of a method provided by an embodiment of the present invention.
Fig. 2 shows a flowchart of detecting wearing of an analog device in the method provided by the embodiment of the invention.
Fig. 3 shows a flowchart of training scene selection import in the method according to the embodiment of the present invention.
Fig. 4 shows a flowchart of capturing an operation monitoring segment in the method according to the embodiment of the present invention.
Fig. 5 shows a flowchart of target evaluation picture extraction in the method provided by the embodiment of the invention.
Fig. 6 shows a flowchart of generating a training evaluation result in the method provided by the embodiment of the invention.
Fig. 7 shows a block diagram of a system provided by an embodiment of the present invention.
Fig. 8 is a block diagram illustrating a configuration of a scene selection import unit in the system according to the embodiment of the present invention.
Fig. 9 shows a block diagram of a system for operating a monitoring camera unit according to an embodiment of the present invention.
Fig. 10 shows a block diagram of the structure of an evaluation picture extraction unit in the system according to the embodiment of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
In order to solve the problems, the invention carries out wearing detection of simulation equipment by simulation training personnel; selecting and importing a simulated training scene; loading a simulated training scene, performing operation monitoring, and performing fragment shooting when a simulated training person withholds a simulated shooting button to obtain a target fragment video; extracting a target evaluation picture in a target fragment video; and performing evaluation analysis on the target evaluation picture to generate a training evaluation result. Different simulated training scenes can be imported according to different training requirements, operation monitoring and fragment shooting are carried out through simulation equipment, target evaluation pictures are extracted, evaluation analysis is carried out, and training evaluation results are generated. The invention can realize standardized simulation training evaluation, saves simulation training cost and meets the actual application requirements.
Fig. 1 shows a flowchart of a method provided by an embodiment of the present invention.
Specifically, in a preferred embodiment of the present invention, a method for evaluating scene simulation training for an emergency, the method specifically includes the following steps:
step S100, performing simulation equipment wearing detection on simulation training personnel, and generating a simulation start signal when the simulation equipment wearing is detected.
In the embodiment of the invention, on the simulation equipment, simulation training personnel carry out simulation equipment wearing detection through an infrared sensing technology, wearing detection data are generated in real time, real-time analysis results are generated through analysis of the wearing detection data, whether the simulation training personnel finish correct wearing of the simulation equipment or not is judged according to the real-time analysis results, and after the correct wearing of the simulation equipment is finished, simulation training is ready to start, and a simulation start signal is generated.
It can be understood that, on the simulation device, a plurality of infrared detection positions are provided, and only the plurality of infrared detection positions can detect the wearing signal, the simulation training person is indicated to finish correct wearing of the simulation device, and a simulation start signal is generated at this time.
Specifically, fig. 2 shows a flowchart of detecting wearing of an analog device in the method provided by the embodiment of the present invention.
In the preferred embodiment of the present invention, the detecting the wearing of the simulation device by the simulation training person, when detecting that the wearing of the simulation device is completed, generating the simulation start signal specifically includes the following steps:
step S101, performing simulation equipment wearing detection on simulation training personnel to generate wearing detection data.
Step S102, analyzing the wearing detection data in real time to generate a real-time analysis result.
And step S103, judging whether the wearing of the simulation equipment is completed or not according to the real-time analysis result.
Step S104, if the wearing of the simulation equipment is completed, generating a simulation start signal.
Further, the method for evaluating the scene simulation training of the emergency event further comprises the following steps:
step 200, selecting and importing a simulation training scene according to the simulation start signal.
In the embodiment of the invention, a scene selection program which is pre-led in is operated according to the simulation start signal, and then corresponding scene selection is carried out from a preset scene database according to the scene selection program, so that a simulation training scene is selected, and the simulation training scene is led into simulation equipment.
It will be appreciated that different scene selection procedures, with different scene selection methods, may include: random selection, theme selection, sequential selection and the like; the scene database has various scene data, and the scene data is gradually increased along with the development of scenes.
Specifically, fig. 3 shows a flowchart of training scene selection import in the method provided by the embodiment of the present invention.
In a preferred embodiment of the present invention, the selecting and importing the simulated training scenario according to the simulation start signal specifically includes the following steps:
step S201, running a preset scene selection program according to the simulation start signal.
The step S201 specifically includes the following sub-steps:
step S2011, when a simulation start signal is received, acquiring personnel attribute information of a simulation training personnel, wherein the personnel attribute information comprises a gender attribute, an age attribute, a occupation attribute and a character attribute;
step S2012, calculating according to the gender attribute, the age attribute, the occupation attribute and the character attribute to obtain the personnel attribute category of the simulated training personnel and the corresponding comprehensive attribute score;
wherein, the calculation formula of the comprehensive attribute score is expressed as:
wherein,representing the composite attribute score, ++>Reference value representing the integrated attribute score, +.>Correction factor representing attribute score, ++>Weight factor representing gender attribute item, +.>Attribute score representing gender attribute item, ++>Weight factor representing age attribute term,/>Attribute score representing age attribute term, ++>Weight factor representing professional attribute item, +.>Attribute score representing professional Attribute term, +.>Weight factor of table property item, +.>Representing the attribute scores of the personality attribute items.
In this embodiment, the method for determining the person attribute category of the simulation training person includes the following sub-steps:
step S2012a, the gender attribute, the age attribute, the occupation attribute and the character attribute of the current simulation training personnel are obtained;
step 2012b, searching a preset attribute category mapping table according to the gender attribute, the age attribute, the occupation attribute and the character attribute of the current simulated training personnel to obtain the association degree between each attribute of the simulated training personnel and the personnel attribute category;
step 2012c, calculating to obtain a comprehensive association value according to the association degree between each attribute of the simulation training personnel and the personnel attribute category, and determining the personnel attribute category corresponding to the maximum comprehensive association degree value as the personnel attribute category of the current simulation training personnel;
in this embodiment, the calculation formula of the integrated association value is expressed as:
wherein,representing attributes of simulated training personnelAnd->Comprehensive association between species attribute categories, +.>Reference value representing the degree of comprehensive association, +.>Representing the +.>Item attribute and->Association degree between personnel attribute categories, +.>Representing the +.>Item attribute and->A weight factor for the degree of association between category attributes of the species,
step 2012d, after the comprehensive relevance value corresponding to each personnel attribute category is obtained by calculation, determining the personnel attribute category corresponding to the maximum comprehensive relevance value as the personnel attribute category of the current simulation training personnel.
Step S2013, searching a plurality of sub-scene selection programs of the category corresponding to the personnel attribute category in a preset scene selection program library according to the personnel attribute category of the simulated training personnel;
step S2014, confirming the corresponding score grade in a preset attribute score mapping table according to the comprehensive attribute scores of the simulation training personnel, and confirming and operating the sub-scene selection program corresponding to the score grade in the selected multiple sub-scene selection programs of the current category.
Step S202, selecting a simulated training scene from a scene database through the scene selection program.
Step S203, the simulated training scene is imported into simulation equipment.
Further, the method for evaluating the scene simulation training of the emergency event further comprises the following steps:
and step S300, loading the simulated training scene, performing operation monitoring through simulation equipment, and performing fragment shooting when a simulated training person withholds a simulated shooting button to obtain a target fragment video.
In the embodiment of the invention, the imported simulated training scene is loaded in the VR glasses in the simulation equipment, the follow-up display of the simulated training scene is carried out in the VR glasses along with the movement of the simulated training personnel, the shooting targets are displayed in the simulated scene and the real environment at the same time and at the same space position, the simulated shooting props in the simulation equipment are operated and monitored by the simulated training personnel, whether the simulated training personnel withhold the simulated shooting keys is judged, and when the simulated training personnel withhold the simulated shooting keys, the shooting signals and the shooting signals are generated at the same time, and then the simulated shooting props in the simulation equipment are simultaneously subjected to laser simulated shooting and fragment shooting according to the shooting signals and the shooting signals, so that the target fragment video shot by the simulated shooting props can be acquired.
It can be understood that the target fragment video is a video shot by simulating shooting props in a real environment, and when a simulated target is provided in the simulated scene, a real target is also provided in the real environment corresponding to time and space, and can be a moving target; in a simulated training scene, simulation training personnel shoot a simulated target through a simulated shooting prop, and simultaneously, in a real environment, the simulated shooting prop emits laser to the real target and shoots fragments to obtain a target fragment video.
Specifically, fig. 4 shows a flowchart of capturing an operation monitoring segment in the method provided by the embodiment of the present invention.
In the preferred embodiment of the present invention, the loading the simulated training scene, and performing operation monitoring through a simulation device, when a simulated training person pulls a simulated shooting button, performing fragment shooting, and obtaining a target fragment video specifically includes the following steps:
step S301, loading the simulated training scene in simulation equipment to perform scene follow-up display.
Step S302, operation monitoring is carried out through simulation equipment, and whether simulation training personnel withhold a simulation shooting button is judged.
In step S303, when the simulated training person pulls the simulated shooting button, the shooting signal and the shooting signal are generated simultaneously.
And step S304, performing laser simulated shooting and fragment shooting according to the shooting signals and the shooting signals to obtain a target fragment video.
Further, the method for evaluating the scene simulation training of the emergency event further comprises the following steps:
and step S400, extracting a target evaluation picture in the target fragment video.
In the embodiment of the invention, a plurality of frame-by-frame fragment pictures are obtained by carrying out frame-by-frame processing on a target fragment video, shooting targets in the frame-by-frame fragment pictures are identified, definition of the shooting targets in the frame-by-frame fragment pictures is compared to generate an identification comparison result, and further the frame-by-frame fragment pictures with clear shooting targets are screened from the frame-by-frame fragment pictures according to the identification comparison result, and the frame-by-frame fragment pictures are marked as target evaluation pictures.
Specifically, fig. 5 shows a flowchart of extracting a target evaluation picture in the method provided by the embodiment of the invention.
In a preferred embodiment of the present invention, the extracting the target evaluation picture in the target segment video specifically includes the following steps:
step S401, performing frame-by-frame processing on the target segment video to obtain a plurality of frame-by-frame segment pictures.
And step S402, performing target recognition comparison on the plurality of frame-by-frame fragment pictures to generate recognition comparison results.
Step S403, screening and marking a target evaluation picture from a plurality of the frame-by-frame fragment pictures according to the identification comparison result.
Further, the method for evaluating the scene simulation training of the emergency event further comprises the following steps:
and S500, performing evaluation analysis on the target evaluation picture to generate a training evaluation result.
In the embodiment of the invention, the position of the shooting target is identified in the target evaluation picture, the central position of the shooting target is marked as the coordinate origin, a plane coordinate system is constructed based on the coordinate origin, the laser position in the plane coordinate system is determined by identifying the laser position, the relative distance between the laser position and the coordinate origin is calculated, and the evaluation analysis of the shooting accuracy is performed according to the relative distance, so that a training evaluation result is generated.
Specifically, fig. 6 shows a flowchart of generating a training evaluation result in the method provided by the embodiment of the present invention.
In the preferred embodiment of the present invention, the evaluation analysis is performed on the target evaluation picture, and the generation of the training evaluation result specifically includes the following steps:
step S501, identifying a laser position and a target position in the target evaluation picture.
Step S502, calculating a relative distance between the laser position and the target position.
And step S503, performing evaluation analysis according to the relative distance to generate a training evaluation result.
In the present embodiment, step S503 includes the following sub-steps:
step S5031, searching in a preset distance score mapping table according to the relative distance to obtain a corresponding distance score;
step S5032, determining a ring number deviation value between the laser position and the target position according to the laser position and the target position in the target evaluation picture;
step S5033, calculating to obtain a comprehensive score according to the distance scores and the ring number deviation values, and determining a corresponding evaluation grade in a preset score grade evaluation table according to the comprehensive score. In the present embodiment, the calculation formula of the integrated score value is expressed as:
wherein,representing the composite score value, < >>Represents the maximum value of the integrated score,/-, for example>Correction factor representing distance score term, +.>Score scaling factor representing distance score term, +.>Indicate->Distance score corresponding to secondary shooting,/->Indicating the total number of shots, +.>Correction factor representing the deviation term of the number of loops, +.>Score conversion factor representing the deviation term of the number of loops, +.>Indicate->The number of rings corresponding to the shot is deviated.
Further, fig. 7 shows a block diagram of a system provided by an embodiment of the present invention.
In another preferred embodiment of the present invention, a scenario simulation training evaluation system for an emergency event includes:
the wearing detection processing unit 10 is configured to perform simulation device wearing detection on a simulation training person, and generate a simulation start signal when the simulation device wearing is detected.
In the embodiment of the present invention, on the simulation device, the wearing detection processing unit 10 performs the simulation device wearing detection on the simulation training personnel through the infrared sensing technology, generates the wearing detection data in real time, generates the real-time analysis result through analyzing the wearing detection data, further determines whether the simulation training personnel completes the correct wearing of the simulation device according to the real-time analysis result, and prepares to start the simulation training after completing the correct wearing of the simulation device, and generates the simulation start signal.
The scene selection import unit 20 is configured to select and import a simulated training scene according to the simulation start signal.
In the embodiment of the present invention, the scene selection and import unit 20 runs a scene selection program that is pre-imported according to the simulation start signal, and further, according to the scene selection program, the scene selection and import unit 20 performs corresponding scene selection from a preset scene database, thereby selecting a simulation training scene, and importing the simulation training scene into the simulation device.
Specifically, fig. 8 shows a block diagram of a scene selection import unit 20 in the system according to the embodiment of the present invention.
In a preferred embodiment of the present invention, the scenario selection import unit 20 specifically includes:
the selection operation module 21 is configured to operate a preset scene selection program according to the simulation start signal.
The scene selection module 22 is configured to select a simulated training scene from a scene database according to the scene selection program.
A scene importing module 23, configured to import the simulated training scene into a simulation device.
Further, the scene simulation training evaluation system for emergency events further comprises:
the operation monitoring shooting unit 30 is configured to load the simulated training scene, perform operation monitoring through a simulation device, and perform segment shooting when a simulated training person pulls a simulated shooting button, so as to obtain a target segment video.
In the embodiment of the present invention, the operation monitoring shooting unit 30 loads the imported simulated training scene in the VR glasses in the simulation device, and performs follow-up display of the simulated training scene in the VR glasses along with movement of the simulated training person, displays the shooting target in the simulated scene and the real environment at the same time and at the same spatial position, performs operation monitoring on the simulated shooting prop in the simulation device on the simulated training person, determines whether the simulated training person withholds the simulated shooting key, and simultaneously generates the shooting signal and the shooting signal when the simulated training person withholds the simulated shooting key, and performs laser simulated shooting and segment shooting simultaneously according to the shooting signal and the shooting signal at the moment, so as to obtain the target segment video shot by the simulated shooting prop.
Specifically, fig. 9 shows a block diagram of the operation monitoring photographing unit 30 in the system according to the embodiment of the present invention.
In a preferred embodiment of the present invention, the operation monitoring photographing unit 30 specifically includes:
and the follow-up display module 31 is used for loading the simulated training scene in the simulation equipment and carrying out scene follow-up display.
The operation monitoring module 32 is used for performing operation monitoring through the simulation equipment and judging whether the simulation training personnel withholds the simulation shooting button.
The signal generating module 33 is configured to generate a shooting signal and a shooting signal simultaneously when the simulated training person pulls the simulated shooting button.
The shooting module 34 is configured to perform laser simulated shooting and segment shooting according to the shooting signal and the shooting signal, and acquire a target segment video.
Further, the scene simulation training evaluation system for emergency events further comprises:
and an evaluation picture extraction unit 40, configured to extract a target evaluation picture in the target clip video.
In the embodiment of the present invention, the evaluation picture extraction unit 40 obtains a plurality of frame-by-frame fragment pictures by performing frame-by-frame processing on the target fragment video, identifies the shooting targets in the frame-by-frame fragment pictures, compares the sharpness of the shooting targets in the frame-by-frame fragment pictures, generates an identification comparison result, and further screens the frame-by-frame fragment pictures with the shooting targets in the sharpness from the frame-by-frame fragment pictures according to the identification comparison result, and marks the frame-by-frame fragment pictures as the target evaluation pictures.
Specifically, fig. 10 shows a block diagram of the structure of the evaluation picture extraction unit 40 in the system according to the embodiment of the present invention.
In a preferred embodiment provided by the present invention, the evaluation picture extraction unit 40 specifically includes:
and the frame-by-frame processing module 41 is used for performing frame-by-frame processing on the target fragment video to obtain a plurality of frame-by-frame fragment pictures.
The recognition comparison module 42 is configured to perform object recognition comparison on the plurality of frame-by-frame segment pictures, and generate a recognition comparison result.
And the screening and marking module 43 is used for screening and marking target evaluation pictures from a plurality of the frame-by-frame fragment pictures according to the identification comparison result.
Further, the scene simulation training evaluation system for emergency events further comprises:
and a picture evaluation analysis unit 50, configured to perform evaluation analysis on the target evaluation picture, and generate a training evaluation result.
In the embodiment of the present invention, the picture evaluation analysis unit 50 identifies the position of the shooting target in the target evaluation picture, marks the central position of the shooting target as the origin of coordinates, constructs a planar coordinate system based on the origin of coordinates, identifies the laser position, determines the laser point of the laser position in the planar coordinate system, calculates the relative distance between the laser point and the origin of coordinates, and performs evaluation analysis of shooting accuracy according to the magnitude of the relative distance, thereby generating a training evaluation result.
It should be understood that, although the steps in the flowcharts of the embodiments of the present invention are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in various embodiments may include multiple sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor do the order in which the sub-steps or stages are performed necessarily performed in sequence, but may be performed alternately or alternately with at least a portion of the sub-steps or stages of other steps or other steps.
The technical features of the above-described embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above-described embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The foregoing examples illustrate only a few embodiments of the invention and are described in detail herein without thereby limiting the scope of the invention. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the invention, which are all within the scope of the invention. Accordingly, the scope of protection of the present invention is to be determined by the appended claims.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (10)

1. The scene simulation training evaluation method for the emergency is characterized by comprising the following steps of:
performing simulation equipment wearing detection on simulation training personnel, and generating a simulation start signal when the simulation equipment wearing is detected;
selecting and importing a simulation training scene according to the simulation starting signal;
loading the simulated training scene, performing operation monitoring through simulation equipment, and performing fragment shooting when a simulated training person withholds a simulated shooting button to obtain a target fragment video;
extracting a target evaluation picture in the target fragment video;
performing evaluation analysis on the target evaluation picture to generate a training evaluation result;
the step of selecting and importing the simulation training scene according to the simulation start signal specifically comprises the following steps:
according to the simulation start signal, running a preset scene selection program;
selecting a simulated training scene from a scene database through the scene selection program;
importing the simulated training scene into simulation equipment;
according to the simulation start signal, the method for running the preset scene selection program comprises the following steps:
when a simulation start signal is received, acquiring personnel attribute information of a simulation training personnel, wherein the personnel attribute information comprises a gender attribute, an age attribute, a occupation attribute and a character attribute;
according to the gender attribute, the age attribute, the occupation attribute and the character attribute, calculating to obtain personnel attribute categories of simulation training personnel and corresponding comprehensive attribute scores;
searching a plurality of sub-scene selection programs of the category corresponding to the personnel attribute category in a preset scene selection program library according to the personnel attribute category of the simulated training personnel;
and confirming the corresponding score grade in a preset attribute score mapping table according to the comprehensive attribute scores of the simulation training personnel, and confirming and operating the sub-scene selection program corresponding to the score grade in the selected multiple sub-scene selection programs of the current category.
2. The method for evaluating the scene simulation training of the emergency according to claim 1, wherein the step of performing the simulation equipment wearing detection by the simulation training personnel, and generating the simulation start signal when the completion of the simulation equipment wearing is detected, specifically comprises the steps of:
performing simulation equipment wearing detection on simulation training personnel to generate wearing detection data;
real-time analysis is carried out on the wearing detection data, and a real-time analysis result is generated;
judging whether the wearing of the simulation equipment is completed or not according to the real-time analysis result;
if the wearing of the simulation equipment is completed, generating a simulation start signal.
3. The method for scenario simulation training evaluation of emergency event according to claim 2, wherein the calculation formula of the integrated attribute score is expressed as:
wherein,representing the composite attribute score, ++>Reference value representing the integrated attribute score, +.>Correction factor representing attribute score, ++>Weight factor representing gender attribute item, +.>Attribute score representing gender attribute item, ++>Weight factor representing age attribute term, +.>Attribute score representing age attribute term, ++>Weight factor representing professional attribute item, +.>Attribute score representing professional Attribute term, +.>Weight factor of table property item, +.>Representing the attribute scores of the personality attribute items.
4. A scene simulation training evaluation method for emergency events according to claim 3, wherein the method for determining the person attribute class of the simulation training person comprises the sub-steps of:
acquiring sex attribute, age attribute, occupation attribute and personality attribute of the current simulation training personnel;
searching and obtaining the association degree between each attribute of the simulated training personnel and the personnel attribute category in a preset attribute category mapping table according to the sex attribute, the age attribute, the occupation attribute and the character attribute of the current simulated training personnel;
according to the association degree between each attribute of the simulation training personnel and the attribute category of the personnel, calculating to obtain a comprehensive association degree value, and determining the attribute category of the personnel corresponding to the maximum comprehensive association degree value as the attribute category of the personnel of the current simulation training personnel;
the calculation formula of the comprehensive association degree value is expressed as follows:
wherein,representing the properties and +.>Comprehensive association between species attribute categories, +.>Reference value representing the degree of comprehensive association, +.>Representing the +.>Item attribute and->Association degree between personnel attribute categories, +.>Representing the +.>Item attribute and->A weight factor for the degree of association between category attributes of the species,
after the comprehensive association degree value corresponding to each personnel attribute type is obtained through calculation, the personnel attribute type corresponding to the maximum comprehensive association degree value is determined as the personnel attribute type of the current simulation training personnel.
5. The method for simulating training and evaluating a scene for an emergency according to claim 4, wherein said loading the simulated training scene and performing operation monitoring by a simulation device, performing fragment shooting when a simulated training person pulls a simulated shooting button, and acquiring a target fragment video specifically comprises the steps of:
loading the simulated training scene in simulation equipment to perform scene follow-up display;
the simulation training personnel are judged whether to withhold the simulated shooting button or not through operation monitoring of the simulation equipment;
when a simulated training person withholds a simulated shooting button, a shooting signal and a shooting signal are generated simultaneously;
and performing laser simulated shooting and fragment shooting according to the shooting signals and the shooting signals to obtain a target fragment video.
6. The method for training and evaluating a scene for an emergency according to claim 5, wherein the extracting the target evaluation picture in the target clip video specifically comprises the following steps:
performing frame-by-frame processing on the target fragment video to obtain a plurality of frame-by-frame fragment pictures;
performing target recognition comparison on a plurality of the frame-by-frame fragment pictures to generate recognition comparison results;
and screening and marking target evaluation pictures from a plurality of frame-by-frame fragment pictures according to the identification comparison result.
7. The method for simulating training and evaluating a scene for an emergency according to claim 6, wherein said performing evaluation analysis on said target evaluation picture, generating a training evaluation result specifically comprises the steps of:
identifying a laser position and a target position in the target evaluation picture;
calculating a relative distance between the laser position and the target position;
and performing evaluation analysis according to the relative distance to generate a training evaluation result.
8. The method for scene modeling training evaluation of emergency events according to claim 7, wherein the method for performing evaluation analysis based on the relative distance, generating training evaluation results, comprises the steps of:
searching in a preset distance score mapping table according to the relative distance to obtain a corresponding distance score;
determining a ring number deviation value between the laser position and the target position according to the laser position and the target position in the target evaluation picture;
and calculating according to the distance scores and the ring number deviation values to obtain comprehensive score values, and determining corresponding evaluation grades in a preset score grade evaluation table according to the comprehensive score values.
9. The scene modeling training evaluation method for emergency events according to claim 8, wherein the calculation formula of the integrated score value is expressed as:
wherein,representing the composite score value, < >>Represents the maximum value of the integrated score,/-, for example>A correction factor representing the distance score term,score scaling factor representing distance score term, +.>Indicate->Distance score corresponding to secondary shooting,/->Indicating the total number of shots, +.>Correction factor representing the deviation term of the number of loops, +.>Score conversion factor representing the deviation term of the number of loops, +.>Indicate->The number of rings corresponding to the shot is deviated.
10. A scene simulation training evaluation system for an emergency event, characterized in that the scene simulation training evaluation method for an emergency event according to any one of claims 1 to 9 is performed, the system comprising a wearing detection processing unit, a scene selection import unit, an operation monitoring shooting unit, an evaluation picture extraction unit, and a picture evaluation analysis unit, wherein:
the wearing detection processing unit is used for carrying out simulation equipment wearing detection on simulation training personnel and generating a simulation starting signal when the simulation equipment wearing is detected;
the scene selection and import unit is used for selecting and importing a simulation training scene according to the simulation start signal;
the operation monitoring shooting unit is used for loading the simulated training scene, performing operation monitoring through simulation equipment, and shooting fragments when a simulated training person pulls a simulated shooting button to acquire a target fragment video;
the evaluation picture extraction unit is used for extracting a target evaluation picture in the target fragment video;
the picture evaluation analysis unit is used for performing evaluation analysis on the target evaluation picture and generating a training evaluation result;
the scene selection import unit specifically comprises:
the selection operation module is used for operating a preset scene selection program according to the simulation start signal;
the scene selection module is used for selecting a simulated training scene from a scene database through the scene selection program;
the scene importing module is used for importing the simulated training scene into simulation equipment;
the operation monitoring shooting unit specifically comprises:
the follow-up display module is used for loading the simulated training scene in the simulation equipment and carrying out scene follow-up display;
the operation monitoring module is used for performing operation monitoring through the simulation equipment and judging whether a simulation training person withholds a simulation shooting button;
the signal generation module is used for generating a shooting signal and a shooting signal simultaneously when the simulated training personnel withhold the simulated shooting key;
the shooting module is used for performing laser simulated shooting and fragment shooting according to the shooting signals and the shooting signals to acquire a target fragment video;
the evaluation picture extraction unit specifically includes:
the frame-by-frame processing module is used for carrying out frame-by-frame processing on the target fragment video to obtain a plurality of frame-by-frame fragment pictures;
the identification comparison module is used for carrying out target identification comparison on a plurality of the frame-by-frame fragment pictures and generating an identification comparison result;
and the screening marking module is used for screening and marking target evaluation pictures from a plurality of frame-by-frame fragment pictures according to the identification comparison result.
CN202410087035.7A 2024-01-22 2024-01-22 Scene simulation training evaluation system and method for emergency Active CN117610794B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410087035.7A CN117610794B (en) 2024-01-22 2024-01-22 Scene simulation training evaluation system and method for emergency

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410087035.7A CN117610794B (en) 2024-01-22 2024-01-22 Scene simulation training evaluation system and method for emergency

Publications (2)

Publication Number Publication Date
CN117610794A true CN117610794A (en) 2024-02-27
CN117610794B CN117610794B (en) 2024-04-19

Family

ID=89956496

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410087035.7A Active CN117610794B (en) 2024-01-22 2024-01-22 Scene simulation training evaluation system and method for emergency

Country Status (1)

Country Link
CN (1) CN117610794B (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108489330A (en) * 2018-02-08 2018-09-04 乌鲁木齐涅墨西斯网络科技有限公司 Police more people's interactive virtual reality qualification course training systems and application method
CN109064821A (en) * 2018-08-31 2018-12-21 苏州竹原信息科技有限公司 A kind of security comprehensive training system and method based on virtual reality
CN109171772A (en) * 2018-08-13 2019-01-11 李丰 A kind of psychological quality training system and training method based on VR technology
US20200316470A1 (en) * 2018-04-27 2020-10-08 Tencent Technology (Shenzhen) Company Limited Method and terminal for displaying distance information in virtual scene
US20210192967A1 (en) * 2019-12-09 2021-06-24 Bob Ferris System and method for virtual target simulation
CN113205281A (en) * 2021-05-28 2021-08-03 中国建设银行股份有限公司 Scene simulation-based personnel ability evaluation method and related equipment
CN113720202A (en) * 2020-05-12 2021-11-30 广东仁光科技有限公司 Immersive 3D image shooting training target range software system and method
CN116558360A (en) * 2023-06-20 2023-08-08 中国人民解放军91976部队 Shooting simulation training method and system based on moving carrier
CN116608726A (en) * 2023-04-25 2023-08-18 江苏星辰数字科技有限公司 Training system for simulating shooting and countermeasure in real environment
CN116665841A (en) * 2023-07-28 2023-08-29 山东大学 Directional shooting athlete reaction training device and real-time evaluation system
CN116843196A (en) * 2023-06-26 2023-10-03 西安速度时空大数据科技有限公司 Intelligent training method and system applied to military training
CN116850567A (en) * 2023-06-21 2023-10-10 中国人民解放军陆军军医大学第二附属医院 VR-based simulated reality environment high-risk professional crowd contusion resistant quality enhancement system

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108489330A (en) * 2018-02-08 2018-09-04 乌鲁木齐涅墨西斯网络科技有限公司 Police more people's interactive virtual reality qualification course training systems and application method
US20200316470A1 (en) * 2018-04-27 2020-10-08 Tencent Technology (Shenzhen) Company Limited Method and terminal for displaying distance information in virtual scene
CN109171772A (en) * 2018-08-13 2019-01-11 李丰 A kind of psychological quality training system and training method based on VR technology
CN109064821A (en) * 2018-08-31 2018-12-21 苏州竹原信息科技有限公司 A kind of security comprehensive training system and method based on virtual reality
US20210192967A1 (en) * 2019-12-09 2021-06-24 Bob Ferris System and method for virtual target simulation
CN113720202A (en) * 2020-05-12 2021-11-30 广东仁光科技有限公司 Immersive 3D image shooting training target range software system and method
CN113205281A (en) * 2021-05-28 2021-08-03 中国建设银行股份有限公司 Scene simulation-based personnel ability evaluation method and related equipment
CN116608726A (en) * 2023-04-25 2023-08-18 江苏星辰数字科技有限公司 Training system for simulating shooting and countermeasure in real environment
CN116558360A (en) * 2023-06-20 2023-08-08 中国人民解放军91976部队 Shooting simulation training method and system based on moving carrier
CN116850567A (en) * 2023-06-21 2023-10-10 中国人民解放军陆军军医大学第二附属医院 VR-based simulated reality environment high-risk professional crowd contusion resistant quality enhancement system
CN116843196A (en) * 2023-06-26 2023-10-03 西安速度时空大数据科技有限公司 Intelligent training method and system applied to military training
CN116665841A (en) * 2023-07-28 2023-08-29 山东大学 Directional shooting athlete reaction training device and real-time evaluation system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
马菲菲 等: "模拟训练综合评价问题的研究", 微计算机信息, vol. 26, no. 7, 31 December 2010 (2010-12-31), pages 155 - 156 *

Also Published As

Publication number Publication date
CN117610794B (en) 2024-04-19

Similar Documents

Publication Publication Date Title
CN110399905B (en) Method for detecting and describing wearing condition of safety helmet in construction scene
US9141184B2 (en) Person detection system
US9183431B2 (en) Apparatus and method for providing activity recognition based application service
EP1530157B1 (en) Image matching system using 3-dimensional object model, image matching method, and image matching program
JP6210650B2 (en) Image search system and image search method
JP2013232181A (en) Image processing apparatus, and image processing method
CN110648352A (en) Abnormal event detection method and device and electronic equipment
CN111228821B (en) Method, device and equipment for intelligently detecting wall-penetrating plug-in and storage medium thereof
CN111062303A (en) Image processing method, system and computer storage medium
CN108121961A (en) Inspection Activity recognition method, apparatus, computer equipment and storage medium
US20210042935A1 (en) Object tracker, object tracking method, and computer program
CN113011280A (en) Method and device for detecting person contact distance, computer equipment and storage medium
US7545983B2 (en) Person image retrieval apparatus
US20220300774A1 (en) Methods, apparatuses, devices and storage media for detecting correlated objects involved in image
KR101124560B1 (en) Automatic object processing method in movie and authoring apparatus for object service
CN117610794B (en) Scene simulation training evaluation system and method for emergency
US11527091B2 (en) Analyzing apparatus, control method, and program
CN112579907A (en) Abnormal task detection method and device, electronic equipment and storage medium
JP2021026744A (en) Information processing device, image recognition method, and learning model generation method
CN115311601A (en) Fire detection analysis method based on video analysis technology
Sebastian et al. Performance evaluation metrics for video tracking
CN110348295B (en) Target detection method, somatosensory interaction device and storage medium
CN113544700A (en) Neural network training method and device, and associated object detection method and device
Chida et al. Enhanced Encoding with Improved Fuzzy Decision Tree Testing Using CASP Templates
CN116630752B (en) Construction site target object identification method and device based on AI algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant