CN112684884A - Flight training action automatic identification and evaluation system - Google Patents
Flight training action automatic identification and evaluation system Download PDFInfo
- Publication number
- CN112684884A CN112684884A CN202011532200.3A CN202011532200A CN112684884A CN 112684884 A CN112684884 A CN 112684884A CN 202011532200 A CN202011532200 A CN 202011532200A CN 112684884 A CN112684884 A CN 112684884A
- Authority
- CN
- China
- Prior art keywords
- gesture
- sequence
- flight
- preset
- actions
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012549 training Methods 0.000 title claims abstract description 18
- 238000011156 evaluation Methods 0.000 title claims abstract description 11
- 238000012545 processing Methods 0.000 claims abstract description 8
- 239000011159 matrix material Substances 0.000 claims description 6
- 238000013145 classification model Methods 0.000 claims description 4
- 238000012216 screening Methods 0.000 claims description 3
- 238000012854 evaluation process Methods 0.000 abstract description 3
- 238000000034 method Methods 0.000 description 4
- 230000007547 defect Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
Landscapes
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The invention discloses an automatic recognition and evaluation system for flight training actions, which comprises a gesture ultrasonic acquisition device, a data processing system and a gesture database, wherein each gesture action model is preset in the database; assigning values to each gesture action in sequence according to the correct operation sequence of the aircraft; the gesture ultrasonic acquisition device is used for adopting gesture actions of a pilot; the data processing compares the acquired gesture with gesture model parameters in a database, judges the gesture action type and assigns a value to the gesture action type; and arranging the recognized gesture actions according to the time sequence. The invention identifies the gestures and actions of the pilots, records the sequence of the gestures and the actions, and finally scores and evaluates the whole flight actions according to the operation sequence and the missing action ratio, so that the evaluation process is completely objective, and a large amount of human resources required by manual evaluation are reduced.
Description
Technical Field
The invention relates to the technical field of flight teaching, in particular to an automatic identification and evaluation system for flight training actions.
Background
The flight performance is the most intuitive and effective method for judging the level of daily training (including simulated flight training) of a pilot, and currently, the judgment of the flight performance is mainly carried out by a marker by taking a capability flight training outline as reference and subjectively evaluating, so that the existing flight performance evaluation mode has the defects of strong subjectivity, insufficient system, slow and comprehensive scoring and the like. In addition, the scoring personnel need to have detailed knowledge of the powerful flight training outline to evaluate the flight performance, and the workload of the scoring personnel is large.
Disclosure of Invention
The technical problem to be solved by the invention is to solve the existing defects and provide an automatic evaluation system for flight simulation training or real training.
In order to solve the technical problems, the invention adopts the technical scheme that: the utility model provides a flight training action automatic identification and evaluation system, includes gesture ultrasonic acquisition device and data processing system, its characterized in that:
the gesture recognition system also comprises a gesture database, wherein each gesture action model is preset in the database;
assigning values to the gesture actions in sequence according to the correct operation sequence of the aircraft, wherein the values are 1, 2 and 3 … n in sequence to obtain a sequence Q;
the gesture ultrasonic acquisition device is used for adopting gesture actions of a pilot;
the data processing compares the acquired gesture with gesture model parameters in a database, judges the gesture action type and assigns a value to the gesture action type;
arranging the recognized gesture actions according to the time sequence to obtain a sequence xi;
According to the sequence xiScreening the array Q to screen out the array xiThe intersection with the sequence Q and sorting by size to obtain a sequence Si;
If the degree of mixing is ρ, then:
wherein m is the sequence SiL is a series xiThe item number difference value with the number sequence Q, wherein a and b are set parameters;
and evaluating and scoring the action according to the magnitude of the rho value.
Further, a gesture probability vector of the gesture features is obtained by utilizing a pre-trained gesture classification model, and the gesture probability vector is composed of probabilities that the gesture features belong to various preset gestures; determining a context probability vector related to the gesture feature based on the acquired context information and a predetermined context feature matrix, wherein the context feature matrix is composed of the probability of the occurrence of each preset gesture under each context factor, and the context probability vector is composed of the probability of the occurrence of each preset gesture under the current context; and determining the probability that the gesture features belong to various preset gestures in the current situation based on the gesture probability vector and the situation probability vector, and identifying the gesture corresponding to the maximum probability as the gesture corresponding to the acquired ultrasonic signals.
Furthermore, the position of the aircraft is obtained in real time according to the GPS, the flight track of the aircraft is obtained according to the position, and the correlation r between the aircraft and the preset track is judged according to the flight track.
Furthermore, a coordinate system is established by taking longitude and latitude as an x axis and a y axis respectively, a flight track and a preset track are displayed in the coordinate system,
wherein X is a latitude value corresponding to the flight trajectory, and Y is a latitude value corresponding to the preset trajectory;
and scoring and evaluating the flight video according to the r value.
The technical scheme shows that the invention has the following advantages: the invention identifies the gestures and actions of the pilots, records the sequence of the gestures and the actions, and finally scores and evaluates the whole flight actions according to the operation sequence and the missing action ratio, so that the evaluation process is completely objective, and a large amount of human resources required by manual evaluation are reduced.
Detailed Description
The invention discloses an automatic identification and evaluation system for flight training actions, which comprises a gesture ultrasonic acquisition device and a data processing system.
Firstly, a gesture database is required to be established, each gesture action model is preset in the database, and the collected models are identified through a trained gesture classification model.
The specific identification method is as follows: acquiring a gesture probability vector of gesture features by using a pre-trained gesture classification model, wherein the gesture probability vector is composed of probabilities that the gesture features belong to various preset gestures; determining a context probability vector related to the gesture feature based on the acquired context information and a predetermined context feature matrix, wherein the context feature matrix is composed of the probability of the occurrence of each preset gesture under each context factor, and the context probability vector is composed of the probability of the occurrence of each preset gesture under the current context; and determining the probability that the gesture features belong to various preset gestures in the current situation based on the gesture probability vector and the situation probability vector, and identifying the gesture corresponding to the maximum probability as the gesture corresponding to the acquired ultrasonic signals.
The evaluation process of the flight action is as follows: and according to the correct operation sequence of the aircraft, assigning values to the gesture actions in sequence, wherein the values are 1, 2 and 3 … n in sequence, and obtaining a sequence Q. If the operation flow has one hundred steps, n is 100.
The data processing compares the acquired gesture with gesture model parameters in a database, judges the gesture action type and assigns a value to the gesture action type; arranging the recognized gesture actions according to the time sequence to obtain a sequence xi。
According to the sequence xiScreening the array Q to screen out the array xiThe intersection with the sequence Q and sorting by size to obtain a sequence Si;
If the degree of mixing is ρ, then:
wherein m is the sequence SiL is a series xiThe item number difference value with the number sequence Q, wherein a and b are set parameters; the operator can set the parameters according to experience, and the b parameter mainly reflects the characteristics of missing actions.
And evaluating and scoring the action according to the magnitude of the rho value. According to the formula, automatic scoring of the operation flow in flight training can be completed, and human resources in the manual judging process are reduced.
In addition, the invention also calculates the deviation of the flight path, thereby integrally evaluating the flight training effect. The method comprises the following specific steps of acquiring the position of the aircraft in real time according to a GPS positioning system, acquiring the flight track of the aircraft according to the position, and judging the correlation r between the aircraft and the preset track according to the flight track. Respectively using longitude and latitude as x-axis and y-axis to establish coordinate system, displaying flight track and preset track in the coordinate system,
wherein X is a latitude value corresponding to the flight trajectory, and Y is a latitude value corresponding to the preset trajectory;
and scoring and evaluating the flight video according to the r value.
Claims (4)
1. The utility model provides a flight training action automatic identification and evaluation system, includes gesture ultrasonic acquisition device and data processing system, its characterized in that:
the gesture recognition system also comprises a gesture database, wherein each gesture action model is preset in the database;
assigning values to the gesture actions in sequence according to the correct operation sequence of the aircraft, wherein the values are 1, 2 and 3 … n in sequence to obtain a sequence Q;
the gesture ultrasonic acquisition device is used for adopting gesture actions of a pilot;
the data processing compares the acquired gesture with gesture model parameters in a database, judges the gesture action type and assigns a value to the gesture action type;
in time order, for identificationThe gesture movements are arranged to obtain a sequence xi;
According to the sequence xiScreening the array Q to screen out the array xiThe intersection with the sequence Q and sorting by size to obtain a sequence Si;
If the degree of mixing is ρ, then:
wherein m is the sequence SiL is a series xiThe item number difference value with the number sequence Q, wherein a and b are set parameters; and evaluating and scoring the action according to the magnitude of the rho value.
2. The automatic flight training action recognition and assessment system according to claim 1, wherein: acquiring a gesture probability vector of gesture features by using a pre-trained gesture classification model, wherein the gesture probability vector is composed of probabilities that the gesture features belong to various preset gestures; determining a context probability vector related to the gesture feature based on the acquired context information and a predetermined context feature matrix, wherein the context feature matrix is composed of the probability of the occurrence of each preset gesture under each context factor, and the context probability vector is composed of the probability of the occurrence of each preset gesture under the current context; and determining the probability that the gesture features belong to various preset gestures in the current situation based on the gesture probability vector and the situation probability vector, and identifying the gesture corresponding to the maximum probability as the gesture corresponding to the acquired ultrasonic signals.
3. The automatic flight training action recognition and assessment system according to claim 1, wherein: and acquiring the position of the aircraft in real time according to the GPS, acquiring the flight track of the aircraft according to the position, and judging the correlation r between the aircraft and the preset track according to the flight track.
4. The automatic flight training action recognition and assessment system according to claim 3, wherein: respectively using longitude and latitude as x-axis and y-axis to establish coordinate system, displaying flight track and preset track in the coordinate system,
wherein X is a latitude value corresponding to the flight trajectory, and Y is a latitude value corresponding to the preset trajectory;
and scoring and evaluating the flight video according to the r value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011532200.3A CN112684884A (en) | 2020-12-22 | 2020-12-22 | Flight training action automatic identification and evaluation system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011532200.3A CN112684884A (en) | 2020-12-22 | 2020-12-22 | Flight training action automatic identification and evaluation system |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112684884A true CN112684884A (en) | 2021-04-20 |
Family
ID=75450837
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011532200.3A Pending CN112684884A (en) | 2020-12-22 | 2020-12-22 | Flight training action automatic identification and evaluation system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112684884A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114529177A (en) * | 2022-02-10 | 2022-05-24 | 北方天途航空技术发展(北京)有限公司 | Training result intelligent judgment method and system for training machine |
-
2020
- 2020-12-22 CN CN202011532200.3A patent/CN112684884A/en active Pending
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114529177A (en) * | 2022-02-10 | 2022-05-24 | 北方天途航空技术发展(北京)有限公司 | Training result intelligent judgment method and system for training machine |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111160474B (en) | Image recognition method based on deep course learning | |
CN114419736B (en) | Experiment scoring method, system, equipment and readable storage medium | |
CN102053563A (en) | Flight training data acquisition and quality evaluation system of analog machine | |
CN108830332A (en) | A kind of vision vehicle checking method and system | |
CN109711377B (en) | Method for positioning and counting examinees in single-frame image monitored by standardized examination room | |
CN103324937A (en) | Method and device for labeling targets | |
CN105224921A (en) | A kind of facial image preferentially system and disposal route | |
CN114662208B (en) | Construction visualization system and method based on Bim technology | |
CN115810163B (en) | Teaching evaluation method and system based on AI classroom behavior recognition | |
CN115826763B (en) | Special combat simulation system and method based on virtual reality | |
CN110210695A (en) | A kind of tower control simulated training appraisal procedure based on support vector machines | |
CN111523445A (en) | Examination behavior detection method based on improved Openpos model and facial micro-expression | |
CN106529470A (en) | Gesture recognition method based on multistage depth convolution neural network | |
CN117726991B (en) | High-altitude hanging basket safety belt detection method and terminal | |
CN112684884A (en) | Flight training action automatic identification and evaluation system | |
CN116994077A (en) | Regression prediction method for flight attitude under action of complex wind field | |
CN116665841A (en) | Directional shooting athlete reaction training device and real-time evaluation system | |
CN116128681A (en) | Intelligent scoring system for oxygen preparation experiment of middle school | |
CN115829234A (en) | Automatic supervision system based on classroom detection and working method thereof | |
CN115861977A (en) | Evaluation method for simulated driving posture and simulated driving device | |
CN108764029A (en) | Model intelligence inspection method | |
CN111223341A (en) | Teaching training intelligent evaluation inspection method and system based on virtualization | |
CN113409635A (en) | Interactive teaching method and system based on virtual reality scene | |
CN112184040A (en) | Platform for evaluating software engineering capability based on behavior and learning data | |
CN103838793A (en) | Image semantic retrieval method based on weighted image pyramid structure and fuzzy support vector machine |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20210420 |
|
WD01 | Invention patent application deemed withdrawn after publication |