CN116704401A - Grading verification method and device for operation type examination, electronic equipment and storage medium - Google Patents

Grading verification method and device for operation type examination, electronic equipment and storage medium Download PDF

Info

Publication number
CN116704401A
CN116704401A CN202310487992.4A CN202310487992A CN116704401A CN 116704401 A CN116704401 A CN 116704401A CN 202310487992 A CN202310487992 A CN 202310487992A CN 116704401 A CN116704401 A CN 116704401A
Authority
CN
China
Prior art keywords
score
examination
examinee
scoring
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310487992.4A
Other languages
Chinese (zh)
Inventor
陈浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei Yikangsi Technology Co ltd
Original Assignee
Hubei Yikangsi Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Yikangsi Technology Co ltd filed Critical Hubei Yikangsi Technology Co ltd
Priority to CN202310487992.4A priority Critical patent/CN116704401A/en
Publication of CN116704401A publication Critical patent/CN116704401A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0639Performance analysis of employees; Performance analysis of enterprise or organisation operations
    • G06Q10/06393Score-carding, benchmarking or key performance indicator [KPI] analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects

Abstract

The embodiment of the application discloses a scoring and checking method and device for operation examination, electronic equipment and storage medium, wherein the method comprises the following steps: obtaining operation examination videos and scores of target examinees; obtaining examinee operation information in an operation type examination video; determining an expected score of the target examinee based on the examinee operation information; a verification result of the score is determined based on a comparison between the score and the expected score. According to the embodiment of the application, the score verification for the operation type examination is realized by extracting the operation information of the examinee from the operation type examination video and giving the corresponding expected score and verifying the score based on the expected score, so that the condition of larger score error is avoided.

Description

Grading verification method and device for operation type examination, electronic equipment and storage medium
Technical Field
The present application relates to the field of score verification technologies, and in particular, to a score verification method and apparatus for an operation examination, an electronic device, and a storage medium.
Background
For practical operation type examination, such as nursing operation examination, machine tool maintenance examination, etc., manual scoring is generally performed by a prisoner based on the practical operation condition of the examinee on site. However, due to timeliness of the operation of the examinee, the proctor often cannot observe all actual operations of the examinee in time, so that a large error may exist in the score of the proctor, and the score cannot be checked after the examination is finished.
Therefore, the prior operation type examination has the technical problem that the score cannot be verified.
Disclosure of Invention
Aiming at the defects of the prior art, the application provides a scoring verification method, a scoring verification device, electronic equipment and a scoring verification storage medium for an operation examination, which aim to realize scoring verification of the operation examination so as to avoid the condition of larger scoring error.
In a first aspect, an embodiment of the present application provides a score verification method for an operation examination, including:
obtaining operation examination videos and scores of target examinees;
obtaining examinee operation information in the operation type examination video;
determining an expected score of the target examinee based on the examinee operation information;
a verification result of the score is determined based on a comparison between the score and the expected score.
In some embodiments of the present application, the obtaining the test taker operation information in the operation type test video includes:
performing three-dimensional modeling on the target examinee in the operation type examination video to obtain a three-dimensional model of the target examinee;
acquiring a plurality of pieces of action information of the three-dimensional model in the operation type examination video and the sequence of execution of the plurality of pieces of action information in the operation type examination video;
And generating examinee operation information in the operation type examination video based on the action information and the sequence of execution.
In some embodiments of the application, the determining the expected score of the target candidate based on the candidate operation information includes:
acquiring preset standard operation information of the operation type examination video;
determining the operation matching degree between the examinee operation information and the preset standard operation information;
and determining the expected score of the target examinee based on the operation matching degree.
In some embodiments of the present application, the determining the operation matching degree between the test taker operation information and the preset standard operation information includes:
inputting the examinee operation information and the preset standard operation information into a preset operation matching degree model;
and acquiring the operation matching degree output by the operation matching degree model.
In some embodiments of the application, the determining a verification result of the score based on a comparison between the score and the expected score comprises:
determining a difference between the score and the expected score;
when the difference value is smaller than or equal to a preset threshold value, judging that the grading verification result is verification passing;
And when the difference value is larger than the preset threshold value, judging that the check result of the score is that the check fails.
In some embodiments of the application, the determining the expected score of the target candidate based on the operation matching degree includes:
acquiring a plurality of historical scores of scoring personnel of the operation type examination video;
determining the scoring verification passing rate of the scoring person based on the historical verification results of the plurality of historical scores;
and correcting the operation matching degree by adopting the scoring verification passing rate to obtain the expected score of the target examinee.
In some embodiments of the present application, the obtaining an operation class examination video of a target examinee includes:
acquiring video data of a plurality of angles when the target examinee performs an operation examination;
and taking the video data of the angles as the operation examination video of the target examinee.
In a second aspect, an embodiment of the present application further provides a scoring and verifying apparatus for an operation test, including:
the first acquisition module is used for acquiring operation examination videos and scores of the target examinees;
the second acquisition module is used for acquiring the operation information of the examinee in the operation type examination video;
The first determining module is used for determining the expected score of the target examinee based on the examinee operation information;
and a second determination module for determining a verification result of the score based on a comparison between the score and the expected score.
In a third aspect, an embodiment of the present application further provides an electronic device, including a memory, a processor, and a computer program stored in the memory and capable of running on the processor, where the processor implements the score checking method for the operation test according to the first aspect when executing the computer program.
In a fourth aspect, an embodiment of the present application further provides a computer readable storage medium, where the computer readable storage medium stores a computer program, where the computer program when executed by a processor causes the processor to execute the score checking method for the operation class examination described in the first aspect.
The embodiment of the application provides a scoring and verifying method and device for an operation examination, electronic equipment and a storage medium, wherein the method comprises the following steps: obtaining operation examination videos and scores of target examinees; obtaining examinee operation information in an operation type examination video; determining an expected score of the target examinee based on the examinee operation information; a verification result of the score is determined based on a comparison between the score and the expected score. According to the embodiment of the application, the score verification for the operation type examination is realized by extracting the operation information of the examinee from the operation type examination video and giving the corresponding expected score and verifying the score based on the expected score, so that the condition of larger score error is avoided.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic view of a scoring and checking system for an operation examination according to an embodiment of the present application;
FIG. 2 is a flowchart of one embodiment of a method for verifying scores of an operation class test according to an embodiment of the present application;
FIG. 3 is a flowchart of another embodiment of a score verification method for an operation class examination according to the present application;
FIG. 4 is a flowchart of still another embodiment of a score verification method for an operation class examination according to the present application;
FIG. 5 is a schematic structural diagram of an embodiment of a scoring and verifying device for an operation-type examination according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of an embodiment of an electronic device provided in an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to fall within the scope of the application.
In the description of the present application, the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more of the described features. In the description of the present application, the meaning of "a plurality" is two or more, unless explicitly defined otherwise.
In the present application, the term "exemplary" is used to mean "serving as an example, instance, or illustration. Any embodiment described as "exemplary" in this disclosure is not necessarily to be construed as preferred or advantageous over other embodiments. The following description is presented to enable any person skilled in the art to make and use the application. In the following description, details are set forth for purposes of explanation. It will be apparent to one of ordinary skill in the art that the present application may be practiced without these specific details. In other instances, well-known structures and processes have not been described in detail so as not to obscure the description of the application with unnecessary detail. Thus, the present application is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features disclosed herein.
The embodiment of the application provides a scoring and verifying method and device for operation type examination, electronic equipment and a storage medium, and the scoring and verifying method, the device, the electronic equipment and the storage medium are respectively described in detail below.
Referring to fig. 1, fig. 1 is a schematic view of a scoring and verifying system for an operation type examination according to an embodiment of the present application, where the scoring and verifying system for an operation type examination may include an electronic device 100, and a scoring and verifying apparatus for an operation type examination is integrated in the electronic device 100.
In the embodiment of the present application, the electronic device 100 may be a terminal or a server, and when the electronic device 100 is a server, it may be an independent server, or may be a server network or a server cluster formed by servers, for example, the electronic device 100 described in the embodiment of the present application includes, but is not limited to, a computer, a network host, a single network server, a plurality of network server sets, or a plurality of servers to construct a cloud server. Wherein the Cloud server is built from a large number of computers or web servers based on Cloud Computing (Cloud Computing).
It will be appreciated that when the electronic device 100 is a terminal in the embodiment of the present application, the terminal used may be a device that includes both receiving and transmitting hardware, i.e., a device having receiving and transmitting hardware capable of performing two-way communication over a two-way communication link. Such a device may include: a cellular or other communication device having a single-line display or a multi-line display or a cellular or other communication device without a multi-line display. The specific electronic device 100 may be a desktop terminal or a mobile terminal, and the electronic device 100 may be one of a mobile phone, a tablet computer, a notebook computer, a medical auxiliary instrument, and the like.
It will be understood by those skilled in the art that the application environment shown in fig. 1 is merely an application scenario of the present application, and is not limited to the application scenario of the present application, and other application environments may further include more or fewer electronic devices than those shown in fig. 1, for example, only 1 electronic device is shown in fig. 1, and it will be understood that the scoring verification system for an operation type examination may further include one or more other electronic devices, which is not limited herein.
In addition, as shown in fig. 1, the scoring verification system for the operation type examination may further include a memory 200 for storing data, such as operation type examination videos and scores of target examinees, and scoring verification data for the operation type examination, for example, scoring verification data for the operation type examination when the scoring verification system for the operation type examination is operated.
It should be noted that, the schematic view of the scenario of the scoring and checking system for the operation type examination shown in fig. 1 is only an example, and the scoring and checking system for the operation type examination and the scenario described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation on the technical solutions provided by the embodiments of the present application, and as a person of ordinary skill in the art can know that, with the evolution of the scoring and checking system for the operation type examination and the appearance of a new service scenario, the technical solutions provided by the embodiments of the present application are applicable to similar technical problems.
Next, a scoring and verifying method for an operation examination provided by the embodiment of the application is described.
In the embodiment of the scoring and verifying method for the operation type examination, the scoring and verifying device for the operation type examination is used as an execution main body, and in order to simplify and facilitate description, the execution main body is omitted in the subsequent method embodiment, and the scoring and verifying device for the operation type examination is applied to electronic equipment.
Referring to fig. 2 to fig. 4, fig. 2 is a flowchart illustrating an embodiment of a scoring and checking method for an operation type test according to an embodiment of the present application, where the scoring and checking method for an operation type test includes:
201. obtaining operation examination videos and scores of target examinees;
in the embodiment of the application, the operation type examination video is a video of a target examinee when performing the operation type examination. The operation examination can be a nursing operation examination, a machine tool maintenance examination and the like. The score is a manual score of the operation of the target test in the operation type test, and thus the score needs to be checked.
In some embodiments of the present application, obtaining operation class examination videos and scores of the target examinee may include: acquiring video data of multiple angles when a target examinee performs an operation examination; and taking the video data of the plurality of angles as an operation examination video of the target examinee, wherein the operation examination video of the target examinee comprises the video data of the plurality of angles.
In some embodiments of the present application, the camera device for capturing the examination video of the operation type is typically a fisheye camera. The fish-eye camera is widely applied to the field with the requirement of open area monitoring due to the advantages of wide vision, large visual angle and the like. The image shot by the fisheye camera is called a fisheye image, and because of the imaging principle of the fisheye camera, the fisheye image has larger image distortion, and distortion correction is often required to be performed on the fisheye image before direct watching or downstream tasks are performed.
At present, in the process of correcting the distortion of the fisheye image, the calibrated parameters and distortion coefficients in the camera are generally adopted to correct the fisheye image, but the edges of the image corrected by adopting the method are greatly deformed, so that the corrected image is required to be subjected to edge clipping, the corrected image loses the edge information of the original image, the visual field is reduced, and the original purpose of using the fisheye camera is contradicted. Meanwhile, the method has higher calculation complexity, so that independent and real-time correction operation cannot be realized in terminals such as mobile phones, USB cameras and the like. In order to solve the technical problem, the application also provides a fine tuning method of the model, which comprises the following steps:
The fine tuning method of the model includes the following steps S110 to S130.
S110, acquiring a pre-trained generated countermeasure network; the generating type countermeasure network comprises an image discrimination model and a first image correction model;
s120, constructing a second image correction model according to the first image correction model; the first image correction model is a teacher model, and the second image correction model is a student model;
and S130, performing knowledge distillation on the second image correction model according to the first image correction model to obtain a distilled second image correction model.
In particular, a generative countermeasure network generally includes a generator and a discriminator, and the idea of countermeasure training is adopted, so that the generator and the discriminator learn to play with each other to output required data.
In this embodiment, the generated antagonism network is obtained in advance through image sample training and is used for correcting the distorted image. The discriminant in the pre-trained generating type countermeasure network is an image discrimination model, the generator is a first image correction model, and the first image correction model can be used for correcting a distorted image. However, the parameter amount of the first image correction model is large, and independent and real-time correction operation cannot be realized in a mobile phone, a USB camera and other terminals. Therefore, after the pre-trained generated countermeasure network is acquired, the student model of the first image correction model, that is, the second image correction model, is constructed by taking the first image correction model in the generated countermeasure network as a teacher model.
The teacher model is a single complex network or a set of a plurality of networks and has good performance and generalization capability, the student model is a model with small network scale and limited expression capability, and the teacher model has strong learning capability and can transfer learned knowledge to a student model with relatively weak learning capability, so that the generalization capability of the student model is enhanced. According to the application, the teacher model is utilized to assist in training of the student model, so that the student model has the performance equivalent to that of the teacher model, but the parameter quantity is greatly reduced, and the model compression and acceleration are realized.
In the process of constructing the second image correction model according to the first image correction model, the second image correction model can be constructed by pruning, parameter sharing and other modes on the first image correction model, and the first image correction model is used for assisting the second image correction model in training, so that the knowledge learned by the first image correction model is transferred into the first image correction model, and the distilled second image correction model has the same image correction function as the first image correction model.
In other embodiments, steps S210 and S220 are further included before step S110.
S210, constructing a first image sample set according to internal parameters and distortion coefficients of a camera;
s220, training the generated type countermeasure network according to the first image sample set to obtain the trained generated type countermeasure network.
In this embodiment, the first image sample set is an image data set for training the generated countermeasure network, and the first image sample set is composed of a distorted image and an undistorted image, where the distorted image is an image of the undistorted image after the distortion has occurred. The distorted image can be obtained by processing an undistorted image based on internal parameters and distortion coefficients of the camera. The camera is preferably selected from, but not limited to, a fisheye camera.
In other embodiments, step S210 includes steps S211, S212, and S213.
S211, calibrating the camera according to a chessboard calibration method to obtain internal parameters and distortion coefficients of the camera;
s212, processing the first image according to the internal parameters and the distortion coefficients to obtain a second image; wherein the second image is a distorted image of the first image;
s213, constructing the first image sample set according to the first image and the second image.
In this embodiment, the internal parameters and distortion coefficients of the camera may be obtained by a checkerboard calibration method, specifically, a checkerboard for calibration may be photographed from multiple angles and positions by using a fisheye camera, and the internal parameters and distortion coefficients may be calculated by using a fisheye calibration algorithm; the matrix K of internal parameters may be:
the vector D of distortion coefficients may be:
D=(k 1 ,k 2 ,k 3 ,k 4 )
wherein f x 、f y C is a parameter of focal length x 、c y Is the longitudinal and transverse offset of the origin of the image relative to the imaging point of the optical center, k 1 、k 2 、k 3 、k 4 Is the radial and lateral distortion coefficient of the camera.
Specifically, the first image is a label sample, the second image is an input sample, the first image is an undistorted high-resolution image, the second image is a distorted high-resolution image, the second image is obtained by converting the first image through a distortion mapping relation formed by internal parameters and distortion coefficients, and the specific process for generating the second image comprises the following steps:
calculating a corrected camera internal reference matrix R by adopting K, D; wherein r=f e (K,D);
Decomposing the camera internal reference matrix R by using singular values to obtain an inverse matrix iR of R; wherein iR = SVD (R);
converting the two-dimensional coordinates (u, v) of the first image to a camera coordinate system (x, y, z) according to the inverse matrix iR; wherein (x, y, z) = (u, v, 1) iR;
Normalization in the z-axis, i.e.
Calculating the radius r of the cross section of the fish eye hemisphere; wherein, the liquid crystal display device comprises a liquid crystal display device,
calculating an incidence angle theta between the light and the optical axis; wherein θ=atan (r);
correcting the incidence angle theta to obtain a corrected incidence angle theta d The method comprises the steps of carrying out a first treatment on the surface of the Wherein θ d =θ(1+k 1 θ 2 +k 2 θ 4 +k 3 θ 6 +k 4 θ 8 );
Generating corrected camera coordinate system coordinates (x ', y') according to the corrected incident angle; wherein, the liquid crystal display device comprises a liquid crystal display device,
converting the camera coordinate system into a pixel coordinate system (u ', v'), i.e., (u ', v') being the second image two-dimensional coordinates; wherein u' =f x x′+c x ,v′=f y y′+c y
In other embodiments, step S220 includes steps S221, S222, S223, and S224.
S221, inputting the second image into the first image correction model to obtain a third image;
s222, constructing a second image sample set according to the first image and the third image;
s223, training the image discrimination model according to the image sample set to obtain a trained image discrimination model;
and S224, training the first image correction model according to the second image to obtain a trained first image correction model.
In this embodiment, the third image is a pseudo-undistorted image, which is obtained by correcting the second image by the first image correction model, and the second image sample set is composed of the first image and the third image and is used for training the image discrimination model, so that the image discrimination model can better distinguish the pseudo-image and the real image.
After the image discrimination model is trained, the image discrimination model is fixed, and the second image is input into the first image correction model to train the first image correction model, so that the image generated by the first image correction model can be indistinguishable into a pseudo image by the image discrimination model, and the first image correction model after the training is completed can be obtained.
In other embodiments, step S120 includes steps S121, S122, and S123.
S121, performing network parameter pruning on the first image correction model to obtain a third image correction model;
s122, performing knowledge distillation on the third image correction model to obtain a distilled third image correction model;
and S123, performing network parameter pruning on the distilled third image correction model to obtain the second image correction model.
In this embodiment, the third image correction model is also a student model of the first image correction model, but only the network parameters of the third image correction model are still relatively large, and the independent and real-time correction operation cannot be realized in the mobile phone, the USB camera and other terminals, so after knowledge distillation is performed on the third image correction model, at least one parameter pruning needs to be performed on the third image correction model until the finally pruned image correction model can realize the independent and real-time correction operation in the mobile phone, the USB camera and other terminals.
It should be noted that, in the process of performing the network parameter pruning on the distilled third image correction model in step S123, a fourth image correction model may also be obtained, and then steps S122 and S123 are performed on the fourth image correction model again until the final pruned image correction model may implement independent and real-time correction operation in a terminal such as a mobile phone, a USB camera, and the like.
Before clipping parameters of the first image correction model, a network structure of the first image correction model needs to be independently constructed, and weights of the first image correction model are loaded, so that the first image correction model can be stripped from a generated type countermeasure network. Wherein the number of parameter trimmings of the first image correction model is related to the smallest functional unit of the first image correction model. Meanwhile, when the first image correction model is subjected to parameter pruning, pruning can be performed by taking a basic unit in the first image correction model as a unit. For example, when the network structure of the first image correction model is Resnet, pruning may be performed in units of ResBlock.
In still other embodiments, the model M 'is corrected for the fourth image using the first set of image samples' g After training, a preset test sample set is adopted to correct the model M 'of the fourth image' g Testing to calculate a fourth image correction model M 'after training' g First accuracy in a test sample setSimultaneously adopting the sample set to correct the model M for the first image g Performing a test to calculate a first image correction model M g Second accuracy in test sample setThen calculate the first accuracy +.>And second accuracy->If the difference value deltaacc is greater than the preset first threshold value Thr, the third image correction model can be directly used as the second image correction model. Wherein (1)>The above judgment formula is:
when s=1, then cropping and training may be stopped, and the image model after the last cropping is taken as the second image correction model.
In other embodiments, step S130 includes steps S131 and S132.
S131, acquiring a first predicted value output by the first image correction model and a second predicted value output by the second image correction model;
and S132, determining a distilled second image correction model according to the first predicted value and the second predicted value.
In this embodiment, when knowledge distillation is performed on the second image correction model by using the first image correction model, the first image sample set may be specifically input to the first image correction model to train the second image model to output the second predicted value, and meanwhile, according to obtaining the corresponding predicted value from the first image model, that is, the first predicted value, then determining whether knowledge distillation is completed by the second image correction model by using a deviation between the first predicted value and the second predicted value.
Specifically, a first predicted value and a second predicted value at the time t are obtained, deviation of the predicted values is generated according to the first predicted value and the second predicted value, whether the deviation is larger than a set second threshold value is judged, and if the deviation is larger than the set second threshold value, a second image correction model at the time t-1 is taken as a second image correction model after final knowledge distillation.
In other embodiments, after step S130, steps S140 and S150 are further included.
S140, deploying the distilled second image correction model into terminal equipment;
and S150, correcting the image to be corrected according to the distilled second image correction model in the terminal equipment, and obtaining a corrected image.
In this embodiment, the distilled second image correction model may be deployed on a mobile phone, a usb camera, or other devices, that is, the terminal device may be a light-weight terminal device, and the terminal device may obtain a video stream captured by a fisheye camera, decode the video stream and acquire a frame to obtain an image to be corrected, and correct the image to be corrected by using the distilled second image correction model, so as to obtain a corrected image.
In the fine tuning method of the model provided by the embodiment of the application, a pre-trained generated countermeasure network is obtained; the generating type countermeasure network comprises an image discrimination model and a first image correction model; constructing a second image correction model according to the first image correction model; the first image correction model is a teacher model, and the second image correction model is a student model; and performing knowledge distillation on the second image correction model according to the first image correction model to obtain a distilled second image correction model. After the distilled second image correction model provided by the application corrects the image on the terminal equipment, not only can all information of the original image be reserved, but also the distortion correction and detail supplement of the whole image can be realized, and meanwhile, the running speed of the model on the terminal equipment can be ensured.
202. Obtaining examinee operation information in the operation type examination video;
in some embodiments of the present application, the examinee operation information in the operation class examination video may include at least one of an action of the target examinee, a sequence of execution of the actions, a gesture, and a tag.
In some embodiments of the present application, obtaining test taker operation information in an operation type test video may include: performing three-dimensional modeling on a target examinee in the operation examination video to obtain a three-dimensional model of the target examinee; acquiring a plurality of pieces of action information of a three-dimensional model in an operation type examination video and the sequence of execution of the plurality of pieces of action information in the operation type examination video; based on the plurality of action information and the sequence of execution, generating examinee operation information in the operation type examination video, namely the operation information of the examinee comprises the plurality of action information of the three-dimensional model and the sequence of execution of the plurality of action information. It can be seen that the three-dimensional modeling of the target examinee is constructed, so that the examinee operation information is determined, and the acquired examinee operation information can be more accurate. In addition, when the operation type examination video of the target examinee comprises the video data of the plurality of angles, namely, the three-dimensional module is generated by integrating the video data of the plurality of angles, the obtained three-dimensional modeling can be more accurate, and the acquisition of a plurality of action information of the three-dimensional model in the operation type examination video can be more accurate and comprehensive.
203. Determining an expected score of the target examinee based on the examinee operation information;
in some embodiments of the application, the expected score is a score calculated automatically by the machine, as opposed to the score obtained by manual scoring. Generally, when the examinee operation information is different, the expected score is also different.
204. A verification result of the score is determined based on a comparison between the score and the expected score.
In some embodiments of the application, the scored verification result may be verification failed, verification passed, or the like.
According to the grading verification method for the operation type examination, which is provided by the embodiment of the application, the grading verification is realized by extracting the operation information of the examinee from the operation type examination video and giving the corresponding expected score and verifying the grading based on the expected score, so that the situation of larger grading error is avoided.
In some embodiments of the present application, as shown in fig. 3, determining the expected score of the target candidate based on the candidate operation information may include:
301. acquiring preset standard operation information of the operation type examination video;
in the embodiment of the application, the preset standard operation information of the operation type examination video is preset, for example, the action information in the preset standard operation information and the sequence of execution of the action information in the preset standard operation information are preset based on actual conditions.
302. Determining the operation matching degree between the examinee operation information and the preset standard operation information;
in embodiments of the present application, the degree of operational matching may be calculated by way of a neural network model. Specifically, determining the operation matching degree between the examinee operation information and the preset standard operation information may include: inputting the operation information of the examinee and the preset standard operation information into a preset operation matching degree model; and obtaining the operation matching degree output by the operation matching degree model, thereby realizing the calculation of the operation matching degree.
The operation matching degree model is a neural network model, and the generation process is as follows: acquiring an initial neural network model, which may be a convolutional neural network (Convolutional Neural Networks, CNN), a cyclic neural network (Recurrent Neural Network, RNN), or the like; and performing model training on the initial neural network model by adopting preset first operation information, preset second operation information and preset operation matching degree between the first operation information and the second operation information to obtain an operation matching degree model, wherein the first operation information, the second operation information and the preset operation matching degree between the first operation information and the second operation information are selected in advance by manpower, and the specific process of model training is not repeated here.
303. And determining the expected score of the target examinee based on the operation matching degree.
In the embodiment of the application, generally, the higher the operation matching degree is, the more accurate the operation information of the examinee is indicated, and therefore, the higher the expected score of the objective examinee is.
According to the grading verification method for the operation examination, which is provided by the embodiment of the application, the automatic calculation of the expected score of the target examinee is realized through the matching between the examinee operation information and the preset standard operation information.
In some embodiments of the present application, as shown in fig. 4, determining a verification result of the score based on a comparison between the score and the expected score may include:
401. determining a difference between the score and the expected score;
402. when the difference value is smaller than or equal to a preset threshold value, judging that the grading verification result is verification passing;
in the embodiment of the application, when the difference value is smaller than or equal to the preset threshold value, the difference between the score and the expected score is smaller, and the score is not wrong, so that the verification result of the score is verification passing.
403. And when the difference value is larger than the preset threshold value, judging that the check result of the score is that the check fails.
In the embodiment of the application, when the difference is greater than the preset threshold, the difference between the score and the expected score is indicated to be large, and the score may be wrong, so that the verification result of the score is that the verification fails. In addition, the verification result can be sent to a preset rechecking terminal for the rechecking terminal to further check the score.
In some embodiments of the present application, determining the expected score of the target candidate based on the degree of operational matching may include: acquiring a plurality of historical scores of scoring persons of the operation type examination videos, wherein the plurality of historical scores can be the historical scores of the scoring persons for other operation type examination videos; based on the historical verification results of the plurality of historical scores, determining the score verification passing rate of the scoring person, wherein the historical verification results are historical verification results, and the score verification passing rate of the scoring person can be: in the historical verification results of the plurality of historical scores, the result is the duty ratio of the historical verification result passing verification; and correcting the operation matching degree by adopting the scoring verification passing rate to obtain the expected score of the target examinee, wherein the higher the scoring verification passing rate is, the higher the expected score is. It can be seen that the operation matching degree is corrected through the scoring verification passing rate of the scoring personnel, so that the expected score obtained by correction is more accurate.
According to the scoring verification method for the operation type examination, provided by the embodiment of the application, the automatic verification of the scoring of the operation type examination is realized through the comparison between the scoring and the expected score.
In order to better implement the scoring and checking method for the operation type examination in the embodiment of the present application, on the basis of the scoring and checking method for the operation type examination, the embodiment of the present application further provides a scoring and checking device for the operation type examination, as shown in fig. 5, the scoring and checking device 500 for the operation type examination includes:
the first obtaining module 501 is configured to obtain an operation examination video and score of a target examinee;
the second obtaining module 502 is configured to obtain operation information of an examinee in the operation type examination video;
a first determining module 503, configured to determine an expected score of the target examinee based on the examinee operation information;
a second determination module 504 is configured to determine a verification result of the score based on a comparison between the score and the expected score.
According to the grading verification device for the operation type examination, provided by the embodiment of the application, the first acquisition module 501 is used for acquiring the operation type examination video and grading of the target examinee, the second acquisition module 502 is used for acquiring the examinee operation information in the operation type examination video, the first determination module 503 is used for determining the expected grading of the target examinee based on the examinee operation information, and the second determination module 504 is used for determining the grading verification result based on the comparison between the grading and the expected grading. Compared with the traditional method, the embodiment of the application realizes the grading verification of the operation type examination by extracting the operation information of the examinee from the operation type examination video and giving the corresponding expected score and verifying the grading based on the expected score so as to avoid the condition of larger grading error.
In some embodiments of the present application, the second obtaining module 502 is specifically configured to:
performing three-dimensional modeling on a target examinee in the operation examination video to obtain a three-dimensional model of the target examinee;
acquiring a plurality of pieces of action information of a three-dimensional model in an operation type examination video and the sequence of execution of the plurality of pieces of action information in the operation type examination video;
and generating examinee operation information in the operation type examination video based on the plurality of action information and the sequence of execution.
In some embodiments of the present application, the first determining module 503 is specifically configured to:
acquiring preset standard operation information of operation examination videos;
determining the operation matching degree between the operation information of the examinee and the preset standard operation information;
based on the operational match, an expected score for the target candidate is determined.
In some embodiments of the present application, the first determining module 503 is specifically configured to:
inputting the operation information of the examinee and the preset standard operation information into a preset operation matching degree model;
and obtaining the operation matching degree output by the operation matching degree model.
In some embodiments of the present application, the second determining module 504 is specifically configured to:
determining a difference between the score and the expected score;
When the difference value is smaller than or equal to a preset threshold value, judging that the scored verification result is verification passing;
and when the difference value is larger than a preset threshold value, judging that the check result of the score is that the check fails.
In some embodiments of the present application, the first determining module 503 is specifically configured to:
acquiring a plurality of historical scores of scoring person histories of operation type examination videos;
determining a scoring verification pass rate of the scoring person based on historical verification results of the plurality of historical scores;
and correcting the operation matching degree by adopting the scoring verification passing rate to obtain the expected score of the target examinee.
In some embodiments of the present application, the first obtaining module 501 is specifically configured to:
acquiring video data of multiple angles when a target examinee performs an operation examination;
and taking the video data of a plurality of angles as an operation examination video of a target examinee.
In addition to the above description of the scoring and verifying method and device for operation type examination, the embodiment of the present application further provides an electronic device, which integrates any one of the scoring and verifying devices for operation type examination provided by the embodiment of the present application, where the electronic device includes:
one or more processors;
a memory; and
one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the processor to perform the steps of any of the scoring verification method embodiments of the class of operation exams described above.
The embodiment of the application also provides electronic equipment which integrates the grading verification device for any operation examination provided by the embodiment of the application. As shown in fig. 6, a schematic structural diagram of an electronic device according to an embodiment of the present application is shown, specifically:
the electronic device may include one or more processing cores 'processors 601, one or more computer-readable storage media's memory units 602, power supplies 603, and input units 604, among other components. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 6 is not limiting of the electronic device and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components. Wherein:
the processor 601 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the storage unit 602 and calling data stored in the storage unit 602, thereby performing overall monitoring of the electronic device. Optionally, the processor 601 may include one or more processing cores; preferably, the processor 601 may integrate an application processor and a modem processor, wherein the application processor primarily handles operating systems, user interfaces, applications, etc., and the modem processor primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 601.
The storage unit 602 may be used to store software programs and modules, and the processor 601 performs various functional applications and data processing by running the software programs and modules stored in the storage unit 602. The storage unit 602 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data created according to the use of the electronic device, etc. In addition, the storage unit 602 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. Accordingly, the memory unit 602 may also include a memory controller to provide access to the memory unit 602 by the processor 601.
The electronic device further comprises a power supply 603 for supplying power to the various components, preferably the power supply 603 may be logically connected to the processor 601 by a power management system, so that functions of managing charging, discharging, power consumption management and the like are achieved by the power management system. The power supply 603 may also include one or more of any components, such as a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
The electronic device may further comprise an input unit 604, which input unit 604 may be used for receiving input digital or character information and for generating keyboard, mouse, joystick, optical or trackball signal inputs in connection with user settings and function control.
Although not shown, the electronic device may further include a display unit or the like, which is not described herein. In particular, in the embodiment of the present application, the processor 601 in the electronic device loads executable files corresponding to the processes of one or more application programs into the storage unit 602 according to the following instructions, and the processor 601 runs the application programs stored in the storage unit 602, so as to implement various functions as follows:
obtaining operation examination videos and scores of target examinees; obtaining examinee operation information in an operation type examination video; determining an expected score of the target examinee based on the examinee operation information; a verification result of the score is determined based on a comparison between the score and the expected score.
According to the grading verification method for the operation type examination, which is provided by the embodiment of the application, the grading verification is realized by extracting the operation information of the examinee from the operation type examination video and giving the corresponding expected score and verifying the grading based on the expected score, so that the situation of larger grading error is avoided.
To this end, embodiments of the present application provide a computer-readable storage medium, which may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like. The computer readable storage medium stores a plurality of instructions that can be loaded by a processor to perform the steps in any of the scoring verification methods for operation-type tests provided by the embodiments of the present application. For example, the instructions may perform the steps of:
obtaining operation examination videos and scores of target examinees; obtaining examinee operation information in an operation type examination video; determining an expected score of the target examinee based on the examinee operation information; a verification result of the score is determined based on a comparison between the score and the expected score.
In the foregoing embodiments, the descriptions of the embodiments are emphasized, and for parts of one embodiment that are not described in detail, reference may be made to related descriptions of other embodiments.
The above describes in detail the scoring and checking method, the device, the electronic equipment and the storage medium for the operation examination provided by the embodiment of the application, and specific examples are applied to describe the principle and the implementation of the application, and the description of the above embodiment is only used for helping to understand the method and the core idea of the application; meanwhile, as those skilled in the art will have variations in the specific embodiments and application scope in light of the ideas of the present application, the present description should not be construed as limiting the present application.

Claims (10)

1. A scoring and checking method for an operation type examination, comprising the steps of:
obtaining operation examination videos and scores of target examinees;
obtaining examinee operation information in the operation type examination video;
determining an expected score of the target examinee based on the examinee operation information;
a verification result of the score is determined based on a comparison between the score and the expected score.
2. The method for verifying the score of an operation examination as defined in claim 1, wherein the obtaining the examinee operation information in the operation examination video comprises:
performing three-dimensional modeling on the target examinee in the operation type examination video to obtain a three-dimensional model of the target examinee;
acquiring a plurality of pieces of action information of the three-dimensional model in the operation type examination video and the sequence of execution of the plurality of pieces of action information in the operation type examination video;
and generating examinee operation information in the operation type examination video based on the action information and the sequence of execution.
3. A method of scoring an operation class examination as recited in claim 1, wherein the determining the expected score of the target test taker based on the test taker operation information includes:
Acquiring preset standard operation information of the operation type examination video;
determining the operation matching degree between the examinee operation information and the preset standard operation information;
and determining the expected score of the target examinee based on the operation matching degree.
4. A scoring verification method for an operation type examination as recited in claim 3, wherein the determining an operation matching degree between the test taker operation information and the preset standard operation information includes:
inputting the examinee operation information and the preset standard operation information into a preset operation matching degree model;
and acquiring the operation matching degree output by the operation matching degree model.
5. A method of scoring a class of operation exam according to claim 3, wherein the determining a result of the scoring based on a comparison between the score and the expected score comprises:
determining a difference between the score and the expected score;
when the difference value is smaller than or equal to a preset threshold value, judging that the grading verification result is verification passing;
and when the difference value is larger than the preset threshold value, judging that the check result of the score is that the check fails.
6. A method of scoring an operative class examination as defined in claim 5, wherein the determining an expected score for the target test taker based on the operative match comprises:
acquiring a plurality of historical scores of scoring personnel of the operation type examination video;
determining the scoring verification passing rate of the scoring person based on the historical verification results of the plurality of historical scores;
and correcting the operation matching degree by adopting the scoring verification passing rate to obtain the expected score of the target examinee.
7. The method for scoring and verifying an operation type test as defined in claim 1, wherein the step of obtaining the operation type test video of the target examinee comprises:
acquiring video data of a plurality of angles when the target examinee performs an operation examination;
and taking the video data of the angles as the operation examination video of the target examinee.
8. A scoring and checking device for an operation type examination, comprising:
the first acquisition module is used for acquiring operation examination videos and scores of the target examinees;
the second acquisition module is used for acquiring the operation information of the examinee in the operation type examination video;
The first determining module is used for determining the expected score of the target examinee based on the examinee operation information;
and a second determination module for determining a verification result of the score based on a comparison between the score and the expected score.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor implements a scoring verification method for an operation class examination according to any one of claims 1 to 7 when the computer program is executed by the processor.
10. A computer readable storage medium, characterized in that the computer readable storage medium stores a computer program which, when executed by a processor, causes the processor to perform a scoring verification method for an operation class examination according to any one of claims 1 to 7.
CN202310487992.4A 2023-04-24 2023-04-24 Grading verification method and device for operation type examination, electronic equipment and storage medium Pending CN116704401A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310487992.4A CN116704401A (en) 2023-04-24 2023-04-24 Grading verification method and device for operation type examination, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310487992.4A CN116704401A (en) 2023-04-24 2023-04-24 Grading verification method and device for operation type examination, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116704401A true CN116704401A (en) 2023-09-05

Family

ID=87830167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310487992.4A Pending CN116704401A (en) 2023-04-24 2023-04-24 Grading verification method and device for operation type examination, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116704401A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117241097A (en) * 2023-09-15 2023-12-15 杭州亦闲信息科技有限公司 Offline recorded video scoring method, device, system and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117241097A (en) * 2023-09-15 2023-12-15 杭州亦闲信息科技有限公司 Offline recorded video scoring method, device, system and storage medium

Similar Documents

Publication Publication Date Title
CN109432753B (en) Action correcting method, device, storage medium and electronic equipment
CN110909693B (en) 3D face living body detection method, device, computer equipment and storage medium
CN110074813B (en) Ultrasonic image reconstruction method and system
CN111709409A (en) Face living body detection method, device, equipment and medium
CN109799073B (en) Optical distortion measuring device and method, image processing system, electronic equipment and display equipment
JP2021531601A (en) Neural network training, line-of-sight detection methods and devices, and electronic devices
WO2022156061A1 (en) Image model training method and apparatus, electronic device, and storage medium
CN116704401A (en) Grading verification method and device for operation type examination, electronic equipment and storage medium
CN114120432A (en) Online learning attention tracking method based on sight estimation and application thereof
EP4053736A1 (en) System and method for matching a test frame sequence with a reference frame sequence
CN111589138B (en) Action prediction method, device, equipment and storage medium
CN114494347A (en) Single-camera multi-mode sight tracking method and device and electronic equipment
CN113252701A (en) Cloud edge cooperation-based power transmission line insulator self-explosion defect detection system and method
CN112861809B (en) Classroom head-up detection system based on multi-target video analysis and working method thereof
CN112633113A (en) Cross-camera human face living body detection method and system
CN110717441B (en) Video target detection method, device, equipment and medium
US20220392246A1 (en) Posture evaluating apparatus, method and system
EP4318314A1 (en) Image acquisition model training method and apparatus, image detection method and apparatus, and device
CN116597246A (en) Model training method, target detection method, electronic device and storage medium
CN114913086B (en) Face image quality enhancement method based on generation countermeasure network
CN116502798A (en) Subjective question score verification method and device, electronic equipment and storage medium
CN115019396A (en) Learning state monitoring method, device, equipment and medium
CN116309538B (en) Drawing examination evaluation method, device, computer equipment and storage medium
CN112598728A (en) Projector attitude estimation and trapezoidal correction method and device, projector and medium
CN116486344A (en) Cheating identification method and device for drawing examination, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination