CN111753624B - Examination point analysis method and device for chemical experiment - Google Patents

Examination point analysis method and device for chemical experiment Download PDF

Info

Publication number
CN111753624B
CN111753624B CN202010171865.XA CN202010171865A CN111753624B CN 111753624 B CN111753624 B CN 111753624B CN 202010171865 A CN202010171865 A CN 202010171865A CN 111753624 B CN111753624 B CN 111753624B
Authority
CN
China
Prior art keywords
target video
rule
video frame
examination point
chemical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010171865.XA
Other languages
Chinese (zh)
Other versions
CN111753624A (en
Inventor
赵帅帅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN202010171865.XA priority Critical patent/CN111753624B/en
Publication of CN111753624A publication Critical patent/CN111753624A/en
Application granted granted Critical
Publication of CN111753624B publication Critical patent/CN111753624B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/41Higher-level, semantic clustering, classification or understanding of video scenes, e.g. detection, labelling or Markovian modelling of sport events or news items
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Educational Administration (AREA)
  • Educational Technology (AREA)
  • Strategic Management (AREA)
  • Tourism & Hospitality (AREA)
  • General Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Software Systems (AREA)
  • Computational Linguistics (AREA)
  • Economics (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application provides a test point analysis method and a device for a chemical experiment, wherein the method comprises the following steps: for each target video frame in the operation video, identifying a chemical vessel from the target video frame; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table; analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule; and obtaining the analysis result of each examination point according to the analysis result of the target video frame based on the examination point judgment rule. The method can perform standard and accurate intelligent examination point analysis on the chemical experiment.

Description

Examination point analysis method and device for chemical experiment
Technical Field
The application relates to the technical field of education informatization, in particular to a test point analysis method and a test point analysis device for chemical experiments.
Background
In the current chemical experiment examination, the appraisal flow of the experiment is executed by a prisoner. The prisoner monitors a plurality of students in the classroom for simultaneous experiments, and has no way to consider the actual operation conditions of all examinees, so that the students can easily ignore the incorrect operation in the experimental operation process; moreover, examination judgment standards of different prisoners on the same chemical experiment are difficult to exclude subjective factors and are completely unified.
Disclosure of Invention
The application provides a test point analysis method and a device for a chemical experiment, which can perform standard and accurate intelligent test point analysis on the chemical experiment.
In a first aspect, the present application provides a method for analyzing a test point in a chemical experiment, comprising:
for each target video frame in the operation video, identifying a chemical vessel from the target video frame; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table;
analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule;
and obtaining the analysis result of each examination point according to the analysis result of the target video frame based on the examination point judgment rule.
Wherein the identifying of the chemical vessel from the target video frame comprises:
detecting the region of the chemical vessel from the target video frame;
identifying boundary keypoints of the chemical vessel from the region;
and connecting the boundary key points to obtain the boundary line of the chemical vessel.
The detecting the region of the chemical vessel from the target video frame comprises the following steps:
Pre-training a model for detecting the region where the chemical vessel is located; the model is obtained by taking video images of areas marked with various chemical vessels as samples to be input into a deep learning frame and a target detection network for training;
and inputting the target video frame into the model to obtain the region of each chemical vessel in the target video frame.
The analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judging rule or not, and obtaining the analysis result of the target video frame based on the examination point judging rule comprises the following steps:
obtaining a test point judging rule which needs to be met by a target video frame;
and analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the obtained examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule.
The test point judging rule to be satisfied by the target video frame comprises the following steps:
obtaining an analysis result of a target video frame of a previous frame of the target video frame based on a test point judgment rule;
if the analysis result is that the examination point judging rule is met, taking the examination point judging rule of the next sequence of the examination point judging rule as the examination point judging rule which needs to be met by the target video frame;
And if the analysis result is that the examination point judging rule is not satisfied, taking the examination point judging rule of the target video frame of the previous frame as the examination point judging rule which is required to be satisfied by the target video frame.
The test point judging rule to be satisfied by the target video frame comprises the following steps:
obtaining an analysis result of the first n frames of target video frames based on the examination point judgment rule; n is greater than 1;
judging whether the analysis result is that the continuous number meeting the examination point judgment rule is equal to n;
if the continuous number is more than 0 and less than n, taking the examination point judgment rule and the examination point judgment rule of the next sequence of the examination point judgment rule as the examination point judgment rule which needs to be met by the target video frame;
if the continuous number is equal to 0, taking the examination point judgment rule as an examination point judgment rule which needs to be met by the target video frame;
and if the continuous number is equal to n, taking the next sequential examination point judgment rule of the examination point judgment rules as the examination point judgment rule required to be met by the target video frame.
The test point judging rule to be satisfied by the target video frame comprises the following steps:
and obtaining a test point judging rule that the continuous target video frame number does not meet the continuous frame number requirement, and taking the obtained test point judging rule as the test point judging rule required to be met by the target video frame.
In a second aspect, an embodiment of the present application provides a test point analysis device for a chemical experiment, including:
an identification unit for identifying a chemical vessel from the target video frames for each of the target video frames in the operation video; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table;
the analysis unit is used for analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judgment rule or not to obtain an analysis result of the target video frame based on the examination point judgment rule;
and the obtaining unit is used for obtaining the analysis result of each test point based on the analysis result of the test point judgment rule according to the target video frame.
In a third aspect, an embodiment of the present application provides an electronic device, including:
a display screen; one or more processors; a memory; a plurality of applications; and one or more computer programs, wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions, which when executed by the device, cause the device to perform the steps of:
for each target video frame in the operation video, identifying a chemical vessel from the target video frame; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table;
Analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule;
and obtaining the analysis result of each examination point according to the analysis result of the target video frame based on the examination point judgment rule.
In a fourth aspect, embodiments of the present application provide a computer readable storage medium having a computer program stored therein, which when run on a computer causes the computer to perform the method of the first aspect.
In a fifth aspect, the present application provides a computer program for performing the method of the first aspect when the computer program is executed by a computer.
In one possible design, the program in the fifth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
In the test point analysis method for the chemical experiment, for each target video frame in the operation video, the chemical vessel is identified from the target video frames, the operation video is shot by a camera device positioned in front of and/or above a chemical experiment table, whether the image state information of the chemical vessel and/or the position relation among the chemical vessels meet the test point judgment rule or not is analyzed, the analysis result of the target video frame based on the test point judgment rule is obtained, and the analysis result of each test point is obtained according to the analysis result of the target video frame based on the test point judgment rule, so that the standard and accurate intelligent test point analysis can be carried out on the chemical experiment.
Drawings
FIG. 1 is a diagram showing an example of the installation position of an image pickup apparatus according to the present application;
FIG. 2 is a flow chart of one embodiment of a method of point of care analysis for a chemical experiment of the present application;
FIG. 3 is a flow chart of another embodiment of a method for analyzing a test point of a chemical experiment according to the present application;
FIG. 4 is a flow chart of another embodiment of a method for analyzing a test point of a chemical experiment according to the present application;
FIG. 5 is a block diagram of one embodiment of a point of care analysis device for chemical experiments of the present application;
FIG. 6 is a block diagram of another embodiment of a point of care analysis device for chemical experiments according to the present application;
FIG. 7 is a block diagram of a test point analysis device according to another embodiment of the chemical experiment of the present application;
fig. 8 is a schematic structural diagram of an embodiment of the electronic device of the present application.
Detailed Description
The terminology used in the description of the embodiments of the application herein is for the purpose of describing particular embodiments of the application only and is not intended to be limiting of the application.
In the existing implementation scheme, the appraising flow of the chemical experiment examination is executed by a prisoner, and specifically, the prisoner supervises a plurality of students in a classroom for carrying out chemical experiments at the same time, and appraises the experimental operation flow of each student. However, since the invigorator supervises a plurality of students, it is easy to ignore erroneous operations of the students in the experimental operation process; moreover, examination judgment standards of different prisoners on the same chemical experiment are difficult to exclude subjective factors and are completely unified.
Therefore, the application provides a test point analysis method for chemical experiments, which can perform standard and accurate intelligent test point analysis on the chemical experiments.
Fig. 1 is an application scenario example of the examination point analysis method for a chemical experiment of the present application, referring to fig. 1, an image pickup device may be disposed in front of a chemical experiment table, and an operation video of a student for performing the chemical experiment may be forward photographed. For better shooting of the operation video of the students, the height of the camera device is preferably higher than that of the chemical experiment table, and can be specifically determined based on the heights of various experimental devices built in the chemical experiment, for example, in the filtering experiment, the camera device can be 20-40 cm higher than the chemical experiment table. Referring to fig. 1, an image pickup device may be further disposed above the chemical experiment table, and the student may take an operation video of the chemical experiment in a nodding manner, and the position of the image pickup device may not be required to be fixed, so long as the operation video of the student may be clearly taken.
The image pickup device in the embodiment of the application can be as follows: a camera or an electronic device with a camera function.
In the following embodiments of the present application, a test point analysis method of a chemical experiment in the embodiments of the present application will be described by taking a filtering experiment as an example, and thus, an operation flow of a student in the filtering experiment will be described in advance. In the filtering experiment, the operation of the students needs to follow the following procedures: the filter paper is soaked and sleeved on the conical funnel to assemble the filter; setting up a filter device according to a sequence from bottom to top, specifically, firstly placing a beaker on a stand of an iron frame, and then placing a filter on an iron ring, so that a sharp mouth at the lower end of a funnel in the filter is tightly close to the inner wall of the beaker; the liquid transfer is carried out, specifically, a glass rod is tightly attached to a three-layer filter paper, the tip of a beaker is tightly attached to the glass rod to pour the liquid, and finally, the filtered liquid is required to be clarified and not turbid. In the flow, whether the students cling the soaked filter paper to the inner wall of the funnel, whether the filter device is built according to the correct sequence from bottom to top, whether the tip of the funnel clings to the inner wall of the beaker, whether the glass rod clings to the three layers of filter paper of the filter paper, whether the tip of the beaker clings to the glass rod, whether the obtained filtrate is clear and free from turbidity and the like are key problems of a filtering experiment. Based on this, the above key problems of the filtering experiment are divided into 5 key points, which are in turn:
1. Correctly manufacturing a filter, wetting filter paper with water and then tightly attaching the filter paper to the inner wall of the funnel;
2. constructing a filtering device from bottom to top;
3. the tip mouth at the lower end of the funnel is tightly close to the inner wall of the beaker;
4. the glass rod is correctly used in the liquid transferring process;
5. the obtained filtrate is clear and transparent, and does not contain solid impurities.
It should be noted that the filtration experiment is only an example, and the examination point analysis method of the chemical experiment in the embodiment of the present application is not limited to the filtration experiment, and may be applicable to other chemical experiments, such as the use of solid medicines, the use of liquid medicines, and the like, which are not repeated here.
FIG. 2 is a flow chart of one embodiment of a method for point of care analysis for chemical experiments according to the present application, as shown in FIG. 2, the method may include:
step 201: for each target video frame in the operation video, identifying a chemical vessel from the target video frame; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table;
the target video frames are selected from the operation video and are subjected to chemical vessel identification processing, and the target video frames are preferably distributed in the operation video approximately uniformly so that the examination point analysis result of the chemical experiment is more accurate. Preferably, the target video frames may be video frames uniformly distributed in the operation video.
Step 202: analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule;
step 203: and obtaining the analysis result of each examination point according to the analysis result of the target video frame based on the examination point judgment rule.
According to the method shown in fig. 2, for each target video frame in the operation video, a chemical vessel is identified from the target video frames, the operation video is shot by a camera device positioned in front of and/or above a chemical experiment table, whether the image state information of the chemical vessel and/or the position relation among the chemical vessels meet the test point judging rule or not is analyzed, the analysis result of the target video frame based on the test point judging rule is obtained, the analysis result of each test point is obtained according to the analysis result of the target video frame based on the test point judging rule, and therefore intelligent test point analysis of a chemical experiment is realized, the intelligent test point analysis standard is unified, the analysis result is more standard, the intelligent test point analysis is carried out for the experiment operation on one chemical experiment table, and whether the experiment operation of a student using the chemical experiment table is correct or not can be analyzed more accurately.
The implementation of step 201 is explained below.
Based on the method shown in fig. 2, referring to fig. 3, identifying a chemical vessel from a target video frame as described in step 201 may include:
step 301: detecting the region of the chemical vessel from the target video frame;
step 302: identifying boundary keypoints of the chemical vessel from the region;
step 303: and connecting the boundary key points to obtain the boundary line of the chemical vessel.
Wherein, the detecting the region of the chemical vessel from the target video frame may include:
inputting the target video frame into a pre-trained model for detecting the region where the chemical vessel is located, and obtaining the region where each chemical vessel is located in the target video frame; the model is obtained by inputting video images of the areas marked with the chemical vessels as samples into a deep learning framework and a target detection network for training.
Specifically, a model for detecting an area where a chemical vessel is located may be trained in advance, and the training method may include:
obtaining video images of the areas marked with the chemical vessels as samples;
and inputting the sample into a deep learning frame and training the target detection network to obtain a model for detecting the region where the chemical vessel is located.
The area of the chemical vessel can be a rectangular area, and then boundary key points are identified according to the area of each chemical vessel to obtain the boundary key points.
Wherein the detection of boundary keypoints of chemical vessels can be achieved using a related keypoint detection method.
Wherein, the deep learning framework may be: caffe, the target detection network may be: YOLOv2, or YOLOv3.
The implementation of step 202 is explained below.
The points of interest in the embodiment of the application correspond to key points of chemical experiments and can comprise parameters: the test point determination rule, optionally, the test point may further include parameters: the test point sequence, and/or the operation video, and/or the test point judgment rule identification, and/or the execution sequence, and/or the rule sequence, and/or the continuous frame number. The sequence of the examination points is consistent with the execution sequence of the key points in the chemical experiment. The examination point decision rule for each examination point may be 1 or more.
The parameter of the examination point sequence is used for recording: the sequence of the examination points;
the parameter of the examination point judgment rule is used for recording: the operating requirement of the examination point;
the parameter of the examination point judging rule mark is used for marking different examination point judging rules;
The parameter of operating video is used to record: the video sources of the target video frames analyzed for meeting the point of interest decision rule are: an imaging device arranged in front of the chemical experiment table or above the chemical experiment table;
the parameter of the execution sequence is used for recording: the sequence among different examination point judgment rules of the same examination point;
the rule sequence is a parameter for recording: sequencing among the examination point judgment rules of all the examination points;
the parameter of the number of continuous frames is used for recording: the minimum number of target video frames that meet the point of interest decision rule.
The test point decision rule relates to the key points of the chemical experiment, for example, 5 key points listed in the filtering experiment, and can be extracted from the key points. For example, the examination point determination rule of the examination point 1 may be set according to the key point 1 as: no gap exists between the filter paper and the funnel; the examination point decision rule of the examination point 2 can be set as follows according to the key point 2: the test point judging rule 1 and the beaker are placed on the iron stand, and the test point judging rule 2 and the filter are placed on the iron ring; the examination point decision rule of the examination point 3 may be set according to the key point 3 as follows: a gap is not formed between the tip at the lower end of the funnel and the inner wall of the beaker; the examination point decision rule of the examination point 4 may be set according to the key point 4 as follows: the examination point judging rule 1 and the glass rod point to the three layers of filter paper of the filter paper, the examination point judging rule 2 and the gap between the tip of the beaker and the glass rod are not formed, and the gap between the glass rod and the three layers of filter paper of the filter paper is not formed; the examination point determination rule of the examination point 5 may be set according to the key point 5 as follows: the texture and/or color difference of the image in the beaker does not exceed a preset threshold;
The parameter of the execution sequence is an optional parameter, and if the examination point judgment rule of each examination point in a certain chemical experiment does not have the limitation of the execution sequence, the parameter of the execution sequence can not be included in the examination points; if the test point includes a parameter of execution order and a test point includes only one test point determination rule, or there is no limitation of execution order among the plurality of test point determination rules although the plurality of test point determination rules are included, the parameter of execution order of the test point may be default.
The number of frames in duration is related to the duration of the chemical experiment itself. For example, for some experimental operations only needing to complete actions, only one frame of target video frame detects the corresponding examination point judging rule, the continuous frame number can be 1 or default, for example, in a chemical experiment of taking solid medicines, a test tube needs to be erected, and the medicines uniformly fall to the bottom of the test tube, for the operations, the corresponding examination point judging rule is as follows: the test tube stands upright, as long as one frame of target video frame detects that the test tube stands upright, the continuous frame number can be 1 or default; for some experimental operations requiring a certain duration, the number of continuous frames meeting the test point determination rule can be preset, the specific value of the number of continuous frames can be set independently in practical application according to the duration required by the experimental operations, for example, the test point 4 of the filtering experiment, the shortest time for the student to pour the liquid can be counted, and the continuous frame number without a gap between the tip of the beaker and the glass rod can be set according to the shortest time.
In addition, the number of frames continued was related to the scoring criteria for the chemical experiments. For example, for the test point 3 in the filtering experiment, the test point judging rule is that no gap exists between the lower end tip of the funnel and the inner wall of the beaker, then the scoring standard can be that the student can judge that the operation is correct when the student initially places the filter and no gap exists between the lower end tip of the funnel and the inner wall of the beaker, at this time, the continuous frame number can be set to be 1 frame or several frames, the scoring standard can also be that no gap exists between the lower end tip of the funnel and the inner wall of the beaker after placing the filter until the experiment is finished, and at this time, the continuous frame number can be set to be a great value or default.
Based on the above arrangement of the test points, in the same chemical experiment, the test points have a sequence, and a plurality of test point judgment rules of the same test point may have a sequence, so that the test point judgment rules in the chemical experiment can be sequenced in sequence. For example, for the 5 points of the above filtering experiment, the sequence of the point judgment rules is as follows:
there is no gap between the filter paper and the funnel, the beaker is placed on the iron stand, the filter (funnel) is placed on the iron ring, there is no gap between the tip of the lower end of the funnel and the inner wall of the beaker, the glass rod points to the three layers of filter paper of the filter paper, there is no gap between the tip of the beaker and the glass rod, there is no gap between the glass rod and the three layers of filter paper of the filter paper, and the texture and/or color difference of the image in the beaker does not exceed the preset threshold value.
The test point judging rule is used for judging whether the experiment operation of the student can meet the experiment requirement, some test point judging rules need to judge that the experiment operation can be shot only by the shooting device positioned in front of the chemical experiment table, and some test point judging rules need to judge that the experiment operation can be shot only by the shooting device positioned above the chemical experiment table, so that the parameter of the operation video is set for recording whether the analyzed video source of the target video frame meets the test point judging rule. Taking the test point judging rule of the filtering experiment as an example, for the three-layer filter paper where the glass rod points to the filter paper in the test point judging rule and the texture and/or color difference of the image in the beaker do not exceed the preset threshold value, only the three-layer filter paper where the glass rod points to the filter paper in the experiment operation of the student and the liquid color in the beaker after filtering can be shot in the operation video shot by the shooting device above the chemical experiment table, and only the shooting device in front of the chemical experiment table can be shot in the experiment operation to be judged in other test point judging rules.
In other chemical experiments, the plurality of test point judging rules may be in the same sequence, and then only if whether the target video frame meets the test point judging rules is analyzed, whether the target video frame meets each test point judging rule is analyzed, and an analysis result based on each test point judging rule is obtained, so that the step can be completed.
Based on the above description, taking the filtering experiment as an example, information similar to the following table 1 may be preset in an electronic device performing the method according to the present application.
TABLE 1
Based on the above description, referring to fig. 4, step 202 may include:
step 401: obtaining a test point judging rule which needs to be met by a target video frame;
step 402: and analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule.
The number of the examination point decision rules to be satisfied by the target video frame may be 1 or more, and the present application is not limited thereto. If the number of the examination point judging rules to be met by the target video frame is multiple, when analyzing whether the image state information of the chemical vessels and/or the position relations among the chemical vessels meet the examination point judging rules, analyzing whether the image state information of the chemical vessels and/or the position relations among the chemical vessels meet the examination point judging rules or not according to each examination point judging rule.
If the target video frames in the video are analyzed according to the sequence of the examination point judging rules:
in a first possible implementation, if the duration frame number of the examination point determination rule is 1, step 401 may include:
Obtaining an analysis result of a target video frame of a previous frame of the target video frame based on a test point judgment rule;
if the analysis result is that the examination point judging rule is met, taking the examination point judging rule of the next sequence of the examination point judging rule as the examination point judging rule which needs to be met by the target video frame;
and if the analysis result is that the examination point judging rule is not satisfied, taking the examination point judging rule of the target video frame of the previous frame as the examination point judging rule which is required to be satisfied by the target video frame.
In a second possible implementation manner, if the duration frame number of the point determination rule is n, and n is greater than 1, the n values of different point determination rules may be different, then step 401 may include:
obtaining an analysis result of the first n frames of target video frames based on the examination point judgment rule;
judging whether the analysis result is that the continuous number meeting the examination point judgment rule is equal to n;
if the continuous number is more than 0 and less than n, taking the examination point judgment rule and the examination point judgment rule of the next sequence of the examination point judgment rule as the examination point judgment rule which needs to be met by the target video frame;
if the continuous number is equal to 0, taking the examination point judgment rule as an examination point judgment rule which needs to be met by the target video frame;
And if the continuous number is equal to n, taking the next sequential examination point judgment rule of the examination point judgment rules as the examination point judgment rule required to be met by the target video frame.
If the sequence of the examination point judging rule is not limited when analyzing the target video frame in the video:
in a third possible implementation manner, the number of continuous frames of each test point judging rule is m, the values of the continuous frames m of different test point judging rules are different, and m is greater than or equal to 1, at this time, a continuous target video frame number list meeting the test point judging rules can be preset, and when an analysis result of a target video frame based on the test point judging rules is obtained, the list is updated according to the analysis result; step 401 may include: and obtaining the examination point judging rule that the continuous target video frame number in the list does not meet the continuous frame number requirement, and taking the obtained examination point judging rule as the examination point judging rule which needs to be met by the target video frame.
Wherein, 2 operation videos processed in step 201 include an operation video captured by a camera located in front of the chemical experiment table and an operation video captured by a camera located above the chemical experiment table, and then the previous frame target video frame of the above target video frame, or the previous n frame target video frames, and the next sequential test point determination rule of the test point determination rule are all for the same operation video, for example, taking table 1 as an example, if the operation video source of the target video frame is the front, the test point determination rule required to be satisfied by the target video frame is the test point determination rule identified as 1-4, or 6, and the test point determination rule identified as 4 and the next sequential test point determination rule of the test point determination rule is the test point determination rule identified as 6; if the operating video source of the target video frame is above, then the point determination rule to be satisfied by the target video frame is a point determination rule identified as 5 or 7, and then the point determination rule identified as 5 in the next order of point determination rules is a point determination rule identified as 7.
The implementation of step 402 is described below by way of example.
Assume that the test point determination rule to be satisfied in step 401 to obtain the target video frame is: if no gap exists between the filter paper and the funnel, the step can analyze whether the gap exists between the filter paper and the funnel according to the position relation between the filter paper and the funnel in the target video frame;
assume that the test point determination rule to be satisfied in step 401 to obtain the target video frame is: if the texture and/or color difference of the image in the beaker does not exceed the preset threshold, the step can analyze whether the image in the beaker meets the examination point judgment rule according to the image state information in the target video frame, specifically, whether the texture and/or color difference of the image in the beaker exceeds the preset threshold, so that an analysis result is obtained.
The other examination point decision rules are similar in processing method in this step, and are not illustrated here.
The implementation of step 203 will be described below.
The implementation of the step aims at determining whether students finish key points required by experiments in experimental operation, so that the implementation of the step is related to a test point judging rule, the sequence among test points, the sequence among different test point judging rules of the same test point, the continuous frame number of the test point judging rule and the like, and the requirements can be realized by presetting the sequence of the test points and different parameters in each test point. Therefore, in this step, it is only required to analyze whether the target video frame meets the preset requirements according to the analysis result of the target video frame based on the examination point determination rule.
Taking the filtering experiment as an example, assuming that the requirements are stored in the electronic device in advance in the manner of table 1, in this step, it is only required to sequentially determine whether the target video frame meets the preset requirements according to the analysis result of the target video frame based on the examination point determination rule.
For example, judging whether the target video frame which is satisfied based on the analysis result of the examination point judgment rule 1 has 6 continuous frames, if not, the analysis result of the examination point 1 is unqualified; if so, judging whether the target video frames which are in front of the continuous 6 frames of target video frames exist or not, wherein the analysis result is that the target video frames which meet the test point judging rules 2-7 exist, if so, the analysis result of the test point 1 is that the continuous frame number is qualified, the execution sequence is not qualified, and if not, the analysis result of the test point 1 is qualified. The analysis of other points is similar to this and will not be described in detail.
It is to be understood that some or all of the steps or operations in the above embodiments are merely examples, and that other operations or variations of the various operations may also be performed by embodiments of the present application. Furthermore, the various steps may be performed in a different order presented in the above embodiments, and it is possible that not all of the operations in the above embodiments are performed.
FIG. 5 is a block diagram of one embodiment of a point of care analysis device for chemical experiments in accordance with the present application, and referring to FIG. 5, the device 500 may comprise:
an identification unit 510 for identifying a chemical vessel from the target video frames for each of the target video frames in the operation video; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table;
the analysis unit 520 is configured to analyze whether the image state information of the chemical vessels and/or the positional relationship between the chemical vessels meet the test point determination rule, so as to obtain an analysis result of the target video frame based on the test point determination rule;
an obtaining unit 530, configured to obtain an analysis result of each test point based on the analysis result of the test point determination rule according to the target video frame.
As shown in fig. 6, the identifying unit 510 may include:
a detection subunit 610, configured to detect, from the target video frame, a region where the chemical vessel is located;
an identification subunit 620 for identifying boundary keypoints of the chemical vessel from the region;
and a connection subunit 630, configured to connect the boundary key points together to obtain a boundary line of the chemical vessel.
Wherein the detection subunit 610 may include:
The training module is used for pre-training a model for detecting the area where the chemical vessel is located;
and the input module is used for inputting the target video frame into the model to obtain the area of each chemical vessel in the target video frame.
Wherein, the training module may specifically be used for:
obtaining video images of the areas marked with the chemical vessels as samples;
and inputting the sample into a deep learning frame and training the target detection network to obtain a model for detecting the region where the chemical vessel is located.
Referring to fig. 7, the analysis unit 520 may include:
an obtaining subunit 710, configured to obtain a point determination rule that needs to be met by the target video frame;
and an analysis subunit 720, configured to analyze whether the image state information of the chemical vessels and/or the positional relationship between the chemical vessels meet the obtained test point determination rule, and obtain an analysis result of the target video frame based on the test point determination rule.
In one possible implementation, the obtaining subunit 710 may specifically be configured to:
obtaining an analysis result of a target video frame of a previous frame of the target video frame based on a test point judgment rule;
If the analysis result is that the examination point judging rule is met, taking the examination point judging rule of the next sequence of the examination point judging rule as the examination point judging rule which needs to be met by the target video frame;
and if the analysis result is that the examination point judging rule is not satisfied, taking the examination point judging rule of the target video frame of the previous frame as the examination point judging rule which is required to be satisfied by the target video frame.
In another possible implementation manner, the obtaining subunit 710 may specifically be configured to:
obtaining an analysis result of the first n frames of target video frames based on the examination point judgment rule; n is greater than 1;
judging whether the analysis result is that the continuous number meeting the examination point judgment rule is equal to n;
if the continuous number is more than 0 and less than n, taking the examination point judgment rule and the examination point judgment rule of the next sequence of the examination point judgment rule as the examination point judgment rule which needs to be met by the target video frame;
if the continuous number is equal to 0, taking the examination point judgment rule as an examination point judgment rule which needs to be met by the target video frame;
and if the continuous number is equal to n, taking the next sequential examination point judgment rule of the examination point judgment rules as the examination point judgment rule required to be met by the target video frame.
In yet another possible implementation manner, the obtaining subunit 710 may specifically be configured to:
And obtaining a test point judging rule that the continuous target video frame number does not meet the continuous frame number requirement, and taking the obtained test point judging rule as the test point judging rule required to be met by the target video frame.
In the apparatus shown in fig. 5 to 7, for each target video frame in the operation video, the identifying unit identifies the chemical vessel from the target video frame, the analyzing unit analyzes the image state information of the chemical vessel and/or whether the positional relationship between the chemical vessels satisfies the test point determination rule, the analysis result of the target video frame based on the test point determination rule is obtained, and the obtaining unit obtains the analysis result of each test point according to the analysis result of the target video frame based on the test point determination rule, thereby realizing the intelligent test point analysis of the chemical experiment, the intelligent test point analysis criteria are unified, the analysis result is more standard, the intelligent test point analysis is to test point analysis for the experiment operation on one chemical experiment table, and whether the experiment operation of the student using the chemical experiment table is correct can be analyzed more accurately.
The apparatus shown in fig. 5 to 7 of the present application may be provided in an image pickup apparatus such as a camera, or may be provided in an electronic device, and further, may be provided in an electronic device having an image pickup function.
The apparatus provided by the embodiments shown in fig. 5 to 7 may be used to implement the technical solutions of the method embodiments shown in fig. 2 to 4 according to the present application, and the implementation principle and technical effects thereof may be further referred to the relevant descriptions in the method embodiments.
It should be understood that the division of the units or modules of the apparatus of the embodiments shown in fig. 5 to 7 is merely a division of a logic function, and may be fully or partially integrated into a physical entity or may be physically separated. And these units or modules may all be implemented in the form of software calls via the processing elements; or can be realized in hardware; it is also possible that part of the units or modules are implemented in the form of software called by means of processing elements and part of the units or modules are implemented in the form of hardware. For example, the identification unit may be a separately established processing element or may be implemented integrated in a certain chip of the electronic device. The implementation of other units or modules is similar. Furthermore, all or part of these units or modules may be integrated together or may be implemented independently. In implementation, each step of the above method or each unit or module above may be implemented by an integrated logic circuit of hardware in a processor element or an instruction in the form of software.
For example, the above units or modules may be one or more integrated circuits configured to implement the above methods, such as: one or more specific integrated circuits (Application Specific Integrated Circuit; hereinafter ASIC), or one or more microprocessors (Digital Singnal Processor; hereinafter DSP), or one or more field programmable gate arrays (Field Programmable Gate Array; hereinafter FPGA), etc. For another example, the units or modules may be integrated together and implemented in the form of a System-On-a-Chip (SOC).
Fig. 8 is a schematic structural diagram of an embodiment of an electronic device according to the present application, as shown in fig. 8, where the electronic device may include: one or more processors; a memory; and one or more computer programs.
Wherein, the display screen may include a display screen of a vehicle-mounted computer (mobile data center Mobile Data Center); the electronic device may be a mobile terminal (mobile phone), a camera such as a camera, a computer, a PAD, an intelligent screen, an unmanned aerial vehicle, an intelligent network vehicle (Intelligent Connected Vehicle; hereinafter abbreviated as ICV), an intelligent (car) or a vehicle-mounted device.
Wherein the one or more computer programs are stored in the memory, the one or more computer programs comprising instructions that, when executed by the device, cause the device to perform the steps of:
for each target video frame in the operation video, identifying a chemical vessel from the target video frame; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table;
analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule;
and obtaining the analysis result of each examination point according to the analysis result of the target video frame based on the examination point judgment rule.
Wherein the instructions, when executed by the apparatus, cause the apparatus to perform the step of identifying a chemical vessel from a target video frame, comprise:
detecting the region of the chemical vessel from the target video frame;
identifying boundary keypoints of the chemical vessel from the region;
and connecting the boundary key points to obtain the boundary line of the chemical vessel.
Wherein when the instructions are executed by the apparatus, the step of causing the apparatus to perform the detecting the region of the chemical vessel from the target video frame comprises:
pre-training a model for detecting the region where the chemical vessel is located;
and inputting the target video frame into the model to obtain the region of each chemical vessel in the target video frame.
Wherein the instructions, when executed by the apparatus, cause the apparatus to perform the pre-training of the model for detecting the region of the chemical vessel, comprise:
obtaining video images of the areas marked with the chemical vessels as samples;
and inputting the sample into a deep learning frame and training the target detection network to obtain a model for detecting the region where the chemical vessel is located.
When the instructions are executed by the device, the step of enabling the device to execute the analysis of the image state information of the chemical vessels and/or whether the position relationship among the chemical vessels meets the examination point judgment rule, and obtaining the analysis result of the target video frame based on the examination point judgment rule comprises the following steps:
obtaining a test point judging rule which needs to be met by a target video frame;
And analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the obtained examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule.
Wherein when the instructions are executed by the device, the step of causing the device to execute the examination point determination rule that needs to be satisfied to obtain the target video frame includes:
obtaining an analysis result of a target video frame of a previous frame of the target video frame based on a test point judgment rule;
if the analysis result is that the examination point judging rule is met, taking the examination point judging rule of the next sequence of the examination point judging rule as the examination point judging rule which needs to be met by the target video frame;
and if the analysis result is that the examination point judging rule is not satisfied, taking the examination point judging rule of the target video frame of the previous frame as the examination point judging rule which is required to be satisfied by the target video frame.
Wherein when the instructions are executed by the device, the step of causing the device to execute the examination point determination rule that needs to be satisfied to obtain the target video frame includes:
obtaining an analysis result of the first n frames of target video frames based on the examination point judgment rule; n is greater than 1;
Judging whether the analysis result is that the continuous number meeting the examination point judgment rule is equal to n;
if the continuous number is more than 0 and less than n, taking the examination point judgment rule and the examination point judgment rule of the next sequence of the examination point judgment rule as the examination point judgment rule which needs to be met by the target video frame;
if the continuous number is equal to 0, taking the examination point judgment rule as an examination point judgment rule which needs to be met by the target video frame;
and if the continuous number is equal to n, taking the next sequential examination point judgment rule of the examination point judgment rules as the examination point judgment rule required to be met by the target video frame.
Wherein when the instructions are executed by the device, the step of causing the device to execute the examination point determination rule that needs to be satisfied to obtain the target video frame includes:
and obtaining a test point judging rule that the continuous target video frame number does not meet the continuous frame number requirement, and taking the obtained test point judging rule as the test point judging rule required to be met by the target video frame.
The electronic device shown in fig. 8 may be a terminal device or a circuit device built in the terminal device. The apparatus may be used to perform the functions/steps of the methods provided by the embodiments of the present application shown in fig. 2-4.
As shown in fig. 8, the electronic device 800 includes a processor 810 and a transceiver 820. Optionally, the electronic device 800 may also include a memory 830. Wherein the processor 810, the transceiver 820 and the memory 830 can communicate with each other via an internal connection path for transferring control and/or data signals, the memory 830 is used for storing a computer program, and the processor 810 is used for calling and running the computer program from the memory 830.
The memory 830 may be a read-only memory (ROM), other type of static storage device that can store static information and instructions, a random access memory (random access memory, RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (electrically erasable programmable read-only memory, EEPROM), a compact disc read-only memory (compact disc read-only memory) or other optical disk storage, optical disk storage (including compact discs, laser discs, optical discs, digital versatile discs, blu-ray discs, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, etc.
Optionally, the electronic device 800 may further include an antenna 840 for transmitting wireless signals output by the transceiver 820.
The processor 810 and the memory 830 may be combined into a single processing device, more commonly separate components, and the processor 810 is configured to execute program code stored in the memory 830 to perform the functions described above. In particular implementations, the memory 830 may also be integrated into the processor 810 or may be separate from the processor 810.
In addition, in order to further improve the functionality of the electronic device 800, the electronic device 800 may further comprise one or more of an input unit 860, a display unit 870, an audio circuit 880, an image pick-up device 890, a sensor 801, etc., which may further comprise a speaker 882, a microphone 884, etc. Wherein the display unit 870 may comprise a display screen.
Optionally, the electronic device 800 described above may also include a power supply 850 for providing power to various devices or circuits in the terminal device.
It should be appreciated that the electronic device 800 shown in fig. 8 is capable of implementing the various processes of the methods provided by the embodiments of the present application shown in fig. 2-4. The operations and/or functions of the respective modules in the electronic device 800 are respectively for implementing the corresponding flows in the above-described method embodiments. Reference is made in particular to the description of the embodiments of the method according to the application shown in fig. 2 to 4, and a detailed description is omitted here as appropriate to avoid repetition.
It should be appreciated that the processor 810 in the electronic device 800 shown in fig. 8 may be a system on a chip SOC, and the processor 810 may include a central processing unit (Central Processing Unit; hereinafter referred to as "CPU") and may further include other types of processors, such as: an image processor (Graphics Processing Unit; hereinafter referred to as GPU) and the like.
In general, portions of the processors or processing units within the processor 810 may cooperate to implement the preceding method flows, and corresponding software programs for the portions of the processors or processing units may be stored in the memory 830.
The present application also provides an electronic device, where the device includes a storage medium and a central processing unit, where the storage medium may be a nonvolatile storage medium, where a computer executable program is stored in the storage medium, and where the central processing unit is connected to the nonvolatile storage medium and executes the computer executable program to implement the methods provided by the embodiments shown in fig. 1 to 5 of the present application.
In the above embodiments, the processor may include, for example, a CPU, a DSP, a microcontroller, or a digital signal processor, and may further include a GPU, an embedded Neural Network Processor (NPU) and an image signal processor (Image Signal Processing; ISP), where the processor may further include a necessary hardware accelerator or a logic processing hardware circuit, such as an ASIC, or one or more integrated circuits for controlling the execution of the program according to the present application. Further, the processor may have a function of operating one or more software programs, which may be stored in a storage medium.
Embodiments of the present application also provide a computer readable storage medium having a computer program stored therein, which when run on a computer causes the computer to perform the method provided by the embodiments of the present application shown in fig. 2 to 4.
Embodiments of the present application also provide a computer program product comprising a computer program which, when run on a computer, causes the computer to perform the method provided by the embodiments of the present application shown in fig. 2-4.
In the embodiments of the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or", describes an association relation of association objects, and indicates that there may be three kinds of relations, for example, a and/or B, and may indicate that a alone exists, a and B together, and B alone exists. Wherein A, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of the following" and the like means any combination of these items, including any combination of single or plural items. For example, at least one of a, b and c may represent: a, b, c, a and b, a and c, b and c or a and b and c, wherein a, b and c can be single or multiple.
Those of ordinary skill in the art will appreciate that the various elements and algorithm steps described in the embodiments disclosed herein can be implemented as a combination of electronic hardware, computer software, and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, and are not repeated herein.
In several embodiments provided by the present application, any of the functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a usb disk, a removable hard disk, a Read-Only Memory (hereinafter referred to as ROM), a random access Memory (Random Access Memory) and various media capable of storing program codes such as a magnetic disk or an optical disk.
The foregoing is merely exemplary embodiments of the present application, and any person skilled in the art may easily conceive of changes or substitutions within the technical scope of the present application, which should be covered by the present application. The protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A method for analyzing a test point of a chemical experiment, comprising:
for each target video frame in the operation video, identifying a chemical vessel from the target video frame; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table;
analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule;
obtaining an analysis result of each examination point according to the analysis result of the target video frame based on the examination point judgment rule;
the examination point judging rule comprises the following steps: the texture and/or color difference of the image in the beaker does not exceed a preset threshold; the analyzing whether the image state information of the chemical vessel meets the examination point judging rule comprises the following steps: analyzing whether texture and/or color differences of the image within the beaker exceed a preset threshold.
2. The method of claim 1, wherein identifying a chemical vessel from the target video frame comprises:
detecting the region of the chemical vessel from the target video frame;
identifying boundary keypoints of the chemical vessel from the region;
and connecting the boundary key points to obtain the boundary line of the chemical vessel.
3. The method of claim 2, wherein detecting the region of the chemical vessel from the target video frame comprises:
inputting the target video frame into a pre-trained model for detecting the region where the chemical vessel is located, and obtaining the region where each chemical vessel is located in the target video frame; the model is obtained by inputting video images of the areas marked with the chemical vessels as samples into a deep learning framework and a target detection network for training.
4. A method according to any one of claims 1 to 3, wherein analyzing whether the image state information of the chemical vessels and/or the positional relationship between the chemical vessels satisfies the examination point determination rule, to obtain the analysis result of the target video frame based on the examination point determination rule, comprises:
Obtaining a test point judging rule which needs to be met by a target video frame;
and analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the obtained examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule.
5. The method of claim 4, wherein the point of interest determination rule to be satisfied for obtaining the target video frame comprises:
obtaining an analysis result of a target video frame of a previous frame of the target video frame based on a test point judgment rule;
if the analysis result is that the examination point judging rule is met, taking the examination point judging rule of the next sequence of the examination point judging rule as the examination point judging rule which needs to be met by the target video frame;
and if the analysis result is that the examination point judging rule is not satisfied, taking the examination point judging rule of the target video frame of the previous frame as the examination point judging rule which is required to be satisfied by the target video frame.
6. The method of claim 4, wherein the point of interest determination rule to be satisfied for obtaining the target video frame comprises:
obtaining an analysis result of the first n frames of target video frames based on the examination point judgment rule; n is greater than 1;
Judging whether the analysis result is that the continuous number meeting the examination point judgment rule is equal to n;
if the continuous number is more than 0 and less than n, taking the examination point judgment rule and the examination point judgment rule of the next sequence of the examination point judgment rule as the examination point judgment rule which needs to be met by the target video frame;
if the continuous number is equal to 0, taking the examination point judgment rule as an examination point judgment rule which needs to be met by the target video frame;
and if the continuous number is equal to n, taking the next sequential examination point judgment rule of the examination point judgment rules as the examination point judgment rule required to be met by the target video frame.
7. The method of claim 4, wherein the point of interest determination rule to be satisfied for obtaining the target video frame comprises:
and obtaining a test point judging rule that the continuous target video frame number does not meet the continuous frame number requirement, and taking the obtained test point judging rule as the test point judging rule required to be met by the target video frame.
8. A point of care analysis device for a chemical experiment, comprising:
an identification unit for identifying a chemical vessel from the target video frames for each of the target video frames in the operation video; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table;
The analysis unit is used for analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judgment rule or not to obtain an analysis result of the target video frame based on the examination point judgment rule;
the acquisition unit is used for acquiring an analysis result of each examination point based on the analysis result of the examination point judgment rule according to the target video frame;
the examination point judging rule comprises the following steps: the texture and/or color difference of the image in the beaker does not exceed a preset threshold; in order to realize the analysis of whether the image state information of the chemical vessel meets the examination point judging rule, the analysis unit is specifically used for: analyzing whether texture and/or color differences of the image within the beaker exceed a preset threshold.
9. An electronic device, comprising:
one or more processors; a memory; wherein one or more computer programs are stored in the memory, the one or more computer programs comprising instructions, which when executed by the processor, cause the electronic device to perform the steps of:
for each target video frame in the operation video, identifying a chemical vessel from the target video frame; the operation video is shot by a camera device positioned in front of and/or above the chemical experiment table;
Analyzing whether the image state information of the chemical vessels and/or the position relation among the chemical vessels meet the examination point judging rule or not to obtain an analysis result of the target video frame based on the examination point judging rule;
obtaining an analysis result of each examination point according to the analysis result of the target video frame based on the examination point judgment rule;
the examination point judging rule comprises the following steps: the texture and/or color difference of the image in the beaker does not exceed a preset threshold; the analyzing whether the image state information of the chemical vessel meets the examination point judging rule comprises the following steps: analyzing whether texture and/or color differences of the image within the beaker exceed a preset threshold.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program which, when run on a computer, causes the computer to perform the method according to any of claims 1-7.
CN202010171865.XA 2020-03-12 2020-03-12 Examination point analysis method and device for chemical experiment Active CN111753624B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010171865.XA CN111753624B (en) 2020-03-12 2020-03-12 Examination point analysis method and device for chemical experiment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010171865.XA CN111753624B (en) 2020-03-12 2020-03-12 Examination point analysis method and device for chemical experiment

Publications (2)

Publication Number Publication Date
CN111753624A CN111753624A (en) 2020-10-09
CN111753624B true CN111753624B (en) 2023-08-22

Family

ID=72672984

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010171865.XA Active CN111753624B (en) 2020-03-12 2020-03-12 Examination point analysis method and device for chemical experiment

Country Status (1)

Country Link
CN (1) CN111753624B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920429A (en) * 2015-12-24 2017-07-04 中国移动通信集团公司 A kind of information processing method and device
CN109035091A (en) * 2018-07-25 2018-12-18 深圳市异度信息产业有限公司 A kind of scoring method, device and equipment for student experimenting
CN109271886A (en) * 2018-08-29 2019-01-25 武汉大学 A kind of the human body behavior analysis method and system of examination of education monitor video
CN109727172A (en) * 2019-03-18 2019-05-07 上海中科教育装备集团有限公司 A kind of artificial intelligence machine study experimental skill points-scoring system
CN110090424A (en) * 2019-04-28 2019-08-06 福建省通通发科技发展有限公司 A kind of judgment system for the examination of football around rod
CN110765814A (en) * 2018-07-26 2020-02-07 杭州海康威视数字技术股份有限公司 Blackboard writing behavior recognition method and device and camera
CN110765967A (en) * 2019-10-30 2020-02-07 腾讯科技(深圳)有限公司 Action recognition method based on artificial intelligence and related device

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1987505A2 (en) * 2005-11-21 2008-11-05 Software Secure, Inc. Systems, methods and apparatus for monitoring exams
US20180225982A1 (en) * 2010-01-15 2018-08-09 ProctorU, INC. System for online automated exam proctoring

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106920429A (en) * 2015-12-24 2017-07-04 中国移动通信集团公司 A kind of information processing method and device
CN109035091A (en) * 2018-07-25 2018-12-18 深圳市异度信息产业有限公司 A kind of scoring method, device and equipment for student experimenting
CN110765814A (en) * 2018-07-26 2020-02-07 杭州海康威视数字技术股份有限公司 Blackboard writing behavior recognition method and device and camera
CN109271886A (en) * 2018-08-29 2019-01-25 武汉大学 A kind of the human body behavior analysis method and system of examination of education monitor video
CN109727172A (en) * 2019-03-18 2019-05-07 上海中科教育装备集团有限公司 A kind of artificial intelligence machine study experimental skill points-scoring system
CN110090424A (en) * 2019-04-28 2019-08-06 福建省通通发科技发展有限公司 A kind of judgment system for the examination of football around rod
CN110765967A (en) * 2019-10-30 2020-02-07 腾讯科技(深圳)有限公司 Action recognition method based on artificial intelligence and related device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Video reports as a novel alternate assessment in the undergraduate chemistry laboratory;Mitzy A. Erdmann等;Chemistry Education Research and Practice;第2014卷(第15期);650-657 *

Also Published As

Publication number Publication date
CN111753624A (en) 2020-10-09

Similar Documents

Publication Publication Date Title
CN109284733B (en) Shopping guide negative behavior monitoring method based on yolo and multitask convolutional neural network
CN103582697A (en) Image processing apparatus, image processing method and image processing system
KR20200066617A (en) Description description positioning method and apparatus of image, electronic device and storage medium
CN109664820A (en) Driving reminding method, device, equipment and storage medium based on automobile data recorder
CN107305635A (en) Object identifying method, object recognition equipment and classifier training method
CN103793719A (en) Monocular distance-measuring method and system based on human eye positioning
CN110516514B (en) Modeling method and device of target detection model
CN107292318B (en) Image significance object detection method based on center dark channel prior information
CN104281839A (en) Body posture identification method and device
CN107798314A (en) Skin color detection method and device
CN104079929A (en) Mosaic detection method and device
CN110688883A (en) Vehicle and pedestrian detection method and device
CN112560649A (en) Behavior action detection method, system, equipment and medium
CN110363111B (en) Face living body detection method, device and storage medium based on lens distortion principle
CN106454411A (en) Station caption processing method and device
CN106327531A (en) Panorama video identification method and device, and video playing method and device
EP4064113A1 (en) User information detection method and system, and electronic device
CN106372663A (en) Method and device for constructing classification model
CN111753624B (en) Examination point analysis method and device for chemical experiment
US9785829B2 (en) Information processing apparatus, information processing method, and non-transitory computer readable medium
CN112200230B (en) Training board identification method and device and robot
US10535154B2 (en) System, method, and program for image analysis
CN112287790A (en) Image processing method, image processing device, storage medium and electronic equipment
CN109901716B (en) Sight point prediction model establishing method and device and sight point prediction method
CN106530286A (en) Method and device for determining definition level

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant