CN114863734A - Evaluation method of organic polymer material synthesis experiment - Google Patents

Evaluation method of organic polymer material synthesis experiment Download PDF

Info

Publication number
CN114863734A
CN114863734A CN202210789293.0A CN202210789293A CN114863734A CN 114863734 A CN114863734 A CN 114863734A CN 202210789293 A CN202210789293 A CN 202210789293A CN 114863734 A CN114863734 A CN 114863734A
Authority
CN
China
Prior art keywords
experiment
demonstration
video
experimental
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210789293.0A
Other languages
Chinese (zh)
Other versions
CN114863734B (en
Inventor
王丽霞
付会凯
王广龙
聂冬冬
王益普
张�杰
张兴娟
马晓雨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xinxiang University
Original Assignee
Xinxiang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xinxiang University filed Critical Xinxiang University
Priority to CN202210789293.0A priority Critical patent/CN114863734B/en
Publication of CN114863734A publication Critical patent/CN114863734A/en
Application granted granted Critical
Publication of CN114863734B publication Critical patent/CN114863734B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/02Electrically-operated educational appliances with visual presentation of the material to be studied, e.g. using film strip
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B23/00Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes
    • G09B23/24Models for scientific, medical, or mathematical purposes, e.g. full-sized devices for demonstration purposes for chemistry
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

An evaluation method for an organic polymer material synthesis experiment integrates experiment demonstration and supervision, divides experiment behaviors into key steps, and plays a corresponding demonstration video; in the process that an operator conducts an experiment by referring to a video, the behavior of the operator using experimental raw materials is supervised through a camera device, the risk behavior is identified by using the optimized neural network model, and the event of the risk behavior is reported. The method provided by the invention can effectively improve the standardization and the safety of the organic polymer material synthesis experiment.

Description

Evaluation method of organic polymer material synthesis experiment
Technical Field
The invention belongs to the field of organic polymer synthesis, and particularly relates to an evaluation method for an organic polymer material synthesis experiment.
Background
The synthesis experiment of the organic polymer material relates to a large amount of combustible and volatile experimental raw materials, and has certain risk. When the experiment is carried out, an operator is required to carry out standard operation according to an experiment flow, the operator is also required to form good safety consciousness and safety habit, experiment raw materials are used according to the experiment rule, and the experiment sequence is strictly observed. The basic premise for carrying out the experiment is to protect the safety of the experiment operator, so that a correct method for demonstrating the organic polymer material synthesis experiment to the operator is needed, and the use behaviors of the experiment raw materials are supervised and evaluated.
The prior art is time-consuming and labor-consuming because the using behavior in the experiment is more manually evaluated (i.e. supervised). When facing multiple operators, this may result in failure of timely and accurate supervision, which may bring risks to the experiment. The assumption that a camera is used for collecting images for monitoring is also proposed, but no suitable algorithm special for synthetic experiment supervision exists, so that the supervision accuracy is low, and even the method cannot be put into practical use. And as for experimental risks, no quantifiable object suitable for computer image supervision exists at present. Some have proposed to supervise the actions of the operator, but this involves complicated kinematic problems, the algorithm is still immature, and the supervision effect is poor. It has also been proposed to monitor temperature rise, smoking, explosion, etc. However, some of these phenomena are normal phenomena in experiments, and some phenomena occur faster, so that the monitoring significance is not great. Therefore, what kind of object is supervised by using a computer and what kind of supervision strategy is also a problem to be solved urgently for controlling the experiment risk.
Disclosure of Invention
The invention innovatively provides a demonstration and supervision evaluation method for organic polymer material synthesis experiment teaching, which integrates experiment demonstration and supervision, divides experiment behaviors into key steps, and plays out corresponding demonstration videos; in the process that an operator conducts an experiment by referring to a video, the behavior of the operator using experimental raw materials is supervised through a camera device, the risk behavior is identified by using the optimized neural network model, and the event of the risk behavior is reported. The method provided by the invention can effectively improve the standardization and the safety of the organic polymer material synthesis experiment.
An evaluation method of organic polymer material synthesis experiment,
dividing a complete experiment into a plurality of key steps; setting an experimental reagent to be used in each key step; playing a corresponding demonstration video to an operator in each key step, and carrying out an operation experiment by the operator according to the demonstration video; in the demonstration and experiment processes, the experiment process is supervised by a computer automatic intelligent algorithm, and risk behaviors using unnecessary reagents are identified and reported;
in the demonstration and experiment processes, a video camera collects the video of an operator and preprocesses the video;
decomposing the preprocessed video into multiple frames of images, and processing each frame of image by using a scale factor and a displacement factor;
inputting each processed frame image into a neural network model NN for identification, so as to identify risk behaviors in the demonstration process and identify risk behaviors in the experiment process, and the method specifically comprises the following steps: a1, acquiring a frame of video after the demonstration process starts, and performing sub-image traversal on the frame of image; a2, the traversing method is as follows: intercepting different displacement and scaling subgraphs as the input of the neural network model NN and obtaining the output
Figure DEST_PATH_IMAGE001
(ii) a When a certain dimension of output
Figure 366786DEST_PATH_IMAGE001
When the maximum value is taken, the subgraph is represented as an experimental reagent image of a corresponding type; a3, obtaining the positions of different reagents in the image, namely the positions and the sizes of the subgraph, according to the traversal result; a4, if all kinds of agents can not be obtained in the first frame of image, continuing to collect the next frame of the video, and repeating A2 and A3 until the positions of all kinds of agents in the image are obtained; a5, after A1-A4 is finished, vision collection is continuedInputting the subgraph of the corresponding position into a neural network model NN for detection according to the positions of various reagents in the image in the next frame of the frequency; a6, if the output of the model NN is not found to be a subgraph of the corresponding category, the reagent is considered to be moved or used in the demonstration stage, and risk warning comprising a laboratory bench number and a raw material type is sent to the control software; b1, acquiring a frame of a video after the experiment process starts, removing allowed reagents according to the types of the reagents required by the key steps, and inputting sub-images of positions corresponding to various reagents which are not necessarily used into a neural network model NN for detection; b2, if the output of the neural network model NN is not found to be a subgraph of the corresponding category, then the agent is considered to be moved or used in the demonstration phase; sending risk warnings including laboratory bench numbers and raw material types to control software;
Figure 250560DEST_PATH_IMAGE002
Figure 303966DEST_PATH_IMAGE004
Figure 143746DEST_PATH_IMAGE001
the output of the neural network model is represented,
Figure DEST_PATH_IMAGE005
representing elements in the 5 th hidden layer
Figure 686723DEST_PATH_IMAGE006
And output layer elements
Figure DEST_PATH_IMAGE007
The connection of (a) to (b),
Figure 290749DEST_PATH_IMAGE008
which represents a linear offset of the bias voltage,
Figure DEST_PATH_IMAGE009
is a process vector.
Identifying risk behaviors during the presentation, and identifying risk behaviors during the experiment comprises: after several frames can be continuously collected, if the output of the model NN in several frames is not a subgraph of the corresponding category, the risk warning is sent out again, and the false alarm rate is reduced.
After the demonstration process of the key step is finished, the experimental process entering the key step is manually controlled.
The evaluation method is carried out simultaneously in a plurality of test areas, and each test area comprises a camera, a display, a test bed and a reagent.
And arranging a display in each experimental area for playing demonstration videos of the course experiment.
The camera is used for collecting behavior videos of an operator in the experiment process.
And the system also comprises a control computer which is used for implementing 1) switching of key steps, 2) switching of demonstration processes and experimental processes in the key steps and 3) monitoring risk behaviors.
When the key step is switched, entering the demonstration process of the key step; after the demonstration is finished, the control is switched to the experimental process of the key step.
Comprising a camera, a display, and a control computer for implementing the supervision method described above.
The control computer stores a neural network model NN which is trained in advance.
The invention has the advantages that:
1. the method for demonstrating, evaluating (supervising) the synthesis experiment of the organic polymer material is innovatively provided, and the complete experiment is divided into a plurality of key steps; setting experimental raw materials to be used in each key step, and controlling the steps by a teacher; and playing a corresponding demonstration video for an operator in each key step, and carrying out an experiment by the operator according to the demonstration video. Particularly, the supervision steps are divided into a demonstration process and an experiment process, and the requirements of risk prompt are reflected as the movement of experiment raw materials, so that the supervision of the experiment process through a computer automatic intelligent algorithm becomes possible. According to a large amount of experimental data, the method can quickly and accurately identify and report the risk behaviors of using unnecessary experimental raw materials.
2. Aiming at the characteristics of organic polymer synthesis experiments, the neural network model structure (including an excitation function and the like) is optimized, the image characteristics under different scales are identified by adopting a multi-layer and multi-scale network structure, different types of experimental raw materials in the collected video can be automatically identified with higher efficiency, and the identification is quicker and more accurate. Meanwhile, for convenience of neural network processing, collected video data are preprocessed, the operation amount of the neural network is reduced, the integrity of information is guaranteed, and efficiency and accuracy are further improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention. In the drawings:
FIG. 1 is a schematic diagram of a system deployment for use with the present invention.
Detailed Description
One, system structure
Each test zone comprises a camera 1, a display 2, a test stand 3, a reagent 4.
The video camera is used for collecting behavior videos of an operator in the experiment process and analyzing the videos. The display is used for playing demonstration videos of the course experiment. The test bed is used for bearing various reagents and providing an experimental site. The reagents may be a variety of reagents used in experiments. Preferably, the reagent bottle is marked with a label or texture to facilitate identification.
Demonstration and supervision method
Step 1: integral demonstration, operation and supervision method for experiment
Dividing a complete course experiment into a plurality of key steps; setting experimental raw materials to be used in each key step; playing a corresponding demonstration video to an operator in each key step, and carrying out an experiment by the operator according to the demonstration video; in the demonstration and experiment process, the experiment process is supervised by a computer automatic intelligent algorithm, and the risk behaviors of using unnecessary experiment raw materials are identified and reported.
And arranging a display in each experimental area for playing demonstration videos of the course experiment.
Deploying a control computer at a teacher end, and installing control software; and the teacher implements 1) key step switching, 2) experiment process switching in the key steps and 3) risk monitoring behaviors through the control software.
And the teacher switches the control software to the key step of the experiment, and the control software selects the corresponding demonstration video and plays the demonstration video on the display of each experiment area. After the key step is switched, the control software starts the camera of each experimental area through the wireless network, shoots the video, identifies the acquired video by adopting an automatic intelligent algorithm, and reports to a teacher in the control software when risk behaviors are identified.
When the key step is switched, entering the demonstration process of the key step; after the demonstration is finished, the teacher sets the experimental process of entering the key step in the control software.
Step 2: experimental area video acquisition method and pretreatment method
A video camera used for collecting videos is deployed in each experimental area, behavior videos of operators in demonstration and experimental processes are collected and used for analyzing the videos, and compression preprocessing is carried out on video data at the video camera end while the videos are collected, so that data flow is reduced. The method is described in detail below.
The system comprises a camera, a supporting device and a background processing device, wherein the camera is used for acquiring videos, the supporting device is used for fixing the camera, and the background processing device is used for receiving video data of the camera; the camera is installed on the supporting equipment, is in the mode of opening after being communicated with the power supply, shoots the video in the experimental area, makes the experimental raw materials be located in the field of view of the camera by adjusting the installation height and orientation of the camera, acquires the video, preprocesses the video, and transmits the video data to the background processing equipment through the wireless local area network.
The video camera preprocesses the video data so as to reduce the data flow, so that the same background processing equipment can process the video data from a plurality of experimental areas in real time, and synchronous supervision is realized. The pretreatment method is as follows.
Setting the size of each frame of image of a video directly collected by a camera to be M x N, setting a time sequence window W, forming a three-dimensional matrix with M x N x W dimensions by W frames of video in the window, and recording as:
Figure DEST_PATH_IMAGE011
defining:
Figure DEST_PATH_IMAGE013
Figure DEST_PATH_IMAGE015
Figure DEST_PATH_IMAGE017
in the third formula, the first and second groups are,
Figure 23212DEST_PATH_IMAGE018
Figure DEST_PATH_IMAGE019
Figure 717499DEST_PATH_IMAGE020
respectively representing the partial derivatives of the matrix V in the n, m, w directions,
Figure DEST_PATH_IMAGE021
Figure 742962DEST_PATH_IMAGE022
Figure DEST_PATH_IMAGE023
in unit increments in three directions.
At the image coordinates
Figure 788278DEST_PATH_IMAGE024
Neighborhood of (2)
Figure DEST_PATH_IMAGE025
In, define:
Figure DEST_PATH_IMAGE027
Figure 245804DEST_PATH_IMAGE028
representing the size of the spatial neighborhood of the image pixel, and W representing the size of the time window, taken as preferred values
Figure DEST_PATH_IMAGE029
,W=6;
Figure 607646DEST_PATH_IMAGE030
Indicating empirical threshold, preferred by experiment
Figure DEST_PATH_IMAGE031
According to the formula (4), the spatial size of the video is reduced
Figure 695688DEST_PATH_IMAGE032
Shortening the video time length to
Figure DEST_PATH_IMAGE033
. Data traffic is optimized and video quality is maintained.
And step 3: method for identifying experimental risk behaviors
And identifying the risk behaviors through videos shot in the experimental area, processing the videos by adopting an automatic intelligent algorithm, and outputting signals of the risk behaviors in the videos by control software.
According to the characteristics and basic experiment requirements of organic polymer material synthesis experiments, the experimental risk behaviors are classified into two types, namely, experimental raw materials are used in the demonstration process, and unnecessary experimental raw materials are used in the experimental process.
According to the two types of behaviors, respectively proposing a recognition algorithm for:
s31, identifying risk behaviors in the demonstration process;
and S32, identifying risk behaviors in the experimental process.
The neural network model NN is created and used in the methods S31 and S32.
The neural network model NN is composed of 1 input layer, 5 hidden layers, and 1 output layer.
The input layer of NN is a sub-image of a frame of image of the experimental area video preprocessed in step 1, and is represented as:
Figure 25038DEST_PATH_IMAGE034
in the formula (I), the compound is shown in the specification,
Figure DEST_PATH_IMAGE035
showing the video of the experimental area after the pre-treatment,
Figure 223938DEST_PATH_IMAGE036
representing the spatial coordinates of the pixels in one frame of image,
Figure DEST_PATH_IMAGE037
indicating the frame number of the video.
Figure 938822DEST_PATH_IMAGE038
Representing coordinates in subgraphs, i.e. input layers
Figure DEST_PATH_IMAGE039
And (4) coordinates.
Figure 197765DEST_PATH_IMAGE040
Figure DEST_PATH_IMAGE041
Representing the scale factor of the subgraph, namely the relative scaling of the subgraph relative to the original video image;
Figure 14412DEST_PATH_IMAGE042
Figure DEST_PATH_IMAGE043
the displacement factor of the sub-picture, i.e. the relative position of the sub-picture in the original picture, is indicated. Any sub-area (namely subgraph) in the original graph can be mapped to the input layer with fixed size through the scale factor and the displacement factor
Figure 564473DEST_PATH_IMAGE039
The above. Therefore, the object to be identified can be accurately positioned, the proportion of the object in the image is ensured to be suitable, and the situation that the target is too small and is difficult to identify is avoided.
The 1 st hidden layer of NN is the following mapping of the input layer:
Figure DEST_PATH_IMAGE045
Figure 87858DEST_PATH_IMAGE046
a window of convolution of the matrix is represented,
Figure DEST_PATH_IMAGE047
representing the spatial coordinates within the convolution window, the size of the convolution window is 11 x 11,
Figure 580019DEST_PATH_IMAGE048
the number representing the convolution window of the matrix,
Figure DEST_PATH_IMAGE049
indicating 32 convolution windows.
Figure 556066DEST_PATH_IMAGE050
Indicating a linear bias (offset).
Figure DEST_PATH_IMAGE051
The 1 st hidden layer is shown as,
Figure 408353DEST_PATH_IMAGE052
is the layer node coordinates.
Figure DEST_PATH_IMAGE053
The function is defined as follows:
Figure DEST_PATH_IMAGE055
the nonlinear excitation function is used for enabling the neural network model to have nonlinear fitting capacity on data, and as an improvement, parameters are introduced
Figure 114141DEST_PATH_IMAGE056
The convergence speed of the excitation function is controlled, and the adaptability of the model to different types of recognition objects can be improved. Adjusted by experiments
Figure DEST_PATH_IMAGE057
Is a preferred value.
2 nd hidden layer of NN
Figure 449307DEST_PATH_IMAGE058
Is a hidden layer
Figure 178229DEST_PATH_IMAGE051
The following mapping:
Figure 70092DEST_PATH_IMAGE060
max is shown in
Figure DEST_PATH_IMAGE061
Is taken to the maximum value within the window of (c),
Figure 568070DEST_PATH_IMAGE062
indicating a linear bias, the above equation will
Figure 74138DEST_PATH_IMAGE051
4 x 4 nodes (coordinate
Figure DEST_PATH_IMAGE063
Figure 352672DEST_PATH_IMAGE061
) Maximum value mapping
Figure 235178DEST_PATH_IMAGE058
Of (coordinate is
Figure 587661DEST_PATH_IMAGE052
). Thus, it is possible to provide
Figure 841794DEST_PATH_IMAGE058
Is reduced in space size to
Figure 279729DEST_PATH_IMAGE051
1/16 of (1).
Figure 700346DEST_PATH_IMAGE053
Is defined as (7).
3 rd hidden layer of NN
Figure 235232DEST_PATH_IMAGE064
Is a hidden layer
Figure 83103DEST_PATH_IMAGE058
The following mapping:
Figure 8333DEST_PATH_IMAGE066
Figure DEST_PATH_IMAGE067
a window of convolution of the matrix is represented,
Figure 45691DEST_PATH_IMAGE047
representing the spatial coordinates within the convolution window, the size of the convolution window being 7 x 7,
Figure 372767DEST_PATH_IMAGE048
the number representing the convolution window of the matrix,
Figure 657117DEST_PATH_IMAGE049
.
Figure 131961DEST_PATH_IMAGE068
indicating a linear bias.
Figure 894381DEST_PATH_IMAGE064
A3 rd hidden layer is shown,
Figure 75963DEST_PATH_IMAGE052
is the layer node coordinates.
4 th hidden layer of NN
Figure DEST_PATH_IMAGE069
Is a hidden layer
Figure 842800DEST_PATH_IMAGE064
The following mapping:
Figure DEST_PATH_IMAGE071
max is shown in
Figure 742623DEST_PATH_IMAGE061
Is taken to the maximum value within the window of (c),
Figure 371050DEST_PATH_IMAGE072
indicating a linear bias. The above formula is
Figure 407139DEST_PATH_IMAGE064
4 x 4 nodes (coordinate
Figure 33293DEST_PATH_IMAGE063
Figure 420412DEST_PATH_IMAGE061
) Maximum value mapping
Figure 337683DEST_PATH_IMAGE069
Of (coordinate is
Figure 493858DEST_PATH_IMAGE052
). Thus, it is possible to provide
Figure 290913DEST_PATH_IMAGE069
Is reduced in space size to
Figure 962066DEST_PATH_IMAGE064
1/16 of (1).
Figure 135558DEST_PATH_IMAGE053
Is defined as (7).
The 1 st to 4 th hidden layers of the neural network establish a multi-scale network structure for identifying image features under different scales, so that the features of the container and the features of the raw materials can be detected, and the detection performance is improved. Meanwhile, the number of nodes in the downstream layer number of the network is reduced, and the model calculation efficiency is improved.
NN 5 th hidden layer
Figure DEST_PATH_IMAGE073
Is a hidden layer
Figure 457824DEST_PATH_IMAGE069
The following mapping:
Figure 425780DEST_PATH_IMAGE074
Figure 521912DEST_PATH_IMAGE073
a set of vectors is represented that represents a set of vectors,
Figure DEST_PATH_IMAGE075
the coordinates representing the vector, i.e., the vector dimension, is 256.
Figure 30254DEST_PATH_IMAGE076
Indicating the same space of the 4 th hidden layerSum and vector of 32 nodes of coordinates
Figure 974070DEST_PATH_IMAGE073
First, the
Figure DEST_PATH_IMAGE077
The connection of the individual elements, i.e. the linear relationship,
Figure 112927DEST_PATH_IMAGE078
indicating a linear bias.
Figure 758672DEST_PATH_IMAGE053
Is defined as (7).
The output layer of NN is a hidden layer
Figure 8388DEST_PATH_IMAGE073
The following mapping:
Figure DEST_PATH_IMAGE079
Figure DEST_PATH_IMAGE081
Figure 305246DEST_PATH_IMAGE001
represents the output layer of the neural network model as a vector, z represents an element of the vector,
Figure 615005DEST_PATH_IMAGE005
representing elements in the 5 th hidden layer
Figure 685729DEST_PATH_IMAGE006
And output layer elements
Figure 801452DEST_PATH_IMAGE007
The connection of (2).
Figure 641232DEST_PATH_IMAGE008
Indicating a linear bias.
Figure 121892DEST_PATH_IMAGE053
Is defined as (7).
Figure 227383DEST_PATH_IMAGE082
Each element of (a) represents a graphical representation of the experimental material when
Figure 84480DEST_PATH_IMAGE001
When =1, the image representing that the input layer is the experimental material is shown in the specification
Figure DEST_PATH_IMAGE083
An image indicating that the input layer is not the kind of experimental material. For experimental raw materials with similar appearances, special textures with distinguishing capability can be marked on the outer package for making visual difference in images.
Figure 778767DEST_PATH_IMAGE082
Is equal to the number of types of experimental material that need supervision.
Preparing images of different kinds of experimental materials according to
Figure 492645DEST_PATH_IMAGE001
And marking the image as 0 or 1 by the value-taking rule, and taking the image as a training sample set to train the model NN. Marking of the sample
Figure 537961DEST_PATH_IMAGE084
Calculating the input image of the sample according to equations (5) to (13) to obtain an output
Figure 933171DEST_PATH_IMAGE082
. And calculating a cost function
Figure DEST_PATH_IMAGE085
Figure DEST_PATH_IMAGE087
Figure 793548DEST_PATH_IMAGE088
To correspond to
Figure 881590DEST_PATH_IMAGE082
And (3) properly adjusting the control parameters of the components according to the training sample set, and taking the relative coefficient of the mean value of each type of image and the maximum value of the mean value of each type of image as the control parameter corresponding to the type of components as an optimal configuration. Cost function is solved by adopting backward propagation method
Figure 210940DEST_PATH_IMAGE085
The learning of each parameter in the neural network models (5) - (13) is completed.
And S31, identifying risk behaviors in the demonstration process.
According to step 2, the teacher switches to a specific key step and enters the demonstration process for that key step.
In the demonstration process, the control software collects videos from each experimental area and analyzes and identifies all the videos frame by frame. The identification process is as follows.
And A1, after the demonstration process is started, acquiring one frame of the video, and performing sub-graph traversal on the image of the one frame.
A2, the traversing method is as follows: intercepting different displacement and scaling subgraphs as the input of the neural network model NN and obtaining the output
Figure 144261DEST_PATH_IMAGE082
. When a certain dimension of output
Figure 813140DEST_PATH_IMAGE001
And when the maximum value is taken, the subgraph is represented as the experimental raw material image of the corresponding type.
And A3, obtaining the positions of the different experimental materials in the image, namely the positions and the sizes of the subgraphs according to the traversal result.
A4, if the experimental materials of all kinds can not be obtained in the first frame image, continuing to collect the next frame of the video, and repeating A2 and A3 until the positions of the experimental materials of all kinds in the image are obtained.
And A5, after A1-A4 is finished, continuously acquiring the next frame of the video, and inputting the subgraph of the corresponding position into a neural network model NN for detection according to the positions of various experimental raw materials in the image.
A6, if the output of the model NN is not found to be a subgraph of the corresponding category, the experimental material is considered to be moved or used in the demonstration stage, and a risk warning comprising a laboratory bench number and a material type is sent to the control software for the teacher to view.
As an improvement, after several frames are continuously collected, if the output of the model NN in several frames is not a subgraph of a corresponding category, the risk warning is sent out again, and the false alarm rate is reduced.
And S32, identifying risk behaviors in the experimental process.
According to the step 2, after the demonstration process of the key step is finished, the teacher sets an experimental process for entering the key step.
During the experiment, the control software continued to capture video from each experimental area and performed monitoring based on the location of each type of experimental material obtained in process a4 above.
And B1, acquiring a frame of the image, removing the allowed experimental raw materials according to the type of the experimental raw materials required by the key step, and inputting sub-images of the positions corresponding to the unnecessary various experimental raw materials into the neural network model NN for detection.
And B2, if the output of the model NN is not found to be a subgraph of the corresponding category, the experimental raw material is considered to be moved or used in the demonstration stage, and a risk warning comprising a laboratory bench number and a raw material type is sent to the control software for the teacher to view.
Similarly, as an improvement measure, after several frames are continuously collected, if the output of the model NN in several frames is not a subgraph of a corresponding category, the risk warning is sent out again, and the false alarm rate is reduced.
Therefore, the risk of the experiment is evaluated according to the alarm information, and the supervision of the experiment process is completed.
The invention provides a demonstration and supervision method for organic polymer material synthesis experiment teaching, which integrates the demonstration and supervision of the experiment teaching, demonstrates the key steps of the experiment, and supervises the standardization of the demonstration process and the experiment process on the use of experiment raw materials in the key steps. The test result of the method shows that the method can intelligently identify unnecessary use of different types of experimental raw materials, has high identification accuracy and high identification speed, can assist in supervising the synthesis experiment process of the organic polymer material, and improves the normalization and the safety of the synthesis experiment of the organic polymer material.
Figure 72083DEST_PATH_IMAGE090
It will be appreciated by those skilled in the art that while a number of exemplary embodiments of the invention have been shown and described in detail herein, many other variations or modifications can be made, which are consistent with the principles of this invention, and which are directly determined or derived from the disclosure herein, without departing from the spirit and scope of the invention. Accordingly, the scope of the invention should be understood and interpreted to cover all such other variations or modifications.

Claims (10)

1. An evaluation method for an organic polymer material synthesis experiment is characterized by comprising the following steps:
dividing a complete experiment into a plurality of key steps; setting an experimental reagent to be used in each key step; playing a corresponding demonstration video to an operator in each key step, and carrying out an operation experiment by the operator according to the demonstration video; in the demonstration and experiment processes, the experiment process is supervised by a computer automatic intelligent algorithm, and risk behaviors using unnecessary reagents are identified and reported;
in the demonstration and experiment processes, a video camera collects the video of an operator and preprocesses the video;
decomposing the preprocessed video into multiple frames of images, and processing each frame of image by using a scale factor and a displacement factor;
inputting each processed frame image into a neural network model NN for identification, so as to identify risk behaviors in the demonstration process and identify risk behaviors in the experiment process, and the method specifically comprises the following steps: a1, acquiring a frame of video after the demonstration process starts, and performing sub-image traversal on the frame of image; a2, the traversing method is as follows: intercepting different displacement and scaling subgraphs as the input of the neural network model NN and obtaining the output
Figure 175621DEST_PATH_IMAGE002
(ii) a When a certain dimension of output
Figure 955358DEST_PATH_IMAGE002
When the maximum value is taken, the subgraph is represented as an experimental reagent image of the corresponding type; a3, obtaining the positions of different reagents in the image, namely the positions and the sizes of the subgraph, according to the traversal result; a4, if all kinds of agents can not be obtained in the first frame of image, continuing to collect the next frame of the video, and repeating A2 and A3 until the positions of all kinds of agents in the image are obtained; a5, after A1-A4 is finished, continuously acquiring the next frame of the video, and inputting sub-images of corresponding positions into a neural network model NN for detection according to the positions of various reagents in the images; a6, if the output of the model NN is not found to be a subgraph of the corresponding category, the reagent is considered to be moved or used in the demonstration stage, and risk warning comprising a laboratory bench number and a raw material type is sent to the control software; b1, acquiring a frame of a video after the experiment process starts, removing allowed reagents according to the types of the reagents required by the key steps, and inputting sub-images of positions corresponding to various reagents which are not necessarily used into a neural network model NN for detection; b2, if the output of the neural network model NN is not found to be a subgraph of the corresponding category, then the agent is considered to be moved or used in the demonstration phase; sending risk warnings including laboratory bench numbers and raw material types to control software;
Figure 350567DEST_PATH_IMAGE004
Figure 164939DEST_PATH_IMAGE006
Figure 174353DEST_PATH_IMAGE002
the output of the neural network model is represented,
Figure 441386DEST_PATH_IMAGE008
representing elements in the 5 th hidden layer
Figure 640286DEST_PATH_IMAGE010
And output layer elements
Figure 246848DEST_PATH_IMAGE012
The connection of (a) to (b),
Figure 240212DEST_PATH_IMAGE014
which represents a linear offset of the bias voltage,
Figure 994541DEST_PATH_IMAGE016
is a process vector.
2. The method of claim 1, wherein: identifying risk behaviors during the presentation, and identifying risk behaviors during the experiment comprises: after several frames can be continuously collected, if the output of the model NN in several frames is not a subgraph of the corresponding category, the risk warning is sent out again, and the false alarm rate is reduced.
3. The method of claim 1, wherein: after the demonstration process of the key step is completed, the experimental process of the key step is manually controlled.
4. The method of claim 1, wherein: the evaluation method is carried out simultaneously in a plurality of test areas, and each test area comprises a camera, a display, a test bed and a reagent.
5. The method of claim 4, wherein: and arranging a display in each experimental area for playing demonstration videos of the course experiment.
6. The method of claim 4, wherein: the camera is used for collecting behavior videos of an operator in the experiment process.
7. The method of claim 4, wherein: and the system also comprises a control computer which is used for implementing 1) switching of key steps, 2) switching of demonstration processes and experimental processes in the key steps and 3) monitoring risk behaviors.
8. The method of claim 7, wherein: when the key step is switched, entering the demonstration process of the key step; after the demonstration is finished, the control is switched to the experimental process of the key step.
9. A supervision system for an organic polymer material synthesis experiment is characterized in that: comprising a camera, a display, and a control computer for implementing the supervision method according to any one of claims 1-8.
10. The system of claim 9, wherein: the control computer stores a neural network model NN which is trained in advance.
CN202210789293.0A 2022-07-06 2022-07-06 Evaluation method for organic polymer material synthesis experiment Active CN114863734B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210789293.0A CN114863734B (en) 2022-07-06 2022-07-06 Evaluation method for organic polymer material synthesis experiment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210789293.0A CN114863734B (en) 2022-07-06 2022-07-06 Evaluation method for organic polymer material synthesis experiment

Publications (2)

Publication Number Publication Date
CN114863734A true CN114863734A (en) 2022-08-05
CN114863734B CN114863734B (en) 2022-09-30

Family

ID=82626201

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210789293.0A Active CN114863734B (en) 2022-07-06 2022-07-06 Evaluation method for organic polymer material synthesis experiment

Country Status (1)

Country Link
CN (1) CN114863734B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107507114A (en) * 2017-09-08 2017-12-22 赵宇航 A kind of Internet of Things teaching platform method of controlling security and device
CN111915460A (en) * 2020-05-07 2020-11-10 同济大学 AI vision-based intelligent scoring system for experimental examination
US20210390313A1 (en) * 2020-06-11 2021-12-16 Tata Consultancy Services Limited Method and system for video analysis
CN114005054A (en) * 2021-10-09 2022-02-01 上海锡鼎智能科技有限公司 AI intelligence system of grading
CN114640752A (en) * 2022-03-28 2022-06-17 杭州海康威视系统技术有限公司 Auxiliary method and device for experimental learning
CN114663834A (en) * 2022-03-22 2022-06-24 天目爱视(北京)科技有限公司 Express storage site monitoring method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107507114A (en) * 2017-09-08 2017-12-22 赵宇航 A kind of Internet of Things teaching platform method of controlling security and device
CN111915460A (en) * 2020-05-07 2020-11-10 同济大学 AI vision-based intelligent scoring system for experimental examination
US20210390313A1 (en) * 2020-06-11 2021-12-16 Tata Consultancy Services Limited Method and system for video analysis
CN114005054A (en) * 2021-10-09 2022-02-01 上海锡鼎智能科技有限公司 AI intelligence system of grading
CN114663834A (en) * 2022-03-22 2022-06-24 天目爱视(北京)科技有限公司 Express storage site monitoring method
CN114640752A (en) * 2022-03-28 2022-06-17 杭州海康威视系统技术有限公司 Auxiliary method and device for experimental learning

Also Published As

Publication number Publication date
CN114863734B (en) 2022-09-30

Similar Documents

Publication Publication Date Title
US11468538B2 (en) Segmentation and prediction of low-level temporal plume patterns
US10937144B2 (en) Pipe feature identification using pipe inspection data analysis
KR102035592B1 (en) A supporting system and method that assist partial inspections of suspicious objects in cctv video streams by using multi-level object recognition technology to reduce workload of human-eye based inspectors
CN111047568A (en) Steam leakage defect detection and identification method and system
CN115294117B (en) Defect detection method and related device for LED lamp beads
CN115169855B (en) Unsafe state detection method based on digital twin workshop mixed data set
CN106034202A (en) Adjusting method and adjusting device for video splicing camera
JP2023553443A (en) Plant detection and display system
CA3081967C (en) Method and system for connected advanced flare analytics
CN111178424A (en) Petrochemical production site safety compliance real-time detection system and method
US20220281177A1 (en) Ai-powered autonomous 3d printer
CN114863734B (en) Evaluation method for organic polymer material synthesis experiment
CN115083229B (en) Intelligent recognition and warning system of flight training equipment based on AI visual recognition
CN116419059A (en) Automatic monitoring method, device, equipment and medium based on behavior label
Skladchykov et al. Application of YOLOX deep learning model for automated object detection on thermograms
CN115438945A (en) Risk identification method, device, equipment and medium based on power equipment inspection
KR102281100B1 (en) System and method for providing heat transporting pipe status information
Danajitha et al. Detection of Cracks in High Rise Buildings using Drones
EP3819817A1 (en) A method and system of evaluating the valid analysis region of a specific scene
WO2023007535A1 (en) Sewage pipe interior abnormality diagnosis assistance system, client machine and server machine for sewage pipe interior abnormality diagnosis assistance system, and related method
US20210142481A1 (en) Method and System of Evaluating the Valid Analysis Region of a Specific Scene
CN113297910B (en) Distribution network field operation safety belt identification method
EP3748444B1 (en) Method and system for connected advanced flare analytics
CN116051824A (en) Multi-source heterogeneous identification algorithm for intelligent management of power station
CN118038489A (en) Visual algorithm testing process and data optimizing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant