CN112102126A - Student manual ability culture method and device based on neural network - Google Patents

Student manual ability culture method and device based on neural network Download PDF

Info

Publication number
CN112102126A
CN112102126A CN202010901002.3A CN202010901002A CN112102126A CN 112102126 A CN112102126 A CN 112102126A CN 202010901002 A CN202010901002 A CN 202010901002A CN 112102126 A CN112102126 A CN 112102126A
Authority
CN
China
Prior art keywords
picture
neural network
added
trained
learning item
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010901002.3A
Other languages
Chinese (zh)
Inventor
海克洪
张磊
吴少美
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei Meihe Yisi Education Technology Co ltd
Original Assignee
Hubei Meihe Yisi Education Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei Meihe Yisi Education Technology Co ltd filed Critical Hubei Meihe Yisi Education Technology Co ltd
Priority to CN202010901002.3A priority Critical patent/CN112102126A/en
Publication of CN112102126A publication Critical patent/CN112102126A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Educational Administration (AREA)
  • Molecular Biology (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Computing Systems (AREA)
  • Tourism & Hospitality (AREA)
  • Strategic Management (AREA)
  • Educational Technology (AREA)
  • Biomedical Technology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Economics (AREA)
  • Human Resources & Organizations (AREA)
  • Marketing (AREA)
  • Primary Health Care (AREA)
  • General Business, Economics & Management (AREA)
  • Image Analysis (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides a student operation ability cultivation method and device based on a neural network. The method comprises the following steps: acquiring the picture category of a learning item to be added, acquiring a corresponding picture from a network according to the picture category of the learning item to be added to serve as a picture to be selected, preprocessing the picture to be selected, and acquiring the processed picture to be selected as a picture to be trained; establishing a neural network model, training a to-be-added learning item picture according to the neural network model, and acquiring a first characteristic value and a second characteristic value; and updating the pictures of the learning items to be added according to the first characteristic value and the second characteristic value, and synchronously updating the learning items. According to the method, the pictures of the learning items to be added and the pictures to be trained are trained through the neural network model, the automatic updating of the learning items to be added is realized by utilizing the first characteristic value and the second characteristic value, and the practical ability of students can be better cultured.

Description

Student manual ability culture method and device based on neural network
Technical Field
The invention relates to the technical field of computer software, in particular to a student manual ability cultivation method and device based on a neural network.
Background
Distance education, also called modern distance education as network education among some documents already in use, is one of adult education calendars. The teaching mode is a teaching mode using transmission media such as televisions, the Internet and the like, breaks through the boundary of time and space, and is different from the traditional teaching mode in school accommodation. Students using this teaching mode are typically amateur repairmens. The student can attend classes anytime and anywhere because the student does not need to attend classes at a specific place. Students can also learn with the help of different channels such as TV broadcasting, Internet, tutoring special line, lesson and research society, and face-to-face (letter) etc. The method is a new concept generated after the modern information technology is applied to education, namely education developed by using network technology and environment.
Network education is not limited to learning of knowledge, and the practical ability of students can be synchronously cultured in the existing network education, but the existing network education is usually only used for culturing the practical ability of the students through some inherent courses and is not updated in real time, so that some students in the same courses can learn for many times, and the practical ability of the students is cultured by a neural network-based student practical ability culturing method.
The above is only for the purpose of assisting understanding of the technical aspects of the present invention, and does not represent an admission that the above is prior art.
Disclosure of Invention
In view of the above, the invention provides a method and a device for training the practical ability of students based on a neural network, and aims to solve the technical problem that the training course of the practical ability of students cannot be updated autonomously through a convolutional neural network model in the prior art.
The technical scheme of the invention is realized as follows:
in one aspect, the invention provides a neural network-based student manual ability culture method, which comprises the following steps:
s1, acquiring the picture category of the learning item to be added, acquiring a corresponding picture from a network according to the picture category of the learning item to be added as a picture to be selected, preprocessing the picture to be selected, and acquiring the processed picture to be selected as a picture to be trained;
s2, establishing a neural network model, training the picture of the learning item to be added according to the neural network model to obtain a first characteristic value of the picture of the learning item to be added, and training the picture to be trained according to the neural network model to obtain a second characteristic value of the picture to be trained;
and S3, updating the picture of the learning item to be added according to the first characteristic value and the second characteristic value, and synchronously updating the learning item.
On the basis of the foregoing technical solution, preferably, in step S1, obtaining a picture category of a learning item to be added, obtaining a corresponding picture from a network according to the picture category of the learning item to be added, taking the picture as a picture to be selected, preprocessing the picture to be selected, and obtaining a processed picture to be selected as a picture to be trained, the method further includes the following steps: the method comprises the steps of calligraphy, drawing, physics, biology and chemistry, establishing different types of picture sets according to the types of pictures added with learning items, storing corresponding pictures, obtaining the corresponding pictures from a network according to the types of the learning items to be used as pictures to be selected, preprocessing the pictures to be selected, and obtaining the processed pictures to be selected to be used as pictures to be trained.
On the basis of the above technical solution, preferably, the corresponding pictures are obtained from the network according to the category of the learning item and are used as the pictures to be selected, the pictures to be selected are preprocessed, and the processed pictures to be selected are obtained and used as the pictures to be trained.
On the basis of the above technical solution, preferably, in step S2, a neural network model is established, a picture of a learning item to be added is trained according to the neural network model, a first feature value of the picture of the learning item to be added is obtained, a picture to be trained is trained according to the neural network model, and a second feature value of the picture to be trained is obtained, which further includes the following steps of establishing the neural network model and a tuple transformation rule, where the tuple is of a floating point type, and the neural network model includes: the method comprises the steps of converting a picture of a learning item to be added and a picture to be trained into a tuple format, training the picture of the learning item to be added according to a neural network model to obtain a first characteristic value of the picture of the learning item to be added, and training the picture to be trained according to the neural network model to obtain a second characteristic value of the picture to be trained, wherein the convolutional layer, the activation function, the pooling layer and the full connection layer are used for converting the picture of the learning item to be added and the picture to be trained into the tuple format.
On the basis of the technical scheme, preferably, a learning item picture to be added is trained according to the neural network model, a first characteristic value of the learning item picture to be added is obtained, the picture to be trained is trained according to the neural network model, a second characteristic value of the picture to be trained is obtained, the method further comprises the following steps of performing convolution calculation on the learning item picture to be added and the picture to be trained through a convolution layer, extracting characteristic parameters, activating the characteristic parameters through an activation function, compressing the activated characteristic parameters through a pooling layer, connecting the compressed activated characteristic parameters through a full connection layer, and obtaining the first characteristic value of the learning item picture to be added and the second characteristic value of the picture to be trained.
On the basis of the above technical solution, preferably, in step S3, the picture of the learning item to be added is updated according to the first characteristic value and the second characteristic value, and the learning item is updated synchronously, and the method further includes the steps of comparing the first characteristic value with the second characteristic value, screening out a second characteristic value different from the first characteristic value and the second characteristic value, adding the picture to be trained corresponding to the second characteristic value into the learning item to be added, and updating the learning item according to the category of the picture to be trained.
On the basis of the above technical solution, preferably, the to-be-trained picture corresponding to the second feature value is added to the to-be-added learning item, and the learning item is updated according to the category of the to-be-trained picture, and the method further includes the following steps of obtaining the handling capability data of the student, where the handling capability data includes: the student learning items and the corresponding learning item adding pictures compare the operation ability data with the learning items, and push the corresponding learning items to the student according to the comparison result.
Still further preferably, the neural network based student manipulative ability culturing apparatus includes:
the processing module is used for acquiring the picture category of the learning item to be added, acquiring a corresponding picture from a network according to the picture category of the learning item to be added to serve as a picture to be selected, preprocessing the picture to be selected, and acquiring the processed picture to be selected as a picture to be trained;
the calculation module is used for establishing a neural network model, training the pictures of the learning items to be added according to the neural network model, acquiring a first characteristic value of the pictures of the learning items to be added, training the pictures to be trained according to the neural network model, and acquiring a second characteristic value of the pictures to be trained;
and the updating module is used for updating the picture of the learning item to be added according to the first characteristic value and the second characteristic value and synchronously updating the learning item.
In a second aspect, the neural network-based student hands-on ability training method further includes a terminal device, where the terminal device includes: the system comprises a memory, a processor and a neural network based student hands-on ability culture method program stored on the memory and capable of running on the processor, wherein the neural network based student hands-on ability culture method program is configured to realize the steps of the neural network based student hands-on ability culture method.
In a third aspect, the neural network-based student manual ability training method further includes a storage medium, the storage medium is a computer storage medium, the computer storage medium stores a program of the neural network-based student manual ability training method, and the program of the neural network-based student manual ability training method is executed by a processor to implement the steps of the neural network-based student manual ability training method as described above.
Compared with the prior art, the student manual ability culture method based on the neural network has the following beneficial effects:
(1) the first characteristic value and the second characteristic value can be accurately calculated through the neural network model, then the learning item to be added is updated through the first characteristic value and the second characteristic value, and the practical ability of students can be better cultured.
(2) The practical ability of the students is improved by acquiring the practical ability data of the students, namely checking the progress of the learning items of the students through the learning item pictures of the students and recommending the unlearned learning items to the students in a targeted manner.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a terminal device in a hardware operating environment according to an embodiment of the present invention;
FIG. 2 is a schematic flow chart of a first embodiment of the neural network-based student manual ability training method according to the present invention;
FIG. 3 is a functional block diagram of a first embodiment of the neural network-based student manual ability training method according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention.
As shown in fig. 1, the terminal device may include: a processor 1001, such as a Central Processing Unit (CPU), a communication bus 1002, a user interface 1003, a network interface 1004, and a memory 1005. Wherein a communication bus 1002 is used to enable connective communication between these components. The user interface 1003 may include a Display screen (Display), an input unit such as a Keyboard (Keyboard), and the optional user interface 1003 may also include a standard wired interface, a wireless interface. The network interface 1004 may optionally include a standard wired interface, a WIreless interface (e.g., a WIreless-FIdelity (WI-FI) interface). The Memory 1005 may be a Random Access Memory (RAM) Memory, or may be a Non-Volatile Memory (NVM), such as a disk Memory. The memory 1005 may alternatively be a storage device separate from the processor 1001.
Those skilled in the art will appreciate that the configuration shown in fig. 1 does not constitute a limitation of the terminal device, and that in actual implementations the terminal device may include more or less components than those shown, or some components may be combined, or a different arrangement of components.
As shown in fig. 1, a memory 1005 as a storage medium may include therein an operating system, a network communication module, a user interface module, and a neural network-based student-manipulative ability training method program.
In the terminal device shown in fig. 1, the network interface 1004 is mainly used for establishing a communication connection between the terminal device and a server storing all data required in the neural network-based student manual ability training method system; the user interface 1003 is mainly used for data interaction with a user; the processor 1001 and the memory 1005 of the neural network-based student manual ability training method terminal device can be arranged in the neural network-based student manual ability training method terminal device, and the neural network-based student manual ability training method terminal device calls the neural network-based student manual ability training method program stored in the memory 1005 through the processor 1001 and executes the neural network-based student manual ability training method provided by the invention.
With reference to fig. 2, fig. 2 is a schematic flow chart of a first embodiment of the neural network-based student manual ability training method according to the present invention.
In this embodiment, the method for training the practical ability of students based on the neural network includes the following steps:
s10: the method comprises the steps of obtaining the picture category of a learning item to be added, obtaining a corresponding picture from a network according to the picture category of the learning item to be added to serve as a picture to be selected, preprocessing the picture to be selected, and obtaining the processed picture to be selected to serve as a picture to be trained.
It should be understood that, in this embodiment, the categories of the pictures of the learning item to be added include: the system comprises a plurality of learning items such as calligraphy, painting, physics, biology and chemistry which can culture the practical ability of students, and then the system can establish picture sets of different types according to the types of pictures added with the learning items and store corresponding pictures, such as calligraphy works pictures stored in the calligraphy type, painting pictures stored in the painting type, and then the system can obtain corresponding pictures from a network according to the types of the learning items to be used as pictures to be selected.
It should be understood that after the system acquires the picture to be selected, the system preprocesses the picture to be selected, that is, the acquired picture to be selected is screened out a part of pictures through a local picture screening parameter range, the local picture screening parameter range is preset by an administrator, and then the system changes the screened pictures into pictures with uniform format and size as the pictures to be trained.
S20: establishing a neural network model, training the picture of the learning item to be added according to the neural network model, acquiring a first characteristic value of the picture of the learning item to be added, training the picture to be trained according to the neural network model, and acquiring a second characteristic value of the picture to be trained.
It should be understood that, in this embodiment, the system further establishes a neural network model and a tuple transformation rule, where the type of the tuple is a floating point type, and the neural network model includes: the system comprises a convolutional layer, an activation function, a pooling layer and a full-connection layer, then the system converts the pictures of the learning items to be added and the pictures to be trained into the form of tuples through tuple conversion rules, and sends the converted pictures of the learning items to be added and the pictures to be trained into a neural network model for near training.
It should be understood that, in this embodiment, the neural network model performs convolution calculation on the to-be-added learning item picture and the to-be-trained picture through the convolution layer, extracts the feature parameters, then increases the nonlinear expression capability of the network through the activation function, and then compresses the activated feature parameters by using the pooling layer, which has the following advantages: the feature graph is reduced, and the network computation complexity is simplified; on one hand, feature compression is carried out, and main features are extracted; over-fitting can be prevented to a certain extent, and finally, the compressed activation characteristic parameters are connected by utilizing a full connection layer, so that a first characteristic value of a picture to be added with a learning item and a second characteristic value of the picture to be trained are obtained.
It should be understood that, among other things, the convolutional layer formula is:
Figure BDA0002659766860000071
w' represents a characteristic value, w represents the size of an input matrix, namely, a picture to be trained in a tuple format, p represents the number of zero padding layers, k represents the size of a convolution kernel, and s represents a stride.
S30: and updating the pictures of the learning items to be added according to the first characteristic value and the second characteristic value, and synchronously updating the learning items.
It should be understood that, in this embodiment, after the neural network is calculated and played, the system compares the first characteristic value with the second characteristic value, screens out a second characteristic value different from the first characteristic value, adds the to-be-trained picture corresponding to the second characteristic value into the to-be-added learning item, and updates the learning item according to the category of the to-be-trained picture, so as to accurately complete the updating of the learning item, thereby improving the practical ability of the student.
It should be appreciated that after the update to the learning item is completed, the system will also obtain student hands-on ability data, including: the student learning items and the corresponding learning item adding pictures compare the practical ability data with the learning items, and the corresponding learning items are pushed to the student according to the comparison result.
The above description is only for illustrative purposes and does not limit the technical solutions of the present application in any way.
As can be easily found from the above description, in the embodiment, by obtaining the picture category of the learning item to be added, obtaining a corresponding picture from a network according to the picture category of the learning item to be added, taking the picture as a picture to be selected, and preprocessing the picture to be selected, obtaining the processed picture to be selected as a picture to be trained; establishing a neural network model, training a picture of a learning item to be added according to the neural network model to obtain a first characteristic value of the picture of the learning item to be added, and training the picture to be trained according to the neural network model to obtain a second characteristic value of the picture to be trained; and updating the pictures of the learning items to be added according to the first characteristic value and the second characteristic value, and synchronously updating the learning items. According to the method, the pictures of the learning items to be added and the pictures to be trained are trained through the neural network model, the automatic updating of the learning items to be added is realized by utilizing the first characteristic value and the second characteristic value, and the practical ability of students can be better cultured.
In addition, the embodiment of the invention also provides a student operation ability culture device based on the neural network. As shown in fig. 3, the neural network-based student manipulative ability culturing apparatus includes: processing module 10, calculation module 20, update module 30.
The processing module 10 is configured to acquire a picture category of a learning item to be added, acquire a corresponding picture from a network according to the picture category of the learning item to be added, use the picture as a picture to be selected, pre-process the picture to be selected, and acquire a processed picture to be selected as a picture to be trained;
the calculation module 20 is configured to establish a neural network model, train the picture of the learning item to be added according to the neural network model, acquire a first feature value of the picture of the learning item to be added, train the picture to be trained according to the neural network model, and acquire a second feature value of the picture to be trained;
and the updating module 30 is configured to update the picture of the learning item to be added according to the first characteristic value and the second characteristic value, and update the learning item synchronously.
In addition, it should be noted that the above-described embodiments of the apparatus are merely illustrative, and do not limit the scope of the present invention, and in practical applications, a person skilled in the art may select some or all of the modules to implement the purpose of the embodiments according to actual needs, and the present invention is not limited herein.
In addition, the technical details that are not described in detail in this embodiment can be referred to the neural network-based student manual ability training method provided in any embodiment of the present invention, and are not described herein again.
In addition, an embodiment of the present invention further provides a storage medium, where the storage medium is a computer storage medium, and the computer storage medium stores a program of a neural network-based student manual ability training method, where the program of the neural network-based student manual ability training method is executed by a processor to implement the following operations:
s1, acquiring the picture category of the learning item to be added, acquiring a corresponding picture from a network according to the picture category of the learning item to be added as a picture to be selected, preprocessing the picture to be selected, and acquiring the processed picture to be selected as a picture to be trained;
s2, establishing a neural network model, training the picture of the learning item to be added according to the neural network model to obtain a first characteristic value of the picture of the learning item to be added, and training the picture to be trained according to the neural network model to obtain a second characteristic value of the picture to be trained;
and S3, updating the picture of the learning item to be added according to the first characteristic value and the second characteristic value, and synchronously updating the learning item.
Further, when executed by the processor, the neural network-based student manual ability training method further realizes the following operations:
the categories of the learning item pictures to be added comprise: the method comprises the steps of calligraphy, drawing, physics, biology and chemistry, establishing different types of picture sets according to the types of pictures added with learning items, storing corresponding pictures, obtaining the corresponding pictures from a network according to the types of the learning items to be used as pictures to be selected, preprocessing the pictures to be selected, and obtaining the processed pictures to be selected to be used as pictures to be trained.
Further, when executed by the processor, the neural network-based student manual ability training method further realizes the following operations:
and acquiring a corresponding picture and a corresponding picture parameter from a network according to the category of the learning item, acquiring a local picture screening parameter range, screening the corresponding picture parameter according to the local picture screening parameter range, taking the screened picture as a picture to be selected, and unifying the size of the picture to be selected as a picture to be trained.
Further, when executed by the processor, the neural network-based student manual ability training method further realizes the following operations:
establishing a neural network model and a tuple conversion rule, wherein the type of the tuple is a floating point type, and the neural network model comprises the following steps: the method comprises the steps of converting a picture of a learning item to be added and a picture to be trained into a tuple format, training the picture of the learning item to be added according to a neural network model to obtain a first characteristic value of the picture of the learning item to be added, and training the picture to be trained according to the neural network model to obtain a second characteristic value of the picture to be trained, wherein the convolutional layer, the activation function, the pooling layer and the full connection layer are used for converting the picture of the learning item to be added and the picture to be trained into the tuple format.
Further, when executed by the processor, the neural network-based student manual ability training method further realizes the following operations:
convolution calculation is carried out on the pictures of the learning items to be added and the pictures to be trained through the convolution layer, characteristic parameters are extracted, the characteristic parameters are activated through an activation function, the activated characteristic parameters are compressed through the pooling layer, the compressed activated characteristic parameters are connected through the full connection layer, and a first characteristic value of the pictures of the learning items to be added and a second characteristic value of the pictures to be trained are obtained.
Further, when executed by the processor, the neural network-based student manual ability training method further realizes the following operations:
and comparing the first characteristic value with the second characteristic value, screening out a second characteristic value different from the first characteristic value and the second characteristic value, adding the picture to be trained corresponding to the second characteristic value into the learning item to be added, and updating the learning item according to the category of the picture to be trained.
Further, when executed by the processor, the neural network-based student manual ability training method further realizes the following operations:
acquiring the hands-on ability data of the student, wherein the hands-on ability data comprises: the student learning items and the corresponding learning item adding pictures compare the operation ability data with the learning items, and push the corresponding learning items to the student according to the comparison result.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A student manual ability culture method based on a neural network is characterized by comprising the following steps: comprises the following steps;
s1, acquiring the picture category of the learning item to be added, acquiring a corresponding picture from a network according to the picture category of the learning item to be added as a picture to be selected, preprocessing the picture to be selected, and acquiring the processed picture to be selected as a picture to be trained;
s2, establishing a neural network model, training the picture of the learning item to be added according to the neural network model to obtain a first characteristic value of the picture of the learning item to be added, and training the picture to be trained according to the neural network model to obtain a second characteristic value of the picture to be trained;
and S3, updating the picture of the learning item to be added according to the first characteristic value and the second characteristic value, and synchronously updating the learning item.
2. The neural network-based student manipulative ability training method according to claim 1, wherein: in step S1, obtaining a picture category of a learning item to be added, obtaining a corresponding picture from a network according to the picture category of the learning item to be added, taking the picture as a picture to be selected, preprocessing the picture to be selected, and obtaining a processed picture to be selected as a picture to be trained, the method further includes the following steps: the method comprises the steps of calligraphy, drawing, physics, biology and chemistry, establishing different types of picture sets according to the types of pictures added with learning items, storing corresponding pictures, obtaining the corresponding pictures from a network according to the types of the learning items to be used as pictures to be selected, preprocessing the pictures to be selected, and obtaining the processed pictures to be selected to be used as pictures to be trained.
3. The neural network-based student manipulative ability training method according to claim 2, wherein: and acquiring a corresponding picture from the network according to the category of the learning item to be used as a picture to be selected, preprocessing the picture to be selected, and acquiring the processed picture to be selected as a picture to be trained.
4. The neural network-based student manipulative ability training method according to claim 3, wherein: in step S2, a neural network model is established, a picture of a learning item to be added is trained according to the neural network model, a first characteristic value of the picture of the learning item to be added is obtained, a picture to be trained is trained according to the neural network model, a second characteristic value of the picture to be trained is obtained, the method further includes the following steps of establishing the neural network model and a tuple transformation rule, the tuple is of a floating point type, and the neural network model includes: the method comprises the steps of converting a picture of a learning item to be added and a picture to be trained into a tuple format, training the picture of the learning item to be added according to a neural network model to obtain a first characteristic value of the picture of the learning item to be added, and training the picture to be trained according to the neural network model to obtain a second characteristic value of the picture to be trained, wherein the convolutional layer, the activation function, the pooling layer and the full connection layer are used for converting the picture of the learning item to be added and the picture to be trained into the tuple format.
5. The neural network-based student manipulative ability training method according to claim 4, wherein: training a to-be-added learning item picture according to the neural network model, acquiring a first characteristic value of the to-be-added learning item picture, training the to-be-trained picture according to the neural network model, and acquiring a second characteristic value of the to-be-trained picture.
6. The neural network-based student manipulative ability training method according to claim 5, wherein: in step S3, the method further includes the steps of comparing the first feature value with the second feature value, screening out a second feature value different from the first feature value and the second feature value, adding the to-be-trained picture corresponding to the second feature value into the to-be-added learning item, and updating the learning item according to the category of the to-be-trained picture.
7. The neural network-based student manipulative ability training method according to claim 6, wherein: adding the picture to be trained corresponding to the second characteristic value into the learning item to be added, and updating the learning item according to the category of the picture to be trained, and further comprising the following steps of obtaining the handling capability data of the student, wherein the handling capability data comprises: the student learning items and the corresponding learning item adding pictures compare the operation ability data with the learning items, and push the corresponding learning items to the student according to the comparison result.
8. A neural network-based student manipulative ability culture apparatus, comprising:
the processing module is used for acquiring the picture category of the learning item to be added, acquiring a corresponding picture from a network according to the picture category of the learning item to be added to serve as a picture to be selected, preprocessing the picture to be selected, and acquiring the processed picture to be selected as a picture to be trained;
the calculation module is used for establishing a neural network model, training the pictures of the learning items to be added according to the neural network model, acquiring a first characteristic value of the pictures of the learning items to be added, training the pictures to be trained according to the neural network model, and acquiring a second characteristic value of the pictures to be trained;
and the updating module is used for updating the picture of the learning item to be added according to the first characteristic value and the second characteristic value and synchronously updating the learning item.
9. A terminal device, characterized in that the terminal device comprises: a memory, a processor, and a neural network based student hands-on ability training method program stored on the memory and executable on the processor, the neural network based student hands-on ability training method program configured to implement the steps of the neural network based student hands-on ability training method of any one of claims 1 to 7.
10. A storage medium, characterized in that the storage medium is a computer storage medium, the computer storage medium is stored with a program of a neural network based student manual ability training method, and the program of the neural network based student manual ability training method is executed by a processor to realize the steps of the neural network based student manual ability training method according to any one of claims 1 to 7.
CN202010901002.3A 2020-08-31 2020-08-31 Student manual ability culture method and device based on neural network Pending CN112102126A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010901002.3A CN112102126A (en) 2020-08-31 2020-08-31 Student manual ability culture method and device based on neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010901002.3A CN112102126A (en) 2020-08-31 2020-08-31 Student manual ability culture method and device based on neural network

Publications (1)

Publication Number Publication Date
CN112102126A true CN112102126A (en) 2020-12-18

Family

ID=73756959

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010901002.3A Pending CN112102126A (en) 2020-08-31 2020-08-31 Student manual ability culture method and device based on neural network

Country Status (1)

Country Link
CN (1) CN112102126A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180204111A1 (en) * 2013-02-28 2018-07-19 Z Advanced Computing, Inc. System and Method for Extremely Efficient Image and Pattern Recognition and Artificial Intelligence Platform
CN108932508A (en) * 2018-08-13 2018-12-04 杭州大拿科技股份有限公司 A kind of topic intelligent recognition, the method and system corrected
CN109685137A (en) * 2018-12-24 2019-04-26 上海仁静信息技术有限公司 A kind of topic classification method, device, electronic equipment and storage medium
CN110990614A (en) * 2019-11-08 2020-04-10 武汉东湖大数据交易中心股份有限公司 Image self-learning method, device, equipment and medium based on engine big data
CN111191119A (en) * 2019-12-16 2020-05-22 绍兴市上虞区理工高等研究院 Neural network-based scientific and technological achievement self-learning method and device
CN111553423A (en) * 2020-04-29 2020-08-18 河北地质大学 Handwriting recognition method based on deep convolutional neural network image processing technology

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180204111A1 (en) * 2013-02-28 2018-07-19 Z Advanced Computing, Inc. System and Method for Extremely Efficient Image and Pattern Recognition and Artificial Intelligence Platform
CN108932508A (en) * 2018-08-13 2018-12-04 杭州大拿科技股份有限公司 A kind of topic intelligent recognition, the method and system corrected
CN109685137A (en) * 2018-12-24 2019-04-26 上海仁静信息技术有限公司 A kind of topic classification method, device, electronic equipment and storage medium
CN110990614A (en) * 2019-11-08 2020-04-10 武汉东湖大数据交易中心股份有限公司 Image self-learning method, device, equipment and medium based on engine big data
CN111191119A (en) * 2019-12-16 2020-05-22 绍兴市上虞区理工高等研究院 Neural network-based scientific and technological achievement self-learning method and device
CN111553423A (en) * 2020-04-29 2020-08-18 河北地质大学 Handwriting recognition method based on deep convolutional neural network image processing technology

Similar Documents

Publication Publication Date Title
CN110033659B (en) Remote teaching interaction method, server, terminal and system
CN110291538B (en) Method and system for managing image recognition
CN111666416B (en) Method and device for generating semantic matching model
CN109637221B (en) Electronic courseware for online learning and generation method
US20200051451A1 (en) Short answer grade prediction
CN112417158A (en) Training method, classification method, device and equipment of text data classification model
US10541884B2 (en) Simulating a user score from input objectives
CN113761153A (en) Question and answer processing method and device based on picture, readable medium and electronic equipment
CN113253836A (en) Teaching method and system based on artificial intelligence and virtual reality
CN112101231A (en) Learning behavior monitoring method, terminal, small program and server
CN114969460A (en) Resource recommendation method, device and equipment based on knowledge graph and storage medium
CN115081965B (en) Big data analysis system of condition of learning and condition of learning server
CN112102126A (en) Student manual ability culture method and device based on neural network
Tan et al. Development of a mobile spreadsheet-based PID control simulation system
CN113822521A (en) Method and device for detecting quality of question library questions and storage medium
KR20220123168A (en) How to automatically classify the unit and difficulty of math problems
CN112328894A (en) Behavior guiding method and device, computer equipment and storage medium
CN114817488A (en) Information processing method and device in live broadcast, electronic equipment and storage medium
CN117557240B (en) Method, system, device and storage medium for reading jobs
Dai et al. Design of MOOC Response System Based on Intelligent Algorithms
Yuanfei A Personalized Recommendation System for English Teaching Resources Based on Learning Behavior Detection
CN117891902A (en) Test question generation method, device, equipment and readable storage medium
Lu Research on Computer Artificial Intelligence-assisted College Public English Teaching System
CN117519483A (en) Media dynamic interaction method, system and medium based on digital virtual
CN117370543A (en) Object classification method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201218

RJ01 Rejection of invention patent application after publication