CN109934156A - A kind of user experience evaluation method and system based on ELMAN neural network - Google Patents

A kind of user experience evaluation method and system based on ELMAN neural network Download PDF

Info

Publication number
CN109934156A
CN109934156A CN201910178660.1A CN201910178660A CN109934156A CN 109934156 A CN109934156 A CN 109934156A CN 201910178660 A CN201910178660 A CN 201910178660A CN 109934156 A CN109934156 A CN 109934156A
Authority
CN
China
Prior art keywords
user
matrix
user experience
expression vector
neural network
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910178660.1A
Other languages
Chinese (zh)
Inventor
李太福
廖志强
尹蝶
段棠少
张志亮
黄星耀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing University of Science and Technology
Original Assignee
Chongqing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing University of Science and Technology filed Critical Chongqing University of Science and Technology
Priority to CN201910178660.1A priority Critical patent/CN109934156A/en
Publication of CN109934156A publication Critical patent/CN109934156A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention provides a kind of user experience evaluation methods and system based on ELMAN neural network, by developing Mobile phone App, user is obtained in the process video (can be taken on site or read video file by mobile phone A pp) using new research and development APP and is transferred to cloud;The video is resolved into continuous serial-gram beyond the clouds;Using face recognition technology, identify the corresponding human face expression type of the serial-gram, the code vector that changes over time of expression is obtained, in cloud platform, the complex nonlinear relational model that is scored by ELMAN neural network user experience data with corresponding user experience process;The typing for carrying out video can automatically obtain the user experience evaluation result of the user experience process, and the foundation of APP product up-gradation optimization is carried out as enterprise.

Description

A kind of user experience evaluation method and system based on ELMAN neural network
Technical field
The present invention relates to big data fields, and in particular to a kind of user experience evaluation method based on ELMAN neural network And system.
Background technique
Nowadays, the exploitation of various APP software products emerges one after another, and whether an APP software product can succeed, user's body It tests and has gradually become a key factor.Big data has become the important references tool for promoting user experience, and effective data are dug Pick and analysis can be used to be promoted the user experience of existing product by enterprise, and new product kimonos is developed by the above results Business.User experience measure targetedly is taken, thus make user psychologically and have a good user experience, but user Experience result is difficult to be expressed with a kind of intuitive, true mode, however expression is that the mankind are used to express various emotional states A kind of most intuitive, most true mode, is a kind of highly important nonverbal communication means.
The prior art is in APP software development process, by the way of legacy user's investigation, can not quick and precisely obtain new The user experience data of APP software are researched and developed, efficiency of research and development is lower.
Summary of the invention
In order to solve in present R & D of complex, research staff is unable to quick obtaining and newly researches and develops APP user experience data The problem of, the application provides a kind of user experience evaluation method based on ELMAN neural network, includes the following steps
S1: acquisition user obtains the first process according to first process video using the first process video of test APP Serial-gram carries out recognition of face to the first process families photo and obtains user's human face expression vector, according to the user Human face expression vector obtains input matrix;
S2: user investigation data are acquired by test APP, matrix of consequence Y is obtained according to the user investigation data, constructs ELMAN model is trained ELMAN model using the input matrix and the matrix of consequence;
S3: acquisition user uses the second process video of target APP, and the ELMAN model completed using training is to the use It is analyzed using the second process video of target APP and obtains user experience data in family.Further, the step S1 packet It includes,
S11: using abscissa as the time, ordinate is that expression type code generation user's human face expression vector changes over time Two-dimentional expression spectrum, wherein " indignation " corresponding expression vector be [0,0,0,0,0,0,1]T, " detest " corresponding expression vector For [0,0,0,0,0,2,0]T, " fear " corresponding expression vector be [0,0,0,0,3,0,0]T, " happiness " corresponding expression vector For [0,0,0,4,0,0,0]T, " sad " corresponding expression vector be [0,0,5,0,0,0,0]T, " surprised " corresponding expression vector For [0,6,0,0,0,0,0]T, " loss of emotion " corresponding expression vector be [7,0,0,0,0,0,0]T, compose to obtain matrix using expression A=[e1,e2,e3,…,en]7×n
S12: matrix A progress transposed transform is obtained into AT=[e1,e2,e3,…,en]n×7
S13: structural matrix M=AAT
S14: the characteristic value of calculating matrix M, eigenvalue matrix λ=[λ of generator matrix M123,…,λ7]1×7
S15: it generates input matrix X=[λ, N, B]1×9, wherein N is the age, and B is gender.
Further, in step S2, X is setk=[xk1,xk2,…,xkM] (k=1,2 ..., S) be input vector, S be instruction Practice number of samples, WMI(g) be the g times iteration when input layer M and hidden layer I between weighted vector, WJP(g) be the g times iteration when Weighted vector between hidden layer J and output layer P, WJC(g) be the g time iteration when hidden layer J and undertaking layer C between weighted vector, Yk(g)=[yk1(g),yk2(g),…,ykP(g)] reality output of network, d when (k=1,2 ..., S) is the g times iterationk= [dk1,dk2,…,dkP] (k=1,2 ..., S) it is desired output.
Further, the step S2 includes
S21: initialization is assigned to W if the number of iterations g initial value is 0 respectivelyMI(0)、WJP(0)WJC(0) (0,1) section Random value;
S22: stochastic inputs sample Xk
S23: to input sample Xk, the input signal and output signal of every layer of neuron of forward calculation neural network;
S24: according to desired output dkWith reality output Yk(g), error E (g) is calculated;
Whether S25: error in judgement E (g) meet the requirements, and is such as unsatisfactory for, then enters step S26, such as meets, then enters step S29;
S26: judging whether the number of iterations g+1 is greater than maximum number of iterations, such as larger than, then enters step S29, otherwise, into Enter step S27;
S27: to input sample XkThe partial gradient δ of every layer of neuron of retrospectively calculate;
S28: modified weight amount Δ W is calculated, and corrects weight;G=g+1 is enabled, go to step S23;
S29: judging whether to complete all training samples, if it is, completing modeling, otherwise, continues to go to step S22。
Further, the step S3 further includes,
User experience data is sent to administrator's mobile terminal and is shown.
In order to guarantee the implementation of the above method, the present invention also provides a kind of user experiences based on ELMAN neural network to comment Valence system, which is characterized in that comprise the following modules
Acquisition module is obtained for acquiring user using the first process video of test APP according to first process video To the first process families photo, recognition of face is carried out to the first process families photo and obtains user's human face expression vector, according to Input matrix is obtained according to user's human face expression vector;
Training module obtains result according to the user investigation data for acquiring user investigation data by test APP Matrix Y is constructed ELMAN model, is trained using the input matrix and the matrix of consequence to ELMAN model;
As a result output module, acquisition user use the second process video of target APP, the ELMAN mould completed using training Type is analyzed the user using the second process video of target APP and obtains user experience data.
Further, the acquisition module obtains input matrix using following steps,
S11: using abscissa as the time, ordinate is that expression type code generation user's human face expression vector changes over time Two-dimentional expression spectrum, wherein " indignation " corresponding expression vector be [0,0,0,0,0,0,1]T, " detest " corresponding expression vector For [0,0,0,0,0,2,0]T, " fear " corresponding expression vector be [0,0,0,0,3,0,0]T, " happiness " corresponding expression vector For [0,0,0,4,0,0,0]T, " sad " corresponding expression vector be [0,0,5,0,0,0,0]T, " surprised " corresponding expression vector For [0,6,0,0,0,0,0]T, " loss of emotion " corresponding expression vector be [7,0,0,0,0,0,0]T, compose to obtain matrix using expression A=[e1,e2,e3,…,en]7×n
S12: matrix A progress transposed transform is obtained into AT=[e1,e2,e3,…,en]n×7
S13: structural matrix M=AAT
S14: the characteristic value of calculating matrix M, eigenvalue matrix λ=[λ of generator matrix M123,…,λ7]1×7
S15: it generates input matrix X=[λ, N, B]1×9, wherein N is the age, and B is gender.
Further, X is arranged in the training modulek=[xk1,xk2,…,xkM] (k=1,2 ..., S) be input vector, S For training sample number, WMI(g) be the g times iteration when input layer M and hidden layer I between weighted vector, WJP(g) repeatedly for the g times For when hidden layer J and output layer P between weighted vector, WJC(g) be the g time iteration when hidden layer J and accept layer C between weight arrow Amount, Yk(g)=[yk1(g),yk2(g),…,ykP(g)] reality output of network, d when (k=1,2 ..., S) is the g times iterationk= [dk1,dk2,…,dkP] (k=1,2 ..., S) it is desired output.
Further, the training module is trained ELMAN model using following steps:
S21: initialization is assigned to W if the number of iterations g initial value is 0 respectivelyMI(0)、WJP(0)WJC(0) (0,1) section Random value;
S22: stochastic inputs sample Xk
S23: to input sample Xk, the input signal and output signal of every layer of neuron of forward calculation neural network;
S24: according to desired output dkWith reality output Yk(g), error E (g) is calculated;
Whether S25: error in judgement E (g) meet the requirements, and is such as unsatisfactory for, then enters step S26, such as meets, then enters step S29;
S26: judging whether the number of iterations g+1 is greater than maximum number of iterations, such as larger than, then enters step S29, otherwise, into Enter step S27;
S27: to input sample XkThe partial gradient δ of every layer of neuron of retrospectively calculate;
S28: modified weight amount Δ W is calculated, and corrects weight;G=g+1 is enabled, go to step S23;
S29: judging whether to complete all training samples, if it is, completing modeling, otherwise, continues to go to step S22。
Further, the result output module is also used to, and user experience data is sent to administrator's mobile terminal simultaneously It is shown.
The invention has the advantages that
1 follows the anatomy such as nerves and muscles, has common trait;Expression Recognition is under a kind of unconscious, free state Data capture method, ensure that the reliability and objectivity of data.
2, which are easily integrated into data analysis system, is analyzed and is visualized.
3 allow the data collection of other software real time access facial expression analysis system.
4 can analyze the facial expression of all races, the facial expression including children.
5 present invention analyze user in the video using APP process by the neural network model that training is completed It quickly obtains user experience data, can be convenient research staff and quickly new research and development APP is assessed, improve the research and development of APP Efficiency.
Detailed description of the invention
Fig. 1 is a kind of user experience evaluation method flow chart based on ELMAN neural network of the present invention.
Fig. 2 is a kind of user experience evaluation system structural schematic diagram based on ELMAN neural network of the present invention.
Fig. 3 is that one embodiment of the invention two dimension expression composes schematic diagram.
Fig. 4 is one embodiment of the invention ELMAN neural network schematic diagram.
Specific embodiment
In the following description, for purposes of illustration, it in order to provide the comprehensive understanding to one or more embodiments, explains Many details are stated.It may be evident, however, that these embodiments can also be realized without these specific details.
For the problem that in R & D of complex, research staff is unable to quick obtaining and newly researches and develops APP user experience data, this Invent a kind of user experience evaluation method and system based on ELMAN neural network
The present invention is trained ELMAN model by acquisition user video and user investigation data, is completed by training ELMAN model the video identification of new research and development APP, the user experience data of quick obtaining user are used to user.
Wherein, it should be noted that ELMAN network can be regarded as one, and there is local memory unit and LOCAL FEEDBACK to connect The recurrent neural network connect.
Hereinafter, specific embodiments of the present invention will be described in detail with reference to the accompanying drawings.
In order to illustrate the user experience evaluation method provided by the invention based on ELMAN neural network, Fig. 1 shows this hair A kind of bright user experience evaluation method flow chart based on ELMAN neural network.
As shown in Figure 1, a kind of user experience evaluation method based on ELMAN neural network provided by the invention includes following Step, S1: acquisition user obtains the first process system according to first process video using the first process video of test APP Column photo carries out recognition of face to the first process families photo and obtains user's human face expression vector, according to the user people Face expression vector obtains input matrix;
S2: user investigation data are acquired by test APP, matrix of consequence Y is obtained according to the user investigation data, constructs ELMAN model is trained ELMAN model using the input matrix and the matrix of consequence;
S3: acquisition user uses the second process video of target APP, and the ELMAN model completed using training is to the use It is analyzed using the second process video of target APP and obtains user experience data in family.
First process video, the first process families photo are the training data for training neural network model, and second Process video is data to be tested, and trained neural network carries out analysis the second mistake of acquisition to the second process video for use The corresponding user experience data of journey video.
Test APP is installed on first process video and investigation knot of the user hand generator terminal for obtaining user using test APP Fruit data, test APP obtain user by the front camera of user hand generator terminal and use the first process video of test APP.It surveys Examination APP is directly obtained user by user's input after test completion and uses the investigation result data of test APP process.
Target APP is new research and development APP to be detected, and during user uses target APP, test APP passes through user The front camera of mobile phone terminal obtains the second process video of user.
The test content and target APP for testing APP are same type of content, for example, if target APP is game class APP then tests APP and obtains by one section of simulation user and play the first process video in game process, if target APP is Music class APP then tests APP and is played by one section of analog music to obtain the first process during user listens to music and regard Frequently, test APP use is improved with the same type of test content of target APP so that neural metwork training is more targeted The accuracy of target APP user experience data is obtained by neural network.
The present invention acquires the first process video and the investigation result of user to be trained to neural network, and neural network is instructed After perfecting, the second process video for inputting same user obtains the user experience data of the user.It is more relative to traditional use A user data progress neural metwork training, the mode that trained neural network tests multiple and different user data, The present invention has specific neural network parameter self, this hair for each user one neural network of training, each user It is bright compared with the existing technology in general neural network product have higher result detection accuracy.
Step S1 includes in implementation process of the present invention, and obtain user (can pass through in the process video using test APP Mobile phone A pp is taken on site or reads video file) it is transferred to cloud, the video is resolved into continuous series beyond the clouds and is shone Piece identifies the corresponding human face expression of the serial-gram using face recognition technology, obtains the code vector that expression changes over time (7 kinds of expression types indignation, detest, frightened, glad, sad, surprised, the corresponding code of loss of emotion is 1,2,3,4,5,6, 7), age N (year), gender B (male/female correspond to code be 1/0) following processing is made to the data matrix, obtain input matrix X;
Specifically, step S1 includes in an embodiment of the present invention,
S11: the two-dimentional expression spectrum that expression code vector changes over time is drawn, wherein abscissa is the time, and ordinate is Expression type code 1-7, obtaining " indignation " corresponding expression vector is [0,0,0,0,0,0,1]T, " detest " corresponding expression to Amount is [0,0,0,0,0,2,0]T, " fear " corresponding expression vector be [0,0,0,0,3,0,0]T, " happiness " corresponding expression to Amount is [0,0,0,4,0,0,0]T, " sad " corresponding expression vector be [0,0,5,0,0,0,0]T, " surprised " corresponding expression to Amount is [0,6,0,0,0,0,0]T, " loss of emotion " corresponding expression vector be [7,0,0,0,0,0,0]T;It composes to obtain square using expression Battle array A=[e1,e2,e3,…,en]7×n(enFor one of seven kinds of expression vectors).For example, as n=10, E=[5,7,6,6,4,4, 4,4,6,7];The expression of expression code matrices at any time is drawn to compose as shown in figure 3, being composed to obtain expression spectrum matrix A by expression:
S12: matrix A progress transposed transform is obtained into AT=[e1,e2,e3,…,en]n×7
S13: constructing new matrix is M=AAT
S14: calculating the characteristic value of matrix M, and value indicative matrix is λ=[λ123,…,λ7]1×7
S15: input parameter matrix is by matrix exgenvalue, gender, age composition X=[λ, N, B]1×9
Step S2 includes in implementation process of the present invention, the real user experience of investigation user's video process, selection point Number 1 divides, 2 points, 3 points, one of 4 points, 5 points (it is very poor, poor, general, good, fine to respectively correspond experience of the process) as experience test knot Fruit, and as output result y;Using a large amount of input matrix X and corresponding output matrix of consequence Y, carried out using ELMAN neural network Modeling, ELMAN are constituted in structure by four layers, respectively input layer, mode layer, summation layer and output layer.Input layer Number be equal to learning sample in input vector dimension, each neuron is simple distribution unit, directly passes input variable Pass mode layer.Mode layer neuron number is equal to the number n of learning sample, and each neuron corresponds to different samples.Summation layer It is middle to be summed using two types neuron.Neuron number in output layer is equal to the dimension of output vector in learning sample The output for layer of summing is divided by by k, each neuron, and the output through first j corresponds to j-th of element of estimated result Y (X).
In implementation process of the present invention, in step S2, X is setk=[xk1,xk2,…,xkM] (k=1,2 ..., S) it is input Vector, S are training sample number, WMI(g) be the g times iteration when input layer M and hidden layer I between weighted vector, WJP(g) it is Weighted vector when the g times iteration between hidden layer J and output layer P, WJC(g) be the g time iteration when hidden layer J and undertaking layer C between Weighted vector, Yk(g)=[yk1(g),yk2(g),…,ykP(g)] reality of network when (k=1,2 ..., S) is the g times iteration Output, dk=[dk1,dk2,…,dkP] (k=1,2 ..., S) it is desired output.
Step S2 includes the following steps in implementation process of the present invention,
S21: initialization is assigned to W if the number of iterations g initial value is 0 respectivelyMI(0)、WJP(0)WJC(0) (0,1) section Random value;
S22: stochastic inputs sample Xk
S23: to input sample Xk, the input signal and output signal of every layer of neuron of forward calculation neural network;
S24: according to desired output dkWith reality output Yk(g), error E (g) is calculated;
Whether S25: error in judgement E (g) meet the requirements, and is such as unsatisfactory for, then enters step S26, such as meets, then enters step S29;
S26: judging whether the number of iterations g+1 is greater than maximum number of iterations, such as larger than, then enters step S29, otherwise, into Enter step S27;
S27: to input sample XkThe partial gradient δ of every layer of neuron of retrospectively calculate;
S28: modified weight amount Δ W is calculated, and corrects weight;G=g+1 is enabled, go to step S23;
S29: judging whether to complete all training samples, if it is, completing modeling, otherwise, continues to go to step S22。
In implementation process of the present invention, step S3 includes that above-mentioned trained ELMAN model is put into cloud, which is opened Send out into software;For newly developed APP, as long as typing video can automatically obtain the user and use the use of new research and development APP process Evaluation result is experienced at family, is carried out product up-gradation optimum results to company and is evaluated.
It should be pointed out that the above description is not a limitation of the present invention, the present invention is also not limited to the example above, Variation, modification, addition or the replacement that those skilled in the art are made within the essential scope of the present invention, are also answered It belongs to the scope of protection of the present invention.

Claims (10)

1. a kind of user experience evaluation method based on ELMAN neural network, which is characterized in that include the following steps
S1: acquisition user obtains the first process families according to first process video using the first process video of test APP Photo carries out recognition of face to the first process families photo and obtains user's human face expression vector, according to user's face Expression vector obtains input matrix;
S2: user investigation data are acquired by test APP, matrix of consequence Y is obtained according to the user investigation data, constructs ELMAN model is trained ELMAN model using the input matrix and the matrix of consequence;
S3: acquisition user uses the second process video of target APP, is made using the ELMAN model that training is completed to the user It is analyzed with the second process video of target APP and obtains user experience data.
2. a kind of user experience evaluation method based on ELMAN neural network as described in claim 1, which is characterized in that institute Stating step S1 includes,
S11: using abscissa as the time, ordinate is that expression type code generates user's human face expression vector changes over time two Dimension table feelings spectrum, wherein " indignation " corresponding expression vector is [0,0,0,0,0,0,1]T, " detest " corresponding expression vector be [0,0,0,0,0,2,0]T, " fear " corresponding expression vector be [0,0,0,0,3,0,0]T, " happiness " corresponding expression vector be [0,0,0,4,0,0,0]T, " sad " corresponding expression vector be [0,0,5,0,0,0,0]T, " surprised " corresponding expression vector be [0,6,0,0,0,0,0]T, " loss of emotion " corresponding expression vector be [7,0,0,0,0,0,0]T, compose to obtain matrix A using expression =[e1,e2,e3,…,en]7×n
S12: matrix A progress transposed transform is obtained into AT=[e1,e2,e3,…,en]n×7
S13: structural matrix M=AAT
S14: the characteristic value of calculating matrix M, eigenvalue matrix λ=[λ of generator matrix M123,…,λ7]1×7
S15: it generates input matrix X=[λ, N, B]1×9, wherein N is the age, and B is gender.
3. a kind of user experience evaluation method based on ELMAN neural network as claimed in claim 2, which is characterized in that step In rapid S2, X is setk=[xk1,xk2,…,xkM] (k=1,2 ..., S) be input vector, S be training sample number, WMI(g) it is Weighted vector when the g times iteration between input layer M and hidden layer I, WJP(g) be the g times iteration when hidden layer J and output layer P between Weighted vector, WJC(g) be the g time iteration when hidden layer J and undertaking layer C between weighted vector, Yk(g)=[yk1(g),yk2 (g),…,ykP(g)] reality output of network, d when (k=1,2 ..., S) is the g times iterationk=[dk1,dk2,…,dkP] (k= 1,2 ..., S) it is desired output.
4. a kind of user experience evaluation method based on ELMAN neural network as claimed in claim 3, which is characterized in that institute Step S2 is stated to include the following steps
S21: initialization is assigned to W if the number of iterations g initial value is 0 respectivelyMI(0)、WJP(0)WJC(0) (0,1) section with Machine value;
S22: stochastic inputs sample Xk
S23: to input sample Xk, the input signal and output signal of every layer of neuron of forward calculation neural network;
S24: according to desired output dkWith reality output Yk(g), error E (g) is calculated;
Whether S25: error in judgement E (g) meet the requirements, and is such as unsatisfactory for, then enters step S26, such as meets, then enters step S29;
S26: judging whether the number of iterations g+1 is greater than maximum number of iterations, such as larger than, then enters step S29, otherwise, into step Rapid S27;
S27: to input sample XkThe partial gradient δ of every layer of neuron of retrospectively calculate;
S28: modified weight amount Δ W is calculated, and corrects weight;G=g+1 is enabled, go to step S23;
S29: judging whether to complete all training samples, if it is, completing modeling, otherwise, continues the S22 that gos to step.
5. a kind of user experience evaluation method based on ELMAN neural network as claimed in claim 4, which is characterized in that institute Stating step S3 further includes,
User experience data is sent to administrator's mobile terminal and is shown.
6. a kind of user experience evaluation system based on ELMAN neural network, which is characterized in that comprise the following modules acquisition mould Block obtains the first process families according to first process video for acquiring user using the first process video of test APP Photo carries out recognition of face to the first process families photo and obtains user's human face expression vector, according to user's face Expression vector obtains input matrix;
Training module obtains matrix of consequence according to the user investigation data for acquiring user investigation data by test APP Y is constructed ELMAN model, is trained using the input matrix and the matrix of consequence to ELMAN model;
As a result output module, acquisition user use the second process video of target APP, the ELMAN model pair completed using training The user is analyzed using the second process video of target APP and obtains user experience data.
7. a kind of user experience evaluation system based on ELMAN neural network as claimed in claim 6, which is characterized in that institute It states acquisition module and obtains input matrix using following steps,
S11: using abscissa as the time, ordinate is that expression type code generates user's human face expression vector changes over time two Dimension table feelings spectrum, wherein " indignation " corresponding expression vector is [0,0,0,0,0,0,1]T, " detest " corresponding expression vector be [0,0,0,0,0,2,0]T, " fear " corresponding expression vector be [0,0,0,0,3,0,0]T, " happiness " corresponding expression vector be [0,0,0,4,0,0,0]T, " sad " corresponding expression vector be [0,0,5,0,0,0,0]T, " surprised " corresponding expression vector be [0,6,0,0,0,0,0]T, " loss of emotion " corresponding expression vector be [7,0,0,0,0,0,0]T, compose to obtain matrix A using expression =[e1,e2,e3,…,en]7×n
S12: matrix A progress transposed transform is obtained into AT=[e1,e2,e3,…,en]n×7
S13: structural matrix M=AAT
S14: the characteristic value of calculating matrix M, eigenvalue matrix λ=[λ of generator matrix M123,…,λ7]1×7
S15: it generates input matrix X=[λ, N, B]1×9, wherein N is the age, and B is gender.
8. a kind of user experience evaluation system based on ELMAN neural network as claimed in claim 7, which is characterized in that institute State training module setting Xk=[xk1,xk2,…,xkM] (k=1,2 ..., S) be input vector, S be training sample number, WMI(g) Weighted vector when for the g times iteration between input layer M and hidden layer I, WJP(g) be the g time iteration when hidden layer J and output layer P it Between weighted vector, WJC(g) be the g time iteration when hidden layer J and undertaking layer C between weighted vector, Yk(g)=[yk1(g),yk2 (g),…,ykP(g)] reality output of network, d when (k=1,2 ..., S) is the g times iterationk=[dk1,dk2,…,dkP] (k= 1,2 ..., S) it is desired output.
9. a kind of user experience evaluation system based on ELMAN neural network as claimed in claim 8, which is characterized in that institute Training module is stated to be trained ELMAN model using following steps:
S21: initialization is assigned to W if the number of iterations g initial value is 0 respectivelyMI(0)、WJP(0)WJC(0) (0,1) section with Machine value;
S22: stochastic inputs sample Xk
S23: to input sample Xk, the input signal and output signal of every layer of neuron of forward calculation neural network;
S24: according to desired output dkWith reality output Yk(g), error E (g) is calculated;
Whether S25: error in judgement E (g) meet the requirements, and is such as unsatisfactory for, then enters step S26, such as meets, then enters step S29;
S26: judging whether the number of iterations g+1 is greater than maximum number of iterations, such as larger than, then enters step S29, otherwise, into step Rapid S27;
S27: to input sample XkThe partial gradient δ of every layer of neuron of retrospectively calculate;
S28: modified weight amount Δ W is calculated, and corrects weight;G=g+1 is enabled, go to step S23;
S29: judging whether to complete all training samples, if it is, completing modeling, otherwise, continues the S22 that gos to step.
10. a kind of user experience evaluation system based on ELMAN neural network as claimed in claim 9, which is characterized in that institute It states result output module to be also used to, user experience data is sent to administrator's mobile terminal and is shown.
CN201910178660.1A 2019-03-11 2019-03-11 A kind of user experience evaluation method and system based on ELMAN neural network Pending CN109934156A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910178660.1A CN109934156A (en) 2019-03-11 2019-03-11 A kind of user experience evaluation method and system based on ELMAN neural network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910178660.1A CN109934156A (en) 2019-03-11 2019-03-11 A kind of user experience evaluation method and system based on ELMAN neural network

Publications (1)

Publication Number Publication Date
CN109934156A true CN109934156A (en) 2019-06-25

Family

ID=66986858

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910178660.1A Pending CN109934156A (en) 2019-03-11 2019-03-11 A kind of user experience evaluation method and system based on ELMAN neural network

Country Status (1)

Country Link
CN (1) CN109934156A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111126297A (en) * 2019-12-25 2020-05-08 淮南师范学院 Experience analysis method based on learner expression

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101777116A (en) * 2009-12-23 2010-07-14 中国科学院自动化研究所 Method for analyzing facial expressions on basis of motion tracking
CN104794444A (en) * 2015-04-16 2015-07-22 美国掌赢信息科技有限公司 Facial expression recognition method in instant video and electronic equipment
US20160307094A1 (en) * 2015-04-16 2016-10-20 Cylance Inc. Recurrent Neural Networks for Malware Analysis
CN106096557A (en) * 2016-06-15 2016-11-09 浙江大学 A kind of semi-supervised learning facial expression recognizing method based on fuzzy training sample
CN107067429A (en) * 2017-03-17 2017-08-18 徐迪 Video editing system and method that face three-dimensional reconstruction and face based on deep learning are replaced
CN107292289A (en) * 2017-07-17 2017-10-24 东北大学 Facial expression recognizing method based on video time sequence
CN107967455A (en) * 2017-11-24 2018-04-27 中南大学 A kind of transparent learning method of intelligent human-body multidimensional physical feature big data and system
CN107977708A (en) * 2017-11-24 2018-05-01 重庆科技学院 The student's DNA identity informations recommended towards individualized learning scheme define method
CN109117731A (en) * 2018-07-13 2019-01-01 华中师范大学 A kind of classroom instruction cognitive load measuring system
CN109145754A (en) * 2018-07-23 2019-01-04 上海电力学院 Merge the Emotion identification method of facial expression and limb action three-dimensional feature
CN109243562A (en) * 2018-09-03 2019-01-18 陈怡� A kind of image makings method for improving based on Elman artificial neural network and genetic algorithms
CN109447305A (en) * 2018-06-23 2019-03-08 四川大学 A kind of trend forecasting method based on the long Memory Neural Networks in short-term of quantum weighting

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101777116A (en) * 2009-12-23 2010-07-14 中国科学院自动化研究所 Method for analyzing facial expressions on basis of motion tracking
CN104794444A (en) * 2015-04-16 2015-07-22 美国掌赢信息科技有限公司 Facial expression recognition method in instant video and electronic equipment
US20160307094A1 (en) * 2015-04-16 2016-10-20 Cylance Inc. Recurrent Neural Networks for Malware Analysis
CN106096557A (en) * 2016-06-15 2016-11-09 浙江大学 A kind of semi-supervised learning facial expression recognizing method based on fuzzy training sample
CN107067429A (en) * 2017-03-17 2017-08-18 徐迪 Video editing system and method that face three-dimensional reconstruction and face based on deep learning are replaced
CN107292289A (en) * 2017-07-17 2017-10-24 东北大学 Facial expression recognizing method based on video time sequence
CN107967455A (en) * 2017-11-24 2018-04-27 中南大学 A kind of transparent learning method of intelligent human-body multidimensional physical feature big data and system
CN107977708A (en) * 2017-11-24 2018-05-01 重庆科技学院 The student's DNA identity informations recommended towards individualized learning scheme define method
CN109447305A (en) * 2018-06-23 2019-03-08 四川大学 A kind of trend forecasting method based on the long Memory Neural Networks in short-term of quantum weighting
CN109117731A (en) * 2018-07-13 2019-01-01 华中师范大学 A kind of classroom instruction cognitive load measuring system
CN109145754A (en) * 2018-07-23 2019-01-04 上海电力学院 Merge the Emotion identification method of facial expression and limb action three-dimensional feature
CN109243562A (en) * 2018-09-03 2019-01-18 陈怡� A kind of image makings method for improving based on Elman artificial neural network and genetic algorithms

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
WANG SONG等: "Identification of PMSM Based on EKF and Elman Neural Network", 《网页在线公开:HTTPS://IEEEXPLORE.IEEE.ORG/STAMP/STAMP.JSP?TP=&ARNUMBER=5262728》 *
付晓峰等: "视频序列中基于多尺度时空局部方向角模式直方图映射的表情识别", 《计算机辅助设计与图形学学报》 *
杨柳等: "改进最小二乘支持向量机电量预测算法", 《电网与清洁能源》 *
王得胜: "气味用户体验测试评价技术研究及应用", 《中国优秀硕士学位论文全文数据库 工程科技II辑》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111126297A (en) * 2019-12-25 2020-05-08 淮南师范学院 Experience analysis method based on learner expression
CN111126297B (en) * 2019-12-25 2023-10-31 淮南师范学院 Experience analysis method based on learner expression

Similar Documents

Publication Publication Date Title
CN110000787A (en) A kind of control method of super redundant mechanical arm
CN107044976A (en) Heavy metal content in soil analyzing and predicting method based on LIBS Yu stack RBM depth learning technologies
CN106527757A (en) Input error correction method and apparatus
CN110245080A (en) Generate the method and device of scrnario testing use-case
CN110110754A (en) Classification method based on the local imbalance problem of extensive error of cost
CN109979541A (en) Medicament molecule pharmacokinetic property and toxicity prediction method based on capsule network
CN104536881A (en) Public testing error report priority sorting method based on natural language analysis
CN108596274A (en) Image classification method based on convolutional neural networks
CN109978870A (en) Method and apparatus for output information
CN105005693B (en) One kind is based on the specific tumour cell drug susceptibility appraisal procedure of inhereditary material
CN109919099A (en) A kind of user experience evaluation method and system based on Expression Recognition
CN108334943A (en) The semi-supervised soft-measuring modeling method of industrial process based on Active Learning neural network model
CN110222940A (en) A kind of crowdsourcing test platform tester's proposed algorithm
CN109919102A (en) A kind of self-closing disease based on Expression Recognition embraces body and tests evaluation method and system
Lepers et al. Inference with selection, varying population size, and evolving population structure: application of ABC to a forward–backward coalescent process with interactions
CN109919101A (en) A kind of user experience evaluation method and system based on cell phone client
CN109934156A (en) A kind of user experience evaluation method and system based on ELMAN neural network
CN109918791A (en) A kind of nuclear plant digital master control room operator human reliability analysis method
CN110726813B (en) Electronic nose prediction method based on double-layer integrated neural network
Henderson et al. A comparison of variety metrics in engineering design
Wei et al. (Retracted) Image analysis and pattern recognition method of three-dimensional process in physical education teaching based on big data
Hu et al. An algorithm J-SC of detecting communities in complex networks
CN115101149B (en) Method for predicting total energy of microstructure of material
CN109359190A (en) A kind of position analysis model construction method based on evaluation object camp
Cheng et al. Neural cognitive modeling based on the importance of knowledge point for student performance prediction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20190625