CN111126297A - Experience analysis method based on learner expression - Google Patents

Experience analysis method based on learner expression Download PDF

Info

Publication number
CN111126297A
CN111126297A CN201911360147.0A CN201911360147A CN111126297A CN 111126297 A CN111126297 A CN 111126297A CN 201911360147 A CN201911360147 A CN 201911360147A CN 111126297 A CN111126297 A CN 111126297A
Authority
CN
China
Prior art keywords
matrix
learner
expression
sample
experience
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911360147.0A
Other languages
Chinese (zh)
Other versions
CN111126297B (en
Inventor
王刚
谭嵩
孙方
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Beike Haiteng Technology Co.,Ltd.
Original Assignee
Huainan Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huainan Normal University filed Critical Huainan Normal University
Priority to CN201911360147.0A priority Critical patent/CN111126297B/en
Publication of CN111126297A publication Critical patent/CN111126297A/en
Application granted granted Critical
Publication of CN111126297B publication Critical patent/CN111126297B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention relates to an experience analysis method based on learner expression, which comprises the steps of data acquisition and initialization, random generation of input weight vectors and input biases of hidden layer mapping functions, hidden layer output function generation, hidden layer output matrix generation, output weight matrix initialization, tag approximation matrix updating, output weight matrix updating, training stop judgment, online experience score prediction and the like. The invention has the advantages of high prediction precision, no need of a large number of learners to experience scoring, high operation speed and the like.

Description

Experience analysis method based on learner expression
Technical Field
The invention belongs to the field of data analysis, and particularly relates to an experience analysis method based on learner expressions.
Background
Currently, more and more learners abandon the traditional learner method and choose learners on intelligent terminals. In order to really know the experience of the learner on the current learner, the camera on the intelligent terminal can be used for capturing the face image of the learner so as to acquire the expression information of the learner. However, the expression of a learner is variable and complex in a learner process. When the learner is laughing, the experience is not good, and similarly, when the learner shows aversive expression, the experience is not good. After each learner is finished, the system may request that the learner evaluate the experience. Of course, not every learner can complete learning, nor is every learner willing to give an assessment. Therefore, it is necessary to establish an experience analysis method based on the expression of the learner, so as to predict the experience of each learning experience and further provide data support for the improvement of the system.
Disclosure of Invention
The invention provides an experience analysis method based on learner expression, which comprises the following steps:
step 1, data acquisition and initialization:
collecting facial videos of a learner in each learning, analyzing the expression of each frame, dividing the expression into 8 types including aversion, anger, fear, happiness, sadness, surprise, photophobia and no expression, and forming a feature vector
Figure BDA0002336971930000011
x(1),...,x(8)The proportion of aversion, anger, fear, happiness, sadness, surprise, photophobia and no expression in the whole video is respectively x(1),...,x(8)The sum is 1, and the auxiliary characteristics are used for expanding x according to actual conditions to obtain
Figure BDA0002336971930000012
Is NiA sample of dimensions; order sample collection
Figure BDA0002336971930000013
Label for scoring learner experience after each learning as sample
Figure BDA0002336971930000014
To pair
Figure BDA0002336971930000015
Marking to obtain corresponding category label
Figure BDA0002336971930000016
Wherein l is the number of labeled samples, n is the number of all samples, and u-n-l is the number of unlabeled samples;
Figure BDA0002336971930000017
a set of real numbers is represented as,
Figure BDA0002336971930000018
representing a positive real number set;
initialization: the following parameters were manually set: lambda [ alpha ]1,λ2Theta, sigma > 0, number of hidden layer nodes NhThe maximum iteration time E is more than 0, and the iteration time t is 0;
step 2, randomly generating an input weight vector of a hidden layer mapping function
Figure BDA0002336971930000019
Is offset from the input; b ∈ R, as follows:
randomly generating NhA, obtaining
Figure BDA00023369719300000110
Randomly generating NhB, obtaining
Figure BDA00023369719300000111
Step 3, generating a hidden layer output function:
Figure BDA0002336971930000021
wherein G (a, b, x) is an activation function, x represents a sample, and superscript T represents matrix transfer;
step 4, generating a hidden layer output matrix:
H=[h(x1),...,h(xn)]T
step 5, initializing an output weight matrix:
Figure BDA0002336971930000022
wherein ,W0An output weight matrix W with t equal to 0, pinv (H) represents a pseudo-inverse matrix of H,
Figure BDA0002336971930000023
a matrix composed of the first l rows of H;
and 6, updating the label approximation matrix as follows:
Figure BDA0002336971930000024
wherein ,Yt+1Tag approximation matrix for t +1 iterations, InIs a unit matrix of n dimensions, J ═ Il,Ol×u;Ou×l,Ou×u],IlIs a unit matrix in the dimension of l,
Figure BDA0002336971930000025
is v is1×v2Zero matrix of dimensions, v1,v2It is possible to take u or l,
Figure BDA0002336971930000026
Ou×1a zero matrix of u x 1 dimensions; l is graph Laplace matrix L ═ D-A, A is similarity matrix, its ith row and jth column element AijComprises the following steps:
Figure BDA0002336971930000027
wherein ,xiAnd xjFor the sample, i, j ∈ {1, …, n }, σ > 0 is the Gaussian kernel width, D is the degree matrix of A, D is the diagonal matrix, the ith diagonal element D of Dii=∑jAij
And 7: the output weight matrix is updated as follows:
Wt+1=(HTH+θUt)-1HTYt+1
wherein ,
Figure BDA0002336971930000028
wherein ,Wt+1Denotes W at the time t +1,
Figure BDA0002336971930000029
is Wt+1Line 1 to line N ofhThe number of the row vectors is,
Figure BDA00023369719300000210
is line 1 to line N of WhA row vector;
and 8: the iteration number t is increased by 1, if t is more than E, W is retainedt+1And jumping to the step 9, otherwise jumping to the step 6;
and step 9: for the new sample x, its experience score is predicted using h (x) W.
Wherein, the activation function G (a, b, x) involved in step 3 is:
Figure BDA00023369719300000211
or ,
Figure BDA00023369719300000212
or ,
Figure BDA0002336971930000031
wherein ,l>Nh
The invention has the advantages of high prediction precision, stable performance, no need of a large amount of learner experience scoring, high operation speed and the like.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
Detailed Description
The invention is further described below with reference to examples, but the scope of the invention is not limited thereto.
As shown in fig. 1, the present invention is specifically implemented as follows:
step 1, data acquisition and initialization:
collecting facial videos of a learner in each learning process, analyzing the expression of each frame, and classifying the expression into aversion, anger, fear and height8 categories of Xingxing, sadness, surprise, shy and blankness constitute the feature vector
Figure BDA0002336971930000032
x(1),...,x(8)The proportion of aversion, anger, fear, happiness, sadness, surprise, photophobia and no expression in the whole video is respectively x(1),...,x(8)The sum is 1, and the auxiliary characteristics are used for expanding x according to actual conditions to obtain
Figure BDA0002336971930000033
Is NiA sample of dimensions; order sample collection
Figure BDA0002336971930000034
Label for scoring learner experience after each learning as sample
Figure BDA0002336971930000035
To pair
Figure BDA0002336971930000036
Marking to obtain corresponding category label
Figure BDA0002336971930000037
Wherein l is the number of labeled samples, n is the number of all samples, and u-n-l is the number of unlabeled samples;
Figure BDA0002336971930000038
a set of real numbers is represented as,
Figure BDA0002336971930000039
representing a positive real number set;
initialization: the following parameters were manually set: lambda [ alpha ]1,λ2Theta, sigma > 0, number of hidden layer nodes NhThe maximum iteration time E is more than 0, and the iteration time t is 0;
step 2, randomly generating an input weight vector of a hidden layer mapping function
Figure BDA00023369719300000310
Is offset from the input; b ∈ R, as follows:
randomly generating NhA, obtaining
Figure BDA00023369719300000311
Randomly generating NhB, obtaining
Figure BDA00023369719300000312
Step 3, generating a hidden layer output function:
Figure BDA00023369719300000313
wherein G (a, b, x) is an activation function, x represents a sample, and superscript T represents matrix transfer;
step 4, generating a hidden layer output matrix:
H=[h(x1),...,h(xn)]T
step 5, initializing an output weight matrix:
Figure BDA00023369719300000314
wherein ,W0An output weight matrix W with t equal to 0, pinv (H) represents a pseudo-inverse matrix of H,
Figure BDA0002336971930000041
Hla matrix composed of the first l rows of H;
and 6, updating the label approximation matrix as follows:
Figure BDA0002336971930000042
wherein ,Yt+1Tag approximation matrix for t +1 iterations, InIs a unit matrix of n dimensions, J ═ Il,Ol×u;Ou×l,Ou×u],IlIs a unit matrix in the dimension of l,
Figure BDA0002336971930000043
is v is1×v2Zero matrix of dimensions, v1,v2It is possible to take u or l,
Figure BDA0002336971930000044
Ou×1a zero matrix of u x 1 dimensions; l is graph Laplace matrix L ═ D-A, A is similarity matrix, its ith row and jth column element AijComprises the following steps:
Figure BDA0002336971930000045
wherein ,xiAnd xjFor the sample, i, j ∈ { 1.,. n }, σ > 0 is the Gaussian kernel width, D is the degree matrix of A, D is the diagonal matrix, the ith diagonal element D of D isii=∑jAij
And 7: the output weight matrix is updated as follows:
Wt+1=(HTH+θUt)-1HTYt+1
wherein ,
Figure BDA0002336971930000046
wherein ,Wt+1Denotes W at the time t +1,
Figure BDA0002336971930000047
is Wt+1Line 1 to line N ofhThe number of the row vectors is,
Figure BDA0002336971930000048
is line 1 to line N of WhA row vector;
and 8: the iteration number t is increased by 1, if t is more than E, W is retainedt+1And jumping to the step 9, otherwise jumping to the step 6;
and step 9: for the new sample x, its experience score is predicted using h (x) W.
Preferably, the activation function G (a, b, x) involved in step 3 is:
Figure BDA0002336971930000049
preferably, the activation function G (a, b, x) involved in step 3 is:
Figure BDA00023369719300000410
preferably, the activation function G (a, b, x) involved in step 3 is:
Figure BDA00023369719300000411
further preferably, l > Nh
In step 1, when the auxiliary features are used to expand x according to actual conditions, the features such as reading category, target learner, plot expansion mode, whether three-dimensional image, whether auxiliary means other than vision exists, main language of text, drawing style, average word number per page, etc. can be adopted.
The Gaussian kernel width may be 0.01, lambda1,λ2And θ may be taken as: lambda [ alpha ]1=0.3,λ2=0.7,θ=0.2。NhAn integer between 100 and 1000 may be taken, and E an integer between 3 and 20 may be taken.
The above examples are provided only for the purpose of describing the present invention, and are not intended to limit the scope of the present invention. The scope of the invention is defined by the appended claims. Various equivalent substitutions and modifications can be made without departing from the spirit and principles of the invention, and are intended to be within the scope of the invention.

Claims (5)

1. An experience analysis method based on learner expression is characterized by comprising the following steps:
step 1, data acquisition and initialization:
collecting facial videos of a learner in each learning process, analyzing the expression of each frame, and dividing the expression into aversion, anger, smell and the like,8 categories of fear, happiness, sadness, surprise, photophobia and blankness are formed into a feature vector
Figure FDA0002336971920000011
x(1),...,x(8)The proportion of aversion, anger, fear, happiness, sadness, surprise, photophobia and no expression in the whole video is respectively x(1),...,x(8)The sum is 1, and the auxiliary characteristics are used for expanding x according to actual conditions to obtain
Figure FDA0002336971920000012
Is NiA sample of dimensions; order sample collection
Figure FDA0002336971920000013
Label for scoring learner experience after each learning as sample
Figure FDA0002336971920000014
To pair
Figure FDA0002336971920000015
Marking to obtain corresponding category label
Figure FDA0002336971920000016
Wherein l is the number of labeled samples, n is the number of all samples, and u-n-l is the number of unlabeled samples;
Figure FDA0002336971920000017
a set of real numbers is represented as,
Figure FDA0002336971920000018
representing a positive real number set;
initialization: the following parameters were manually set: lambda [ alpha ]1,λ2Theta, sigma > 0, number of hidden layer nodes NhThe maximum iteration time E is more than 0, and the iteration time t is 0;
step 2, randomly generating hidingInput weight vector of layer mapping function
Figure FDA0002336971920000019
Is offset from the input; b ∈ R, as follows:
randomly generating NhA, obtaining
Figure FDA00023369719200000110
Randomly generating NhB, obtaining
Figure FDA00023369719200000111
Step 3, generating a hidden layer output function:
Figure FDA00023369719200000112
wherein G (a, b, x) is an activation function, x represents a sample, and superscript T represents matrix transfer;
step 4, generating a hidden layer output matrix:
H=[h(x1),...,h(xn)]T
step 5, initializing an output weight matrix:
Figure FDA00023369719200000113
wherein ,W0An output weight matrix W with t equal to 0, pinv (H) represents a pseudo-inverse matrix of H,
Figure FDA00023369719200000114
Hla matrix composed of the first l rows of H;
and 6, updating the label approximation matrix as follows:
Figure FDA00023369719200000115
wherein ,Yt+1For t +1 iterationsOf the label approximation matrix, InIs a unit matrix of n dimensions, J ═ Il,Ol×u;Ou×l,Ou×u],IlIs a unit matrix in the dimension of l,
Figure FDA00023369719200000116
is v is1×v2Zero matrix of dimensions, v1,v2It is possible to take u or l,
Figure FDA00023369719200000117
Ou×1a zero matrix of u x 1 dimensions; l is graph Laplace matrix L ═ D-A, A is similarity matrix, its ith row and jth column element AijComprises the following steps:
Figure FDA00023369719200000118
wherein ,xiAnd xjFor the sample, i, j ∈ { 1.,. n }, σ > 0 is the Gaussian kernel width, D is the degree matrix of A, D is the diagonal matrix, the ith diagonal element D of D isii=∑jAij
And 7: the output weight matrix is updated as follows:
Wt+1=(HTH+θUt)-1HTYt+1
wherein ,
Figure FDA0002336971920000021
wherein ,Wt+1Denotes W at the time t +1,
Figure FDA0002336971920000022
is Wt+1Line 1 to line N ofhThe number of the row vectors is,
Figure FDA0002336971920000023
is line 1 to line N of WhA row vector;
and 8: the iteration number t is increased by 1, if t > E, the iteration number is kept
Figure FDA0002336971920000027
Jumping to the step 9, otherwise jumping to the step 6;
and step 9: for the new sample x, its experience score is predicted using h (x) W.
2. The method of claim 1, wherein the activation function G (a, b, x) in step 3 is:
Figure FDA0002336971920000024
3. the method of claim 1, wherein the activation function G (a, b, x) in step 3 is:
Figure FDA0002336971920000025
4. the method of claim 1, wherein the activation function G (a, b, x) in step 3 is:
Figure FDA0002336971920000026
5. the method as claimed in any one of claims 1, 2, 3 and 4, wherein l > Nh
CN201911360147.0A 2019-12-25 2019-12-25 Experience analysis method based on learner expression Active CN111126297B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911360147.0A CN111126297B (en) 2019-12-25 2019-12-25 Experience analysis method based on learner expression

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911360147.0A CN111126297B (en) 2019-12-25 2019-12-25 Experience analysis method based on learner expression

Publications (2)

Publication Number Publication Date
CN111126297A true CN111126297A (en) 2020-05-08
CN111126297B CN111126297B (en) 2023-10-31

Family

ID=70502568

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911360147.0A Active CN111126297B (en) 2019-12-25 2019-12-25 Experience analysis method based on learner expression

Country Status (1)

Country Link
CN (1) CN111126297B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112001223A (en) * 2020-07-01 2020-11-27 安徽新知数媒信息科技有限公司 Rapid virtualization construction method of real environment map
CN115506783A (en) * 2021-06-21 2022-12-23 中国石油化工股份有限公司 Lithology identification method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107085704A (en) * 2017-03-27 2017-08-22 杭州电子科技大学 Fast face expression recognition method based on ELM own coding algorithms
CN107392230A (en) * 2017-06-22 2017-11-24 江南大学 A kind of semi-supervision image classification method for possessing maximization knowledge utilization ability
US20180165554A1 (en) * 2016-12-09 2018-06-14 The Research Foundation For The State University Of New York Semisupervised autoencoder for sentiment analysis
CN109359521A (en) * 2018-09-05 2019-02-19 浙江工业大学 The two-way assessment system of Classroom instruction quality based on deep learning
CN109919102A (en) * 2019-03-11 2019-06-21 重庆科技学院 A kind of self-closing disease based on Expression Recognition embraces body and tests evaluation method and system
CN109919099A (en) * 2019-03-11 2019-06-21 重庆科技学院 A kind of user experience evaluation method and system based on Expression Recognition
CN109934156A (en) * 2019-03-11 2019-06-25 重庆科技学院 A kind of user experience evaluation method and system based on ELMAN neural network
CN110390307A (en) * 2019-07-25 2019-10-29 首都师范大学 Expression recognition method, Expression Recognition model training method and device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180165554A1 (en) * 2016-12-09 2018-06-14 The Research Foundation For The State University Of New York Semisupervised autoencoder for sentiment analysis
CN107085704A (en) * 2017-03-27 2017-08-22 杭州电子科技大学 Fast face expression recognition method based on ELM own coding algorithms
CN107392230A (en) * 2017-06-22 2017-11-24 江南大学 A kind of semi-supervision image classification method for possessing maximization knowledge utilization ability
CN109359521A (en) * 2018-09-05 2019-02-19 浙江工业大学 The two-way assessment system of Classroom instruction quality based on deep learning
CN109919102A (en) * 2019-03-11 2019-06-21 重庆科技学院 A kind of self-closing disease based on Expression Recognition embraces body and tests evaluation method and system
CN109919099A (en) * 2019-03-11 2019-06-21 重庆科技学院 A kind of user experience evaluation method and system based on Expression Recognition
CN109934156A (en) * 2019-03-11 2019-06-25 重庆科技学院 A kind of user experience evaluation method and system based on ELMAN neural network
CN110390307A (en) * 2019-07-25 2019-10-29 首都师范大学 Expression recognition method, Expression Recognition model training method and device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MIN WANG ET AL.: "Look-up Table Unit Activation Function for Deep Convolutional Neural Networks", 《2018 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION》, pages 1225 - 1233 *
雒晓卓: "基于联合稀疏和局部线性的极限学习机及应用", 《中国博士学位论文全文数据库 信息科技辑》, no. 2017, pages 140 - 45 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112001223A (en) * 2020-07-01 2020-11-27 安徽新知数媒信息科技有限公司 Rapid virtualization construction method of real environment map
CN112001223B (en) * 2020-07-01 2023-11-24 安徽新知数字科技有限公司 Rapid virtualization construction method for real environment map
CN115506783A (en) * 2021-06-21 2022-12-23 中国石油化工股份有限公司 Lithology identification method

Also Published As

Publication number Publication date
CN111126297B (en) 2023-10-31

Similar Documents

Publication Publication Date Title
CN109376242B (en) Text classification method based on cyclic neural network variant and convolutional neural network
CN110334705B (en) Language identification method of scene text image combining global and local information
CN108647742B (en) Rapid target detection method based on lightweight neural network
CN113486981B (en) RGB image classification method based on multi-scale feature attention fusion network
CN108537119B (en) Small sample video identification method
Taylor et al. Learning invariance through imitation
CN108765383B (en) Video description method based on deep migration learning
CN110717526A (en) Unsupervised transfer learning method based on graph convolution network
CN106951911A (en) A kind of quick multi-tag picture retrieval system and implementation method
CN110837846A (en) Image recognition model construction method, image recognition method and device
CN110705490B (en) Visual emotion recognition method
CN105701225B (en) A kind of cross-media retrieval method based on unified association hypergraph specification
CN110175657B (en) Image multi-label marking method, device, equipment and readable storage medium
CN113222011A (en) Small sample remote sensing image classification method based on prototype correction
CN103020658B (en) Recognition method for objects in two-dimensional images
CN117992805B (en) Zero sample cross-modal retrieval method and system based on tensor product graph fusion diffusion
CN108470025A (en) Partial-Topic probability generates regularization own coding text and is embedded in representation method
CN111126297A (en) Experience analysis method based on learner expression
CN112364791A (en) Pedestrian re-identification method and system based on generation of confrontation network
CN114898136B (en) Small sample image classification method based on characteristic self-adaption
CN112784921A (en) Task attention guided small sample image complementary learning classification algorithm
CN118036555B (en) Low-sample font generation method based on skeleton transfer and structure contrast learning
CN114742014B (en) Few-sample text style migration method based on associated attention
CN110442736B (en) Semantic enhancer spatial cross-media retrieval method based on secondary discriminant analysis
CN116883746A (en) Graph node classification method based on partition pooling hypergraph neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240411

Address after: Building 24, 4th Floor, No. 68 Beiqing Road, Haidian District, Beijing, 100000, 0446

Patentee after: Beijing Beike Haiteng Technology Co.,Ltd.

Country or region after: China

Address before: 232001 cave West Road, Huainan, Anhui

Patentee before: HUAINAN NORMAL University

Country or region before: China

TR01 Transfer of patent right