CN114399827A - College graduate career personality testing method and system based on facial micro-expression - Google Patents
College graduate career personality testing method and system based on facial micro-expression Download PDFInfo
- Publication number
- CN114399827A CN114399827A CN202210244199.7A CN202210244199A CN114399827A CN 114399827 A CN114399827 A CN 114399827A CN 202210244199 A CN202210244199 A CN 202210244199A CN 114399827 A CN114399827 A CN 114399827A
- Authority
- CN
- China
- Prior art keywords
- micro
- expression
- facial
- target object
- acquiring
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012360 testing method Methods 0.000 title claims abstract description 226
- 230000001815 facial effect Effects 0.000 title claims abstract description 144
- 230000014509 gene expression Effects 0.000 claims abstract description 73
- 230000008859 change Effects 0.000 claims abstract description 68
- 238000000034 method Methods 0.000 claims abstract description 59
- 230000008569 process Effects 0.000 claims abstract description 46
- 230000008451 emotion Effects 0.000 claims description 49
- 238000001514 detection method Methods 0.000 claims description 48
- 230000002159 abnormal effect Effects 0.000 claims description 46
- 238000012795 verification Methods 0.000 claims description 25
- 238000012549 training Methods 0.000 claims description 20
- 238000011161 development Methods 0.000 claims description 13
- 239000012634 fragment Substances 0.000 claims description 12
- 238000007781 pre-processing Methods 0.000 claims description 12
- 230000007115 recruitment Effects 0.000 claims description 6
- 230000001755 vocal effect Effects 0.000 claims description 6
- 238000010998 test method Methods 0.000 claims description 2
- 238000010200 validation analysis Methods 0.000 claims 2
- 230000003287 optical effect Effects 0.000 description 6
- 238000013527 convolutional neural network Methods 0.000 description 4
- 238000012937 correction Methods 0.000 description 4
- 210000003128 head Anatomy 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 206010063659 Aversion Diseases 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 210000004709 eyebrow Anatomy 0.000 description 2
- 210000001097 facial muscle Anatomy 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 238000002372 labelling Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 1
- 238000007726 management method Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/167—Personality evaluation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/105—Human resources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/20—Education
- G06Q50/205—Education administration or guidance
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Human Resources & Organizations (AREA)
- General Health & Medical Sciences (AREA)
- Strategic Management (AREA)
- Biophysics (AREA)
- Evolutionary Computation (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- General Engineering & Computer Science (AREA)
- Tourism & Hospitality (AREA)
- Artificial Intelligence (AREA)
- Educational Technology (AREA)
- Computing Systems (AREA)
- General Business, Economics & Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Psychology (AREA)
- Educational Administration (AREA)
- Psychiatry (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Economics (AREA)
- Computational Linguistics (AREA)
- Marketing (AREA)
- Child & Adolescent Psychology (AREA)
- Heart & Thoracic Surgery (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Developmental Disabilities (AREA)
- Hospice & Palliative Care (AREA)
- Social Psychology (AREA)
- Pathology (AREA)
- Primary Health Care (AREA)
Abstract
The invention discloses a college graduate occupational character testing method and system based on facial micro-expression, frame image information of a target object during occupational character testing is obtained, the frame image information is preprocessed, and facial features of the target object are obtained; detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change; judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight; and generating a target user professional character test report by combining the credibility with the result of the professional character test. The invention combines the facial micro-expression and the attention change of the target user with the professional character test, thereby improving the accuracy of the professional character test of the target object.
Description
Technical Field
The invention relates to the technical field of psychological tests, in particular to a college graduate career personality test method and system based on facial micro-expression.
Background
With the popularization of higher education, the number of graduates in colleges and universities increases year by year, the problem of difficult employment of the graduates is also highlighted year by year, and the difficulty in finding work for college students becomes a difficult problem in the current colleges and universities. And with the development of psychology, the recognition and discrimination mechanisms of human characters and behaviors are more and more mature, so that part of enterprises draw on the research results of psychology to link the human resource management work with the professional character test, and the aim is to find the staff most suitable for the enterprises, so that colleges and universities need to advance at any time when graduate employment guidance work is carried out, but graduates conduct starting late at the present stage and a guidance system is immature, the requirements of college students on employment guidance cannot be met, and the traditional graduate professional character test of the graduates cannot truly reflect the professional characters of the graduates of the colleges and universities due to the existence of human subjective factors, and the employment guidance work cannot be carried out in a targeted manner.
In order to improve the accuracy of the graduate professional character test of colleges and universities, a system needs to be developed to be matched with the graduate professional character test, and the system acquires the facial features of a target object by acquiring frame image information of the target object during the professional character test and preprocessing the frame image information; detecting and identifying the attention and the micro expression change of a target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change; judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight; and generating a target user professional character test report by combining the credibility with the result of the professional character test. In the system implementation process, how to obtain the validity degree selected in the test process according to the attention and micro-expression changes of the target user is one of the problems that needs to be solved urgently.
Disclosure of Invention
In order to solve the technical problems, the invention provides a college graduate career personality testing method and system based on facial micro-expression.
The invention provides a college graduate career personality testing method based on facial microexpression, which comprises the following steps:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
and generating a target user professional character test report by combining the credibility with the result of the professional character test.
In the scheme, the acquiring of the frame image information of the target object during the occupational character test, the preprocessing of the feature image information, and the acquiring of the facial features of the target object specifically include:
acquiring an original video of a target object during occupational character test, and decoding the original video to acquire frame image information;
positioning according to the frame image information to acquire facial image data of a target object, acquiring a facial interesting region of the target object according to the facial image, and extracting facial interesting region characteristics by positioning face key points of the facial interesting region;
identity verification is carried out in a college graduate information database according to the face interesting features, and similarity judgment is carried out on the face interesting features and photo information corresponding to a target object in the college graduate information database;
and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
In the scheme, the detection and identification of the target user micro expression is carried out according to the facial features, and the method specifically comprises the following steps:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set;
training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training;
obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model;
testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set;
judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model;
and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
In this scheme, obtaining the attention weight according to the attention change of the target user specifically includes:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth occupational character test according to the attention score.
In the scheme, the micro expression weight is obtained according to the micro expression change of the target user, and the method specifically comprises the following steps:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
In this scheme, the generating of the target user career personality test report by combining the results of the credibility and career personality test specifically includes:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
The invention also provides a university graduate career personality testing system based on facial microexpression, which comprises: the system comprises a memory and a processor, wherein the memory comprises a college graduate occupation character testing method program based on facial micro-expressions, and the college graduate occupation character testing method program based on facial micro-expressions realizes the following steps when being executed by the processor:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
and generating a target user professional character test report by combining the credibility with the result of the professional character test.
In the scheme, the acquiring of the frame image information of the target object during the occupational character test, the preprocessing of the feature image information, and the acquiring of the facial features of the target object specifically include:
acquiring an original video of a target object during occupational character test, and decoding the original video to acquire frame image information;
positioning according to the frame image information to acquire facial image data of a target object, acquiring a facial interesting region of the target object according to the facial image, and extracting facial interesting region characteristics by positioning face key points of the facial interesting region;
identity verification is carried out in a college graduate information database according to the face interesting features, and similarity judgment is carried out on the face interesting features and photo information corresponding to a target object in the college graduate information database;
and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
In the scheme, the detection and identification of the target user micro expression is carried out according to the facial features, and the method specifically comprises the following steps:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set;
training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training;
obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model;
testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set;
judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model;
and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
In this scheme, obtaining the attention weight according to the attention change of the target user specifically includes:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth occupational character test according to the attention score.
In the scheme, the micro expression weight is obtained according to the micro expression change of the target user, and the method specifically comprises the following steps:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
In this scheme, the generating of the target user career personality test report by combining the results of the credibility and career personality test specifically includes:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
The invention discloses a college graduate occupational character testing method and system based on facial micro-expression, frame image information of a target object during occupational character testing is obtained, the frame image information is preprocessed, and facial features of the target object are obtained; detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change; judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight; and generating a target user professional character test report by combining the credibility with the result of the professional character test. According to the invention, the facial micro-expression and attention change of the target user are combined with the professional personality test, so that the accuracy of the professional personality test of the target object is improved, the real ideas of graduates in colleges and universities are obtained by introducing the micro-expression detection, a reasonable employment guidance scheme is further formulated, and the targeted employment recommendation is provided for the graduates in colleges and universities.
Drawings
FIG. 1 shows a flow chart of a college graduate career personality testing method of the present invention based on facial microexpression;
FIG. 2 shows a block diagram of a college graduate professional character testing system based on facial microexpression of the present invention.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a more particular description of the invention will be rendered by reference to the appended drawings. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and therefore the scope of the present invention is not limited by the specific embodiments disclosed below.
FIG. 1 shows a flow chart of a college graduate career personality testing method based on facial microexpression of the present invention.
As shown in fig. 1, a first aspect of the present invention provides a graduate professional character testing method for colleges and universities based on facial microexpression, comprising:
s102, obtaining frame image information of a target object during occupational character testing, preprocessing the frame image information, and obtaining facial features of the target object;
s104, detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
s106, judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
and S108, generating a target user career character test report according to the credibility and the result of the career character test.
It should be noted that, the obtaining of the frame image information of the target object during the occupational character test, the preprocessing of the feature image information, and the obtaining of the facial features of the target object specifically include: acquiring an original video of a target object during occupational character test, and decoding the original video to acquire frame image information; positioning according to the frame image information to acquire facial image data of a target object, acquiring a facial region of interest of the target object according to the facial image, and extracting facial region of interest features through positioning face key points of the facial region of interest, wherein the region of interest features comprise optical flow features and texture features, and the optical flow features reflect the direction and action amplitude of facial muscles of the region of interest; identity verification is carried out in a college graduate information database according to the face interesting features, and similarity judgment is carried out on the face interesting features and photo information corresponding to a target object in the college graduate information database; and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
It should be noted that, the detecting and identifying of the target user micro expression according to the facial features specifically includes:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set; training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training; obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model; testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set; judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model; and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
The CASME database is acquired by a camera with a frame rate of 60fps, and 195 micro expression samples are obtained in total. The CASME database is divided into an A part and a B part, wherein the A-type sample is collected under natural illumination, and the resolution of a sample image is 1280 multiplied by 720; the B-type sample is collected under the LED environment, and the resolution of the sample image is 640 multiplied by 480. Sample labeling in the CASME database is completed by participants and psychologists together, and eight types of feelings are contained: enjoying, hurting heart, aversion, surprise, keeping away from sight, fear, depression and tension.
It should be noted that, obtaining the attention weight according to the attention change of the target user specifically includes:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth occupational character test according to the attention score.
It should be noted that the acquisition of the sight line drop point of the target object can be realized by establishing a sight line detection model through a machine learning method such as a support vector machine or a neural network, acquiring a face image of the target object, and extracting feature points of the eye region of the target object and the head posture of the target object; and acquiring a sight angle and a sight confidence parameter according to the characteristic points of the eye region, and determining the sight falling point position of the target object according to the head posture, the sight angle, the sight confidence parameter and the distance from the target object to the display screen.
The method comprises the steps of establishing a threshold interval by presetting a plurality of times and duration threshold information, judging the threshold interval in which face deflection abnormal information and sight line drop point abnormal information fall, and obtaining the attention score of a target object in the response process of the nth professional character test according to the corresponding scoring standard of the threshold interval, wherein the more the number of times and the duration of the face deflection abnormal information and the sight line drop point abnormal information of the target object are, the less the attention score of the target object in the test response process is, and the smaller the attention weight in the professional character test is.
It should be noted that the obtaining of the micro expression weight according to the micro expression change of the target user specifically includes:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
The passive emotion micro-expression segment mainly comprises frown, sipping mouth, nostril micro-opening, eyebrow picking and the like, when the micro-expression change of the target object is judged to be the passive emotion change through the micro-expression detection model, the similarity of the micro-expression change of the target object and the micro-expression segment corresponding to the passive emotion in the psychological test is calculated, the micro-expression weight of the target object in the professional character test is obtained according to the similarity, and the similarity calculation can be realized through methods such as Euclidean distance or cosine comparison. Calculating the credibility of the occupational character test result generated by the initial credibility of the attention weight and the micro-expression weight of the target object in the occupational character test
It should be noted that the generating of the target user career personality test report by combining the credibility with the result of the career personality test specifically includes:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
According to the embodiment of the invention, the invention also comprises the step of prompting the attention-inattentive condition of the target object in the job character test, which comprises the following specific steps:
acquiring face deflection abnormal information and sight line drop point abnormal information of a target object in a professional character test, and counting test item quantity information of the target object with the face abnormal information and the sight line drop point abnormal information in the professional character test answering process;
presetting abnormal test item quantity threshold information, and judging whether the test item quantity information is greater than the abnormal test item quantity threshold information;
if the value is larger than the preset value, sending prompt information like the target object, prompting that the accuracy of the test result of the target object is reduced due to abnormal conditions, and displaying the prompt information according to a preset mode.
According to the embodiment of the invention, the invention further comprises the step of detecting the keyword information matched with the micro expression change of the target user through the sight line drop point, which specifically comprises the following steps:
acquiring a timestamp of the micro expression change of the target object in the professional planning test, and acquiring the keyword information of the sight line landing point of the target object according to the timestamp;
matching the keyword information with the micro expression changes to obtain test subject information of occupational character test items corresponding to the keys;
obtaining the rest test items of the keywords through the test subject information, and performing semantic replacement on the keywords in the rest test items;
comparing the test results of the rest test items with the test results of the test items containing the micro-expression changes to obtain the deviation between the test results of the test items containing the micro-expression changes and the test results of the rest test items;
and acquiring correction information of the micro-expression weight according to the deviation, and correcting the micro-expression weight according to the correction information.
FIG. 2 shows a block diagram of a college graduate professional character testing system based on facial microexpression of the present invention.
The second aspect of the present invention also provides a college graduate career personality testing system 2 based on facial microexpression, comprising: a memory 21 and a processor 22, wherein the memory includes a facial micro-expression-based graduate vocational test method program, and when the processor executes the facial micro-expression-based graduate vocational test method program, the following steps are implemented:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
and generating a target user professional character test report by combining the credibility with the result of the professional character test.
It should be noted that, the obtaining of the frame image information of the target object during the occupational character test, the preprocessing of the feature image information, and the obtaining of the facial features of the target object specifically include: acquiring an original video of a target object during occupational character test, and decoding the original video to acquire frame image information; positioning according to the frame image information to acquire facial image data of a target object, acquiring a facial region of interest of the target object according to the facial image, and extracting facial region of interest features through positioning face key points of the facial region of interest, wherein the region of interest features comprise optical flow features and texture features, and the optical flow features reflect the direction and action amplitude of facial muscles of the region of interest; identity verification is carried out in a college graduate information database according to the face interesting features, and similarity judgment is carried out on the face interesting features and photo information corresponding to a target object in the college graduate information database; and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
It should be noted that, the detecting and identifying of the target user micro expression according to the facial features specifically includes:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set; training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training; obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model; testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set; judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model; and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
The CASME database is acquired by a camera with a frame rate of 60fps, and 195 micro expression samples are obtained in total. The CASME database is divided into an A part and a B part, wherein the A-type sample is collected under natural illumination, and the resolution of a sample image is 1280 multiplied by 720; the B-type sample is collected under the LED environment, and the resolution of the sample image is 640 multiplied by 480. Sample labeling in the CASME database is completed by participants and psychologists together, and eight types of feelings are contained: enjoying, hurting heart, aversion, surprise, keeping away from sight, fear, depression and tension.
It should be noted that, obtaining the attention weight according to the attention change of the target user specifically includes:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth occupational character test according to the attention score.
It should be noted that the acquisition of the sight line drop point of the target object can be realized by establishing a sight line detection model through a machine learning method such as a support vector machine or a neural network, acquiring a face image of the target object, and extracting feature points of the eye region of the target object and the head posture of the target object; and acquiring a sight angle and a sight confidence parameter according to the characteristic points of the eye region, and determining the sight falling point position of the target object according to the head posture, the sight angle, the sight confidence parameter and the distance from the target object to the display screen.
The method comprises the steps of establishing a threshold interval by presetting a plurality of times and duration threshold information, judging the threshold interval in which face deflection abnormal information and sight line drop point abnormal information fall, and obtaining the attention score of a target object in the response process of the nth professional character test according to the corresponding scoring standard of the threshold interval, wherein the more the number of times and the duration of the face deflection abnormal information and the sight line drop point abnormal information of the target object are, the less the attention score of the target object in the test response process is, and the smaller the attention weight in the professional character test is.
It should be noted that the obtaining of the micro expression weight according to the micro expression change of the target user specifically includes:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
The passive emotion micro-expression segment mainly comprises frown, sipping mouth, nostril micro-opening, eyebrow picking and the like, when the micro-expression change of the target object is judged to be the passive emotion change through the micro-expression detection model, the similarity of the micro-expression change of the target object and the micro-expression segment corresponding to the passive emotion in the psychological test is calculated, the micro-expression weight of the target object in the professional character test is obtained according to the similarity, and the similarity calculation can be realized through methods such as Euclidean distance or cosine comparison. Calculating the credibility of the occupational character test result generated by the initial credibility of the attention weight and the micro-expression weight of the target object in the occupational character test
It should be noted that the generating of the target user career personality test report by combining the credibility with the result of the career personality test specifically includes:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
According to the embodiment of the invention, the invention also comprises the step of prompting the attention-inattentive condition of the target object in the job character test, which comprises the following specific steps:
acquiring face deflection abnormal information and sight line drop point abnormal information of a target object in a professional character test, and counting test item quantity information of the target object with the face abnormal information and the sight line drop point abnormal information in the professional character test answering process;
presetting abnormal test item quantity threshold information, and judging whether the test item quantity information is greater than the abnormal test item quantity threshold information;
if the value is larger than the preset value, sending prompt information like the target object, prompting that the accuracy of the test result of the target object is reduced due to abnormal conditions, and displaying the prompt information according to a preset mode.
According to the embodiment of the invention, the invention further comprises the step of detecting the keyword information matched with the micro expression change of the target user through the sight line drop point, which specifically comprises the following steps:
acquiring a timestamp of the micro expression change of the target object in the professional planning test, and acquiring the keyword information of the sight line landing point of the target object according to the timestamp;
matching the keyword information with the micro expression changes to obtain test subject information of occupational character test items corresponding to the keys;
obtaining the rest test items of the keywords through the test subject information, and performing semantic replacement on the keywords in the rest test items;
comparing the test results of the rest test items with the test results of the test items containing the micro-expression changes to obtain the deviation between the test results of the test items containing the micro-expression changes and the test results of the rest test items;
and acquiring correction information of the micro-expression weight according to the deviation, and correcting the micro-expression weight according to the correction information.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.
Claims (10)
1. A college graduate career personality testing method based on facial micro-expression is characterized by comprising the following steps:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
and generating a target user professional character test report by combining the credibility with the result of the professional character test.
2. The college graduate vocational test method based on the facial micro-expression as claimed in claim 1, wherein the obtaining of the frame image information of the target object during the vocational test is performed, the preprocessing of the feature image information is performed to obtain the facial features of the target object, and specifically:
acquiring an original video of a target object during occupational character test, and decoding the original video to acquire frame image information;
positioning according to the frame image information to acquire facial image data of a target object, acquiring a facial interesting region of the target object according to the facial image, and extracting facial interesting region characteristics by positioning face key points of the facial interesting region;
identity verification is carried out in a college graduate information database according to the face interesting features, and similarity judgment is carried out on the face interesting features and photo information corresponding to a target object in the college graduate information database;
and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
3. The college graduate career personality testing method based on facial microexpression of claim 1, wherein the detection and recognition of the microexpression of the target user is performed according to the facial features, and specifically comprises:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set;
training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training;
obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model;
testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set;
judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model;
and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
4. The college graduate career personality testing method based on facial microexpression of claim 1, wherein the attention weight is obtained according to the attention change of the target user, and specifically comprises:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth occupational character test according to the attention score.
5. The college graduate career personality testing method based on facial micro-expression of claim 1, characterized in that micro-expression weights are obtained according to target user micro-expression changes, specifically:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
6. The method of claim 1, wherein the results of the validation and occupational character tests are combined to generate a target user occupational character test report, specifically:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
7. A college graduate career personality testing system based on facial microexpression, the system comprising: the system comprises a memory and a processor, wherein the memory comprises a college graduate occupation character testing method program based on facial micro-expression, and the college graduate occupation character testing method program based on facial micro-expression realizes the following steps when being executed by the processor:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
and generating a target user professional character test report by combining the credibility with the result of the professional character test.
8. The system of claim 7, wherein the attention weight is obtained according to the attention change of the target user, and specifically comprises:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth occupational character test according to the attention score.
9. The system of claim 7, wherein the micro-expression-based college graduate career personality testing system is configured to obtain micro-expression weights according to micro-expression changes of a target user, and specifically comprises:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
10. The system of claim 7, wherein the results of the validation and vocational tests are combined to generate a target user vocational character test report, specifically:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210244199.7A CN114399827B (en) | 2022-03-14 | 2022-03-14 | College graduate career character testing method and system based on facial micro-expression |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210244199.7A CN114399827B (en) | 2022-03-14 | 2022-03-14 | College graduate career character testing method and system based on facial micro-expression |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114399827A true CN114399827A (en) | 2022-04-26 |
CN114399827B CN114399827B (en) | 2022-08-09 |
Family
ID=81234014
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210244199.7A Active CN114399827B (en) | 2022-03-14 | 2022-03-14 | College graduate career character testing method and system based on facial micro-expression |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114399827B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115358605A (en) * | 2022-08-26 | 2022-11-18 | 山东心法科技有限公司 | Multi-mode fusion-based career planning auxiliary method, equipment and medium |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010072203A (en) * | 2008-09-17 | 2010-04-02 | Fuji Xerox Co Ltd | Problem creating device, problem creating program, and learning system |
US20140376772A1 (en) * | 2013-06-24 | 2014-12-25 | Utechzone Co., Ltd | Device, operating method and computer-readable recording medium for generating a signal by detecting facial movement |
CN108304745A (en) * | 2017-01-10 | 2018-07-20 | 普天信息技术有限公司 | A kind of driver's driving behavior detection method, device |
CN109222888A (en) * | 2018-11-05 | 2019-01-18 | 温州职业技术学院 | A method of psychological test reliability is judged based on eye movement technique |
CN109767321A (en) * | 2018-12-18 | 2019-05-17 | 深圳壹账通智能科技有限公司 | Question answering process optimization method, device, computer equipment and storage medium |
CN109805943A (en) * | 2017-11-20 | 2019-05-28 | 徐熠 | A kind of intelligent evaluating system and method based on micro- Expression Recognition |
CN109872026A (en) * | 2018-12-14 | 2019-06-11 | 深圳壹账通智能科技有限公司 | Evaluation result generation method, device, equipment and computer readable storage medium |
CN109977903A (en) * | 2019-04-03 | 2019-07-05 | 珠海读书郎网络教育有限公司 | The method, apparatus and computer storage medium of a kind of wisdom classroom student-directed |
CN110362648A (en) * | 2019-05-31 | 2019-10-22 | 深圳壹账通智能科技有限公司 | Update method and device, storage medium, the computer equipment of questionnaire survey topic |
CN110569347A (en) * | 2019-09-10 | 2019-12-13 | 出门问问信息科技有限公司 | Data processing method and device, storage medium and electronic equipment |
CN110728182A (en) * | 2019-09-06 | 2020-01-24 | 平安科技(深圳)有限公司 | Interviewing method and device based on AI interviewing system and computer equipment |
CN111260517A (en) * | 2020-02-23 | 2020-06-09 | 徐永贵 | Intelligent teaching and management platform system and method for mobile phone |
CN111476178A (en) * | 2020-04-10 | 2020-07-31 | 大连海事大学 | Micro-expression recognition method based on 2D-3D CNN |
CN111507592A (en) * | 2020-04-08 | 2020-08-07 | 山东大学 | Evaluation method for active modification behaviors of prisoners |
CN111887867A (en) * | 2020-07-10 | 2020-11-06 | 衡阳师范学院 | Method and system for analyzing character formation based on expression recognition and psychological test |
CN111918133A (en) * | 2020-07-27 | 2020-11-10 | 深圳创维-Rgb电子有限公司 | Method for tutoring and supervising student writing homework, television and storage medium |
CN112613440A (en) * | 2020-12-29 | 2021-04-06 | 北京市商汤科技开发有限公司 | Attitude detection method and apparatus, electronic device and storage medium |
CN112613444A (en) * | 2020-12-29 | 2021-04-06 | 北京市商汤科技开发有限公司 | Behavior detection method and device, electronic equipment and storage medium |
US20210357316A1 (en) * | 2020-05-13 | 2021-11-18 | Data-Core Systems, Inc. | Synthesizing Data based on Topic Modeling for Training and Testing Machine Learning Systems |
CN113837010A (en) * | 2021-08-30 | 2021-12-24 | 淄博师范高等专科学校 | Education assessment system and method |
CN114129165A (en) * | 2021-12-10 | 2022-03-04 | 北京邮电大学 | Psychological assessment method, system and storage medium based on credible assessment scale |
-
2022
- 2022-03-14 CN CN202210244199.7A patent/CN114399827B/en active Active
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2010072203A (en) * | 2008-09-17 | 2010-04-02 | Fuji Xerox Co Ltd | Problem creating device, problem creating program, and learning system |
US20140376772A1 (en) * | 2013-06-24 | 2014-12-25 | Utechzone Co., Ltd | Device, operating method and computer-readable recording medium for generating a signal by detecting facial movement |
CN108304745A (en) * | 2017-01-10 | 2018-07-20 | 普天信息技术有限公司 | A kind of driver's driving behavior detection method, device |
CN109805943A (en) * | 2017-11-20 | 2019-05-28 | 徐熠 | A kind of intelligent evaluating system and method based on micro- Expression Recognition |
CN109222888A (en) * | 2018-11-05 | 2019-01-18 | 温州职业技术学院 | A method of psychological test reliability is judged based on eye movement technique |
CN109872026A (en) * | 2018-12-14 | 2019-06-11 | 深圳壹账通智能科技有限公司 | Evaluation result generation method, device, equipment and computer readable storage medium |
CN109767321A (en) * | 2018-12-18 | 2019-05-17 | 深圳壹账通智能科技有限公司 | Question answering process optimization method, device, computer equipment and storage medium |
CN109977903A (en) * | 2019-04-03 | 2019-07-05 | 珠海读书郎网络教育有限公司 | The method, apparatus and computer storage medium of a kind of wisdom classroom student-directed |
CN110362648A (en) * | 2019-05-31 | 2019-10-22 | 深圳壹账通智能科技有限公司 | Update method and device, storage medium, the computer equipment of questionnaire survey topic |
CN110728182A (en) * | 2019-09-06 | 2020-01-24 | 平安科技(深圳)有限公司 | Interviewing method and device based on AI interviewing system and computer equipment |
CN110569347A (en) * | 2019-09-10 | 2019-12-13 | 出门问问信息科技有限公司 | Data processing method and device, storage medium and electronic equipment |
CN111260517A (en) * | 2020-02-23 | 2020-06-09 | 徐永贵 | Intelligent teaching and management platform system and method for mobile phone |
CN111507592A (en) * | 2020-04-08 | 2020-08-07 | 山东大学 | Evaluation method for active modification behaviors of prisoners |
CN111476178A (en) * | 2020-04-10 | 2020-07-31 | 大连海事大学 | Micro-expression recognition method based on 2D-3D CNN |
US20210357316A1 (en) * | 2020-05-13 | 2021-11-18 | Data-Core Systems, Inc. | Synthesizing Data based on Topic Modeling for Training and Testing Machine Learning Systems |
CN111887867A (en) * | 2020-07-10 | 2020-11-06 | 衡阳师范学院 | Method and system for analyzing character formation based on expression recognition and psychological test |
CN111918133A (en) * | 2020-07-27 | 2020-11-10 | 深圳创维-Rgb电子有限公司 | Method for tutoring and supervising student writing homework, television and storage medium |
CN112613440A (en) * | 2020-12-29 | 2021-04-06 | 北京市商汤科技开发有限公司 | Attitude detection method and apparatus, electronic device and storage medium |
CN112613444A (en) * | 2020-12-29 | 2021-04-06 | 北京市商汤科技开发有限公司 | Behavior detection method and device, electronic equipment and storage medium |
CN113837010A (en) * | 2021-08-30 | 2021-12-24 | 淄博师范高等专科学校 | Education assessment system and method |
CN114129165A (en) * | 2021-12-10 | 2022-03-04 | 北京邮电大学 | Psychological assessment method, system and storage medium based on credible assessment scale |
Non-Patent Citations (3)
Title |
---|
PABLO BARROS 等,: "A deep neural model of emotion appraisal", 《ARXIV》 * |
谭红叶 等,: "基于代表性答案选择与注意力机制的自动评分", 《中文信息学报》 * |
钟马驰 等,: "基于人脸检测和模糊综合评判的在线教育专注度研究", 《计算机科学》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115358605A (en) * | 2022-08-26 | 2022-11-18 | 山东心法科技有限公司 | Multi-mode fusion-based career planning auxiliary method, equipment and medium |
CN115358605B (en) * | 2022-08-26 | 2023-05-05 | 山东心法科技有限公司 | Professional planning auxiliary method, device and medium based on multi-mode fusion |
Also Published As
Publication number | Publication date |
---|---|
CN114399827B (en) | 2022-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108765131B (en) | Micro-expression-based credit auditing method, device, terminal and readable storage medium | |
CN107680019B (en) | Examination scheme implementation method, device, equipment and storage medium | |
US10019653B2 (en) | Method and system for predicting personality traits, capabilities and suggested interactions from images of a person | |
CN112533051B (en) | Barrage information display method, barrage information display device, computer equipment and storage medium | |
CN109948447B (en) | Character network relation discovery and evolution presentation method based on video image recognition | |
CN107203953A (en) | It is a kind of based on internet, Expression Recognition and the tutoring system of speech recognition and its implementation | |
CN111126553A (en) | Intelligent robot interviewing method, equipment, storage medium and device | |
CN105518708A (en) | Method and equipment for verifying living human face, and computer program product | |
CN109766412B (en) | Learning content acquisition method based on image recognition and electronic equipment | |
US20210125149A1 (en) | Adaptability job vacancies matching system and method | |
CN115205764B (en) | Online learning concentration monitoring method, system and medium based on machine vision | |
CN108175425B (en) | Analysis processing device and cognitive index analysis method for potential value test | |
US10997609B1 (en) | Biometric based user identity verification | |
CN114399827B (en) | College graduate career character testing method and system based on facial micro-expression | |
CN113010557A (en) | Method and system for randomly answering questions by using psychological evaluation system scale | |
CN112053205A (en) | Product recommendation method and device through robot emotion recognition | |
Boychuk et al. | An exploratory sentiment and facial expressions analysis of data from photo-sharing on social media: The case of football violence | |
CN110852073A (en) | Language learning system and learning method for customizing learning content for user | |
CN110443122B (en) | Information processing method and related product | |
CN111753168A (en) | Method and device for searching questions, electronic equipment and storage medium | |
CN111046293A (en) | Method and system for recommending content according to evaluation result | |
CN114971425B (en) | Database information monitoring method, device, equipment and storage medium | |
CN115147067A (en) | Intelligent recruiter talent recruitment method based on deep learning | |
CN113920575A (en) | Facial expression recognition method and device and storage medium | |
CN113901418A (en) | Video-based identity authentication method and device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |