CN114399827B - College graduate career character testing method and system based on facial micro-expression - Google Patents

College graduate career character testing method and system based on facial micro-expression Download PDF

Info

Publication number
CN114399827B
CN114399827B CN202210244199.7A CN202210244199A CN114399827B CN 114399827 B CN114399827 B CN 114399827B CN 202210244199 A CN202210244199 A CN 202210244199A CN 114399827 B CN114399827 B CN 114399827B
Authority
CN
China
Prior art keywords
micro
test
expression
acquiring
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210244199.7A
Other languages
Chinese (zh)
Other versions
CN114399827A (en
Inventor
杜福杰
李世元
杜子明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Weifang Nursing Vocational College
Original Assignee
Weifang Nursing Vocational College
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Weifang Nursing Vocational College filed Critical Weifang Nursing Vocational College
Priority to CN202210244199.7A priority Critical patent/CN114399827B/en
Publication of CN114399827A publication Critical patent/CN114399827A/en
Application granted granted Critical
Publication of CN114399827B publication Critical patent/CN114399827B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/167Personality evaluation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/105Human resources
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/20Education
    • G06Q50/205Education administration or guidance

Abstract

The invention discloses a college graduate occupational character testing method and system based on facial micro-expression, frame image information of a target object during occupational character testing is obtained, the frame image information is preprocessed, and facial features of the target object are obtained; detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change; judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight; and generating a target user professional character test report by combining the credibility with the result of the professional character test. The invention combines the facial micro-expression and the attention change of the target user with the professional character test, thereby improving the accuracy of the professional character test of the target object.

Description

College graduate career personality testing method and system based on facial micro-expression
Technical Field
The invention relates to the technical field of psychological tests, in particular to a college graduate career personality test method and system based on facial micro-expression.
Background
With the popularization of higher education, the number of graduates in colleges and universities increases year by year, the problem of difficult employment of the graduates is also highlighted year by year, and the difficulty in finding work for college students becomes a difficult problem in the current colleges and universities. And with the development of psychology, the recognition and discrimination mechanisms of human characters and behaviors are more and more mature, so that part of enterprises draw the research results of psychology for reference, the human resource management work is associated with the career character test, and the aim is to find out the staff most suitable for the enterprises, so that colleges and universities need to advance with time when carrying out graduate career guidance work, but graduates guide starting late at the present stage and the guidance system is not mature, the requirement of college students on career guidance cannot be met, and the traditional graduate career character test of college graduates cannot truly reflect the career characters of the graduates and the career guidance work cannot be carried out pertinently due to the existence of human subjective factors.
In order to improve the accuracy of the graduate professional character test of colleges and universities, a system needs to be developed to be matched with the graduate professional character test, and the system acquires the facial features of a target object by acquiring frame image information of the target object during the professional character test and preprocessing the frame image information; detecting and identifying the attention and the micro expression change of a target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change; judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight; and generating a target user professional character test report by combining the credibility with the result of the professional character test. In the system implementation process, how to obtain the validity degree selected in the test process according to the attention and micro-expression changes of the target user is one of the problems that needs to be solved urgently.
Disclosure of Invention
In order to solve the technical problems, the invention provides a college graduate career personality testing method and system based on facial micro-expression.
The invention provides a college graduate career personality testing method based on facial microexpression, which comprises the following steps:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of carrying out the professional character test according to the attention weight and the micro expression weight;
and generating a target user professional character test report by combining the credibility with the result of the professional character test.
In the scheme, the acquiring of the frame image information of the target object during the occupational character test, the preprocessing of the feature image information, and the acquiring of the facial features of the target object specifically include:
acquiring an original video of a target object during occupational character test, and decoding the original video to acquire frame image information;
positioning according to the frame image information to acquire facial image data of a target object, acquiring a facial interesting region of the target object according to the facial image, and extracting facial interesting region characteristics by positioning face key points of the facial interesting region;
identity verification is carried out in a college graduate information database according to the face interesting features, and similarity judgment is carried out on the face interesting features and photo information corresponding to a target object in the college graduate information database;
and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
In the scheme, the detection and identification of the target user micro expression is carried out according to the facial features, and the method specifically comprises the following steps:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set;
training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training;
obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model;
testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set;
judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model;
and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
In this scheme, obtaining the attention weight according to the attention change of the target user specifically includes:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth occupational character test according to the attention score.
In the scheme, the micro expression weight is obtained according to the micro expression change of the target user, and the method specifically comprises the following steps:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
In this scheme, the generating of the target user career personality test report by combining the results of the credibility and career personality test specifically includes:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
The invention also provides a university graduate career personality testing system based on facial microexpression, which comprises: the system comprises a memory and a processor, wherein the memory comprises a college graduate occupation character testing method program based on facial micro-expressions, and the college graduate occupation character testing method program based on facial micro-expressions realizes the following steps when being executed by the processor:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
and generating a target user professional character test report by combining the credibility with the result of the professional character test.
In this scheme, the acquiring frame image information of the target object during the occupational personality test, preprocessing the characteristic image information, and acquiring the facial characteristics of the target object specifically include:
acquiring an original video of a target object during occupational character test, and decoding the original video to acquire frame image information;
positioning according to the frame image information to obtain face image data of a target object, obtaining a face interesting area of the target object according to the face image, and extracting face interesting area characteristics through positioning face key points of the face interesting area;
identity verification is carried out in a college graduate information database according to the face interesting features, and similarity judgment is carried out on the face interesting features and photo information corresponding to a target object in the college graduate information database;
and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
In the scheme, the detection and identification of the target user micro expression is carried out according to the facial features, and the method specifically comprises the following steps:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set;
training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training;
obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model;
testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set;
judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model;
and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
In this scheme, obtaining the attention weight according to the attention change of the target user specifically includes:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth occupational character test according to the attention score.
In the scheme, the micro expression weight is obtained according to the micro expression change of the target user, and the method specifically comprises the following steps:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
In this scheme, the generating of the target user career personality test report by combining the results of the credibility and career personality test specifically includes:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
The invention discloses a college graduate occupational character testing method and system based on facial micro-expression, frame image information of a target object during occupational character testing is obtained, the frame image information is preprocessed, and facial features of the target object are obtained; detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change; judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight; and generating a target user professional character test report by combining the credibility with the result of the professional character test. According to the invention, the facial micro-expression and attention change of the target user are combined with the professional personality test, so that the accuracy of the professional personality test of the target object is improved, the real ideas of graduates in colleges and universities are obtained by introducing the micro-expression detection, a reasonable employment guidance scheme is further formulated, and the targeted employment recommendation is provided for the graduates in colleges and universities.
Drawings
FIG. 1 shows a flow chart of a college graduate career personality testing method of the present invention based on facial microexpression;
FIG. 2 shows a block diagram of a college graduate career personality testing system of the present invention based on facial microexpression.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a more particular description of the invention will be rendered by reference to the appended drawings. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, however, the present invention may be practiced in other ways than those specifically described herein, and therefore the scope of the present invention is not limited by the specific embodiments disclosed below.
FIG. 1 shows a flow chart of a college graduate career personality testing method based on facial microexpression of the present invention.
As shown in fig. 1, a first aspect of the present invention provides a graduate professional character testing method for colleges and universities based on facial microexpression, comprising:
s102, obtaining frame image information of a target object during occupational character testing, preprocessing the frame image information, and obtaining facial features of the target object;
s104, detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
s106, judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
and S108, generating a target user career character test report according to the credibility and the result of the career character test.
It should be noted that, the obtaining of the frame image information of the target object during the occupational character test, the preprocessing of the feature image information, and the obtaining of the facial features of the target object specifically include: acquiring an original video of a target object during occupational character test, and decoding the original video to acquire frame image information; positioning according to the frame image information to acquire facial image data of a target object, acquiring a facial region of interest of the target object according to the facial image, and extracting facial region of interest features through positioning face key points of the facial region of interest, wherein the region of interest features comprise optical flow features and texture features, and the optical flow features reflect the direction and action amplitude of facial muscles of the region of interest; identity verification is carried out in a college graduate information database according to the face interesting features, and similarity judgment is carried out on the face interesting features and photo information corresponding to a target object in the college graduate information database; and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
It should be noted that, the detecting and identifying of the target user micro expression according to the facial features specifically includes:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set; training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training; obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model; testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set; judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model; and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
The CASME database is acquired by a camera with a frame rate of 60fps, and 195 micro expression samples are obtained in total. The CASME database is divided into an A part and a B part, wherein the A-type sample is collected under natural illumination, and the resolution of a sample image is 1280 multiplied by 720; the B-type sample is collected under the LED environment, and the resolution of the sample image is 640 multiplied by 480. Sample labeling in the CASME database is completed by participants and psychologists together, and eight types of feelings are contained: enjoying, hurting heart, aversion, surprise, keeping away from sight, fear, depression and tension.
It should be noted that, obtaining the attention weight according to the attention change of the target user specifically includes:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth occupational character test according to the attention score.
It should be noted that the acquisition of the sight line drop point of the target object can be realized by establishing a sight line detection model through a machine learning method such as a support vector machine or a neural network, acquiring a face image of the target object, and extracting feature points of the eye region of the target object and the head posture of the target object; and acquiring a sight angle and a sight confidence parameter according to the characteristic points of the eye region, and determining the sight falling point position of the target object according to the head posture, the sight angle, the sight confidence parameter and the distance from the target object to the display screen.
The method comprises the steps of constructing a threshold interval by presetting a plurality of times and duration threshold information, obtaining an attention score of a target object in the n-th occupational personality test answering process by judging the threshold interval where the facial deflection abnormal information and the sight line drop point abnormal information fall according to a corresponding scoring standard of the threshold interval, namely, the more the duration times and the duration time of the facial deflection abnormal information and the sight line drop point abnormal information of the target object are, the less the attention score of the target object in the test answering process is, and the smaller the attention weight in the occupational personality test is.
It should be noted that the obtaining of the micro expression weight according to the micro expression change of the target user specifically includes:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
The passive emotion micro-expression segment mainly comprises frown, sipping mouth, nostril micro-opening, eyebrow picking and the like, when the micro-expression change of the target object is judged to be the passive emotion change through the micro-expression detection model, the similarity of the micro-expression change of the target object and the micro-expression segment corresponding to the passive emotion in the psychological test is calculated, the micro-expression weight of the target object in the professional character test is obtained according to the similarity, and the similarity calculation can be realized through methods such as Euclidean distance or cosine comparison. Calculating the credibility of the occupational character test result generated by the initial credibility of the attention weight and the micro-expression weight of the target object in the occupational character test
It should be noted that the generating of the target user career personality test report by combining the credibility with the result of the career personality test specifically includes:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
According to the embodiment of the invention, the invention also comprises the step of prompting the attention-inattentive condition of the target object in the job character test, which comprises the following specific steps:
acquiring face deflection abnormal information and sight line drop point abnormal information of a target object in a professional character test, and counting test item quantity information of the target object with the face abnormal information and the sight line drop point abnormal information in the professional character test answering process;
presetting abnormal test item quantity threshold information, and judging whether the test item quantity information is greater than the abnormal test item quantity threshold information;
if the number of the test results is larger than the preset number, sending prompt information like the target object, reducing the accuracy of the test results of the prompt target object due to abnormal conditions, and displaying the prompt information according to a preset mode.
According to the embodiment of the invention, the invention further comprises the step of detecting the keyword information matched with the micro expression change of the target user through the sight line drop point, which specifically comprises the following steps:
acquiring a timestamp of the micro expression change of the target object in the professional planning test, and acquiring the keyword information of the sight line landing point of the target object according to the timestamp;
matching the keyword information with the micro expression changes to obtain test subject information of occupational character test items corresponding to the keys;
obtaining the rest test items of the keywords through the test subject information, and performing semantic replacement on the keywords in the rest test items;
comparing the test results of the rest test items with the test results of the test items containing the micro-expression changes to obtain the deviation between the test results of the test items containing the micro-expression changes and the test results of the rest test items;
and acquiring correction information of the micro-expression weight according to the deviation, and correcting the micro-expression weight according to the correction information.
FIG. 2 shows a block diagram of a college graduate professional character testing system based on facial microexpression of the present invention.
The second aspect of the present invention also provides a college graduate career personality testing system 2 based on facial microexpression, comprising: a memory 21 and a processor 22, wherein the memory includes a facial micro-expression-based graduate vocational test method program, and when the processor executes the facial micro-expression-based graduate vocational test method program, the following steps are implemented:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
and generating a target user professional character test report by combining the credibility with the result of the professional character test.
It should be noted that, the obtaining of the frame image information of the target object during the occupational character test, the preprocessing of the feature image information, and the obtaining of the facial features of the target object specifically include: acquiring an original video of a target object during occupational character testing, and decoding the original video to acquire frame image information; positioning according to the frame image information to acquire facial image data of a target object, acquiring a facial region of interest of the target object according to the facial image, and extracting facial region of interest features through positioning face key points of the facial region of interest, wherein the region of interest features comprise optical flow features and texture features, and the optical flow features reflect the direction and action amplitude of facial muscles of the region of interest; identity verification is carried out in a college graduate information database according to the face interesting features, and similarity judgment is carried out on the face interesting features and photo information corresponding to a target object in the college graduate information database; and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
It should be noted that, the detecting and identifying of the target user micro expression according to the facial features specifically includes:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set; training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training; obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model; testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set; judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model; and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
The CASME database is acquired by a camera with a frame rate of 60fps, and 195 micro expression samples are obtained in total. The CASME database is divided into an A part and a B part, wherein the A-type sample is collected under natural illumination, and the resolution of a sample image is 1280 multiplied by 720; the B-type sample is collected under the LED environment, and the resolution of the sample image is 640 multiplied by 480. Sample labeling in the CASME database is completed by participants and psychologists together, and eight types of feelings are contained: enjoying, hurting heart, aversion, surprise, keeping away from sight, fear, depression and tension.
It should be noted that, obtaining the attention weight according to the attention change of the target user specifically includes:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
and acquiring the attention weight of the nth professional character test according to the attention score.
It should be noted that the acquisition of the sight line drop point of the target object can be realized by establishing a sight line detection model through a machine learning method such as a support vector machine or a neural network, acquiring a face image of the target object, and extracting feature points of the eye region of the target object and the head posture of the target object; and acquiring a sight angle and a sight confidence parameter according to the characteristic points of the eye region, and determining the sight falling point position of the target object according to the head posture, the sight angle, the sight confidence parameter and the distance from the target object to the display screen.
The method comprises the steps of establishing a threshold interval by presetting a plurality of times and duration threshold information, judging the threshold interval in which face deflection abnormal information and sight line drop point abnormal information fall, and obtaining the attention score of a target object in the response process of the nth professional character test according to the corresponding scoring standard of the threshold interval, wherein the more the number of times and the duration of the face deflection abnormal information and the sight line drop point abnormal information of the target object are, the less the attention score of the target object in the test response process is, and the smaller the attention weight in the professional character test is.
It should be noted that the obtaining of the micro expression weight according to the micro expression change of the target user specifically includes:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
and acquiring micro-expression fragment similarity corresponding to the negative emotion through micro-expression change of the target object, and acquiring micro-expression weight according to the similarity.
The passive emotion micro-expression segment mainly comprises frown, sipping mouth, nostril micro-opening, eyebrow picking and the like, when the micro-expression change of the target object is judged to be the passive emotion change through the micro-expression detection model, the similarity of the micro-expression change of the target object and the micro-expression segment corresponding to the passive emotion in the psychological test is calculated, the micro-expression weight of the target object in the professional character test is obtained according to the similarity, and the similarity calculation can be realized through methods such as Euclidean distance or cosine comparison. Calculating the credibility of the occupational character test result generated by the initial credibility of the attention weight and the micro-expression weight of the target object in the occupational character test
It should be noted that the generating of the target user career personality test report by combining the credibility with the result of the career personality test specifically includes:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
and connecting the college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion.
According to the embodiment of the invention, the invention also comprises the step of prompting the attention-inattentive condition of the target object in the job character test, which comprises the following specific steps:
acquiring face deflection abnormal information and sight line drop point abnormal information of a target object in a professional character test, and counting test item quantity information of the target object with the face abnormal information and the sight line drop point abnormal information in the professional character test answering process;
presetting abnormal test item quantity threshold information, and judging whether the test item quantity information is greater than the abnormal test item quantity threshold information;
if the value is larger than the preset value, sending prompt information like the target object, prompting that the accuracy of the test result of the target object is reduced due to abnormal conditions, and displaying the prompt information according to a preset mode.
According to the embodiment of the invention, the invention further comprises the step of detecting the keyword information matched with the micro expression change of the target user through the sight line drop point, which specifically comprises the following steps:
acquiring a timestamp of the micro expression change of the target object in the professional planning test, and acquiring the keyword information of the sight line landing point of the target object according to the timestamp;
matching the keyword information with the micro expression changes to obtain test subject information of occupational character test items corresponding to the keys;
obtaining the rest test items of the keywords through the test subject information, and performing semantic replacement on the keywords in the rest test items;
comparing the test results of the rest test items with the test results of the test items containing the micro-expression changes to obtain the deviation between the test results of the test items containing the micro-expression changes and the test results of the rest test items;
and acquiring correction information of the micro-expression weight according to the deviation, and correcting the micro-expression weight according to the correction information.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described device embodiments are merely illustrative, for example, the division of the unit is only a logical functional division, and there may be other division ways in actual implementation, such as: multiple units or components may be combined, or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the coupling, direct coupling or communication connection between the components shown or discussed may be through some interfaces, and the indirect coupling or communication connection between the devices or units may be electrical, mechanical or other forms.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units; can be located in one place or distributed on a plurality of network units; some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, all the functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may be separately regarded as one unit, or two or more units may be integrated into one unit; the integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: a mobile storage device, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Alternatively, the integrated unit of the present invention may be stored in a computer-readable storage medium if it is implemented in the form of a software functional module and sold or used as a separate product. Based on such understanding, the technical solutions of the embodiments of the present invention may be essentially implemented or a part contributing to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the methods described in the embodiments of the present invention. And the aforementioned storage medium includes: a removable storage device, a ROM, a RAM, a magnetic or optical disk, or various other media that can store program code.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.

Claims (4)

1. A method for providing recruitment information recommendations for a target user, comprising the steps of:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
generating a target user career character test report by combining the credibility with the result of the career character test;
acquiring an attention weight according to the attention change of the target user, specifically:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
acquiring the attention weight of the nth occupational character test according to the attention score;
acquiring micro expression weight according to micro expression change of a target user, which specifically comprises the following steps:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
acquiring micro-expression fragment similarity corresponding to negative emotions through micro-expression change of a target object, and acquiring micro-expression weight according to the similarity;
the step of generating a target user career personality test report by combining the credibility with the result of the career personality test specifically comprises the following steps:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
connecting a college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion;
the method comprises the following steps of detecting keyword information matched with the micro expression change of a target user through sight falling points, specifically:
acquiring a timestamp of the micro expression change of a target object in a professional planning test, and acquiring keyword information of a target object sight line drop point according to the timestamp;
matching the keyword information with the micro expression changes to obtain test subject information of the occupational character test items corresponding to the keywords;
obtaining the rest test items of the keywords through the test subject information, and performing semantic replacement on the keywords in the rest test items;
comparing the test results of the rest test items with the test results of the test items containing the micro-expression changes to obtain the deviation between the test results of the test items containing the micro-expression changes and the test results of the rest test items;
and acquiring correction information of the micro-expression weight according to the deviation, and correcting the micro-expression weight according to the correction information.
2. The method for providing recruitment information recommendation for a target user according to claim 1, wherein the obtaining of the frame image information of the target object during the occupational character test, the preprocessing of the frame image information, and the obtaining of the facial features of the target object are specifically:
acquiring an original video of a target object during occupational character test, and decoding the original video to acquire frame image information;
positioning according to the frame image information to acquire facial image data of a target object, acquiring a facial interesting region of the target object according to the facial image data, and extracting facial interesting region characteristics through positioning face key points of the facial interesting region;
identity verification is carried out in a college graduate information database according to the facial interesting region characteristics, and similarity judgment is carried out on the facial interesting characteristics and photo information corresponding to a target object in the college graduate information database;
and if the similarity is greater than a preset similarity threshold, the identity verification of the target object is passed, and the professional character test is carried out.
3. The method for providing recruitment information recommendation for a target user according to claim 1, wherein the detection and identification of the target user micro expression is performed according to the facial features, specifically:
constructing a micro-expression detection model based on 3D-CNN, acquiring sample data through a CASME database, extracting facial features of the sample data, normalizing the facial features, generating and dividing the sample data into a training set and a verification set;
training the micro expression detection model through the training set, and adjusting relevant parameters of the micro expression detection model through multiple iterative training;
obtaining and calculating the prediction probability of the micro expression through a Softmax layer, and taking the emotion with the maximum prediction probability as the finally predicted emotion of the micro expression detection model;
testing the finally predicted emotion of the micro-expression detection model according to the verification set, and calculating the deviation rate of the finally predicted emotion of the model and the sample data in the verification set;
judging whether the deviation rate is smaller than a preset deviation rate threshold value or not, if so, proving that the precision of the micro-expression detection model meets a preset standard, and outputting the micro-expression detection model;
and acquiring the micro-expression change and the emotion change of the target object in the nth occupational character test answering process according to the facial features of the target object through the micro-expression detection model.
4. A system for providing recruitment information recommendations for a target user, the system comprising: the recruitment information recommendation system comprises a memory and a processor, wherein the memory comprises a method program for providing recruitment information recommendation for a target user, and the method program for providing the recruitment information recommendation for the target user realizes the following steps when being executed by the processor:
acquiring frame image information of a target object during occupational character testing, preprocessing the frame image information, and acquiring facial features of the target object;
detecting and identifying the attention and the micro expression change of the target user according to the facial features, and generating an attention weight and a micro expression weight according to a preset judgment standard through the attention and the micro expression change;
judging the credibility selected by the target user in the process of performing occupational character testing according to the attention weight and the micro-expression weight;
generating a target user career character test report by combining the credibility with the result of the career character test;
acquiring an attention weight according to the attention change of the target user, specifically:
acquiring a face contour feature of a target object at a target moment through a face interesting region, and acquiring a position coordinate of the face contour feature as a face contour initial position coordinate;
judging the facial deflection change of the target object in the nth occupational character test answering process, comparing the real-time position coordinates of the facial contour with the initial position coordinates of the facial contour to generate facial deflection deviation;
acquiring the times that the facial deflection deviation is greater than a deviation threshold value and the duration time that the facial deflection deviation is greater than the deviation threshold value in the answering process, and generating facial deflection abnormal information;
meanwhile, the sight line drop point of the target object is obtained, the times and duration that the sight line drop point of the target object is not in the preset range are judged, and sight line drop point abnormal information is generated;
according to the facial deflection abnormal information and the sight line drop point abnormal information, the attention score of the target object in the nth occupational character test answering process is evaluated through a preset scoring standard;
acquiring the attention weight of the nth occupational character test according to the attention score;
acquiring micro expression weight according to micro expression change of a target user, which specifically comprises the following steps:
acquiring the micro-expression change of a target object in the n-th occupational character test answering process, acquiring the association information of the expressions and the negative emotions in the psychological test through big data, and extracting micro-expression fragments corresponding to the negative emotions according to the association information;
acquiring micro-expression fragment similarity corresponding to negative emotions through micro-expression change of a target object, and acquiring micro-expression weight according to the similarity;
the step of generating a target user career personality test report by combining the credibility with the result of the career personality test specifically comprises the following steps:
matching the selection result of the target user in each occupational character test with the credibility to generate a final score of each test;
summarizing the final scores of each test, and comparing and judging the final scores with occupational character test scoring standards to generate an occupational character test report of the target object;
acquiring working environment tendency and development suggestion in the occupational character test report, and acquiring a recommended occupational type of a target object according to the working environment tendency;
connecting a college employment platform database, and providing recruitment information recommendation for the target user according to the recommended occupation type and the development suggestion;
the method comprises the following steps of detecting keyword information matched with the micro expression change of a target user through a sight line drop point, specifically:
acquiring a timestamp of the micro expression change of a target object in a professional planning test, and acquiring keyword information of a target object sight line drop point according to the timestamp;
matching the keyword information with the micro expression changes to obtain test subject information of the occupational character test items corresponding to the keywords;
obtaining the rest test items of the keywords through the test subject information, and performing semantic replacement on the keywords in the rest test items;
comparing the test results of the rest test items with the test results of the test items containing the micro-expression changes to obtain the deviation between the test results of the test items containing the micro-expression changes and the test results of the rest test items;
and acquiring correction information of the micro-expression weight according to the deviation, and correcting the micro-expression weight according to the correction information.
CN202210244199.7A 2022-03-14 2022-03-14 College graduate career character testing method and system based on facial micro-expression Active CN114399827B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210244199.7A CN114399827B (en) 2022-03-14 2022-03-14 College graduate career character testing method and system based on facial micro-expression

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210244199.7A CN114399827B (en) 2022-03-14 2022-03-14 College graduate career character testing method and system based on facial micro-expression

Publications (2)

Publication Number Publication Date
CN114399827A CN114399827A (en) 2022-04-26
CN114399827B true CN114399827B (en) 2022-08-09

Family

ID=81234014

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210244199.7A Active CN114399827B (en) 2022-03-14 2022-03-14 College graduate career character testing method and system based on facial micro-expression

Country Status (1)

Country Link
CN (1) CN114399827B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115358605B (en) * 2022-08-26 2023-05-05 山东心法科技有限公司 Professional planning auxiliary method, device and medium based on multi-mode fusion

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010072203A (en) * 2008-09-17 2010-04-02 Fuji Xerox Co Ltd Problem creating device, problem creating program, and learning system
TW201501044A (en) * 2013-06-24 2015-01-01 Utechzone Co Ltd Apparatus, method and computer readable recording medium of generating signal by detecting facial action
CN108304745A (en) * 2017-01-10 2018-07-20 普天信息技术有限公司 A kind of driver's driving behavior detection method, device
CN109805943A (en) * 2017-11-20 2019-05-28 徐熠 A kind of intelligent evaluating system and method based on micro- Expression Recognition
CN109222888B (en) * 2018-11-05 2021-03-23 温州职业技术学院 Method for judging reliability of psychological test based on eye movement technology
CN109872026A (en) * 2018-12-14 2019-06-11 深圳壹账通智能科技有限公司 Evaluation result generation method, device, equipment and computer readable storage medium
CN109767321A (en) * 2018-12-18 2019-05-17 深圳壹账通智能科技有限公司 Question answering process optimization method, device, computer equipment and storage medium
CN109977903B (en) * 2019-04-03 2020-03-17 珠海读书郎网络教育有限公司 Method and device for intelligent classroom student management and computer storage medium
CN110362648A (en) * 2019-05-31 2019-10-22 深圳壹账通智能科技有限公司 Update method and device, storage medium, the computer equipment of questionnaire survey topic
CN110728182B (en) * 2019-09-06 2023-12-26 平安科技(深圳)有限公司 Interview method and device based on AI interview system and computer equipment
CN110569347A (en) * 2019-09-10 2019-12-13 出门问问信息科技有限公司 Data processing method and device, storage medium and electronic equipment
CN111260517A (en) * 2020-02-23 2020-06-09 徐永贵 Intelligent teaching and management platform system and method for mobile phone
CN111507592B (en) * 2020-04-08 2022-03-15 山东大学 Evaluation method for active modification behaviors of prisoners
CN111476178A (en) * 2020-04-10 2020-07-31 大连海事大学 Micro-expression recognition method based on 2D-3D CNN
US11681610B2 (en) * 2020-05-13 2023-06-20 Data-Core Systems, Inc. Synthesizing data based on topic modeling for training and testing machine learning systems
CN111887867A (en) * 2020-07-10 2020-11-06 衡阳师范学院 Method and system for analyzing character formation based on expression recognition and psychological test
CN111918133A (en) * 2020-07-27 2020-11-10 深圳创维-Rgb电子有限公司 Method for tutoring and supervising student writing homework, television and storage medium
CN112613444A (en) * 2020-12-29 2021-04-06 北京市商汤科技开发有限公司 Behavior detection method and device, electronic equipment and storage medium
CN112613440A (en) * 2020-12-29 2021-04-06 北京市商汤科技开发有限公司 Attitude detection method and apparatus, electronic device and storage medium
CN113837010A (en) * 2021-08-30 2021-12-24 淄博师范高等专科学校 Education assessment system and method
CN114129165A (en) * 2021-12-10 2022-03-04 北京邮电大学 Psychological assessment method, system and storage medium based on credible assessment scale

Also Published As

Publication number Publication date
CN114399827A (en) 2022-04-26

Similar Documents

Publication Publication Date Title
CN107680019B (en) Examination scheme implementation method, device, equipment and storage medium
CN108765131B (en) Micro-expression-based credit auditing method, device, terminal and readable storage medium
US10019653B2 (en) Method and system for predicting personality traits, capabilities and suggested interactions from images of a person
US7860347B2 (en) Image-based face search
CN112533051B (en) Barrage information display method, barrage information display device, computer equipment and storage medium
CN107203953A (en) It is a kind of based on internet, Expression Recognition and the tutoring system of speech recognition and its implementation
CN111126553A (en) Intelligent robot interviewing method, equipment, storage medium and device
CN109766412B (en) Learning content acquisition method based on image recognition and electronic equipment
CN105518708A (en) Method and equipment for verifying living human face, and computer program product
CN115205764B (en) Online learning concentration monitoring method, system and medium based on machine vision
CN113903469A (en) Psychological assessment method, device, electronic device and medium based on artificial intelligence
US10997609B1 (en) Biometric based user identity verification
CN114399827B (en) College graduate career character testing method and system based on facial micro-expression
CN112132030A (en) Video processing method and device, storage medium and electronic equipment
CN113010557A (en) Method and system for randomly answering questions by using psychological evaluation system scale
CN111581623A (en) Intelligent data interaction method and device, electronic equipment and storage medium
Ceneda et al. Show me your face: Towards an automated method to provide timely guidance in visual analytics
Boychuk et al. An exploratory sentiment and facial expressions analysis of data from photo-sharing on social media: The case of football violence
CN110852073A (en) Language learning system and learning method for customizing learning content for user
CN114943549A (en) Advertisement delivery method and device
CN111753168A (en) Method and device for searching questions, electronic equipment and storage medium
CN110443122B (en) Information processing method and related product
CN112053205A (en) Product recommendation method and device through robot emotion recognition
CN114971425A (en) Database information monitoring method, device, equipment and storage medium
CN115147067A (en) Intelligent recruiter talent recruitment method based on deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant