CN109919001A - Customer service monitoring method, device, equipment and storage medium based on Emotion identification - Google Patents
Customer service monitoring method, device, equipment and storage medium based on Emotion identification Download PDFInfo
- Publication number
- CN109919001A CN109919001A CN201910061884.4A CN201910061884A CN109919001A CN 109919001 A CN109919001 A CN 109919001A CN 201910061884 A CN201910061884 A CN 201910061884A CN 109919001 A CN109919001 A CN 109919001A
- Authority
- CN
- China
- Prior art keywords
- face image
- expression
- recognition result
- customer service
- emotion
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000008451 emotion Effects 0.000 title claims abstract description 60
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000012544 monitoring process Methods 0.000 title claims abstract description 38
- 230000014509 gene expression Effects 0.000 claims abstract description 133
- 238000013527 convolutional neural network Methods 0.000 claims abstract description 40
- 238000013145 classification model Methods 0.000 claims abstract description 34
- 230000001815 facial effect Effects 0.000 claims abstract description 25
- 230000008569 process Effects 0.000 claims abstract description 14
- 230000008909 emotion recognition Effects 0.000 claims description 96
- 238000012545 processing Methods 0.000 claims description 27
- 238000004364 calculation method Methods 0.000 claims description 26
- 238000007781 pre-processing Methods 0.000 claims description 13
- 238000010586 diagram Methods 0.000 claims description 12
- 238000012549 training Methods 0.000 claims description 9
- 238000006243 chemical reaction Methods 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 6
- 238000012806 monitoring device Methods 0.000 claims description 6
- 230000002194 synthesizing effect Effects 0.000 claims description 4
- 230000036651 mood Effects 0.000 abstract description 6
- 230000008921 facial expression Effects 0.000 description 12
- 230000000694 effects Effects 0.000 description 5
- 210000004373 mandible Anatomy 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 210000001847 jaw Anatomy 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000000605 extraction Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
Landscapes
- Image Analysis (AREA)
Abstract
This application involves micro- Expression Recognition technical field more particularly to a kind of customer service monitoring method, device, equipment and storage mediums based on Emotion identification, comprising: obtains facial image to be detected, pre-processes to the facial image;It is handled being input in the trained expression classification model based on convolutional neural networks by the pretreated facial image, obtains the Expression Recognition result of the facial image;According to the Expression Recognition of the facial image as a result, according to preset expression Emotion identification result corresponding with the corresponding relationship of mood and the computation rule calculating acquisition facial image;According to the Emotion identification as a result, sending prompting message corresponding with the Emotion identification result to contact staff.The application is provided some methods for alleviating mood, is improved the service quality of contact staff, increase the satisfaction of user by the mood of real time monitoring contact staff.
Description
Technical Field
The application relates to the technical field of micro-expression recognition, in particular to a customer service monitoring method, a customer service monitoring device, customer service monitoring equipment and a storage medium based on emotion recognition.
Background
The customer service mainly embodies a value view oriented to customer satisfaction, and in a broad sense, any content capable of improving the customer satisfaction belongs to the scope of the customer service. The customer service can be divided into manual customer service and electronic customer service, wherein the manual customer service can be divided into three types of character customer service, video customer service and voice customer service. The character customer service is customer service mainly in a typing chat mode; the video customer service means that the customer service is mainly carried out in a voice video mode; voice service refers to service that is performed primarily in the form of a mobile phone.
At present, customer service business is common, however, in the process of customer service, measures for monitoring emotion of customer service personnel are basically not provided, and leaders only evaluate the working attitude of one customer service personnel through a satisfaction survey and scoring mechanism; in the chat process, the customer service personnel can not find out the mood fluctuation of the customer service personnel in time and leadership does not know, so that the satisfaction degree of the customer service personnel on the customer service is greatly reduced.
Disclosure of Invention
In view of this, it is necessary to provide a customer service monitoring method, apparatus, device and storage medium based on emotion recognition, for the problem that in the prior art, there are no measures for monitoring the emotion of customer service staff, and leaders only evaluate the working attitude of a customer service staff through a satisfaction survey and scoring mechanism.
A customer service monitoring method based on emotion recognition comprises the following steps:
acquiring a face image to be detected, and preprocessing the face image;
inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image;
calculating according to the expression recognition result of the face image and a preset corresponding relation between the expression and the emotion and a calculation rule to obtain an emotion recognition result corresponding to the face image;
and sending a prompt message corresponding to the emotion recognition result to the customer service staff according to the emotion recognition result.
In one possible embodiment, the acquiring a face image to be detected and preprocessing the face image includes:
acquiring video data of a person to be detected through monitoring equipment, and extracting an image containing the face of the person to be detected from the video data to obtain a face image;
carrying out gray level conversion on the face image, and then correcting the length and width of the face image after the gray level conversion to adjust the length and width of the face image to a preset size;
and then segmenting the face image according to a preset face part rule, and marking the segmented parts.
In one possible embodiment, the inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image includes:
inputting the segmented and marked face image into a trained expression classification model based on a convolutional neural network;
calculating the marked parts one by one, and synthesizing the calculation results of all the parts to obtain the score of the face image under each expression in the expression classification model based on the convolutional neural network;
and sorting the scores of each type of expressions of the facial image from high to low according to the scores, and taking the expressions with the scores arranged in the top and the corresponding scores as the expression recognition results of the facial image.
In one possible embodiment, the obtaining, according to the expression recognition result of the face image, an emotion recognition result corresponding to the face image by calculation according to a preset correspondence between an expression and an emotion and a calculation rule includes:
inputting the expression recognition result into an emotion recognition model, and respectively calculating the scores of all emotions corresponding to each type of expression in the expression recognition result;
and summing the scores of the same emotion, wherein the emotion with the highest score is the emotion recognition result corresponding to the face image.
In one possible embodiment, the sending, to the customer service person, a prompt message corresponding to the emotion recognition result according to the emotion recognition result includes:
acquiring the emotion recognition result;
calling a corresponding relation table of a preset emotion recognition result and a prompt message;
in the comparison relation table, carrying out keyword query on the emotion recognition result to obtain a prompt message corresponding to the emotion recognition result;
and sending the prompt message to customer service staff.
In one possible embodiment, the inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image further includes a process of training the expression classification model based on the convolutional neural network, which specifically includes:
the method comprises the steps of collecting a plurality of public face images containing various expressions, adjusting the sizes of the face images to preset sizes, converting the face images into gray level images, and converting the gray level images into pixel matrixes X (X)ij]M×NWherein x isijPixel values representing the ith row and the jth column of the image, M being the height of the image (in pixels) and N being the width of the image (in pixels);
the pixel matrix of all gray level images is subjected to mean value removing processing, and the calculation formula is as follows:
and inputting the face image subjected to mean value removing processing into a convolutional neural network model for training to obtain the expression classification model based on the convolutional neural network.
In one possible embodiment, the inputting the face image after the mean value removing process into a convolutional neural network model for training to obtain an expression classification model based on a convolutional neural network includes:
performing convolution calculation on a pixel matrix of the gray level image input to the convolution layer, wherein the calculation formula is as follows:
wherein,j is the index of the network layer, j is the index of the output expression diagram, i is the index of the input expression diagram, NinIn order to input the number of the emoticons,a convolution kernel corresponding to the ith input expression chart of the layer I network is shown,is an offset.
A customer service monitoring device based on emotion recognition comprises the following modules:
the image acquisition module is used for acquiring a face image to be detected and preprocessing the face image;
the expression recognition module is used for inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image;
the emotion recognition module is used for calculating to obtain an emotion recognition result corresponding to the face image according to an expression recognition result of the face image and a preset corresponding relation between the expression and the emotion and a calculation rule;
and the result processing module is used for sending a prompt message corresponding to the emotion recognition result to the customer service staff according to the emotion recognition result.
Based on the same concept, the present application proposes a computer device comprising a memory and a processor, the memory having stored therein computer-readable instructions, which, when executed by one or more of the processors, cause the one or more processors to perform the steps of the above-mentioned customer service monitoring method based on emotion recognition.
Based on the same concept, the present application proposes a storage medium readable and writable by a processor, the storage medium storing computer readable instructions, which, when executed by one or more processors, cause the one or more processors to perform the steps of the above-mentioned customer service monitoring method based on emotion recognition.
The customer service monitoring method, the customer service monitoring device, the customer service monitoring equipment and the storage medium based on emotion recognition comprise the following steps: acquiring a face image to be detected, and preprocessing the face image; inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image; calculating according to the expression recognition result of the face image and a preset corresponding relation between the expression and the emotion and a calculation rule to obtain an emotion recognition result corresponding to the face image; and sending a prompt message corresponding to the emotion recognition result to the customer service staff according to the emotion recognition result. The technical scheme can monitor the emotion of the customer service staff in real time, provides methods for relieving the emotion, improves the service quality of the customer service staff, increases the satisfaction degree of users, and has a certain auxiliary effect on management work, and the management level can pay attention to the emotion change of the customer service staff in real time.
Drawings
Various other advantages and benefits will become apparent to those of ordinary skill in the art upon reading the following detailed description of the preferred embodiments. The drawings are only for purposes of illustrating the preferred embodiments and are not to be construed as limiting the invention.
FIG. 1 is a general flow diagram of a method for monitoring customer service based on emotion recognition in one embodiment of the present application;
FIG. 2 is a schematic diagram illustrating an expression recognition process in a customer service monitoring method based on emotion recognition according to an embodiment of the present application;
FIG. 3 is a diagram illustrating an emotion recognition process in a method for monitoring customer service based on emotion recognition according to an embodiment of the present application;
fig. 4 is a block diagram of a customer service monitoring device based on emotion recognition in one embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
Fig. 1 is an overall flowchart of a customer service monitoring method based on emotion recognition according to the present invention, and as shown in the figure, the customer service monitoring method based on emotion recognition includes the following steps:
and step S1, acquiring a face image to be detected, and preprocessing the face image.
When the steps are executed, the obtained face image to be detected can obtain the video data of the customer service personnel through the monitoring equipment, and the image containing the face of the person to be detected is extracted from the video data in real time to obtain the face image; the image of the customer service personnel in the customer service process can be captured in real time through the camera equipment, and then the face image is extracted from the image, or the face image of the customer service personnel is directly captured and obtained through the camera equipment. Specifically, the monitoring apparatus and the imaging apparatus may be disposed in front of the customer service person, including a straight front, an oblique front, an upper front, a lower front, and the like. For example, on a computer device used by the customer service personnel.
The preprocessing of the face image comprises processing of image color, adjustment of image size, image rotation, image turning, image noise reduction and the like. The processing of the image color specifically comprises carrying out gray level conversion on the face image, so that the face image is converted into a gray level image. The image size adjustment comprises the step of correcting the length and width of the face image to enable the length and width of the face image to be adjusted to a preset size. The preset size is set according to historical data, and the subsequent processing can achieve a better effect.
The method comprises the steps of preprocessing the face image, segmenting the face image according to a preset face part rule, and marking the segmented part. The preset face part rule is a preset segmentation rule, for example, the eyes, lips, mandible and other parts of the face are segmented; the segmented parts are then labeled, for example, segmented eyes are labeled a, lips are labeled b, and mandible is labeled c.
And step S2, inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image.
When the above steps are performed, the facial image is processed by image color, image size, image rotation, image turning, image noise reduction, image segmentation, marking and the like, then is processed by being added into a trained expression classification model based on a convolutional neural network, and the expression recognition result of the facial image is output. The method includes the following steps that the expression classification model based on the convolutional neural network is trained to process, and the method includes the following steps: calculating the marked parts one by one, and synthesizing the calculation results of all the parts to obtain the score of the face image under each expression in the expression classification model based on the convolutional neural network; and then, the scores of each type of expressions of the facial image are sorted from high to low according to the scores, and the expressions with the scores arranged in the top and the corresponding scores are used as the expression recognition results of the facial image.
And step S3, calculating according to the expression recognition result of the face image and the corresponding relation between the preset expression and emotion and the calculation rule to obtain the emotion recognition result corresponding to the face image.
When the steps are executed, after the expression recognition result is output, the expression recognition result is input into an emotion recognition model, and scores of all emotions corresponding to each category of expression in the expression recognition result are respectively calculated; and summing the scores of the same emotion, wherein the emotion with the highest score is the emotion recognition result corresponding to the face image.
And step S4, sending a prompt message corresponding to the emotion recognition result to the customer service staff according to the emotion recognition result.
When the steps are executed, firstly, the emotion recognition result is obtained, then a preset corresponding relation table of the emotion recognition result and the prompt message is called, keyword query is carried out on the emotion recognition result in the comparison relation table, the prompt message corresponding to the emotion recognition result is obtained, and then the prompt message is sent to customer service staff.
Wherein the prompting message comprises a prompting message related to caring, a prompting message related to praise, a prompting message related to encouragement and the like according to different emotions.
Specifically, if the emotion recognition result is anger or angry, a prompt message related to care is sent; if the emotion recognition result is happy or excited, sending a prompt message related to praise; and if the emotion recognition result is neutral, sending a prompt message related to encouragement.
According to the embodiment, the emotion of the customer service staff can be monitored in real time, the corresponding prompt message is sent according to the emotion to play a role in relieving the emotion of the customer service staff, the service quality of the customer service staff is improved, meanwhile, the satisfaction degree of a user is increased, the management layer can pay attention to the emotion change of the customer service staff in real time, and certain auxiliary effects are achieved for management work.
In an embodiment, the step S1 of acquiring a face image to be detected, and preprocessing the face image includes the following steps:
acquiring video data of a person to be detected through monitoring equipment, and extracting an image containing the face of the person to be detected from the video data to obtain a face image;
when the steps are executed, video key frame data are extracted from the video data, and an image containing the face of the person to be detected in each frame is obtained from the video key frame data. Specifically, an execution cycle may be set, and the video data is decimated according to the execution cycle, that is, decimated according to a predetermined time interval. The frame extracting tool can adopt ffmpeg, the frame extracting time can be set by the aid of the frame extracting tool, the video data is used as the input parameter, and the picture which meets the requirement is obtained.
Carrying out gray level conversion on the face image, and then correcting the length and width of the face image after the gray level conversion to adjust the length and width of the face image to a preset size;
and then segmenting the face image according to a preset face part rule, and marking the segmented parts.
When the above steps are executed, the preset face part rule is a preset segmentation rule, for example, a partial image corresponding to the eyes, lips, mandible and the like in the face image is segmented; the segmented partial images are then labeled, for example, the segmented eye-corresponding partial image is labeled a and may be named as eye a, the lip-corresponding partial image is labeled b and may be named as lip b, and the mandible-corresponding partial image is labeled c and may be named as mandible c.
According to the embodiment, the time interval of frame extraction is flexibly set, different monitoring effects and requirements can be achieved, and subsequent operations are more accurate and efficient by preprocessing the face image.
In an embodiment, fig. 2 is a schematic diagram of an expression recognition process in an emotion recognition-based customer service monitoring method according to an embodiment of the present application, and as shown in fig. 2, the step S2 is to input the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing, so as to obtain an expression recognition result of the face image, and includes the following steps:
step S201, inputting the segmented and marked facial image into a trained expression classification model based on a convolutional neural network;
step S202, calculating the marked parts one by one, and synthesizing the calculation results of all the parts to obtain the score of the face image under each expression in the expression classification model based on the convolutional neural network;
the expression classification model based on the convolutional neural network includes 54 types of facial expressions, which is equivalent to classifying the facial expressions of human into 54 types, such as: happy face, feeling and pulse. And inputting the segmented and marked face image into a trained expression classification model based on a convolutional neural network, respectively calculating the scores of each marked part in 54 facial expressions, and then integrating the calculation results of all the parts. For example, eye a scores 20% for facial expression "smiley, lip b scores 30% for facial expression" smiley, and jaw c scores 50% for facial expression "smiley; calculating the score M of the facial image in the expression "smiling face" according to the weights given to the eyes, lips and jaw in the facial expression judgment, for example, the weight given to the eye a is P1The weight given to the lips b is P2Mandible c is given weight P3Then the person isThe score of the face image in the facial expression "smiling face" is M ═ 20%. P1+30%*P2+50%*P3. By analogy, the score of the facial image under any one of the 54 facial expressions can be calculated and obtained. For example, the score of the facial image in the facial expression "smiling face" is calculated to be 15%, the score of the facial expression "emotion and pulse" is calculated to be 30%, and so on, the scores of the facial image in all other types of facial expressions are obtained.
Step S203, the scores of each type of expressions of the facial image are sorted from high to low according to the scores, and the expressions with the scores arranged in the top and the corresponding scores are used as the expression recognition results of the facial image.
In a preferred embodiment, the scores of each type of expressions of the facial image are sorted from high to low according to the scores, and the expressions with the scores in the top five positions and the corresponding scores are output as the expression recognition results of the facial image.
In the embodiment, the facial image is divided and marked according to the parts, and each marked facial part is given weight, so that the expression recognition effect is improved.
In an embodiment, fig. 3 is a schematic diagram of an emotion recognition process in an emotion recognition-based customer service monitoring method according to an embodiment of the present application, and as shown in fig. 3, the step S3 is to calculate and obtain an emotion recognition result corresponding to the face image according to a preset corresponding relationship between an expression and an emotion and a calculation rule according to an expression recognition result of the face image, and includes the following steps:
step S301, inputting the expression recognition result into an emotion recognition model, and respectively calculating scores of all emotions corresponding to each category of expression in the expression recognition result;
and step S302, summing the scores of the same emotion, wherein the emotion with the highest score is the emotion recognition result corresponding to the face image.
In a preferred embodiment, when the above steps are performed, the expression recognition results corresponding to the expressions with scores in the top five digits are input as input into an emotion recognition model, wherein the emotion recognition model is pre-trained and comprises seven emotions, and scores of the seven emotions corresponding to each of the 54 expressions, for example, the score of M is the score of M for the emotion "happy face" corresponding to the emotion "happy" of the expression "happy face" of the user1And a score M corresponding to the emotion "anger2The scores corresponding to the other five emotions are M respectively3、M4、M5、M6、M7(ii) a The expression 'containing emotion and pulse' corresponds to the emotion 'happy' and has a score of N1. By analogy, the scores of the seven emotions corresponding to the other four expressions are respectively called out. The score of each emotion, such as the score of the emotion "happy" corresponding to the expression "happy face", is calculated as: score M of mood "happiness1Multiplying the score M of the happy face, calculating the scores of the other four expressions corresponding to the emotion 'happy', and adding the scores of the five expressions corresponding to the emotion 'happy', thereby obtaining the total score of the emotion 'happy'. And by analogy, calculating the total scores of other six emotions. And the emotion with the highest total score is the emotion recognition result corresponding to the face image.
In one embodiment, the step S4, sending a prompt message corresponding to the emotion recognition result to the customer service person according to the emotion recognition result, includes:
acquiring the emotion recognition result;
calling a corresponding relation table of a preset emotion recognition result and a prompt message;
in the comparison relation table, carrying out keyword query on the emotion recognition result to obtain a prompt message corresponding to the emotion recognition result;
and sending the prompt message to customer service staff.
When the steps are executed, when the emotion of the current customer service staff is detected, the corresponding prompt message is sent to the customer service staff according to the corresponding relation between the preset emotion recognition result and the prompt message, and the customer service staff is reminded in real time.
Specifically, for example, when it is detected that the customer service person is angry in the course of talking, a prompt message can be sent and displayed on the page: in the favorite XX, the emotion of your is detected to be not good at present, and the XX can also take a rest appropriately. Through the warm sentence and the recommendation of a little story and a piece of music after the conversation is finished, the emotion of the customer service staff is relieved by the music, the emotion of the customer service staff is controlled, and a stable and pleasant mood is kept, so that the quality of the next call can be ensured.
According to the embodiment, the corresponding prompt message is sent according to the detected emotion to play a role in relieving the emotion of the customer service staff, so that the service quality of the customer service staff is improved, and meanwhile, the satisfaction degree of the user is increased.
In an embodiment, the inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image, and the method further includes a process of training the expression classification model based on the convolutional neural network, and specifically includes:
the method comprises the steps of collecting a plurality of public face images containing various expressions, adjusting the sizes of the face images to preset sizes, converting the face images into gray level images, and converting the gray level images into pixel matrixes X (X)ij]M×NWherein x isijPixel values representing the ith row and the jth column of the image, M being the height of the image (in pixels) and N being the width of the image (in pixels);
the pixel matrix of all gray level images is subjected to mean value removing processing, and the calculation formula is as follows:
and inputting the face image subjected to mean value removing processing into a convolutional neural network model for training to obtain the expression classification model based on the convolutional neural network.
In an embodiment, the inputting the face image after the mean value removing process into a convolutional neural network model for training to obtain an expression classification model based on a convolutional neural network includes:
performing convolution calculation on a pixel matrix of the gray level image input to the convolution layer, wherein the calculation formula is as follows:
wherein,j is the index of the network layer, j is the index of the output expression diagram, i is the index of the input expression diagram, NinIn order to input the number of the emoticons,a convolution kernel corresponding to the ith input expression chart of the layer I network is shown,is an offset.
In an embodiment, an emotion recognition-based customer service monitoring device is provided, as shown in fig. 4, including an image acquisition module, an expression recognition module, an emotion recognition module, and a result processing module, specifically:
the image acquisition module is used for acquiring a face image to be detected and preprocessing the face image;
the expression recognition module is used for inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image;
the emotion recognition module is used for calculating to obtain an emotion recognition result corresponding to the face image according to an expression recognition result of the face image and a preset corresponding relation between the expression and the emotion and a calculation rule;
and the result processing module is used for sending a prompt message corresponding to the emotion recognition result to the customer service staff according to the emotion recognition result.
In one embodiment, a computer device is provided, which includes a memory and a processor, the memory stores computer readable instructions, and the computer readable instructions, when executed by the one or more processors, cause the one or more processors to implement the steps of the emotion recognition based customer service monitoring method in the above embodiments when the computer readable instructions are executed by the one or more processors.
In one embodiment, a storage medium is provided, which can be read and written by a processor, and stores computer readable instructions, which when executed by one or more processors, cause the one or more processors to perform the steps of the emotion recognition-based customer service monitoring method described in the above embodiments. Wherein the storage medium may be a non-volatile storage medium.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: a Read Only Memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, or the like.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express some exemplary embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the present invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.
Claims (10)
1. A customer service monitoring method based on emotion recognition is characterized by comprising the following steps:
acquiring a face image to be detected, and preprocessing the face image;
inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image;
calculating according to the expression recognition result of the face image and a preset corresponding relation between the expression and the emotion and a calculation rule to obtain an emotion recognition result corresponding to the face image;
and sending a prompt message corresponding to the emotion recognition result to the customer service staff according to the emotion recognition result.
2. The customer service monitoring method based on emotion recognition as recited in claim 1, wherein said obtaining a face image to be detected and preprocessing said face image comprises:
acquiring video data of a person to be detected through monitoring equipment, and extracting an image containing the face of the person to be detected from the video data to obtain a face image;
carrying out gray level conversion on the face image, and then correcting the length and width of the face image after the gray level conversion to adjust the length and width of the face image to a preset size;
and then segmenting the face image according to a preset face part rule, and marking the segmented parts.
3. The emotion recognition-based customer service monitoring method of claim 2, wherein the step of inputting the preprocessed face image into a trained convolutional neural network-based expression classification model for processing to obtain an expression recognition result of the face image comprises:
inputting the segmented and marked face image into a trained expression classification model based on a convolutional neural network;
calculating the marked parts one by one, and synthesizing the calculation results of all the parts to obtain the score of the face image under each expression in the expression classification model based on the convolutional neural network;
and sorting the scores of each type of expressions of the facial image from high to low according to the scores, and taking the expressions with the scores arranged in the top and the corresponding scores as the expression recognition results of the facial image.
4. The customer service monitoring method based on emotion recognition as recited in claim 3, wherein the obtaining of the emotion recognition result corresponding to the face image by calculation according to the expression recognition result of the face image and the preset correspondence between expression and emotion and the calculation rule comprises:
inputting the expression recognition result into an emotion recognition model, and respectively calculating the scores of all emotions corresponding to each type of expression in the expression recognition result;
and summing the scores of the same emotion, wherein the emotion with the highest score is the emotion recognition result corresponding to the face image.
5. The customer service monitoring method based on emotion recognition as recited in claim 1, wherein said sending a prompt message corresponding to the emotion recognition result to the customer service person according to the emotion recognition result comprises:
acquiring the emotion recognition result;
calling a corresponding relation table of a preset emotion recognition result and a prompt message;
in the comparison relation table, carrying out keyword query on the emotion recognition result to obtain a prompt message corresponding to the emotion recognition result;
and sending the prompt message to customer service staff.
6. The emotion recognition-based customer service monitoring method of claim 1, wherein the preprocessed face image is input into a trained convolutional neural network-based expression classification model for processing, so as to obtain an expression recognition result of the face image, and the method further comprises a process of training the convolutional neural network-based expression classification model, specifically comprising:
the method comprises the steps of collecting a plurality of public face images containing various expressions, adjusting the sizes of the face images to preset sizes, converting the face images into gray level images, and converting the gray level images into pixel matrixes X (X)ij]M×NWherein x isijPixel values representing the ith row and the jth column of the image, M being the height of the image (in pixels) and N being the width of the image (in pixels);
the pixel matrix of all gray level images is subjected to mean value removing processing, and the calculation formula is as follows:
and inputting the face image subjected to mean value removing processing into a convolutional neural network model for training to obtain the expression classification model based on the convolutional neural network.
7. The emotion recognition-based customer service monitoring method of claim 6, wherein the step of inputting the face image subjected to the mean value removal processing into a convolutional neural network model for training to obtain an expression classification model based on the convolutional neural network comprises:
performing convolution calculation on a pixel matrix of the gray level image input to the convolution layer, wherein the calculation formula is as follows:
wherein,j is the index of the network layer, j is the index of the output expression diagram, i is the index of the input expression diagram, NinIn order to input the number of the emoticons,a convolution kernel corresponding to the ith input expression chart of the layer I network is shown,is an offset.
8. The utility model provides a customer service monitoring devices based on emotion recognition which characterized in that includes following module:
the image acquisition module is used for acquiring a face image to be detected and preprocessing the face image;
the expression recognition module is used for inputting the preprocessed face image into a trained expression classification model based on a convolutional neural network for processing to obtain an expression recognition result of the face image;
the emotion recognition module is used for calculating to obtain an emotion recognition result corresponding to the face image according to an expression recognition result of the face image and a preset corresponding relation between the expression and the emotion and a calculation rule;
and the result processing module is used for sending a prompt message corresponding to the emotion recognition result to the customer service staff according to the emotion recognition result.
9. A computer device comprising a memory and a processor, the memory having stored therein computer-readable instructions which, when executed by one or more of the processors, cause the one or more processors to carry out the steps of the method of emotion recognition based customer service monitoring as claimed in any one of claims 1 to 7.
10. A storage medium readable by a processor, the storage medium storing computer readable instructions which, when executed by one or more processors, cause the one or more processors to perform the steps of the method for emotion recognition based customer service monitoring as claimed in any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910061884.4A CN109919001A (en) | 2019-01-23 | 2019-01-23 | Customer service monitoring method, device, equipment and storage medium based on Emotion identification |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910061884.4A CN109919001A (en) | 2019-01-23 | 2019-01-23 | Customer service monitoring method, device, equipment and storage medium based on Emotion identification |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109919001A true CN109919001A (en) | 2019-06-21 |
Family
ID=66960509
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910061884.4A Pending CN109919001A (en) | 2019-01-23 | 2019-01-23 | Customer service monitoring method, device, equipment and storage medium based on Emotion identification |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109919001A (en) |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111259729A (en) * | 2019-12-30 | 2020-06-09 | 视联动力信息技术股份有限公司 | Expression recognition method and device |
CN111885343A (en) * | 2020-07-31 | 2020-11-03 | 中国工商银行股份有限公司 | Feature processing method and device, electronic equipment and readable storage medium |
CN111932056A (en) * | 2020-06-19 | 2020-11-13 | 北京文思海辉金信软件有限公司 | Customer service quality scoring method and device, computer equipment and storage medium |
CN111967380A (en) * | 2020-08-16 | 2020-11-20 | 云知声智能科技股份有限公司 | Content recommendation method and system |
CN112418059A (en) * | 2020-11-19 | 2021-02-26 | 平安普惠企业管理有限公司 | Emotion recognition method and device, computer equipment and storage medium |
CN112700255A (en) * | 2020-12-28 | 2021-04-23 | 科讯嘉联信息技术有限公司 | Multi-mode monitoring service system and method |
CN112699774A (en) * | 2020-12-28 | 2021-04-23 | 深延科技(北京)有限公司 | Method and device for recognizing emotion of person in video, computer equipment and medium |
CN114140865A (en) * | 2022-01-29 | 2022-03-04 | 深圳市中讯网联科技有限公司 | Intelligent early warning method and device, storage medium and electronic equipment |
CN116112630A (en) * | 2023-04-04 | 2023-05-12 | 成都新希望金融信息有限公司 | Intelligent video face tag switching method |
CN117079324A (en) * | 2023-08-17 | 2023-11-17 | 厚德明心(北京)科技有限公司 | Face emotion recognition method and device, electronic equipment and storage medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0676058A (en) * | 1992-06-22 | 1994-03-18 | Masashige Furukawa | Expression encoding device and emotion discriminating device |
CN103186774A (en) * | 2013-03-21 | 2013-07-03 | 北京工业大学 | Semi-supervised learning-based multi-gesture facial expression recognition method |
CN105354527A (en) * | 2014-08-20 | 2016-02-24 | 南京普爱射线影像设备有限公司 | Negative expression recognizing and encouraging system |
GB201713829D0 (en) * | 2017-08-29 | 2017-10-11 | We Are Human Ltd | Image data processing system and method |
CN107358169A (en) * | 2017-06-21 | 2017-11-17 | 厦门中控智慧信息技术有限公司 | A kind of facial expression recognizing method and expression recognition device |
CN107862598A (en) * | 2017-09-30 | 2018-03-30 | 平安普惠企业管理有限公司 | Long-range the interview measures and procedures for the examination and approval, server and readable storage medium storing program for executing |
CN107943449A (en) * | 2017-12-23 | 2018-04-20 | 河南智盈电子技术有限公司 | A kind of intelligent sound system based on human facial expression recognition |
CN108564007A (en) * | 2018-03-27 | 2018-09-21 | 深圳市智能机器人研究院 | A kind of Emotion identification method and apparatus based on Expression Recognition |
CN108734570A (en) * | 2018-05-22 | 2018-11-02 | 深圳壹账通智能科技有限公司 | A kind of Risk Forecast Method, storage medium and server |
CN108960201A (en) * | 2018-08-01 | 2018-12-07 | 西南石油大学 | A kind of expression recognition method extracted based on face key point and sparse expression is classified |
CN109190487A (en) * | 2018-08-07 | 2019-01-11 | 平安科技(深圳)有限公司 | Face Emotion identification method, apparatus, computer equipment and storage medium |
-
2019
- 2019-01-23 CN CN201910061884.4A patent/CN109919001A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0676058A (en) * | 1992-06-22 | 1994-03-18 | Masashige Furukawa | Expression encoding device and emotion discriminating device |
CN103186774A (en) * | 2013-03-21 | 2013-07-03 | 北京工业大学 | Semi-supervised learning-based multi-gesture facial expression recognition method |
CN105354527A (en) * | 2014-08-20 | 2016-02-24 | 南京普爱射线影像设备有限公司 | Negative expression recognizing and encouraging system |
CN107358169A (en) * | 2017-06-21 | 2017-11-17 | 厦门中控智慧信息技术有限公司 | A kind of facial expression recognizing method and expression recognition device |
GB201713829D0 (en) * | 2017-08-29 | 2017-10-11 | We Are Human Ltd | Image data processing system and method |
CN107862598A (en) * | 2017-09-30 | 2018-03-30 | 平安普惠企业管理有限公司 | Long-range the interview measures and procedures for the examination and approval, server and readable storage medium storing program for executing |
CN107943449A (en) * | 2017-12-23 | 2018-04-20 | 河南智盈电子技术有限公司 | A kind of intelligent sound system based on human facial expression recognition |
CN108564007A (en) * | 2018-03-27 | 2018-09-21 | 深圳市智能机器人研究院 | A kind of Emotion identification method and apparatus based on Expression Recognition |
CN108734570A (en) * | 2018-05-22 | 2018-11-02 | 深圳壹账通智能科技有限公司 | A kind of Risk Forecast Method, storage medium and server |
CN108960201A (en) * | 2018-08-01 | 2018-12-07 | 西南石油大学 | A kind of expression recognition method extracted based on face key point and sparse expression is classified |
CN109190487A (en) * | 2018-08-07 | 2019-01-11 | 平安科技(深圳)有限公司 | Face Emotion identification method, apparatus, computer equipment and storage medium |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111259729A (en) * | 2019-12-30 | 2020-06-09 | 视联动力信息技术股份有限公司 | Expression recognition method and device |
CN111932056A (en) * | 2020-06-19 | 2020-11-13 | 北京文思海辉金信软件有限公司 | Customer service quality scoring method and device, computer equipment and storage medium |
CN111885343B (en) * | 2020-07-31 | 2022-06-14 | 中国工商银行股份有限公司 | Feature processing method and device, electronic equipment and readable storage medium |
CN111885343A (en) * | 2020-07-31 | 2020-11-03 | 中国工商银行股份有限公司 | Feature processing method and device, electronic equipment and readable storage medium |
CN111967380A (en) * | 2020-08-16 | 2020-11-20 | 云知声智能科技股份有限公司 | Content recommendation method and system |
CN112418059A (en) * | 2020-11-19 | 2021-02-26 | 平安普惠企业管理有限公司 | Emotion recognition method and device, computer equipment and storage medium |
CN112418059B (en) * | 2020-11-19 | 2024-01-05 | 哈尔滨华晟泛亚人力资源服务有限公司 | Emotion recognition method and device, computer equipment and storage medium |
CN112699774A (en) * | 2020-12-28 | 2021-04-23 | 深延科技(北京)有限公司 | Method and device for recognizing emotion of person in video, computer equipment and medium |
CN112700255A (en) * | 2020-12-28 | 2021-04-23 | 科讯嘉联信息技术有限公司 | Multi-mode monitoring service system and method |
CN112699774B (en) * | 2020-12-28 | 2024-05-24 | 深延科技(北京)有限公司 | Emotion recognition method and device for characters in video, computer equipment and medium |
CN114140865A (en) * | 2022-01-29 | 2022-03-04 | 深圳市中讯网联科技有限公司 | Intelligent early warning method and device, storage medium and electronic equipment |
CN116112630A (en) * | 2023-04-04 | 2023-05-12 | 成都新希望金融信息有限公司 | Intelligent video face tag switching method |
CN116112630B (en) * | 2023-04-04 | 2023-06-23 | 成都新希望金融信息有限公司 | Intelligent video face tag switching method |
CN117079324A (en) * | 2023-08-17 | 2023-11-17 | 厚德明心(北京)科技有限公司 | Face emotion recognition method and device, electronic equipment and storage medium |
CN117079324B (en) * | 2023-08-17 | 2024-03-12 | 厚德明心(北京)科技有限公司 | Face emotion recognition method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109919001A (en) | Customer service monitoring method, device, equipment and storage medium based on Emotion identification | |
CN108564007B (en) | Emotion recognition method and device based on expression recognition | |
CN104361316B (en) | Dimension emotion recognition method based on multi-scale time sequence modeling | |
CN110321805B (en) | Dynamic expression recognition method based on time sequence relation reasoning | |
CN110363084A (en) | A kind of class state detection method, device, storage medium and electronics | |
CN110719525A (en) | Bullet screen expression package generation method, electronic equipment and readable storage medium | |
JP2018055470A (en) | Facial expression recognition method, facial expression recognition apparatus, computer program, and advertisement management system | |
CN111292262B (en) | Image processing method, device, electronic equipment and storage medium | |
CN116363261B (en) | Training method of image editing model, image editing method and device | |
CN107025678A (en) | A kind of driving method and device of 3D dummy models | |
CN111177386B (en) | Proposal classification method and system | |
CN105956570B (en) | Smiling face's recognition methods based on lip feature and deep learning | |
CN111737576B (en) | Application function personalized recommendation method and device | |
CA3050456C (en) | Facial modelling and matching systems and methods | |
CN111127309A (en) | Portrait style transfer model training method, portrait style transfer method and device | |
CN112883867A (en) | Student online learning evaluation method and system based on image emotion analysis | |
CN111126254A (en) | Image recognition method, device, equipment and storage medium | |
CN111339940B (en) | Video risk identification method and device | |
CN113422876A (en) | AI-based auxiliary management method, system and medium for power customer service center | |
CN115620384A (en) | Model training method, fundus image prediction method and device | |
CN111144407A (en) | Target detection method, system, device and readable storage medium | |
CN113868472A (en) | Method for generating digital human video and related equipment | |
CN110135391A (en) | System is matched using the program and spectacle-frame of computer apolegamy spectacle-frame | |
CN116643675B (en) | Intelligent interaction system based on AI virtual character | |
CN111126177B (en) | Method and device for counting number of people |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |