CN115953724A - User data analysis and management method, device, equipment and storage medium - Google Patents

User data analysis and management method, device, equipment and storage medium Download PDF

Info

Publication number
CN115953724A
CN115953724A CN202310237508.2A CN202310237508A CN115953724A CN 115953724 A CN115953724 A CN 115953724A CN 202310237508 A CN202310237508 A CN 202310237508A CN 115953724 A CN115953724 A CN 115953724A
Authority
CN
China
Prior art keywords
user
emotion
analysis
video data
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202310237508.2A
Other languages
Chinese (zh)
Other versions
CN115953724B (en
Inventor
匡奕胤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silver Bullet Technology Co ltd
Original Assignee
Smartin Technology Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smartin Technology Shenzhen Co ltd filed Critical Smartin Technology Shenzhen Co ltd
Priority to CN202310237508.2A priority Critical patent/CN115953724B/en
Publication of CN115953724A publication Critical patent/CN115953724A/en
Application granted granted Critical
Publication of CN115953724B publication Critical patent/CN115953724B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The invention relates to the field of artificial intelligence, and discloses a method, a device, equipment and a storage medium for analyzing and managing user data, which are used for realizing intelligent monitoring on user emotion and improving the accuracy of emotion analysis of a user. The method comprises the following steps: respectively inputting the plurality of continuous video data into a user emotion analysis model for user emotion analysis to obtain a user emotion analysis result corresponding to each continuous video data; performing emotion fluctuation analysis on the target user according to the emotion analysis result of the user corresponding to each continuous video data to construct an emotion fluctuation distribution map of the target user; obtaining a judgment result according to the emotion fluctuation distribution map; if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has a record in the emotion abnormal record table; and if the record exists, performing emotion abnormity warning on the target user, performing data integration analysis on the plurality of continuous video data and the judgment result, and generating a user management scheme.

Description

User data analysis and management method, device, equipment and storage medium
Technical Field
The present invention relates to the field of artificial intelligence, and in particular, to a method, an apparatus, a device, and a storage medium for analyzing and managing user data.
Background
With the rapid development of artificial intelligence technology, people pay more attention to the research on human emotion expression. The existing scheme is to simply use a voice or visual mode to identify the current emotion of people, and the single identification mode is often low in accuracy and causes the situation of false identification.
In the existing scheme, the emotion of people is recognized by a visual method, and the expressed facial expressions of some people are different when the people feel happy, angry, sadness and other emotions; or only a voice mode is used for recognizing the emotion of people, the recognition result is usually dialect, and the accuracy of the existing scheme is low.
Disclosure of Invention
The invention provides a method, a device, equipment and a storage medium for analyzing and managing user data, which are used for realizing intelligent monitoring on user emotion and improving the accuracy of emotion analysis of a user.
A first aspect of the present invention provides a user data analysis and management method, including:
acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and performing video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
acquiring user basic information of the target user, and setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data to obtain a user emotion analysis model corresponding to the target user;
respectively inputting the plurality of continuous video data into the user emotion analysis model for user emotion analysis to obtain a user emotion analysis result corresponding to each continuous video data;
performing emotion fluctuation analysis on the target user according to a user emotion analysis result corresponding to each continuous video data to construct an emotion fluctuation distribution map of the target user;
judging whether the target user has abnormal emotional fluctuation according to the emotional fluctuation distribution map to obtain a judgment result;
if the judgment result is that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has a record in the emotion abnormal record table;
and if the record exists, performing emotion abnormity warning on the target user, performing data integration analysis on the plurality of continuous video data and the judgment result, and generating a user management scheme.
With reference to the first aspect, in a first implementation manner of the first aspect of the present invention, the obtaining user basic information of the target user, and setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data to obtain a user emotion analysis model corresponding to the target user includes:
acquiring user basic information of the target user, wherein the user basic information comprises: user gender and user age;
matching corresponding target parameter information from a preset parameter set according to the user gender and the user age;
setting detection times according to the plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information;
and taking the set original video processing model as a user emotion analysis model corresponding to the target user.
With reference to the first aspect, in a second implementation manner of the first aspect of the present invention, the respectively inputting the multiple continuous video data into the user emotion analysis model to perform user emotion analysis, so as to obtain a user emotion analysis result corresponding to each continuous video data, includes:
inputting the plurality of continuous video data into the user emotion analysis model respectively;
performing emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain the emotion stability of each continuous video data;
and generating a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
With reference to the first aspect, in a third implementation manner of the first aspect of the present invention, the performing emotion fluctuation analysis on the target user according to the user emotion analysis result corresponding to each piece of continuous video data to construct an emotion fluctuation distribution map of the target user includes:
acquiring video acquisition intervals corresponding to the plurality of continuous video data;
carrying out corresponding matching on the emotion analysis result of the user corresponding to each continuous video data and the video acquisition interval, and carrying out numerical mapping on the emotion analysis result of the user to generate a target numerical value corresponding to each emotion analysis result of the user;
and constructing the emotion fluctuation distribution map of the target user according to the target numerical value corresponding to the emotion analysis result of each user.
With reference to the first aspect, in a fourth implementation manner of the first aspect of the present invention, the determining, according to the mood swing profile, whether there is a mood swing abnormality in the target user, and obtaining a determination result, includes:
extracting feature points of the emotion fluctuation distribution map to obtain target distribution probability;
comparing the target distribution probability with a preset probability threshold to obtain a judgment result;
if the target distribution probability is larger than or equal to a preset threshold value, determining that the judgment result is that the emotion fluctuation abnormality exists;
and if the target distribution probability is smaller than a preset threshold value, determining that no emotion fluctuation abnormality exists in the judgment result.
With reference to the first aspect, in a fifth implementation manner of the first aspect of the present invention, if there is a record, performing an emotional anomaly alarm on the target user, and performing data integration analysis on the multiple continuous video data and the determination result to generate a user management scheme, includes:
if the record exists, generating alarm information of the target user;
transmitting the alarm information to a preset monitoring terminal, and performing emotion abnormity alarm on the target user;
and performing data integration analysis on the plurality of continuous video data and the judgment result to generate a user management scheme.
With reference to the first aspect, in a sixth implementation manner of the first aspect of the present invention, the user data analysis and management method further includes:
collecting real-time video data of the target user;
performing emotion analysis on the real-time video data to obtain a target analysis result;
and adjusting the user management scheme according to the target analysis result to obtain the adjusted user management scheme.
A second aspect of the present invention provides a user data analysis and management apparatus, including:
the acquisition module is used for acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and performing video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
the setting module is used for acquiring the user basic information of the target user, and setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data to obtain a user emotion analysis model corresponding to the target user;
the analysis module is used for inputting the plurality of continuous video data into the user emotion analysis model respectively to perform user emotion analysis so as to obtain a user emotion analysis result corresponding to each continuous video data;
the construction module is used for carrying out emotion fluctuation analysis on the target user according to the emotion analysis result of the user corresponding to each continuous video data to construct an emotion fluctuation distribution map of the target user;
the judging module is used for judging whether the target user has emotion fluctuation abnormity according to the emotion fluctuation distribution map to obtain a judging result;
the identification module is used for acquiring an emotion abnormity record table if the judgment result shows that the emotion fluctuation abnormity exists, and identifying whether the target user has a record in the emotion abnormity record table;
and the generating module is used for performing emotional anomaly warning on the target user if the record exists, performing data integration analysis on the plurality of continuous video data and the judgment result, and generating a user management scheme.
With reference to the second aspect, in a first implementation manner of the second aspect of the present invention, the setting module is specifically configured to:
acquiring user basic information of the target user, wherein the user basic information comprises: user gender and user age;
matching corresponding target parameter information from a preset parameter set according to the gender and the age of the user;
setting detection times according to the plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information;
and taking the set original video processing model as a user emotion analysis model corresponding to the target user.
With reference to the second aspect, in a second implementation manner of the second aspect of the present invention, the analysis module is specifically configured to:
inputting the plurality of continuous video data into the user emotion analysis model respectively;
performing emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain the emotion stability of each continuous video data;
and generating a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
With reference to the second aspect, in a third implementation manner of the second aspect of the present invention, the building module is specifically configured to:
acquiring video acquisition intervals corresponding to the plurality of continuous video data;
carrying out corresponding matching on the emotion analysis result of the user corresponding to each continuous video data and the video acquisition interval, and carrying out numerical mapping on the emotion analysis result of the user to generate a target numerical value corresponding to each emotion analysis result of the user;
and constructing the emotion fluctuation distribution map of the target user according to the target numerical value corresponding to the emotion analysis result of each user.
With reference to the second aspect, in a fourth implementation manner of the second aspect of the present invention, the determining module is specifically configured to:
extracting feature points of the emotion fluctuation distribution map to obtain target distribution probability;
comparing the target distribution probability with a preset probability threshold to obtain a judgment result;
if the target distribution probability is larger than or equal to a preset threshold value, determining that the judgment result is that the emotion fluctuation abnormality exists;
and if the target distribution probability is smaller than a preset threshold value, determining that no emotion fluctuation abnormality exists in the judgment result.
With reference to the second aspect, in a fifth implementation manner of the second aspect of the present invention, the generating module is specifically configured to:
if the record exists, generating alarm information of the target user;
transmitting the alarm information to a preset monitoring terminal, and performing emotion abnormity alarm on the target user;
and performing data integration analysis on the plurality of continuous video data and the judgment result to generate a user management scheme.
With reference to the second aspect, in a sixth implementation manner of the second aspect of the present invention, the user data analysis and management apparatus further includes:
the adjusting module is used for acquiring real-time video data of the target user; performing emotion analysis on the real-time video data to obtain a target analysis result; and adjusting the user management scheme according to the target analysis result to obtain the adjusted user management scheme.
A third aspect of the present invention provides a user data analysis and management apparatus, including: a memory and at least one processor, the memory having instructions stored therein; the at least one processor invokes the instructions in the memory to cause the user data analysis and management device to perform the user data analysis and management method described above.
A fourth aspect of the present invention provides a computer-readable storage medium having stored therein instructions, which, when run on a computer, cause the computer to perform the user data analysis and management method described above.
In the technical scheme provided by the invention, a plurality of continuous video data are respectively input into a user emotion analysis model to carry out user emotion analysis, and a user emotion analysis result corresponding to each continuous video data is obtained; performing emotion fluctuation analysis on the target user according to the emotion analysis result of the user corresponding to each continuous video data to construct an emotion fluctuation distribution map of the target user; obtaining a judgment result according to the emotion fluctuation distribution map; if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has a record in the emotion abnormal record table; if the record exists, the emotion abnormity warning is carried out on the target user, the data integration analysis is carried out on the plurality of continuous video data and the judgment result, and the user management scheme is generated.
Drawings
FIG. 1 is a diagram of an embodiment of a method for analyzing and managing user data according to an embodiment of the present invention;
FIG. 2 is a flow chart of a user emotion analysis in an embodiment of the present invention;
FIG. 3 is a flow chart of mood swing analysis in an embodiment of the present invention;
FIG. 4 is a flowchart illustrating a method for determining whether an emotional fluctuation abnormality exists in a target user according to an embodiment of the present invention;
FIG. 5 is a diagram of an embodiment of a user data analysis and management apparatus according to an embodiment of the present invention;
fig. 6 is a schematic diagram of another embodiment of a user data analysis and management apparatus according to an embodiment of the present invention;
fig. 7 is a schematic diagram of an embodiment of a user data analysis and management device in the embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a method, a device, equipment and a storage medium for analyzing and managing user data, which are used for realizing intelligent monitoring on user emotion and improving the accuracy of emotion analysis of a user. The terms "first," "second," "third," "fourth," and the like in the description and in the claims, as well as in the drawings, if any, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprises," "comprising," or "having," and any variations thereof, are intended to cover non-exclusive inclusions, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
For convenience of understanding, a specific flow of an embodiment of the present invention is described below, and referring to fig. 1, an embodiment of a method for analyzing and managing user data according to an embodiment of the present invention includes:
s101, acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and performing video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
it should be understood that the executing entity of the present invention may be a user data analyzing and managing device, and may also be a terminal or a server, which is not limited herein. The embodiment of the present invention is described by taking a server as an execution subject.
Specifically, the server acquires a plurality of initial video data of a target user based on a preset video acquisition interval, further, the server performs time sequence characteristic analysis on each initial video data respectively to determine a time sequence characteristic corresponding to each initial video data, and further, the server performs video sequencing on the plurality of initial videos according to the time sequence characteristic corresponding to each initial video to generate a plurality of continuous video data.
S102, obtaining user basic information of a target user, and setting video detection parameters of an original video processing model according to the user basic information and a plurality of continuous video data to obtain a user emotion analysis model corresponding to the target user;
specifically, the server acquires user basic information, processes the user basic information to obtain time sequence data, further, the server sequentially sets hyper-parameters of an original video processing model according to a preset hyper-parameter range, inputs the hyper-parameters and the time sequence data which are set each time into the original video processing model, calculates to obtain prediction errors corresponding to the hyper-parameters, selects the first N groups of hyper-parameters with the minimum prediction errors, and further sets video detection parameters of the original video processing model through the first N groups of hyper-parameters to obtain a user emotion analysis model corresponding to a target user.
S103, respectively inputting the plurality of continuous video data into a user emotion analysis model for user emotion analysis to obtain a user emotion analysis result corresponding to each continuous video data;
it should be noted that, a plurality of continuous video data are respectively input into the emotion analysis model of the user; and performing emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain the emotion stability of each continuous video data, and generating a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
S104, performing emotion fluctuation analysis on the target user according to the emotion analysis result of the user corresponding to each continuous video data, and constructing an emotion fluctuation distribution map of the target user;
specifically, for each continuous video data, dividing dimensions and indexes according to the emotion fluctuation indexes to perform real-time statistics to obtain specific numerical values of the indexes, building a calculation model of the emotion fluctuation indexes by using a user emotion analysis model, inputting each continuous video data, performing machine learning, matching weights of the dimensions and performing comprehensive calculation on the indexes to determine the emotion fluctuation indexes, and further performing emotion fluctuation analysis on a target user through the emotion fluctuation indexes by using a server to build an emotion fluctuation distribution map of the target user.
S105, judging whether the target user has emotion fluctuation abnormity according to the emotion fluctuation distribution map to obtain a judgment result;
specifically, an emotion fluctuation distribution map is obtained, emotion recognition is carried out on the emotion fluctuation distribution map, the emotion type to which the emotion fluctuation distribution map belongs is determined, an emotion recognition result of the emotion fluctuation distribution map is obtained, whether emotion fluctuation abnormality exists in a target user is judged, and a judgment result is obtained.
S106, if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has a record in the emotion abnormal record table;
and S107, if the record exists, performing emotion abnormity warning on the target user, performing data integration analysis on the plurality of continuous video data and the judgment result, and generating a user management scheme.
In particular, the server
In the embodiment of the invention, if the judgment result shows that the emotion fluctuation is abnormal, the emotion abnormal record table is obtained, whether a target user has a record in the emotion abnormal record table is identified, if the emotion abnormal record table has the record, the emotion abnormal alarm is given to the target user, the data integration analysis is carried out on a plurality of continuous video data and the judgment result, and the user management scheme is generated.
In a specific embodiment, the process of executing step S102 may specifically include the following steps:
(1) Acquiring user basic information of a target user, wherein the user basic information comprises: user gender and user age;
(2) Matching corresponding target parameter information from a preset parameter set according to the gender and the age of the user;
(3) Setting detection times according to a plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and target parameter information;
(4) And taking the set original video processing model as a user emotion analysis model corresponding to the target user.
Specifically, the server obtains user basic information of the target user, where the user basic information includes: the user gender and the user age, and then the server matches the corresponding target parameter information from a preset parameter set according to the user gender and the user age, wherein,
extracting text characteristic information in user basic information, extracting theme characteristic information in the user basic information, performing characteristic fusion on the theme characteristic information and the text characteristic information to obtain fusion characteristic information, extracting user behavior characteristic information from the fusion characteristic information, determining time sequence data according to the user behavior characteristic information, further, matching corresponding target parameter information from a preset parameter set by a server according to the time sequence data, further, setting detection times according to a plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information, wherein the server sets hyper-parameters of the original video processing model in sequence according to a preset hyper-parameter range, inputting the hyper-parameters and the time sequence data which are set each time into the original video processing model, calculating prediction errors corresponding to the hyper-parameters, selecting a front N groups of hyper-parameters with the minimum prediction errors, further setting the video detection parameters of the original video processing model through the front N groups of hyper-parameters, and obtaining a user emotion analysis model corresponding to a target user.
In a specific embodiment, as shown in fig. 2, the process of executing step S103 may specifically include the following steps:
s201, respectively inputting a plurality of continuous video data into a user emotion analysis model;
s202, performing emotion stability analysis on the plurality of continuous video data through a user emotion analysis model to obtain the emotion stability of each continuous video data;
and S203, generating a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
Specifically, the server respectively inputs a plurality of continuous video data into a user emotion analysis model, emotion stability analysis is carried out on the plurality of continuous video data through the user emotion analysis model, emotion stability of each continuous video data is obtained, wherein the server obtains emotion theme related video materials, carries out screening and classification, forms a basic emotion analysis library, carries out one-dimensional emotion analysis depth scoring on all emotion analysis materials for many times, takes the average score of each section of analysis material as the one-dimensional emotion analysis depth score of the material, obtains the emotion stability of each continuous video data, and finally generates a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
In a specific embodiment, as shown in fig. 3, the process of executing step S104 may specifically include the following steps:
s301, acquiring video acquisition intervals corresponding to a plurality of continuous video data;
s302, carrying out corresponding matching on the emotion analysis result of the user corresponding to each continuous video data and the video acquisition interval, carrying out numerical mapping on the emotion analysis result of the user, and generating a target numerical value corresponding to each emotion analysis result of the user;
s303, constructing an emotion fluctuation distribution map of the target user according to the target numerical value corresponding to the emotion analysis result of each user.
Specifically, the server acquires video acquisition intervals corresponding to a plurality of continuous video data and acquires a mapping numerical value having a mapping relation with the video acquisition intervals, and further, the server performs corresponding matching on a user emotion analysis result corresponding to each continuous video data and the video acquisition intervals and performs numerical mapping on the user emotion analysis result to generate a target numerical value corresponding to each user emotion analysis result; the method comprises the steps that a mapping relation between a video acquisition interval and a mapping numerical value is input into each continuous video data, then a server correspondingly matches a user emotion analysis result corresponding to each continuous video data with the video acquisition interval according to the mapping relation, numerical mapping is conducted on the user emotion analysis result, a target numerical value corresponding to each user emotion analysis result is generated, and finally, the server constructs an emotion fluctuation distribution map of a target user according to the target numerical value corresponding to each user emotion analysis result.
In a specific embodiment, as shown in fig. 4, the process of executing step S105 may specifically include the following steps:
s401, extracting feature points of the emotion fluctuation distribution map to obtain target distribution probability;
s402, comparing the target distribution probability with a preset probability threshold to obtain a judgment result;
s403, if the target distribution probability is greater than or equal to a preset threshold, determining that the judgment result is that the emotional fluctuation is abnormal;
s404, if the target distribution probability is smaller than a preset threshold value, determining that no emotion fluctuation abnormality exists in the judgment result.
Specifically, the server extracts the characteristic points of the emotion fluctuation distribution map to obtain the target distribution probability,
the server pre-constructs an initial feature point extraction model, wherein the initial feature point extraction model comprises a feature extraction module, and the feature extraction module comprises a multi-scale convolutional layer and a deformable convolutional layer. And training the initial characteristic point extraction model by using a training data set to obtain a trained characteristic point extraction model. Inputting the image to be processed into a trained feature point extraction model, further obtaining a target distribution probability, comparing the target distribution probability with a preset probability threshold value to obtain a judgment result, if the target distribution probability is greater than or equal to the preset threshold value, determining that the judgment result is that emotional fluctuation abnormity exists, and if the target distribution probability is smaller than the preset threshold value, determining that the judgment result is that the emotional fluctuation abnormity does not exist.
In a specific embodiment, the process of executing step S107 may specifically include the following steps:
(1) If the record exists, generating alarm information of the target user;
(2) Transmitting the alarm information to a preset monitoring terminal, and performing emotion abnormity alarm on a target user;
(3) And performing data integration analysis on the plurality of continuous video data and the judgment result to generate a user management scheme.
In a specific embodiment, the user data analysis and management method further includes the following steps:
(1) Collecting real-time video data of a target user;
(2) Performing emotion analysis on the real-time video data to obtain a target analysis result;
(3) And adjusting the user management scheme according to the target analysis result to obtain the adjusted user management scheme.
Specifically, if the record exists, generating alarm information of the target user, transmitting the alarm information to a preset monitoring terminal, performing emotion abnormity alarm on the target user, performing data integration analysis on a plurality of continuous video data and the judgment result, and generating a user management scheme.
Further, real-time video data of a target user are collected, wherein the server updates and obtains the current real-time video data of the user every a first interval time length, the real-time video data of a monitoring interval in the process of executing the management scheme are determined according to the obtained real-time video data sequence of the user and the position and time length information of the corresponding preset monitoring interval in the management scheme, the real-time video data of the corresponding monitoring interval in the process of the user are compared with the characteristic value of a preset standard template, emotion analysis is carried out according to the comparison result, a target analysis result is obtained, the user management scheme is adjusted according to the target analysis result, and the adjusted user management scheme is obtained.
With reference to fig. 5, the user data analysis and management apparatus in the embodiment of the present invention is described above, and an embodiment of the user data analysis and management apparatus in the embodiment of the present invention includes:
an obtaining module 501, configured to obtain multiple pieces of initial video data of a target user based on a preset video acquisition interval, and perform video sequencing on the multiple pieces of initial video data to generate multiple pieces of continuous video data;
a setting module 502, configured to obtain user basic information of the target user, and set video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data, so as to obtain a user emotion analysis model corresponding to the target user;
the analysis module 503 is configured to input the multiple continuous video data into the user emotion analysis model for user emotion analysis, so as to obtain a user emotion analysis result corresponding to each continuous video data;
a constructing module 504, configured to perform emotion fluctuation analysis on the target user according to a user emotion analysis result corresponding to each piece of continuous video data, and construct an emotion fluctuation distribution map of the target user;
the judging module 505 is configured to judge whether the target user has abnormal emotional fluctuation according to the emotional fluctuation distribution map, so as to obtain a judgment result;
an identifying module 506, configured to, if the determination result indicates that there is an abnormal emotion fluctuation, obtain an abnormal emotion record table, and identify whether the target user has a record in the abnormal emotion record table;
and a generating module 507, configured to perform an emotional anomaly alarm on the target user if there is a record, perform data integration analysis on the multiple continuous video data and the determination result, and generate a user management scheme.
Through the cooperative cooperation of the components, respectively inputting a plurality of continuous video data into a user emotion analysis model for user emotion analysis to obtain a user emotion analysis result corresponding to each continuous video data; performing emotion fluctuation analysis on the target user according to the user emotion analysis result corresponding to each continuous video data to construct an emotion fluctuation distribution map of the target user; obtaining a judgment result according to the emotion fluctuation distribution map; if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has a record in the emotion abnormal record table; if the record exists, the emotion abnormity warning is carried out on the target user, the data integration analysis is carried out on the plurality of continuous video data and the judgment result, and the user management scheme is generated.
Referring to fig. 6, another embodiment of the apparatus for analyzing and managing user data according to the embodiment of the present invention includes:
an obtaining module 501, configured to obtain multiple pieces of initial video data of a target user based on a preset video acquisition interval, perform video sequencing on the multiple pieces of initial video data, and generate multiple pieces of continuous video data;
a setting module 502, configured to obtain user basic information of the target user, and set video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data, so as to obtain a user emotion analysis model corresponding to the target user;
the analysis module 503 is configured to input the multiple continuous video data into the user emotion analysis model for user emotion analysis, so as to obtain a user emotion analysis result corresponding to each continuous video data;
a constructing module 504, configured to perform emotion fluctuation analysis on the target user according to a user emotion analysis result corresponding to each piece of continuous video data, and construct an emotion fluctuation distribution map of the target user;
the judging module 505 is configured to judge whether the target user has an abnormal emotion fluctuation according to the emotion fluctuation distribution map, so as to obtain a judgment result;
an identifying module 506, configured to, if the determination result indicates that there is an abnormal emotion fluctuation, obtain an abnormal emotion record table, and identify whether the target user has a record in the abnormal emotion record table;
and a generating module 507, configured to perform an emotional anomaly alarm on the target user if there is a record, perform data integration analysis on the multiple continuous video data and the determination result, and generate a user management scheme.
Optionally, the setting module 502 is specifically configured to:
acquiring user basic information of the target user, wherein the user basic information comprises: user gender and user age;
matching corresponding target parameter information from a preset parameter set according to the gender and the age of the user;
setting detection times according to the plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information;
and taking the set original video processing model as a user emotion analysis model corresponding to the target user.
Optionally, the analysis module 503 is specifically configured to:
inputting the plurality of continuous video data into the user emotion analysis model respectively;
performing emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain the emotion stability of each continuous video data;
and generating a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
Optionally, the building module 504 is specifically configured to:
acquiring video acquisition intervals corresponding to the plurality of continuous video data;
carrying out corresponding matching on the emotion analysis result of the user corresponding to each continuous video data and the video acquisition interval, and carrying out numerical mapping on the emotion analysis result of the user to generate a target numerical value corresponding to each emotion analysis result of the user;
and constructing the emotion fluctuation distribution map of the target user according to the target numerical value corresponding to the emotion analysis result of each user.
Optionally, the determining module 505 is specifically configured to:
extracting feature points of the emotion fluctuation distribution map to obtain target distribution probability;
comparing the target distribution probability with a preset probability threshold to obtain a judgment result;
if the target distribution probability is greater than or equal to a preset threshold value, determining that the judgment result is that emotional fluctuation abnormity exists;
and if the target distribution probability is smaller than a preset threshold value, determining that no emotion fluctuation abnormality exists in the judgment result.
Optionally, the generating module 507 is specifically configured to:
if the record exists, generating alarm information of the target user;
transmitting the alarm information to a preset monitoring terminal, and performing emotion abnormity alarm on the target user;
and performing data integration analysis on the plurality of continuous video data and the judgment result to generate a user management scheme.
Optionally, the user data analyzing and managing apparatus further includes:
an adjusting module 508, configured to collect real-time video data of the target user; performing emotion analysis on the real-time video data to obtain a target analysis result; and adjusting the user management scheme according to the target analysis result to obtain the adjusted user management scheme.
In the embodiment of the invention, a plurality of continuous video data are respectively input into a user emotion analysis model to carry out user emotion analysis, and a user emotion analysis result corresponding to each continuous video data is obtained; performing emotion fluctuation analysis on the target user according to the user emotion analysis result corresponding to each continuous video data to construct an emotion fluctuation distribution map of the target user; obtaining a judgment result according to the emotion fluctuation distribution map; if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has a record in the emotion abnormal record table; if the record exists, the emotion abnormity warning is carried out on the target user, the data integration analysis is carried out on the plurality of continuous video data and the judgment result, and the user management scheme is generated.
Fig. 5 and fig. 6 describe the user data analysis and management apparatus in the embodiment of the present invention in detail from the perspective of the modular functional entity, and the user data analysis and management apparatus in the embodiment of the present invention in detail from the perspective of hardware processing.
Fig. 7 is a schematic structural diagram of a user data analysis and management apparatus 600 according to an embodiment of the present invention, which may have a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 610 (e.g., one or more processors) and a memory 620, and one or more storage media 630 (e.g., one or more mass storage devices) for storing applications 633 or data 632. Memory 620 and storage medium 630 may be, among other things, transient or persistent storage. The program stored in the storage medium 630 may include one or more modules (not shown), each of which may include a series of instruction operations for analyzing user data and managing the device 600. Still further, the processor 610 may be configured to communicate with the storage medium 630 and execute a series of instruction operations in the storage medium 630 on the user data analysis and management device 600.
The user data analysis and management device 600 may also include one or more power supplies 640, one or more wired or wireless network interfaces 650, one or more input-output interfaces 660, and/or one or more operating systems 631, such as Windows Server, mac OS X, unix, linux, freeBSD, and the like. It will be appreciated by those skilled in the art that the user data analysis and management device configuration shown in fig. 7 does not constitute a limitation of the user data analysis and management device, and may include more or fewer components than shown, or some components may be combined, or a different arrangement of components.
The present invention further provides a user data analysis and management device, which includes a memory and a processor, where the memory stores computer readable instructions, and the computer readable instructions, when executed by the processor, cause the processor to execute the steps of the user data analysis and management method in the above embodiments.
The present invention also provides a computer-readable storage medium, which may be a non-volatile computer-readable storage medium, and which may also be a volatile computer-readable storage medium, having stored therein instructions, which, when run on a computer, cause the computer to perform the steps of the user data analysis and management method.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U disk, a removable hard disk, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk, an optical disk, or other various media capable of storing program codes.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (10)

1. A user data analysis and management method is characterized by comprising the following steps:
acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and performing video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
acquiring user basic information of the target user, and setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data to obtain a user emotion analysis model corresponding to the target user;
respectively inputting the plurality of continuous video data into the user emotion analysis model for user emotion analysis to obtain a user emotion analysis result corresponding to each continuous video data;
performing emotion fluctuation analysis on the target user according to a user emotion analysis result corresponding to each continuous video data to construct an emotion fluctuation distribution map of the target user;
judging whether the target user has emotion fluctuation abnormity according to the emotion fluctuation distribution map to obtain a judgment result;
if the judgment result is that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has a record in the emotion abnormal record table;
and if the record exists, performing emotion abnormity warning on the target user, performing data integration analysis on the plurality of continuous video data and the judgment result, and generating a user management scheme.
2. The method for analyzing and managing user data according to claim 1, wherein the obtaining of the user basic information of the target user and the setting of the video detection parameters of the original video processing model according to the user basic information and the plurality of continuous video data to obtain the user emotion analysis model corresponding to the target user comprises:
acquiring user basic information of the target user, wherein the user basic information comprises: user gender and user age;
matching corresponding target parameter information from a preset parameter set according to the user gender and the user age;
setting detection times according to the plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information;
and taking the set original video processing model as a user emotion analysis model corresponding to the target user.
3. The method for analyzing and managing user data according to claim 1, wherein the step of inputting the plurality of continuous video data into the user emotion analysis model for user emotion analysis to obtain a user emotion analysis result corresponding to each continuous video data comprises:
inputting the plurality of continuous video data into the user emotion analysis model respectively;
performing emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain the emotion stability of each continuous video data;
and generating a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
4. The method for analyzing and managing user data according to claim 1, wherein the performing emotion fluctuation analysis on the target user according to the emotion analysis result of the user corresponding to each continuous video data to construct the emotion fluctuation distribution map of the target user comprises:
acquiring video acquisition intervals corresponding to the plurality of continuous video data;
carrying out corresponding matching on the emotion analysis result of the user corresponding to each continuous video data and the video acquisition interval, and carrying out numerical mapping on the emotion analysis result of the user to generate a target numerical value corresponding to each emotion analysis result of the user;
and constructing the emotion fluctuation distribution map of the target user according to the target numerical value corresponding to the emotion analysis result of each user.
5. The method for analyzing and managing user data according to claim 1, wherein the determining whether the target user has abnormal emotional fluctuation according to the emotional fluctuation distribution map to obtain a determination result comprises:
extracting feature points of the emotion fluctuation distribution map to obtain target distribution probability;
comparing the target distribution probability with a preset probability threshold to obtain a judgment result;
if the target distribution probability is larger than or equal to a preset threshold value, determining that the judgment result is that the emotion fluctuation abnormality exists;
and if the target distribution probability is smaller than a preset threshold value, determining that no emotion fluctuation abnormality exists in the judgment result.
6. The method for analyzing and managing user data according to claim 1, wherein if there is a record, performing an emotional anomaly alarm on the target user, and performing data integration analysis on the plurality of continuous video data and the determination result to generate a user management scheme, comprising:
if the record exists, generating alarm information of the target user;
transmitting the alarm information to a preset monitoring terminal, and performing emotion abnormity alarm on the target user;
and performing data integration analysis on the plurality of continuous video data and the judgment result to generate a user management scheme.
7. The user data analysis and management method according to claim 1, further comprising:
collecting real-time video data of the target user;
performing emotion analysis on the real-time video data to obtain a target analysis result;
and adjusting the user management scheme according to the target analysis result to obtain the adjusted user management scheme.
8. A user data analysis and management apparatus, characterized in that the user data analysis and management apparatus comprises:
the acquisition module is used for acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and performing video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
the setting module is used for acquiring the user basic information of the target user, and setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data to obtain a user emotion analysis model corresponding to the target user;
the analysis module is used for inputting the plurality of continuous video data into the user emotion analysis model respectively to perform user emotion analysis so as to obtain a user emotion analysis result corresponding to each continuous video data;
the construction module is used for carrying out emotion fluctuation analysis on the target user according to the emotion analysis result of the user corresponding to each continuous video data to construct an emotion fluctuation distribution map of the target user;
the judging module is used for judging whether the target user has abnormal emotional fluctuation according to the emotional fluctuation distribution map to obtain a judgment result;
the identification module is used for acquiring an emotion abnormity record table if the judgment result shows that the emotion fluctuation abnormity exists, and identifying whether the target user has a record in the emotion abnormity record table;
and the generating module is used for performing emotion abnormal alarm on the target user if the record exists, performing data integration analysis on the plurality of continuous video data and the judgment result, and generating a user management scheme.
9. A user data analysis and management device, characterized in that the user data analysis and management device comprises: a memory and at least one processor, the memory having instructions stored therein;
the at least one processor invoking the instructions in the memory to cause the user data analysis and management device to perform the user data analysis and management method of any one of claims 1-7.
10. A computer-readable storage medium having instructions stored thereon, wherein the instructions, when executed by a processor, implement the user data analysis and management method of any one of claims 1-7.
CN202310237508.2A 2023-03-14 2023-03-14 User data analysis and management method, device, equipment and storage medium Active CN115953724B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310237508.2A CN115953724B (en) 2023-03-14 2023-03-14 User data analysis and management method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310237508.2A CN115953724B (en) 2023-03-14 2023-03-14 User data analysis and management method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115953724A true CN115953724A (en) 2023-04-11
CN115953724B CN115953724B (en) 2023-06-16

Family

ID=85893028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310237508.2A Active CN115953724B (en) 2023-03-14 2023-03-14 User data analysis and management method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115953724B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110390048A (en) * 2019-06-19 2019-10-29 深圳壹账通智能科技有限公司 Information-pushing method, device, equipment and storage medium based on big data analysis
CN112667075A (en) * 2020-12-23 2021-04-16 珠海市魅族科技有限公司 Terminal control method, terminal control device, electronic equipment and storage medium
CN113822164A (en) * 2021-08-25 2021-12-21 深圳市安视宝科技有限公司 Dynamic emotion recognition method and device, computer equipment and storage medium
CN114124724A (en) * 2020-08-11 2022-03-01 中国电信股份有限公司 User behavior analysis method, device, NWDAF and storage medium
CN114596619A (en) * 2022-05-09 2022-06-07 深圳市鹰瞳智能技术有限公司 Emotion analysis method, device and equipment based on video stream and storage medium
CN114639150A (en) * 2022-03-16 2022-06-17 平安科技(深圳)有限公司 Emotion recognition method and device, computer equipment and storage medium
CN115376559A (en) * 2022-08-22 2022-11-22 中国工商银行股份有限公司 Emotion recognition method, device and equipment based on audio and video

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110390048A (en) * 2019-06-19 2019-10-29 深圳壹账通智能科技有限公司 Information-pushing method, device, equipment and storage medium based on big data analysis
CN114124724A (en) * 2020-08-11 2022-03-01 中国电信股份有限公司 User behavior analysis method, device, NWDAF and storage medium
CN112667075A (en) * 2020-12-23 2021-04-16 珠海市魅族科技有限公司 Terminal control method, terminal control device, electronic equipment and storage medium
CN113822164A (en) * 2021-08-25 2021-12-21 深圳市安视宝科技有限公司 Dynamic emotion recognition method and device, computer equipment and storage medium
CN114639150A (en) * 2022-03-16 2022-06-17 平安科技(深圳)有限公司 Emotion recognition method and device, computer equipment and storage medium
CN114596619A (en) * 2022-05-09 2022-06-07 深圳市鹰瞳智能技术有限公司 Emotion analysis method, device and equipment based on video stream and storage medium
CN115376559A (en) * 2022-08-22 2022-11-22 中国工商银行股份有限公司 Emotion recognition method, device and equipment based on audio and video

Also Published As

Publication number Publication date
CN115953724B (en) 2023-06-16

Similar Documents

Publication Publication Date Title
CN104765733B (en) A kind of method and apparatus of social networks event analysis
CN111555921B (en) Method and device for positioning alarm root cause, computer equipment and storage medium
CN110688454A (en) Method, device, equipment and storage medium for processing consultation conversation
US20210255613A1 (en) Abnormality predicting system and abnormality predicting method
CN115688760B (en) Intelligent diagnosis guiding method, device, equipment and storage medium
CN112116168B (en) User behavior prediction method and device and electronic equipment
CN115440196A (en) Voice recognition method, device, medium and equipment based on user facial expression
JP2019105871A (en) Abnormality candidate extraction program, abnormality candidate extraction method and abnormality candidate extraction apparatus
CN114238764A (en) Course recommendation method, device and equipment based on recurrent neural network
CN114238033A (en) Board card running state early warning method, device, equipment and readable storage medium
CN111061394B (en) Touch force identification method, training method and device of model thereof and electronic system
CN116485020B (en) Supply chain risk identification early warning method, system and medium based on big data
CN115953724B (en) User data analysis and management method, device, equipment and storage medium
CN116777692A (en) Online learning method, device, equipment and storage medium based on data analysis
CN113570070B (en) Streaming data sampling and model updating method, device, system and storage medium
CN111814523A (en) Human body activity recognition method and device
CN111680572B (en) Dynamic judgment method and system for power grid operation scene
CN115082041A (en) User information management method, device, equipment and storage medium
CN114936600A (en) Document abnormity monitoring method, device, equipment and storage medium
CN114118306A (en) Method and device for analyzing SDS (sodium dodecyl sulfate) gel electrophoresis experimental data and SDS gel reagent
CN114530163A (en) Method and system for recognizing life cycle of equipment by adopting voice based on density clustering
CN113435753A (en) Enterprise risk judgment method, device, equipment and medium in high-risk industry
KR102072894B1 (en) Abnormal sequence identification method based on intron and exon
Amayri et al. A statistical process control chart approach for occupancy estimation in smart buildings
CN111832815A (en) Scientific research hotspot prediction method and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230524

Address after: Room 410, Building 8, Xinyi Lingyu R&D Center, No. 26 Honglang North 2nd Road, Xingdong Community, Xin'an Street, Bao'an District, Shenzhen City, Guangdong Province, 518000

Applicant after: Shenzhen Silver Bullet Technology Co.,Ltd.

Address before: 713, Building 10, Shenzhen Bay Science and Technology Ecological Park, No. 10 Gaoxin South 9th Road, Gaoxin Community, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province, 518000

Applicant before: SmartIn Technology (Shenzhen) Co.,Ltd.

GR01 Patent grant
GR01 Patent grant