CN115953724B - User data analysis and management method, device, equipment and storage medium - Google Patents

User data analysis and management method, device, equipment and storage medium Download PDF

Info

Publication number
CN115953724B
CN115953724B CN202310237508.2A CN202310237508A CN115953724B CN 115953724 B CN115953724 B CN 115953724B CN 202310237508 A CN202310237508 A CN 202310237508A CN 115953724 B CN115953724 B CN 115953724B
Authority
CN
China
Prior art keywords
user
emotion
target
video data
analysis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310237508.2A
Other languages
Chinese (zh)
Other versions
CN115953724A (en
Inventor
匡奕胤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Silver Bullet Technology Co ltd
Original Assignee
Shenzhen Silver Bullet Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Silver Bullet Technology Co ltd filed Critical Shenzhen Silver Bullet Technology Co ltd
Priority to CN202310237508.2A priority Critical patent/CN115953724B/en
Publication of CN115953724A publication Critical patent/CN115953724A/en
Application granted granted Critical
Publication of CN115953724B publication Critical patent/CN115953724B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Image Analysis (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to the field of artificial intelligence, and discloses a user data analysis and management method, device, equipment and storage medium, which are used for realizing intelligent monitoring of user emotion and improving accuracy of user emotion analysis. The method comprises the following steps: inputting a plurality of continuous video data into a user emotion analysis model to perform user emotion analysis, so as to obtain a user emotion analysis result corresponding to each continuous video data; according to the user emotion analysis result corresponding to each continuous video data, carrying out emotion fluctuation analysis on the target user, and constructing an emotion fluctuation distribution diagram of the target user; obtaining a judgment result according to the emotion fluctuation distribution diagram; if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has records in the emotion abnormal record table; if the record exists, carrying out emotion abnormal warning on the target user, and carrying out data integration analysis on a plurality of continuous video data and judgment results to generate a user management scheme.

Description

User data analysis and management method, device, equipment and storage medium
Technical Field
The present invention relates to the field of artificial intelligence, and in particular, to a method, apparatus, device, and storage medium for user data analysis and management.
Background
With the rapid development of artificial intelligence technology, people pay more attention to research on human emotion expression. The existing scheme is to simply use voice or vision to identify the current emotion of people, and the single identification mode is often low in accuracy and has the situation of false identification.
In the existing scheme, people are identified simply by visual methods, and when some people feel happy, angry, sad and the like, the facial expressions are different; and then, or only a voice mode is used for identifying the emotion of people, the identification result is usually subjected to dialect, and the accuracy of the existing scheme is lower.
Disclosure of Invention
The invention provides a user data analysis and management method, device, equipment and storage medium, which are used for realizing intelligent monitoring of user emotion and improving accuracy of user emotion analysis.
The first aspect of the present invention provides a user data analysis and management method, the user data analysis and management method comprising:
Acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and performing video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
acquiring user basic information of the target user, and setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data to obtain a user emotion analysis model corresponding to the target user;
inputting the plurality of continuous video data into the user emotion analysis model to perform user emotion analysis, so as to obtain a user emotion analysis result corresponding to each continuous video data;
according to the user emotion analysis result corresponding to each continuous video data, carrying out emotion fluctuation analysis on the target user, and constructing an emotion fluctuation distribution map of the target user;
judging whether the target user has abnormal emotion fluctuation according to the emotion fluctuation distribution diagram, and obtaining a judgment result;
if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has records in the emotion abnormal record table;
and if the record exists, carrying out emotion abnormal warning on the target user, and carrying out data integration analysis on the plurality of continuous video data and the judging result to generate a user management scheme.
With reference to the first aspect, in a first implementation manner of the first aspect of the present invention, the obtaining user basic information of the target user, and setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data, to obtain a user emotion analysis model corresponding to the target user includes:
obtaining user basic information of the target user, wherein the user basic information comprises: user gender and user age;
matching corresponding target parameter information from a preset parameter set according to the gender of the user and the age of the user;
setting detection times according to the plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information;
and taking the set original video processing model as a user emotion analysis model corresponding to the target user.
With reference to the first aspect, in a second implementation manner of the first aspect of the present invention, the inputting the plurality of continuous video data into the user emotion analysis model to perform user emotion analysis, to obtain a user emotion analysis result corresponding to each continuous video data includes:
Inputting the plurality of continuous video data into the user emotion analysis model, respectively;
carrying out emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain emotion stability of each continuous video data;
and generating a user emotion analysis result corresponding to each piece of continuous video data according to the emotion stability.
With reference to the first aspect, in a third implementation manner of the first aspect of the present invention, according to a result of emotion analysis of a user corresponding to each continuous video data, performing emotion fluctuation analysis on the target user, and constructing an emotion fluctuation distribution map of the target user, including:
acquiring video acquisition intervals corresponding to the plurality of continuous video data;
performing corresponding matching on a user emotion analysis result corresponding to each continuous video data and the video acquisition interval, and performing numerical mapping on the user emotion analysis result to generate a target numerical value corresponding to each user emotion analysis result;
and constructing an emotion fluctuation distribution map of the target user according to the target numerical value corresponding to each user emotion analysis result.
With reference to the first aspect, in a fourth implementation manner of the first aspect of the present invention, the determining, according to the mood swings distribution diagram, whether the target user has a mood swings abnormality, to obtain a determination result includes:
Extracting feature points of the emotion fluctuation distribution map to obtain target distribution probability;
comparing the target distribution probability with a preset probability threshold to obtain a judgment result;
if the target distribution probability is greater than or equal to a preset threshold value, determining that the judgment result is that emotion fluctuation abnormality exists;
and if the target distribution probability is smaller than a preset threshold value, determining that the judgment result is that the emotion fluctuation abnormality does not exist.
With reference to the first aspect, in a fifth implementation manner of the first aspect of the present invention, if there is a record, performing an emotion abnormal alert on the target user, and performing data integration analysis on the plurality of continuous video data and the determination result, to generate a user management scheme, where the method includes:
if the record exists, generating alarm information of the target user;
transmitting the alarm information to a preset monitoring terminal, and carrying out emotion abnormal alarm on the target user;
and carrying out data integration analysis on the plurality of continuous video data and the judging result to generate a user management scheme.
With reference to the first aspect, in a sixth implementation manner of the first aspect of the present invention, the user data analysis and management method further includes:
Collecting real-time video data of the target user;
carrying out emotion analysis on the real-time video data to obtain a target analysis result;
and adjusting the user management scheme according to the target analysis result to obtain an adjusted user management scheme.
A second aspect of the present invention provides a user data analysis and management apparatus, comprising:
the acquisition module is used for acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and carrying out video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
the setting module is used for acquiring user basic information of the target user, setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data, and obtaining a user emotion analysis model corresponding to the target user;
the analysis module is used for respectively inputting the plurality of continuous video data into the user emotion analysis model to carry out user emotion analysis so as to obtain a user emotion analysis result corresponding to each continuous video data;
the construction module is used for carrying out emotion fluctuation analysis on the target user according to the emotion analysis result of the user corresponding to each piece of continuous video data and constructing an emotion fluctuation distribution diagram of the target user;
The judging module is used for judging whether the target user has abnormal emotion fluctuation according to the emotion fluctuation distribution diagram to obtain a judging result;
the identification module is used for acquiring an emotion abnormal record table if the judgment result is that emotion fluctuation is abnormal, and identifying whether the target user has records in the emotion abnormal record table or not;
and the generation module is used for carrying out emotion abnormal warning on the target user if records exist, carrying out data integration analysis on the plurality of continuous video data and the judgment result, and generating a user management scheme.
With reference to the second aspect, in a first implementation manner of the second aspect of the present invention, the setting module is specifically configured to:
obtaining user basic information of the target user, wherein the user basic information comprises: user gender and user age;
matching corresponding target parameter information from a preset parameter set according to the gender of the user and the age of the user;
setting detection times according to the plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information;
and taking the set original video processing model as a user emotion analysis model corresponding to the target user.
With reference to the second aspect, in a second implementation manner of the second aspect of the present invention, the analysis module is specifically configured to:
inputting the plurality of continuous video data into the user emotion analysis model, respectively;
carrying out emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain emotion stability of each continuous video data;
and generating a user emotion analysis result corresponding to each piece of continuous video data according to the emotion stability.
With reference to the second aspect, in a third implementation manner of the second aspect of the present invention, the building block is specifically configured to:
acquiring video acquisition intervals corresponding to the plurality of continuous video data;
performing corresponding matching on a user emotion analysis result corresponding to each continuous video data and the video acquisition interval, and performing numerical mapping on the user emotion analysis result to generate a target numerical value corresponding to each user emotion analysis result;
and constructing an emotion fluctuation distribution map of the target user according to the target numerical value corresponding to each user emotion analysis result.
With reference to the second aspect, in a fourth implementation manner of the second aspect of the present invention, the determining module is specifically configured to:
Extracting feature points of the emotion fluctuation distribution map to obtain target distribution probability;
comparing the target distribution probability with a preset probability threshold to obtain a judgment result;
if the target distribution probability is greater than or equal to a preset threshold value, determining that the judgment result is that emotion fluctuation abnormality exists;
and if the target distribution probability is smaller than a preset threshold value, determining that the judgment result is that the emotion fluctuation abnormality does not exist.
With reference to the second aspect, in a fifth implementation manner of the second aspect of the present invention, the generating module is specifically configured to:
if the record exists, generating alarm information of the target user;
transmitting the alarm information to a preset monitoring terminal, and carrying out emotion abnormal alarm on the target user;
and carrying out data integration analysis on the plurality of continuous video data and the judging result to generate a user management scheme.
With reference to the second aspect, in a sixth implementation manner of the second aspect of the present invention, the user data analysis and management apparatus further includes:
the adjusting module is used for collecting real-time video data of the target user; carrying out emotion analysis on the real-time video data to obtain a target analysis result; and adjusting the user management scheme according to the target analysis result to obtain an adjusted user management scheme.
A third aspect of the present invention provides a user data analysis and management apparatus comprising: a memory and at least one processor, the memory having instructions stored therein; the at least one processor invokes the instructions in the memory to cause the user data analysis and management device to perform the user data analysis and management method described above.
A fourth aspect of the present invention provides a computer readable storage medium having instructions stored therein which, when run on a computer, cause the computer to perform the user data analysis and management method described above.
In the technical scheme provided by the invention, a plurality of continuous video data are respectively input into a user emotion analysis model to carry out user emotion analysis, so as to obtain a user emotion analysis result corresponding to each continuous video data; according to the user emotion analysis result corresponding to each continuous video data, carrying out emotion fluctuation analysis on the target user, and constructing an emotion fluctuation distribution diagram of the target user; obtaining a judgment result according to the emotion fluctuation distribution diagram; if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has records in the emotion abnormal record table; if the record exists, carrying out emotion abnormal warning on the target user, carrying out data integration analysis on a plurality of continuous video data and judgment results, and generating a user management scheme.
Drawings
FIG. 1 is a diagram illustrating an embodiment of a user data analysis and management method according to an embodiment of the present invention;
FIG. 2 is a flow chart of user emotion analysis in an embodiment of the present invention;
FIG. 3 is a flow chart of emotion fluctuation analysis in an embodiment of the present invention;
FIG. 4 is a flowchart of determining whether a target user has abnormal mood swings in an embodiment of the present invention;
FIG. 5 is a schematic diagram of an embodiment of a user data analysis and management apparatus according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of another embodiment of a user data analysis and management apparatus according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of an embodiment of a user data analysis and management apparatus according to an embodiment of the present invention.
Detailed Description
The embodiment of the invention provides a user data analysis and management method, device, equipment and storage medium, which are used for realizing intelligent monitoring of user emotion and improving the accuracy of user emotion analysis. The terms "first," "second," "third," "fourth" and the like in the description and in the claims and in the above drawings, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus.
For ease of understanding, a specific flow of an embodiment of the present invention is described below with reference to fig. 1, where an embodiment of a user data analysis and management method includes:
s101, acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and performing video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
it will be appreciated that the execution subject of the present invention may be a user data analysis and management device, and may also be a terminal or a server, which is not limited herein. The embodiment of the invention is described by taking a server as an execution main body as an example.
Specifically, the server acquires a plurality of initial video data of the target user based on a preset video acquisition interval, further, the server respectively performs time sequence feature analysis on each initial video data, determines time sequence features corresponding to each initial video data, and further, the server performs video sequencing on the plurality of initial videos according to the time sequence features corresponding to each initial video to generate a plurality of continuous video data.
S102, acquiring user basic information of a target user, and setting video detection parameters of an original video processing model according to the user basic information and a plurality of continuous video data to obtain a user emotion analysis model corresponding to the target user;
Specifically, the server acquires user basic information, processes the user basic information to obtain time sequence data, further sequentially sets the super parameters of the original video processing model according to a preset super parameter range, inputs the super parameters set each time and the time sequence data into the original video processing model, calculates to obtain prediction errors corresponding to the super parameters, selects the front N groups of super parameters with the minimum prediction errors, and further sets video detection parameters of the original video processing model according to the front N groups of super parameters to obtain a user emotion analysis model corresponding to a target user.
S103, respectively inputting a plurality of continuous video data into a user emotion analysis model to perform user emotion analysis, and obtaining a user emotion analysis result corresponding to each continuous video data;
it should be noted that, inputting a plurality of continuous video data into the user emotion analysis model respectively; and carrying out emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain the emotion stability of each continuous video data, and generating a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
S104, carrying out emotion fluctuation analysis on the target user according to the emotion analysis result of the user corresponding to each piece of continuous video data, and constructing an emotion fluctuation distribution diagram of the target user;
Specifically, each piece of continuous video data is subjected to real-time statistics according to the emotion fluctuation index dividing dimension and index to obtain specific numerical values of the index, a user emotion analysis model is utilized to build a calculation model of the emotion fluctuation index, each piece of continuous video data is input, the emotion fluctuation index is determined through machine learning, matching of each dimension weight and comprehensive calculation of the index, further, the server carries out emotion fluctuation analysis on a target user through the emotion fluctuation index, and an emotion fluctuation distribution map of the target user is constructed.
S105, judging whether the target user has abnormal emotion fluctuation according to the emotion fluctuation distribution diagram, and obtaining a judgment result;
specifically, an emotion fluctuation distribution map is obtained, emotion recognition is carried out on the emotion fluctuation distribution map, the emotion category of the emotion fluctuation distribution map is determined, an emotion recognition result of the emotion fluctuation distribution map is obtained, whether the target user has abnormal emotion fluctuation is judged, and a judgment result is obtained.
S106, if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has records in the emotion abnormal record table;
and S107, if the record exists, carrying out emotion abnormal warning on the target user, and carrying out data integration analysis on a plurality of continuous video data and judgment results to generate a user management scheme.
Specifically, the server
In the embodiment of the invention, if the judgment result is that the emotion fluctuation is abnormal, an emotion abnormal record table is obtained, whether the target user has records in the emotion abnormal record table is identified, if so, emotion abnormal warning is carried out on the target user, and data integration analysis is carried out on a plurality of continuous video data and the judgment result, so that a user management scheme is generated.
In a specific embodiment, the process of executing step S102 may specifically include the following steps:
(1) Obtaining user basic information of a target user, wherein the user basic information comprises the following steps: user gender and user age;
(2) Matching corresponding target parameter information from a preset parameter set according to the gender and age of the user;
(3) Setting detection times according to a plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and target parameter information;
(4) And taking the set original video processing model as a user emotion analysis model corresponding to the target user.
Specifically, the server acquires user basic information of the target user, wherein the user basic information comprises: the user gender and the user age, and the server matches corresponding target parameter information from a preset parameter set according to the user gender and the user age, wherein,
Extracting text feature information in basic user information, extracting the topic feature information in the basic user information, carrying out feature fusion on the topic feature information and the text feature information to obtain fusion feature information, extracting user behavior feature information from the fusion feature information, determining time sequence data by the user behavior feature information, further, setting detection times according to a plurality of continuous video data by a server according to the time sequence data and matching corresponding target parameter information from a preset parameter set, setting video detection parameters of an original video processing model according to the detection times and the target parameter information, wherein the server sequentially sets the super parameters of the original video processing model according to a preset super parameter range, inputs the super parameters set each time and the time sequence data into the original video processing model, calculates to obtain prediction errors corresponding to the super parameters, selects the first N groups of super parameters with minimum prediction errors, and further sets the video detection parameters of the original video processing model according to the first N groups of super parameters to obtain a user analysis model corresponding to the target user.
In a specific embodiment, as shown in fig. 2, the process of performing step S103 may specifically include the following steps:
S201, respectively inputting a plurality of continuous video data into a user emotion analysis model;
s202, performing emotion stability analysis on a plurality of continuous video data through a user emotion analysis model to obtain emotion stability of each continuous video data;
s203, generating a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
Specifically, the server respectively inputs a plurality of continuous video data into a user emotion analysis model, the emotion stability analysis is carried out on the plurality of continuous video data through the user emotion analysis model to obtain the emotion stability of each continuous video data, wherein the server acquires the video materials related to the emotion subject, screens and classifies the video materials to form a basic emotion analysis library, single-dimensional emotion analysis depth scoring is carried out on all the emotion analysis materials for multiple times, the average score of each section of analysis materials is used as the single-dimensional emotion analysis depth score of the material to obtain the emotion stability of each continuous video data, and finally the server generates a user emotion analysis result corresponding to each continuous video data according to the emotion stability.
In a specific embodiment, as shown in fig. 3, the process of executing step S104 may specifically include the following steps:
S301, acquiring video acquisition intervals corresponding to a plurality of continuous video data;
s302, carrying out corresponding matching on a user emotion analysis result corresponding to each continuous video data and a video acquisition interval, and carrying out numerical mapping on the user emotion analysis result to generate a target numerical value corresponding to each user emotion analysis result;
s303, constructing an emotion fluctuation distribution map of the target user according to the target numerical value corresponding to each user emotion analysis result.
Specifically, the server acquires video acquisition intervals corresponding to a plurality of continuous video data, acquires mapping values with mapping relation with the video acquisition intervals, further, carries out corresponding matching on user emotion analysis results corresponding to each continuous video data and the video acquisition intervals, carries out value mapping on the user emotion analysis results, and generates target values corresponding to each user emotion analysis result; the method comprises the steps that a mapping relation between a video acquisition interval and a mapping numerical value is input into each piece of continuous video data, then a server correspondingly matches a user emotion analysis result corresponding to each piece of continuous video data with the video acquisition interval according to the mapping relation, numerical value mapping is carried out on the user emotion analysis result, a target numerical value corresponding to each user emotion analysis result is generated, and finally, the server constructs an emotion fluctuation distribution diagram of a target user according to the target numerical value corresponding to each user emotion analysis result.
In a specific embodiment, as shown in fig. 4, the process of performing step S105 may specifically include the following steps:
s401, extracting feature points of an emotion fluctuation distribution map to obtain target distribution probability;
s402, comparing the target distribution probability with a preset probability threshold to obtain a judgment result;
s403, if the target distribution probability is greater than or equal to a preset threshold value, determining that the judgment result is that the emotion fluctuation abnormality exists;
s404, if the target distribution probability is smaller than a preset threshold, determining that the judgment result is that the emotion fluctuation abnormality does not exist.
Specifically, the server extracts feature points of the emotion fluctuation distribution map to obtain target distribution probability,
the method comprises the steps that a server builds an initial feature point extraction model in advance, wherein the initial feature point extraction model comprises a feature extraction module, and the feature extraction module comprises a multi-scale convolution layer and a deformable convolution layer. And training the initial feature point extraction model by using the training data set to obtain a trained feature point extraction model. Inputting the image to be processed into a trained feature point extraction model, further obtaining target distribution probability, comparing the target distribution probability with a preset probability threshold to obtain a judgment result, determining that the judgment result is abnormal due to emotion fluctuation if the target distribution probability is greater than or equal to the preset threshold, and determining that the judgment result is abnormal due to emotion fluctuation if the target distribution probability is less than the preset threshold.
In a specific embodiment, the process of executing step S107 may specifically include the following steps:
(1) If the record exists, generating alarm information of the target user;
(2) Transmitting the alarm information to a preset monitoring terminal, and carrying out emotion abnormal alarm on a target user;
(3) And carrying out data integration analysis on the plurality of continuous video data and the judging result to generate a user management scheme.
In a specific embodiment, the user data analysis and management method further includes the following steps:
(1) Collecting real-time video data of a target user;
(2) Carrying out emotion analysis on the real-time video data to obtain a target analysis result;
(3) And adjusting the user management scheme according to the target analysis result to obtain an adjusted user management scheme.
Specifically, if the record exists, generating alarm information of the target user, transmitting the alarm information to a preset monitoring terminal, carrying out emotion abnormal alarm on the target user, carrying out data integration analysis on a plurality of continuous video data and judgment results, and generating a user management scheme.
Further, the real-time video data of the target user is collected, wherein the server updates and obtains the current real-time video data of the user every first interval time length, the real-time video data of the monitoring interval in the process of executing the management scheme is determined according to the obtained user real-time video data sequence and the corresponding position and time length information of the preset monitoring interval in the management scheme, the real-time video data of the monitoring interval in the process of executing the management scheme is compared with the characteristic value of the preset standard template, emotion analysis is carried out according to the comparison result, a target analysis result is obtained, and the user management scheme is adjusted according to the target analysis result, so that the adjusted user management scheme is obtained.
The method for analyzing and managing user data in the embodiment of the present invention is described above, and the device for analyzing and managing user data in the embodiment of the present invention is described below, referring to fig. 5, one embodiment of the device for analyzing and managing user data in the embodiment of the present invention includes:
the acquiring module 501 is configured to acquire a plurality of initial video data of a target user based on a preset video acquisition interval, and perform video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
the setting module 502 is configured to obtain user basic information of the target user, and set video detection parameters of an original video processing model according to the user basic information and the multiple continuous video data, so as to obtain a user emotion analysis model corresponding to the target user;
the analysis module 503 is configured to input the plurality of continuous video data into the user emotion analysis model to perform user emotion analysis, so as to obtain a user emotion analysis result corresponding to each continuous video data;
a construction module 504, configured to perform emotion fluctuation analysis on the target user according to a user emotion analysis result corresponding to each continuous video data, and construct an emotion fluctuation distribution map of the target user;
The judging module 505 is configured to judge whether the target user has abnormal emotion fluctuation according to the emotion fluctuation distribution diagram, so as to obtain a judgment result;
the identifying module 506 is configured to obtain an emotion abnormal recording table if the judgment result indicates that there is an emotion fluctuation abnormality, and identify whether the target user has a record in the emotion abnormal recording table;
and the generating module 507 is configured to perform emotion abnormality warning on the target user if there is a record, and perform data integration analysis on the multiple continuous video data and the judgment result, so as to generate a user management scheme.
Through the cooperative cooperation of the components, a plurality of continuous video data are respectively input into a user emotion analysis model to carry out user emotion analysis, and a user emotion analysis result corresponding to each continuous video data is obtained; according to the user emotion analysis result corresponding to each continuous video data, carrying out emotion fluctuation analysis on the target user, and constructing an emotion fluctuation distribution diagram of the target user; obtaining a judgment result according to the emotion fluctuation distribution diagram; if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has records in the emotion abnormal record table; if the record exists, carrying out emotion abnormal warning on the target user, carrying out data integration analysis on a plurality of continuous video data and judgment results, and generating a user management scheme.
Referring to fig. 6, another embodiment of the user data analysis and management apparatus according to the present invention includes:
the acquiring module 501 is configured to acquire a plurality of initial video data of a target user based on a preset video acquisition interval, and perform video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
the setting module 502 is configured to obtain user basic information of the target user, and set video detection parameters of an original video processing model according to the user basic information and the multiple continuous video data, so as to obtain a user emotion analysis model corresponding to the target user;
the analysis module 503 is configured to input the plurality of continuous video data into the user emotion analysis model to perform user emotion analysis, so as to obtain a user emotion analysis result corresponding to each continuous video data;
a construction module 504, configured to perform emotion fluctuation analysis on the target user according to a user emotion analysis result corresponding to each continuous video data, and construct an emotion fluctuation distribution map of the target user;
the judging module 505 is configured to judge whether the target user has abnormal emotion fluctuation according to the emotion fluctuation distribution diagram, so as to obtain a judgment result;
The identifying module 506 is configured to obtain an emotion abnormal recording table if the judgment result indicates that there is an emotion fluctuation abnormality, and identify whether the target user has a record in the emotion abnormal recording table;
and the generating module 507 is configured to perform emotion abnormality warning on the target user if there is a record, and perform data integration analysis on the multiple continuous video data and the judgment result, so as to generate a user management scheme.
Optionally, the setting module 502 is specifically configured to:
obtaining user basic information of the target user, wherein the user basic information comprises: user gender and user age;
matching corresponding target parameter information from a preset parameter set according to the gender of the user and the age of the user;
setting detection times according to the plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information;
and taking the set original video processing model as a user emotion analysis model corresponding to the target user.
Optionally, the analysis module 503 is specifically configured to:
inputting the plurality of continuous video data into the user emotion analysis model, respectively;
Carrying out emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain emotion stability of each continuous video data;
and generating a user emotion analysis result corresponding to each piece of continuous video data according to the emotion stability.
Optionally, the building module 504 is specifically configured to:
acquiring video acquisition intervals corresponding to the plurality of continuous video data;
performing corresponding matching on a user emotion analysis result corresponding to each continuous video data and the video acquisition interval, and performing numerical mapping on the user emotion analysis result to generate a target numerical value corresponding to each user emotion analysis result;
and constructing an emotion fluctuation distribution map of the target user according to the target numerical value corresponding to each user emotion analysis result.
Optionally, the determining module 505 is specifically configured to:
extracting feature points of the emotion fluctuation distribution map to obtain target distribution probability;
comparing the target distribution probability with a preset probability threshold to obtain a judgment result;
if the target distribution probability is greater than or equal to a preset threshold value, determining that the judgment result is that emotion fluctuation abnormality exists;
And if the target distribution probability is smaller than a preset threshold value, determining that the judgment result is that the emotion fluctuation abnormality does not exist.
Optionally, the generating module 507 is specifically configured to:
if the record exists, generating alarm information of the target user;
transmitting the alarm information to a preset monitoring terminal, and carrying out emotion abnormal alarm on the target user;
and carrying out data integration analysis on the plurality of continuous video data and the judging result to generate a user management scheme.
Optionally, the user data analysis and management device further includes:
an adjustment module 508, configured to collect real-time video data of the target user; carrying out emotion analysis on the real-time video data to obtain a target analysis result; and adjusting the user management scheme according to the target analysis result to obtain an adjusted user management scheme.
In the embodiment of the invention, a plurality of continuous video data are respectively input into a user emotion analysis model for user emotion analysis, and a user emotion analysis result corresponding to each continuous video data is obtained; according to the user emotion analysis result corresponding to each continuous video data, carrying out emotion fluctuation analysis on the target user, and constructing an emotion fluctuation distribution diagram of the target user; obtaining a judgment result according to the emotion fluctuation distribution diagram; if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has records in the emotion abnormal record table; if the record exists, carrying out emotion abnormal warning on the target user, carrying out data integration analysis on a plurality of continuous video data and judgment results, and generating a user management scheme.
The user data analysis and management apparatus in the embodiment of the present invention is described in detail above in terms of the modularized functional entity in fig. 5 and 6, and the user data analysis and management device in the embodiment of the present invention is described in detail below in terms of hardware processing.
Fig. 7 is a schematic structural diagram of a user data analysis and management device according to an embodiment of the present invention, where the user data analysis and management device 600 may have a relatively large difference due to different configurations or performances, and may include one or more processors (central processing units, CPU) 610 (e.g., one or more processors) and a memory 620, and one or more storage media 630 (e.g., one or more mass storage devices) storing application programs 633 or data 632. Wherein the memory 620 and the storage medium 630 may be transitory or persistent storage. The program stored on the storage medium 630 may include one or more modules (not shown), each of which may include a series of instruction operations in the user data analysis and management apparatus 600. Still further, the processor 610 may be configured to communicate with the storage medium 630 and execute a series of instruction operations in the storage medium 630 on the user data analysis and management device 600.
The user data analysis and management device 600 may also include one or more power supplies 640, one or more wired or wireless network interfaces 650, one or more input/output interfaces 660, and/or one or more operating systems 631, such as Windows Server, mac OS X, unix, linux, freeBSD, and the like. It will be appreciated by those skilled in the art that the user data analysis and management device structure shown in fig. 7 does not constitute a limitation of the user data analysis and management device, and may include more or less components than illustrated, or may combine certain components, or may be a different arrangement of components.
The present invention also provides a user data analysis and management apparatus including a memory and a processor, the memory storing computer readable instructions which, when executed by the processor, cause the processor to perform the steps of the user data analysis and management method in the above embodiments.
The present invention also provides a computer readable storage medium, which may be a non-volatile computer readable storage medium, and which may also be a volatile computer readable storage medium, the computer readable storage medium having stored therein instructions which, when executed on a computer, cause the computer to perform the steps of the user data analysis and management method.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied essentially or in part or all of the technical solution or in part in the form of a software product stored in a storage medium, including instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a read-only memory (ROM), a random access memory (random acceS memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above embodiments are only for illustrating the technical solution of the present invention, and not for limiting the same; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (7)

1. A user data analysis and management method, characterized in that the user data analysis and management method comprises:
acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and performing video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
acquiring user basic information of the target user, and setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data to obtain a user emotion analysis model corresponding to the target user; wherein the user basic information includes: user gender and user age; matching corresponding target parameter information from a preset parameter set according to the gender of the user and the age of the user; setting detection times according to the plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information; taking the set original video processing model as a user emotion analysis model corresponding to the target user; extracting text feature information in basic user information, extracting the topic feature information in the basic user information, carrying out feature fusion on the topic feature information and the text feature information to obtain fusion feature information, extracting user behavior feature information from the fusion feature information, determining time sequence data according to the user behavior feature information, matching corresponding target parameter information from a preset parameter set according to the time sequence data, sequentially setting super parameters of an original video processing model according to a preset super parameter range, inputting the super parameters set each time and the time sequence data into the original video processing model, calculating to obtain prediction errors corresponding to the super parameters, selecting front N groups of super parameters with minimum prediction errors, setting video detection parameters of the original video processing model through the front N groups of super parameters, and obtaining a user emotion analysis model corresponding to the target user;
Inputting the plurality of continuous video data into the user emotion analysis model to perform user emotion analysis, so as to obtain a user emotion analysis result corresponding to each continuous video data;
according to the user emotion analysis result corresponding to each continuous video data, carrying out emotion fluctuation analysis on the target user, and constructing an emotion fluctuation distribution map of the target user; the video acquisition intervals corresponding to the plurality of continuous video data are acquired; performing corresponding matching on a user emotion analysis result corresponding to each continuous video data and the video acquisition interval, and performing numerical mapping on the user emotion analysis result to generate a target numerical value corresponding to each user emotion analysis result; constructing an emotion fluctuation distribution map of the target user according to the target numerical value corresponding to each user emotion analysis result; the method comprises the steps of obtaining a mapping value with a mapping relation with a video acquisition interval, carrying out corresponding matching on a user emotion analysis result corresponding to each continuous video data and the video acquisition interval, carrying out value mapping on the user emotion analysis result, and generating a target value corresponding to each user emotion analysis result; the method comprises the steps of inputting a mapping relation between a video acquisition interval and a mapping numerical value in each piece of continuous video data, and carrying out corresponding matching on a user emotion analysis result corresponding to each piece of continuous video data and the video acquisition interval according to the mapping relation;
Judging whether the target user has abnormal emotion fluctuation according to the emotion fluctuation distribution diagram, and obtaining a judgment result; extracting characteristic points of the emotion fluctuation distribution map to obtain target distribution probability; comparing the target distribution probability with a preset probability threshold to obtain a judgment result; if the target distribution probability is greater than or equal to a preset threshold value, determining that the judgment result is that emotion fluctuation abnormality exists; if the target distribution probability is smaller than a preset threshold, determining that the judgment result is that the emotion fluctuation abnormality does not exist; extracting feature points of the emotion fluctuation distribution graph to obtain target distribution probability, pre-constructing an initial feature point extraction model, wherein the initial feature point extraction model comprises a feature extraction module, the feature extraction module comprises a multi-scale convolution layer and a deformable convolution layer, training the initial feature point extraction model by utilizing a training data set to obtain a trained feature point extraction model, and inputting the emotion fluctuation distribution graph into the trained feature point extraction model to obtain target distribution probability;
if the judgment result shows that the emotion fluctuation is abnormal, acquiring an emotion abnormal record table, and identifying whether the target user has records in the emotion abnormal record table;
And if the record exists, carrying out emotion abnormal warning on the target user, and carrying out data integration analysis on the plurality of continuous video data and the judging result to generate a user management scheme.
2. The method for analyzing and managing user data according to claim 1, wherein the inputting the plurality of continuous video data into the user emotion analysis model for user emotion analysis respectively, to obtain a user emotion analysis result corresponding to each continuous video data, comprises:
inputting the plurality of continuous video data into the user emotion analysis model, respectively;
carrying out emotion stability analysis on the plurality of continuous video data through the user emotion analysis model to obtain emotion stability of each continuous video data;
and generating a user emotion analysis result corresponding to each piece of continuous video data according to the emotion stability.
3. The method for analyzing and managing user data according to claim 1, wherein if there is a record, performing an emotion abnormality warning on the target user, and performing data integration analysis on the plurality of continuous video data and the determination result, generating a user management scheme, comprising:
If the record exists, generating alarm information of the target user;
transmitting the alarm information to a preset monitoring terminal, and carrying out emotion abnormal alarm on the target user;
and carrying out data integration analysis on the plurality of continuous video data and the judging result to generate a user management scheme.
4. The user data analysis and management method according to claim 1, characterized in that the user data analysis and management method further comprises:
collecting real-time video data of the target user;
carrying out emotion analysis on the real-time video data to obtain a target analysis result;
and adjusting the user management scheme according to the target analysis result to obtain an adjusted user management scheme.
5. A user data analysis and management apparatus, characterized in that the user data analysis and management apparatus comprises:
the acquisition module is used for acquiring a plurality of initial video data of a target user based on a preset video acquisition interval, and carrying out video sequencing on the plurality of initial video data to generate a plurality of continuous video data;
the setting module is used for acquiring user basic information of the target user, setting video detection parameters of an original video processing model according to the user basic information and the plurality of continuous video data, and obtaining a user emotion analysis model corresponding to the target user; wherein the user basic information includes: user gender and user age; matching corresponding target parameter information from a preset parameter set according to the gender of the user and the age of the user; setting detection times according to the plurality of continuous video data, and setting video detection parameters of an original video processing model according to the detection times and the target parameter information; taking the set original video processing model as a user emotion analysis model corresponding to the target user; extracting text feature information in basic user information, extracting the topic feature information in the basic user information, carrying out feature fusion on the topic feature information and the text feature information to obtain fusion feature information, extracting user behavior feature information from the fusion feature information, determining time sequence data according to the user behavior feature information, matching corresponding target parameter information from a preset parameter set according to the time sequence data, sequentially setting super parameters of an original video processing model according to a preset super parameter range, inputting the super parameters set each time and the time sequence data into the original video processing model, calculating to obtain prediction errors corresponding to the super parameters, selecting front N groups of super parameters with minimum prediction errors, setting video detection parameters of the original video processing model through the front N groups of super parameters, and obtaining a user emotion analysis model corresponding to the target user;
The analysis module is used for respectively inputting the plurality of continuous video data into the user emotion analysis model to carry out user emotion analysis so as to obtain a user emotion analysis result corresponding to each continuous video data;
the construction module is used for carrying out emotion fluctuation analysis on the target user according to the emotion analysis result of the user corresponding to each piece of continuous video data and constructing an emotion fluctuation distribution diagram of the target user; the video acquisition intervals corresponding to the plurality of continuous video data are acquired; performing corresponding matching on a user emotion analysis result corresponding to each continuous video data and the video acquisition interval, and performing numerical mapping on the user emotion analysis result to generate a target numerical value corresponding to each user emotion analysis result; constructing an emotion fluctuation distribution map of the target user according to the target numerical value corresponding to each user emotion analysis result; the method comprises the steps of obtaining a mapping value with a mapping relation with a video acquisition interval, carrying out corresponding matching on a user emotion analysis result corresponding to each continuous video data and the video acquisition interval, carrying out value mapping on the user emotion analysis result, and generating a target value corresponding to each user emotion analysis result; the method comprises the steps of inputting a mapping relation between a video acquisition interval and a mapping numerical value in each piece of continuous video data, and carrying out corresponding matching on a user emotion analysis result corresponding to each piece of continuous video data and the video acquisition interval according to the mapping relation;
The judging module is used for judging whether the target user has abnormal emotion fluctuation according to the emotion fluctuation distribution diagram to obtain a judging result; extracting characteristic points of the emotion fluctuation distribution map to obtain target distribution probability; comparing the target distribution probability with a preset probability threshold to obtain a judgment result; if the target distribution probability is greater than or equal to a preset threshold value, determining that the judgment result is that emotion fluctuation abnormality exists; if the target distribution probability is smaller than a preset threshold, determining that the judgment result is that the emotion fluctuation abnormality does not exist; extracting feature points of the emotion fluctuation distribution graph to obtain target distribution probability, pre-constructing an initial feature point extraction model, wherein the initial feature point extraction model comprises a feature extraction module, the feature extraction module comprises a multi-scale convolution layer and a deformable convolution layer, training the initial feature point extraction model by utilizing a training data set to obtain a trained feature point extraction model, and inputting the emotion fluctuation distribution graph into the trained feature point extraction model to obtain target distribution probability;
the identification module is used for acquiring an emotion abnormal record table if the judgment result is that emotion fluctuation is abnormal, and identifying whether the target user has records in the emotion abnormal record table or not;
And the generation module is used for carrying out emotion abnormal warning on the target user if records exist, carrying out data integration analysis on the plurality of continuous video data and the judgment result, and generating a user management scheme.
6. A user data analysis and management device, characterized in that the user data analysis and management device comprises: a memory and at least one processor, the memory having instructions stored therein;
the at least one processor invokes the instructions in the memory to cause the user data analysis and management device to perform the user data analysis and management method of any of claims 1-4.
7. A computer readable storage medium having instructions stored thereon, which when executed by a processor, implement the user data analysis and management method of any of claims 1-4.
CN202310237508.2A 2023-03-14 2023-03-14 User data analysis and management method, device, equipment and storage medium Active CN115953724B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310237508.2A CN115953724B (en) 2023-03-14 2023-03-14 User data analysis and management method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310237508.2A CN115953724B (en) 2023-03-14 2023-03-14 User data analysis and management method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN115953724A CN115953724A (en) 2023-04-11
CN115953724B true CN115953724B (en) 2023-06-16

Family

ID=85893028

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310237508.2A Active CN115953724B (en) 2023-03-14 2023-03-14 User data analysis and management method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115953724B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115376559A (en) * 2022-08-22 2022-11-22 中国工商银行股份有限公司 Emotion recognition method, device and equipment based on audio and video

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110390048A (en) * 2019-06-19 2019-10-29 深圳壹账通智能科技有限公司 Information-pushing method, device, equipment and storage medium based on big data analysis
CN114124724A (en) * 2020-08-11 2022-03-01 中国电信股份有限公司 User behavior analysis method, device, NWDAF and storage medium
CN112667075A (en) * 2020-12-23 2021-04-16 珠海市魅族科技有限公司 Terminal control method, terminal control device, electronic equipment and storage medium
CN113822164A (en) * 2021-08-25 2021-12-21 深圳市安视宝科技有限公司 Dynamic emotion recognition method and device, computer equipment and storage medium
CN114639150A (en) * 2022-03-16 2022-06-17 平安科技(深圳)有限公司 Emotion recognition method and device, computer equipment and storage medium
CN114596619B (en) * 2022-05-09 2022-07-12 深圳市鹰瞳智能技术有限公司 Emotion analysis method, device and equipment based on video stream and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115376559A (en) * 2022-08-22 2022-11-22 中国工商银行股份有限公司 Emotion recognition method, device and equipment based on audio and video

Also Published As

Publication number Publication date
CN115953724A (en) 2023-04-11

Similar Documents

Publication Publication Date Title
CN109117380B (en) Software quality evaluation method, device, equipment and readable storage medium
CN111555921B (en) Method and device for positioning alarm root cause, computer equipment and storage medium
CN113792453A (en) Partial discharge monitoring system, method and device based on digital twins
US20150269195A1 (en) Model updating apparatus and method
CN111401339B (en) Method and device for identifying age of person in face image and electronic equipment
CN113222149B (en) Model training method, device, equipment and storage medium
CN115688760B (en) Intelligent diagnosis guiding method, device, equipment and storage medium
CN113270197A (en) Health prediction method, system and storage medium based on artificial intelligence
CN111708890B (en) Search term determining method and related device
CN113704389A (en) Data evaluation method and device, computer equipment and storage medium
CN111317458A (en) Blood pressure detection system based on deep learning
CN111061394B (en) Touch force identification method, training method and device of model thereof and electronic system
CN117763126A (en) Knowledge retrieval method, device, storage medium and apparatus
CN115953724B (en) User data analysis and management method, device, equipment and storage medium
CN116485020B (en) Supply chain risk identification early warning method, system and medium based on big data
CN113408210B (en) Deep learning-based non-invasive load decomposition method, system, medium and equipment
CN115662595A (en) User information management method and system based on online diagnosis and treatment system
CN111814523A (en) Human body activity recognition method and device
CN111680572B (en) Dynamic judgment method and system for power grid operation scene
CN115082041A (en) User information management method, device, equipment and storage medium
CN114186646A (en) Block chain abnormal transaction identification method and device, storage medium and electronic equipment
CN113643283A (en) Method, device, equipment and storage medium for detecting aging condition of human body
Luca et al. Anomaly detection using the Poisson process limit for extremes
CN111832815A (en) Scientific research hotspot prediction method and system
CN112612844A (en) Data processing method, device, equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20230524

Address after: Room 410, Building 8, Xinyi Lingyu R&D Center, No. 26 Honglang North 2nd Road, Xingdong Community, Xin'an Street, Bao'an District, Shenzhen City, Guangdong Province, 518000

Applicant after: Shenzhen Silver Bullet Technology Co.,Ltd.

Address before: 713, Building 10, Shenzhen Bay Science and Technology Ecological Park, No. 10 Gaoxin South 9th Road, Gaoxin Community, Yuehai Street, Nanshan District, Shenzhen City, Guangdong Province, 518000

Applicant before: SmartIn Technology (Shenzhen) Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant