CN117573742B - User behavior pattern mining method, device and storage medium - Google Patents

User behavior pattern mining method, device and storage medium Download PDF

Info

Publication number
CN117573742B
CN117573742B CN202410056221.4A CN202410056221A CN117573742B CN 117573742 B CN117573742 B CN 117573742B CN 202410056221 A CN202410056221 A CN 202410056221A CN 117573742 B CN117573742 B CN 117573742B
Authority
CN
China
Prior art keywords
user
user behavior
data
behavior
mining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410056221.4A
Other languages
Chinese (zh)
Other versions
CN117573742A (en
Inventor
吕少卿
沈亚军
俞鸣园
王克彦
曹亚曦
孙俊伟
费敏健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Huachuang Video Signal Technology Co Ltd
Original Assignee
Zhejiang Huachuang Video Signal Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Huachuang Video Signal Technology Co Ltd filed Critical Zhejiang Huachuang Video Signal Technology Co Ltd
Priority to CN202410056221.4A priority Critical patent/CN117573742B/en
Publication of CN117573742A publication Critical patent/CN117573742A/en
Application granted granted Critical
Publication of CN117573742B publication Critical patent/CN117573742B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2458Special types of queries, e.g. statistical queries, fuzzy queries or distributed queries
    • G06F16/2465Query processing support for facilitating data mining operations in structured databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/24Querying
    • G06F16/245Query processing
    • G06F16/2457Query processing with adaptation to user needs

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Fuzzy Systems (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Mathematical Physics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The application discloses a user behavior pattern mining method, equipment and a storage medium, wherein the method comprises the following steps: acquiring user behavior data and equipment condition data; adjusting the user behavior data based on the device condition data; and mining the user behavior mode based on the adjusted user behavior data. By the method, the user behavior data can be analyzed more accurately, the behavior mode of the user can be further described accurately, and finally a more effective decision result is obtained.

Description

User behavior pattern mining method, device and storage medium
Technical Field
The present application relates to the field of data mining and user behavior analysis, and in particular, to a method, apparatus, and storage medium for mining a user behavior pattern.
Background
With the popularity of the internet and mobile devices, a large amount of user behavior data generated by users while using products or services is recorded, which provides a basis for mining user behavior patterns. The mining of the user behavior pattern is an important data analysis technology, and the behavior characteristics and rules of the user are found by analyzing the data generated when the user uses the product or service, so that the user needs can be deeply known, personalized service is provided, product design is optimized, and the user experience and satisfaction degree are improved.
However, the amount of user behavior data is often very large, and various anomalies and noise may be present, such that the quality of the obtained user behavior data is low or inaccurate, thereby affecting the results of behavior pattern mining, and eventually may lead to deviations in decision results generated based on the analysis results of the user behavior data.
Disclosure of Invention
In order to solve the technical problems, the application adopts the following technical scheme: a user behavior pattern mining method, apparatus and computer readable storage medium are provided to at least solve the problem that the quality of user behavior data obtained in the related art is low or inaccurate, and thus the result of behavior pattern mining is affected, and finally the deviation of decision results generated based on the analysis result of the user behavior data may be caused.
According to an embodiment of the present invention, there is provided a user behavior pattern mining method including:
Acquiring user behavior data and equipment condition data;
adjusting the user behavior data based on the device condition data;
and mining the user behavior mode based on the adjusted user behavior data.
In order to solve the technical problems, the application adopts a technical scheme that: there is provided an electronic device comprising a memory and a processor, wherein the memory is configured to store a computer program, and the computer program, when executed by the processor, is configured to implement the user behavior pattern mining method in the above technical solution.
In order to solve the technical problems, the application adopts a technical scheme that: there is provided a computer readable storage medium for storing a computer program which, when executed by a processor, is adapted to carry out the user behavior pattern mining method of the above-mentioned technical solution.
Through the scheme, the application has the beneficial effects that: according to the user behavior pattern mining method provided by the application, the influence of the equipment condition data on the user behavior data is considered, the user behavior data is adjusted based on the equipment condition data, and the mining of the user behavior pattern is performed based on the adjusted user behavior data; by the technical scheme, more accurate user behavior data can be obtained, the quality of the user behavior data is improved, the user behavior data can be analyzed more accurately, the behavior mode of the user can be further accurately described, and finally more effective decision results are obtained.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art. Wherein:
FIG. 1 is a flowchart illustrating an embodiment of a method for mining a user behavior pattern according to the present application;
FIG. 2 is a flowchart illustrating another embodiment of a method for mining a user behavior pattern according to the present application;
FIG. 3 is a schematic diagram of an embodiment of an electronic device according to the present application;
fig. 4 is a schematic structural diagram of an embodiment of a computer readable storage medium provided by the present application.
Detailed Description
The application is described in further detail below with reference to the drawings and examples. It is specifically noted that the following examples are only for illustrating the present application, but do not limit the scope of the present application. Likewise, the following examples are only some, but not all, of the examples of the present application, and all other examples, which a person of ordinary skill in the art would obtain without making any inventive effort, are within the scope of the present application.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Those of skill in the art will explicitly and implicitly appreciate that the embodiments described herein may be combined with other embodiments.
In the scenes of video conference, online learning or live interaction, etc., the acquired user behavior data needs to be analyzed, so that a decision maker can adjust a decision scheme according to the data analysis result, and the user experience is optimized. Environmental factors such as network status, background noise, device type, operating system, etc. are important to consider in the analysis of user behavior data, because these indirect user behavior factors have a significant impact on user behavior or experience, even in the behavior patterns of the same user, when there is a difference in the device conditions used by the user, the collected user behavior data may be caused to be quite different, and thus, there may be a deviation in the judgment analysis of the same user behavior pattern.
The application provides a user behavior pattern mining method, which is characterized in that user behavior data and equipment condition data are obtained, the user behavior data are adjusted by utilizing the equipment condition data, and mining of a user behavior pattern is performed based on the adjusted user behavior data; by the aid of the scheme, more accurate user behavior data can be obtained, quality of the user behavior data is improved, accordingly, the user behavior data can be analyzed more accurately, behavior patterns of the user can be further accurately described, and finally more effective decision results are obtained.
Referring to fig. 1, fig. 1 is a flowchart illustrating an embodiment of a method for mining a user behavior pattern according to the present application. It should be noted that, if there are substantially the same results, the present embodiment is not limited to the flow sequence shown in fig. 1. As shown in fig. 1, the present embodiment includes:
S110: user behavior data and device condition data are obtained.
The data related to the study object can be collected, and the data can comprise behavior records, transaction histories, social media activities and the like of the user, and comprehensive user behavior data can be obtained through a tracker or a recorder embedded in a product or service, using a log file record, collecting public data, utilizing a third party data source or a user survey, a questionnaire and the like, and the quality and the diversity of the obtained data are important for mining accurate behavior patterns, so that the user requirements and behavior characteristics can be known in depth.
In an example, taking a user as an example in a video conference scenario, the acquired user behavior data includes:
(1) Camera opening rate: knowing whether the user really is concerned about the meeting;
(2) Microphone usage times: reflecting the participation degree of the user;
(3) User login and exit time: knowing the participation time of the user;
(4) User click rate: such as the number of clicks on PPT, video, etc.;
(5) Number of questions in meeting: reflecting the focus of the user;
(6) Number of silence in conference: knowing if the user has other disturbances in the meeting;
(7) User mute duration: knowing the total duration of mute of the user in the conference;
(8) User video off duration: knowing the total duration of closing the video by the user;
(9) Number of interactions by the user: for example, the total number of times the functions of "hand-lifting", "praise" and the like are clicked;
(10) Number of questions submitted by the user: knowing the number of times a user has posed a problem in a meeting;
(11) The screen gaze duration of the user: whether a user is looking at the screen or not can be captured through the camera;
(12) Number of background switches for the user: whether the user frequently switches virtual backgrounds or not can be known;
(13) Number of conference interruptions for the user: knowing if the meeting is often interrupted, such as a network break, an application crash, etc.;
(14) Number of conference reconnections for the user: knowing if the user often needs to reconnect the meeting;
(15) Feedback submission times of user: knowing the satisfaction and advice of the user on the meeting;
(16) Number of screen touches by the user: knowing whether the user is frequently interacting with the meeting interface;
(17) Number of screen shares of user: the importance of the issue may be reflected.
Of course, the obtained user behavior data includes, but is not limited to, the collected user behavior data, and the typing speed, the screen brightness adjustment times and/or the volume adjustment times of the user in the conference can also be collected, and the data analysis personnel can select the user behavior data to be collected according to the decision requirement.
Likewise, various means may be employed to obtain device condition data, including establishing a device information base, using an Application Program Interface (API), analyzing an operating system log, obtaining sensor data, analyzing a mobile application log and data, etc., where the obtained device condition data may include device basic information, a network state, and/or an operating state, etc., and obtaining device condition data may help to understand device states and device usage conditions in a user usage environment.
In an example, in any user usage scenario requiring decision, the device condition data to be acquired may include:
(1) Device type: knowing whether the user uses a cell phone, a computer or other equipment;
(2) Operating system of device: providing reference for technical support;
(3) Network status of the device: knowing whether the network environment of the user is stable;
(4) Background noise: knowing the quality of the user's usage environment.
Of course, the acquired device condition data includes, but is not limited to, the above collected device condition data, and the device condition data such as the network state and the device performance of the device can also be indirectly acquired by collecting the video quality or the audio quality of the user, and likewise, the data analyst can select the device condition data to be collected according to the decision requirement.
S120: the user behavior data is adjusted based on the device condition data.
To ensure accuracy and reliability of the analysis, the collected data is cleaned and preprocessed, including cleaning (eliminating erroneous and incomplete data), converting (normalizing or normalizing the data to allow comparison) and/or quantifying, possibly including discretizing the data (e.g., converting continuous variables into classified variables), etc.
In a specific embodiment, before the user behavior data is adjusted based on the device condition data, since different environmental factors such as different device conditions have different quantization scales, the device condition data and the user behavior data acquired in step S110 need to be preprocessed before the user behavior pattern mining is performed. In one example, data of environmental condition indicators such as device conditions and user behavior data are quantified; for example, network status may be quantified by delay and/or bandwidth; background noise can be quantified by decibel level; the device type and operating system may be identified by an identifier (e.g., a string); also, the user's behavior needs to be quantified as a measurable indicator, such as page browsing time, click rate, number of false clicks, etc.; quantization of further user behavior data and device condition data is shown below as a-t.
Camera opening rate: the method can be quantified by calculating the time proportion of the camera opening in the conference duration; for example, if a meeting lasts 60 minutes, the user turns on the camera for 30 minutes, and the camera turn-on rate is 50%. Microphone usage times: this can be quantified by counting the number of times the user turns on the microphone in the meeting. User login and exit time: the user's engagement and the appeal of the meeting can be analyzed by recording the time the user joined the meeting and the time the user left the meeting. User click rate: may be calculated by tracking the number of times a user clicks on a screen or clicks on a particular function in a meeting. Number of questions in meeting: this can be quantified by counting the number of times a user has posed a problem in the meeting. Number of silence in conference: this index can be quantified by counting the number of times the user clicks the mute button in the meeting. User mute duration: this can be quantified by calculating the total duration of silence of the user throughout the conference. User video off duration: quantification may be performed by calculating the total duration of time a user has turned off a video. Number of interactions by the user: may be quantified by counting how often the user is engaged in the interaction, such as speaking, answering questions, or engaging in voting. Number of questions submitted by the user: may be quantified by counting the number of times a user submitted a question. The screen gaze duration of the user: eye tracking techniques are often required to determine when a user is looking at the screen. Number of background switches for the user: this can be quantified by counting the number of times a user changes the virtual background in the meeting. Number of conference interruptions for the user: this can be quantified by recording the number of times a user drops or actively leaves from the meeting. Number of conference reconnections for the user: may be quantified by counting the number of times a user attempts to rejoin the meeting. Feedback submission times of user: this can be quantified by counting the number of times a user submits feedback in the meeting. Number of screen touches by the user: quantification may be achieved by tracking the frequency with which a user touches the screen, particularly when using a touch screen device. Background noise detection in meetings: the presence of background noise is typically identified by running an audio analysis algorithm and possibly calculating the frequency and duration of occurrence of the noise. Network status of the user: it can be assessed by monitoring the quality of the user's network connection, such as delay, bandwidth or packet loss rate. User device type: this is recorded by identifying the device (e.g., smart phone, tablet or personal computer) that the user used to join the meeting. The user's operating system: recorded by detecting the operating system of the user device (e.g., windows, macOS, linux or Android). Because different environmental conditions and user behavior data have different quantization scales, preprocessing of the data may also include data normalization prior to user behavior pattern mining to ensure that all data follow the same metrics. For example, to ensure consistency and integrity of data analysis, information islands or analysis errors caused by data dispersion are avoided, data from different channels and different formats are integrated into a common platform or database, and metrics such as format, unit and timestamp of the data are unified; if the user behavior data can be normalized by using the formula (1), x' is the normalized value and x is the original value.
x' = (x - min(x)) / (max(x) - min(x))(1)
For example, in a scenario where users participate in a video conference, activity data of users in different conferences are measured and recorded according to the same criteria, and if data is collected during a specific time period or event (as in a conference), then the integration of the data should also consider consistency of time factors, i.e. the data should be analyzed based on the same time period to ensure fairness and accuracy of the data, such as unifying the number of utterances, participation, etc. in a certain time unit. Data normalization also ensures that meaningful comparisons can be made even if the data comes from different scenes, e.g., different video conferences. This is because the metrics and recording criteria for data have been unified so that data in different scenarios is comparable.
In the case where a decision maker evaluates the quality or efficiency of the user behavior itself, it is necessary to eliminate the influence of environmental variables such as device conditions, in one implementation, the user behavior data may be adjusted based on the device condition data prior to pattern mining of the user behavior based on the user behavior data, so as to eliminate the influence of environmental variables prior to further analysis. For example, if a decision maker wants to evaluate the teaching effect of an online learning platform, they may need to first adjust their engagement data according to the network quality of the students to ensure that the data reflects the effect of the teaching content, but not the influence of the network conditions.
In a specific embodiment, before the mining of the user behavior pattern, a statistical or machine learning method may be used to analyze the relationship between the environmental conditions such as device conditions and the user behavior, and the device condition data is used to adjust the user behavior data through an environment-behavior association model. For example, a regression model may be established in which device condition data is an independent variable, and the user behavior score is a predicted dependent variable used to represent the overall behavior of the user under a specific environmental condition, that is, the user behavior data obtained by adjusting the original user behavior data based on the device condition data, and a specific calculation formula is shown in the following formula (2):
User behavior score = β0+β1 network delay + β2 background noise level + β3 device type score + β4 operating system score + epsilon (2)
Where β0 is an intercept term representing the expected value of the user behavior score when all arguments are 0, the value being determined based on some original user behavior data; β1, β2, β3 are regression coefficients representing the effect of each argument on the user's behavioral score, e.g., β1 represents the effect of network delay, β2 represents the effect of background noise level, and so on; epsilon is an error term that represents the portion of the model that cannot be interpreted and may include other factors or random variations that are not considered.
In another implementation, the overall impact of the environment on the user experience or satisfaction may also be evaluated in the calculation of the final behavioral pattern composite score by taking the environmental impact index determined based on the device condition data as an adjustment factor. In this implementation, the decision maker focuses on the overall quality of the user experience, including the influence of environmental factors, so that the overall score of the user behavior pattern needs to be adjusted by taking environmental factors such as the device conditions as a whole to obtain a more accurate decision result. For example, in assessing user satisfaction with video conferencing software, factors such as network quality, device performance, etc. may be considered in the final scoring, as these factors directly affect the user experience, resulting in decision bias.
In a specific embodiment, an environmental impact index may be calculated for each user directly based on the following equation (3), which comprehensively considers the impact of all environmental conditions on user behavior.
Environmental impact index = w1 x normalized network status score + w2 x normalized background noise score + w3 x normalized device type score + w4 x normalized operating system score (3)
Where w1, w2, w3, w4 are weight parameters, representing the relative importance of each condition to the overall user experience.
And the environmental impact index is used as an adjustment factor, and the environmental impact index and the final behavior pattern comprehensive score are subjected to weighted fusion or added as an adjustment value, so that the comprehensive score of the behavior pattern calculated based on all user behavior data of each user is adjusted, and a more accurate index reflecting the user satisfaction degree or efficiency is obtained.
By the method for adjusting the user behavior data based on the environmental factors such as the equipment conditions, the user behavior can be more comprehensively understood, behavior changes caused by specific environmental conditions can be identified, and therefore the user experience is improved by taking targeted measures, for example, the network delay is found to have a great influence on the user experience, and the network infrastructure can be preferentially invested.
S130: and mining the user behavior mode based on the adjusted user behavior data.
Behavior pattern mining is a data analysis method, aims to find and understand the behavior patterns of people or the operation patterns of a system, and can be applied to the fields of marketing, online learning and the like. Common behavior pattern mining methods include association rule mining, cluster analysis, classification, sequence pattern mining and the like to identify patterns or associations hidden in data, and particularly, an appropriate behavior pattern mining method should be selected according to the nature of the problem and the characteristics of the data.
The user behavior data adjusted based on the device condition data may be mined by association rule mining and/or sequence pattern mining, as will be described in detail below in connection with examples.
1. Association rule mining is a common method of behavior pattern mining, mainly used to find unexpected, meaningful, or unexpected associations between variables, such as "when user a speaks, user B tends to turn on the camera? "or" when the topic goes to topic X, is the number of questions increased? ". In an example, in the context of videoconferencing behavior analysis, association rules may help reveal specific patterns and associations between user behaviors; for example, the association of "camera turn-on rate" with "number of microphone uses" is a typical example, and if most users also tend to use microphones while turning on the camera, this rule may indicate that in a video conference, the user prefers to communicate voice while communicating visually.
Common association rule mining algorithms such as Apriori algorithm, FP-Growth algorithm (Frequent Pattern Growth) and Eclat algorithm are used for finding association relations between user behaviors in the dataset. In one example of generating association rules using the Apriori algorithm, the Apriori algorithm's core idea is to find a k+1 candidate item set by concatenating a k-item frequent item set; taking a video conference scenario as an example, if a microphone is used frequently when a camera is turned on frequently, the following steps are implemented:
1) The support of each item set is calculated separately and frequent item sets are identified.
A frequent item set is a set of items that frequently occur in a data set with a support that exceeds a predetermined minimum support threshold. The support degree indicates the frequency of occurrence of a certain item set in all transactions, for example, when the support degree is set to 0.5, it indicates that one item set occurs in at least 50% of transactions, and the calculation formula is shown in formula (4). For example, it is desirable to know how frequently "camera on" and "microphone on" occur simultaneously in all video conferences, as shown in equation (5).
(4)
(5)
2) The items meeting the minimum support threshold are combined into a k+1 candidate item set.
Generating a candidate item set based on the frequent item set identified in step 1), and if the camera turn-on rate and the microphone use frequency are frequently simultaneously present, forming a frequent item set; that is, if the user turns on the camera and uses the microphone in most conferences, the "camera on" and the "microphone use" can form a frequent item set; thus, the "camera on" and "microphone on" are the 2-item candidate set.
3) And calculating the support degree of the k+1 candidate item sets, and retaining the item sets meeting the minimum support degree threshold.
And (3) scanning the data set again for the candidate item set generated in the step (2) to calculate the support degree of the frequent item set, wherein the calculation formula is shown in the following formula (6), and screening the candidate item set higher than the support degree threshold value to obtain a new frequent item set. For example, if the user turns on the camera and uses the microphone in 80 conferences out of 100 conferences, the support of this set of items is 0.8 or 80%.
(6)
The process of steps 1) -3) above is repeated until no new frequent item sets can be found.
4) Based on the frequent item set, association rules are constructed.
The confidence level represents the conditional probability of the occurrence of the latter in the case of the occurrence of the former, reflecting the accuracy of the association rule, and the calculation formula is: confidence = (rule a = > number of occurrences of B in the dataset)/(number of occurrences of rule a in the dataset), which represents the probability that rule conclusion B holds if rule precondition a is satisfied.
For example, we want to know what the probability that the microphone is on with the camera on; if the "camera on rate" is found to be high and the "number of microphone uses" is also high, then an association rule may be generated: "camera on rate is high= > microphone usage number is high", as shown in formula (7).
(7)
5) Association rules are evaluated.
The rules generated need to be evaluated in terms of their reliability and importance. This is typically done by calculating the confidence level of the association rule, which refers to the accuracy of the association rule, and the degree of promotion, which refers to the strength of the association rule, i.e. the ratio of the probability that a target occurs under the association rule conditions to the probability that the target occurs in the entire dataset. The calculation formula of the lifting degree is as follows: degree of improvement= (rule a= > confidence of B)/(probability of B occurrence in dataset); if the lifting degree is greater than 1, the positive correlation exists between the rules A and B; if the lifting degree is less than 1, the negative correlation relation exists; if the degree of lift is equal to 1, it indicates that there is no association between A and B.
Behavior patterns refer to unique patterns of behavior identified from data, and may refer to a group or sequence of behaviors that typically occur in a particular order or in a particular context. In order to more accurately integrate all the behavior patterns of the user, the confidence and/or support degree can be calculated based on the original or adjusted user behavior data, and a behavior pattern score is defined for quantifying the user behavior patterns so as to reflect the influence of the patterns on the user experience or satisfaction degree.
Specifically, in the user behavior pattern mining, two candidate item sets in the frequent item set may be regarded as one behavior pair, and the behavior pair includes at least two different user behaviors, and may also include at least one of user behavior and device condition data. And further screening out the behavior pairs with the support degree larger than a preset threshold value by calculating the support degree of each behavior pair, determining the confidence degree of the association rule corresponding to each screened out behavior pair, and finally calculating the behavior pattern score based on the confidence degree and/or the support degree corresponding to the behavior pair.
In one example, the support of the filtered user behavior calculated based on the adjusted user behavior data represents a pattern score for each user behavior, a degree of behavior association between the behavior pairs is determined based on the confidence of each group of behavior pairs, a pattern score for the user is determined based on the pattern score and the degree of behavior association of the user behavior, and a calculation formula is shown in formula (8).
(8)
Wherein n is the number of behavior patterns screened out; α i is the weight of the ith behavior pattern; the pattern score i is the pattern score of the ith behavior (i.e. the support degree of the user behavior), and is related to factors such as the frequency, duration and the like of the behavior pattern, and can be represented by the support degree of the filtered user behavior calculated based on the adjusted user behavior data; beta is the weight of the degree of behavioral association; the degree of correlation of behaviors is the degree of correlation between behaviors, and is related to factors such as similarity between behaviors, time interval and the like, and can be represented by the confidence degree of each group of behavior pairs, namely, the confidence degree is represented by a formula (9).
(9)
Where m is the number of behavior pairs considered; gamma j is the weight of the jth behavioral pair; confidence j is the confidence of the jth pair of actions, and may be related to factors such as time interval and frequency of the two actions, and if a user is found to frequently turn on the camera after hearing a certain keyword, the confidence between the two actions will be high.
6) The association rules are visualized and applied.
The association rule may be visualized using a graph or chart before the association rule is applied, for example, a scatter diagram may be used to represent the relationship of "camera on rate" and "microphone use number". Applying the mined association rules to the actual meeting scenario, for example, if it is found that when a certain issue is mentioned, the number of questions of the user increases, then the host may be reminded to encourage the user to ask questions when the issue begins; or find that the user has interacted particularly frequently with a certain topic, it may be considered to allocate more time for this topic.
According to the calculated behavior pattern scores of the users, the users can be classified, recommended and the like, for example, the behavior pattern score of a certain user is particularly high, and then the user can be considered as an active user, and more conference contents can be recommended for the user.
Association rule mining can help us understand complex relationships between variables in data, provide data support for decisions, and can also be considered in combination with association rules with other data mining techniques, such as classification, clustering, regression, etc., to gain more comprehensive and deep insight. Moreover, association rules may change over time and over time, so association rule mining algorithms need to be re-run periodically to ensure that rules are always up-to-date and most relevant. In the video conference scene, the association rule mining can help us understand the mode of user behavior, so that the product design is optimized, and the user satisfaction is improved.
2. Sequence pattern mining aims at finding out that there is a certain pattern or rule from time-series data or ordered data, for example, whether the user has a higher tendency to turn on the camera after hearing a certain keyword. These patterns can be used for tasks such as prediction, classification, clustering, etc., and common sequence pattern mining algorithms include: GSP (Generalized Sequential Pattern) algorithm, prefixSpan algorithm, SPAM algorithm, etc.
In one embodiment, the sequence pattern mining steps are as follows:
1) Data preprocessing: the user behavior data is converted into a format suitable for sequence pattern mining. The behavior sequence of each user may be represented as a list of time stamps, where each time stamp corresponds to one or more behaviors.
2) Defining a time window: to capture the short-term behavior pattern of the user, a time window is defined, for example 5 minutes. Within this time window, all the behaviors of the user are considered as one sequence, e.g., time window sequence = { behavior 1, behavior 2, …, behavior n }.
3) Calculating the sequence support degree: the sequence support represents the frequency of occurrence of a sequence in all time windows, and the calculation formula is: sequence support = number of times a particular sequence occurs/total time window number; for example, if one wants to know the support of turning on the camera within 5 minutes after hearing a certain keyword: support (keyword→camera on) =number of times the camera is turned on within 5 minutes after the keyword/total number of times the keyword appears.
4) Setting a support degree and a confidence degree threshold value: similar to association rule mining, support and confidence thresholds may be set to screen out meaningful sequence patterns.
5) Using a sequence mining pattern algorithm: if GSP algorithm is used, starting from a single action, the length of the sequence is gradually increased until all sequence patterns are found that meet the support and confidence thresholds.
6) Evaluating the sequence pattern: not all sequence patterns are useful and the decision maker needs to evaluate the actual meaning and application value of these patterns. This mode may be valuable to an organizer if, for example, the user is found to ask questions frequently after hearing a certain keyword.
To more accurately integrate all behavior sequences, confidence and/or support may be calculated based on the raw or adjusted user behavior data by defining a sequence pattern scoring formula for quantifying the user behavior patterns, the calculation formula being shown in formula (10):
(10)
Where p is the number of behavior sequence patterns; σ q refers to the weight of the q-th behavior sequence pattern; pattern score q refers to the score of the q-th behavior sequence pattern, and is related to factors such as the frequency, duration, etc. of the behavior sequence pattern; λ is the weight of the relevance of the behavior sequence.
7) Application sequence pattern: the mined sequence mode can be applied to actual conference scenes, and the user can be classified, recommended and the like. For example, if a user is found to frequently ask questions after hearing a certain keyword, the meeting organizer may encourage the user to ask questions when referring to the keyword.
3. In order to analyze the user behavior patterns more accurately, the user behavior patterns can be mined by a method combining association rule mining and sequence pattern mining, and comprehensive behavior scores of the user can be obtained by a method of weighting based on the behavior pattern scores and the sequence pattern scores. The calculation formula of the comprehensive score is shown in formula (11):
Composite score stotal=ws×ss+wb×sb (11)
Where ws+wb=1 (ensuring that the sum of weights is 1, keeping the consistency of the scale); ss represents a sequence pattern score obtained by sequence pattern mining; sb represents a behavior pattern score obtained by association rule mining; ws represents the weight of the sequence pattern score; wb represents the weight of the behavior pattern score.
It should be noted that, to ensure that the scores are under the same criteria, the scores may be normalized, and the scores may be normalized to a common scale (e.g., between 0 and 1) by the normalization function norm (x), then the normalized scores will be: ss' =norm (Ss); sb' =norm (Sb), i.e., the composite scoring formula (12) is:
composite score Stotal ' =ws×ss ' +wb×sb ' (12)
In one embodiment, it is assumed that the decision maker operates an online learning platform, and sequence pattern mining and association rule mining are performed simultaneously to understand how users interact and use the online learning platform.
(1) Through sequential pattern mining, the decision maker finds:
a) Most users tend to watch video lessons between 3 pm and 5 pm;
b) The user will typically try the relevant quiz immediately after completing the video lesson.
Calculation was performed by the above method to obtain a sequence pattern score ss=80.
(2) Through association rule mining, the decision maker finds:
a) Users frequently ask questions in video courses, indicating that they have encountered challenges in the course;
b) Users have a higher engagement with interactive content (e.g., they prefer to participate in a quiz rather than merely watching video).
Calculation by the above method gives a behavior pattern score sb=70.
(3) To optimize the learning experience of the user, the decision maker may combine these findings, possibly leading to such policy decisions:
a) In view of the user being more active in the afternoon and tending to take a quiz immediately after watching the video, we decided to post new video content every afternoon and ensure that each video post was accompanied by a quiz.
B) Since users frequently ask questions during a course, we can infer that video content may be somewhat difficult, and therefore decide to add more interactive elements and coaching resources, such as real-time questions, forums, or additional instructional material.
The sequence pattern score and behavior pattern score described above are normalized to a range of 0 to 1, assuming that a simple linear transformation is used, the normalized score may be: ss' =0.8; sb' =0.7; if it is decided that both modes are equally important, then the weights can be set to ws=0.5 and wb=0.5; then the composite score may be calculated to be: stotal=0.5×0.8+0.5×0.7=0.75.
This composite score indicates that the overall active engagement of the user with the platform is high. Thus, the platform may decide to continue with the current user engagement policy or consider slight optimization adjustments to further improve user satisfaction and engagement.
Through the combination of the sequence pattern mining and the association rule mining, the user behavior can be more comprehensively understood, so that more intelligent and effective decisions can be made, and the user experience is optimized.
4. When determining the composite score, an adjustment factor may be added to modify the composite score to reflect the influence of environmental factors, such as device conditions, in consideration of the influence of the environmental factors on the user's behavior. At this time, the final comprehensive scoring formula is shown as formula (13):
composite score = w1×behavioural pattern score + w2×sequential pattern score + adjustment factor (13)
Wherein w1 and w2 are weight parameters; the adjustment factor is calculated according to the environmental factors, and the specific calculation mode is detailed in the aforementioned step S120, which is not described herein.
In one embodiment, such composite scores may be used to analyze the learning behavior of students, again taking the online educational platform scenario as an example. The behavioral pattern score may be based on the student's activities on the platform (e.g., watching video, participating in discussions); and the sequence pattern score may be based on a time sequence of student activities (e.g., regularity of learning time); the composite score can help teachers to learn learning habits and demands of students, thereby providing more personalized teaching support. Meanwhile, the scores can be correspondingly adjusted to reflect the learning effect of the students more accurately by considering the influence of the network condition and the equipment type on the learning experience.
After the user behavior pattern mining method discovers meaningful patterns and associations from the user behavior data, such as typical behavior patterns, abnormal behaviors, frequently occurring behavior sequences and the like of the user, intervention of domain expertise is usually required, the mined behavior patterns are interpreted, and the meanings behind the mined behavior patterns are understood so as to ensure that the interpretation of the patterns is reasonable and feasible, thereby guiding decision making and policy making.
Referring to fig. 2, fig. 2 is a flowchart illustrating another embodiment of the user behavior pattern mining method according to the present application. It should be noted that, if there are substantially the same results, the present embodiment is not limited to the flow sequence shown in fig. 2, and the present embodiment includes:
s201: raw user behavior data and device condition data are collected.
Taking the example of a user in a video conference scenario, the user behavior data and device condition data that may need to be collected are shown in the following table, and may further include click streams, user generated content, interaction time stamps, and other user behavior data.
S202: data cleaning and preprocessing.
After the original user behavior data and the equipment condition data are collected, abnormal values and noise are removed, the quality and consistency of the data are ensured, for example, abnormal values and noise are identified by using a statistical method or a visualization tool (a histogram, a box diagram and the like) through using field expertise, and meanwhile, the abnormal values, the missing values and repeated data are processed.
Converting the collected raw data into a format which can be interpreted by pattern mining, standardizing the numerical data to eliminate dimension differences among different features, for example, the following a-q shows a method for quantifying and standardizing part of user behavior data, so that the metering rules of the data are under the same time period or unified preset conditions.
Camera on rate = camera on duration/conference total duration x 100%. Video switching rate = video switching times/total duration of conference x 100%; the video switching times can know whether the user switches the video source or the view frequently. Volume adjustment rate = volume adjustment times/total duration of conference x 100%; the volume adjustment times reflect whether the user adjusts the volume frequently, and may be related to the audio quality of the conference. Typing speed = total number of words/duration of typing; by analyzing the typing speed of the user in the meeting, the user can know whether the user is busy with other things or not. Video quality index = high definition video duration/conference total duration x 100%; the user's video definition is known, which is related to the user's network status and device capabilities. Audio quality index = clear audio duration/conference total duration x 100%; the audio clarity of the user is known, which is related to the network status and device performance of the user. Feedback rate = number of feedback/conference population x 100%; user satisfaction and advice for the meeting may be known. Brightness adjustment rate = brightness adjustment times/conference total duration x 100%; it is known whether the user frequently adjusts the screen brightness, in relation to the visual content of the conference. Interrupt rate = number of interrupts/total duration of conference x 100%; it is known whether the conference is often interrupted, such as a network break, an application crash, etc. Reconnection rate = number of reconnections/total duration of conference x 100%; it is known whether the user often needs to reconnect the meeting. Touch interaction rate = number of touches/total duration of conference x 100%; it is known whether the user is frequently interacting with the meeting interface. Silence rate = user silence duration/conference total duration x 100%. Video closure rate = user video closure duration/conference total duration x 100%. Interaction rate = interaction number/total duration of conference x 100%; such as the total number of "hand-lifting", "praise" and the like functions. Question rate = number of questions submitted/total duration of conference x 100%. Fixation rate = screen fixation duration/conference total duration x 100%; whether the user is looking at the screen can be captured by the camera. Background switching rate = background switching times/total duration of conference x 100%; it is known whether the user switches virtual contexts frequently. S203: and screening the user behavior characteristic data.
The method has the advantages that the characteristic with high correlation and representativeness is selected from the original user behavior data, different user behavior characteristics can be selected according to different purposes of a decision maker, the screened characteristics can represent the basic behavior mode and preference of a user, and meanwhile, the relation between each characteristic and the user behavior can be understood by using statistical analysis and visualization tools. For example, the decision maker wants to know the topic interest of the user, and can be reflected by the question number, click rate, and the like of the user.
These features may be from the user behavior data list in step S201, or any data selected according to the actual requirements and the background. Feature extraction may also be performed by Principal Component Analysis (PCA), time series analysis, frequency domain features, etc., and the most relevant features may be further selected from the data set collected in step S201 based on statistical analysis, correlation analysis, or other feature selection techniques, to facilitate model learning.
Information gain or chi-square tests may also be used to evaluate the importance of each user behavior data at the time of feature selection. The information gain is an index for evaluating the importance of the feature in a decision tree algorithm and measures the uncertainty reduction degree of a target variable under the condition that a certain feature is known; specifically, for each feature, the entropy (uncertainty) of the target variable under the condition that the feature is known is calculated, and the information gain is the entropy of the target variable when the feature is unknown minus the entropy when the feature is known, i.e., the decrease in entropy. The chi-square test is used to evaluate the correlation between two classification variables, in feature selection, it is used to test the independence between each feature and the target variable; specifically, for each feature, a cross table is constructed to show the relationship between two or more classification variables, and then chi-square statistics are calculated, which reflect the deviation between the observed value and the expected value, to determine whether there is a significant correlation between the feature and the target variable.
The purpose of feature selection is to reduce data dimensionality, highlight important information, and eliminate noise and redundant information. The steps and step S202 do not limit the operation sequence, and the data may be cleaned and preprocessed after feature selection.
S204: user behavior data is adjusted based on the device condition data.
Considering the influence of environmental factors such as equipment conditions on the effectiveness of the user behavior data, the user behavior data can be adjusted before the pattern mining of the user behavior based on the user behavior data so as to eliminate the influence of environmental variables before further analysis; or by taking the environmental impact index determined based on the device condition data as an adjustment factor in the calculation of the final behavioral pattern composite score to evaluate the overall impact of the environment on the user experience or satisfaction.
In practical applications, the relationship between the quality of the environmental factors and the adjustment factors is complex and variable, and may be affected by various factors, including personal preferences of users, specific application scenarios, and other external factors. The adjustment of certain environmental factors to the user behavior data is a positively correlated relationship, such as better network connection or better quality equipment, so that the user experience and satisfaction can be directly improved; in some cases, however, the improvement in environmental factors may also expose some other problems or reduce the need for certain functions, such as in video conferencing software, the user may rely less on text chat functions and more on direct voice and video communications if the network connection is very stable.
S205: pattern recognition is performed based on a pattern mining method.
Various data mining algorithms such as association rule learning, sequence pattern mining, cluster analysis and the like are applied to identify patterns or associations hidden in data, and a combination of one or more mining algorithms is selected according to different requirements to achieve mining of user behavior patterns.
Illustratively, in video conference analysis, user behavior and preferences are known in depth through association rule learning and cluster analysis, and the specific steps are as follows:
mining frequent item sets using association rule analysis: combinations of frequently occurring behaviors or events in video conferences are found, for example, by analyzing the following parameters: a) The topic interest (reflected by the number of questions, click rate, etc. of the user);
b) User activity periods (by analyzing user login and logout time, etc. data);
c) Content-induced distraction (by background noise detection, number of user silence, etc.);
d) The frequency of technical problems for the user (by the user's network status, number of conference breaks and reconnections, etc.).
The operation steps are as follows: a) Data preparation: user behavior data including the above parameters is collected.
B) Setting a threshold value: an appropriate support threshold is determined to identify frequent item sets.
C) The algorithm is operated: all possible combinations of item sets that exceed the threshold are found using an algorithm such as Apriori.
D) Analysis results: frequent behavior patterns are identified, such as "frequent silence due to network instability" or "high issue focus and frequent screen sharing behavior".
And (3) cluster analysis: the cluster analysis groups users based on behavioral similarity, the following user behaviors can be considered: a) The participation mode of the user (such as speaking, chatting, screen sharing, etc.);
b) User attention to meeting material (by clicking PPT, number of videos, etc.);
c) Emotional response of the user (by analyzing speech and expressions);
d) Conference satisfaction of the user (via questionnaires or feedback);
e) The user's interaction partner (by analyzing chat logs, interaction of shared content).
The operation steps of cluster analysis are as follows: a) Feature selection: based on the parameters, a feature vector is created for each user.
B) Selection algorithm: a suitable clustering algorithm, such as K-means, is selected.
C) Determining the clustering number: the number of clusters is determined using a method such as an elbow rule.
D) And (3) operation clustering: an algorithm is applied to divide the users into a plurality of groups.
E) Interpretation results: analyzing the common characteristics of each group, such as "highly interactive user groups" may be those users who speak frequently, use chat functionality, and have positive emotional response to the meeting content.
The user behaviors are deeply analyzed by utilizing the results of different mining algorithms, so that not only can the commonality of the user behaviors be known, but also the factors affecting the conference experience of the user can be identified, and further measures are taken to improve future video conferences. For example, if users are found to be highly involved in a particular topic, the meeting organizer may allocate more time for these topics in future meetings; if certain technical problems frequently occur, the technical team can optimize the problems, thereby improving the satisfaction of users.
By calculating the support and confidence between the user behaviors, association rule mining or sequence pattern mining is performed, the relation between the user behaviors is quantized, and the analysis result is more intuitively and conveniently obtained. The following demonstrates the way in which the partial behavior pattern support is calculated:
1) Support for increased click rate after screen sharing: the calculation formula is as follows: support = (number of conferences/total number of conferences recorded to "click rate increase" after "screen sharing" is performed) = 100;
2) Support for background noise detection after silence: the calculation formula is as follows: support = (number of conferences/total number of conferences where "background noise detection" occurs after "mute" of the user) ×100;
3) Support for immediate questions after login: the calculation formula is as follows: support = (number of conferences/total number of conferences of "questions" immediately after user "logged in") x100;
4) Support for immediate exit after hearing keywords: the calculation formula is as follows: support = (number of conferences/total number of conferences "immediately exited" of the user after "keyword" mention) ×100;
5) Support degree of stable network state after camera is closed: the calculation formula is as follows: support= (number of conferences/total number of conferences with "network state stable" after user "closed camera") x100;
6) Support for click rate reduction after microphone use: the calculation formula is as follows: support= (number of conferences/total number of conferences with "click rate reduced" after "use of microphone" by user) ×100.
S206: and (5) establishing a comprehensive scoring system.
Based on the calculation results of the support degree, the confidence degree, the promotion degree and the like among the user behavior modes, a comprehensive scoring system is established, and a score is distributed to each identified mode or behavior to reflect the influence of the score on the user experience or satisfaction degree. For example, a weighted or other synthetic strategy may be used in combination with scoring of the sequence patterns and association rules to calculate a composite user behavior score, which may be related to user satisfaction, retention or other key performance indicators by interpreting the meaning of the score, for example, as described above with reference to step S130.
S207: and performing evaluation and decision making.
In a real scenario, the mining results of the user behavior patterns may prompt the decision maker to analyze the results at the time of use or after use.
Illustratively, when the activity of the subject in question needs to be evaluated, the analysis may be performed by "subject activity=number of user interactions in the subject presentation/total duration of the subject presentation", where the number of user interactions includes camera on, microphone use, number of questions, etc.; setting a reminding threshold, and when the activity of a certain topic exceeds the threshold, automatically reminding a meeting organizer by the system, wherein the calculation formula is as follows: alert threshold = average of all subject liveness + k x standard deviation, where k is a constant, which can be adjusted as needed.
In association therewith, the probability of a user asking questions when a certain topic is mentioned can also be calculated by "question probability=number of questions in question/total number of questions in question"; the degree of association between different issues is evaluated by "degree of association of issues=number of times two issues are mentioned at the same time/number of times any one issue is mentioned", and if two issues are found to be frequently mentioned at the same time and both have high liveness, it is considered that the two issues can be combined.
Based on the above calculation, feedback can be provided to the organizer of the meeting in real time, for example, if the activity of a certain topic is particularly high, the system can remind the host: the current issue has attracted a great deal of attention, suggesting that some more time is left for discussion.
In other embodiments, the user behavior pattern in the past may be provided as a reference to the organizer before the next use, for example, if the user finds that the user frequently asks after the camera is turned on through the calculation result of "prediction probability of asking questions = number of questions after the camera is turned on/total number of times the camera is turned on", the system may assign the question rights to the user when the user turns on the camera this time.
The above environmental and technical assessment based on environmental factors such as device conditions focuses on the impact of "external" factors on the user experience, helping us understand why users behave differently in certain environments. While behavior pattern mining focuses on intrinsic patterns and trends identified from user behavior, the impact of external environments is not directly evaluated.
Thus, when the two analyses are combined, a 360 degree view of the user is obtained, understanding not only how the user interacted, but what motivated the user to so combine. Based on analysis of results of environmental and technical assessment and behavior pattern mining, policy makers can more fully understand users and make comprehensive policies including technical optimization, content policies and interface design.
Referring to fig. 3, fig. 3 is a schematic structural diagram of an embodiment of an electronic device according to the present application, where the electronic device 60 includes a memory 61 and a processor 62 connected to each other, and the memory 61 is used for storing a computer program, and the computer program is used for implementing the user behavior pattern mining method in the above embodiment when executed by the processor 62.
For the method of the embodiment, which may be in the form of a computer program, the present application proposes a computer readable storage medium, please refer to fig. 4, fig. 4 is a schematic structural diagram of an embodiment of the computer readable storage medium provided by the present application, and the computer readable storage medium 80 is used to store a computer program 81, which may be executed to implement the user behavior pattern mining method of the embodiment.
The computer readable storage medium 80 may be a server, a usb disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, or an optical disk, etc. various media capable of storing program codes.
The foregoing description is only illustrative of the present application and is not intended to limit the scope of the application, and all equivalent structures or equivalent processes or direct or indirect application in other related technical fields are included in the scope of the present application.

Claims (9)

1. A method for mining a user behavior pattern, the method comprising:
Acquiring user behavior data and equipment condition data, wherein the equipment condition data comprises network state, background noise, equipment type and an operating system;
adjusting the user behavior data based on the device condition data;
Mining a user behavior mode based on the adjusted user behavior data;
wherein said adjusting said user behavior data based on said device condition data comprises:
And adjusting the user behavior data by using the environment-behavior association model to obtain a user behavior score, wherein the user behavior score is jointly determined by the expected value of the user behavior score, the sum of weighted scores of the device condition data and an error item, and the expected value of the user behavior score is determined based on certain original user behavior data.
2. The user behavior pattern mining method of claim 1, wherein an environmental impact index is determined based on the network state, the background noise, the device type, the operating system, and weight parameters;
And adjusting a comprehensive score of the behavior pattern calculated based on the user behavior data based on the environmental impact index.
3. The method for mining a user behavior pattern according to claim 1, wherein the mining of a user behavior pattern based on the adjusted user behavior data includes:
Mining a user behavior mode based on the adjusted user behavior data to obtain a mode score of the user behavior;
The mode score is adjusted based on the user's device condition data.
4. A method for mining a user behavior pattern according to claim 3, wherein the mining of the user behavior pattern based on the adjusted user behavior data, to obtain a pattern score of the user behavior, includes:
Learning by using association rules and mining a user behavior pattern based on the adjusted user behavior data to obtain a behavior pattern score of the user;
The adjusting the mode score based on the user's device condition data includes: the behavior pattern score is adjusted based on the device condition data of the user.
5. The method for mining a user behavior pattern according to claim 1, wherein the acquiring the user behavior data and the device condition data includes: acquiring equipment condition data and various behavior data of a user;
Learning by using association rules and mining a user behavior mode based on the adjusted user behavior data, wherein the mining comprises the following steps:
Calculating a support for each pair of behaviors, the pair of behaviors comprising two different user behaviors or the pair of behaviors comprising at least one of the user behaviors and at least one of the device condition data;
Screening out behavior pairs with support degree larger than a preset threshold value;
Determining the confidence coefficient of each screened behavior to the corresponding association rule;
And calculating a behavior pattern score based on the confidence and/or support of the behavior pair.
6. The method for mining a behavior pattern of a user according to claim 4, wherein the mining a behavior pattern of a user based on the adjusted behavior data of the user to obtain a behavior pattern score of the user comprises:
Performing pattern mining of a user behavior sequence based on the adjusted user behavior data to obtain a sequence pattern score of the user;
The adjusting the mode score based on the user's device condition data includes: and calculating an adjustment factor based on the equipment condition data, and weighting the adjustment factor, the behavior mode score and the sequence mode score to obtain the comprehensive behavior score of the user.
7. The method of claim 1, wherein said adjusting said user behavior data based on said device condition data, previously comprises:
evaluating the importance of each user behavior data using an information gain or chi-square test;
The adjusting the user behavior data based on the device condition data includes:
and adjusting the user behavior data of which the importance meets preset conditions based on the equipment condition data.
8. An electronic device comprising a processor, a memory, the processor coupled to the memory, the processor configured to perform one or more steps of the user behavior pattern mining method of any one of claims 1-7 based on instructions stored in the memory.
9. A computer readable storage medium storing a computer program for execution by a processor to implement the steps of the user behavior pattern mining method of any one of claims 1-7.
CN202410056221.4A 2024-01-15 2024-01-15 User behavior pattern mining method, device and storage medium Active CN117573742B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410056221.4A CN117573742B (en) 2024-01-15 2024-01-15 User behavior pattern mining method, device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410056221.4A CN117573742B (en) 2024-01-15 2024-01-15 User behavior pattern mining method, device and storage medium

Publications (2)

Publication Number Publication Date
CN117573742A CN117573742A (en) 2024-02-20
CN117573742B true CN117573742B (en) 2024-05-07

Family

ID=89886530

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410056221.4A Active CN117573742B (en) 2024-01-15 2024-01-15 User behavior pattern mining method, device and storage medium

Country Status (1)

Country Link
CN (1) CN117573742B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108415926A (en) * 2018-01-15 2018-08-17 大连理工大学 A kind of collaborative filtering recommending method for eliminating original score data scoring noise
CN108830315A (en) * 2018-05-31 2018-11-16 大连理工大学 A kind of score data noise cancellation method
CN112488065A (en) * 2020-12-19 2021-03-12 智粤云(广州)数字信息科技有限公司 Remote education system based on cloud platform
CN116109121A (en) * 2023-04-17 2023-05-12 西昌学院 User demand mining method and system based on big data analysis
CN117314573A (en) * 2023-09-27 2023-12-29 深圳市斯堡霖科技有限公司 Network marketing system based on user matching

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180053197A1 (en) * 2016-08-18 2018-02-22 International Business Machines Corporation Normalizing user responses to events
US20220138866A1 (en) * 2020-11-02 2022-05-05 Relationship Capital Technologies Inc. Systems and Methods for Managing Social Networks Based Upon Predetermined Objectives

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108415926A (en) * 2018-01-15 2018-08-17 大连理工大学 A kind of collaborative filtering recommending method for eliminating original score data scoring noise
CN108830315A (en) * 2018-05-31 2018-11-16 大连理工大学 A kind of score data noise cancellation method
CN112488065A (en) * 2020-12-19 2021-03-12 智粤云(广州)数字信息科技有限公司 Remote education system based on cloud platform
CN116109121A (en) * 2023-04-17 2023-05-12 西昌学院 User demand mining method and system based on big data analysis
CN117314573A (en) * 2023-09-27 2023-12-29 深圳市斯堡霖科技有限公司 Network marketing system based on user matching

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于BP神经网络的产品性能满意度预测分析;邵宏宇等;天津大学学报;20190930;第52卷(第9期);932-940 *

Also Published As

Publication number Publication date
CN117573742A (en) 2024-02-20

Similar Documents

Publication Publication Date Title
CN110781321B (en) Multimedia content recommendation method and device
CN112364234B (en) Automatic grouping system for online discussion
US10691896B2 (en) Conversational system user behavior identification
EP4297030A2 (en) Polling questions for a conference call discussion
US10832153B2 (en) Analyzing behavior in light of social time
US20180239824A1 (en) Targeted feedback systems and methods
DE102021125184A1 (en) PERSONAL TALK RECOMMENDATIONS USING LISTENER RESPONSES
Oleson et al. Statistical considerations for analyzing ecological momentary assessment data
US20240119074A1 (en) Recognizing polling questions from a conference call discussion
CN117573742B (en) User behavior pattern mining method, device and storage medium
Keenan et al. Introduction to analytics
Zhang et al. Can Large Language Models Assess Personality from Asynchronous Video Interviews? A Comprehensive Evaluation of Validity, Reliability, Fairness, and Rating Patterns
WO2022168185A1 (en) Video session evaluation terminal, video session evaluation system, and video session evaluation program
WO2022168180A1 (en) Video session evaluation terminal, video session evaluation system, and video session evaluation program
US10795896B2 (en) Systems and methods for automatically identifying specific teams of users to solve specific and real-time problems
CN113886674A (en) Resource recommendation method and device, electronic equipment and storage medium
Raffensperger et al. A simple metric for turn-taking in emergent communication
Barthet et al. Knowing your annotator: Rapidly testing the reliability of affect annotation
TWI725535B (en) Voice interaction method to detect user behavior and attribute characteristics
CN117909379A (en) User behavior pattern mining method, device and storage medium
WO2022168184A1 (en) Video session evaluation terminal, video session evaluation system, and video session evaluation program
WO2022168183A1 (en) Video session evaluation terminal, video session evaluation system, and video session evaluation program
WO2022168176A1 (en) Video session evaluation terminal, video session evaluation system, and video session evaluation program
WO2022168182A1 (en) Video session evaluation terminal, video session evaluation system, and video session evaluation program
WO2022168178A1 (en) Video session evaluation terminal, video session evaluation system, and video session evaluation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant