CN114742154A - Method, device, electronic device and storage medium for generating user portrait - Google Patents

Method, device, electronic device and storage medium for generating user portrait Download PDF

Info

Publication number
CN114742154A
CN114742154A CN202210374651.1A CN202210374651A CN114742154A CN 114742154 A CN114742154 A CN 114742154A CN 202210374651 A CN202210374651 A CN 202210374651A CN 114742154 A CN114742154 A CN 114742154A
Authority
CN
China
Prior art keywords
touch
user
operation data
interval
time period
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210374651.1A
Other languages
Chinese (zh)
Inventor
张世泽
陶建容
范长杰
胡志鹏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Netease Hangzhou Network Co Ltd
Original Assignee
Netease Hangzhou Network Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Netease Hangzhou Network Co Ltd filed Critical Netease Hangzhou Network Co Ltd
Priority to CN202210374651.1A priority Critical patent/CN114742154A/en
Publication of CN114742154A publication Critical patent/CN114742154A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2415Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on parametric or probabilistic models, e.g. based on likelihood ratio or false acceptance rate versus a false rejection rate
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/90Details of database functions independent of the retrieved data types
    • G06F16/95Retrieval from the web
    • G06F16/953Querying, e.g. by the use of web search engines
    • G06F16/9535Search customisation based on user profiles and personalisation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/10Complex mathematical operations
    • G06F17/18Complex mathematical operations for evaluating statistical data, e.g. average values, frequency distributions, probability functions, regression analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24133Distances to prototypes
    • G06F18/24137Distances to cluster centroïds
    • G06F18/2414Smoothing the distance, e.g. radial basis function networks [RBFN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]

Abstract

The application provides a method, equipment, electronic equipment and storage medium for generating a user portrait, which comprise the following steps: acquiring first touch operation data and second touch operation data, wherein the first touch operation data are generated by a user aiming at a first touch area operation of a terminal screen, and the second touch operation data are generated by the user aiming at a second touch area operation of the terminal screen; obtaining the score of the touch representation parameter of the user according to the first touch operation data and the second touch operation data; and obtaining the user portrait of the user according to the score of the touch representation parameter and the pre-trained user portrait prediction model. According to the method and the device, the user operation capacity and the user level are evaluated more accurately through the touch frequency, the interval, the displacement and other parameters of the user in the application, so that the user is classified accurately, more accurate service is provided for the user in the fields of information screening, application recommendation, team matching and the like, and the overall user experience is improved.

Description

Method, device, electronic device and storage medium for generating user portrait
Technical Field
The present application relates to the field of electronic information technologies, and in particular, to a method, an apparatus, an electronic apparatus, and a storage medium for generating a user portrait.
Background
With the popularization of smart phones and the development of mobile internet, mobile games have gradually occupied a major part of the game market. In order to bring better game experience to users, the technology of user portrait correlation has attracted a lot of attention. The user portrait technology analyzes and models the interest preference, horizontal capability and the like of a user through a series of logs, and is the basis of services such as personalized recommendation and matching.
User profiling techniques generally classify users by analyzing, modeling, etc. the user's interest preferences through a series of logs, where traditional user competency assessment is often based on user-to-bureau settlement data. However, the evaluation mode is limited in scene and not intuitive enough. The evaluation of the operation capability of the user is not accurate enough due to the direct influence of the playing result of the user, and finally poor user experience is brought to the user when services such as recommendation matching are carried out.
Disclosure of Invention
In view of this, the present application provides a method, a device, an electronic device, and a storage medium for generating a user portrait, so as to improve accuracy of user operation capability evaluation, accurately classify users, and finally improve user experience of users.
In view of the above, the present application provides a method of generating a user representation, comprising:
acquiring first touch operation data and second touch operation data, wherein the first touch operation data is data generated by a user aiming at a first touch area operation of a terminal screen, and the second touch operation data is data generated by the user aiming at a second touch area operation of the terminal screen;
obtaining the score of the touch representation parameter of the user according to the first touch operation data and the second touch operation data;
and obtaining the user portrait of the user according to the score of the touch characterization parameter and a pre-trained user portrait prediction model.
In some embodiments, the touch characterization parameter includes at least one of:
touch frequency, touch interval, touch displacement.
In some embodiments, the method comprises:
calculating corresponding first touch frequency, first touch interval and first touch displacement according to the first touch operation data;
calculating a corresponding second touch frequency, a corresponding second touch interval and a corresponding second touch displacement according to the second touch operation data;
and calculating the touch frequency through the first touch frequency and the second touch frequency by combining a preset first weight and a preset second weight, calculating the touch interval through the first touch interval and the second touch interval, and calculating the touch displacement through the first touch displacement and the second touch displacement.
In some embodiments, the obtaining the first touch operation data and the second touch operation data includes: acquiring first touch operation data and second touch operation data in an operation whole time period and an operation intensive time period; the operation intensive time period is a time period with a specific length selected in the operation whole time period, and the time period in which the number of the corresponding touch operation data exceeds a set threshold in the selected time period is determined as the operation intensive time period;
the method comprises the following steps:
and calculating the touch frequency according to the touch operation data corresponding to the operation whole time period, and calculating the touch interval and the touch displacement according to the touch operation data corresponding to the operation intensive time period.
In some embodiments, the calculating the touch frequency according to the touch operation data corresponding to the operation full time period includes:
and determining the touch times of the corresponding touch operation data in the operation whole time period, and determining the touch frequency according to the ratio of the touch times to the total time length of the operation whole time period.
In some embodiments, the calculating the touch interval and the touch displacement according to the touch operation data corresponding to the operation-intensive time period includes:
and determining the starting time and the ending time of touch operation data corresponding to the operation intensive time period, and calculating the touch interval according to the starting time and the ending time.
In some embodiments, the calculating the touch interval according to the starting time and the ending time includes:
Figure BDA0003589812690000021
wherein, ScoreTouch intervalFor the touch interval, n is the total number of touch operation data in the operation intensive time period, TimeStarti+1Is the start time of the i +1 th touch operation data, TimeEndiThe end time of the ith touch operation data is obtained.
In some embodiments, the calculating the touch interval and the touch displacement according to the touch operation data corresponding to the operation-intensive time period includes:
and determining the initial position and the end position of the touch operation data corresponding to the operation intensive time period, and calculating the touch displacement according to the initial position and the end position.
In some embodiments, the calculating the touch displacement according to the starting position and the ending position includes:
Figure BDA0003589812690000031
wherein, ScoreTouch displacementIs the touch displacement, n is the total number of touch operation data in the operation intensive time period,
Figure BDA0003589812690000032
the coordinate of the end position of the ith touch operation data,
Figure BDA0003589812690000033
is the initial position coordinate of the i +1 th touch operation data,
Figure BDA0003589812690000034
representing two points of computation
Figure BDA0003589812690000035
And
Figure BDA0003589812690000036
a function of the distance.
In some embodiments, the obtaining a score of the touch characterization parameter of the user according to the first touch operation data and the second touch operation data includes:
determining adjustment weights respectively corresponding to the touch frequency, the touch interval and the touch displacement, and determining adjustment parameters;
and respectively combining the corresponding adjustment weights according to the touch frequency, the touch interval and the touch displacement, and summing the adjustment weights and the adjustment parameters to obtain the score of the touch characterization parameters of the user.
In some embodiments, the determining adjustment weights corresponding to the touch frequency, the touch interval, and the touch displacement, and determining adjustment parameters includes:
acquiring a historical data test set, and establishing a linear regression model to calculate the adjustment weight and the adjustment parameter;
the adjustment weight is, in particular,
Figure BDA0003589812690000037
wherein k is the adjustment weight, m is the total number of the test samples in the historical data test set, yiFor the operating level of the ith test specimen, xiIs a vector of touch frequency, touch interval and touch displacement for the ith test sample,
Figure BDA0003589812690000041
is all xiThe mean value of (a);
the adjustment parameters are, in particular,
Figure BDA0003589812690000042
wherein b is an adjustment parameter.
Based on the same concept, the present application also provides a user portrayal device, comprising:
the terminal comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring first touch operation data and second touch operation data, the first touch operation data is data generated by a user aiming at a first touch area operation of a terminal screen, and the second touch operation data is data generated by the user aiming at a second touch area operation of the terminal screen;
the calculating module is used for obtaining the score of the touch representation parameter of the user according to the first touch operation data and the second touch operation data;
and the determining module is used for obtaining the user portrait of the user according to the score of the touch representation parameter and a pre-trained user portrait prediction model.
Based on the same concept, the present application also provides an electronic device, comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor executes the program to implement the method according to any one of the above.
Based on the same concept, the present application also provides a non-transitory computer-readable storage medium storing computer instructions for causing the computer to implement the method as described in any one of the above.
From the foregoing, the present application provides a method, device, electronic device and storage medium for generating a user representation, including: acquiring first touch operation data and second touch operation data, wherein the first touch operation data are generated by a user aiming at a first touch area operation of a terminal screen, and the second touch operation data are generated by the user aiming at a second touch area operation of the terminal screen; obtaining the score of the touch representation parameter of the user according to the first touch operation data and the second touch operation data; and obtaining the user portrait of the user according to the score of the touch representation parameter and the pre-trained user portrait prediction model. According to the method and the device, the touch frequency, the touch interval, the touch displacement and other parameters of the user in the application are determined, the user operation capacity can be more accurately evaluated through the parameters, the user level can be more effectively evaluated, the user can be accurately classified, more accurate service is finally provided for the user in the fields of information screening, application recommendation, team matching and the like, and the overall user experience is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or related technologies, the drawings needed to be used in the description of the embodiments or related technologies are briefly introduced below, it is obvious that the drawings in the following description are only the embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
FIG. 1 is a flow chart illustrating a method for generating a user representation according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a user portrait apparatus according to an embodiment of the present disclosure;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
To make the objects, technical solutions and advantages of the present specification more apparent, the present specification is further described in detail below with reference to the accompanying drawings in combination with specific embodiments.
It should be noted that technical terms or scientific terms used in the embodiments of the present application should have a general meaning as understood by those having ordinary skill in the art to which the present application belongs, unless otherwise defined. The use of "first," "second," and similar terms in the embodiments of the present application do not denote any order, quantity, or importance, but rather the terms are used to distinguish one element from another. The word "comprising" or "comprises", and the like, means that a element, article, or method step that precedes the word, and includes the element, article, or method step that follows the word, and equivalents thereof, does not exclude other elements, articles, or method steps. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships may also be changed accordingly.
As described in the background section, the user representation technology analyzes and models the interest preference, the horizontal capability and the like of a user through a series of logs, and is the basis of services such as personalized recommendation and the like. The modeling of the user operation level in the mobile phone game scene is particularly important, and services such as application matching and information recommendation based on the user level are directly supported. Traditional user level estimation is often based on user match settlement, such as win, net win, etc. But this approach is not intuitive enough to describe the user level. Conventional user competency assessment is often based on user-to-bureau settlement data. Such as the number of winning spots of the user against the game, the level of the user segment bits, and the statistical data (such as the number of hits and kills) in the game against the user. User ability evaluation results are often applied to matching, recommending and other systems, and common matching algorithms based on these game settlement data are ELO, TrueSkill and the like. The general flow is as follows: 1) according to the statistical data such as the current section position of the user, the teammates and the opponents are matched for the user, and the game of match is carried out; 2) the user performs the game; 3) and updating the user ability evaluation score by combining an ELO algorithm and a TrueSkill algorithm according to the result of the settlement of the user to the game (whether the user wins or not, the user performance and the like). However, the user ability evaluation is performed based only on the office settlement data, and the scenario is limited. The user ability evaluation can be carried out only under the condition that the user deals with the game and the scene (playing method) generated by the game result is available. The evaluation of the user ability based on the game settlement data is not intuitive enough. The game result of the user, such as success or failure, killing quantity and the like, cannot visually depict the operation level of the user, and even if the operation level is very good, the data of the game result is not good due to various reasons (such as poor teammates, enemy pertinence and the like), so that the capability evaluation is inconsistent with the actual capability of the user. Ultimately resulting in a poor gaming experience.
In combination with the actual situation, the embodiment of the application provides a user classification scheme, parameters such as touch frequency, touch interval and touch displacement of a user in application are determined, the user operation capacity can be more accurately evaluated through the parameters, so that the user level can be more effectively evaluated, the user can be accurately classified, more accurate services are finally provided for the user in the fields of information screening, application recommendation, team matching and the like, and the overall user experience is improved.
As shown in fig. 1, a flow chart of a method for generating a user portrait according to the present application is shown, where the method specifically includes:
step 101, acquiring first touch operation data and second touch operation data, wherein the first touch operation data is data generated by a user operating a first touch area of a terminal screen, and the second touch operation data is data generated by the user operating a second touch area of the terminal screen.
For most existing touch operation screens, a user often controls a left screen area and a right screen area with a left hand and a right hand respectively, such as a general left screen "control direction area" and a right screen "control skill area". Therefore, in this step, the acquisition of the touch operation data of the user is also performed for different touch areas. In a specific embodiment, the first touch area and the second touch area are left and right portions of the screen, respectively. In an alternative embodiment, the first touch area and the second touch area may also be different areas that are preset.
The touch operation data is related information of the touch operation of the user, such as start time, end time, touch start position, touch end position, touch point dwell time, touch movement related data, and the like of the touch operation. Each piece of touch operation data corresponds to one touch operation behavior of the user, and all relevant characteristic data of the touch behavior or part of relevant characteristic data of the touch behavior which is preset and needed is recorded. Of course, the touch operation data may also include touch overall data of the user in the characteristic time period, such as: the total number of touch operations performed by the user in the characteristic time period, the total touch contact time, the touch movement length and other related data.
And 102, obtaining the score of the touch representation parameter of the user according to the first touch operation data and the second touch operation data.
In this step, the touch frequency, the touch interval, and the touch displacement in the corresponding characteristic time period may be calculated from the touch operation data obtained in step 101. The characteristic time period is a period of time or an entire period of time when the user uses a specific application or an unspecified application. For example: when a user uses a game application, the characteristic time period may be the whole time period of the game application used by the user, the time period of the game played by the user, a specific time period of the game played by the user (such as an operation-intensive time period, a game-defined key time period, and the like), and the like. The specific characteristic time period may be specifically set according to a specific application scenario.
The touch frequency is the frequency of effective touch control performed by a user in a certain time period, statistics can be generally performed through the ratio of the touch frequency to the time, touch control operations can be performed on touch control devices such as mobile phones and pads generally by two hands at the same time, the touch control frequencies of the two hands can be further respectively counted, and then in order to visually represent the touch frequency, respective weights can be respectively given to the touch control frequencies of the two hands, so that comprehensive touch frequency is finally obtained. The touch interval is an average interval between two adjacent touches, which are valid by a user for a certain period of time, and is generally represented by an interval time, and similarly to the touch frequency, the touch intervals of two hands can be counted respectively, and then a comprehensive touch interval is obtained by giving corresponding weights. The touch displacement is an average interval displacement between two adjacent touches, which is effective by a user in a certain time period, and is generally represented by an interval length. For example, the distance interval between the ending position of the previous touch and the starting position of the next touch is obtained. Similarly to the touch frequency, it is also possible to count the touch displacements of the two hands separately, and then obtain the integrated touch displacement by giving corresponding weights.
In a specific embodiment, the general touch frequency is a statistical global overall touch frequency, and the corresponding time period may be the total time of the user using the application program or the total time of the user using the specific function of the application program. For example, in the field of games, a user plays a game for one play, such as the time of the FPS game user playing a game for one game for confrontation, the touch frequencies of two hands are respectively counted for the whole confrontation game time, and finally the comprehensive touch frequency is obtained through weighting. The touch intervals and the touch displacements are unbalanced in user operation in all time when the user uses the application program or all time when the user uses the specific function of the application program, for example, in an FPS game, the user only needs to perform intensive operation when confrontation occurs, and the user only needs to perform sparse operation in other time periods such as overtaking or hiding, so that in order to truly reflect the operation level of the user, the touch intervals and the touch displacements in the time period in which the user operates intensively generally need to be counted, and the touch intervals and the touch displacements in the time period in which the operation is sparse are omitted. The operation intensive time period can be specifically set according to a specific application scene, for example, two or more than two different users operating roles in a formation or a queue are maintained for a certain time period within a certain distance according to the prior definition, and then the operation intensive time period is started; or a time window with a specific duration is set on a time axis by counting the operation of the whole user, a group of touch data is determined by moving the time window frame, and the time window corresponding to a plurality of groups of touch data with specific names before the touch times in the group of touch data is used as an operation intensive time period; or a time window corresponding to several sets of touch data of 10% of the number of touches in the sets of touch data is used as an operation-intensive time period. Then, since the touch frequency may not be the same as the touch interval and the touch displacement, for example, the touch frequency may correspond to the time of one game, and the touch interval and the touch displacement correspond to a plurality of operation-intensive time periods within the time of one game, it is possible to perform statistics on the average number of the touch interval and the touch displacement by determining how many operation-intensive time periods the time of one game is, so as to finally reflect the touch frequency, the touch interval and the touch displacement of the entire game user.
In a particular embodiment, the frequency of touches may be calculated by the ratio of the number of touches to time. The touch interval can be calculated by counting the time interval between all two adjacent touches in the time period and comparing the number of touches in the time period. The touch displacement can be calculated by counting the displacement distance between all two adjacent touches in a time period and comparing the number of touches in the time period.
In this step, the touch frequency, the touch interval, and the touch displacement may be directly weighted, and the final operation level of the user may be obtained by combining the adjustment parameter, where the adjustment parameter is a specific number. The operational level may be a score, a corresponding rating, and so on. The final classification of the user can then be performed based on this operational level.
In a specific embodiment, different weights may be given to the touch frequency, the touch interval, and the touch displacement, respectively, or the response calculation may be performed directly. In the scheme of giving the weight, the corresponding weight can be given through manual setting in advance, and meanwhile, the adjustment parameter is set. The respective weights and adjustment parameters can also be determined by calculation of a mathematical model. For example, a training set is composed of a large number of determined operation levels and corresponding touch frequencies, touch intervals and touch displacements, a linear regression mathematical model is constructed, and finally adjustment parameters and weights corresponding to the touch frequencies, the touch intervals and the touch displacements are output.
And 103, acquiring the user portrait of the user according to the score of the touch characterization parameter and a pre-trained user portrait prediction model.
In this step, the score of the touch characterization parameter obtained in step 102 is matched with a pre-trained user portrait model to obtain the user portrait of the user. The image is ultimately output for storage, presentation, use, or rework. The specific output mode of the portrait can be flexibly selected according to different application scenes and implementation requirements.
For example, for an application scenario in which the method of the present embodiment is executed on a single device, the representation may be directly output in a display manner on a display part (a display, a projector, etc.) of the current device, so that an operator of the current device can directly see the content of the representation from the display part.
For another example, for an application scenario executed on a system composed of multiple devices by the method of this embodiment, the representation may be sent to other preset devices serving as a receiving party in the system, that is, a synchronization terminal, through any data communication manner (e.g., wired connection, NFC, bluetooth, wifi, cellular mobile network, etc.), so that the synchronization terminal may perform subsequent processing on the representation. Optionally, the synchronization terminal may be a preset server, and the server is generally arranged at a cloud end and used as a data processing and storage center, which is capable of storing and distributing the user portrait; the distribution receiver is a terminal device, and the holder or operator of the terminal device may be a current user, an administrator of the application program, a promoter of the application program, and the like.
For another example, for an application scenario executed on a system composed of multiple devices, the method of this embodiment may directly send the user representation to a preset terminal device through any data communication manner, where the terminal device may be one or more of the foregoing paragraphs.
In particular embodiments, a user representation of a user may be sent to a downstream application recommendation system or an in-application team matching system, etc., to make appropriate application recommendations, advertisement recommendations, or in-application function recommendations, etc., to target the user's operational level, to make in-application or inter-queue level-equivalent team matches, etc. It is also possible to determine a specific user or group of users on the basis of the user image of the user.
From the above, it can be seen that a method for generating a user representation of an embodiment of the present application includes: acquiring first touch operation data and second touch operation data, wherein the first touch operation data is data generated by a user aiming at a first touch area operation of a terminal screen, and the second touch operation data is data generated by the user aiming at a second touch area operation of the terminal screen; obtaining the score of the touch representation parameter of the user according to the first touch operation data and the second touch operation data; and obtaining the user portrait of the user according to the score of the touch characterization parameter and a pre-trained user portrait prediction model. According to the method and the device, parameters such as the touch frequency, the touch interval and the touch displacement of the user in the application are determined, the user operation capacity can be evaluated more accurately through the parameters, the user level can be evaluated more effectively, the user can be classified accurately, more accurate service is provided for the user in the fields of information screening, application recommendation, team matching and the like, and the overall user experience is improved.
It should be noted that the method of the embodiment of the present application may be executed by a single device, such as a computer or a server. The method of the embodiment of the application can also be applied to a distributed scene and is completed by the mutual cooperation of a plurality of devices. In such a distributed scenario, one of the multiple devices may only perform one or more steps of the method of the embodiment, and the multiple devices interact with each other to complete the method.
It should be noted that the above-mentioned description describes specific embodiments of the present application. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments described above and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In an optional exemplary embodiment, the touch operation data includes: left hand touch operation data and right hand touch operation data; the calculating the touch frequency, the touch interval and the touch displacement according to the touch operation data comprises the following steps: calculating corresponding left hand touch frequency, left hand touch interval and left hand touch displacement according to the left hand touch operation data; calculating corresponding right hand touch frequency, right hand touch interval and right hand touch displacement according to the right hand touch operation data; combine predetermined left hand weight and right hand weight, through left hand touch frequency reaches right hand touch frequency calculates touch frequency, through left hand touch interval reaches right hand touch interval calculates touch interval, through left hand touch displacement reaches right hand touch displacement calculates touch displacement. Therefore, the touch operation data of the user is accurately counted.
In this embodiment, since the touch operation generally corresponds to the operation data of both hands, all the touch operation data can be divided into left-handed and right-handed. And then respectively counting the touch frequency, the touch interval and the touch displacement of the left hand and the touch frequency, the touch interval and the touch displacement of the right hand. Then, due to the difference of the function key positions and the like corresponding to each hand, the comprehensive operation level of the user can be better embodied only by giving the adaptive weight to the touch operation data of the left hand and the right hand, so that the touch operation data of the left hand and the right hand can be weighted by the preset left hand weight and the preset right hand weight, and the comprehensive touch frequency, the touch interval and the touch displacement are finally obtained.
Figure BDA0003589812690000111
Wherein, ScoreFrequency of touchFor frequency of touch, ScoreTouch intervalFor touch interval, ScoreTouch displacementFor touch displacement, ωLeft'And omegaRight'Respectively representing the weights, ω, corresponding to the left and right hand touch frequenciesLeftAnd omegaRightRepresenting the weights, ω, corresponding to the left-hand and right-hand touch intervals, respectivelyLeft'And omegaRight'Represents the weights corresponding to the left and right hand touch displacements, ScoreLeft hand touch frequencyAnd ScoreRight hand touch frequencyLeft hand touch frequency and right hand touch frequency, ScoreLeft hand touch intervalAnd ScoreRight hand touch intervalLeft hand touch interval and right hand touch interval, ScoreLeft hand touch displacementAnd ScoreRight hand touch displacementLeft hand touch displacement and right hand touch displacement.
In a specific application scenario, the left-hand weight and/or the right-hand weight corresponding to the touch frequency, the touch interval, and the touch displacement may be the same or different, and may be specifically adjusted according to the specific application scenario. Each weight value can be set according to the prior manual setting, and can also be trained by establishing a corresponding mathematical model, so that each corresponding weight value is calculated.
In an alternative exemplary embodiment, the characteristic time period includes: an operation whole time period and an operation intensive time period; the operation intensive time period is a time period with a specific length selected in the operation whole time period, and the time period in which the quantity of the corresponding touch operation data in the selected time period exceeds a set threshold value is determined as the operation intensive time period; the calculating the touch frequency, the touch interval and the touch displacement according to the touch operation data comprises the following steps: and calculating the touch frequency according to the touch operation data corresponding to the operation whole time period, and calculating the touch interval and the touch displacement according to the touch operation data corresponding to the operation intensive time period. Therefore, the operation level of the user can be reflected more truly.
In this embodiment, the touch interval and the touch displacement may reflect the real operation level of the user, such as the real reaction capability and the real operation capability, only in response to the data of the user during violent confrontation or frequent operation, so that the statistics is not performed in the whole time. And then can choose one or more time quantum of operation intensive in the time quantum of operation overall, the choice way of time quantum of operation intensive can be to set up a time window of specific duration in the time quantum of operation overall, make the time window move in this time quantum, frame out the multi-section time, the time quantum that the frame is set out can be mutual non-overlapping, also can overlap each other. And counting the user operation times in the time periods, and then selecting the time period with the operation times exceeding a certain threshold value as an operation intensive time period, or selecting the time period with specific name times before the operation times are ranked or with specific proportion before the operation times are ranked as the operation intensive time period.
And then, calculating the touch frequency through the operation whole time period, and calculating the touch interval and the touch displacement through the operation intensive time period.
In an optional exemplary embodiment, the calculating the touch frequency according to the touch operation data includes: and determining the touch times of the corresponding touch operation data in the operation whole time period, and determining the touch frequency according to the ratio of the touch times to the total time length of the operation whole time period. Thereby accurately calculating the touch frequency.
In this embodiment, the touch frequency is determined by counting all valid touch times in the operation full-time period and by the ratio of the touch times to the duration of the operation full-time period.
Figure BDA0003589812690000121
Wherein, ScoreFrequency of touchIs touch frequency, CountNumber of touchesThe Time is the total Time corresponding to the operation whole Time period for the effective touch times.
In a specific embodiment, the touch frequency of the left hand and the touch frequency of the right hand can be respectively counted, and then the touch frequencies are combined with the weights corresponding to the two hands to obtain an overall touch frequency in a comprehensive mode, so that the actual overall touch frequency of the user is reflected. Such as the frequency of touches by a user during a local battle or a full play. Generally, the more operations a user has, the more complex the policy can be considered, and the higher the corresponding level.
In an optional exemplary embodiment, the calculating a touch interval according to the touch operation data includes: and determining the starting time and the ending time of touch operation data corresponding to the operation intensive time period, and calculating the touch interval according to the starting time and the ending time. Thereby accurately calculating the touch interval.
In this embodiment, since the touch interval is a time interval between two touches, the overall touch interval can be calculated according to the time interval between every two adjacent touches. It may specifically be:
Figure BDA0003589812690000122
wherein, ScoreTouch intervalFor the touch interval, n is the total number of touch operation data in the operation intensive time period, TimeStarti+1Is the start time of the i +1 th touch operation data, TimeEndiThe end time of the ith touch operation data is obtained.
In the present embodiment, n touches occur within the operation intensive time period, and n-1 intervals are generated, so as to be based on the end time (TimeEnd) of the previous touchi) And the start time (TimeStart) of the next touchi+1) Determining the interval time of the interval, counting all the interval time, and dividing the interval time by the interval number n-1 to generate the ScoreTouch interval
In a specific embodiment, the touch intervals of the left hand and the right hand can be respectively counted, and then the touch intervals are combined with the weights corresponding to the touch intervals to obtain an overall touch interval in a comprehensive mode, so that the actual overall touch interval of the user is reflected.
In an optional exemplary embodiment, the calculating the touch displacement according to the touch operation data includes: and determining the initial position and the end position of the touch operation data corresponding to the operation intensive time period, and calculating the touch displacement according to the initial position and the end position. Thereby accurately calculating the touch displacement.
In this embodiment, since the touch displacement is a distance displacement between two touches, the overall touch displacement can be calculated by determining how much distance the user moves between two touches according to the end position of the previous touch and the start position of the next touch in each two adjacent touches. It may specifically be:
Figure BDA0003589812690000131
wherein, ScoreTouch displacementIs the touch displacement, n is the total number of touch operation data in the operation intensive time period,
Figure BDA0003589812690000132
the coordinate of the end position of the ith touch operation data,
Figure BDA0003589812690000133
is the initial position coordinate of the i +1 th touch operation data,
Figure BDA0003589812690000134
representing two points of computation
Figure BDA0003589812690000135
And
Figure BDA0003589812690000136
a function of the distance.
In the embodiment, n times of touch control occur in the operation intensive time period, n-1 intervals are generated, so that the touch control operation is performed according to the end position of the previous touch control operation
Figure BDA0003589812690000137
And the starting position of the next touch operation
Figure BDA0003589812690000138
Determining the offset between the two, counting all the offsets, and dividing the counted offsets by the interval number n-1 to generate the ScoreTouch displacement
In a specific embodiment, the touch displacements of the left hand and the right hand can be respectively counted, and then combined with the respective corresponding weights, an overall touch displacement is obtained comprehensively, so that the real overall touch displacement of the user is reflected.
By calculating the touch interval and the touch displacement, the response capability and the operation flexibility of the user in the operation intensive time period can be well reflected, so that the operation level of the user can be reflected in the two dimensions, and the finally calculated operation level can be closer to the real operation level of the user.
In an alternative exemplary embodiment, the determining the operation level of the user according to the touch frequency, the touch interval and the touch displacement includes: determining adjustment weights respectively corresponding to the touch frequency, the touch interval and the touch displacement, and determining adjustment parameters; and respectively combining the corresponding adjustment weights according to the touch frequency, the touch interval and the touch displacement, and summing the adjustment weights and the adjustment parameters to obtain the operation level.
In this embodiment, the adjustment weight reflects the proportion of the corresponding parameter when performing the operation level calculation. The tuning parameter is a specific number for tuning. Both of them may be set in advance or calculated from a model. The calculation of the operating level may be specifically expressed as:
y=kx+b
where y denotes an operation level, k is an adjustment weight, and k is [ ω ═ ωFrequency of touch,ωTouch interval,ωTouch displacement],ωFrequency of touchAdjustment weight, ω, representing touch frequencyTouch intervalAdjustment weight, ω, representing touch intervalTouch displacementAn adjustment weight representing a touch displacement, x being a specific touch frequency, touch interval and touch displacement, x ═ ScoreFrequency of touch,ScoreTouch interval,ScoreTouch displacement]And b is an adjustment parameter. After the formula is developed, the following can be obtained: y ═ ωFrequency of touch×ScoreFrequency of touchTouch interval×ScoreTouch intervalTouch displacement×ScoreTouch displacement+b。
In a specific embodiment, the values of k and b may be obtained according to a previous setting, which may be manually set according to experience and the like. The calculation of these two parameters can also be performed by building a regression model or the like after a large amount of data has been accumulated. Based on data (touch intervals, touch displacement, touch frequency and corresponding user operation level) of a large number of users, a regression model is obtained by using a machine learning method, then the operation level of a new user can be evaluated, namely the touch intervals, the touch displacement and the touch frequency of the new user in a period of time are given, and the operation level value of the new user can be obtained by inputting the regression model. The regression model may include a multi-layer perceptron (MLP), linear regression, polynomial regression, decision tree regression, and the like. That is, the determining adjustment weights corresponding to the touch frequency, the touch interval, and the touch displacement, and determining adjustment parameters includes: and acquiring a historical data test set, and establishing a linear regression model to calculate the adjustment weight and the adjustment parameter.
In the specific embodiment, taking linear regression as an example, the established linear regression model is:
y=ωfrequency of touch×ScoreFrequency of touchTouch interval×ScoreTouch intervalTouch displacement×ScoreTouch displacement+b
That is, y is kx + b, the linear regression model needs to complete the task of giving a large amount of data x and y and deriving the values of k and b that minimize the error function. In this embodiment, a batch of data, i.e. the touch intervals, touch displacements, touch frequencies of a batch of users, and the operation levels of the batch of users, is acquired first. From these data, ω is derivedFrequency of touch、ωTouch interval、ωTouch displacementB, value of b.
In a specific application scenario, if the least square method is used for solving, the loss function is
Figure BDA0003589812690000151
Wherein x isi、yiRepresenting the ith piece of data in the dataset, which in turn represents its (touch interval, touch displacement, touch frequency) and operating level, the equation is that the total square error of the regression model that one wishes to learn is the smallest on the known dataset:
Figure BDA0003589812690000152
the finally available adjustment weight and adjustment parameter are specifically:
Figure BDA0003589812690000153
wherein k is the adjustment weight, m is the total number of the test samples in the historical data test set, yiFor the operating level of the ith test specimen, xiIs a vector of touch frequency, touch interval and touch displacement for the ith test sample,
Figure BDA0003589812690000154
is all xiIs measured.
Figure BDA0003589812690000155
Wherein b is an adjustment parameter.
Based on the same concept, the application also provides user classification equipment corresponding to the method of any embodiment.
Referring to fig. 2, the user classifying device includes:
the obtaining module 210 is configured to obtain touch operation data of a user in a characteristic time period.
The calculating module 220 is configured to calculate a touch frequency, a touch interval, and a touch displacement according to the touch operation data.
A determining module 230, configured to determine an operation level of the user according to the touch frequency, the touch interval, and the touch displacement, and determine a user portrait of the user according to the operation level.
For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, the functions of the modules may be implemented in the same or multiple software and/or hardware when implementing the embodiments of the present application.
The device of the foregoing embodiment is used to implement the corresponding user classification method in the foregoing embodiment, and has the beneficial effects of the corresponding user classification method embodiment, which are not described herein again.
In an optional exemplary embodiment, the touch operation data includes: left hand touch operation data and right hand touch operation data;
the calculating module 220 is further configured to:
calculating corresponding left hand touch frequency, left hand touch interval and left hand touch displacement according to the left hand touch operation data;
calculating corresponding right hand touch frequency, right hand touch interval and right hand touch displacement according to the right hand touch operation data;
combine predetermined left hand weight and right hand weight, through left hand touch frequency reaches right hand touch frequency calculates touch frequency, through left hand touch interval reaches right hand touch interval calculates touch interval, through left hand touch displacement reaches right hand touch displacement calculates touch displacement.
In an alternative exemplary embodiment, the characteristic time period includes: the whole operation time period and the intensive operation time period; the operation intensive time period is a time period with a specific length selected in the operation whole time period, and the time period in which the number of the corresponding touch operation data exceeds a set threshold in the selected time period is determined as the operation intensive time period;
the calculating module 220 is further configured to:
and calculating the touch frequency according to the touch operation data corresponding to the operation whole time period, and calculating the touch interval and the touch displacement according to the touch operation data corresponding to the operation intensive time period.
In an alternative exemplary embodiment, the calculating module 220 is further configured to:
and determining the touch times of the corresponding touch operation data in the operation whole time period, and determining the touch frequency according to the ratio of the touch times to the total time length of the operation whole time period.
In an alternative exemplary embodiment, the calculating module 220 is further configured to:
and determining the starting time and the ending time of touch operation data corresponding to the operation intensive time period, and calculating the touch interval according to the starting time and the ending time.
In an optional exemplary embodiment, the calculating module 220 calculates the touch interval according to the touch operation data, specifically:
Figure BDA0003589812690000171
wherein, ScoreTouch intervalFor the touch interval, n is the total number of touch operation data in the operation intensive time period, TimeStarti+1Is the start time, TimeEnd, of the i +1 th touch operation dataiThe end time of the ith touch operation data is obtained.
In an alternative exemplary embodiment, the calculating module 220 is further configured to:
and determining the initial position and the end position of the touch operation data corresponding to the operation intensive time period, and calculating the touch displacement according to the initial position and the end position.
In an optional exemplary embodiment, the calculating module 220 calculates the touch displacement according to the touch operation data, specifically:
Figure BDA0003589812690000172
wherein, ScoreTouch displacementIs the touch displacement, n is the total number of touch operation data in the operation intensive time period,
Figure BDA0003589812690000173
the coordinate of the end position of the ith touch operation data,
Figure BDA0003589812690000174
the initial position coordinates of the i +1 th touch operation data,
Figure BDA0003589812690000175
representing two points of computation
Figure BDA0003589812690000176
And
Figure BDA0003589812690000177
a function of the distance.
In an optional exemplary embodiment, the determining module 230 is further configured to:
determining adjustment weights respectively corresponding to the touch frequency, the touch interval and the touch displacement, and determining adjustment parameters;
and respectively combining the corresponding adjustment weights according to the touch frequency, the touch interval and the touch displacement, and summing the adjustment weights and the adjustment parameters to obtain the operation level.
In an optional exemplary embodiment, the determining module 230 is further configured to:
acquiring a historical data test set, and establishing a linear regression model to calculate the adjustment weight and the adjustment parameter;
the adjustment weight is, in particular,
Figure BDA0003589812690000181
wherein k is the adjustment weight, m is the total number of the test samples in the historical data test set, and yiFor the operating level of the ith test specimen, xiIs a vector of touch frequency, touch interval and touch displacement for the ith test sample,
Figure BDA0003589812690000182
is all xiThe mean value of (a);
the adjustment parameters are, in particular,
Figure BDA0003589812690000183
wherein b is an adjustment parameter.
Based on the same concept, corresponding to the method of any embodiment, the present application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the user classification method according to any embodiment is implemented.
Fig. 3 is a schematic diagram illustrating a more specific hardware structure of an electronic device according to this embodiment, where the electronic device may include: a processor 1010, a memory 1020, an input/output interface 1030, a communication interface 1040, and a bus 1050. Wherein the processor 1010, memory 1020, input/output interface 1030, and communication interface 1040 are communicatively coupled to each other within the device via a bus 1050.
The processor 1010 may be implemented by a general-purpose CPU (Central Processing Unit), a microprocessor, an Application Specific Integrated Circuit (ASIC), or one or more Integrated circuits, and is configured to execute related programs to implement the technical solutions provided in the embodiments of the present disclosure.
The Memory 1020 may be implemented in the form of a ROM (Read Only Memory), a RAM (Random Access Memory), a static storage device, a dynamic storage device, or the like. The memory 1020 may store an operating system and other application programs, and when the technical solution provided by the embodiments of the present specification is implemented by software or firmware, the relevant program codes are stored in the memory 1020 and called to be executed by the processor 1010.
The input/output interface 1030 is used for connecting an input/output module to input and output information. The i/o module may be configured as a component in a device (not shown) or may be external to the device to provide a corresponding function. The input devices may include a keyboard, a mouse, a touch screen, a microphone, various sensors, etc., and the output devices may include a display, a speaker, a vibrator, an indicator light, etc.
The communication interface 1040 is used for connecting a communication module (not shown in the drawings) to implement communication interaction between the present apparatus and other apparatuses. The communication module can realize communication in a wired mode (such as USB, network cable and the like) and also can realize communication in a wireless mode (such as mobile network, WIFI, Bluetooth and the like).
The bus 1050 includes a path to transfer information between various components of the device, such as the processor 1010, memory 1020, input/output interface 1030, and communication interface 1040.
It should be noted that although the above-mentioned device only shows the processor 1010, the memory 1020, the input/output interface 1030, the communication interface 1040 and the bus 1050, in a specific implementation, the device may also include other components necessary for normal operation. In addition, those skilled in the art will appreciate that the above-described apparatus may also include only those components necessary to implement the embodiments of the present description, and not necessarily all of the components shown in the figures.
The electronic device of the foregoing embodiment is used to implement the corresponding user classification method in any of the foregoing embodiments, and has the beneficial effects of the corresponding method embodiment, which are not described herein again.
Based on the same idea, corresponding to any of the above embodiments, the present application further provides a non-transitory computer-readable storage medium storing computer instructions for causing the computer to execute the user classification method according to any of the above embodiments.
Computer-readable media of the present embodiments, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
The computer instructions stored in the storage medium of the foregoing embodiment are used to enable the computer to execute the user classification method according to any one of the foregoing embodiments, and have the beneficial effects of the corresponding method embodiment, which are not described herein again.
Those of ordinary skill in the art will understand that: the discussion of any embodiment above is meant to be exemplary only, and is not intended to intimate that the scope of the disclosure, including the claims, is limited to these examples; within the context of the present application, features from the above embodiments or from different embodiments may also be combined, steps may be implemented in any order, and there are many other variations of the different aspects of the embodiments of the present application as described above, which are not provided in detail for the sake of brevity.
In addition, well-known power/ground connections to Integrated Circuit (IC) chips and other components may or may not be shown in the provided figures for simplicity of illustration and discussion, and so as not to obscure the embodiments of the application. Furthermore, devices may be shown in block diagram form in order to avoid obscuring embodiments of the application, and this also takes into account the fact that specifics with respect to implementation of such block diagram devices are highly dependent upon the platform within which the embodiments of the application are to be implemented (i.e., specifics should be well within purview of one skilled in the art). Where specific details (e.g., circuits) are set forth in order to describe example embodiments of the application, it should be apparent to one skilled in the art that the embodiments of the application can be practiced without, or with variation of, these specific details. Accordingly, the description is to be regarded as illustrative instead of restrictive.
While the present application has been described in conjunction with specific embodiments thereof, many alternatives, modifications, and variations of these embodiments will be apparent to those of ordinary skill in the art in light of the foregoing description. For example, other memory architectures (e.g., dynamic ram (dram)) may use the discussed embodiments.
The present embodiments are intended to embrace all such alternatives, modifications and variances which fall within the broad scope of the appended claims. Therefore, any omissions, modifications, substitutions, improvements, and the like that may be made without departing from the spirit and principles of the embodiments of the present application are intended to be included within the scope of the present application.

Claims (14)

1. A method of generating a user image, the method comprising:
acquiring first touch operation data and second touch operation data, wherein the first touch operation data is data generated by a user aiming at a first touch area operation of a terminal screen, and the second touch operation data is data generated by the user aiming at a second touch area operation of the terminal screen;
obtaining the score of the touch representation parameter of the user according to the first touch operation data and the second touch operation data;
and obtaining the user portrait of the user according to the score of the touch characterization parameter and a pre-trained user portrait prediction model.
2. The method of claim 1, wherein the touch characterization parameters comprise at least one of:
touch frequency, touch interval, touch displacement.
3. The method of claim 2, wherein the method comprises:
calculating corresponding first touch frequency, first touch interval and first touch displacement according to the first touch operation data;
calculating a corresponding second touch frequency, a second touch interval and a second touch displacement according to the second touch operation data;
and calculating the touch frequency through the first touch frequency and the second touch frequency by combining a preset first weight and a preset second weight, calculating the touch interval through the first touch interval and the second touch interval, and calculating the touch displacement through the first touch displacement and the second touch displacement.
4. The method of claim 2, wherein the obtaining the first touch operation data and the second touch operation data comprises: acquiring first touch operation data and second touch operation data in an operation whole time period and an operation intensive time period; the operation intensive time period is a time period with a specific length selected in the operation whole time period, and the time period in which the quantity of the corresponding touch operation data in the selected time period exceeds a set threshold value is determined as the operation intensive time period;
the method comprises the following steps:
and calculating the touch frequency according to the touch operation data corresponding to the operation whole time period, and calculating the touch interval and the touch displacement according to the touch operation data corresponding to the operation intensive time period.
5. The method according to claim 4, wherein the calculating the touch frequency according to the touch operation data corresponding to the operation full time period comprises:
and determining the touch times of the corresponding touch operation data in the operation whole time period, and determining the touch frequency according to the ratio of the touch times to the total duration of the operation whole time period.
6. The method of claim 4, wherein the calculating the touch interval and the touch displacement according to the touch operation data corresponding to the operation-intensive time period comprises:
and determining the starting time and the ending time of touch operation data corresponding to the operation intensive time period, and calculating the touch interval according to the starting time and the ending time.
7. The method of claim 6, wherein the calculating the touch interval according to the start time and the end time comprises:
Figure FDA0003589812680000021
wherein, ScoreTouch intervalFor touch interval, n is operationTotal number of touch operation data, TimeStart, in intensive time periodsi+1Is the start time of the i +1 th touch operation data, TimeEndiThe end time of the ith touch operation data is obtained.
8. The method of claim 4, wherein the calculating the touch interval and the touch displacement according to the touch operation data corresponding to the operation-intensive time period comprises:
and determining the initial position and the end position of the touch operation data corresponding to the operation intensive time period, and calculating the touch displacement according to the initial position and the end position.
9. The method of claim 8, wherein the calculating the touch displacement according to the start position and the end position comprises:
Figure FDA0003589812680000022
wherein, ScoreTouch displacementN is the total number of touch operation data in the operation intensive time period,
Figure FDA0003589812680000023
the coordinate of the end position of the ith touch operation data,
Figure FDA0003589812680000024
the initial position coordinates of the i +1 th touch operation data,
Figure FDA0003589812680000025
representing two points of computation
Figure FDA0003589812680000026
And
Figure FDA0003589812680000027
a function of the distance.
10. The method of claim 2, wherein obtaining the score for the touch characterization parameter of the user from the first touch operation data and the second touch operation data comprises:
determining adjustment weights respectively corresponding to the touch frequency, the touch interval and the touch displacement, and determining adjustment parameters;
and respectively combining the corresponding adjustment weights according to the touch frequency, the touch interval and the touch displacement, and summing the adjustment weights and the adjustment parameters to obtain the score of the touch characterization parameters of the user.
11. The method of claim 10, wherein determining the adjustment weights corresponding to the touch frequency, the touch interval and the touch displacement respectively, and determining an adjustment parameter comprises:
acquiring a historical data test set, and establishing a linear regression model to calculate the adjustment weight and the adjustment parameter;
the adjusting weight is, specifically,
Figure FDA0003589812680000031
wherein k is the adjustment weight, m is the total number of the test samples in the historical data test set, yiFor the operating level of the ith test specimen, xiIs a vector of touch frequency, touch interval and touch displacement for the ith test sample,
Figure FDA0003589812680000032
is all xiThe mean value of (a);
the adjustment parameters are, in particular,
Figure FDA0003589812680000033
wherein b is an adjustment parameter.
12. A user-portrayal device, comprising:
the terminal comprises an acquisition module, a processing module and a display module, wherein the acquisition module is used for acquiring first touch operation data and second touch operation data, the first touch operation data is data generated by a user aiming at a first touch area operation of a terminal screen, and the second touch operation data is data generated by the user aiming at a second touch area operation of the terminal screen;
the calculating module is used for obtaining the score of the touch representation parameter of the user according to the first touch operation data and the second touch operation data;
and the determining module is used for obtaining the user portrait of the user according to the score of the touch representation parameter and a pre-trained user portrait prediction model.
13. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any of claims 1 to 11 when executing the program.
14. A non-transitory computer-readable storage medium storing computer instructions for causing a computer to implement the method of any one of claims 1 to 11.
CN202210374651.1A 2022-04-11 2022-04-11 Method, device, electronic device and storage medium for generating user portrait Pending CN114742154A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210374651.1A CN114742154A (en) 2022-04-11 2022-04-11 Method, device, electronic device and storage medium for generating user portrait

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210374651.1A CN114742154A (en) 2022-04-11 2022-04-11 Method, device, electronic device and storage medium for generating user portrait

Publications (1)

Publication Number Publication Date
CN114742154A true CN114742154A (en) 2022-07-12

Family

ID=82281220

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210374651.1A Pending CN114742154A (en) 2022-04-11 2022-04-11 Method, device, electronic device and storage medium for generating user portrait

Country Status (1)

Country Link
CN (1) CN114742154A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117180730A (en) * 2023-09-08 2023-12-08 广州火石传娱科技有限公司 Toy gun system processing method and system applied to image positioning

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117180730A (en) * 2023-09-08 2023-12-08 广州火石传娱科技有限公司 Toy gun system processing method and system applied to image positioning
CN117180730B (en) * 2023-09-08 2024-03-19 广州火石传娱科技有限公司 Toy gun system processing method and system applied to image positioning

Similar Documents

Publication Publication Date Title
US20190288918A1 (en) Forecasting computer resources demand
JP6145387B2 (en) User matching method and system
CN107335220B (en) Negative user identification method and device and server
US11058957B2 (en) Method, device and system for game difficulty assessment
CN106022505A (en) Method and device of predicting user off-grid
CN109784959B (en) Target user prediction method and device, background server and storage medium
CN108322317A (en) A kind of account identification correlating method and server
CN111260449B (en) Model training method, commodity recommendation device and storage medium
KR20160117097A (en) Education performance predict method and system using estimation filter
WO2019148587A1 (en) Competition object matching method in learning competition and apparatus
CN110461430A (en) The matching of game on line is carried out with streaming player
CN113101655A (en) Virtual prop recommendation method, device, equipment and storage medium
CN108512883A (en) A kind of information-pushing method, device and readable medium
JP2017153783A (en) Matching method and matching system of user in game
CN114742154A (en) Method, device, electronic device and storage medium for generating user portrait
US20200192930A1 (en) Method and device for assessing quality of multimedia resource
CN110276404B (en) Model training method, device and storage medium
US11179631B2 (en) Providing video game content to an online connected game
CN114342411B (en) Method for providing one or more sets of graphics parameters, computer executing program for implementing method for providing one or more sets of graphics parameters
KR20150114169A (en) The Learning Type system and method For Sports fixture
KR101288476B1 (en) Method and server for providing reward item
CN113780415B (en) User portrait generating method, device, equipment and medium based on applet game
US20150170035A1 (en) Real time personalization and categorization of entities
CN110851724B (en) Article recommendation method based on self-media number grade and related products
CN111694753A (en) Application program testing method and device and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination