GB2575236A - User and/or application profiles - Google Patents

User and/or application profiles Download PDF

Info

Publication number
GB2575236A
GB2575236A GB1807658.8A GB201807658A GB2575236A GB 2575236 A GB2575236 A GB 2575236A GB 201807658 A GB201807658 A GB 201807658A GB 2575236 A GB2575236 A GB 2575236A
Authority
GB
United Kingdom
Prior art keywords
user
application
usage
data
profiles
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1807658.8A
Other versions
GB201807658D0 (en
Inventor
Ansorregui Lobete Daniel
Saà-Garriga Albert
Munikrishnappa Ramesh
Palavedu Saravanan Karthikeyan
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to GB1807658.8A priority Critical patent/GB2575236A/en
Publication of GB201807658D0 publication Critical patent/GB201807658D0/en
Priority to PCT/KR2019/005574 priority patent/WO2019216671A1/en
Publication of GB2575236A publication Critical patent/GB2575236A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • G06F21/6254Protecting personal data, e.g. for financial or medical purposes by anonymising data, e.g. decorrelating personal data from the owner's identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/316User authentication by observing the pattern of computer usage, e.g. typical user behaviour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0382Plural input, i.e. interface arrangements in which a plurality of input device of the same type are in communication with a PC
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems

Abstract

An apparatus or method including: obtaining user input (e.g. swipes, touches, pressure applied, accelerometer data etc) from one or more user devices (e.g. mobile phone or games controller) wherein the user is accessing an application (such as a computer game); converting each of the plurality of user inputs into a usage signature and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server). A usage signature may be a vector and the input data may be converted into the vector using sparse coding and/or principle (principal) component analysis. Usage signatures may be anonymized. A user profile may be based on usage signatures from a plurality of applications. User profiles may be used to identify a change in a user and restrict or block access to the device. An application profile may also be based on usage signatures from a plurality of users. User profiles may also identify user performance feedback over time, a potential health problem or an emotional status of the user. Accessing similarity of application profiles may be used for targeted advertising.

Description

The present specification relates to user and/or application profiles and relates, for example, to the use of usage signatures for generating user and/or application profiles.
Background
Application profiles can be generated based on user experiences of using such applications. Such data tends to be unreliable since only a small number of users are typically willing to provide such data. There is a need for improved methods for generating application profiles and user profiles.
Summary
In a first aspect, this specification describes a method comprising: obtaining a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game or an application); converting each of the plurality of user inputs into a usage signature (which may, for example, be a vector); and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g.
at a server). In some embodiments, each usage signature is anonymized (in order, for example, to provide for user privacy).
Each application profile may be based on usage signatures of a plurality of users of the respective application (e.g. an average, or some other combination, of the relevant usage signatures). Thus, one method may characterise an application by how multiple users interact with an application.
Each user profile may be based on usage signatures of a single user for a plurality of applications (e.g. an average, or some other combination, of the relevant usage signatures). Thus, one method may characterise a user by how they interact with multiple applications.
The user inputs can take many forms. By way of example, the user inputs may include one or more of: swipes of one or more of the input devices; touches of one or more of the input devices; pressure applied to one or more of the input devices; pressure size applied to one or more of the input devices; buttons (or other input devices) pressed;
- 2 pressure applied to buttons (or other input devices); joystick positions; gyroscope data; accelerometer data; user reaction time; user hand position; user hand speed; user playing style; biometric data and other external data. For example, one or more of the user inputs may be labelled with external information.
Each user input maybe converted into a usage signature using sparse coding and/or principle component analysis. Alternatively, or in addition, each user input may be converted into a usage signature using a function that generates a unique fingerprint (e.g. without storing the user input data). Thus, in some embodiments, the usage 10 signature may be generated by any suitable one-way process (that is unable to recover the relevant user input). The use of such a one-way process may have security or privacy advantages.
One or more of the user profile(s) may be used to identify a likely change in a user of a 15 user device. In one embodiment, access to some or all functions of a user device is blocked on identifying the likely change in the user of a user device.
One or more of the user profile(s) maybe used to provide user performance feedback over time.
One embodiment may include identifying changes in user profile data indicative of a potential health problem. The potential health problem(s) may include one or more short term health problems (such as tiredness). Alternatively, or in addition, the potential health problem(s) may include one or more long term health problems (which 25 may, for example, be indicative of a more serious health concern).
One or more of the user profile(s) maybe used to predict an emotional status of a user.
The emotional status maybe predicted based on one or more usage signatures and/or a change in one or more usage signatures. Alternatively, or in addition, external information (such as biometric data) regarding a user may be collected, for example for use in emotional status prediction.
One or more of the user profile(s) may be used for user behaviour prediction and/or user profile prediction.
-3One or more of said application profiles may be used to generate user feedback for the respective application. In some embodiments, user feedback (e.g. anonymised user feedback) maybe provided, for example, to a game developer and/or an application developer.
Similar applications may be identified on the basis of applications having similar application profiles. Alternatively, or in addition, similar users may be identified on the basis of users having a similar usage signature. Alternatively, or in addition, similar user groups may be identified on the basis of users having similar usage signatures for a 10 given application.
Applications may be suggested to a user based on an identified, determined or predicted emotional status of the user. Such emotions maybe identified, determined or predicted based on one or more usage signatures. Biometric data for a user maybe used 15 during the identification, determination or prediction of emotions.
In a second aspect, this specification describes an apparatus configured to perform any method as described with reference to the first aspect.
In a third aspect, this specification describes computer-readable instructions which, when executed by computing apparatus, cause the computing apparatus to perform any method as described with reference to the first aspect.
In a fourth aspect, this specification describes an apparatus comprising: an input for receiving a plurality of user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game or an application); a converter for converting each of a plurality of user inputs into a usage signature (which may, for example, be a vector); and an aggregator for aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server). In some embodiments, each usage signature is anonymized (in order, for example, to provide for user privacy).
The aggregator may be configured to generate each application profile based on usage 35 signatures of a plurality of users of the respective application. The aggregator may
-4generate the application profile(s) from an average of the relevant usage signatures (although other methods, such as the use of alternating least squares, are possible).
The aggregator may be configured to generate each user profile based on usage signatures of a single user for a plurality of applications. The aggregator may generate the user profile(s) from an average of the relevant usage signatures (although other methods, such as the use of alternating least squares, are possible).
The user inputs may include one or more of: swipes of one or more of the input devices; 10 touches of one or more of the input devices; pressure applied to one or more of the input devices; pressure size applied to one or more of the input devices; buttons (or other input devices) pressed; pressure applied to buttons (or other input devices); joystick positions; gyroscope data; accelerometer data; user reaction time; user hand position; user hand speed; user playing style; biometric data and other external data.
For example, one or more of the user inputs may be labelled with external information.
The converter may comprise a sparse coding module and/or principle component analysis module. The converter may comprise a module for converting each user input into a usage signature using a function that generates a unique fingerprint (e.g. without 20 storing the user input data). Thus, in some embodiments, the usage signature may be generated by any suitable one-way process (that is unable to recover the relevant user input).
The apparatus of the fourth aspect may further comprise an output module. The output 25 module may be configured to identify one or more of: a likely change in a user of a user device; user performance feedback over time; changes in user profile data indicative of a potential health problem; an emotional status of a user; user feedback for the respective application; similar applications, on the basis of applications having similar application profiles; similar users, on the basis of users having a similar usage signature; similar user groups, on the basis of users having similar usage signatures for a given application; user behaviour prediction; user profile prediction; and applications based on an identified, determined or predicted emotional status of a user.
In a fifth aspect, this specification describes a computer-readable medium having computer-readable code stored thereon, the computer readable code, when executed by at least one processor, causes performance of: obtaining a plurality of user inputs from
-5one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); converting each of the plurality of user inputs into a usage signature (wherein each usage signature may be anonymized); and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
In a sixth aspect, this specification describes an apparatus comprising: means for obtaining a plurality of user inputs from one or more user devices (such as one or more 10 mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); means for converting each of the plurality of user inputs into a usage signature (which may, for example, be a vector) (wherein each usage signature maybe anonymized); and means for aggregating some or all of said usage signatures into one or more application 15 profiles and/or one or more user profiles (e.g. at a server).
In a seventh aspect, this specification describes a non-transitory computer readable medium comprising program instructions stored therefore for performing at least the following: obtaining a plurality of user inputs from one or more user devices (such as 20 one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); converting each of the plurality of user inputs into a usage signature (which may, for example, be a vector) (wherein each usage signature maybe anonymized); and aggregating some or all of said usage signatures into one or more application profiles 25 and/or one or more user profiles (e.g. at a server).
In an eighth aspect, this specification describes an apparatus comprising: at least one processor; and at least one memory including computer program code which, when executed by the at least one processor, causes the apparatus to: obtain a plurality of 30 user inputs from one or more user devices (such as one or more mobile communication devices and/or one or more games controllers), each user device being used by a user for accessing an application (such as a computer game); convert each of the plurality of user inputs into a usage signature (which may, for example, be a vector) (wherein each usage signature maybe anonymized); and aggregate some or all of said usage signatures into one or more application profiles and/or one or more user profiles (e.g. at a server).
-6Brief description of the drawings
Example embodiments will now be described, by way of non-limiting examples, with reference to the following schematic drawings, in which:
Figure i is a block diagram of an example system;
Figure 2 is a flow chart showing an algorithm in accordance with an example embodiment;
Figure 3 is a block diagram of a portion of a system in accordance with an example io embodiment;
Figure 4 shows an example user device that maybe used in example embodiments; Figures 5 and 6 show example data signals provided for use with example embodiments;
Figures 7 to 9 are block diagrams of systems in accordance with example embodiments;
Figures 10 to 18 are flow charts showing algorithms in accordance with example embodiments; and
Figure 19 is a block diagram of a system in accordance with an example embodiment.
Detailed description
Figure 1 is a block diagram of an example system, indicated generally by the reference numeral 1. The system 1 includes a processor 10. The processor has an input receiving user information and provides an output.
By way of example, the system 1 may be used to recommend applications to a user based on information about the user that is input to the processor 10. The user information may include details such as: the age of the user, the number of applications that the user has installed, user demographics, and details of the user device. Based on the user information provided, the processor may suggest applications that might be of interest to the user. The system 1 is not able to make use of information regarding how the user interacts with one or more existing applications in the generation of the output (e.g. how the relevant applications are actually used).
Figure 2 is a flow chart showing an algorithm, indicated generally by the reference numeral 20, in accordance with an example embodiment.
-7The algorithm 20 starts at step 22, where user input from one or more users is collected. The user inputs may take many forms. Figure 3 is a block diagram of a portion of a system, indicated generally by the reference numeral 30, showing examples of user input data than might be collected in the step 22. The system 30 includes a collect user input(s) module 32 that receives inputs including one or more of the following: swipes made on a user interface, pressure applied to a user interface (and/or to a button), degree of pressure applied (to the user interface and/or to the button), user reaction time, user playing style, user hand speed, user hand position, information relating to touches on the user interface joystick positions, gyroscopic data and accelerometer data.
The inputs shown in Figure 3 are provided by way of example only. In any instance of the step 22 of the algorithm 20, some of those inputs may be omitted and/or other inputs maybe provided. For example, the inputs considered in the step 22 of the algorithm 20 may include inputs that are not related to the direct user interaction with a user device. For example, biometric data (such as heart rate data) for a user may be provided as an input. Such biometric data may be used, for example, in determining an emotion of the user (e.g. tired, excited, stimulated, bored etc.). Other external user input data sources (not related to user inputs or biometric data) could also be provided, 20 as shown in Figure 3. For example, at least some user input data may be labelled with external information or data (such as biometric data or user emotion data).
At step 24 of the algorithm 20, usage signatures are generated based on the user information collected in the step 22. Such usage signatures may, for example, be an 25 interaction profile indicating how a particular user interacts with a particular application. As discussed further below, the usage signatures may, for example, be modified from the user data such that the data is anonymised.
At step 26, one or more profiles are generated based on the usage signatures. As discussed further below, the generated user profile(s) may include a user profile (relevant to a particular user) and/or an application profile (relating to a particular application). The profiles may be generated by aggregating data from multiple usage signatures. A user profile may include user information that is not related to the user’s interaction with one or more applications (such as biometric data, or some other data indicative of user emotion).
-8Figure 4 shows an example user device 40 that may be used in example embodiments. The user device 40 may be used as a user input device (for example for providing input to an application, such as a computer game). The user device 40 may provide at least some of the data provided to the collect user input(s) module 32 described above. The user device 40 may, for example, be a mobile phone or similar device, and/or a games controller. Alternative implementations for the user device 40 are possible.
A number of potential interactions between a user and the user device are shown in
Figure 4. These include a swipe left command 42a, a swipe right command 42b, a user 10 device tilting 44 and user device shaking 46.
Figure 5 shows example data signals, indicated generally by the reference numeral 50, that might be obtained from the user device 40. The data signals 50 show a swipe signature plotted against time. The swipe signature includes left facing arrows indicative of a swipe left command 42a and right facing arrows indicative of a swipe right command 42b. The position of the arrows on the x-axis indicates the time (and duration) of the swipe. The position of the arrows on the y-axis can be used to indicate the vertical position on the user device screen of the swipe command.
Figure 6 shows an example data signal, indicated generally by the reference numeral
60, that might be obtained from the user device 40. The data signal 60 shows a tilt signature plotted over time. The tilt signature is derived from the user device tilting 44 discussed above. The plot 60 plots the degree of tilt (on the y-axis) against time.
The data signals 50 and 60 are two examples of data signal formats and are provided by way of example only. Many other data signals formats are possible and may be used, as will be readily apparent to those skilled in the art.
Figure 7 is a block diagram of a system, indicated generally by the reference numeral
70, in accordance with an example embodiment. The system 70 includes a user input module 72, a data compression module 74, a database 76 and a processor 78.
The user inputs module 72 may take the form of one or more collect user input(s) modules 32 described above. The module 72 may therefore implement that step 22 of the algorithm 20 described above. The user inputs module 72 may include one or more user device 40.
-9The data compression module 74 is used to anonymize the user inputs obtained from the user inputs module 72. The module 74 may therefore implement the step 24 of the algorithm 20 described above. The module 74 may compress the user input data using 5 sparse coding or some other form of principal component analysis (PCA) to generate a usage signature vector. PCA is a technique that can be used to reduce the dimensionality of data, for example by applying statistical procedures to the underlying data. The compression carried out by the module 74 maybe used to reduce the quantity of data and to increase privacy by preventing the original data from being reconstructed 10 from the compressed data. Alternative approaches to compress data might be a machine learning neural network encoder or manually created statistics. The data generated by the data compression module 74 may, for example, be referred to as a fingerprint or a signature. The fingerprint or signature may be provided as a vector. The fingerprint or signature may include other user information (such as general user information, such as age, user demographics etc., and/or measured user information, such as heart rate and other biometric data). The fmgerprint(s) or signature(s) maybe generated by any suitable one-way process (that is unable to recover the user data). Such techniques include machine learning techniques, such as autoencoders, data statistics such as mean/variance/probability distribution or manually designed methods.
Consider, for example, the data shown in Figures 5 and 6. The data compression step may compress the data such that the information content is available, but the original data (such as the time, duration and position of swipe data in Figure 5 and the 25 shape of each tilt signature in Figure 6) cannot be reconstructed from the compressed data. For example, the swipe signature may be compressed so that the time and distance between swipes can be recovered, but the location of each swipe of a user interface screen cannot be reconstructed.
The compressed data may be stored at the database 76. The database 76 may be located, for example, at a server and may store data relating to many users, obtained from many user inputs. By storing usage signatures generated by one or more instances of the data compression module 74, the database 76 does not need to store information that can be used to reconstruct the original data provided by the user inputs 72. This may be advantageous for privacy or security reasons.
- 10 The processor 78 has access to the database 76 and therefore has access to the compressed data. The processor 78 maybe used to extract information from the stored data, as discussed further below. The processor 78 may, for example, extract or generate one or more user profiles and/or one of more application profiles from the 5 data stored in the database 76, thereby implementing the step 26 of the algorithm 20 described above.
The processor 78 may aggregate information from multiple data sources in order to generate the user and/or application profiles. For example, as discussed below, data from a single user across multiple applications may be aggregated to generate a user profile and data from multiple users for a single application maybe aggregated to generate an application profile. The aggregation may involve a simple average, but other methods, such as the use of alternating least squares, are possible.
The system 70 described above assumes that the aggregation of data to generate user and application profiles is carried out by the processor 78. This is not essential. For example, data could be aggregated within the database 76.
Figure 8 is a block diagram of a system, indicated generally by the reference numeral 20 80, in accordance with an example embodiment.
The system 80 comprises a first user module 82, a second user module 83 and a third user module 84. Each of the user modules 82 to 84 may comprise a user input device and a data compression module (such as the user input device 72 and data compression 25 module 74 described above). The system 80 further comprises a database 86 and a user profile generation module 87. The database 86 maybe the database 76 of the system 70 described above. The user profile generation module 87 may be an example of the processor 78 described above.
As shown in Figure 8, the first user module 82 is used by a first user to access a first application, the second user module 83 is used by the first user to access a second application and the third user module 84 is used by the first user to access a third application. Of course, the first, second and third modules 82 to 84 could be the same user module used (e.g. at different times) to access different applications.
- 11 The database 86 can therefore store data (which maybe compressed and/or anonymized) relating to how the first user interacts with multiple applications. The user profile generation module 87 can therefore use the data stored in the database 86 to create a user profile for the first user including information across multiple applications.
Figure 9 is a block diagram of a system, indicated generally by the reference numeral
90, in accordance with an example embodiment. The system 90 comprises a first user module 92, a second user module 93 and a third user module 94. Each of the user modules 92 to 94 may comprise a user input device and a data compression module (such as the user input device 72 and data compression module 74 described above). The system 90 further comprises a database 96 and a user profile generation module 97. The database 96 may be the database 76 of the system 70 described above. The user profile generation module 97 may be an example of the processor 78 described above.
As shown in Figure 9, the first user module 92 is used by a first user to access a first application, the second user module 93 is used by a second user to access the first application and the third user module 94 is used by a third user to access the first application. Of course, the first, second and third modules 92 to 94 could be the same 20 module used (at different times) by different users to access the first application.
The database 96 can therefore store data (which may be compressed and/or anonymized) relating to how different users interact with the first application. The user profile generation module 97 can therefore use the data stored in the database 96 to 25 create an application profile for the first application including information across multiple users.
The system 90 may additionally comprise a fourth user module 98 and a fifth user module 99. Each of the user modules 98 and 99 may comprise a user input device and a 30 data compression module (such as the user input device 72 and data compression module 74 described above). As shown in Figure 9, the fourth user module 98 may be used by the first user to access a second application and the fifth user module 99 may be used by the first user to access a third application. The database 96 can therefore store data relating to how the first user interacts with multiple applications. Thus, the 35 user profile generation module 97 may also be able to create a user profile for the first user including information across multiple applications.
- 12 As discussed above, the system 80 can be used to generate a user profile and the system 90 can be used to generate an application profile (and optionally a user profile). Such user profiles and applications profiles are examples of the profile(s) that may be generated in the step 26 of the algorithm 20 discussed above.
User and application profiles generated in accordance with the principles described above have a wide variety of potential uses. A number of potential uses are described below with reference to Figures 10 to 18. It should be understood that these are some of 10 many example uses of such user and application profiles that will be apparent to those skilled in the art.
Figures 10 is a flow chart showing an algorithm, indicated generally by the reference numeral 100, in accordance with an example embodiment. The algorithm 100 starts at 15 operation 102, where user input data is collected, as described above. A user profile may be generated based on a user that is currently using a particular application or device.
At step 104, the user data (e.g. user profile) of a person currently using the particular 20 application or device is compared with the user profile of the normal user of the application or device. A difference between the normal user and the current user may be indicative of an unauthorised user. If no difference is detected, the algorithm terminates at step 108. If a difference is detected, the algorithm moves to step 106 where access to the device or application may be restricted or prohibited, before the 25 algorithm terminates at step 108.
By way of example, the step 106 may allow access to a device, but may prevent certain function(s), such as payment functions. This would enable, for example, a child to use a mobile phone owned by a parent to access games, but the detection of a different user 30 (based on a generated user profile based on game play style) may be used to prevent the child from authorising payments from the mobile phone.
There are further potential uses for the algorithm 100. For example, the user identification process could be used as an alternative to other device locking methods 35 (e.g. as an alternative to providing a password).
-13Figures 11 is a flow chart showing an algorithm, indicated generally by the reference numeral no, in accordance with an example embodiment. The algorithm no starts at operation 112, where user input data is collected, as described above.
At step 114, an indication of user performance is generated. The user performance indication may be generated from the user data stored within a database (e.g. database 76, 86 or 96). By way of example, the performance indication could be based on reaction time and/or user accuracy, which indications maybe derivable from the stored data. The step could be based on data for a single application (such as a game), but could also be provided for a single user across multiple applications. Performance data for multiple users (of a single application or across multiple applications) could also be provided.
The step 114 could additionally output information regarding how the user’s performance has changed over time and could compare the user’s performance with other users, based on the user profile data stored in the database.
With the performance data indicated to the user, the algorithm 110 terminates at step 116.
The algorithm 110 can be adapted to provide different outputs (in addition to, or instead of, performance data). For example, the step 114 could provide a prediction of an emotional status of a user (for example whether the user is one or more of: content, excited, calm, tired and bored). Other uses of the principles of the algorithm 110 will be 25 apparent to those skilled in the art.
Figures 12 is a flow chart showing an algorithm, indicated generally by the reference numeral 120, in accordance with an example embodiment. The algorithm 120 starts at operation 122, where user input data is collected, as described above.
At step 124, the user profile data is interrogated to determine whether the data indicates any potential short term health issues. If so, the algorithm 120 moves to step 126; if not, the algorithm terminates at step 128.
Short-term health issues may be identified in step 124 by identifying a change in user performance, as indicated by the user profile data (e.g. comparing current performance
-14with historical performance). The step 124 could, for example, be used to identify fatigue and may, for example, indicate that the user should take a break. In the event that a short-term health issue is identified, an alert is raised to the user at step 126 (e.g. recommending or requiring a break) and the algorithm 120 then terminates at step 128.
Figures 13 is a flow chart showing an algorithm, indicated generally by the reference numeral 130, in accordance with an example embodiment. The algorithm 130 starts at operation 132, where user input data is collected, as described above.
At step 134, the user profile data is interrogated to determine whether the data indicates any potential long term health issues. If so, the algorithm 130 moves to step 136; if not, the algorithm terminates at step 138.
Long-term health issues may be identified in step 134 by identifying a change in user performance, as indicated by the user profile data (e.g. comparing current performance with historical performance). The step 134 could, for example, be used to identify changes in levels of attention or reaction time. In the event that a long-term health issue is identified, an alert is raised to the user at step 136 and the algorithm 120 then terminates at step 138.
Figures 14 is a flow chart showing an algorithm, indicated generally by the reference numeral 140, in accordance with an example embodiment. The algorithm 140 starts at operation 142, where user input data is collected, as described above. The user data collected in operation 142 may be based on many users of a particular application (such 25 as a game).
At step 144, user feedback, for example in the form of an application profile, is provided. For example, the user feedback could be provided to the developer of the application. The algorithm 140 then terminates at operation 146. The user feedback 30 provided in step 144 may enable a game or application developer to obtain information regarding how users are interacting with their game or application. Such data can be provided anonymously and can provide data regarding real users, rather than a test community.
Figures 15 is a flow chart showing an algorithm, indicated generally by the reference numeral 150, in accordance with an example embodiment. The algorithm 150 starts at
5operation 152, where user input data is collected, as described above. The user data collected in operation 152 may be based on many users of a particular application (such as a game).
At step 154, similar applications may be identified. Applications may be deemed to be similar in the event that the application profiles generated for the application share predefined metrics. The identification of similar application maybe of interest, for example, to a user who enjoys a particular style of game and wishes to identify games with similar attributes. The algorithm 150 then terminates at step 156.
In addition to, or instead of, the step 154, the algorithm 150 may include identifying similar users on the basis of users having similar user fingerprint and/or identifying similar user groups on the basis of users having similar user fingerprint for a given application.
Figures 16 is a flow chart showing an algorithm, indicated generally by the reference numeral 160, in accordance with an example embodiment. The algorithm 160 starts at operation 162, where user input data is collected, as described above. The user data collected in operation 162 may be based on many users of a particular application (such 20 as a game) and/or users of a plurality of applications.
At step 164, clustering by similarity is carried out. The step 164 may, for example, cluster users into groups of users having similar user characteristics (as extracted from the user inputs collected in step 162). Alternatively, or in addition, the step 164 may cluster applications into groups of applications having similar user characteristics (as extracted from the user inputs collected in step 162).
At step 166, the cluster information collected in step 164 is used. For example, the cluster information could be used for one or more of: targeted advertising, improved 30 recommendations and tailoring mobile experiences to specific user groups. Once grouped, generalised statistics from user or application groups can be obtained. Insights obtained from such statistics may be usable to further improve user experiences. Experimental results, novel insights and correlations of performance, preference and/or engagement are all possible applications of the step 166. Other uses 35 will be apparent to the skilled person.
-16The algorithm 160 then terminates at step 168.
Figures 17 is a flow chart showing an algorithm, indicated generally by the reference numeral 170, in accordance with an example embodiment. The algorithm 170 starts at 5 operation 172, where user input data is collected, as described above. The user data collected in operation 172 may be based on many users of a particular application (such as a game) and/or users of a plurality of applications.
At step 174, one or more predictions are made based on the user input(s) collected in 10 step 172. The algorithm 170 then terminates at step 176.
For example, the step 174 may be used to predict or infer data. For example, a prediction of movies that a particular user might like might be inferred on the basis of knowledge of the preferences of near neighbours to the user (with the near neighbours 15 being identified, for example, by identifying other users with similar user characteristics).
In some embodiments, the algorithms 160 and 170 may be combined. For example, the step 174 may make use of the clustering techniques of step 164 in the identification of 20 near neighbours for providing predictions in the step 174.
Figures 18 is a flow chart showing an algorithm, indicated generally by the reference numeral 180, in accordance with an example embodiment. The algorithm 180 starts at operation 182, where user input data is collected, as described above. The user data 25 collected in operation 182 may be based on usage signatures of a single user for one or more applications.
At step 184, one or more applications are suggested based on a determined emotion of the user (based on the user input(s) collected in step 182). The algorithm 180 then terminates at step 186.
For example, the step 184 may make use of the way in which a user is interacting with a game to identify, determine or predict an emotional status of the user (e.g. excited, bored, short-tempered etc). In addition, the step 184 may make use of biometric data 35 (e.g. heart rate data) in the identification/determination/prediction of emotional status. Once an emotion has been identified, determined or predicted, games (or other
-17applications) may be suggested accordingly. Such game may, for example, be suggested in order to reduce an emotional response (e.g. calming games if a user is over-excited) or to make use of a detected emotion.
If desired, different functions discussed herein may be performed in a different order and/or concurrently with each other. Furthermore, if desired, one or more of the above-described functions maybe optional or maybe combined. For example, any combination of the algorithms described above with reference to Figures 10 to 18 may be provided within a single implementation. For example, the detection of potential short-term and long-term health issues as described above with reference to the algorithms 120 and 130 could readily be combined. Moreover, those skilled in the art will be aware of variants to such algorithms.
For completeness, Figure 19 is a schematic diagram of a system, indicated generally by the reference numeral 200, that may be used in example implementations of the principles described herein. The system 200 comprises a processor 202, a memory 204 (for example, including RAM and/or ROM), input means 206 and output means 208. The processor 202 is in communication with each of the other components in the system 200 in order to control operation thereof. The processor 202 may take any suitable form, such as a microcontroller, plural microcontrollers, a processor, or plural processors.
The memory 204 may include a non-volatile memory, a hard disk drive (HDD) or a solid state drive (SSD) and may, for example, store an operating system and/or one or more software applications. The operating system may contain code which, when executed by the processor, implements aspects of the algorithms described herein.
The input means 206 and the output means 208 may take many different forms and maybe provided, for example, to allow a user (such as an application or games developer) to interact with the system 200.
It will be appreciated that the above described example embodiments are purely illustrative and not limiting on the scope of the invention. Other variants and modifications will be apparent to persons skilled in the art upon reading the present specification.
-18Moreover, the disclosure of the present application should be understood to include any novel features or any novel combination of features either explicitly or implicitly disclosed herein or any generalization thereof and during the prosecution of the present application or of any application derived therefrom, new claims may be formulated to 5 cover any such features and/or combination of such features.
Although various aspects of the invention are set out in the independent claims, other aspects of the invention comprise other combinations of features from the described embodiments and/or the dependent claims with the features of the independent claims, 10 and not solely the combinations explicitly set out in the claims.
It is also noted herein that while the above describes various examples, these descriptions should not be viewed in a limiting sense. Rather, there are several variations and modifications which maybe made without departing from the scope of 15 the present invention as defined in the appended claims.

Claims (13)

1. A method comprising:
obtaining a plurality of user inputs from one or more user devices, each user
5 device being used by a user for accessing an application;
converting each of the plurality of user inputs into a usage signature; and aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles.
io
2. A method as claimed in claim 1, wherein each usage signature is anonymized.
3. A method as claimed in claim 1 or claim 2, wherein each application profile is based on usage signatures of a plurality of users of the respective application.
15
4. A method as claimed in any one of claims 1 to 3, wherein each user profile is based on usage signatures of a single user for a plurality of applications.
5. A method as claimed in any one of the preceding claims, wherein the user devices comprise one or more mobile communication devices and/or one or more
20 games controllers.
6. A method as claimed in any one of the preceding claims, wherein the user inputs include one or more of: swipes of one or more of the input devices; touches of one or more of the input devices; pressure applied to one or more of the input devices;
25 pressure size applied to one or more of the input devices; buttons pressed; pressure applied to buttons; joystick positions; gyroscope data; accelerometer data; user reaction time; user hand position; user hand speed; user playing style; biometric data and other external data.
30
7. A method as claimed in any one of the preceding claims, wherein each user input is converted into a usage signature using sparse coding and/or principle component analysis.
8. A method as claimed in any one of the preceding claims, wherein each user
35 input is converted into a usage signature using a function that generates a unique fingerprint without storing the user input data.
- 20
9- A method as claimed in any one of the preceding claims, wherein each usage signature is a vector.
5 10. A method as claimed in any one of the preceding claims, further comprising using one or more of the user profile(s) to identify a likely change in a user of a user device.
n. A method as claimed in claim io, further comprising blocking access to some or io all functions of a user device on identifying the likely change in the user of a user device.
12. A method as claimed in any one of the preceding claims, further comprising using one or more of the user profile(s) to provide user performance feedback over
15 time.
13. A method as claimed in any one of the preceding claims, further comprising identifying changes in user profile data indicative of a potential health problem.
20 14. A method as claimed in claim 13, wherein the potential health problem includes one or more short term health problems and/or one or more long term health problems.
15. A method as claimed in any one of the preceding claims, further comprising
25 using one or more of the user profile(s) to predict an emotional status of a user.
16. A method as claimed in any one of the preceding claims, further comprising using one or more of said application profiles to generate user feedback for the respective application.
17. A method as claimed in any one of the preceding claims, further comprising identifying similar applications on the basis of applications having similar application profiles.
35 18. A method as claimed in any one of the preceding claims, further comprising identifying similar users on the basis of users having a similar usage signature.
- 21 19- A method as claimed in any one of the preceding claims, further comprising identifying similar user groups on the basis of users having similar usage signatures for a given application.
20. A method as claimed in any one of the preceding claims, further comprising suggesting applications to a user based on an identified, determined or predicted emotional status of the user.
io 21. An apparatus comprising:
an input for receiving a plurality of user inputs from one or more user devices, each user device being used by a user for accessing an application;
a converter for converting each of a plurality of user inputs into a usage signature; and
15 an aggregator for aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles.
22. An apparatus as claimed in claim 21, wherein each usage signature is anonymized.
23. An apparatus as claimed in claim 21 or claim 22, wherein the aggregator is configured to generate each application profile based on usage signatures of a plurality of users of the respective application.
25 24. An apparatus as claimed in claim 23, wherein the aggregator generates the application profiles from an average of the relevant usage signatures.
25. An apparatus as claimed in any one of claims 21 to 24, wherein the aggregator is configured to generate each user profile based on usage signatures of a single user for a
30 plurality of applications.
26. An apparatus as claimed in claim 25, wherein the aggregator generates the user profile(s) from an average of the relevant usage signatures.
35 2.7. An apparatus as claimed in any one of claims 21 to 26, wherein the user inputs include one or more of: swipes of one or more of the input devices; touches of one or
- 22 more of the input devices; pressure applied to one or more of the input devices; pressure size applied to one or more of the input devices; buttons pressed; pressure applied to buttons; joystick positions; gyroscope data; accelerometer data; user reaction time; user hand position; user hand speed; user playing style; biometric data and other 5 external data.
28. An apparatus as claimed in any one of claims 21 to 27, wherein the converter comprises a sparse coding module and/or a principle component analysis module and/or a module using a function that generates a unique fingerprint without storing
10 the user input data.
29. An apparatus as claimed in any one of claims 21 to 28, further comprising an output module.
15 30. An apparatus as claimed in claim 29, wherein the output module is configured to identify one or more of:
a likely change in a user of a user device;
user performance feedback over time;
changes in user profile data indicative of a potential health problem;
20 an emotional status of a user;
user feedback for the respective application;
similar applications, on the basis of applications having similar application profiles;
similar users, on the basis of users having a similar usage signature;
25 similar user groups, on the basis of users having similar usage signatures for a given application;
user behaviour prediction;
user profile prediction; and applications based on an identified or determined emotion(s) of a user.
31. A computer-readable medium having computer-readable code stored thereon, the computer readable code, when executed by at least one processor, causes performance of:
obtaining a plurality of user inputs from one or more user devices, each user
35 device being used by a user for accessing an application;
converting each of the plurality of user inputs into a usage signature; and
-23aggregating some or all of said usage signatures into one or more application profiles and/or one or more user profiles.
GB1807658.8A 2018-05-11 2018-05-11 User and/or application profiles Withdrawn GB2575236A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
GB1807658.8A GB2575236A (en) 2018-05-11 2018-05-11 User and/or application profiles
PCT/KR2019/005574 WO2019216671A1 (en) 2018-05-11 2019-05-09 User and/or application profiles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1807658.8A GB2575236A (en) 2018-05-11 2018-05-11 User and/or application profiles

Publications (2)

Publication Number Publication Date
GB201807658D0 GB201807658D0 (en) 2018-06-27
GB2575236A true GB2575236A (en) 2020-01-08

Family

ID=62623247

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1807658.8A Withdrawn GB2575236A (en) 2018-05-11 2018-05-11 User and/or application profiles

Country Status (2)

Country Link
GB (1) GB2575236A (en)
WO (1) WO2019216671A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170161478A1 (en) * 2015-08-12 2017-06-08 Kryptowire LLC Active Authentication of Users
US20170299624A1 (en) * 2011-11-30 2017-10-19 The Nielsen Company (Us), Llc Multiple meter detection and processing using motion data

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7231378B2 (en) * 2001-04-26 2007-06-12 General Electric Company System and method for managing user profiles
US8892649B2 (en) * 2009-12-02 2014-11-18 Novatium Solutions Pvt. Ltd. Management of user profiles in a cloud based managed utility computing environment
US10453070B2 (en) * 2011-06-15 2019-10-22 Blue Kai, Inc. Non-invasive sampling and fingerprinting of online users and their behavior
US9544212B2 (en) * 2012-01-27 2017-01-10 Microsoft Technology Licensing, Llc Data usage profiles for users and applications
US10021169B2 (en) * 2013-09-20 2018-07-10 Nuance Communications, Inc. Mobile application daily user engagement scores and user profiles

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170299624A1 (en) * 2011-11-30 2017-10-19 The Nielsen Company (Us), Llc Multiple meter detection and processing using motion data
US20170161478A1 (en) * 2015-08-12 2017-06-08 Kryptowire LLC Active Authentication of Users

Also Published As

Publication number Publication date
WO2019216671A1 (en) 2019-11-14
GB201807658D0 (en) 2018-06-27

Similar Documents

Publication Publication Date Title
Chikersal et al. Detecting depression and predicting its onset using longitudinal symptoms captured by passive sensing: a machine learning approach with robust feature selection
US11102221B2 (en) Intelligent security management
Kommiya Mothilal et al. Towards unifying feature attribution and counterfactual explanations: Different means to the same end
US9842313B2 (en) Employee wellness tracking and recommendations using wearable devices and human resource (HR) data
Murmuria et al. Continuous authentication on mobile devices using power consumption, touch gestures and physical movement of users
US20190325119A1 (en) Methods and system for passive authentication through user attributes
US9813908B2 (en) Dynamic unlock mechanisms for mobile devices
CN113826368A (en) Detecting behavioral anomalies for cloud users for outlier actions
US9589137B2 (en) Method for detecting unfair use and device for detecting unfair use
US11397873B2 (en) Enhanced processing for communication workflows using machine-learning techniques
US10547616B2 (en) Systems and methods for supporting information security and sub-system operational protocol conformance
WO2016126867A1 (en) Biometric measures profiling analytics
Silva et al. Eye tracking support for visual analytics systems: foundations, current applications, and research challenges
EP3908945A1 (en) Systems and methods for enhanced host classification
US11059492B2 (en) Managing vehicle-access according to driver behavior
US11449773B2 (en) Enhanced similarity detection between data sets with unknown prior features using machine-learning
US20210264251A1 (en) Enhanced processing for communication workflows using machine-learning techniques
US20230008904A1 (en) Systems and methods for de-biasing campaign segmentation using machine learning
Jeong et al. Examining the current status and emerging trends in continuous authentication technologies through citation network analysis
WO2021175010A1 (en) User gender identification method and apparatus, electronic device, and storage medium
US11386805B2 (en) Memory retention enhancement for electronic text
Wang et al. The effectiveness of zoom touchscreen gestures for authentication and identification and its changes over time
GB2575236A (en) User and/or application profiles
US11475221B2 (en) Techniques for selecting content to include in user communications
US20210201237A1 (en) Enhanced user selection for communication workflows using machine-learning techniques

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)