EP4381435A1 - Computersystemarchitektur für maschinelles lernen - Google Patents

Computersystemarchitektur für maschinelles lernen

Info

Publication number
EP4381435A1
EP4381435A1 EP22854024.1A EP22854024A EP4381435A1 EP 4381435 A1 EP4381435 A1 EP 4381435A1 EP 22854024 A EP22854024 A EP 22854024A EP 4381435 A1 EP4381435 A1 EP 4381435A1
Authority
EP
European Patent Office
Prior art keywords
computer system
user device
transaction request
server
remote
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP22854024.1A
Other languages
English (en)
French (fr)
Inventor
Abhishek CHHIBBER
Darshankumar Bhadrasinh DESAI
Michael Charles Todasco
Vidyut Mukund Naware
Nitin S. Sharma
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
PayPal Inc
Original Assignee
PayPal Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/395,204 external-priority patent/US12081541B2/en
Priority claimed from US17/395,014 external-priority patent/US20230041015A1/en
Application filed by PayPal Inc filed Critical PayPal Inc
Publication of EP4381435A1 publication Critical patent/EP4381435A1/de
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N20/00Machine learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/44Program or device authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/50Monitoring users, programs or devices to maintain the integrity of platforms, e.g. of processors, firmware or operating systems
    • G06F21/55Detecting local intrusion or implementing counter-measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q20/00Payment architectures, schemes or protocols
    • G06Q20/38Payment protocols; Details thereof
    • G06Q20/40Authorisation, e.g. identification of payer or payee, verification of customer or shop credentials; Review and approval of payers, e.g. check credit lines or negative lists
    • G06Q20/401Transaction verification
    • G06Q20/4016Transaction verification involving fraud or risk level assessment in transaction processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/09Supervised learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • G06N3/098Distributed learning, e.g. federated learning

Definitions

  • This disclosure relates generally to security in computer systems, and in particular the use of a federated machine learning model to determine a response to transaction requests.
  • Computer systems that are available to access from unsecured networks such as the Internet use various techniques to determine whether requests made to the computer systems are authentic and approved.
  • authentication information such as usernames and passwords are used.
  • computer systems may also use machine learning algorithms that are able to identify patterns that are associated with legitimate requests and/or with malicious requests. Some of such machine learning algorithms use sensitive information that may be usable to personally identify individual users.
  • Fig. l is a block diagram illustrating an embodiment of a computer system configured to implement a federated machine learning model.
  • FIG. 2 is an expanded block diagram of the user device of Fig. 1 in accordance with various embodiments.
  • FIG. 3 is an expanded block diagram of the remote edge server of Fig. 1 in accordance with various embodiments.
  • FIG. 4 is an expanded block diagram of the server computer system of Fig. 1 in accordance with various embodiments.
  • FIG. 5 A is block diagram of an example of the federated machine learning model of Fig. 1 in accordance with various embodiments.
  • Fig. 5B is a block diagram illustrating clusters of clusters of trained machine learning algorithms of a federated machine learning model in accordance with various embodiments.
  • FIGs. 6A-6D are flowcharts illustrating various embodiments of a transaction request evaluation method using a federated machine learning model in accordance with various embodiments.
  • Fig. 7 is a flowchart illustrating various embodiments of a transaction request evaluation method using a federated machine learning model that is implemented using the user device and the server computer system of Fig. 1 in accordance with various embodiments.
  • Fig. 8 is a flowchart illustrating various embodiments of a transaction request evaluation method using a federated machine learning model that is implemented using the user device, the remote edge server, and the server computer system of Fig. 1 in accordance with various embodiments.
  • FIG. 9 is flowchart illustrating an embodiment of a server computer system portion of a transaction request evaluation method in accordance with the various embodiments.
  • FIG. 10 is flowchart illustrating an embodiment of a remote computer system portion of a transaction request evaluation method in accordance with the various embodiments.
  • FIG. 11 is a block diagram of an exemplary computer system, which may implement the various components of Figs. 1-4.
  • a “computer system configured to collect user behavior information” is intended to cover, for example, a computer system has circuitry that performs this function during operation, even if the computer system in question is not currently being used (e.g., a power supply is not connected to it).
  • an entity described or recited as “configured to” perform some task refers to something physical, such as a device, circuit, memory storing program instructions executable to implement the task, etc. This phrase is not used herein to refer to something intangible.
  • the “configured to” construct is not used herein to refer to a software entity such as an application programming interface (API).
  • API application programming interface
  • first, second, etc. are used as labels for nouns that they precede, and do not imply any type of ordering (e.g., spatial, temporal, logical, etc.) unless specifically stated.
  • references to “first” and “second” subsets of transaction request evaluation factors would not imply an ordering between the two unless otherwise stated.
  • the term “based on” is used to describe one or more factors that affect a determination. This term does not foreclose the possibility that additional factors may affect a determination. That is, a determination may be solely based on specified factors or based on the specified factors as well as other, unspecified factors.
  • a determination may be solely based on specified factors or based on the specified factors as well as other, unspecified factors.
  • modules operable to perform designated functions are shown in the figures and described in detail (e.g., federated machine learning module 112, transaction request analysis module 122, etc.).
  • a “module” refers to software or hardware that is operable to perform a specified set of operations.
  • a module may refer to a set of software instructions that are executable by a computer system to perform the set of operations.
  • a module may also refer to hardware that is configured to perform the set of operations.
  • a hardware module may constitute general-purpose hardware as well as a non-transitory computer-readable medium that stores program instructions, or specialized hardware such as a customized ASIC.
  • a module that is described as being “executable” to perform operations refers to a software module
  • a module that is described as being “configured” to perform operations refers to a hardware module
  • a module that is described as “operable” to perform operations refers to a software module, a hardware module, or some combination thereof. Further, for any discussion herein that refers to a module that is “executable” to perform certain operations, it is to be understood that those operations may be implemented, in other embodiments, by a hardware module “configured” to perform the operations, and vice versa.”
  • authentication may be performed using authentication factors that are typically divided into three categories: knowledge, possession, and inherence.
  • Knowledge factors include factors that are related to demonstrating that a user seeking authentication knows one or more pieces of information (e.g., a username, a password) that are compared to information stored at the authenticating computer system (e.g., comparing a hash of a password entered by a user with a hash of the password sorted at the authenticating computer system).
  • Possession factors include factors that are related to demonstrating that a user seeking authentication has possession of an article or device that is known to be associated with the user (e.g., a physical token or program that generates predictable pseudorandom numbers, a device having the same network address of previous accesses, a device that is operable to receive an out-of-band communication such as a text message or email message).
  • Inherence factors include factors that are related to demonstrating that a user seeking authentication is the same individual as the user who owns the account (e.g., biometric information, user behavior information).
  • While some information used to satisfy authentication factors may be of a less sensitive nature, other information (especially information that relates to inherence or possession) may be personally identifiable information or other sensitive information that users may desire not be shared with the authenticating computer system. Additionally, makers of user devices have begun notifying users when usage information is being collected and providing users with configuration options that limit information collection. Moreover, recent government regulation such as the EU’s General Data Protection Regulation (“EU GDPR”) impose significant requirements on how such sensitive information is obtained and stored with significant penalties for noncompliance. Further, in some instances collecting some information to use to demonstrate possession or inherence may be cumbersome for logistical reasons.
  • EU GDPR General Data Protection Regulation
  • multi-factor authentication For example, if a large enterprise requires multi-factor authentication to authenticate accesses by its employees to a secured computer service, some information like users’ personal cellular telephone number may be difficult to obtain and keep up-to-date such that some traditional multi-factor authentication techniques may be difficult to implement. Storing such personal information may also be regulated by EU GDPR and similar regulations. Accordingly, multi-factor authentication is increasingly preferred (or required), but the collection and usage of sensitive information is also increasingly fraught.
  • sensitive information and machine learning may also be usable to determine whether a requested transaction is authorized.
  • an enterprise employing hundreds of users may desire to monitor trends in user behavior to proactively identify deviations, which may be useable to identity a compromised computer system and prevent improper transactions from being allowed.
  • the collection and usage of sensitive information is also increasingly fraught.
  • a federated machine learning model various computer systems including a user device and and/or one or more edge servers are configured to apply information collected about a requested transaction to local portions of federated machine learning model that have been distributed across the computer system.
  • distributing a federated machine learning model refers to sending portions of the federated machine learning model from a centralized server computer system to other computer systems. These distributed portions are usable to generate scores by applying input to the portion of the federated machine learning model. The generated scores may be used to make decisions (such as determining whether to grant a request).
  • scores that are useable to satisfactorily demonstrate multi-factor authentication or authorization may be generated using sensitive information, and these scores can be communicated instead of the sensitive information itself.
  • user behavior information such as how a user holds their user device and where the user uses the device may be collected along with biometric information about the user themselves and evaluated with a portion of the federated machine learning model implemented on the user device.
  • the portion of the federated machine learning model implemented on the user device is operable to generate scores that are sent to the authenticating computer system instead of the sensitive information itself.
  • sensitive information may be sent to an edge server that is operable to apply a portion of the federated machine learning model instead of the authenticating computer system (which may be owned by a different entity and/or on the other side of a less secure wide area network), which also reduces distribution of sensitive information and the risks inherent in distribution of sensitive information.
  • sensitive device data, user behavior data, user information, and/or network data would not leave the confines of a local network. Instead, portions of machine learning models would be downloaded to user devices and/or to edge servers. The scores generated by these portions scores could then be sent to the authenticating computer system and married with machine learning models running on the authenticating computer system to determine how to respond to a transaction request.
  • This kind of hybrid machine learning implementation not only helps increase the security of the computer system using machine learning models but also reduces risk around privacy and sensitive information. As little to no sensitive information is taken off of user devices or outside a local network, these techniques are less likely to run afoul of user privacy concerns or regulations.
  • Fig. 1 is a block diagram illustrating an embodiment of a computer system 100 that includes a server computer system 110, a user device 120, a remote edge server 130, and a local edge server 140 that are configured to implement a federate machine learning model to evaluate transactions requests made by a user using user device 120.
  • the various components of computer system 100 communicate with each other via a combination of local and wide area networks.
  • user device 120 communicates with remote edge server 130 using a local broadband cellular network or WiFi network
  • remote edge server 130 communicates with local edge server 140 over a wide area network (e.g., the Internet)
  • local edge server 140 communicates with server computer system 110 via a local wired network.
  • Fig. 1 is a block diagram illustrating an embodiment of a computer system 100 that includes a server computer system 110, a user device 120, a remote edge server 130, and a local edge server 140 that are configured to implement a federate machine learning model to evaluate transactions requests made by a user using user device 120.
  • a line indicating a network boundary 150 separates components of computer system 100 that are on either side of a wide area network.
  • the term “local” refers to devices that are local to the server computer system 110 (i.e., on the same side of a wide area network connection) and the term “remote” refers to devices that are remote from the server computer system 110 (i.e., on the other side of a wide area network connection).
  • no separate local edge server 140 is present and a server computer system 110 performs the functions of local edge server 140 to communicate with remote edge server 130 over a wide area network connection. While only one of each of server computer system 110, user device 120, remote edge server 130, and local edge server 140 are shown in Fig.
  • server computer system 110 is a computer system that is operable to interact with user device 120 and other components of computer system 100 evaluate and facilitate transaction requests.
  • server computer system 110 is implemented by using software running on a single a computer system (e.g., a desktop computer, a laptop computer, a tablet computer, a mobile phone, a server) or a plurality of computer systems (e.g., a network of servers operating as a cloud).
  • computer system 110 is implemented in specialized hardware (e.g., on an FPGA) or in a combination of hardware and software.
  • Server computer system 110 includes a federated machine learning module 112 that is operable to generate a federated machine learning model (e.g., federated machine learning model 400 shown in Fig. 4) which includes the various portions 114, 124, 134, and 144 shown in Fig. 1, to distribute the portions 124, 134, and 144 to other components of computer system 100, and to apply server portion of federated machine learning model to determine how to respond to transaction request 102.
  • the various portions 114, 124, 134, and 144 are operable to take as input various sets of transaction request evaluation factors and to output scores generated by applying the transaction request evaluation factors to the various portions 114, 124, 134, and 144 of the federated machine learning model.
  • transaction request evaluation factors may be of a sensitive nature with privacy implications (e.g., personally identifiable information) while other are less sensitive (e.g., time and date information, operating system type and version of user device 120),
  • the various portions 114, 124, 134, and 144 are usable to determine whether to grant transaction requests 102 without sensitive information being sent to server computer system 110 in various embodiments.
  • server computer system 110 implements its own server portion 114 of the federated machine learning model, which allows server computer system 110 to have a final say on whether to grant a transaction request 102, adjust the machine learning model based on changing conditions across computer system 100, apply different evaluation rules across different subsets of user devices 120 (e.g., across different regions, across different types of user devices 120, etc.).
  • server computer system 110 is operable to perform other functions in addition to facilitating the use of a federated machine learning model to evaluate transaction requests 102 from users 104 (e.g., providing a website, providing a computer- implemented financial transaction service, etc.). Server computer system 110 is discussed in further detail in reference to Fig. 4.
  • User device 120 is a computer system that is operable to facilitate interactions between users 104 and server computer system 110.
  • user device 120 can be implemented as one of (but not limited to) a desktop computer, a laptop computer, a tablet computer, a mobile phone, and/or a wearable computer.
  • user device 120 can be implemented by software running on a computer system (e.g., a server) or a plurality of computer systems (e.g., a network of servers operating as a cloud).
  • user device 120 is implemented in specialized hardware (e.g., on an FPGA) or in a combination of hardware and software.
  • User device 120 is operable to create a transaction request 102 in response to accessing user input, such as received at the user device 102, and send this transaction request to the server computer system 110.
  • Computer system 100 is operable to evaluate such a transaction request 102 using a federated machine learning model that is implemented on various portions of computer system 100.
  • transaction request 102 may be a request to access any of a number of secured electronic resources including but not limited to (a) a request to log in to a secure computer service or website, (b) a request to purchase goods or services, (c) a financial transaction involving moving funds from one account to another, or (d) a request to access a secured data or media file.
  • User device 120 includes a transaction request analysis module 122 that is operable to apply a user device portion 124 of the federated machine learning model (referred to herein as a user device portion 124 or a remote device portion of the federated machine learning model) to information that is related to transaction request 102.
  • user device 120 does not include a transaction request analysis module 122 and instead sends collected information to remote edge server 130 or other computer systems for analysis as discussed herein.
  • User device 120 includes a data collection module 126 that is operable to collect information about user device 120, about how a user has used user device 120, and/or about user 104.
  • user device 120 is operable to evaluate transaction request evaluation factors that may include sensitive information (e.g., personally identifiable information, health information, etc.) to determine whether a transaction request should be granted.
  • sensitive information e.g., personally identifiable information, health information, etc.
  • such a determination is made without sending sensitive information to server computer system 110.
  • such sensitive information is evaluated using transaction request analysis module 122 and is not sent to another computer (i.e., such information does not leave user device 120).
  • some of such sensitive information may be sent to remote edge server 130 for analysis using a remote edge server portion 134 of the federated machine learning model that is applied by remote edge server 130. In such embodiments, at least some of the sensitive information sent to remote edge server 130 is not sent over network boundary 150.
  • User device 120 is discussed in additional detail in reference to Fig. 2.
  • remote edge server 130 and local edge server 140 are computer systems that are situated on either side of network boundary 150.
  • remote edge server 130 and local edge server 140 are implemented using computer systems (e.g., servers, groups of servers operating together as a cloud).
  • remote edge server 130 and local edge server 140 are edge computing systems that facilitate networking with computer systems on their side of network boundary 150 and communication across network boundary 150.
  • remote edge server 130 and local edge server 140 are 5G (or subsequent generation) edge servers that are operable to perform advanced computing functions beyond simply facilitating communication across network boundary 150.
  • remote edge server 130 is operable to accept distributed processing requests from user device 120 (e.g., requests from user device 120 to perform calculations or other processing).
  • local edge server 140 is operable to accept distributed processing requests from server computer system 110.
  • remote edge server 130 and local edge server 140 are operable to apply respective portions 134, 144 of the federated machine learning model, which are indicated in dashed lines to indicate that such portions 134, 144 of the federated machine learning model are not present in all embodiments).
  • Remote edge server 130 and local edge server 140 are discussed in further detail in reference to Fig. 3.
  • computer system 100 is operable to collect transaction request evaluation factors such as information about user device 120, user behavior of user 104, and information about user 104 (some or all of which may include sensitive information), and apply such transaction request evaluation factors to a federated machine learning model to evaluate a transaction request 102.
  • transaction request evaluation factors such as information about user device 120, user behavior of user 104, and information about user 104 (some or all of which may include sensitive information)
  • Such analysis may be performed without sending sensitive information off of user device 120 and/or without sending sensitive information across network boundary 150. Additionally, even if sensitive information is sent across network boundary 150, the sensitive information may be processed at local edge server 140 and not stored at server computer system 110.
  • server computer system 110 applies a server portion 114 of the federated machine learning model (which in various embodiments includes the decision threshold(s) used to determine whether to grant the transaction request), server computer system 110 has not completely outsourced the decision making to other components of computer system 100. Accordingly, a federated machine learning model can be applied to evaluating a transaction request 102, enabling analysis of transaction request evaluation factors which are sensitive information while reducing risks relating to data privacy for such sensitive information.
  • Fig. 2 is an expanded block diagram illustrating user device 120 (and components thereof) in additional detail.
  • user device 120 includes transaction request analysis module 122, user device portion 124 of the federated machine learning model, and data collection module 126 discussed in reference to Fig. 1 as well as additional components.
  • Fig. 2 illustrates additional detail about transaction request analysis module 122 which may be installed on user device 120 (e.g., by server computer system 110) as part of adding user device 120 to computer system 100 as well as various built-in components of user device 120 such as user device input/output (“I/O”) 202 and user device operating system (“OS”) 204.
  • I/O user device input/output
  • OS user device operating system
  • various other applications may be installed on user device 120 such a web browser, a financial application (e.g., an application operated by a financial institution with which user 104 is associated), a media player, a file synchronization and storage application, a shopping application, etc.
  • a financial application e.g., an application operated by a financial institution with which user 104 is associated
  • a media player e.g., an application operated by a financial institution with which user 104 is associated
  • a file synchronization and storage application e.g., a shopping application, etc.
  • User device I/O 202 includes any of a number of I/O devices (and firmware and software for the same) that is usable to receive input from and provide output to user 104 or other components of computer system 100.
  • User device I/O 202 includes but is not limited to one or more displays (e.g., touchscreen displays), keyboards or other buttons, pointing devices, cameras, biometric sensors (e.g., fingerprint scanners, retinal scanners), gyroscopes or other motion sensing devices, location sensors (e.g., GPS interfaces), communication devices (e.g., Bluetooth, NFC, cellular, WiFi, wired communication).
  • the type of transaction request evaluation factors that may be collected and evaluated will vary according to the I/O capabilities of user device 120 (e.g., GPS coordinates can be collected if user device 120 includes a GPS sensor, fingerprint information cannot be collected if user device 120 does not include a fingerprint scanner).
  • various types of user device I/O 202 may collect information that is only of a sensitive nature (e.g., a fingerprint scanner), information that may be sensitive or may be less sensitive (e.g., a camera that is useable to capture images of the face of user 104 as well as a pattern on a wall behind user 104), or information that is generally less sensitive (e.g., the time of day that user device 120 is being used, the time of day that user device 120 is idle).
  • a sensitive nature e.g., a fingerprint scanner
  • information that may be sensitive or may be less sensitive e.g., a camera that is useable to capture images of the face of user 104 as well as a pattern on a wall behind user 104
  • information that is generally less sensitive e.g., the time of day that user device 120 is being used, the time of day that user device 120 is idle.
  • what information is collected using user device I/O 202 and made available to transaction request analysis module 122 may be controlled by user device OS 204 and/or
  • user device 120 is operable to collect (and to analyze using transaction request analysis module 122) user behavior information.
  • user behavior information refers to any information about how one or more users 104 has used a particular user device 120.
  • user behavior information includes but is not limited to: location information (e.g., a geolocation of user device 120), temporal information (e.g., when user device 120 is used, when user device 120 is idle), accelerometer information and gyroscopic information (e.g., how user 104 holds and moves user device 120), input patterns (e.g., typing speed, cursor movements), what peripheral devices are connected to user device 120 (e.g., whether wireless headphones or fitness trackers are connected with user device 120), etc.
  • location information e.g., a geolocation of user device 120
  • temporal information e.g., when user device 120 is used, when user device 120 is idle
  • accelerometer information and gyroscopic information e.g., how user 104 holds and moves user device 120
  • input patterns e.g.
  • user behavior information may also include information about the environment in which user device 120 is used such as temperature, humidity, and pressure sensors. It will be understood that some subsets of user behavior information may be more sensitive than others (e.g., location information may be more sensitive than gyroscope information). Moreover, a user device 120 may include explicit settings marking some or all of such user behavior information as sensitive (e.g., user 104 has configured user device 120 not share gyroscope information with other computer systems).
  • user device 120 is operable to collect (and to analyze using transaction request analysis module 122) user device information.
  • user device information refers to any information about user device 120 itself.
  • user device information includes but is not limited to: networking information (e.g., IP addresses, MAC addresses, network paths, whether user devices 120 is connected to WiFi networks or cellular networks), information about hardware of user device 120 (e.g., serial numbers, processor model numbers, connected peripherals), information about software of user device 120 (e.g., version and type of user device OS 204, a user agent of a web browser), and device fingerprinting (e.g., hardware differences in user device 120 as a result of manufacturing variation that are unique to the user device 120 and may be measured using various algorithms).
  • networking information e.g., IP addresses, MAC addresses, network paths, whether user devices 120 is connected to WiFi networks or cellular networks
  • hardware of user device 120 e.g., serial numbers, processor model numbers, connected peripherals
  • software of user device 120 e.g., version and type of user device
  • a user device 120 may include explicit settings marking some or all of such user device information as sensitive (e.g., user 104 has configured user device 120 not share a list of WiFi networks detected by user device 120 with other computer systems).
  • user device 120 is operable to collect (and to analyze using transaction request analysis module 122) user information.
  • user information refers to information about user 104.
  • user information includes but is not limited to: biometric information (e.g., fingerprints, retinal scans), facial scans or other visual information about user 104, voice patterns, health or fitness information (e.g., heartbeat, number of steps in a period of time), etc.
  • biometric information e.g., fingerprints, retinal scans
  • facial scans or other visual information about user 104 e.g., voice patterns, health or fitness information (e.g., heartbeat, number of steps in a period of time), etc.
  • health or fitness information e.g., heartbeat, number of steps in a period of time
  • a user device 120 may include explicit settings marking some or all of such user information as sensitive (e.g., user 104 has configured user device 120 not share fingerprint information with other computer systems).
  • transaction request evaluation factors Collectively, user behavior information, user device information, and user information may be referred to as “transaction request evaluation factors.”
  • the term “transaction request evaluation factors” may also include information that was not collected by user device 120, however, such as information collected by server computer system 110 discussed below. Such “transaction request evaluation factors” are usable by computer system 100 using a federated learning model to determine how to respond to transaction request 102. In various instances, some or all transaction request evaluation factors may be sensitive information (including information that usable to personally identify user 104). As discussed herein, in various embodiments some or all of the transaction request evaluation factors collected by user device 120 are not sent off of user device 120 at all.
  • some or all transaction request evaluation factors collected by user device 120 are sent to remote edge server 130 as transaction request evaluation factors 232 to be analyzed using a remote edge server portion 134 of the federated machine learning model.
  • some transaction request evaluation factors collected by user device 120 are sent to server computer system 110 (and/or local edge server 140) as transaction request evaluation factors 232.
  • transaction request evaluation factors that include sensitive information are not sent across network boundary 150 and are applied by user device 120 and/or remote edge server 130 to user device portion 124 and/or remote edge server portion 134 to generate scores as discussed herein.
  • other transaction request evaluation factors e.g., transaction request information about other transaction requests from other user devices
  • other transaction request evaluation factors may also be used to evaluate transaction request 102.
  • transaction information refers to information about a transaction requested by transaction request 102. Such transaction information describes the requested transaction: what is requested, by whom, etc. Such transaction information may vary according to the kind of transaction being requested. If the transaction request 102 is a login request, for example, such transaction information may include but is not limited to a username and password entered by user 104.
  • transaction information may include but is not limited to where the purchase request was input (e.g., a URL of a website running on a browser, an application used to submit the purchase quest), an identifier of the user account requesting the purchase, an identifier of the merchant, an identifier of a payment method for the purchase, an identifier of a billing address and/or shipping address, an amount of the purchase, etc.
  • transaction request 102 is a financial request such as a request to transfer funds between bank accounts
  • transaction information may include but is not limited to an indication of the source account, an indication of the destination account, a username of the requestor, etc.
  • the transaction request 102 is a request to access a secured resource (e.g., one or more files), transaction information may include but is not limited to a file pathway of the secured resource, a username of the requestor, etc.
  • User device OS 204 may be any of a number of operating systems installed on user device 120 that is operable to control the functions of user device 120.
  • user device OS 204 controls user device I/O 202 and determines what information to collect, whether and how it is stored, and whether such information is made available to transaction request analysis module 122.
  • data collection module 126 which includes an application program interface (API) 210 that is operable to send collected information to a data collection and abstraction module 220 of transaction request analysis module 122 (e.g., regularly, upon request).
  • API application program interface
  • information may be collected by data collection module 126 of user device OS 204 and made available, via API 210, to then be collected (and in embodiments, abstracted or anonymized) by third-party applications such as data collection and abstraction module 220 of transaction request analysis module 122.
  • third-party applications such as data collection and abstraction module 220 of transaction request analysis module 122.
  • user device OS 204 include one or more user device settings 206 which may be set by a manufacturer of user device 120, user 104, or another entity associated with user device 120 (e.g., an internet service provider, an employer of user 104).
  • user device settings 206 are settable to control what information is collected, whether such information is stored, and whether such information is shared with third-party applications such as transaction request analysis module 122. For example, a first user device setting 206 may dictate that location information may not be shared with third-party applications. Another user device setting 206 may disable a WiFi module of user device 120, thus preventing monitoring of nearby WiFi SSIDs.
  • a thumbprint sensor may be enabled in a user device setting 206, allowing for collection of thumbprint information.
  • user device settings 206 generally limit what data is collected and how it is used.
  • user 104 may have very granular control via such user device settings 206, thus allowing for fine tuning of what data is collected and shared.
  • user device settings 206 may explicitly prevent any personally identifiable information from leaving user device 120 such that personally identifiable information is never shared with other components of computer system 100.
  • user device OS 204 includes an optional safe list 208 (indicated in dashed lines).
  • user device OS 204 may generally require authorization (e.g., from user 104) to collect sensitive information and/or share such sensitive information with third-party applications such as transaction request analysis module 122.
  • third-party applications such as transaction request analysis module 122.
  • user device OS 204 may notify a user that a third-party application is requesting location data and ask for the user to sign off on such requests. Such request and sign-off workflows, however, can impede ease of use by user 104. In some instances, therefore, user device OS 204 may maintain a safe list 208 of third-party applications that are allowed to access some or all of the sensitive information collected by user device 120 without explicit user permission.
  • membership on the safe list 208 may be conditioned on meeting certain criteria with respect to sensitive information. For example, membership on safe list 208 may be conditioned on the third- party application anonymizing data and/or masking personally identifiable information (e.g., by removing names, by abstracting more specific information like full IP address to less specific information like an IP address range). In some embodiments, member on the safe list 208 may be conditioned on proving that the third-party application sends no personally identifiable information off of user device 120 (e.g., all analysis on personally identifiable information is performed on user device 120 or information is transformed such that it is no longer personally identifiable before being sent off of user device 120). In some embodiments, membership on the safe list 208 is controlled by an outside body such as a governmental regulatory agency or nongovernmental standards-setting body.
  • an outside body such as a governmental regulatory agency or nongovernmental standards-setting body.
  • Transaction request analysis module 122 (also referred to herein as authentication software) is a third-party application that is installed on user device 120 (e.g., an application installed on user device 120 by server computer system 110 to facilitate transactions using computer system 100) which is operable to authenticate transaction requests 102 using the federated machine learning model techniques discussed herein.
  • transaction request analysis module 122 includes a user device portion 124 of the federated machine learning model that was received from server computer system 110 prior to user device 120 receiving transaction request 102.
  • user device portion 124 of the federated machine learning model was generated by a server computer system 110 using a dataset of previous transaction requests.
  • federated machine learning model includes a server portion 114 and one or more remote device portions including user device portion 124 and optionally a remote edge server portion 134 and/or a local edge server portion 144.
  • Transaction request analysis module 122 includes a data collection and abstraction module 220 that is operable to receive transaction request evaluation factors collected by user device 120 (e.g., via API 210).
  • data collection and abstraction module 220 is operable to anonymize and/or abstract information received by data collection and abstraction module 220.
  • information received by data collection and abstraction module 220 is personally identifiable information usable to identify user 104, and in some embodiments anonymizing and/or abstracting such information makes it no longer personally identifiable information.
  • data collection and abstraction module 220 is operable to convert collected information into a different format to enable the collected information to be applied to user device portion 124 of the federated machine learning model.
  • Transaction request analysis module 122 is operable to generate one or more user device scores 230 by applying collected transaction request evaluation factors to the user device portion 124 of the federated machine learning model.
  • user device portion 124 of the federated machine learning model takes as input various transaction request evaluation factors.
  • user device portion 124 of the federated machine learning model takes as input user behavior information, user device information, and user information, but in other embodiments user device portion 124 of the federated machine learning model may take fewer inputs and sent collected transaction request evaluation factors to other portions of computer system as transaction request evaluation factors 232.
  • federated machine learning model (and user device portion 124 of the federated machine learning model) is trained to identify deviations from more typical transaction requests.
  • user device portion 124 of the federated machine learning model is trained using data collected by user device 120. If transaction request evaluation factors differ markedly from the norm, such a deviation may be detected upon applying the transaction request evaluation factors to user device portion 124 of the federated machine learning model.
  • user device scores 230 are generated on a regular basis and not in response to receiving transaction request 102.
  • user device scores 230 may include a “trust score” that is indicative of a degree to which federated machine leaning model has determined that the current usage pattern and device information has not materially deviated from prior patterns. Accordingly, such a trust score does not represent an indication of whether a transaction request 102 itself is fraudulent, but rather represents a wholistic evaluation of the circumstances surrounding the submission of transaction request 102 and whether present circumstances are different from the circumstances surrounding the submission of previous transaction requests.
  • federated machine learning model (and user device portion 124 of the federated machine learning model) is trained to detect whether a particular transaction request 102 represents a deviation from an ideal (e.g., defined by computer security analysts).
  • federated machine learning model (and user device portion 124 of the federated machine learning model) are operable to determine whether a particular transaction request 102 is “correct” or “incorrect” based on comparing it to the ideal.
  • the generation of user device scores 230 by applying transaction request evaluation factors to user device portion 124 of the federated machine learning model is discussed in additional detail in reference to Fig. 5A.
  • Transaction request evaluation factors 232 may be sent to any combination of remote edge server 130, local edge server 140, and server computer system 110 depending what kind of transaction request evaluation factors 232 are being sent (e.g., less sensitive transaction request evaluation factors may be sent to server computer system 110 whereas sensitive transaction request evaluation factors are only sent to remote edge server 130). As discussed herein, in some embodiments, no transaction request evaluation factors 232 are sent from user device 120 to any other portion of computer system 100. In some embodiments, transaction request evaluation factors used to generate users scores 230 are not sent off of user device 120.
  • biometric information (even abstracted biometric information) that is applied to user device portion 124 to generate user device scores 230 pertaining to biometric matches may not be sent to any other computer systems; rather the user device scores 230 are sent instead (in some of such embodiments transaction request evaluation factors 232 that were not used to generate user devices scores 230 may be sent out as well).
  • transaction request evaluation factors may be used to generate user device scores 230 and also sent to other components of computer system 100 for additional analysis (e.g., as a second opinion and/or to be aggregated with transaction request evaluation factors collected by other user devices 120 and processed by remote edge server 130).
  • transaction request analysis module 122 is operable to send an indication of transaction request 234.
  • the indication of transaction request 234 includes the transaction information input by user 104 as part of inputting transaction request 102 as well as information about how, when, where transaction request 102 was entered (e.g., via a web browser, when transaction request 102 was received by user device 120, etc.).
  • transaction request analysis module 122 includes one or more rules modules 226.
  • rules modules 226 are operable to make various decisions based on user device scores 230 (e.g., whether to deny a transaction request because the user device score 230 for a particular transaction request differs from the established trend by more than a standard deviation).
  • rules modules 226 may be operable to make a limited set of decision without communicating with server computer system 110.
  • such rules may simply deny transaction requests 102 that represent obvious deviations or to ask for additional information from user 104 prior to communicating with server computer system 110.
  • rules modules 226 may operate behind scenes from the perspective of user 104.
  • rules modules 226 may cause data collection and abstraction module 220 to collect additional information for analysis. For example, such a second collection may collect a larger set of information that is useable to generate additional user device scores 230 or to send to other components of computer system 100.
  • a second collection may collect a larger set of information that is useable to generate additional user device scores 230 or to send to other components of computer system 100.
  • user device information collected around the time a particular transaction request 102 is received indicates a dramatic change in the number and SSIDs of WiFi networks detected by user device 120, such information may be sent to remote edge server 130 where such device information may be combined with WiFi network information from other user devices 120 and applied to a remote edge server portion 134 of the federated machine learning model implemented by remote edge server 130.
  • transaction request analysis module 122 includes modules that are operable to modify user device portion 124 of the federated machine learning model: an update module 222 that is operable to adjust user device portion 124 of the federated machine learning model as transaction request evaluation factors are received and analyzed (e.g., by changing weights applied to various aspects of user device portion 124 of the federated machine learning model as discussed in further detail in reference to Fig. 5A) and a synchronization module 224 that is operable to adjust user device portion 124 of the federated machine learning model based on updated provided by other portions of computer system 100 including server computer system 110.
  • synchronization module 224 is operable to send an indication of the user device portion 124 as adjusted by update module 222.
  • the user device portion 124 of the federated machine learning model may be a “headstart model” that is selected for user device 120 based on various characteristics of user device 120 (e.g., where user device 120 is located, what type of hardware and software is installed on user device 120, etc.). As the headstart model is used, it may be updated (e.g., weights may be adjusted) according to circumstances at user device 120. This may be particularly useful in embodiments in which the federated machine learning model is trained to identify deviations in the circumstances surrounding receiving transaction request 102.
  • user device portion 124 of the federated machine learning model can be revised to reflect user behavior and updated device information.
  • synchronization module 224 is operable to update user device portion 124 of the federated machine learning model to reflect that change.
  • synchronization module 224 is operable to check for updates regularly and/or receive updates pushed by server computer system 110.
  • user device 120 After sending user device scores 230, optional transaction request evaluation factors 232, and an indication of transaction request 234, user device 120 is configured to receive a response to transaction request 102. In some instances, the response will be simply to grant or deny transaction request 102. In other instances, the response will be “step-up challenge” asking user 104 provide additional information (e.g., by performing an image recognition challenge such as identifying which of a series of pictures has a motorcycle in it) or for user device 120 to collect additional information (e.g., additional transaction request evaluation factors). Step-up challenges as discussed in additional detail in reference to Fig. 6D and 8.
  • user device 120 is operable to collect transaction request evaluation factors (including user behavior information, device information, and/or user information) and to analyze the collected transaction request evaluation factors using a local portion of a federated machine learning model and without sending some or all of the collected information to server computer system 110, which reduces privacy risk.
  • transaction request evaluation factors including user behavior information, device information, and/or user information
  • Fig. 3 is an expanded block diagram illustrating remote edge server 130 (and components thereof) in additional detail.
  • remote edge server 130 optionally includes a transaction request analysis module 300 that is similar to the transaction request analysis module 122 of user device 120 except that no data collection and abstraction module 220 is present.
  • no data collection and abstraction module 220 is present because remote edge server 130 receives transaction request evaluation factors 232 from user device 120 and does not otherwise collect transaction request evaluation factors.
  • transaction request analysis module 300 is operable to apply transaction request evaluation factors 232 (and in embodiments indication of transaction request 234 and/or user device scores 230) to a remote edge server portion 134 of the federated machine learning model to generate edge server scores 310.
  • remote edge server 130 may be any of a number of advanced edge servers that are disposed at opposite ends of network boundary 150 (across which communication includes use of a wide area network). Unlike less advanced devices operable to facilitate communication across a wide area network remote edge server 130 (and local edge server 140) are operable to perform more computationally intensive processing such as collecting information and applying it to a machine learning models.
  • remote edge server 130 includes a transaction request analysis module 300
  • transaction request analysis module 300 is operable to apply transaction request evaluation factors 232 (and in embodiments and indication of transaction request 234 and/or user device scores 230) to a remote edge server portion 134 of the federated machine learning model to generate edge server scores 310.
  • applying transaction request evaluation factors 232 (and other information in certain embodiments) to remote edge server portion 134 of the federated machine learning model is performed in a similar manner as by transaction request analysis module 122 except that the portions of federated learning model and inputs thereto differ.
  • the generation of edge server scores 310 by applying transaction request evaluation factors 232 (and other information in certain embodiments to remote edge server portion 134 of the federated machine learning model is discussed in additional detail in reference to Fig. 5A.
  • remote edge server 130 is operable to receive collected information from a plurality of user devices 120.
  • collected information may be useable to identify deviations in behavior that warrant further investigation or denial of a transaction request 102. If the usage information and device information of a particular user device 120 suddenly deviates not only from its prior patterns but is an outlier among other user devices 120, then that particular user device 120 may by flagged for further investigation or prevented from submitting transaction requests 102.
  • remote edge server 130 For example, if temperature and pressure readings collected by particular user device 120 suddenly deviates from other user device 120 communicating with remote edge server 130, this may indicate that the particular user device 120 is being used in a different location, which may be as a result of theft of the particular user device 120. Such deviations may be detected by applying the transaction request evaluation factors 232 (and in embodiments indication of transaction request 234 and/or user device scores 230) to a remote edge server portion 134 of the federated machine learning model to produce remote edge server scores 310. If remote edge server scores 310 suddenly change from a previously stable pattern, it may be the case that malicious activity such as a takeover of user device 120 or a user account associated with user 104.
  • remote edge server 130 is operable to send an indication of transaction request 234, one or more transaction request evaluation factors 312, and one or more edge server scores 310across network boundary 150.
  • one or more transaction request evaluation factors 312 are the same as transaction request evaluation factors 232 received by remote edge server 130 (i.e., remote edge server 130 merely passes along the transaction request evaluation factors 232).
  • remote edge server 130 does not send transaction request evaluation factors that were used to generate edge server scores 310.
  • transaction request evaluation factors 232 do not cross network boundary 150 and are not received by server computer system 110, rather edge server scores 310 generated from such transaction request evaluation factors 232 using remote edge server portion 134 of the federated machine learning model are sent instead. As discussed herein, doing so reduces risks associated with storing and communicating sensitive information.
  • remote edge server 130 also passes along user device scores 230, although in other embodiments user device 120 is operable to communicate user device scores 230 to server computer system 110 without sending user device scores 230 through remote edge server 130.
  • transaction request analysis module 300 includes one or more rules modules 306.
  • rules modules 306 are operable to make various decisions based on edge server scores 310 and/or user device scores 230 (e.g., whether to deny a transaction request because the edge server scores 310 and/or user device score 230 for a particular transaction request differs from the established trend by more than a standard deviation).
  • rules modules 306 may be operable to make a limited set of decisions without communicating with server computer system 110.
  • such rules may simply deny transaction requests 102 that represent obvious deviations or to ask for additional information from user 104 prior to communicating with server computer system 110.
  • rules modules 306 may operate behind scenes from the perspective of user 104. In such embodiments, rather than asking user 104 for additional information, rules modules 306 is operable to communicate with user device 120 to cause data collection and abstraction module 220 to collect additional information for analysis. For example, such a second collection may collect a larger set of information that is useable to generate additional users scores 230 and/or edge server scores 310 to send to other components of computer system 100.
  • remote edge server portion 134 of the federated machine learning model is operable to be updated (e.g., by adjusting model weights) by an update module 302 as transaction request evaluation factors 232 are received and evaluated. Such updates may allow for aligning remote edge server portion 134 of the federated machine learning model with changing usage patterns such that a sudden deviation is more apparent. Further, in various embodiments, as with user device portion 124 of the federated machine learning model, remote edge server portion 134 of the federated machine learning model may be initialized using a headstart model (e.g., a model that has been trained by other remote edge servers 130 having similar characteristics).
  • a headstart model e.g., a model that has been trained by other remote edge servers 130 having similar characteristics.
  • synchronization module 304 is operable to update remote edge server portion 134 of the federated machine learning model to reflect that change. In various embodiments, synchronization module 304 is operable to check for updates regularly and/or receive updates pushed by server computer system 110.
  • remote edge server 130 accordingly, having a remote edge server 130 apply a separate portion of federated machine learning model facilitates using additional information to evaluate transaction requests 102. Rather than merely determining whether a deviation is detected in a particular user device 120 based on transaction request evaluation factors collected by user device 120, remote edge server 130 is operable to detect deviations for the user device 120 from among other user devices (not shown) with which remote edge server 130 has communicated. This may be especially useful in an enterprise setting where various user devices 120 are all associated with the enterprise (e.g., a plurality of laptops disposed at desks in an office building that use remote edge server 130 to communicate with server computer system 110).
  • one particular user device 120 has not deviated from its usual pattern substantially, then malicious action may not be detected based on a local deviation along. But if the pattern for a particular user device 120 is evaluated along with other patterns from similar user devices, a malicious action by the particular user device 120 may be more detectable if it deviates from patterns of other user devices.
  • remote edge server 130 is on the same side of network boundary 150 and is applying a remote edge server portion 134 of the federated machine learning model, such deviations may be detected without exposing sensitive information. This may be especially advantageous in an enterprise setting. For example, a deviation in access requests to sensitive health information can be detected without exposing the health information to computer systems outside of the enterprise, a deviation can be detected based on a change in personally identifiable user behavior information without exposing personally identifiable information to computer systems outside of the enterprise).
  • local edge server 140 may also include a transaction request analysis module that may operate similarly to transaction request analysis module 300 except that the transaction request analysis module of local edge server applies transaction request evaluation factors 232 (from user device 120) and/or transaction request evaluation factors 312 (from remote edge server 130) to a local edge server portion 144 of federated machine learning model (shown on Fig. 1).
  • both user device 120 and remote edge server 130 may be referred to as “remote computer systems” because they are remote from server computer system 110 (i.e., separated by a network boundary 150 such that communication with server computer system 110 includes the use of a wide area network).
  • either or both remote computer systems are operable to apply transaction request evaluation factors to respective “remote portions” of a federated machine learning model to determine “remote computer system scores” that are useable to evaluate the transaction request 102.
  • recitations of a remote computer system “receiving” transaction request evaluation factors can include a transaction request analysis module 122 of user device 120 receiving transaction request evaluation factors collected by user device 120.
  • a remote computer system “receiving” transaction request evaluation factors can also include a transaction request analysis module 300 of remote edge server 130 receiving transaction request evaluation factors 232 from user device 120.
  • a “remote computer system score” may be user device scores 230 or edge server scores 310, depending on the context.
  • Fig. 4 is an expanded block diagram illustrating server computer system 110 (and components thereof) in additional detail.
  • server computer system 110 includes federated machine learning module 112 and server portion 114 of federated machine learning model as well as additional components.
  • Federated machine learning module 112 is operable to generate and maintain a federated machine learning model 400 which includes server portion 114 of the federated machine learning model as well as the various other portions 124, 134, and/or 144 shown in Fig. 1.
  • server portion 114 includes one or more decision thresholds 402.
  • federated machine learning module 112 also includes a synchronization module 410, an aggregation module 412, and an update module 414 that are operable to maintain federated machine learning model 400 as well as a decision module 416 that is operable determine how to respond to transaction request 102.
  • Server computer system 110 generates and maintains federated machine learning model 400 using information about transaction requests stored in transaction datastore 430 and/or indications of ground truth 432. Additionally, in various embodiments server computer system includes other modules such as an identity services modules 420, a domain services module 422, and an orchestration module 424.
  • Federated machine learning module 112 is operable to generate and train federated machine learning model 400 using a dataset of previous transaction requests stored in transaction datastore 430. In some instances, training is supervised or partially supervised using indications of ground truth 432. As used herein, “ground truth” refers to information that is known to be real or true and is typically input by human analysts or users (e.g., fraud reports filed on certain prior transaction requests, results of audits of prior transaction requests). As discussed herein, in various embodiments federated machine leaning model 400 is divisible into server portion 114 as well as one or more remote computer system portions (i.e., user device portion 124 and/or remote edge server portion 134). In some embodiments, federated machine learning model 400 may also include a local edge server portion 144. Subsequent to training, the various portions may be distributed across computer system 100 to evaluate subsequent transaction requests (e.g., transaction request 102).
  • federated machine leaning model 400 takes various inputs (e.g., transaction request evaluation factors) and uses the inputs to calculated weighted scores.
  • federated machine learning model 400 is divisible into portions such that different subsets of transaction request evaluation factors are applied to the different portions, and the scores generated from the various portion can be analyzed separately or together. But because the sets of inputs are separate, the various portions are operable to generate their separate scores without needing communicate with computer systems generating the other scores. For example, a particular federated machine learning model 400 may take six inputs.
  • the portions of federated machine learning model 400 used the generate the first and second scores may be separated into different portions.
  • the structure of federated machine learning model 400 and various embodiments explaining how it can be separated into different portions are discussed in reference to Fig. 5A.
  • server portion 114 is useable to generate one or more scores.
  • server scores e.g., server scores 500 shown in Fig. 5A
  • server scores 500 are generated by applying transaction request evaluation factors 232 (collected by user device 120) and/or transaction request evaluation factors 312 (collected by user device 120 and sent to server computer system 110 via remote edge server 130) and/or indication of transaction request 234 (including transaction information) to server portion 114.
  • user device 120 can collect a set transaction request evaluation factors and generate user device scores 230 using some of such factors.
  • User device 120 can send a subset of transaction request evaluation factors 232 that were not used to generate user device scores 230.
  • edge servers 130, 140 are operable to generate edge server scores 310 and send a subset of transaction request evaluation factors 312 that were not used to generate user device scores 310.
  • user device 120 and/or edge servers 130, 140 may use particular transaction request evaluation factors (such as non-sensitive information about times of user device 120 usage) to generate scores and still send the particular transaction request evaluation factors to server computer system 110.
  • transaction request evaluation factors e.g., personally identifiable information or other sensitive information
  • federated machine learning module 112 is operable (e.g., using update module 414) to adjust server portion 114 by adjusting weights used to calculate server scores as well as adjusting decision thresholds 402 in response to receiving transaction request 102.
  • federated machine learning module 112 may also apply to server portion 114 information that was not collected by user device 120.
  • server computer system 110 may use records of prior transactions stored in transaction datastore 430 (e.g., what was requested, whether the prior requests were granted).
  • federated machine learning module 112 may apply other information such as records of user accounts and interactions between user accounts, records of funding instruments on file, analyses of a secure resource (e.g., what file type secure resource is, access patterns of the particular secure resource) stored at server computer system 110 to which access is requested in transaction request 102.
  • Server portion 114 includes one or more decision thresholds 402 that are used by decision module 416 to evaluate server scores, user device scores 230, and/or edge server scores 310 and determine how to respond to transaction request.
  • decision module 416 evaluates scores generated for a particular requested transaction and compares the scores to decision thresholds.
  • decision module 416 includes rule-based decision making such that if the scores exceed the decision thresholds 402 by a large amount, transaction request 102 is granted; if the scores are within a range of decision thresholds 402 a step-up operation is initiated; and if the scores are below decision thresholds 402 by a large around, transaction request 102 is denied.
  • different rules may be applied depending on circumstances (e.g., different rules for different regions, different rules for different enterprise customers, different rules for individual users 104 versus users 104 associated with an enterprise, etc.)
  • decision thresholds 402 may be used for different types of transaction request 102.
  • a first decision threshold 402 may be used for transaction requests 102 originating from user devices 120 located in country A while a second decision threshold 402 may be used for transaction requests 102 originating from user devices 120 located in country B.
  • decision thresholds 402 are constantly (e.g., in response to every transaction request 102) or regularly (e.g., every hour, every day, etc.) being adjusted by federated machine learning module 112 (e.g., using update module 414) as transaction requests 102 (and in some embodiments, indications of ground truth 432) are received and analyzed.
  • federated machine learning module 112 is able to adjust federated machine learning model 400 to account for global patterns while leveraging the federated nature of the model to allow for detection of trends in particular user device 120 or among user devices 120 communicating with particular edge servers 130 or 140.
  • the final decision on how to respond to a particular transaction request 102 is made using decision thresholds 402
  • server computer system 110 is ultimately still in control of evaluating transaction request 102, so even though portions of federated machine learning model 400 have been distributed across computer system 100, a malicious attack on a particular user device 120 and/or edge server 130 or 140 is not alone sufficient to improperly approve a particular transaction request 102.
  • decision thresholds 402 may be set for different authentication factors.
  • decision thresholds 402 may be set by analysts, but in other embodiments machine learning may be used to set and/or adjust decision thresholds 402 as discussed herein.
  • a knowledge factor for a transaction request may be satisfied by authenticating a username and password (included in indication of transaction request 234) using identity services module 420 as discussed below.
  • identity services module 420 as discussed below.
  • a determinization is rule-based (e.g., is there a match: yes or no?) and does not leverage machine learning. Whether a second authentication factor has been met, however, is established using federated machine learning model 400 and the various scores calculated therewith.
  • various scores may be used to determine whether an inherence factor has been met and/or whether a possession factor has been met.
  • various scores generated using user behavior information may be used to determine whether user behavior patterns have materially deviated from prior patterns (e.g., user device 120 is suddenly being held in a different hand and used in a different time period), and establishing that no such deviation is detected may be sufficient to establish an inherence factor for user 104. If the inherence factor is sufficiently established, computer system 100 concludes that user 104 for the present transaction request 102 has been sufficiently determined to be the same person as prior transaction requests.
  • patterns in user information may also be used to establish inherence (e.g., based on facial scans of user 104).
  • various scores generated using user device information may be used to determine whether a profile of user device 120 has materially deviated from prior patterns (e.g., user device 120 is suddenly coupled to a different set of WiFi networks and has different peripheral devices), and establishing that no such deviation is detected may be sufficient to establish a possession factor for user device 120. If a possession factor is sufficiently established, computer system 100 concludes that a particular user device 120 that is associate with the present transaction request 102 has been sufficiently determined to be the same user device 120 as prior transaction requests. Whether inherence or possession factors are established is based on one or more authentication factor thresholds.
  • synchronization module 410 is operable to communicate with synchronization modules 224 and 304 to distribute revisions to portions 124, 134, and/or 144 of the federated machine learning model.
  • update modules 222 and 302 may revise of user device portions 124 and edge server portions 134, 144.
  • indications of such revised portions 124, 134, 144 may be sent to server computer system 110 and used to update (e.g., using update module 414) federated machine learning model 400.
  • revised portions 124, 134, 144 from various user devices 120 and/or edge servers 130, 140 may be aggregated together (e.g., by averaging performing other statistical processes on changed values applied to the portions 124, 134, 144) and used to update federated machine learning model 400.
  • aggregations may also be used to generate the headstart models discussed in reference to Fig. 5B.
  • Transaction datastore 430 may be implemented using any of a number of computer storage devices.
  • Transaction datastore 430 may store records associated with any number (e.g., thousands, millions, billions, etc.) of previous transaction requests, transaction information for the requested transactions, an outcome of the transaction, and any indications of ground truth 432 such as fraud reports. Records in transaction datastore 430 are used to generate and train federated machine learning model 400 using any of a number of suitable machine learning techniques. As transaction requests 102 are received and analyzed, records are added to transaction datastore in various embodiments.
  • server computer system 110 includes identity services module 420, which is operable to authenticate information such as access tokens, user names and passwords, and cryptographic indicators that are used to secure computer system 100.
  • identity services module 420 is operable to authenticate information such as access tokens, user names and passwords, and cryptographic indicators that are used to secure computer system 100.
  • hashes of user names and passwords are compared to hashes of usernames and passwords included with indications of transaction request 234, for example.
  • domain services module 422 is operable to facilitate the requested transaction once transaction request 102 has been granted. In various embodiments, therefore, domain services module 422 is operable to transfer funds between accounts, cause secured information to be sent to user device 120, and/or to cause pages of a secure website to be sent to user device 120 for display.
  • orchestration module 424 is operable to activate or deactivate additional computation resources to scale the computational capabilities of server computer system 110 to handle the current workload of transaction requests 102.
  • server computer system 110 is operable to generate a federated machine learning model 400 and distribute portions 124, 134, 144 to various components of computer system 100 such that some or all transaction request evaluation factors (especially personally identifiable information or other sensitive information) may be analyzed by user device 120, remote edge server 130, and/or local edge server 140 without being sent to server computer system 110. Because server computer system 110 is operable to adjust federated machine learning model 400 (and update and distributed revised portions 124, 134, 144 accordingly) server computer system 110 is still able to adjust to global changes. Moreover, because server computer system 110 ultimately decides on whether to grant a particular transaction request 102, security in decision making is not compromised despite some analysis being performed by potentially more vulnerable components of computer system 100.
  • server computer system 110 is operable to adjust decision thresholds 402 and/or apply different sets of decision thresholds 402 in different circumstances.
  • sensitive information can be applied to advanced machine learning models to determine how to respond to transaction requests without reception and storage of sensitive information at server computer system 110.
  • Fig. 5 A is a block diagram of a simplified example of the federated machine learning model 400.
  • federated machine learning model 400 includes two portions: a user device portion 124 and a server portion 114.
  • federated machine learning model 400 includes a server portion 114 and any combination of portions 124, 134, and 144. It will also be understood that the example shown in Fig. 5A is simplified for the sake of explanation.
  • Federated machine learning model 400 in Fig. 5A has six data inputs 510: DI 510A, D2 510B, D3 510C, D4 510D, D5 510E, and D6 51 OF.
  • Data inputs D 1 510 A, D2 510B, and D3 510C are inputs to user device portion 124 and data inputs D4 510D, D5 510E, and D6 510F are inputs to server portion 114.
  • data inputs are used to generate four variables: VI 512A, V2 512B, V3 512C, and V4 512D.
  • VI 512A VI 512A
  • V2 512B V3 512C
  • V4 512D As shown in Fig.
  • some variables 512 are generated using more than one data input 510, and some data inputs 510 are used to generate more than one variable 512.
  • Data inputs 510 can be any of the transaction request evaluation factors discussed herein (e.g., user behavior information, user device information, or user information collected by user device 120; information about other transactions stored in transaction datastore 430, etc.) or transaction information for the particular transaction request 102.
  • Variables 512 may be generated based on data inputs 510 (e.g., position of user device 120 based on data inputs from multiple gyroscopes; user facial recognition match calculated by user device 120, etc.).
  • variables 512 are calculated using simple arithmetic operations, but in other embodiments more complex calculations may be made (e.g., statistical analysis such as means and standard deviations; derivative or integrals) to generate variables 512 from data inputs 510. While the federated machine learning model 400 in Fig. 5A includes only six data inputs 510 and four variables, in various embodiments dozens or hundreds of each may be used. As shown in Fig. 5A, user device portion 124 and server portion 114 use data inputs 510 that are completely independent; none of D1-D3 510A-C are taken as input by server portion 124 and none of D4-D6 510D-F are taken as input by user device portion 124. In other embodiments, however, some data inputs 510 may be used in more than one portion 114, 124, 134, 144.
  • variables 512 are applied to machine learning algorithms 514 to generate scores.
  • machine learning algorithm Ml 514A takes a weighted sum of variables V 1 and V2 (a* V 1 + b* V2) to generate user device scores 230.
  • machine learning algorithm M2 514B takes a weighted sum of variables V3 and V4 (c*Vl + d*V2) to generate server scores 500.
  • weights a, Z>, c, and d may be adjusted using any of a number of machine learning algorithms including a wide range of supervised and semi-supervised algorithms, that are nonparametric or of the neural computation nature.
  • Fig. 5A are a simplified example. Any suitable machine learning algorithm may be applied to variables 512.
  • adjustments to machine learning algorithms 514 may be made locally (e.g., by an update module 222 of a user device 120, by an update module 302 of an edge server 130 or 140), or by an update module 414 of a server computer system 110) or by server computer system 110 and distributed to the rest of computer system 100.
  • different user devices 120 and edge servers 130, 140 are operable to adjust their respective portions 124, 134, and 144 as transaction requests 102 are received and analyzed.
  • the two user device portions 124 will have diverged from each other as a result of receiving transaction requests 102. For example, if a first user 104 of a first user device 120 holds her user device 120 in her right hand and a second user 104 of a second user device 120 holds his user device 120 in his left hand, the model weights for their respective user device portions 124 will differ accordingly in various embodiments.
  • scores such as user device scores 230 and server scores 500 may be evaluated using decision thresholds 402 to determine a response (e.g., grant, deny, step-up) for a particular transaction request 102.
  • decision thresholds 402 may be adjusted dynamically using any of a number of machine leaning algorithms including a wide range of supervised and semi-supervised algorithms, that are non-parametric or of the neural computation nature.
  • Federated machine learning model 400 may be adjusted automatically (e.g., by adjusting model weights such as weights a, Z>, c, and d) or by adding or removing layers of federated machine learning model 400, adding or removing data inputs 510, adding or removing variables 512, change how variables 512 are calculated, etc. Such adjustment may be made in response to indication of ground truth 434 and/or based on performance data (e.g., increasing thresholds 402 in response to determining that a false positive rate was lower than expected, removing data inputs 510 in response to determining that such data inputs had little effect on the final scores and were consuming computational resources).
  • model weights such as weights a, Z>, c, and d
  • Fig. 5B is a block diagram illustrating clusters of trained machine learning algorithms (including machine learning algorithm Ml 514A of Fig. 5 A) of a federated machine learning model 400 in accordance with various embodiments.
  • the various trained machine learning algorithms in Fig. 5B were trained by various user device 120 and information about them was sent back to server computer system 110 (e.g., by update modules 222). Having received the trained machine learning algorithms, server computer system 110 has performed a clustering analysis (e.g., using a k-means algorithm, a k-nearest neighbor algorithm, or other suitable clustering algorithm) and grouped the trained machine learning algorithms together.
  • Fig. 5 A has been grouped with Ml 5301 and Ml 530J for example.
  • Fig. 5B three clusters are shown, although any number of clusters may be identified among machine learning algorithms.
  • trained machine learning algorithms within a cluster may be aggregated together (e.g., by averaging model weights, by finding a center of the cluster and calculating model weights by scaling model weights of clustered trained machine learning algorithms) to generate the “headstart models.”
  • “headstart models” are generic models that are generated from models that have been trained by components of computer system 100 (e.g., a user device 120, a remote edge server 130).
  • Models trained by different components are clustered (e.g., clustered according to attributes of the various user devices that trained the models) and the clusters are useable to generate a headstart model for the cluster.
  • the clusters are useable to generate a headstart model for the cluster.
  • the new user device 120 can receive a headstart model that corresponds to similar characteristics that new user device 120 shares. For example, it may be the case that Ml 530A-D correspond to user devices 120 that are mobile phones in the United States, Ml 530E-H correspond to user device 120 that are laptops in Spain, and Ml 514A, Ml 5301, and Ml 530J correspond to tablet computers in Japan.
  • new user device 120 may be sent headstart model 3 524 upon being added to computer system 110 (e.g., when transaction request analysis module 122 is installed and prior to the reception of any transaction requests 102 by new user device 120).
  • headstart models may be distributed to newly installed (or reset) edge servers 130 and 140.
  • a first remote edge server 130 that is associated with a large enterprise in the United States may be sent a headstart model generated from portions 134 of other remote edge servers 130 that are associated with large enterprises in the United States.
  • a second remote edge server 130 that services a university may be sent headstart models generated from portions 134 of other remote edge servers 130 that are associated with universities.
  • local edge server 140 may be sent headstart models generated from portions 144 of other local edge servers 140 having similar characteristics.
  • any of a number of other characteristics may be used of any level of granularity (e.g., a headstart model generated from user device portions 124 for mobile phones that have 128 GB of internal storage with 11 GB of available space, run version 14.6 of user device operating system 204, that are located in Texas and are used by users 104 of who are left-handed).
  • a headstart model generated from user device portions 124 for mobile phones that have 128 GB of internal storage with 11 GB of available space, run version 14.6 of user device operating system 204, that are located in Texas and are used by users 104 of who are left-handed).
  • Figs. 6A-6D are flowcharts illustrating various embodiments of a transaction request evaluation method 600 using federated machine learning model 400 in accordance with various embodiments.
  • Method 600 includes actions performed by server computer system 110, user device 120, remote edge server 130, and local edge server 140 including various optional actions that are not performed in all embodiments. These actions are represented in Figs. 6A-6D as blocks and information passed between components is represent as arrows.
  • network boundary 150 separates user device 120 and remote edge server 130 from server computer system 110 and local edge server 140.
  • Method 600 includes certain steps that may be performed more frequently than others (e.g., generating federated machine learning model 400 at block 602 may be performed less frequently than receiving transaction request 102 at block 606).
  • federated machine learning model 400 is generated. As discussed above, in various embodiments, federated machine learning model 400 includes server portion 114 and any combination of user device portion 124, remote edge server portion 134, and local edge server portion 144. As discussed herein user device portion 124 and remote edge server portion 134 are also referred to herein as remote computer system portions.
  • user device portion 124, remote edge server portion 134, and/or local edge server portion 144 are distributed by server computer system 110.
  • user device 120 receives (and stores with transaction requestion analysis module 122) a user device portion 124.
  • remote edge server 130 receives a remote edge server portion 134 and local edge server 140 receives a local edge server portion 144.
  • user device 120 receives transaction request 102 via transaction request analysis module 122.
  • user 104 enters transaction information about the requested transaction as they submit transaction request 102.
  • transaction request analysis module 122 requests transaction request evaluation factors from data collection module 126.
  • data collection module 126 collects various transaction request evaluation factors (e.g., user behavior information, user device information, user information).
  • data collection module 126 sends the collected transaction request evaluation factors to transaction request analysis module 122.
  • transaction request analysis module 122 anonymizes and/or abstracts the transaction request evaluation factors.
  • transaction request analysis module 122 analyzes the transaction request evaluation factors sent by data collection module 126 using user device portion 124 to generate user device scores 230.
  • the operations described in connection to line 608, block 610, line 612, and block 614 are performed independently of the reception of transaction request 102 at block 606.
  • the transaction request evaluation factors are continuously collected and/or collected on a regular schedule.
  • user device scores 230 may be continuously generated and/or generated on a regular schedule. In such embodiments, deviations in user device scores 230 may be used to identify changes in user behavior, device information, or determine that a different user 104 may be using user device 120 (e.g., by determining that user information has deviated from prior patterns).
  • user device 120 sends user device scores 230 and/or transaction request evaluation factors to server computer system 110.
  • user device 120 sends user device scores 230 and/or transaction request evaluation factors via remote edge server 130 and local edge server 140. In other embodiments, however, user device 120 sends user device scores 230 and/or transaction request evaluation factors to server computer system 110 using a separate transmission pathway that does not pass through via remote edge server 130 and local edge server 140.
  • User device 120 communicates an indication of transaction request 234 (e.g., along with user device scores 230 and/or transaction request evaluation factors). As discussed herein, in various embodiments, user device 120 does not send some or all of the transaction request evaluation factors that were used to generate user device score 230.
  • transaction request evaluation factors may be continuously collected and/or collected on a regular schedule.
  • user device 120 optionally sends subsets of transaction request evaluation factors to remote edge server 130 (line 620) and/or local edge server 140 (line 626).
  • the subset of transaction request evaluation factors sent to remote edge server 130 at line 620 does not include transaction request evaluation factors used to generate user device scores 230. In other embodiments, however, the subset of transaction request evaluation factors sent to remote edge server 130 at line 620 includes some or all of the transaction request evaluation factors used to generate user scores 230.
  • the subset of transaction request evaluation factors sent to local edge server 140 at line 626 does not include transaction request evaluation factors used to generate user device scores 230. In other embodiments, however, the subset of transaction request evaluation factors sent to local edge server 140 at line 626 includes some or all of the transaction request evaluation factors used to generate user scores 230. In some embodiments, the subsets of transaction request evaluation factors sent at line 620 and 626 are identical, but in other embodiments the subset transaction request evaluation factors sent at line 620 and 626 differ (e.g., personally identifiable information is sent to remote edge server 130 but not to local edge server 140).
  • remote edge server 130 generates remote edge server scores 310 by applying received transaction request evaluation factors to remote edge server portion 134.
  • remote edge server 130 sends edge server scores 310 to server computer system 110 (via local edge server 140 or via a different transmission pathway).
  • local edge server 140 generates remote edge server scores 310 by applying received transaction request evaluation factors to local edge server portion 144.
  • local edge server 140 sends edge server scores 310 to server computer system 110.
  • server computer system 110 generates remote server scores (e.g., server scores 500) by applying received transaction request evaluation factors to server portion 114.
  • computer system 100 is configured such that sensitive information does not cross network boundary 150, and can be instead analyzed on user device 120 and/or remote edge server 130.
  • sending subsets of transaction request evaluation factors from user device 120 to local edge server 140 is performed via remote edge server 130.
  • the subset or transaction request evaluation factors are sent from user device 120 to local edge server 140, analyzed by applying received transaction request evaluation factors to remote edge server portion 134 (block 622), and then some or all of the subset of transaction request evaluation factors are sent across network boundary 150 from remote edge server 130 to locale edge server 140.
  • fifteen transaction request evaluation factors are collected at block 610.
  • the fifteen transaction request evaluation factors include factors A through E that include personally identifiable information and factors F through I that include other sensitive information.
  • the transaction request evaluation factors A though E are used to generate user device scores 230 at block 614.
  • the remaining ten transaction request evaluation factors (F though O) are sent to local edge server 130.
  • transaction request evaluation factors F, G, H, and I are applied by remote edge server 130 to remote edge server portion 134 to generate edge server scores 310.
  • transaction request evaluation factors J and K are used by local edge server 140 to generate additional edge server scores 310.
  • Transaction request evaluation factors L, M, N, and O are sent to server computer system 110 for analysis.
  • transaction request evaluation factors may be analyzed be more than one computer system (e.g., user device 120 analyzes factors A through F, and remote edge server 130 analyzes factors D through I).
  • server computer system 110 determines how to response to transaction request 102.
  • server computer system 110 authenticates transaction request 102 (e.g., using identity services module 420). If transaction request 102 is authenticated, at block 644, server computer system 110 evaluates the received scores (e.g., user scores 230, edge server scores 310) and/or server scores against one or more decision thresholds 402 to determine how to respond to transaction request 102.
  • server computer system 110 determines to initiate a step-up operation (block 645), which causes method 600 to advance to block 650.
  • server computer system 110 determines to grant transaction request 102 (block 646), which causes method 600 to jump to block 660.
  • server computer system 110 determines to deny transaction request 102 (block 647), which cases method 600 to end and user device 120 to receive an indication that transaction request 102 was denied.
  • a step-up operation is performed.
  • a step-up challenge is sent from server computer system 110 to user device 120 (arrow 652).
  • user 104 solves the step-up challenge or user device 120 solves the step-up challenge.
  • a message is sent to server computer system 110 with the step-up solution (arrow 656).
  • Server computer system 110 checks the step-up solution to determine whether the grant transaction request 102.
  • the step-up challenge is a request for additional information from user 104 (e.g., a secondary password, a request to input a one-time password sent via a secondary channel or generated by another device or application, etc.).
  • the step-up challenge is performed by user device 120 performing additional data collection and user device 120 or an edge server 130, 140 performing additional scoring (e.g., user device 120 collects additional transaction request evaluation factors and applies them to user device portion 124 to generate additional user device scores).
  • additional scoring e.g., user device 120 collects additional transaction request evaluation factors and applies them to user device portion 124 to generate additional user device scores.
  • server computer system 110 may be operable to securely perform step-up operations without exposing sensitive information to server computer system 110.
  • server computer system 110 performs the requested transaction (block 660). As discussed above, this transaction may be a login, a financial transaction, or access to secured information or media. After the requested transaction is successfully completed, server computer system 110 sends an indication to user device 120 that the transaction was successfully completed (arrow 662).
  • FIG. 7 a flowchart illustrates an example transaction request evaluation method 700 using a federated machine learning model 400 that is implemented using user device 120 and server computer system 110.
  • Method 700 includes various elements discussed in Figs. 6A-6D, but neither edge server 130, 140 is shown because edge servers 130, 140 are not used to evaluate transaction request 102 in method 700.
  • server computer system 110 generates federated machine learning model 400 (block 602) and distributes user device portion 124 to user device 120 (line 604). Blocks 606, 610, 614, and 632 (and the lines 608, 612, and 616 connecting them) proceed in the same manner as Figs. 6A-6D.
  • server computer system 110 determines whether to grant transaction request 102. In method 700, no step-up operation is performed, and block 660 and arrow 662 are omitted for brevity.
  • user device portion 124 has been revised to become revised user device portion 702.
  • user device 120 sends an indication of revised user device portion 702 (e.g., revised model weights) to server computer system 110.
  • server computer system 110 uses revised user device portion 702 to update federated machine learning model 400 (e.g., using modules 412 and 414). In some embodiments, such updating results in a revised server device portion 708 and revised user device portion 702 being used as an updated federated machine learning model 400 to evaluation subsequent transitions.
  • revised user device portion 702 may also be used to generate headstart models (discussed in connection to Fig. 5B) that can be sent to a second user device 120.
  • the process of method 700 may be performed by a particular user device 120 that a particular user 104 desires to use to log in to an application provided by server computer system 110.
  • server computer system 110 distributes user device portion 124 to the particular user device 120 when user 104 caused transaction request analysis module 122 to be installed with the application provided by server computer system 110.
  • a headstart model was used to initialize the user device portion 124 on the particular user device 120.
  • User 104 has previously used user device 120 to successfully log in two times since installing the application prior to inputting the current login request.
  • the particular user 104 enters the login request (e.g., a transaction request 102), including their username and password.
  • server computer system 110 requires at least a second factor be established as well before a login request is granted.
  • the particular user device 120 has collected user behavior information and user device information over the last two weeks (block 610) and this collected information is applied to the user device portion 124 to generate user device scores (block 614).
  • User device 120 sends personally identifiable transaction request evaluation factors and user device scores 230 to server computer system 110 (line 616). After receiving the user device score 230 and performing additional scoring by applying the received transaction request evaluation factors to server portion 114, server computer system 110 is able to establish an inherence factor and grants the login request. Subsequently, server computer system 110 incorporates the particular user device’s 120 revised user device portion 702 into federated machine learning model 400 and uses the revised user device portion 702 to generate a headstart model that is sent to a second user device 120.
  • a flowchart illustrates an example transaction request evaluation method 800 using a federated machine learning model 400 that is implemented using user device 120, remote edge server 130, and server computer system 110.
  • Method 800 includes various elements discussed in Figs. 6A-6D, but local edge server 140 is not shown because local edge server 140 is not used to evaluate transaction request 102 in method 800.
  • server computer system 110 generates federated machine learning model 400 (block 602) and distributes user device portion 124 to user device 120 and remote edge server portion 134 to remote edge server 130 (line 604).
  • Transaction request 102 is received at block 606.
  • Transaction request evaluation factors collected by data collection module 126 which is omitted from Fig.
  • transaction request analysis module 122 (line 612) and analyzed using user device portion 124 (block 614).
  • User device scores 230 and transaction request evaluation factors are sent to server computer system 110 (line 616) along with an indication of transaction request 234.
  • transaction request evaluation factors are sent to remote edge server 130 (line 620).
  • the received transaction request evaluation factors are analyzed using remote edge server portion 134.
  • Edge server scores 310 are sent to server computer system 110 (line 624).
  • server computer system 110 determines whether to grant transaction request 102 and determines to perform a step-up operation. Step-up operation is performed (block 650), and after server computer system 110 determines that the step-up operation has been successfully performed, the transaction is performed (block 660).
  • the step-up operation may require asking user 104 for additional authentication information such as additional passwords, answering security questions, or performing an image recognition check. In other embodiments, however, the step-up operation is performed behind the scenes using additional transaction request evaluation factors and additional scoring.
  • the process of method 800 may be performed by a particular user device 120 that a particular user 104 uses to perform tasks for their employer ABC Corp, an enterprise with hundreds of workers.
  • server computer system 110 Prior to allowing particular user 104 to perform a task that imposes a certain level of risk (e.g., before performing a financial transaction for $10,000 or more), server computer system 110 requires a step-up operation.
  • Transaction request analysis module 122 is installed on the particular user device 120 as well as hundreds of other user devices that are associated with ABC Corp.
  • ABC Corp has transaction request analysis module 300 on the remote edge server 310 that serves the ABC Corp office where particular user 104 works.
  • ABC Corp does not allow personally identifiable information about employees to go beyond network boundary 150. When particular user 104 submits a $20,000 transaction request as part of their job, to the particular user 104 it does not appear that additional authentication information was needed to perform the task.
  • FIG. 9 is flowchart illustrating an embodiment of a server computer system portion 900 of a transaction request evaluation method in accordance with the various embodiments.
  • the various actions associated with portion 900 are implemented by server computer system 110.
  • server computer system 110 trains a federated machine learning model 400 using a dataset of previous transaction requests (e.g., stored in transaction datastore 430), A server portion 114 of the federated machine learning model 400 is usable by the server computer system 110 to analyze a first set of factors for subsequent transaction requests 102 and a remote portion (e.g., a user device portion 124, a remote edge server portion 134) of the federated machine learning model 400 is useable by a remote computer system (e.g., a user device 120, a remote edge server 130) to analyze a second set of factors for subsequent transaction requests 102.
  • server computer system 110 sends the remote portion of the federated machine learning model 400 to the remote computer system.
  • server computer system receives from the remote computer system: (a) an indication of a particular subsequent transaction request 234, (b) the first set of factors for the particular subsequent transaction request 102 (e.g., line 616 in Fig. 6A, line 624 in Fig. 6B), and (c) remote scores (e.g., user device scores 230, edge server scores 310) that were generated by analyzing the second set of factors of the particular subsequent transaction request 102 with the remote portion of the federated machine learning model 400.
  • server computer system 110 generates server scores (e.g., server scores 500) by analyzing the first set of factors for the particular subsequent transaction request 102 with the server portion 114 of the federated machine learning model.
  • server computer system 110 determines whether to grant the particular subsequent transaction request 102.
  • FIG. 10 is flowchart illustrating an embodiment of a remote computer system portion 1000 of a transaction request evaluation method in accordance with the various embodiments.
  • the various actions associated with portion 1000 are implemented by user device 120 or remote edge server 130.
  • a remote computer system (e.g., user device 120, remote edge server 130) stores a remote portion (e.g., a user device portion 124, a remote edge server portion 134) of a federated machine learning model 400 that was generated by a server computer system 110 using a dataset of previous transaction requests.
  • the federated machine learning model 400 includes at least the remote portion and a server portion 114.
  • the remote computer system receives a transaction request 102 from a user 104.
  • the remote computer system receives transaction request evaluation factors (e.g., line 612 of Fig. 6A, line 620 of Fig. 6B).
  • the transaction request evaluation factors include information that is usable to identify the user 104.
  • remote computer system generates, using the remote portion of the federated machine learning model 400 and a first subset of the transaction request evaluation factors, one or more remote computer system scores (e.g., user device scores 230, edge server scores 310).
  • the remote computer system sends to the server computer system 110, an indication of the transaction request 234, the one or more remote computer system scores, and a second subset of the transaction request evaluation factors (e.g., line 616 in Fig. 6A, line 624 in Fig. 6B).
  • the remote computer system receives a response to the transaction request 102 from the server computer system 110.
  • a user device comprising: a computer processor circuit; and a computer memory circuit storing instructions that when executed by the computer processor circuit cause the user device to perform operations including: receiving a user device portion of a federated machine learning model that was generated by a server computer system using a dataset of previous transaction requests, wherein the federated machine learning model includes the user device portion and a server portion; receiving a transaction request from a user; collecting (a) user behavior information about how the user has used the user device and (b) transaction information about a transaction requested by the transaction request; generating, using the user device portion of the federated machine learning model and the user behavior information, one or more user device scores by applying the user behavior information to the federated machine learning model; sending, to the server computer system, an indication of the transaction request, the one or more user device scores, the transaction information; and receiving, from the server computer system, a response to the transaction request.
  • the operations further include: training the user device portion using the user behavior information; and wherein generating the one or more user device scores includes generating an indication of whether user behavior in a period of time before receiving the transaction request has deviated from a previous trend in user behavior.
  • the collecting includes collecting (c) user device information; wherein generating the one or more user device scores includes applying the user device information to the federated machine learning model; and wherein the user behavior information and device information used to generate the one or more user device scores is not sent from the user device to another computer system.
  • operations further include: sending, from the user device to an edge server, user behavior information and device information that was not used to generate the one or more user device scores.
  • the collecting includes (c) collecting biometric information about the user of the user device; wherein generating the one or more user device scores includes applying indications of the biometric information to the federated machine learning model; and wherein biometric information is not sent from the user device to another computer system.
  • the operations further comprise: receiving a step-up challenge from the server computer system; receiving a solution to the step-up challenge the user; and sending an indication of the solution to the step-up challenge to the server computer system.
  • a method comprising: storing, at a remote computer system, a remote portion of a federated machine learning model that was generated by a server computer system using a dataset of previous transaction requests, wherein the federated machine learning model includes the remote portion and a server portion; receiving, by the remote computer system, a transaction request from a user; receiving, by the remote computer system, transaction request evaluation factors, wherein the transaction request evaluation factors include information that is usable to identify the user; generating, by the remote computer system using the remote portion of the federated machine learning model and a first subset of the transaction request evaluation factors, one or more remote computer system scores; sending, from the remote computer system to the server computer system, an indication of the transaction request, the one or more remote computer system scores, and a second subset of the transaction request evaluation factors; and receiving, from the server computer system, a response to the transaction request.
  • the user device includes a safe list of software approved to collect and analyze information that is usable to identify a user without permission from the user; wherein the transaction request evaluation factors are collected by authentication software running on the user device; wherein the first subset of the transaction request evaluation factors includes information that is usable to identify the user; and wherein at least some of the information that is usable to identify the user is not sent from the user device to another computer system.
  • receiving the transaction request evaluation factors includes receiving the transaction request evaluation factors from a user device.
  • the federated machine learning model includes a user device portion that was previously sent to the user device via the edge server; the method further comprising: receiving, at the edge server from the user device, one or more user device scores generated by the user device using the user device portion of the federated machine learning model and a third subset of transaction request evaluation factors that was collected by the user device but not sent to the edge server; and sending, from the edge server to the server computer system, the one or more user device scores.
  • the response to the transaction request is sending a step- up challenge
  • the method further comprising: receiving, at the remote computer system, a step-up challenge from the server computer system; receiving, at the remote computer system a solution to the step-up challenge from the user; and sending an indication of the solution to the step-up challenge to the server computer system.
  • a method comprising: storing, at a user device, a user device portion of a federated machine learning model that was generated by a server computer system using a dataset of previous transaction requests, wherein the federated machine learning model includes the user device portion and a server portion; receiving, at the user device, a transaction request from a user; collecting, by the user device, (a) user behavior information about how the user has used the user device, (b) user device information, and (c) transaction information about a transaction requested by the transaction request; generating, by the user device using the user device portion of the federated machine learning model and the user behavior information, one or more user device scores by applying the user behavior information and user device information to the federated machine learning model; sending, from the user device to the server computer system, an indication of the transaction request, the one or more user device scores, and the transaction information; and receiving, at the user device from the server computer system, a response to the transaction request.
  • the method of claim 16 further comprising: receiving, at the user device from the server computer system, an updated user device portion of the federated machine learning model, wherein the user device portion was generated based on ground truth indications for one or more transaction requests analyzed using the federated machine learning model.
  • the transaction request includes a knowledge authentication factor for the transaction requested by the transaction request; and wherein the one or more user device scores are useable to determine whether a second authentication factor is established for the transaction requested by the transaction request.
  • the response to the transaction request is sending a step- up challenge
  • the method further comprising: receiving, at the user device, a step-up challenge from the server computer system; receiving, at the user device a solution to the step-up challenge from the user; and sending, from the user device, an indication of the solution to the step-up challenge to the server computer system.
  • Computer system 1100 includes a processor subsystem 1180 that is coupled to a system memory 1120 and I/O interfaces(s) 1140 via an interconnect 1160 (e.g., a system bus). I/O interface(s) 1140 is coupled to one or more I/O devices 1150.
  • processor subsystem 1180 that is coupled to a system memory 1120 and I/O interfaces(s) 1140 via an interconnect 1160 (e.g., a system bus).
  • I/O interface(s) 1140 is coupled to one or more I/O devices 1150.
  • Computer system 1100 may be any of various types of devices, including, but not limited to, a server system, personal computer system, desktop computer, laptop or notebook computer, mainframe computer system, tablet computer, handheld computer, workstation, network computer, a consumer device such as a mobile phone, music player, or personal data assistant (PDA). Although a single computer system 1100 is shown in Figure 11 for convenience, system 1100 may also be implemented as two or more computer systems operating together.
  • a server system personal computer system
  • desktop computer laptop or notebook computer
  • mainframe computer system tablet computer
  • handheld computer handheld computer
  • workstation network computer
  • PDA personal data assistant
  • Processor subsystem 1180 may include one or more processors or processing units. In various embodiments of computer system 1100, multiple instances of processor subsystem 1180 may be coupled to interconnect 1160. In various embodiments, processor subsystem 1180 (or each processor unit within 1180) may contain a cache or other form of on-board memory.
  • System memory 1120 is usable to store program instructions executable by processor subsystem 1180 to cause system 1100 perform various operations described herein.
  • System memory 1120 may be implemented using different physical memory media, such as hard disk storage, floppy disk storage, removable disk storage, flash memory, random access memory (RAM— SRAM, EDO RAM, SDRAM, DDR SDRAM, RAMBUS RAM, etc ), read only memory (PROM, EEPROM, etc.), and so on.
  • Memory in computer system 1100 is not limited to primary storage such as memory 1120. Rather, computer system 1100 may also include other forms of storage such as cache memory in processor subsystem 1180 and secondary storage on I/O Devices 1150 (e.g., a hard drive, storage array, etc.). In some embodiments, these other forms of storage may also store program instructions executable by processor subsystem 1180.
  • VO interfaces 1140 may be any of various types of interfaces configured to couple to and communicate with other devices, according to various embodiments.
  • I/O interface 1140 is a bridge chip (e.g., Southbridge) from a front-side to one or more back-side buses.
  • VO interfaces 1140 may be coupled to one or more I/O devices 1150 via one or more corresponding buses or other interfaces.
  • I/O devices 1150 include storage devices (hard drive, optical drive, removable flash drive, storage array, SAN, or their associated controller), network interface devices (e.g., to a local or wide-area network), or other devices (e.g., graphics, user interface devices, etc.).
  • computer system 1100 is coupled to a network via a network interface device (e.g., configured to communicate over WiFi, Bluetooth, Ethernet, etc.), another example of an VO device 1150.
  • a network interface device e.g., configured to communicate over WiFi, Bluetooth, Ethernet, etc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Security & Cryptography (AREA)
  • Software Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Computer Hardware Design (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Medical Informatics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Storage Device Security (AREA)
EP22854024.1A 2021-08-05 2022-07-22 Computersystemarchitektur für maschinelles lernen Pending EP4381435A1 (de)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US17/395,204 US12081541B2 (en) 2021-08-05 2021-08-05 Device-side federated machine learning computer system architecture
US17/395,014 US20230041015A1 (en) 2021-08-05 2021-08-05 Federated Machine Learning Computer System Architecture
PCT/US2022/074037 WO2023015111A1 (en) 2021-08-05 2022-07-22 Machine learning computer system architecture

Publications (1)

Publication Number Publication Date
EP4381435A1 true EP4381435A1 (de) 2024-06-12

Family

ID=85156397

Family Applications (1)

Application Number Title Priority Date Filing Date
EP22854024.1A Pending EP4381435A1 (de) 2021-08-05 2022-07-22 Computersystemarchitektur für maschinelles lernen

Country Status (3)

Country Link
EP (1) EP4381435A1 (de)
AU (1) AU2022323412A1 (de)
WO (1) WO2023015111A1 (de)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11526745B2 (en) * 2018-02-08 2022-12-13 Intel Corporation Methods and apparatus for federated training of a neural network using trusted edge devices
US10616369B1 (en) * 2018-04-04 2020-04-07 Fuze, Inc. System and method for distributing communication requests based on collaboration circle membership data using machine learning
US11443240B2 (en) * 2019-09-06 2022-09-13 Oracle International Corporation Privacy preserving collaborative learning with domain adaptation
US11188791B2 (en) * 2019-11-18 2021-11-30 International Business Machines Corporation Anonymizing data for preserving privacy during use for federated machine learning
WO2021121585A1 (en) * 2019-12-18 2021-06-24 Telefonaktiebolaget Lm Ericsson (Publ) Methods for cascade federated learning for telecommunications network performance and related apparatus
CN112232528B (zh) * 2020-12-15 2021-03-09 之江实验室 一种联邦学习模型训练方法、装置及联邦学习系统

Also Published As

Publication number Publication date
AU2022323412A1 (en) 2024-01-18
WO2023015111A1 (en) 2023-02-09

Similar Documents

Publication Publication Date Title
US20230041015A1 (en) Federated Machine Learning Computer System Architecture
US11403413B2 (en) Avoiding user session misclassification using configuration and activity fingerprints
US10419427B2 (en) Authenticating identity for password changes
US20180248699A1 (en) Systems and methods for providing a universal decentralized solution for verification of users with cross-verification features
US20180343120A1 (en) Systems and methods for providing a universal decentralized solution for verification of users with cross-verification features
US20200380112A1 (en) Verification of access to secured electronic resources
US10984410B2 (en) Entity-sovereign data wallets using distributed ledger technology
US10846385B1 (en) Systems and methods for user-authentication despite error-containing password
Biswas et al. DAAC: Digital asset access control in a unified blockchain based e-health system
US20200053085A1 (en) Context-based possession-less access of secure information
US11765162B2 (en) Systems and methods for automatically performing secondary authentication of primary authentication credentials
US20140101752A1 (en) Secure gesture
US8943559B2 (en) Access authentication method and system
WO2023204916A2 (en) Apparatus and methods for mapping user-associated data to an identifier
US12081541B2 (en) Device-side federated machine learning computer system architecture
Ahmad et al. Machine learning-based intelligent security framework for secure cloud key management
US8965340B1 (en) Mobile device indentification by device element collection
EP4381435A1 (de) Computersystemarchitektur für maschinelles lernen
WO2022272262A1 (en) Federated machine learning management
Sinno et al. How biometrics can save companies from ‘fire and forget’
Junquera-Sánchez et al. JBCA: Designing an adaptative continuous authentication architecture
US20230177528A1 (en) Systems and methods for data insights from consumer accessible data
US20230140665A1 (en) Systems and methods for continuous user authentication based on behavioral data and user-agnostic pre-trained machine learning algorithms
US20230370473A1 (en) Policy scope management
US20240340339A1 (en) Peer-to-peer identity verification

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20240104

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR