US20190258791A1 - Expression recognition in messaging systems - Google Patents

Expression recognition in messaging systems Download PDF

Info

Publication number
US20190258791A1
US20190258791A1 US16/397,787 US201916397787A US2019258791A1 US 20190258791 A1 US20190258791 A1 US 20190258791A1 US 201916397787 A US201916397787 A US 201916397787A US 2019258791 A1 US2019258791 A1 US 2019258791A1
Authority
US
United States
Prior art keywords
messaging
expression
biometric
user
message
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/397,787
Inventor
Eric Leuthardt
Scott Stern
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
FaceToFace Biometrics Inc
Original Assignee
FaceToFace Biometrics Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=54017629&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20190258791(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Assigned to FaceToFace Biometrics, Inc. reassignment FaceToFace Biometrics, Inc. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: STERN, SCOTT, LEUTHARDT, ERIC
Application filed by FaceToFace Biometrics Inc filed Critical FaceToFace Biometrics Inc
Priority to US16/397,787 priority Critical patent/US20190258791A1/en
Publication of US20190258791A1 publication Critical patent/US20190258791A1/en
Priority to US16/831,432 priority patent/US11042623B2/en
Priority to US17/353,270 priority patent/US20210312028A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/629Protecting access to data via a platform, e.g. using keys or access control rules to features or functions of an application
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0269Targeted advertisements based on user profile or attribute
    • G06Q30/0271Personalized advertisement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L63/00Network architectures or network communication protocols for network security
    • H04L63/08Network architectures or network communication protocols for network security for authentication of entities
    • H04L63/0861Network architectures or network communication protocols for network security for authentication of entities using biometrical features, e.g. fingerprint, retina-scan
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/06Authentication
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/20Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel
    • H04W4/21Services signaling; Auxiliary data signalling, i.e. transmitting data via a non-traffic channel for social networking applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/011Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W12/00Security arrangements; Authentication; Protecting privacy or anonymity
    • H04W12/60Context-dependent security
    • H04W12/65Environment-dependent, e.g. using captured environmental data
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/12Messaging; Mailboxes; Announcements

Abstract

Some embodiments include a messaging system capable of expression-based communication and/or expression-based actions. The messaging system can run on a computing device. For example, the messaging system monitors a camera feed from a camera of the computing device to detect a biometric signature when a messaging interface of the messaging application is actively being used. The messaging system can match the detected biometric signature against a known profile utilizing a facial recognition process to authenticate an operating user to use the messaging application. The messaging system can determine a human expression based on the detected biometric signature utilizing an expression recognition process to associate a contextual tag with an activity on the messaging interface. The messaging system can then communicate with a message server system to associate the contextual tag with content presented to the operating user or a conversation participated by the operating user via the messaging application.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This application is a continuation of U.S. patent application Ser. No. 14/643,810, entitled “EXPRESSION RECOGNITION IN MESSAGING SYSTEMS,” which was filed on Mar. 10, 2015, which claims the benefit of U.S. Provisional Patent Application No. 61/950,423, entitled “BIOMETRIC FOR MOBILE ACCESS,” which was filed on Mar. 10, 2014; U.S. Provisional Patent Application No. 61/985,059, entitled “USE OF BIOMETRIC FOR ACCESS TO DATA DEVICE AND ASSOCIATED SOLUTIONS FOR DIFFICULT BIOMETRIC READING SCENARIOS,” which was filed on Apr. 28, 2014; and U.S. Provisional Patent Application No. 62/051,031, entitled “EXPRESSION RECOGNITION IN MESSAGING SYSTEMS,” which was filed on Sep. 16, 2014; which are all incorporated by reference herein in their entirety.
  • RELATED FIELD
  • At least one embodiment of this disclosure relates generally to an electronic messaging system, and in particular to privacy and security of an electronic messaging system.
  • BACKGROUND
  • With the wide availability of mobile devices, in some areas/cultures, electronic messaging is becoming an integral part of a person's life. Because of this, privacy and security concerns arise over the use of such systems. Conventional technology protects against privacy violations by providing a screen lock on a mobile device whenever the mobile device is not used. An authorized user can unlock the screen by typing a passcode into the mobile device. However, the passcode is knowledge that is transferable, and hence may be stolen. Furthermore, the screen lock prevents access to other applications on the mobile device, making it unnecessarily inconvenient. Other solutions protect against violation by scheduling the destruction of a message to ensure that its content therein does not survive indefinitely.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1A is a diagram illustrating a messaging application of a mobile device revealing content of a message when a facial profile associated with a recipient account is recognized, in accordance with various embodiments.
  • FIG. 1B is a diagram illustrating a messaging application of a mobile device hiding content of a message when a facial profile associated with a recipient account is not detected, in accordance with various embodiments.
  • FIG. 1C is a diagram illustrating a messaging application of a mobile device hiding content of a message when a facial profile associated with a recipient account is detected and a second unauthorized facial profile is detected, in accordance with various embodiments.
  • FIG. 2 is a block diagram of a system environment of a messaging system implementing a biometric security mechanism, in accordance with various embodiments.
  • FIG. 3 is a flow chart of a method of operating a messaging application on a computing device that implements an expression recognition process, in accordance with various embodiments.
  • FIG. 4 is a flow chart of a method of operating a messaging application on a computing device capable of delivering advertisements, in accordance with various embodiments.
  • FIG. 5 is a flow chart of a method of operating a message server system that facilitates conversations between computing devices, in accordance with various embodiments.
  • FIG. 6 is a block diagram of an example of a computing device, which may represent one or more computing devices or servers described herein, in accordance with various embodiments.
  • The figures depict various embodiments of this disclosure for purposes of illustration only. One skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles of the invention described herein.
  • DETAILED DESCRIPTION
  • Disclosed is a technology that pertains to protecting messages through a messaging system via a biometric security mechanism. The messaging system includes a message server system comprising one or more computer servers and messaging applications running on end-user devices. The end-user devices can include personal computers, smart phones, tablets, wearable devices, gaming consoles, smart TVs, other electronic gadgets, or any combination thereof. The messaging applications act as independent agents for the messaging system. For example, a messaging application may be installed on a general-purpose operating system (e.g., Windows, android, iOS, etc.). The messaging application can also be implemented by client-side script (e.g., JavaScript) that may be executed on a web browser of the end-user devices.
  • The biometric security mechanism is implemented via the messaging application. Unlike the conventional security mechanisms for a recipient user device that protect privacy only of the user of the recipient user device, the disclosed biometric security mechanism also protects the privacy and security of the sender of the message. This enables a new paradigm of protection for stakeholders (e.g., the message senders) who are conventionally dependent solely on the message recipients to protect their interests.
  • In various embodiments, the disclosed technology uses sensors in end-user devices to monitor and profile an end-user. Such profiling can serve to ensure security (for example, by utilizing biometric recognition, such as facial recognition or ear recognition). Such profiling can also serve to enhance context mining (for example, by utilizing expression recognition). In some embodiments, the expression recognition can be implemented without implementing the biometric recognition. In some embodiments, the biometric recognition can be implemented without implementing the expression recognition.
  • Security
  • The biometric security mechanism implements a biometric recognition process to verify one or more identities as authorized by the message senders. For example, the biometric recognition process may be based on facial recognition, ear recognition, silhouette recognition, speaker recognition, fingerprint recognition, device motion pattern recognition, contact pattern recognition, etc. The biometric security mechanism can detect biometric patterns utilizing a sensor, such as a camera, a microphone, an accelerometer, a touch sensor, a gyroscope, etc. The biometric security mechanism can then profile (e.g., by recording patterns and then training one or more computational functions that recognize one or more shared attributes or characteristic of the patterns) the biometric patterns of one or more users into a profile model utilizing machine learning algorithms, such as support vector machines, Gaussian mixture models, hidden Marcov models, etc. Based on the profiles of the users, the biometric security mechanism can recognize a user in real-time by matching the biometric pattern detected by a sensor with the profile model.
  • The biometric security mechanism is instantiated whenever the messaging interface of the message application is actively opened and running on an end-user device (e.g., such that the mechanism is active while the message application is open). The biometric security mechanism utilizes one or more types of biometric recognition processes to determine whether the people who have access to the end-user device (e.g., looking at the screen of the end-user device) are consistent with the privacy settings of the sender account and the recipient account.
  • To use the messaging application on a user device, a user may have to identify a messaging system account associated with the messaging system. The biometric security mechanism uses the one or more types of biometric recognition processes to ensure that an operator using the messaging application matches a biometric profile of the messaging system account. This procedure ensures that a third party (e.g., a brother or a classmate) cannot send messages on behalf of the user. In the case that there are multiple operators for the same end-user device, this procedure also ensures that independent private sessions of using the messaging application can be established based on automatic biometric recognition.
  • The biometric security mechanism can provide customizable privacy shields per message or conversation. These privacy shields may be customized by privacy settings dictated by the sender of the message or the initiator of the conversation. The biometric security mechanism can use the one or more types of biometric recognition processes to ensure that the operator using the messaging application matches an authorized biometric profile in accordance with a privacy setting of an individual conversation or message. For example, the default privacy setting may be that the message or conversation is revealed when a biometric profile of the recipient account is recognized by the biometric security mechanism. In another example, the default privacy setting may be that the message or conversation is revealed only when the biometric profile of the recipient account is recognized without the presence of other detected human beings in the proximate area of the recipient end-user device.
  • In some embodiments, the privacy setting can indicate a single authorized account or biometric profile. In some embodiments, the privacy setting can indicate a group of authorized accounts or biometric profiles. In some embodiments, the privacy setting can indicate an attribute of a biometric profile, such as age, gender, facial gesture, facial expression, vocal pitch, etc. The attribute can be the sole variable to authorize a person. For example, the privacy setting can indicate that any female face (e.g., in the case of facial recognition) or any female voice (e.g., in the case of speaker recognition) would be authorized. The attribute can be an additional conditional on top of a specific account or biometric profile. For example, a sender account (e.g., John) may send a message with a privacy setting indicating Nancy as the recipient account. In some cases, the privacy setting can add an attribute of the recipient profile being female.
  • The privacy shield may be implemented, for example, as a screen lock, a scrambling of the message content, a blur of the message content, a blackout of the message content, or any combination thereof. In some embodiments, the biometric security mechanism continuously loops through the one or more types of biometric recognition processes. In some embodiments, the biometric security mechanism disengages the privacy shield whenever an authorized biometric profile is recognized and detected. In some embodiments, the biometric security mechanism engages the privacy shield whenever a non-authorized biometric profile is detected.
  • A sender account or a recipient account can be associated with a single individual or a group of individuals. In the case of a group of individuals, biometric profiles of every member are associated with the sender account or the recipient account.
  • Expression Recognition
  • The use of the biometric security mechanism provides a dependable authentication process that improves or guarantees privacy of the users of the messaging system. Accordingly, the users can trust that the biometric recognition process is used for protection instead of exploitation. Because of this trust, an expression recognition process can be run concurrently to the biometric recognition process. That is, a user, who otherwise would not use a messaging application with an expression recognition process, would use that same messaging application if the biometric recognition process were there to guarantee security and privacy.
  • The disclosed technology can implement expression recognition process in addition to the biometric recognition process to provide additional contextual information associated with a user's emotional/mood state when using a messaging application. The expression recognition of a user of the messaging system can provide several benefits. For example, the message server system can maintain a stimulus response database that maps associations between stimuli presented on the messaging interface and expressions recognized by the expression recognition process. The message server system can then generate and provide a query interface to present consumer data (e.g., anonymized consumer data) based on the stimulus response database for advertisers, researchers, or business intelligence departments.
  • For another example, the messaging application can use the expression recognition process to personalize the messaging application. In some cases, based on the responding expression of a user when viewing a particular type of message (e.g., from a particular sender or during a particular time of day), the messaging application can determine whether or not to present another message of the same type via the messaging interface.
  • For another example, the messaging application can use the expression recognition process to add gesture control to the messaging application. In some cases, certain facial expressions can correspond to a gesture control to activate or deactivate an interactive component of the messaging application. In one specific example, a frown detected by the expression recognition process can cause the messaging application to suspend.
  • For another example, the messaging application can use the expression recognition process to customize targeted advertisement. In some cases, an advertisement may be selected based on the current mood of the user. The current mood of the user, in turn, can be estimated based on the expression recognized. In some cases, presentation of an advertisement may be triggered based on a user expression condition. For example, an advertisement may be shown only when the user is recognized to be smiling.
  • For another example, the messaging application can use the expression recognition process to add context to conversations between users of the messaging system. In some cases, the recognized expression of a sender user can be added as an emoticon to a message. In some cases, the recognized expression of a viewer user can be fed back to the sender as a status update.
  • Alternative Embodiments
  • The disclosure above pertains to the specific example of an electronic messaging system for delivery of text, images, audio clips, or video clips. However, it is within the contemplation of this disclosure to implement the biometric security mechanism to other similar systems that include registration of a sender account and the receiver account. For example, the biometric security mechanism can apply to email, social network, dating network, event/meet up invitation, physical delivery of goods, enterprise messaging (e.g., financial agent to customer messaging, real estate agent to customer messaging, medical professional messaging, etc.), or any combination thereof. The biometric security mechanism is advantageous in enforcing privacy settings of the sender at the device of the recipient. This is an improvement over existing technology that only seeks to protect against unauthorized exposure of content on behalf of the device owner but not on behalf of the sender.
  • The biometric security mechanism is also advantageous in preventing impersonation attempts, such as for the purpose of cyber bullying, by verifying the identity of a message sender when composing a message. This improves the overall accountability of messaging within the messaging system. Similarly, in an enterprise environment, the authentication in both viewing and composition of the message enhances enterprise-level security and compliance (e.g., eHealth compliance when the message content involves medical information).
  • The disclosure above assumes that the messaging application implements both the biometric security mechanism and a messaging interface (e.g., to compose and read messages). However, it is within the contemplation of this disclosure to implement the biometric security mechanism on a separate device or application from where the messaging interface is implemented. For example, in the case of an enterprise email system, an email access interface may be provided on a desktop computer while the biometric security mechanism may be implemented on a mobile device (e.g., a smart phone). In this example, the biometric security mechanism can require the mobile device to connect with the email access interface on the desktop computer. The biometric security mechanism can perform the biometric recognition process to verify that there is a secure environment near the desktop computer (e.g., no unauthorized user is around and/or an authorized user is present). In response to verifying a secure environment, the biometric security mechanism can notify the email access interface to reveal content of a message.
  • FIGS. 1A through 1C illustrate an example of the biometric security mechanism that implements a facial recognition process to protect the privacy of both a sender of a message and a recipient of the message. FIG. 1A is a diagram illustrating a messaging application of a mobile device revealing content of a message when a facial profile associated with a recipient account is recognized, in accordance with various embodiments. FIG. 1B is a diagram illustrating a messaging application of a mobile device hiding content of a message when a facial profile associated with a recipient account is not detected, in accordance with various embodiments. FIG. 1C is a diagram illustrating a messaging application of a mobile device hiding content of a message when a facial profile associated with a recipient account is detected and a second unauthorized facial profile is detected, in accordance with various embodiments.
  • FIG. 2 is a block diagram of a system environment of a messaging system 200 implementing a biometric security mechanism, in accordance with various embodiments.
  • The messaging system 200 can communicate with client devices 202 (e.g., mobile phones, tablets, desktop computers, laptops, other network-enabled devices, or any combination thereof). The messaging system 200 can include a messaging platform system 204 (e.g., one or more computer servers) configured to provide a service to facilitate human-understandable electronic communication between user accounts. The human-understandable electronic communication can include emoticons, text, photos, audio clips, videos, links, images, or any combination thereof. The human-understandable content of the electronic communication may be part of an electronic message or can be referenced in the electronic message (e.g., stored elsewhere that is accessible through a network).
  • In some embodiments, each of the client devices 202 can have its own instance of a messaging interface 206 and a corresponding instance of a biometric security engine 207 running thereon that communicates with the messaging platform system 204. In some embodiments, the messaging interface 206 and the biometric security engine 207 are part of a messaging application running and/or installed on the client device. In some embodiments, the messaging interface 206 is installed and/or running on a first client device and the biometric security engine 207 is installed and/or running on a second client device. In these embodiments, the biometric security engine 207 on the second client device can control a privacy shield implemented by the messaging interface 206.
  • For example, the messaging interface 206 and/or the biometric security engine 207 can be embodied as a mobile application running on operating systems of some of the client devices 202. In another example, the messaging interface 206 and/or the biometric security engine 207 can be implemented as a web-based application running on web browsers on some of the client devices 202.
  • The client devices 202 can be associated with user accounts. In some embodiments, a user account of the messaging system 200 can have multiple client devices associated therewith. In some embodiments, a client device can have multiple user accounts associated therewith. Conversations between user accounts are tracked by the messaging system 200 such that the messaging system 200 can deliver an electronic message from a client device of one user account to a client device of another user account.
  • In some embodiments, the messaging system 200 can include a user profile database 208. The user profile database 208 is configured to store user profiles of one or more user accounts. The user profiles may be associated with one or more social networking systems (e.g., an affiliated social networking system, a social networking system integrated with the messaging system 200, or an external social networking system) and social network profiles in the social networking systems.
  • In various embodiments, the messaging interface 206 can implement a privacy shield. The biometric security engine 207, for example, can recognize whether or not an authorized user is present by analyzing a video feed from its respective client device using a facial recognition algorithm. The messaging platform system 204 can maintain biometric profiles of user accounts in the user profile database 208. The messaging platform system 204 associates a sender account and a receiver account with every message (e.g., as specified by the sender account). The biometric profiles of the sender account and the receiver account can both be considered “authorized users.” One or more lists of authorized users may be stored in an authorization database 210. The authorization database 210 can also maintain one or more lists of blacklisted user accounts that are explicitly unauthorized. In some embodiments, the sender account, the receiver account, or both can add additional user accounts to the list of authorized users. In some embodiments, the sender account, the receiver account, or both can add one or more user accounts to the list of “blacklisted” user accounts.
  • The authorization database 210 can store a list of authorized users specific to a message conversation or specific to a user account (e.g., specific to a sender account, a receiver account, or both). For example, for each message or conversation sent or each message or conversation received, a user can add or remove one or more accounts from the list of authorized users or blacklisted users. For another example, a first user account can add a second user account as an authorized user (e.g., a spouse) or a blacklisted user (e.g., a rival friend or sibling) for all conversations that the first user account participates in. In some embodiments (e.g., as required by law) the authorization database 210 can also store a list of globally authorized users.
  • The messaging platform system 204 can communicate with the biometric security engine 207 to secure content of messages. For example, the messaging platform system 204 can send biometric profiles (e.g., facial profiles) of the authorized users and/or expressly unauthorized users to be verified on a client device by the biometric security engine 207. The biometric security engine 207 can monitor outputs of a sensor 214 (e.g., a camera) to detect and recognize biometric profiles. In some embodiments, the messaging platform system 204 can request the biometric security engine 207 to capture and send an image or a video of its operator to be verified on the messaging platform system 204. In those embodiments, the messaging platform system 204 returns the result of the verification back to the biometric security engine 207.
  • Once the biometric security engine 207 determines that a secure environment is present, the biometric security engine 207 can control the privacy shield implemented in the messaging interface 206 to either reveal or hide content of a message. The privacy shield implemented by the messaging interface 206 can reveal content of a message conversation when the authorized user is detected. In some embodiments, the privacy shield can hide the content in response to detecting both an authorized user and an unauthorized user (e.g., implicitly unauthorized or expressly unauthorized). In some embodiments, the privacy shield can hide the content whenever an expressly unauthorized user account is detected.
  • In some embodiments, the client devices 202 can each also include an expression recognition engine 216. The expression recognition engine 216 can configure a client device 202 to execute the disclosed expression recognition process. For example, the expression recognition engine 216 can implement the methods described in FIG. 3, FIG. 4, and FIG. 5. In some embodiments, the expression recognition engine 216 can be integrated with the biometric security engine 207. In some embodiments, the expression recognition engine 216 can be part of the same messaging application as the messaging interface 206.
  • The expression recognition engine 216 can utilize the one or more sensors 214 to monitor at least a biometric signature of an operating user. For example, the biometric signature can be a facial profile, a head profile, a mouth profile, a hand gesture profile, a gait profile, a vocal profile, or any combination thereof. From the biometric signature, the expression recognition engine 216 can determine an expression state. The expression state can correspond to an emotion, a mood, or other patterns recognizable across the biometric signatures of various people.
  • In some embodiments, the expression recognition engine 216 can upload recognized expression states of an operating user to the messaging platform system 204. The expression recognition engine 216 can also upload associated metadata (e.g., time stamp, associated user identifier, etc.) to the messaging platform system 204. Based on the expression states and the associated metadata, the messaging platform system 204 can maintain a stimulus response database 218. The stimulus response database 218 logs and tracks the expression state of an operating user in response to an activity occurring on the messaging interface 206. The activity, for example, can be the operating user composing a message, viewing a message, viewing an advertisement, viewing a media object, or any combination thereof. The stimulus response database 218 can generate a query interface such that an authorized agent of the messaging system 200 can access statistic of specific users or groups of users.
  • FIG. 3 is a flow chart of a method 300 of operating a messaging application (e.g., the expression recognition engine 216 of FIG. 2) on a computing device (e.g., one of the client devices 202 of FIG. 2) that implements an expression recognition process, in accordance with various embodiments.
  • At step 302, the computing device can monitor a video feed from a camera of the computing device to detect a biometric signature when a messaging interface of the messaging application is actively being used. At step 304, the computing device can match the detected biometric signature against a known profile utilizing a facial recognition process to authenticate an operating user to use the messaging application. The known profile can be stored in the computing device or a messaging server system (e.g., the messaging platform system 204 of FIG. 2). In some embodiments, the biometric signature is a facial profile and the biometric recognition process is a facial recognition process.
  • As step 306, the computing system can determine a human expression based on the detected biometric signature utilizing an expression recognition process to associate a contextual tag with an activity on the messaging interface. At step 308, the computing system can communicate with the message server system to associate the contextual tag with a conversation involving or participated by the operating user (e.g., where the operating user engaged in the conversation through the messaging application).
  • In some cases, the messaging interface is being used to compose a message in the conversation. In those cases, communicating with the message server system can include sending the contextual tag as an emoticon embedded as part of the message to configure the message server system to distribute the emoticon to another participant of the conversation. In another example, communicating with the message server system can include sending the contextual tag as an emotion status that is time stamped to configure the message server system to maintain an emotion status record of at least a participant of the conversation. In some cases, the messaging interface is being used to read a message in the conversation. In those cases, communicating with the message server system can include sending the contextual tag as an emotion status that is time stamped to configure the message server system to present the emotion status to another participant of the conversation. The other participant, for example, can be the writer of the message.
  • In some cases, communicating with the message server system can include personalizing the messaging application based on the contextual tag. In some cases, communicating with the message server system can include archiving an expression log associated with the operating user on the message server system.
  • Optionally, at step 310, the computing device can activate or deactivate an interactive component of the messaging interface in response to detecting a specific expression via the expression recognition process. That is, biometric expressions (e.g., facial expressions) can be used as a gesture control of the messaging interface/messaging application.
  • FIG. 4 is a flow chart of a method 400 of operating a messaging application (e.g., the expression recognition engine 216 of FIG. 2) on a computing device (e.g., one of the client devices 202 of FIG. 2) capable of delivering advertisements, in accordance with various embodiments. At step 402, the computing device monitors a video feed from a camera (e.g., one of the sensors 214 of FIG. 2) to detect a biometric signature when a messaging interface of the messaging application is actively being used. At step 404, the computing device can match the detected biometric signature against a known profile utilizing a facial recognition process to authenticate an operating user to use the messaging application.
  • At step 406, the computing device determines a human expression based on the detected biometric signature utilizing an expression recognition process to associate a contextual tag with an activity on the messaging interface. Optionally, at step 408, the computing device can receive an expression trigger condition associated with the targeted advertisement from a message server system or an advertisement service server. At step 410, the computing device presents a targeted advertisement on the messaging interface based on the contextual tag and an identity of the operating user. The targeted advertisement can be presented in response to determining that the contextual tag corresponds to the expression trigger condition. The expression trigger condition, for example, can include a smile, a laugh, a grimace, a frown, a pout, or any combination thereof. The expression trigger condition can include an expression corresponding to a mood or an emotion, a micro expression, a stimulated expression, a neutralized expression, a masked expression, or any combination thereof.
  • Presenting the targeted advertisement can include selecting the targeted advertisement from multiple options based on the contextual tag. Selecting the targeted advertisement can be based on a time stamp of the contextual tag in relation to an activity (e.g., user action or messaging-app-side activity) occurring on the messaging interface near or at the time stamp. Selecting the targeted advertisement can be based on an expression trigger condition associated with the targeted advertisement.
  • FIG. 5 is a flow chart of a method 500 of operating a message server system (e.g., the messaging platform system 204 of FIG. 2) that facilitates conversations between computing devices (e.g., the client devices 202 of FIG. 2), in accordance with various embodiments. At step 502, the message server system implements a messaging service to facilitate a messaging application executing on a first computing device. At step 504, the message server system authenticates an operating user of the first computing device based on a facial recognition process in conjunction with the messaging application when the operating user is actively using a messaging interface of the messaging application.
  • At step 506, the message server system receives an expression status from the messaging application based on an expression recognition process. Receiving the expression status can include receiving metadata associated with the expression status. The metadata, for example, can include a time stamp, a user identifier, a language identifier (e.g., identifies the language used in the conversation), an ethnic identifier of the user, an associated user action identifier (e.g., what the user was doing on the messaging interface), user response time, or any combination thereof.
  • At step 508, the message server system associates the expression status in a stimulus response database maintained by the message server system. At step 510, the message server system generates a query interface to the stimulus response database to provide associated expressions in response to a stimulus identifier.
  • The stimulus identifier, for example, can correspond to a corporate word, a political word, or a social word. The stimulus identifier, for another example, can correspond to a media object (e.g., image, text string, audio, video clip, etc.). The stimulus identifier can also correspond to an advertisement.
  • While processes or methods are presented in a given order, alternative embodiments may perform routines having steps, or employ systems having blocks, in a different order, and some processes or blocks may be deleted, moved, added, subdivided, combined, and/or modified to provide alternative or sub-combinations. Each of these processes or blocks may be implemented in a variety of different ways. In addition, while processes or blocks are at times shown as being performed in series, these processes or blocks may instead be performed in parallel, or may be performed at different times.
  • FIG. 6 is a block diagram of an example of a computing device 600, which may represent one or more computing devices or servers described herein, in accordance with various embodiments. The computing device 600 can be one or more computing devices that implement the messaging system 200 of FIG. 2 or methods and processes described in this disclosure. The computing device 600 includes one or more processors 610 and memory 620 coupled to an interconnect 630. The interconnect 630 shown in FIG. 6 is an abstraction that represents any one or more separate physical buses, point-to-point connections, or both, connected by appropriate bridges, adapters, or controllers. The interconnect 630, therefore, may include, for example, a system bus, a Peripheral Component Interconnect (PCI) bus or PCI-Express bus, a HyperTransport or industry standard architecture (ISA) bus, a small computer system interface (SCSI) bus, a universal serial bus (USB), IIC (I2C) bus, or an Institute of Electrical and Electronics Engineers (IEEE) standard 1494 bus, also called “Firewire”.
  • The processor(s) 610 is/are the central processing unit (CPU) of the computing device 600 and thus controls the overall operation of the computing device 600. In certain embodiments, the processor(s) 610 accomplishes this by executing software or firmware stored in memory 620. The processor(s) 610 may be, or may include, one or more programmable general-purpose or special-purpose microprocessors, digital signal processors (DSPs), programmable controllers, application-specific integrated circuits (ASICs), programmable logic devices (PLDs), trusted platform modules (TPMs), or the like, or a combination of such devices.
  • The memory 620 is or includes the main memory of the computing device 600. The memory 620 represents any form of random access memory (RAM), read-only memory (ROM), flash memory, or the like, or a combination of such devices. In use, the memory 620 may contain a code 670 containing instructions according to the mesh connection system disclosed herein.
  • Also connected to the processor(s) 610 through the interconnect 630 are a network adapter 640 and a storage adapter 650. The network adapter 640 provides the computing device 600 with the ability to communicate with remote devices over a network and may be, for example, an Ethernet adapter or Fibre Channel adapter. The network adapter 640 may also provide the computing device 600 with the ability to communicate with other computers. The storage adapter 650 enables the computing device 600 to access a persistent storage, and may be, for example, a Fibre Channel adapter or SCSI adapter.
  • The code 670 stored in memory 620 may be implemented as software and/or firmware to program the processor(s) 610 to carry out actions described above. In certain embodiments, such software or firmware may be initially provided to the computing device 600 by downloading it from a remote system through the computing device 600 (e.g., via network adapter 640).
  • The techniques introduced herein can be implemented by, for example, programmable circuitry (e.g., one or more microprocessors) programmed with software and/or firmware, or entirely in special-purpose hardwired circuitry, or in a combination of such forms. Special-purpose hardwired circuitry may be in the form of, for example, one or more application-specific integrated circuits (ASICs), programmable logic devices (PLDs), field-programmable gate arrays (FPGAs), etc.
  • Software or firmware for use in implementing the techniques introduced here may be stored on a machine-readable storage medium and may be executed by one or more general-purpose or special-purpose programmable microprocessors. A “machine-readable storage medium,” as the term is used herein, includes any mechanism that can store information in a form accessible by a machine (a machine may be, for example, a computer, network device, cellular phone, personal digital assistant (PDA), manufacturing tool, any device with one or more processors, etc.). For example, a machine-accessible storage medium includes recordable/non-recordable media (e.g., read-only memory (ROM); random access memory (RAM); magnetic disk storage media; optical storage media; flash memory devices; etc.), etc.
  • The term “logic,” as used herein, can include, for example, programmable circuitry programmed with specific software and/or firmware, special-purpose hardwired circuitry, or a combination thereof.
  • Some embodiments of the disclosure have other aspects, elements, features, and steps in addition to or in place of what is described above. These potential additions and replacements are described throughout the rest of the specification.

Claims (1)

What is claimed is:
1. A computer-implemented method of operating a messaging application on a computing device, comprising:
monitoring a camera feed (e.g., continuous video stream or one or more discrete photographs) from a camera of the computing device to detect a biometric signature when a messaging interface of the messaging application is actively being used;
matching the detected biometric signature against a known profile utilizing a facial recognition process to authenticate an operating user to use the messaging application;
determining a human expression based on the detected biometric signature utilizing an expression recognition process to associate a contextual tag with an activity on the messaging interface; and
communicating with a message server system to associate the contextual tag with content presented to the operating user or with a conversation participated by the operating user via the messaging application.
US16/397,787 2014-03-10 2019-04-30 Expression recognition in messaging systems Abandoned US20190258791A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/397,787 US20190258791A1 (en) 2014-03-10 2019-04-30 Expression recognition in messaging systems
US16/831,432 US11042623B2 (en) 2014-03-10 2020-03-26 Expression recognition in messaging systems
US17/353,270 US20210312028A1 (en) 2014-03-10 2021-06-21 Expression recognition in messaging systems

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US201461950423P 2014-03-10 2014-03-10
US201461985059P 2014-04-28 2014-04-28
US201462051031P 2014-09-16 2014-09-16
US14/643,810 US10275583B2 (en) 2014-03-10 2015-03-10 Expression recognition in messaging systems
US16/397,787 US20190258791A1 (en) 2014-03-10 2019-04-30 Expression recognition in messaging systems

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US14/643,810 Continuation US10275583B2 (en) 2014-03-10 2015-03-10 Expression recognition in messaging systems

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/831,432 Continuation US11042623B2 (en) 2014-03-10 2020-03-26 Expression recognition in messaging systems

Publications (1)

Publication Number Publication Date
US20190258791A1 true US20190258791A1 (en) 2019-08-22

Family

ID=54017629

Family Applications (4)

Application Number Title Priority Date Filing Date
US14/643,810 Active 2037-02-16 US10275583B2 (en) 2014-03-10 2015-03-10 Expression recognition in messaging systems
US16/397,787 Abandoned US20190258791A1 (en) 2014-03-10 2019-04-30 Expression recognition in messaging systems
US16/831,432 Active US11042623B2 (en) 2014-03-10 2020-03-26 Expression recognition in messaging systems
US17/353,270 Abandoned US20210312028A1 (en) 2014-03-10 2021-06-21 Expression recognition in messaging systems

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/643,810 Active 2037-02-16 US10275583B2 (en) 2014-03-10 2015-03-10 Expression recognition in messaging systems

Family Applications After (2)

Application Number Title Priority Date Filing Date
US16/831,432 Active US11042623B2 (en) 2014-03-10 2020-03-26 Expression recognition in messaging systems
US17/353,270 Abandoned US20210312028A1 (en) 2014-03-10 2021-06-21 Expression recognition in messaging systems

Country Status (1)

Country Link
US (4) US10275583B2 (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10043060B2 (en) * 2008-07-21 2018-08-07 Facefirst, Inc. Biometric notification system
US10929651B2 (en) * 2008-07-21 2021-02-23 Facefirst, Inc. Biometric notification system
US11080513B2 (en) * 2011-01-12 2021-08-03 Gary S. Shuster Video and still image data alteration to enhance privacy
US9817960B2 (en) * 2014-03-10 2017-11-14 FaceToFace Biometrics, Inc. Message sender security in messaging system
US10275583B2 (en) 2014-03-10 2019-04-30 FaceToFace Biometrics, Inc. Expression recognition in messaging systems
KR101583181B1 (en) * 2015-01-19 2016-01-06 주식회사 엔씨소프트 Method and computer program of recommending responsive sticker
CN106502712A (en) * 2015-09-07 2017-03-15 北京三星通信技术研究有限公司 APP improved methods and system based on user operation
US10437332B1 (en) * 2015-10-30 2019-10-08 United Services Automobile Association System and method for emotional context communication
US10732809B2 (en) * 2015-12-30 2020-08-04 Google Llc Systems and methods for selective retention and editing of images captured by mobile image capture device
CN107566243B (en) * 2017-07-11 2020-07-24 阿里巴巴集团控股有限公司 Picture sending method and equipment based on instant messaging
CN107480622A (en) * 2017-08-07 2017-12-15 深圳市科迈爱康科技有限公司 Micro- expression recognition method, device and storage medium
US20200028810A1 (en) * 2018-07-20 2020-01-23 International Business Machines Corporation Cognitive recognition and filtering of cyberbullying messages
US10593152B1 (en) 2018-08-22 2020-03-17 Aristocrat Technologies Australia Pty Limited Gaming machine and method for evaluating player reactions
CN109542216B (en) * 2018-10-11 2022-11-22 平安科技(深圳)有限公司 Man-machine interaction method, system, computer equipment and storage medium
US11133002B2 (en) * 2019-01-14 2021-09-28 Ford Global Technologies, Llc Systems and methods of real-time vehicle-based analytics and uses thereof
US11189130B2 (en) 2019-01-23 2021-11-30 Aristocrat Technologies Australia Pty Limited Gaming machine security devices and methods
CN110321477B (en) * 2019-05-24 2022-09-09 平安科技(深圳)有限公司 Information recommendation method and device, terminal and storage medium
US11308761B2 (en) 2019-05-31 2022-04-19 Aristocrat Technologies, Inc. Ticketing systems on a distributed ledger
US11373480B2 (en) 2019-05-31 2022-06-28 Aristocrat Technologies, Inc. Progressive systems on a distributed ledger
US11263866B2 (en) 2019-05-31 2022-03-01 Aristocrat Technologies, Inc. Securely storing machine data on a non-volatile memory device
CN110889366A (en) * 2019-11-22 2020-03-17 成都市映潮科技股份有限公司 Method and system for judging user interest degree based on facial expression
US11195371B2 (en) 2019-12-04 2021-12-07 Aristocrat Technologies, Inc. Preparation and installation of gaming devices using blockchain
US11636726B2 (en) 2020-05-08 2023-04-25 Aristocrat Technologies, Inc. Systems and methods for gaming machine diagnostic analysis
US11361062B1 (en) 2021-03-02 2022-06-14 Bank Of America Corporation System and method for leveraging microexpressions of users in multi-factor authentication
US11468713B2 (en) 2021-03-02 2022-10-11 Bank Of America Corporation System and method for leveraging a time-series of microexpressions of users in customizing media presentation based on users# sentiments

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080218472A1 (en) * 2007-03-05 2008-09-11 Emotiv Systems Pty., Ltd. Interface to convert mental states and facial expressions to application input
US20100141662A1 (en) * 2007-02-05 2010-06-10 Amegoworld, Ltd. Communication network and devices for text to speech and text to facial animation conversion

Family Cites Families (62)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8479978B1 (en) * 1998-04-17 2013-07-09 Diebold Self-Service Systems Division Of Diebold, Incorporated Banking system controlled responsive to data bearing records
US7246244B2 (en) 1999-05-14 2007-07-17 Fusionarc, Inc. A Delaware Corporation Identity verification method using a central biometric authority
US6836846B1 (en) 1999-10-21 2004-12-28 International Business Machines Corporation Method and apparatus for controlling e-mail access
AU1474401A (en) 1999-12-15 2001-06-25 Reuben Bahar Method and system for confirming receipt of electronic mail transmitted via a communications network
US6873710B1 (en) * 2000-06-27 2005-03-29 Koninklijke Philips Electronics N.V. Method and apparatus for tuning content of information presented to an audience
US7689832B2 (en) 2000-09-11 2010-03-30 Sentrycom Ltd. Biometric-based system and method for enabling authentication of electronic messages sent over a network
CA2372380A1 (en) 2001-02-20 2002-08-20 Martin D. Levine Method for secure transmission and receipt of data over a computer network using biometrics
US20030214535A1 (en) 2002-05-14 2003-11-20 Motorola, Inc. User interface for a messaging device and method
US7266847B2 (en) 2003-09-25 2007-09-04 Voltage Security, Inc. Secure message system with remote decryption service
US20050163379A1 (en) * 2004-01-28 2005-07-28 Logitech Europe S.A. Use of multimedia data for emoticons in instant messaging
JP2006344049A (en) 2005-06-09 2006-12-21 Konica Minolta Business Technologies Inc Image processor and image processing system
US8700469B2 (en) * 2006-03-06 2014-04-15 Apple Inc. System and method for delivering advertising with enhanced effectiveness
US8645379B2 (en) * 2006-04-27 2014-02-04 Vertical Search Works, Inc. Conceptual tagging with conceptual message matching system and method
WO2008042879A1 (en) 2006-10-02 2008-04-10 Global Rainmakers, Inc. Fraud resistant biometric financial transaction system and method
US20120164613A1 (en) * 2007-11-07 2012-06-28 Jung Edward K Y Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US8203530B2 (en) * 2007-04-24 2012-06-19 Kuo-Ching Chiang Method of controlling virtual object by user's figure or finger motion for electronic device
US20090016617A1 (en) 2007-07-13 2009-01-15 Samsung Electronics Co., Ltd. Sender dependent messaging viewer
US20090077163A1 (en) * 2007-09-14 2009-03-19 Phorm Uk, Inc. Approach for identifying and providing targeted content to a network client with reduced impact to the service provider
US8462949B2 (en) 2007-11-29 2013-06-11 Oculis Labs, Inc. Method and apparatus for secure display of visual content
US8600120B2 (en) 2008-01-03 2013-12-03 Apple Inc. Personal computing device control using face detection and recognition
US7814061B2 (en) * 2008-01-24 2010-10-12 Eastman Kodak Company Method for preserving privacy with image capture
KR20100002756A (en) * 2008-06-30 2010-01-07 삼성전자주식회사 Matrix blogging system and service support method thereof
US8266536B2 (en) * 2008-11-20 2012-09-11 Palo Alto Research Center Incorporated Physical-virtual environment interface
WO2010101697A2 (en) 2009-02-06 2010-09-10 Oculis Labs, Inc. Video-based privacy supporting system
US20110125844A1 (en) * 2009-05-18 2011-05-26 Telcordia Technologies, Inc. mobile enabled social networking application to support closed, moderated group interactions for purpose of facilitating therapeutic care
US8922480B1 (en) 2010-03-05 2014-12-30 Amazon Technologies, Inc. Viewer-based device control
US8694899B2 (en) * 2010-06-01 2014-04-08 Apple Inc. Avatars reflecting user states
US20110304629A1 (en) * 2010-06-09 2011-12-15 Microsoft Corporation Real-time animation of facial expressions
US9183557B2 (en) * 2010-08-26 2015-11-10 Microsoft Technology Licensing, Llc Advertising targeting based on image-derived metrics
JP2012073998A (en) * 2010-08-31 2012-04-12 Casio Comput Co Ltd Image distribution system, image display device, image distribution server, and program
JP6148431B2 (en) * 2010-12-28 2017-06-14 キヤノン株式会社 Imaging apparatus and control method thereof
WO2013006351A2 (en) * 2011-07-01 2013-01-10 3G Studios, Inc. Techniques for controlling game event influence and/or outcome in multi-player gaming environments
US20150033017A1 (en) 2011-09-30 2015-01-29 Mithun ULIYAR Methods and Apparatuses for Electronic Message Authentication
US9189797B2 (en) 2011-10-26 2015-11-17 Apple Inc. Systems and methods for sentiment detection, measurement, and normalization over social networks
KR20130084543A (en) * 2012-01-17 2013-07-25 삼성전자주식회사 Apparatus and method for providing user interface
US9215395B2 (en) * 2012-03-15 2015-12-15 Ronaldo Luiz Lisboa Herdy Apparatus, system, and method for providing social content
US20130275223A1 (en) * 2012-04-17 2013-10-17 Yahoo! Inc. Future ad targeting
US8848068B2 (en) * 2012-05-08 2014-09-30 Oulun Yliopisto Automated recognition algorithm for detecting facial expressions
US20130322705A1 (en) * 2012-05-30 2013-12-05 Google Inc. Facial and fingerprint authentication
US20140195345A1 (en) * 2013-01-09 2014-07-10 Philip Scott Lyren Customizing advertisements to users
US9117066B2 (en) 2013-01-14 2015-08-25 Sap Portals Israel Ltd Camera-based portal content security
US8973149B2 (en) 2013-01-14 2015-03-03 Lookout, Inc. Detection of and privacy preserving response to observation of display screen
JP2016517052A (en) 2013-02-08 2016-06-09 エモティエント Collecting machine learning training data for facial expression recognition
US9552535B2 (en) 2013-02-11 2017-01-24 Emotient, Inc. Data acquisition for machine perception systems
WO2014127065A2 (en) 2013-02-12 2014-08-21 Emotient Facial expression measurement for assessment, monitoring, and treatment evaluation of affective and neurological disorders
US20140316881A1 (en) 2013-02-13 2014-10-23 Emotient Estimation of affective valence and arousal with automatic facial expression measurement
WO2014127333A1 (en) 2013-02-15 2014-08-21 Emotient Facial expression training using feedback from automatic facial expression recognition
WO2014130748A1 (en) 2013-02-20 2014-08-28 Emotient Automatic analysis of rapport
US9104905B2 (en) 2013-05-02 2015-08-11 Emotient, Inc. Automatic analysis of individual preferences for attractiveness
US9105119B2 (en) 2013-05-02 2015-08-11 Emotient, Inc. Anonymization of facial expressions
US20140351163A1 (en) 2013-05-21 2014-11-27 Kevin Alan Tussy System and method for personalized delivery verification
US9721107B2 (en) 2013-06-08 2017-08-01 Apple Inc. Using biometric verification to grant access to redacted content
US10037530B2 (en) 2013-06-13 2018-07-31 Paypal, Inc. Payment recipient verification
KR102182398B1 (en) * 2013-07-10 2020-11-24 엘지전자 주식회사 Electronic device and control method thereof
US9104907B2 (en) 2013-07-17 2015-08-11 Emotient, Inc. Head-pose invariant recognition of facial expressions
JP2016529612A (en) 2013-08-02 2016-09-23 エモティエント インコーポレイテッド Filters and shutters based on image emotion content
US9553859B2 (en) 2013-08-08 2017-01-24 Google Technology Holdings LLC Adaptive method for biometrically certified communication
WO2015024002A1 (en) 2013-08-15 2015-02-19 Emotient Emotion and appearance based spatiotemporal graphics systems and methods
US9450957B1 (en) 2013-08-22 2016-09-20 Isaac S. Daniel Interactive mail verification system and method
US10275583B2 (en) 2014-03-10 2019-04-30 FaceToFace Biometrics, Inc. Expression recognition in messaging systems
US9817960B2 (en) 2014-03-10 2017-11-14 FaceToFace Biometrics, Inc. Message sender security in messaging system
US9525668B2 (en) 2014-06-27 2016-12-20 Intel Corporation Face based secure messaging

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100141662A1 (en) * 2007-02-05 2010-06-10 Amegoworld, Ltd. Communication network and devices for text to speech and text to facial animation conversion
US20080218472A1 (en) * 2007-03-05 2008-09-11 Emotiv Systems Pty., Ltd. Interface to convert mental states and facial expressions to application input

Also Published As

Publication number Publication date
US20150254447A1 (en) 2015-09-10
US10275583B2 (en) 2019-04-30
US11042623B2 (en) 2021-06-22
US20210312028A1 (en) 2021-10-07
US20200226239A1 (en) 2020-07-16

Similar Documents

Publication Publication Date Title
US11042623B2 (en) Expression recognition in messaging systems
US11334653B2 (en) Message sender security in messaging system
US10971158B1 (en) Designating assistants in multi-assistant environment based on identified wake word received from a user
US8850536B2 (en) Methods and systems for identity verification in a social network using ratings
US9282090B2 (en) Methods and systems for identity verification in a social network using ratings
US10607035B2 (en) Method of displaying content on a screen of an electronic processing device
US9055071B1 (en) Automated false statement alerts
US8490157B2 (en) Authentication—circles of trust
US10277588B2 (en) Systems and methods for authenticating a user based on self-portrait media content
US8850597B1 (en) Automated message transmission prevention based on environment
US10467387B2 (en) Computerized system and method for modifying a media file by automatically applying security features to select portions of media file content
US20190081919A1 (en) Computerized system and method for modifying a message to apply security features to the message's content
US8887300B1 (en) Automated message transmission prevention based on a physical reaction
US9256748B1 (en) Visual based malicious activity detection
US20160188862A1 (en) Method and system of silent biometric security privacy protection for smart devices
US20200193068A1 (en) Method Of Displaying Content On A Screen Of An Electronic Processing Device
US9208326B1 (en) Managing and predicting privacy preferences based on automated detection of physical reaction
US20160043979A1 (en) Automatic biographical summary compilation and speaker recognition based messaging system
CA3178249A1 (en) Systems and methods for conducting remote attestation
US10470043B1 (en) Threat identification, prevention, and remedy
US10498840B2 (en) Method and system for efficient review of exchanged content
US20200311226A1 (en) Methods, systems, apparatuses and devices for facilitating secure publishing of a digital content
Alkaeed et al. Privacy Preservation in Artificial Intelligence and Extended Reality (AI-XR) Metaverses: A Survey
GB2566043A (en) A method of displaying content on a screen of an electronic processing device
WO2015035057A1 (en) Systems and methods for verifying identities

Legal Events

Date Code Title Description
AS Assignment

Owner name: FACETOFACE BIOMETRICS, INC., MISSOURI

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEUTHARDT, ERIC;STERN, SCOTT;SIGNING DATES FROM 20150318 TO 20150323;REEL/FRAME:049025/0161

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION