US20210004573A1 - Sentiment analysis - Google Patents
Sentiment analysis Download PDFInfo
- Publication number
- US20210004573A1 US20210004573A1 US16/763,494 US201816763494A US2021004573A1 US 20210004573 A1 US20210004573 A1 US 20210004573A1 US 201816763494 A US201816763494 A US 201816763494A US 2021004573 A1 US2021004573 A1 US 2021004573A1
- Authority
- US
- United States
- Prior art keywords
- subject
- sentiment
- facial features
- computing device
- sentiment level
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G06K9/00302—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- G06K9/00248—
-
- G06K9/00281—
-
- G06K9/00288—
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/165—Detection; Localisation; Normalisation using facial parts and geometric relationships
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/18—Status alarms
Definitions
- Identifying the sentiment of customers and employees can be a factor in providing services.
- Subjects, such as customers and/or employees can be surveyed before, during, and/or after a transaction by asking if the transaction experience was satisfactory or not.
- Customers can be surveyed post-transaction based on their recollection of an event, time, and/or day of the transaction.
- FIG. 1 is a diagram of an example system to perform sentiment analysis according to the disclosure.
- FIG. 2 is a block diagram of an example of a computing device to perform sentiment analysis according to the disclosure.
- FIG. 3 is a block diagram of an example of a system consistent with the disclosure.
- FIG. 4 is an example of a computing device to perform sentiment analysis according to the disclosure.
- Surveys, reviews, and/or voice detection of subjects before, during, and/or after a transaction to determine a sentiment level of the subjects can allow for insight into trends and early signs of issues.
- the analysis of surveys, reviews, and/or voice detection can be limited to a subgroup of subjects who are either happy or upset enough to want to leave a review, ask for customer assistance, and/or desire to take part in a survey.
- surveys can be time-consuming to create and may be subject to bias in question phrasing, reviews can be fraudulent, analysis is typically gathered post-transaction and may be dependent on a subject's recollection of the transaction, and surveys and reviews may be subject to the bias of the creator of the surveys and reviews.
- Sentiment analysis can allow for a subject's sentiment level to be determined and monitored.
- the subject can be subjected to sentiment analysis while they are monitored by a camera.
- the term “subject” can, for example, refer to a person as an object of interest.
- Sentiment analysis can provide for insights into a subject's sentiment regarding a transaction while removing the workload of creating and filling out surveys and/or reviews and deriving meaning from those surveys and/or reviews.
- Sentiment analysis can refer to determining an attitude of a speaker, writer, or other subject with respect to some topic or the overall contextual polarity or emotional reaction to a document, interaction, or event.
- the term “sentiment level” can, for example, refer to a degree to which a subject has a sentiment. Sentiment levels can include a happy sentiment level, a frustrated sentiment level, an upset sentiment level, and/or a satisfied sentiment level, among other types of sentiment levels.
- Determining a sentiment level of a subject may include analyzing a subject's sentiment level using the subject's identity based on facial features.
- facial features may be determined via a digital image of the subject received from a camera.
- facial features may include an element of the face.
- an “element of a face” can, for example, refer to an ear, nose, mouth, hair, jaw, and/or cheekbones of a subject, among other types of facial elements of a subject.
- Sentiment analysis via sentiment level determination can allow for analyzing and determining an identity of a subject from facial features.
- identity can for example refer to a distinguishing character or personality of an individual.
- a subject's identity can distinguish the subject from other subjects.
- a sentiment level can be determined for each subject, where the subjects are distinguishable via their respective identities.
- a sentiment level may be displayed based on the determination of a sentiment level of a subject via sentiment analysis.
- a subject's sentiment level, identity, as well as contextual data may be stored for future use to improve customer satisfaction.
- an alert in response to the determined sentiment level being different from a previous sentiment level may be generated, as is further described herein.
- FIG. 1 is a diagram of an example system 100 to perform sentiment analysis according to the disclosure.
- the system 100 may include a computing device 102 , a camera 110 , and a database 108 .
- System 100 may include database 108 .
- Database 108 can perform functions related to sentiment analysis.
- database 108 can be included in computing device 102 .
- database 108 can be located remote from computing device 102 as illustrated in FIG. 1 .
- Data can be transmitted to and/or from database 108 via computing device 102 via a network relationship.
- data can be transmitted to and/or from database 108 via computing device 102 via a wired or wireless network.
- data can refer to a set of values of qualitative or quantitative variables.
- the data included in database 108 can be hardware data and/or software data of the database 108 , among other types of data.
- the wired or wireless network can be a network relationship that connects the database 108 to computing device 102 .
- Examples of such a network relationship can include a local area network (LAN), wide area network (WAN), personal area network (PAN), a distributed computing environment (e.g., a cloud computing environment), storage area network (SAN), Metropolitan area network (MAN), a cellular communications network, and/or the Internet, among other types of network relationships.
- Computing device 102 can receive from camera 110 a digital image of a subject.
- a “camera” can refer to a device for recording visual images in the form of photographs, film, and/or video signals, such as, for example, compact digital cameras such as Digital Single Lens Reflex (DSLR) Cameras, mirrorless cameras, infrared (IR) cameras, action cameras, 360 cameras, and/or cameras, among other types of cameras.
- DSLR Digital Single Lens Reflex
- IR infrared
- Digital images may be periodically transmitted to computing device 102 .
- camera 110 may transmit digital images to computing device 102 based on a predetermined time period. For example, computing device 102 can transmit a digital image to computing device 102 every fifteen minutes, every ten minutes, and/or any other time period.
- camera 110 may transmit digital images to computing device 102 in response to a subject's change in position.
- camera 110 in response to a subject changing position, camera 110 can take and transmit a digital image to computing device 102 .
- a change in position can, for instance, refer to a change in physical position of the subject.
- the subject may move their arm, take a step to a different physical location than where the subject was previously standing, may move their torso, among other types of changes in position of a subject.
- camera 110 may transmit digital images to computing device 102 in response to an action by a subject causing the camera 110 to transmit digital images to computing device 102 .
- camera 110 may take a digital image upon a new client's arrival and transmit the digital image to computing device 102 .
- the action by the subject can include picking up a predetermined product, standing or entering a predetermined area, etc.
- Computing device 102 can analyze the digital image received from camera 110 to detect facial features of a subject.
- facial feature can for example, refer to a distinguishing element of a face.
- element of the subject's face may be an ear, nose, mouth, hair, jaw, and/or cheekbones of the subject.
- Facial features can be detected by computing device 102 via object detection.
- Object detection can, for example, refer to detecting instances of semantic objects in digital images.
- Semantic objects can include facial features.
- computing device 102 can utilize object detection to detect facial elements such as a subject's ear, nose, eye, mouth, hair, jaw, and/or cheekbones, among other facial features and/or combinations thereof.
- Computing device 102 can receive the digital image of the subject from camera 110 and analyze the detected facial features of the subject received from camera 110 . Analyzing the detected facial features can include analyzing an element of the subject's face. For example, computing device 110 can analyze an ear, nose, eye, mouth, hair, jaw, cheekbones, and/or combinations thereof of the subject's face.
- Analyzing an element of the subject's face can include determining various characteristics about the element of the subject's face.
- characteristics of an element of the subject's face can include a shape of the element, a size of the element, a color of the element, distinguishing features of the element, among other types of characteristics.
- computing device 102 may analyze an element of the subject's face such as the subject's eye. Analyzing the subject's eye may include determining a shape of the eye, size of the eye, color of the eye, etc.
- Computing device 102 can analyze the detected facial features to determine an identity of a subject. In some examples, computing device 102 can identify a subject as an existing subject or as a new subject, as is further described herein.
- computing device 102 can identify the subject as an existing subject. For example, computing device 102 may receive a digital image from camera 110 . Computing device 102 may then analyze the detected facial features of the subject based on an element of a subject's face. For example, computing device 102 may analyze a subject's nose, mouth, and jaw. Based on the analysis, computing device 102 may identify the subject as an existing subject. For instance, if facial features included in the image received from camera 110 match the facial features of an existing image included in database 108 , computing device 102 can identify the subject as an existing subject.
- computing device 102 can identify the subject as a new subject. For example, computing device 102 may receive a digital image from camera 110 . Computing device 102 may then analyze the detected facial features of the subject based on an element of a subject's face. For example, computing device 102 may analyze a subject's nose, mouth, and jaw. Based on the analysis, computing device 102 may not identify the subject as an existing subject. For instance, if facial features included in the image received from camera 110 do not match the facial features of an existing image included in database 108 , computing device 102 can identify the subject as a new subject.
- computing device 102 is described above as utilizing a subject's nose, mouth, and jaw, examples of the disclosure are not so limited.
- computing device 102 can utilize a subject's ear(s), nose, mouth, hair, jaw, and/or cheekbones, and/or any other facial element and/or combination thereof to determine the identity of a subject.
- Computing device 102 can determine a sentiment level of a subject using a sentiment analysis. For example, computing device 102 can determine the sentiment level by detecting facial features and the identity of a subject. Computing device 102 can determine a sentiment analysis via machine learning. For example, computing device 102 can utilize decision tree learning, artificial neural networks, deep learning, inductive logic programming, support vector machines, Bayesian networks, and/or learning classifier systems to determine a sentiment analysis, among other types of machine learning techniques.
- computing device 102 can determine the subject's sentiment level based on a facial expression of the subject. For example, computing device 102 may determine a subject's sentiment levels as a happy sentiment level, a frustrated sentiment level, an upset sentiment level, and/or a satisfied sentiment level, among other types of sentiment levels. For example, computing device 102 can determine the subject's sentiment level as happy based on the mouth of the subject being oriented in a smile. In some examples, computing device 102 can determine the subject's sentiment level as upset based on the subject's eyebrows being turned down.
- computing device 102 to can determine a subject's sentiment level based on an identity of the subject. For example, if computing device 102 identifies a subject as an existing subject, computing device 102 can determine the sentiment level to be the previous sentiment level of the existing subject. Further, based on the analysis of the subject, computing device 102 can update the subject's sentiment level by comparing the subject's sentiment level with facial features and related sentiment levels of subjects, received from database 108 .
- Computing device 102 can determine customer satisfaction based on the determined sentiment level of the subject.
- customer satisfaction can, for example, refer to a measure of how a product or service meets a customer expectation.
- computing device 102 can determine the customer satisfaction utilizing the facial features analysis, as is further described herein.
- computing device 102 may determine a customer satisfaction level as dissatisfied based on the determined sentiment level. For example, computing device 102 can identify a subject as a new customer and determine the sentiment level of the subject based on facial features analysis. For example, the sentiment level may be determined by computing device 102 as frustrated. Based on the determination of the subject's sentiment level as frustrated, computing device 102 may determine the subject has a dissatisfied customer satisfaction level.
- computing device 102 may identify a subject as an existing customer based on facial features analysis. Computing device 102 may then determine the subject's sentiment level as a happy sentiment level by comparing the subject's facial features with sentiment levels stored in and received from database 108 . Based on the determination of subject's sentiment level as a happy sentiment level, computing device 102 may determine the customer is satisfied.
- Sentiment level information stored in database 108 can include existing subjects' information. Sentiment level information stored in database 108 can be information from other subjects, collected in various places and at various points in time.
- Computing device 102 can display the sentiment level of the subject via a display.
- display can, for example, refer to an output device which can display information via a screen.
- a display may include a television, computer monitor, mobile device display, other type of display device, or any combination thereof, which can receive and output a video signal.
- the display can be a liquid crystal display (LCD), LED display, organic light-emitting diode (OLED) display, polymer light-emitting diode (PLED) display, micro-LED display, electronic paper display (EPD), bi-stable display, and/or a quantum-dot LED (QLED) display, among other types of displays.
- LCD liquid crystal display
- LED organic light-emitting diode
- PLED polymer light-emitting diode
- micro-LED micro-LED display
- EPD electronic paper display
- bi-stable display bi-stable display
- QLED quantum-dot LED
- computing device 102 may identify determine a subject's sentiment level and display the sentiment level via a display.
- an existing subject's sentiment level may be determined as “dissatisfied”.
- the dissatisfied sentiment level may be displayed on a display so that an employee, supervisor, and/or other user may view the determined sentiment level.
- appropriate action can be taken.
- further employee training can be performed to improve customer sentiment levels and customer satisfaction.
- the subject may be given coupons or other discounts in order to improve customer sentiment levels.
- appropriate personnel may be notified based on the subject's sentiment level.
- Computing device 102 can generate a report including the determined sentiment level and/or the past sentiment level of the subject.
- the term “report” can, for example, refer to an account or statement describing an event.
- the report generated by computing device 102 can include a sentiment level of the subject, including, for instance, whether the subject has a happy, frustrated, upset, and/or a dissatisfied sentiment level, among other types of sentiment levels, whether the subject is a new or existing subject, the customer satisfaction of the subject, including, for instance, whether the subject is satisfied, dissatisfied, among other types of satisfaction levels, etc.
- the report can include information to allow personnel, such as a supervisor and/or employee, to determine whether to take action to improve the subject's experience by improving their sentiment level and/or customer satisfaction, to give an employee further training, etc.
- the report can be displayed via a display.
- the report can be printed by an imaging device, such as a printer, such that the report can be physically distributed among personnel.
- FIG. 2 is a block diagram 220 of an example computing device 202 for sentiment analysis consistent with the disclosure.
- the computing device 202 may perform a number of functions related to sentiment analysis.
- the computing device 202 may include a processor and a machine-readable storage medium.
- the following descriptions refer to a single processor and a single machine-readable storage medium, the descriptions may also apply to a system with multiple processors and multiple machine-readable storage mediums.
- the computing device 202 may be distributed across multiple machine-readable storage mediums and the computing device 202 may be distributed across multiple processors.
- the instructions executed by the computing device 202 may be stored across multiple machine-readable storage mediums and executed across multiple processors, such as in a distributed or virtual computing environment.
- Processing resource 204 may be a central processing unit (CPU), a semiconductor based microprocessor, and/or other hardware devices suitable for retrieval and execution of machine-readable instructions 201 , 203 , 205 , 207 , stored in memory resource 206 .
- Processing resource 204 may fetch, decode, and execute instructions 201 , 203 , 205 , 207 .
- processing resource 204 may include a plurality of electronic circuits that include electronic components for performing the functionality of instructions 201 , 203 , 205 , 207 .
- Memory resource 206 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions 201 , 203 , 205 , 207 and/or data.
- memory resource 206 may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like.
- RAM Random Access Memory
- EEPROM Electrically-Erasable Programmable Read-Only Memory
- Memory resource 206 may be disposed within computing device 202 , as shown in FIG. 2 . Additionally, and/or alternatively, memory resource 206 may be a portable, external or remote storage medium, for example, that allows computing device 202 to download the instructions 201 , 203 , 205 , 207 from a portable/external/remote storage medium.
- Computing device 202 may include instructions 201 stored in the memory resource 206 and executable by the processing resource 204 to receive a digital image of a subject.
- computing device 202 may execute instructions 201 via the processing resource 204 to receive, from a camera, a digital image of a subject.
- digital images taken by a camera may be periodically transmitted to computing device 202 .
- a camera may transmit digital images to computer device 202 periodically, such as every fifteen minutes, and/or in response to a subject's change in position.
- a camera may transmit digital images to computing device 202 in response to a subject's triggered action. For instance, an employee may trigger a camera to take a digital image upon a new subject's arrival, and transmit the digital image to computing device 202 .
- Computing device 202 may include instructions 203 stored in the memory resource 206 and executable by the processing resource 204 to analyze the digital image to detect facial features.
- computing device 202 may execute instructions 203 via the processing resource 204 to analyze the digital image to detect facial features of the subject.
- Facial elements of the subject can, for example, include an ear nose, mouth, eye, hair, jaw, and/or cheekbones of the subject, among other facial elements and/or combinations thereof.
- Computing device 202 may include instructions 205 stored in the memory resource 206 and executable by the processing resource 204 to determine a sentiment level of the subject using a sentiment analysis. For example, computing device 202 may execute instructions 205 via the processing resource 204 to determine a sentiment level of the subject using a sentiment analysis, where the sentiment analysis uses the detected facial features and the identity of the subject to determine the sentiment level of the subject.
- computing device 202 can determine the identity of a subject as an existing subject. In some examples, computing device 202 can determine the identity of the subject as a new subject.
- computing device 202 can analyze the detected facial features of a subject based on an element of the subject's face. For example, the computing device 202 can to analyze a subject's nose, mouth, and jaw, Computing device 202 can compare the facial features from the digital image with facial features of existing images in a database. In response to the facial features from the digital image matching facial features of an existing image in the database, computing device 202 can identify the subject as an existing subject.
- computing device 202 can analyze the detected facial features of a subject based on an element of the subject's face. For example, the computing device 202 can to analyze a subject's nose, mouth, and jaw. Computing device 202 can compare the facial features from the digital image with facial features of existing images in a database. In response to the facial features from the digital image not matching facial features of an existing image in the database, computing device 202 can identify the subject as a new subject.
- Computing device 202 may include instructions 207 stored in the memory resource 206 and executable by the processing resource 204 to display the sentiment level. For example, computing device 202 may execute instructions 207 via the processing resource 204 to display the sentiment level via a display.
- FIG. 3 is a block diagram of an example of a system 322 consistent with the disclosure.
- system 322 includes a processor 304 and a machine-readable storage medium 312 .
- the following descriptions refer to an individual processing resource and an individual machine-readable storage medium, the descriptions may also apply to a system with multiple processing resources and multiple machine-readable storage mediums.
- the instructions may be distributed across multiple machine-readable storage mediums and the instructions may be distributed across multiple processing resources. Put another way, the instructions may be stored across multiple machine-readable storage mediums and executed across multiple processing resources, such as in a distributed computing environment.
- Processor 304 may be a central processing unit (CPU), microprocessor, and/or other hardware device suitable for retrieval and execution of instructions stored in machine-readable storage medium 312 .
- processor 304 may receive, determine, and send instructions 309 , 311 , 313 , 315 , 317 .
- processor 304 may include an electronic circuit comprising a number of electronic components for performing the operations of the instructions in machine-readable storage medium 312 .
- executable instruction representations or boxes described and shown herein it should be understood that part or all of the executable instructions and/or electronic circuits included within one box may be included in a different box shown in the figures or in a different box not shown.
- Machine-readable storage medium 312 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions.
- machine-readable storage medium 312 may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like.
- the executable instructions may be “installed” on the system 322 illustrated in FIG. 3 .
- Machine-readable storage medium 312 may be a portable, external or remote storage medium, for example, that allows the system 322 to download the instructions from the portable/external/remote storage medium. In this situation, the executable instructions may be part of an “installation package”.
- machine-readable storage medium 312 may be encoded with executable instructions for sentiment analysis.
- Instructions 309 to receive a digital image when executed by processor 304 , may cause system 322 to receive a digital image of a subject.
- a computing device including processor 304 and machine-readable storage medium 312 can receive a digital image of a subject from a camera.
- Instructions 311 to analyze the digital image to detect facial features of the subject when executed by processor 304 , may cause system 322 to analyze the digital image to detect facial features of the subject.
- Facial features of the subject can include an element of the subject's face.
- elements of a subject's face may include an ear, nose, mouth, hair, jaw, and/or cheekbones of the subject, among other facial elements and/or combinations thereof.
- Instructions 312 to analyze the detected facial features to determine an identity of the subject when executed by processor 304 , may cause system 322 to analyze the detected facial features to determine an identity of the subject by comparing the detected facial features to facial features included in a database.
- the computing device can determine the subject to be an existing subject in response to the detected facial features matching the facial features included in the database. In some examples, the computing device can determine the subject to be a new subject in response to the detected facial features not matching the facial features included in the database.
- Instructions 313 to determine a sentiment level of the subject using a sentiment analysis when executed by processor 304 , may cause system 322 to determine a sentiment level of the subject using a sentiment analysis, where the sentiment analysis uses the detected facial features and an identity of the subject to determine the sentiment level of the subject.
- Instructions 315 to compare the sentiment level of the subject with a past sentiment level of the subject when executed by processor 304 , may cause system 322 to compare the sentiment level of the subject with a past sentiment level of the subject in response to the subject being an existing subject.
- Instructions 317 to generate an alert in response to the determined sentiment level having changed when executed by processor 304 , may cause system 322 to generate an alert in response to the determined sentiment level being changed from the past sentiment level. For example, an alert may be generated such that an employee can be notified that a sentiment level of the subject has changed. In some examples, the employee can, in response to the sentiment level having changed, approach the subject differently, offer the subject coupons, and/or other actions.
- FIG. 4 is a block diagram of an example computing device 402 to perform sentiment analysis consistent with the disclosure. As described herein, the computing device 402 may perform a number of functions related to sentiment analysis.
- Processing resource 404 may be a central processing unit (CPU), a semiconductor based microprocessor, and/or other hardware devices suitable for retrieval and execution of machine-readable instructions 419 , 421 , 423 , 425 , 427 , 429 stored in memory resource 406 .
- Processing resource 404 may fetch, decode, and execute instructions 419 , 421 , 423 , 425 , 427 , 429 .
- processing resource 404 may include a plurality of electronic circuits that include electronic components for performing the functionality of instructions 419 , 421 , 423 , 425 , 427 , 429 .
- Memory resource 406 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions 419 , 421 , 423 , 425 , 427 , 429 and/or data.
- memory resource 406 may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like.
- RAM Random Access Memory
- EEPROM Electrically-Erasable Programmable Read-Only Memory
- Memory resource 406 may be disposed within computing device 402 , as shown in FIG. 4 .
- memory resource 406 may be a portable, external or remote storage medium, for example, that allows computing device 402 to download the instructions 419 , 421 , 423 , 425 , 427 , 429 from a portable/external/remote storage medium.
- Computing device 402 may include instructions 419 stored in the memory resource 406 and executable by the processing resource 404 to receive a digital image of a subject. For example, computing device 402 may execute instructions 419 via the processing resource 404 to receive, from a camera, a digital image of a subject.
- Computing device 402 may include instructions 421 stored in the memory resource 406 and executable by the processing resource 404 to analyze the digital image to detect facial features.
- computing device 402 may execute instructions 421 via the processing resource 404 to analyze the digital image to detect facial features of the subject.
- Facial elements of the subject can, for example, include an ear nose, mouth, eye, hair, jaw, and/or cheekbones of the subject, among other facial elements and/or combinations thereof.
- Computing device 402 may include instructions 422 stored in the memory resource 406 and executable by the processing resource 404 to analyze the detected facial features to determine an identity of the subject. For example, computing device 402 may execute instructions 422 via the processing resource 404 to analyze the detected facial features to determine an identity of the subject by comparing the detected facial features to facial features included in a database. The subject can be determined to be an existing subject in response to the detected facial features matching the facial features included in the database. The subject can be determined to be a new subject in response to the detected facial features not matching the facial features included in the database.
- Computing device 402 may include instructions 423 stored in the memory resource 406 and executable by the processing resource 404 to determine a sentiment level of the subject using a sentiment analysis. For example, computing device 402 may execute instructions 423 via the processing resource 404 to determine a sentiment level of the subject using a sentiment analysis, where the sentiment analysis uses the detected facial features and the identity of the subject to determine the sentiment level of the subject.
- Computing device 402 may include instructions 425 stored in the memory resource 406 and executable by the processing resource 404 to compare the sentiment level of the subject with a past sentiment level. For example, computing device 402 may execute instructions 425 via the processing resource 404 to compare the sentiment level of the subject with a past sentiment level of the subject in response to the subject being an existing subject.
- Computing device 402 may include instructions 427 stored in the memory resource 406 and executable by the processing resource 404 to analyze the sentiment level to determine customer satisfaction. For example, the customer may be satisfied or dissatisfied.
- Computing device 402 may include instructions 429 stored in the memory resource 406 and executable by the processing resource 404 to display the sentiment level and the customer satisfaction. For example, computing device 402 may execute instructions 429 via the processing resource 404 to display the sentiment level and the customer satisfaction of the subject via display 414 .
- Display 414 can be, for instance, a touch-screen display. As previously described in connection with FIG. 1 , the display may include a television, computer monitor, mobile device display, other type of display device, or any combination thereof, which can receive and output a video signal.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Psychiatry (AREA)
- Emergency Management (AREA)
- Business, Economics & Management (AREA)
- Social Psychology (AREA)
- Molecular Biology (AREA)
- Hospice & Palliative Care (AREA)
- Educational Technology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Psychology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Geometry (AREA)
- Developmental Disabilities (AREA)
- Child & Adolescent Psychology (AREA)
- Image Analysis (AREA)
Abstract
Description
- Identifying the sentiment of customers and employees can be a factor in providing services. Subjects, such as customers and/or employees can be surveyed before, during, and/or after a transaction by asking if the transaction experience was satisfactory or not. Customers can be surveyed post-transaction based on their recollection of an event, time, and/or day of the transaction.
-
FIG. 1 is a diagram of an example system to perform sentiment analysis according to the disclosure. -
FIG. 2 is a block diagram of an example of a computing device to perform sentiment analysis according to the disclosure. -
FIG. 3 is a block diagram of an example of a system consistent with the disclosure. -
FIG. 4 is an example of a computing device to perform sentiment analysis according to the disclosure. - Surveys, reviews, and/or voice detection of subjects before, during, and/or after a transaction to determine a sentiment level of the subjects can allow for insight into trends and early signs of issues. However, the analysis of surveys, reviews, and/or voice detection can be limited to a subgroup of subjects who are either happy or upset enough to want to leave a review, ask for customer assistance, and/or desire to take part in a survey. Further, surveys can be time-consuming to create and may be subject to bias in question phrasing, reviews can be fraudulent, analysis is typically gathered post-transaction and may be dependent on a subject's recollection of the transaction, and surveys and reviews may be subject to the bias of the creator of the surveys and reviews.
- Sentiment analysis, according to the disclosure, can allow for a subject's sentiment level to be determined and monitored. For example, the subject can be subjected to sentiment analysis while they are monitored by a camera. As used herein, the term “subject” can, for example, refer to a person as an object of interest. Sentiment analysis can provide for insights into a subject's sentiment regarding a transaction while removing the workload of creating and filling out surveys and/or reviews and deriving meaning from those surveys and/or reviews.
- Sentiment analysis, according to the disclosure, can refer to determining an attitude of a speaker, writer, or other subject with respect to some topic or the overall contextual polarity or emotional reaction to a document, interaction, or event. As used herein, the term “sentiment level” can, for example, refer to a degree to which a subject has a sentiment. Sentiment levels can include a happy sentiment level, a frustrated sentiment level, an upset sentiment level, and/or a satisfied sentiment level, among other types of sentiment levels.
- Determining a sentiment level of a subject may include analyzing a subject's sentiment level using the subject's identity based on facial features. In some instances, facial features may be determined via a digital image of the subject received from a camera. In some instances, facial features may include an element of the face. As used herein, an “element of a face” can, for example, refer to an ear, nose, mouth, hair, jaw, and/or cheekbones of a subject, among other types of facial elements of a subject.
- Sentiment analysis via sentiment level determination according to the disclosure can allow for analyzing and determining an identity of a subject from facial features. As used herein, the term, “identity”, can for example refer to a distinguishing character or personality of an individual. A subject's identity can distinguish the subject from other subjects. A sentiment level can be determined for each subject, where the subjects are distinguishable via their respective identities.
- A sentiment level may be displayed based on the determination of a sentiment level of a subject via sentiment analysis. A subject's sentiment level, identity, as well as contextual data may be stored for future use to improve customer satisfaction. In some examples, an alert in response to the determined sentiment level being different from a previous sentiment level may be generated, as is further described herein.
-
FIG. 1 is a diagram of anexample system 100 to perform sentiment analysis according to the disclosure. As illustrated inFIG. 1 , thesystem 100 may include acomputing device 102, acamera 110, and adatabase 108. -
System 100 may includedatabase 108.Database 108 can perform functions related to sentiment analysis. In some examples,database 108 can be included incomputing device 102. In some instances,database 108 can be located remote fromcomputing device 102 as illustrated inFIG. 1 . - Data can be transmitted to and/or from
database 108 viacomputing device 102 via a network relationship. For example, data can be transmitted to and/or fromdatabase 108 viacomputing device 102 via a wired or wireless network. As used herein, “data” can refer to a set of values of qualitative or quantitative variables. The data included indatabase 108 can be hardware data and/or software data of thedatabase 108, among other types of data. - The wired or wireless network can be a network relationship that connects the
database 108 to computingdevice 102. Examples of such a network relationship can include a local area network (LAN), wide area network (WAN), personal area network (PAN), a distributed computing environment (e.g., a cloud computing environment), storage area network (SAN), Metropolitan area network (MAN), a cellular communications network, and/or the Internet, among other types of network relationships. -
Computing device 102 can receive from camera 110 a digital image of a subject. As used herein, a “camera” can refer to a device for recording visual images in the form of photographs, film, and/or video signals, such as, for example, compact digital cameras such as Digital Single Lens Reflex (DSLR) Cameras, mirrorless cameras, infrared (IR) cameras, action cameras, 360 cameras, and/or cameras, among other types of cameras. - Digital images may be periodically transmitted to computing
device 102. In some examples,camera 110 may transmit digital images to computingdevice 102 based on a predetermined time period. For example,computing device 102 can transmit a digital image to computingdevice 102 every fifteen minutes, every ten minutes, and/or any other time period. - In some examples,
camera 110 may transmit digital images to computingdevice 102 in response to a subject's change in position. For example, in response to a subject changing position,camera 110 can take and transmit a digital image to computingdevice 102. A change in position can, for instance, refer to a change in physical position of the subject. For example, the subject may move their arm, take a step to a different physical location than where the subject was previously standing, may move their torso, among other types of changes in position of a subject. - In some examples,
camera 110 may transmit digital images to computingdevice 102 in response to an action by a subject causing thecamera 110 to transmit digital images to computingdevice 102. For instance,camera 110 may take a digital image upon a new client's arrival and transmit the digital image to computingdevice 102. In some examples, the action by the subject can include picking up a predetermined product, standing or entering a predetermined area, etc. -
Computing device 102 can analyze the digital image received fromcamera 110 to detect facial features of a subject. As used herein, the term “facial feature” can for example, refer to a distinguishing element of a face. For example, element of the subject's face may be an ear, nose, mouth, hair, jaw, and/or cheekbones of the subject. - Facial features can be detected by
computing device 102 via object detection. Object detection can, for example, refer to detecting instances of semantic objects in digital images. Semantic objects can include facial features. For example,computing device 102 can utilize object detection to detect facial elements such as a subject's ear, nose, eye, mouth, hair, jaw, and/or cheekbones, among other facial features and/or combinations thereof. -
Computing device 102 can receive the digital image of the subject fromcamera 110 and analyze the detected facial features of the subject received fromcamera 110. Analyzing the detected facial features can include analyzing an element of the subject's face. For example,computing device 110 can analyze an ear, nose, eye, mouth, hair, jaw, cheekbones, and/or combinations thereof of the subject's face. - Analyzing an element of the subject's face can include determining various characteristics about the element of the subject's face. For example, characteristics of an element of the subject's face can include a shape of the element, a size of the element, a color of the element, distinguishing features of the element, among other types of characteristics. For example,
computing device 102 may analyze an element of the subject's face such as the subject's eye. Analyzing the subject's eye may include determining a shape of the eye, size of the eye, color of the eye, etc. -
Computing device 102 can analyze the detected facial features to determine an identity of a subject. In some examples,computing device 102 can identify a subject as an existing subject or as a new subject, as is further described herein. - In some examples,
computing device 102 can identify the subject as an existing subject. For example,computing device 102 may receive a digital image fromcamera 110.Computing device 102 may then analyze the detected facial features of the subject based on an element of a subject's face. For example,computing device 102 may analyze a subject's nose, mouth, and jaw. Based on the analysis,computing device 102 may identify the subject as an existing subject. For instance, if facial features included in the image received fromcamera 110 match the facial features of an existing image included indatabase 108,computing device 102 can identify the subject as an existing subject. - In some examples,
computing device 102 can identify the subject as a new subject. For example,computing device 102 may receive a digital image fromcamera 110.Computing device 102 may then analyze the detected facial features of the subject based on an element of a subject's face. For example,computing device 102 may analyze a subject's nose, mouth, and jaw. Based on the analysis,computing device 102 may not identify the subject as an existing subject. For instance, if facial features included in the image received fromcamera 110 do not match the facial features of an existing image included indatabase 108,computing device 102 can identify the subject as a new subject. - Although computing
device 102 is described above as utilizing a subject's nose, mouth, and jaw, examples of the disclosure are not so limited. For example,computing device 102 can utilize a subject's ear(s), nose, mouth, hair, jaw, and/or cheekbones, and/or any other facial element and/or combination thereof to determine the identity of a subject. -
Computing device 102 can determine a sentiment level of a subject using a sentiment analysis. For example,computing device 102 can determine the sentiment level by detecting facial features and the identity of a subject.Computing device 102 can determine a sentiment analysis via machine learning. For example,computing device 102 can utilize decision tree learning, artificial neural networks, deep learning, inductive logic programming, support vector machines, Bayesian networks, and/or learning classifier systems to determine a sentiment analysis, among other types of machine learning techniques. - In some examples,
computing device 102 can determine the subject's sentiment level based on a facial expression of the subject. For example,computing device 102 may determine a subject's sentiment levels as a happy sentiment level, a frustrated sentiment level, an upset sentiment level, and/or a satisfied sentiment level, among other types of sentiment levels. For example,computing device 102 can determine the subject's sentiment level as happy based on the mouth of the subject being oriented in a smile. In some examples,computing device 102 can determine the subject's sentiment level as upset based on the subject's eyebrows being turned down. - In some examples,
computing device 102 to can determine a subject's sentiment level based on an identity of the subject. For example, if computingdevice 102 identifies a subject as an existing subject,computing device 102 can determine the sentiment level to be the previous sentiment level of the existing subject. Further, based on the analysis of the subject,computing device 102 can update the subject's sentiment level by comparing the subject's sentiment level with facial features and related sentiment levels of subjects, received fromdatabase 108. -
Computing device 102 can determine customer satisfaction based on the determined sentiment level of the subject. As used herein, the term “customer satisfaction” can, for example, refer to a measure of how a product or service meets a customer expectation. For example,computing device 102 can determine the customer satisfaction utilizing the facial features analysis, as is further described herein. - In some examples,
computing device 102 may determine a customer satisfaction level as dissatisfied based on the determined sentiment level. For example,computing device 102 can identify a subject as a new customer and determine the sentiment level of the subject based on facial features analysis. For example, the sentiment level may be determined by computingdevice 102 as frustrated. Based on the determination of the subject's sentiment level as frustrated,computing device 102 may determine the subject has a dissatisfied customer satisfaction level. - In some examples,
computing device 102 may identify a subject as an existing customer based on facial features analysis.Computing device 102 may then determine the subject's sentiment level as a happy sentiment level by comparing the subject's facial features with sentiment levels stored in and received fromdatabase 108. Based on the determination of subject's sentiment level as a happy sentiment level,computing device 102 may determine the customer is satisfied. - Sentiment level information stored in
database 108 can include existing subjects' information. Sentiment level information stored indatabase 108 can be information from other subjects, collected in various places and at various points in time. -
Computing device 102 can display the sentiment level of the subject via a display. As used herein, the term “display” can, for example, refer to an output device which can display information via a screen. A display may include a television, computer monitor, mobile device display, other type of display device, or any combination thereof, which can receive and output a video signal. The display can be a liquid crystal display (LCD), LED display, organic light-emitting diode (OLED) display, polymer light-emitting diode (PLED) display, micro-LED display, electronic paper display (EPD), bi-stable display, and/or a quantum-dot LED (QLED) display, among other types of displays. - In some examples,
computing device 102 may identify determine a subject's sentiment level and display the sentiment level via a display. In one example, an existing subject's sentiment level may be determined as “dissatisfied”. The dissatisfied sentiment level may be displayed on a display so that an employee, supervisor, and/or other user may view the determined sentiment level. Based on the determined sentiment level, appropriate action can be taken. In some examples, further employee training can be performed to improve customer sentiment levels and customer satisfaction. In some examples, the subject may be given coupons or other discounts in order to improve customer sentiment levels. In some examples, appropriate personnel may be notified based on the subject's sentiment level. -
Computing device 102 can generate a report including the determined sentiment level and/or the past sentiment level of the subject. As used herein, the term “report” can, for example, refer to an account or statement describing an event. For example, the report generated by computingdevice 102 can include a sentiment level of the subject, including, for instance, whether the subject has a happy, frustrated, upset, and/or a dissatisfied sentiment level, among other types of sentiment levels, whether the subject is a new or existing subject, the customer satisfaction of the subject, including, for instance, whether the subject is satisfied, dissatisfied, among other types of satisfaction levels, etc. - The report can include information to allow personnel, such as a supervisor and/or employee, to determine whether to take action to improve the subject's experience by improving their sentiment level and/or customer satisfaction, to give an employee further training, etc. In some examples, the report can be displayed via a display. In some examples, the report can be printed by an imaging device, such as a printer, such that the report can be physically distributed among personnel.
-
FIG. 2 is a block diagram 220 of anexample computing device 202 for sentiment analysis consistent with the disclosure. As described herein, thecomputing device 202 may perform a number of functions related to sentiment analysis. Although not illustrated inFIG. 2 , thecomputing device 202 may include a processor and a machine-readable storage medium. Although the following descriptions refer to a single processor and a single machine-readable storage medium, the descriptions may also apply to a system with multiple processors and multiple machine-readable storage mediums. In such examples, thecomputing device 202 may be distributed across multiple machine-readable storage mediums and thecomputing device 202 may be distributed across multiple processors. Put another way, the instructions executed by thecomputing device 202 may be stored across multiple machine-readable storage mediums and executed across multiple processors, such as in a distributed or virtual computing environment. -
Processing resource 204 may be a central processing unit (CPU), a semiconductor based microprocessor, and/or other hardware devices suitable for retrieval and execution of machine- 201, 203, 205, 207, stored inreadable instructions memory resource 206.Processing resource 204 may fetch, decode, and execute 201, 203, 205, 207. As an alternative or in addition to retrieving and executinginstructions 201, 203, 205, 207,instructions processing resource 204 may include a plurality of electronic circuits that include electronic components for performing the functionality of 201, 203, 205, 207.instructions -
Memory resource 206 may be any electronic, magnetic, optical, or other physical storage device that stores 201, 203, 205, 207 and/or data. Thus,executable instructions memory resource 206 may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like.Memory resource 206 may be disposed withincomputing device 202, as shown inFIG. 2 . Additionally, and/or alternatively,memory resource 206 may be a portable, external or remote storage medium, for example, that allowscomputing device 202 to download the 201, 203, 205, 207 from a portable/external/remote storage medium.instructions -
Computing device 202 may includeinstructions 201 stored in thememory resource 206 and executable by theprocessing resource 204 to receive a digital image of a subject. For example,computing device 202 may executeinstructions 201 via theprocessing resource 204 to receive, from a camera, a digital image of a subject. - For example, digital images taken by a camera may be periodically transmitted to
computing device 202. In one example a camera may transmit digital images tocomputer device 202 periodically, such as every fifteen minutes, and/or in response to a subject's change in position. - In one example, a camera may transmit digital images to
computing device 202 in response to a subject's triggered action. For instance, an employee may trigger a camera to take a digital image upon a new subject's arrival, and transmit the digital image tocomputing device 202. -
Computing device 202 may includeinstructions 203 stored in thememory resource 206 and executable by theprocessing resource 204 to analyze the digital image to detect facial features. For example,computing device 202 may executeinstructions 203 via theprocessing resource 204 to analyze the digital image to detect facial features of the subject. Facial elements of the subject can, for example, include an ear nose, mouth, eye, hair, jaw, and/or cheekbones of the subject, among other facial elements and/or combinations thereof. -
Computing device 202 may includeinstructions 205 stored in thememory resource 206 and executable by theprocessing resource 204 to determine a sentiment level of the subject using a sentiment analysis. For example,computing device 202 may executeinstructions 205 via theprocessing resource 204 to determine a sentiment level of the subject using a sentiment analysis, where the sentiment analysis uses the detected facial features and the identity of the subject to determine the sentiment level of the subject. - In one example,
computing device 202 can determine the identity of a subject as an existing subject. In some examples,computing device 202 can determine the identity of the subject as a new subject. - In some examples,
computing device 202 can analyze the detected facial features of a subject based on an element of the subject's face. For example, thecomputing device 202 can to analyze a subject's nose, mouth, and jaw,Computing device 202 can compare the facial features from the digital image with facial features of existing images in a database. In response to the facial features from the digital image matching facial features of an existing image in the database,computing device 202 can identify the subject as an existing subject. - In some examples,
computing device 202 can analyze the detected facial features of a subject based on an element of the subject's face. For example, thecomputing device 202 can to analyze a subject's nose, mouth, and jaw.Computing device 202 can compare the facial features from the digital image with facial features of existing images in a database. In response to the facial features from the digital image not matching facial features of an existing image in the database,computing device 202 can identify the subject as a new subject. -
Computing device 202 may includeinstructions 207 stored in thememory resource 206 and executable by theprocessing resource 204 to display the sentiment level. For example,computing device 202 may executeinstructions 207 via theprocessing resource 204 to display the sentiment level via a display. -
FIG. 3 is a block diagram of an example of asystem 322 consistent with the disclosure. In the example ofFIG. 3 ,system 322 includes aprocessor 304 and a machine-readable storage medium 312. Although the following descriptions refer to an individual processing resource and an individual machine-readable storage medium, the descriptions may also apply to a system with multiple processing resources and multiple machine-readable storage mediums. In such examples, the instructions may be distributed across multiple machine-readable storage mediums and the instructions may be distributed across multiple processing resources. Put another way, the instructions may be stored across multiple machine-readable storage mediums and executed across multiple processing resources, such as in a distributed computing environment. -
Processor 304 may be a central processing unit (CPU), microprocessor, and/or other hardware device suitable for retrieval and execution of instructions stored in machine-readable storage medium 312. In the particular example shown inFIG. 3 ,processor 304 may receive, determine, and send 309, 311, 313, 315, 317. As an alternative or in addition to retrieving and executing instructions,instructions processor 304 may include an electronic circuit comprising a number of electronic components for performing the operations of the instructions in machine-readable storage medium 312. With respect to the executable instruction representations or boxes described and shown herein, it should be understood that part or all of the executable instructions and/or electronic circuits included within one box may be included in a different box shown in the figures or in a different box not shown. - Machine-
readable storage medium 312 may be any electronic, magnetic, optical, or other physical storage device that stores executable instructions. Thus, machine-readable storage medium 312 may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like. The executable instructions may be “installed” on thesystem 322 illustrated inFIG. 3 . Machine-readable storage medium 312 may be a portable, external or remote storage medium, for example, that allows thesystem 322 to download the instructions from the portable/external/remote storage medium. In this situation, the executable instructions may be part of an “installation package”. As described herein, machine-readable storage medium 312 may be encoded with executable instructions for sentiment analysis. -
Instructions 309 to receive a digital image, when executed byprocessor 304, may causesystem 322 to receive a digital image of a subject. For example, a computingdevice including processor 304 and machine-readable storage medium 312 can receive a digital image of a subject from a camera. -
Instructions 311 to analyze the digital image to detect facial features of the subject, when executed byprocessor 304, may causesystem 322 to analyze the digital image to detect facial features of the subject. Facial features of the subject can include an element of the subject's face. For example, elements of a subject's face may include an ear, nose, mouth, hair, jaw, and/or cheekbones of the subject, among other facial elements and/or combinations thereof. -
Instructions 312 to analyze the detected facial features to determine an identity of the subject, when executed byprocessor 304, may causesystem 322 to analyze the detected facial features to determine an identity of the subject by comparing the detected facial features to facial features included in a database. In some examples, the computing device can determine the subject to be an existing subject in response to the detected facial features matching the facial features included in the database. In some examples, the computing device can determine the subject to be a new subject in response to the detected facial features not matching the facial features included in the database. -
Instructions 313 to determine a sentiment level of the subject using a sentiment analysis, when executed byprocessor 304, may causesystem 322 to determine a sentiment level of the subject using a sentiment analysis, where the sentiment analysis uses the detected facial features and an identity of the subject to determine the sentiment level of the subject. -
Instructions 315 to compare the sentiment level of the subject with a past sentiment level of the subject, when executed byprocessor 304, may causesystem 322 to compare the sentiment level of the subject with a past sentiment level of the subject in response to the subject being an existing subject. -
Instructions 317 to generate an alert in response to the determined sentiment level having changed, when executed byprocessor 304, may causesystem 322 to generate an alert in response to the determined sentiment level being changed from the past sentiment level. For example, an alert may be generated such that an employee can be notified that a sentiment level of the subject has changed. In some examples, the employee can, in response to the sentiment level having changed, approach the subject differently, offer the subject coupons, and/or other actions. -
FIG. 4 is a block diagram of anexample computing device 402 to perform sentiment analysis consistent with the disclosure. As described herein, thecomputing device 402 may perform a number of functions related to sentiment analysis. -
Processing resource 404 may be a central processing unit (CPU), a semiconductor based microprocessor, and/or other hardware devices suitable for retrieval and execution of machine- 419, 421, 423, 425, 427, 429 stored inreadable instructions memory resource 406.Processing resource 404 may fetch, decode, and execute 419, 421, 423, 425, 427, 429. As an alternative or in addition to retrieving and executinginstructions 419, 421, 423, 425, 427, 429,instructions processing resource 404 may include a plurality of electronic circuits that include electronic components for performing the functionality of 419, 421, 423, 425, 427, 429.instructions -
Memory resource 406 may be any electronic, magnetic, optical, or other physical storage device that stores 419, 421, 423, 425, 427, 429 and/or data. Thus,executable instructions memory resource 406 may be, for example, Random Access Memory (RAM), an Electrically-Erasable Programmable Read-Only Memory (EEPROM), a storage drive, an optical disc, and the like.Memory resource 406 may be disposed withincomputing device 402, as shown inFIG. 4 . Additionally, and/or alternatively,memory resource 406 may be a portable, external or remote storage medium, for example, that allowscomputing device 402 to download the 419, 421, 423, 425, 427, 429 from a portable/external/remote storage medium.instructions -
Computing device 402 may includeinstructions 419 stored in thememory resource 406 and executable by theprocessing resource 404 to receive a digital image of a subject. For example,computing device 402 may executeinstructions 419 via theprocessing resource 404 to receive, from a camera, a digital image of a subject. -
Computing device 402 may includeinstructions 421 stored in thememory resource 406 and executable by theprocessing resource 404 to analyze the digital image to detect facial features. For example,computing device 402 may executeinstructions 421 via theprocessing resource 404 to analyze the digital image to detect facial features of the subject. Facial elements of the subject can, for example, include an ear nose, mouth, eye, hair, jaw, and/or cheekbones of the subject, among other facial elements and/or combinations thereof. -
Computing device 402 may includeinstructions 422 stored in thememory resource 406 and executable by theprocessing resource 404 to analyze the detected facial features to determine an identity of the subject. For example,computing device 402 may executeinstructions 422 via theprocessing resource 404 to analyze the detected facial features to determine an identity of the subject by comparing the detected facial features to facial features included in a database. The subject can be determined to be an existing subject in response to the detected facial features matching the facial features included in the database. The subject can be determined to be a new subject in response to the detected facial features not matching the facial features included in the database. -
Computing device 402 may includeinstructions 423 stored in thememory resource 406 and executable by theprocessing resource 404 to determine a sentiment level of the subject using a sentiment analysis. For example,computing device 402 may executeinstructions 423 via theprocessing resource 404 to determine a sentiment level of the subject using a sentiment analysis, where the sentiment analysis uses the detected facial features and the identity of the subject to determine the sentiment level of the subject. -
Computing device 402 may includeinstructions 425 stored in thememory resource 406 and executable by theprocessing resource 404 to compare the sentiment level of the subject with a past sentiment level. For example,computing device 402 may executeinstructions 425 via theprocessing resource 404 to compare the sentiment level of the subject with a past sentiment level of the subject in response to the subject being an existing subject. -
Computing device 402 may includeinstructions 427 stored in thememory resource 406 and executable by theprocessing resource 404 to analyze the sentiment level to determine customer satisfaction. For example, the customer may be satisfied or dissatisfied. -
Computing device 402 may includeinstructions 429 stored in thememory resource 406 and executable by theprocessing resource 404 to display the sentiment level and the customer satisfaction. For example,computing device 402 may executeinstructions 429 via theprocessing resource 404 to display the sentiment level and the customer satisfaction of the subject via display 414. - Display 414 can be, for instance, a touch-screen display. As previously described in connection with
FIG. 1 , the display may include a television, computer monitor, mobile device display, other type of display device, or any combination thereof, which can receive and output a video signal. - The figures herein follow a numbering convention in which the first digit corresponds to the drawing figure number and the remaining digits identify an element or component in the drawing. Similar elements or components between different figures may be identified by the use of similar digits. For example, 102 may reference element “02” in
FIG. 1 , and a similar element may be referenced as 202 inFIG. 2 . Elements shown in the various figures herein can be added, exchanged, and/or eliminated so as to provide a plurality of additional examples of the disclosure. In addition, the proportion and the relative scale of the elements provided in the figures are intended to illustrate the examples of the disclosure and should not be taken in a limiting sense. Further, as used herein, “a plurality of” an element and/or feature can refer to more than one of such elements and/or features.
Claims (15)
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/US2018/021513 WO2019172910A1 (en) | 2018-03-08 | 2018-03-08 | Sentiment analysis |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| US20210004573A1 true US20210004573A1 (en) | 2021-01-07 |
Family
ID=67847366
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/763,494 Abandoned US20210004573A1 (en) | 2018-03-08 | 2018-03-08 | Sentiment analysis |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US20210004573A1 (en) |
| WO (1) | WO2019172910A1 (en) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20210383103A1 (en) * | 2019-09-19 | 2021-12-09 | Arctan Analytics Pte. Ltd. | System and method for assessing customer satisfaction from a physical gesture of a customer |
| JP6815667B1 (en) * | 2019-11-15 | 2021-01-20 | 株式会社Patic Trust | Information processing equipment, information processing methods, programs and camera systems |
| IT202300003027A1 (en) | 2023-02-22 | 2024-08-22 | Mayak Games And Solutions Oue | SYSTEM OF SUPPORT AND EXECUTION OF INVESTMENTS IN FINANCIAL INSTRUMENTS AND SERVICES, PRODUCTS AND INVESTMENTS OF ANY TYPE OF MOVABLE, REAL ESTATE, MONETARY AND CRYPTOCURRENCY-RELATED ASSETS |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2005034025A1 (en) * | 2003-10-08 | 2005-04-14 | Xid Technologies Pte Ltd | Individual identity authentication systems |
| US8235725B1 (en) * | 2005-02-20 | 2012-08-07 | Sensory Logic, Inc. | Computerized method of assessing consumer reaction to a business stimulus employing facial coding |
| US9648061B2 (en) * | 2014-08-08 | 2017-05-09 | International Business Machines Corporation | Sentiment analysis in a video conference |
-
2018
- 2018-03-08 US US16/763,494 patent/US20210004573A1/en not_active Abandoned
- 2018-03-08 WO PCT/US2018/021513 patent/WO2019172910A1/en not_active Ceased
Also Published As
| Publication number | Publication date |
|---|---|
| WO2019172910A1 (en) | 2019-09-12 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10891469B2 (en) | Performance of an emotional analysis of a target using techniques driven by artificial intelligence | |
| US12056752B2 (en) | System and method for visually tracking persons and imputing demographic and sentiment data | |
| CN111563396A (en) | Method and device for online identifying abnormal behavior, electronic equipment and readable storage medium | |
| US20220083767A1 (en) | Method and system to provide real time interior analytics using machine learning and computer vision | |
| CN114066534A (en) | Elevator advertisement delivery method, device, equipment and medium based on artificial intelligence | |
| US20210004573A1 (en) | Sentiment analysis | |
| US20220292750A1 (en) | Selective redaction of images | |
| CN113139812B (en) | User transaction risk identification method, device and server | |
| Farinella et al. | Face re-identification for digital signage applications | |
| US20200065631A1 (en) | Produce Assessment System | |
| KR20220021689A (en) | System for artificial intelligence digital signage and operating method thereof | |
| CN118675101A (en) | Remote control shop operation management method, equipment and medium | |
| WO2021104388A1 (en) | System and method for interactive perception and content presentation | |
| US11354909B2 (en) | Adaptive queue management system | |
| US10762339B2 (en) | Automatic emotion response detection | |
| US11763595B2 (en) | Method and system for identifying, tracking, and collecting data on a person of interest | |
| Al Shazly et al. | Ethical concerns: an overview of artificial intelligence system development and life cycle | |
| US10929685B2 (en) | Analysis of operator behavior focalized on machine events | |
| WO2020055002A1 (en) | Electronic device and method for controlling same | |
| WO2020050862A1 (en) | Determining sentiments of customers and employees | |
| KR20240000838A (en) | Method and system for measuring reaction to contents based on smile analysis | |
| CN115424176A (en) | Target object determination method and device, computer equipment and storage medium | |
| US10943122B2 (en) | Focalized behavioral measurements in a video stream | |
| Borges et al. | Are you lost? using facial recognition to detect customer emotions in retail stores | |
| US20240430520A1 (en) | Method and apparatus for analyzing content viewers |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GUPTA, MOHIT;PETTIT, LUCAS;KRUGER, CHRIS;SIGNING DATES FROM 20180227 TO 20180301;REEL/FRAME:052673/0332 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |