WO2019039659A1 - Procédé de gestion d'utilisateur basé sur les émotions et appareils le mettant en oeuvre - Google Patents

Procédé de gestion d'utilisateur basé sur les émotions et appareils le mettant en oeuvre Download PDF

Info

Publication number
WO2019039659A1
WO2019039659A1 PCT/KR2017/013278 KR2017013278W WO2019039659A1 WO 2019039659 A1 WO2019039659 A1 WO 2019039659A1 KR 2017013278 W KR2017013278 W KR 2017013278W WO 2019039659 A1 WO2019039659 A1 WO 2019039659A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
state
emotion
image
emotional
Prior art date
Application number
PCT/KR2017/013278
Other languages
English (en)
Korean (ko)
Inventor
김은이
Original Assignee
건국대학교 산학협력단
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 건국대학교 산학협력단 filed Critical 건국대학교 산학협력단
Publication of WO2019039659A1 publication Critical patent/WO2019039659A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/01Social networking

Definitions

  • the following embodiments relate to a method of managing a user based on emotion and apparatuses for performing the same.
  • the emotion of the user is represented not only by the voice but also through various contents.
  • the user's emotional information can be obtained in various ways.
  • the user emotion information varies depending on various applications.
  • Emotion recognition technology can extract basic emotions of humans through complex signals obtained from human and environment in which human beings live, and can give ability to intelligently respond to human emotions.
  • emotion recognition technology has been utilized in the field of human interface technology.
  • Embodiments can provide a content analysis technique with improved performance through not only image but also text segmentation, sentence structure, and morphology and emotion estimation for emoticons.
  • embodiments may provide a technique that allows a user to monitor the status of a user by providing the user with information about the status of the user.
  • Embodiments can also provide techniques for diagnosing and managing depression by providing customized user management for the user who created the content.
  • the emotion-based user management method estimates the emotional state of the user based on learning data for a tagged image, sentiment lexicon, and contents generated by a user, And managing the user by determining the state of the user based on the emotional state.
  • the content may include at least one of the image, text, and emoticon of the user.
  • the step of estimating the first emotion state comprises: generating the learning data based on an emotion value for an arbitrary image; and selecting one of the deep learning and the probabilistic latent semantic analysis learning And estimating the first emotion state by analyzing a correlation between the characteristic of emotion for the image and the learning data using the first emotion state.
  • the generating step comprises the steps of: acquiring the arbitrary image; determining win / lose of the arbitrary image with respect to a specific emotion based on a response of an arbitrary user; And generating the learning data by tagging the normalized emotion value to the arbitrary image by normalizing the emotion value.
  • the step of estimating the second emotional state includes the steps of generating the emotional dictionary based on a feature of emotion of any text, a structure of sentences, and a feature of emotion of a form and an emoticon of an arbitrary emoticon, And estimating the second emotional state by analyzing a feature of emotion of the text and a feature of emotion of the emoticon using a dictionary.
  • the managing comprises: determining the state of the user based on the integrated emotional state and the social networking service activity propensity of the user; and determining a state of the user based on the state of the user Grouping the user into a group, and managing the user by providing information about the user.
  • the determining comprises: determining a first state of the user based on the integrated emotional state; determining a second state of the user based on the SNS activity propensity; And determining a state of the user by integrating the second state.
  • Determining the first state may include monitoring the integrated emotional state and determining a first state of the user by calculating a statistical value for the integrated emotional state based on the monitoring result have.
  • the group may be any one of a normal group and a depressed group.
  • the information may include at least one of a graph of the emotional state of the user and a feature value of the group.
  • the feature value of the group may include at least one of a word, a sentence, and a color of an image frequently used in the group.
  • the emotion-based user management apparatus includes a transmission / reception unit that receives a content generated by a user, a learning unit that generates learning data for a tagged emotion value, a sentiment lexicon, And a controller for managing the user by estimating a mood state of the user and determining the state of the user based on the emotion state.
  • the content may include at least one of the image, text, and emoticon of the user.
  • the controller estimates a first emotion state corresponding to the image using the learning data, estimates a second emotion state corresponding to the text and the emoticon using the emotion dictionary, And an estimator for generating an integrated emotion state by integrating the second emotion state.
  • the estimator generates the learning data based on the emotion value for an arbitrary image and uses the depth learning and the probabilistic latent semantic analysis learning
  • the first emotion state can be estimated by analyzing the correlation between the feature and the learning data.
  • the estimation unit acquires the arbitrary image, determines the win or loss of the arbitrary image with respect to the specific emotion based on a response of an arbitrary user, and determines an emotion value for the arbitrary image based on the determination result ,
  • the learning data can be generated by normalizing the emotion value and tagging the normalized emotion value to the arbitrary image.
  • the estimating unit generates the emotion dictionary based on a feature of a part of text, a structure of a sentence, a feature of emotion for a form, and a feature of emotion for an arbitrary emoticon, And the second emotional state can be estimated by analyzing the characteristics of the emotional state and the characteristics of the emotional state.
  • the controller comprises: a determination unit for determining the state of the user based on the integrated emotion state and the social networking service activity tendency of the user; and a group corresponding to the state of the user based on the state of the user And a management unit for classifying the user and managing the user by providing information about the user.
  • the determination unit determines the first state of the user based on the integrated emotion state, determines a second state of the user based on the SNS activity propensity, and integrates the first state and the second state The state of the user can be determined.
  • the determining unit may determine the first state of the user by monitoring the integrated state of emotion and calculating a statistical value for the integrated state of emotion based on the monitoring result.
  • the group may be any one of a normal group and a depressed group.
  • the information may include at least one of a graph of the emotional state of the user and a feature value of the group.
  • the feature value of the group may include at least one of a word, a sentence, and a color of an image frequently used in the group.
  • FIG. 1 shows a schematic block diagram of a user management system according to one embodiment.
  • Fig. 2 shows a schematic block diagram of the user management apparatus shown in Fig. 1.
  • Fig. 3 shows an example for explaining learning data.
  • Fig. 4 shows an example for explaining the correlation of learning data.
  • Fig. 5 shows an example for explaining the integrated emotion state.
  • 6A and 6B show an example for explaining a graph of the emotional state of the user.
  • FIG. 7 shows a flowchart for explaining the operation of the user management apparatus shown in FIG. 1.
  • first, second, or the like may be used to describe various elements, but the elements should not be limited by the terms.
  • the terms may be named for the purpose of distinguishing one element from another, for example without departing from the scope of the right according to the concept of the present invention, the first element being referred to as the second element, Similarly, the second component may also be referred to as the first component.
  • FIG. 1 shows a schematic block diagram of a user management system according to one embodiment.
  • a user managing system 10 includes a user device 100, a user managing apparatus 200, and a content storage 300, .
  • the user device 100 may be implemented as an electronic device.
  • the electronic device may be implemented as a personal computer (PC), a data server, or a portable electronic device.
  • the portable electronic device may be a laptop computer, a mobile phone, a smart phone, a tablet PC, a mobile internet device (MID), a personal digital assistant (PDA), an enterprise digital assistant A digital still camera, a digital video camera, a portable multimedia player (PMP), a personal navigation device or a portable navigation device (PND), a handheld game console, an e-book, or a smart device.
  • the smart device can be implemented as a smart watch or a smart band.
  • the user device 100 may communicate with at least one of the user management device 200 and the content repository 300.
  • the user device 100 may be connected to the user management device 200 through various communication networks such as an Internet communication network, an intranet, a local area network (LAN), a wireless LAN, Wi-Fi, LF, Xbee, Zigbee, Blue- And the content repository 300, as shown in FIG.
  • various communication networks such as an Internet communication network, an intranet, a local area network (LAN), a wireless LAN, Wi-Fi, LF, Xbee, Zigbee, Blue- And the content repository 300, as shown in FIG.
  • the user device 100 may transmit the content created by the user to at least one of the user management apparatus 200 and the content repository 300 based on the user response.
  • the user device 100 may be provided with an application program (APP) for registering and managing content created by a user, and may be linked to the user management apparatus 200 and the content repository 300.
  • the application program (APP) may be a program associated with a social networking service (SNS).
  • the user device 100 can receive and visualize information on the status of the user from either the user management device 200 or the content storage 300.
  • the user device 100 may visualize information about the user transmitted by the user management device 200 to allow the user to monitor the status of the user through an application program (APP).
  • APP application program
  • the user management apparatus 200 can provide a customized user management according to the state of the user by estimating (or recognizing) the emotional state of the user through the content created by the user and determining the state of the user.
  • the user management apparatus 200 may be a user management apparatus based on emotion.
  • the user management apparatus 200 may continuously monitor the user's image, text, and emoticons to determine the user's state by estimating (or recognizing) the emotional state of the user with respect to the user-generated content. At this time, the user management apparatus 200 can estimate the emotional state of the user through the characteristics of the emotion of the user's image, the parts of the text of the user, the structure of the sentence, and the emotional characteristics of the form.
  • the user management apparatus 200 can provide customized user management for the user who generated the content by classifying the user according to the status of the user and providing information on the status of the user.
  • the user management apparatus 200 can provide a content analysis technique with improved performance through not only the image but also the parts of speech, the structure of sentences, and the emotion estimation of the form and the emoticon.
  • improved content analysis techniques can be used to investigate preferences for specific politicians, polls on specific issues, reviews on certain products, and preferences.
  • the user management apparatus 200 may provide the user with information on the status of the user, thereby enabling the user to monitor the status of the user.
  • the user management apparatus 200 can diagnose and manage depression by providing customized user management to the user who created the content.
  • the content store 300 may store various contents.
  • the content repository 300 may store content created by the user, content created by any user, and information transmitted by the user management device 200.
  • the content storage 300 may include a data base 310, a social network service server 330, and the Internet 350.
  • the content storage 300 includes the database 310, the SNS server 330, and the Internet 350, but is not limited thereto.
  • the content repository 300 may include various repositories that store various content that the user management device 200 acquires.
  • FIG. 2 shows a schematic block diagram of the user management apparatus shown in Fig. 1.
  • Fig. 3 shows an example for explaining learning data
  • Fig. 4 shows an example for explaining the correlation of learning data
  • FIG. 5 shows an example for explaining the integrated emotional state
  • FIGS. 6A and 6B show an example for explaining a graph of the emotional state of the user.
  • the user management apparatus 200 includes a transceiver 210 and a controller 230.
  • the transceiver 210 may receive the signal (or data) transmitted from the user device 100 and the content storage 300 and may transmit the received signal (or data) to the controller 230.
  • the signal may be user generated content, and any user generated content.
  • the user-generated content may be at least one of a user's image, text, and emoticon.
  • the content created by an arbitrary user may be at least one of video, text, and emoticons of an arbitrary user.
  • the transceiver 210 may transmit the signal (or data) transmitted from the controller 230 to at least one of the user device 100 and the content repository 300.
  • the controller 230 can control the overall operation of the user management apparatus 200. [ For example, the controller 230 may control the operation of the transceiver 210.
  • the controller 230 may estimate (or recognize) the mood state of the user based on the learning data for the tagged image, the sentiment lexicon, and the content generated by the user.
  • the controller 230 can manage the user by determining the state of the user based on the emotional state.
  • the controller 230 includes an estimator 231, a determiner 233, and a manager 237.
  • the estimating unit 231 estimates (or recognizes) the first emotion state corresponding to the user's image using the learning data, estimates the second emotion state corresponding to the user's text and emoticon using the emotion dictionary, (Or recognizes) the first emotion state, and integrate the first emotion state and the second emotion state to generate an integrated emotion state.
  • the estimation unit 231 can generate learning data based on emotion values for an arbitrary image.
  • An arbitrary image may be an arbitrary user's image.
  • the estimation unit 231 acquires an arbitrary image and can determine the win or loss of an arbitrary image for a specific emotion based on a response of an arbitrary user. At this time, the estimator 231 can determine the win or loss of an arbitrary image by using an online rating game based on crowd sourcing or pair-wise competition.
  • the estimation unit 231 may provide an arbitrary image to an arbitrary user using crow sourcing and an online rating game based on fair-wise competition, so that an arbitrary user can select an arbitrary image.
  • An arbitrary image may be two images randomly selected differently. Any selected image may have a higher relevance to a particular emotion.
  • the estimator 231 can repeat an arbitrary pair of images 100 times.
  • the estimation unit 231 can determine and estimate (or recognize) the win or loss of an arbitrary image based on the selection result. For example, the estimation unit 231 may determine that an arbitrary image selected is a positive value. The estimation unit 231 can estimate (or recognize) the win / loss of an arbitrary image based on the determined win / loss. The estimation unit 231 can estimate (or recognize) the win / loss of the image A and the image C based on the win / loss of the image A and the image B and the win / loss of the image B and the image C. Image A, image B, and image C may be different arbitrary images.
  • the estimation unit 231 can determine the emotion value for an arbitrary image based on the determination result. For example, the estimator 231 may determine emotion values for an arbitrary image using visual link analysis, a three-column method, and a page rank. The emotion value may be a ground truth value.
  • the estimation unit 231 may generate a visual-link graph based on the determination result using the visual-link analysis. For example, the estimator 231 may arrange an arbitrary image with a large multiplier in the head of the visual link graph. The estimating unit 231 can arrange an arbitrary image having a large number of eyes on the tail of the visual link graph.
  • the estimation unit 231 can determine the ranking for an arbitrary image using the visual link graph. For example, the estimator 231 may use a page rank to determine a high ranking for any image disposed at the head of the visual link graph. The estimator 231 may generate an inferred matrix as in the algorithm of FIG. 3 using a three-column method, and determine a ranking for an arbitrary image. The estimation unit 231 can determine an emotion value of an arbitrary image according to the rank of an arbitrary image.
  • the estimator 231 can generate the learning data by normalizing the emotion value for an arbitrary image and tagging the normalized emotion value to an arbitrary image.
  • the estimation unit 231 can generate learning data using the algorithm of FIG.
  • the normalized emotion value may be a value between -1 and 1. A value close to 1 corresponds to the winner image and a value close to -1 may correspond to the loser image.
  • the estimating unit 231 compares the correlation between the characteristic of emotion with respect to the user's image and the learning data using any one of deep learning and probabilistic latent semantic analysis learning, Estimation (or recognition).
  • the estimating unit 231 can generate correlation of learning data such as GRAPH1 in FIG. 4 through an expectation-maximization (EM) algorithm.
  • GRAPH1 can be a graph showing a correlation of 1170 color combinations with a particular emotion. If there is a high correlation between the color combination and a certain emotion, GRAPH1 may be close to white. Certain emotions are pretty, colorful, dynamic, gorgeous, wild, romantic, natural, graceful, quiet, classical, May be at least one of classic, dandy, jazz, cool, pure, and modern.
  • the estimator 231 may estimate (or recognize) the first emotion state by extracting visual features of the user's image and comparing the correlation of the learning data.
  • the visual feature may be at least one of color compositions, and type information based on scale invariant feature transform (SIFT).
  • the estimator 231 can estimate (or recognize) the first emotion state using Equation (1).
  • Is the joint probability that the jth emotion is estimated (or recognized) in the image of the ith user Is the probability that an image of the i < th > user is input
  • Is the conditional probability that there is j-th emotion in the image of the i-th user Is a k-th visual feature that can be extracted from the user's image
  • the estimator 231 calculates the emotional state of the emotional dictionary 231 based on the feature of emotion of any text, the structure of the sentence, Can be generated. Any text, and any emoticon, can be any user's text, and emoticons.
  • the estimator 231 may generate an emotion dictionary using visual sentiment ontology (VSO), sentistrength, and wikipedia.
  • the estimator 231 can learn the emotion dictionary using a support vector machine (SVM).
  • SVM support vector machine
  • the learned emotional dictionaries are composed of emotional vocabularies included in interrogative, negative, exclamation, punctuation, adjectives, nouns, verbs, and adverbs.
  • the correlation between the features of the base and the parts of arbitrary text, the structure of the sentences, and the features of the emotions on the form can be learned emotional dictionaries.
  • the estimating unit 231 can estimate (or recognize) the second emotional state by analyzing the characteristics of the emotion of the user with respect to the text and the emotional characteristics of the emoticon using the emotional dictionary. For example, the estimating unit 231 can extract characteristics of emotion of a user's emoticon of the part of speech, structure of a sentence, and type of a text divided by a sentence, and emotional characteristics of the emoticon of the user. The estimator 231 can estimate (or recognize) the second emotion state by comparing and analyzing the extracted emotion feature and the emotion dictionary.
  • the estimator 231 may assign a weight to the first emotion state and the second emotion state using a harmonic analysis technique. For example, the estimator 231 may assign a weight to the first emotion state to be higher than a weight to the second emotion state.
  • Harmonic analysis techniques may be at least one of rule-based aggregation, and learning-based aggregation techniques. The optimum weight values of the first emotion state and the second emotion state can be given through experiments.
  • the determination unit 233 can determine the status of the user based on the integrated emotion state and the social networking service activity tendency of the user.
  • the determination unit 233 may determine the first state of the user based on the integrated emotion state, determine the second state of the user based on the SNS activity propensity, and integrate the first state and the second state To determine the state of the user.
  • the user's state can be either a negative state, a positive state, and a normal state.
  • the determination unit 233 may monitor the integrated emotion state. For example, the determination unit 233 may monitor the integrated emotional state of the user's image, text, and emoticon for a predetermined period of time. The period may be various periods such as a day, a week, and a month.
  • the determination unit 233 can determine the first state of the user by calculating a statistical value for the integrated emotion state based on the monitoring result. For example, the determination unit 233 may calculate a statistical value for the integrated emotion state using the algorithm shown in FIG.
  • the determination unit 233 may extract either the user's SNS activity propensity and the user's posting habits through the information associated with the user-generated content.
  • the information may include at least one of a content volume, a content length, a content word count, a social engagement in the SNS, and a content upload time.
  • the determining unit 233 can determine the second state of the user based on any one of the extracted user's SNS activity propensity and the user's posting habits.
  • the management unit 235 classifies users into groups corresponding to the status of the user based on the status of the user, and provides information on the users to manage the users.
  • the management unit 235 may classify users into groups corresponding to the status of the user using Bayesian, clustering, and a convolutional neural network (CNN).
  • the group can be either a normal group, or a depressed group.
  • a normal group may be a group that is classified into a normal state, and a user in a positive state.
  • a depressed group can be a grouped group of users who are in a negative state.
  • the management unit 235 can provide information on the user to the user device 100 and the content repository 300 to manage the user.
  • the management unit 235 may provide information on the user classified into the depression group to the user device 100 and the content repository 300 to manage the user in a negative state.
  • the information may include at least one of a graph of the emotional state of the user, and a feature value of the group.
  • a graph of the emotional state of the user may be as shown in Figs. 5A and 5B.
  • GRAPH2, GRAPH3, and GRAPH4 in FIG. 6A may be graphs of the emotional state of the user classified in the depression group.
  • GRAPH5, GRAPH6, and GRAPH7 in FIG. 6B may be graphs of the emotional state of the user classified into the normal group.
  • 6A and 6B may be the date, and the Y axis may be the emotion value.
  • the feature value of the group may include at least one of the words, sentences, and color of the image frequently used in the group.
  • FIG. 7 shows a flowchart for explaining the operation of the user management apparatus shown in FIG. 1.
  • the transmitter / receiver 210 may receive user-generated content from the user device 100 and the content repository 300 (S710).
  • the estimator 231 estimates and generates the first emotion state, the second emotion state, and the integrated emotion state of the user on the basis of the learning data for the tagged emotion value, the emotion dictionary, and the contents generated by the user (S730).
  • the estimation unit 231 can estimate the first emotion state of the user based on the learning data and the user's image.
  • the estimating unit 231 can estimate the second emotional state of the user based on the emotional dictionary and the user's text and emoticons.
  • the estimator 231 can generate the integrated emotion state by integrating the first emotion state and the second emotion state.
  • the determining unit 233 may determine the state of the user based on the integrated emotion state, the user's SNS activity propensity, and the user's posting habit (S750).
  • the determination unit 233 can determine the first state of the user based on the statistical value of the integrated emotion state.
  • the determination unit 233 can determine the second state of the user based on the user's SNS activity propensity extracted from the information associated with the user-generated content, and the posting habit of the user.
  • the determination unit 233 may determine the state of the user by integrating the first state and the second state.
  • the management unit 233 can classify users into groups corresponding to the status of the user based on the status of the user and provide information on the users to the user device 100 and the content storage 300 to manage the users (S770).
  • the management unit 233 may classify a user who is in a negative state into a depression group, and provide information of the user in a negative state to the user device 100 and the content storage 300 to manage the user.
  • the apparatus described above may be implemented as a hardware component, a software component, and / or a combination of hardware components and software components.
  • the apparatus and components described in the embodiments may be implemented within a computer system, such as, for example, a processor, a controller, an arithmetic logic unit (ALU), a digital signal processor, a microcomputer, a field programmable gate array (FPGA) , A programmable logic unit (PLU), a microprocessor, or any other device capable of executing and responding to instructions.
  • the processing device may execute an operating system (OS) and one or more software applications running on the operating system.
  • the processing device may also access, store, manipulate, process, and generate data in response to execution of the software.
  • OS operating system
  • the processing device may also access, store, manipulate, process, and generate data in response to execution of the software.
  • the processing apparatus may be described as being used singly, but those skilled in the art will recognize that the processing apparatus may have a plurality of processing elements and / As shown in FIG.
  • the processing unit may comprise a plurality of processors or one processor and one controller.
  • Other processing configurations are also possible, such as a parallel processor.
  • the software may include a computer program, code, instructions, or a combination of one or more of the foregoing, and may be configured to configure the processing device to operate as desired or to process it collectively or collectively Device can be commanded.
  • the software and / or data may be in the form of any type of machine, component, physical device, virtual equipment, computer storage media, or device , Or may be permanently or temporarily embodied in a transmitted signal wave.
  • the software may be distributed over a networked computer system and stored or executed in a distributed manner.
  • the software and data may be stored on one or more computer readable recording media.
  • the method according to an embodiment may be implemented in the form of a program command that can be executed through various computer means and recorded in a computer-readable medium.
  • the computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination.
  • the program instructions to be recorded on the medium may be those specially designed and configured for the embodiments or may be available to those skilled in the art of computer software.
  • Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like.
  • program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like.
  • the hardware devices described above may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Tourism & Hospitality (AREA)
  • Theoretical Computer Science (AREA)
  • Psychiatry (AREA)
  • General Physics & Mathematics (AREA)
  • Primary Health Care (AREA)
  • Strategic Management (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Human Resources & Organizations (AREA)
  • Economics (AREA)
  • Computing Systems (AREA)
  • Psychology (AREA)
  • Molecular Biology (AREA)
  • Hospice & Palliative Care (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Educational Technology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)

Abstract

L'invention concerne un procédé de gestion d'utilisateur basé sur les émotions et des appareils pour mettre en oeuvre ce procédé. Le procédé de gestion d'utilisateur basé sur les émotions, selon un mode de réalisation, comprend les étapes consistant à : estimer un état d'humeur d'un utilisateur sur la base de données d'apprentissage pour une image marquée avec des valeurs d'émotion, un lexique de sentiments, et un contenu généré par l'utilisateur; et gérer l'utilisateur en déterminant un état de l'utilisateur sur la base de l'état d'humeur.
PCT/KR2017/013278 2017-08-23 2017-11-21 Procédé de gestion d'utilisateur basé sur les émotions et appareils le mettant en oeuvre WO2019039659A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2017-0106490 2017-08-23
KR1020170106490A KR101894194B1 (ko) 2017-08-23 2017-08-23 감성 기반의 사용자 관리 방법 및 이를 수행하는 장치들

Publications (1)

Publication Number Publication Date
WO2019039659A1 true WO2019039659A1 (fr) 2019-02-28

Family

ID=63862650

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2017/013278 WO2019039659A1 (fr) 2017-08-23 2017-11-21 Procédé de gestion d'utilisateur basé sur les émotions et appareils le mettant en oeuvre

Country Status (2)

Country Link
KR (1) KR101894194B1 (fr)
WO (1) WO2019039659A1 (fr)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102128274B1 (ko) * 2018-12-19 2020-06-30 주식회사 카케냐 페르소나와 딥-러닝 모델을 이용한 소셜네트워크상의 콘텐츠 추천 방법
KR102347328B1 (ko) * 2020-02-20 2022-01-04 인제대학교 산학협력단 빅데이터 기반의 텍스트를 활용한 개인 감성분석 모니터링 시스템 아키텍쳐
KR102359466B1 (ko) * 2020-02-24 2022-02-08 인제대학교 산학협력단 딥 메타데이터 기반 감성분석 방법 및 그 시스템
KR102556972B1 (ko) * 2021-09-02 2023-07-18 전남대학교산학협력단 딥러닝 기반의 그래프 융합을 이용한 시청자 감정 예측 시스템 및 방법

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007219929A (ja) * 2006-02-17 2007-08-30 Nec Corp 感性評価システム及び方法
KR20120109943A (ko) * 2011-03-28 2012-10-09 가톨릭대학교 산학협력단 문장에 내재한 감정 분석을 위한 감정 분류 방법
KR20160069027A (ko) * 2014-12-05 2016-06-16 건국대학교 산학협력단 감성 기반의 영상 색인 시스템 및 방법
KR20160097841A (ko) * 2015-02-10 2016-08-18 류창형 원격 심리지원 서비스를 위한 통합 관리 시스템
KR101739538B1 (ko) * 2016-01-25 2017-05-25 주식회사 솔트룩스 기계 학습 및 규칙에 기반한 감성 분석 시스템 및 방법

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012134180A2 (fr) 2011-03-28 2012-10-04 가톨릭대학교 산학협력단 Procédé de classification des émotions pour analyser des émotions inhérentes dans une phrase et procédé de classement des émotions pour des phrases multiples à l'aide des informations de contexte

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007219929A (ja) * 2006-02-17 2007-08-30 Nec Corp 感性評価システム及び方法
KR20120109943A (ko) * 2011-03-28 2012-10-09 가톨릭대학교 산학협력단 문장에 내재한 감정 분석을 위한 감정 분류 방법
KR20160069027A (ko) * 2014-12-05 2016-06-16 건국대학교 산학협력단 감성 기반의 영상 색인 시스템 및 방법
KR20160097841A (ko) * 2015-02-10 2016-08-18 류창형 원격 심리지원 서비스를 위한 통합 관리 시스템
KR101739538B1 (ko) * 2016-01-25 2017-05-25 주식회사 솔트룩스 기계 학습 및 규칙에 기반한 감성 분석 시스템 및 방법

Also Published As

Publication number Publication date
KR101894194B1 (ko) 2018-10-04

Similar Documents

Publication Publication Date Title
US10140272B2 (en) Dynamic context aware abbreviation detection and annotation
US10585901B2 (en) Tailoring question answer results to personality traits
US20200019642A1 (en) Question Answering Using Trained Generative Adversarial Network Based Modeling of Text
US9373086B1 (en) Crowdsource reasoning process to facilitate question answering
WO2020027540A1 (fr) Appareil et procédé de compréhension de langage naturel personnalisé
WO2019039659A1 (fr) Procédé de gestion d'utilisateur basé sur les émotions et appareils le mettant en oeuvre
US11170660B2 (en) Harvesting question/answer training data from watched hypotheses in a deep QA system
US9886390B2 (en) Intelligent caching of responses in a cognitive system
CN109086265B (zh) 一种语义训练方法、短文本中多语义词消歧方法
WO2023029506A1 (fr) Procédé et appareil d'analyse d'état de maladie, dispositif électronique et support de stockage
WO2023096254A1 (fr) Système de mise en correspondance d'emploi sur la base de l'intelligence artificielle
US20210097405A1 (en) Bias Identification in Cognitive Computing Systems
US20220309252A1 (en) Determining conversational structure from speech
Razzaq et al. Text sentiment analysis using frequency-based vigorous features
US10552461B2 (en) System and method for scoring the geographic relevance of answers in a deep question answering system based on geographic context of a candidate answer
US10902342B2 (en) System and method for scoring the geographic relevance of answers in a deep question answering system based on geographic context of an input question
EP3994589A1 (fr) Système, appareil et procédé de gestion de connaissances générées à partir de données techniques
Yin et al. Sentiment lexical-augmented convolutional neural networks for sentiment analysis
WO2018212584A2 (fr) Procédé et appareil de classification de catégorie à laquelle une phrase appartient à l'aide d'un réseau neuronal profond
CN112052424B (zh) 一种内容审核方法及装置
US11599580B2 (en) Method and system to extract domain concepts to create domain dictionaries and ontologies
CN116844731A (zh) 疾病分类方法、疾病分类装置、电子设备及存储介质
WO2018117327A1 (fr) Procédé d'analyse de style de vie et système associé
Khoo et al. Machine Learning for Multimodal Mental Health Detection: A Systematic Review of Passive Sensing Approaches
Oak et al. Generating clinically relevant texts: A case study on life-changing events

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17922094

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17922094

Country of ref document: EP

Kind code of ref document: A1