WO2020088102A1 - 情绪干预方法、装置和系统,以及计算机可读存储介质和疗愈小屋 - Google Patents

情绪干预方法、装置和系统,以及计算机可读存储介质和疗愈小屋 Download PDF

Info

Publication number
WO2020088102A1
WO2020088102A1 PCT/CN2019/104911 CN2019104911W WO2020088102A1 WO 2020088102 A1 WO2020088102 A1 WO 2020088102A1 CN 2019104911 W CN2019104911 W CN 2019104911W WO 2020088102 A1 WO2020088102 A1 WO 2020088102A1
Authority
WO
WIPO (PCT)
Prior art keywords
emotional
user
intervention
emotional intervention
picture
Prior art date
Application number
PCT/CN2019/104911
Other languages
English (en)
French (fr)
Inventor
朱红文
徐志红
Original Assignee
京东方科技集团股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 京东方科技集团股份有限公司 filed Critical 京东方科技集团股份有限公司
Priority to US16/758,743 priority Critical patent/US11617526B2/en
Publication of WO2020088102A1 publication Critical patent/WO2020088102A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/284Lexical analysis, e.g. tokenisation or collocates
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/165Evaluating the state of mind, e.g. depression, anxiety
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/486Bio-feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/237Lexical tools
    • G06F40/242Dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/30Semantic analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/772Determining representative reference patterns, e.g. averaging or distorting patterns; Generating dictionaries
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the present disclosure relates to the field of computer technology, and in particular, to an emotional intervention method, device, and system, as well as computer-readable storage media and healing cabins.
  • an emotional intervention method including: identifying a user's emotional state based on the user's first biometric information; recommending at least one emotional intervention corresponding to the emotional state the way.
  • At least one emotional intervention method corresponding to the emotional state is recommended.
  • recommending at least one emotional intervention method corresponding to the emotional state includes: identifying the user's physical state based on the user's second biometric information; and recommending based on the user's physical state At least one emotional intervention method corresponding to the emotional state.
  • the emotional intervention method includes at least one of outputting media data, adjusting the environment, providing diet, providing psychological counseling, providing emotional management courses, and performing physical therapy.
  • identifying the emotional state of the user includes: acquiring the first biometric information of the user in real time; determining the real-time emotional state of the user based on the first biometric information acquired in real time; The proportion of each real-time emotional state of the user in the unit time; the real-time emotional state with the largest proportion is identified as the emotional state of the user in the unit time.
  • acquiring the first biometric information of the user includes: acquiring an image of the user; identifying the user's face from the image; identifying the user based on the characteristics of the face Facial expression; use the recognized facial expression as the first biometric information.
  • recommending at least one emotional intervention method corresponding to the emotional state includes: obtaining corresponding emotional intervention data according to the emotional state of the user, the intervention data including at least one of physical therapy recommendations and media data One; based on the acquired emotional intervention data, recommend at least one emotional intervention method corresponding to the emotional state.
  • the emotional intervention method further includes: labeling the acquired emotional intervention data; deleting emotional intervention data that does not match the emotional intervention target; and using the remaining emotional intervention data to build an emotional intervention knowledge base.
  • the emotional intervention data is obtained through a text similarity matching algorithm.
  • acquiring the emotional intervention data through a text similarity matching algorithm includes: acquiring a keyword dictionary corresponding to the emotional intervention target, the keyword dictionary includes w keywords, and w is a positive integer; compare The text similarity between the keyword dictionary and the text to be compared; the media data corresponding to the text whose text similarity exceeds the similarity threshold is determined as the emotional intervention data.
  • comparing the text similarity between the keyword dictionary and the text to be compared includes weighting the keywords in the keyword dictionary and the keywords in the text to be compared, The weights reflect the importance of the keywords.
  • the keywords in the keyword dictionary have n weights, n is a positive integer; the keyword dictionary and the keywords with the same weight in the text to be compared And operation, to obtain n keyword sets, which include a total of a keyword, where a is an integer; calculate the ratio of a and w to obtain the text similarity between the text to be compared and the keyword dictionary.
  • the keywords in the keyword dictionary are used for searching to obtain text to be compared.
  • the first biometric information includes at least one of facial expressions and sound; the second biometric information includes at least one of height, weight, and health status.
  • the emotional intervention method further includes: determining whether the user selects the recommended emotional intervention method; and when the user selects the recommended emotional intervention method, starting the corresponding emotional intervention method.
  • the emotional intervention method further includes: determining the content conformity of the image based on the hue of the background color of the image and / or the objects included in the image; using the image whose content conformity is greater than or equal to the first threshold, Construct a picture knowledge base as a knowledge base for emotional intervention.
  • the emotional intervention method further includes: searching for keywords matching the keyword dictionary A from the descriptive text of the picture, where the keyword dictionary A includes a 0 keywords, and a 0 is positive Integer, matching the keywords in the keyword dictionary A to form the keyword dictionary A1; by expanding similar keywords in the keyword dictionary A to construct a keyword dictionary B; searching for the key from the descriptive text of the picture The keywords matched by the word dictionary B, and the searched keywords matching the keyword dictionary B constitute the keyword dictionary A2; from the descriptive text of the picture, use the semantic analysis method to search for the keywords in the keyword dictionary B Sentences with similar meanings, the searched sentences with similar semantics match the keywords in the keyword dictionary B to form the keyword dictionary A3; the keyword dictionary A1, A2, and A3 are merged to form the keyword dictionary C, keyword dictionary C the number of keywords matched with the keyword dictionary a is c, c is a positive integer; keyword matching degree calculating according to a 0 and C; keyword matching using images not less than the second threshold value, Picture built knowledge base, knowledge
  • the emotional intervention method further includes: determining the greater value of the content conformity of the picture and the keyword matching degree as the conformity of the picture; using the picture with the conformity degree greater than or equal to the third threshold to construct the picture Knowledge base, as a knowledge base for emotional intervention.
  • an emotional intervention device including: a recognition unit configured to recognize the user's emotional state based on the user's first biometric information; a recommendation unit configured to recommend At least one emotional intervention method corresponding to the emotional state.
  • an emotional intervention device including: a memory; and a processor coupled to the memory, the processor configured to be based on instructions stored in the memory, Perform the emotional intervention method as described in any of the foregoing embodiments.
  • a computer-readable storage medium on which a computer program is stored, which when executed by a processor implements the emotional intervention method as described in any of the foregoing embodiments.
  • an emotional intervention system including the emotional intervention device of any of the foregoing embodiments.
  • the emotional intervention system further includes at least one of a physiotherapy device, an environmental atmosphere adjustment device, a display, a player, a diet provision module, a psychological consultation module, and an emotion management course module, wherein Is configured to perform physical therapy on the user when the recommended emotional intervention method includes performing physical therapy; the environmental atmosphere adjusting device is configured to perform the environmental atmosphere when the recommended emotional intervention method includes performing environmental atmosphere adjustment Adjustment; the display and the player are configured to output media data in the case where the recommended emotional intervention method includes outputting media data; the diet providing module is configured to include the provision of diet in the recommended emotional intervention method In case, provide a corresponding diet to stimulate the user's nervous system from taste; the psychological counseling module is configured to provide online psychological counseling referral appointments in cases where the recommended emotional intervention method includes providing psychological counseling Service; the emotion management course module is configured to The recommended emotional intervention methods include providing emotional management courses such as online psychological management in the case of providing emotional management courses.
  • the emotional intervention system further includes at least one of an image sensor, a sound sensor, a measurement device, and an input device, wherein: the image sensor and the sound sensor are configured to acquire the first biometric of the user Feature information; the measuring device and the input device are configured to acquire the second biometric information of the user.
  • the physiotherapy device includes a massage chair.
  • a healing cabin including the emotional intervention system of any of the foregoing embodiments.
  • FIG. 1A is a flowchart illustrating an emotional intervention method according to an embodiment of the present disclosure
  • 1B is a flowchart illustrating an emotional intervention method according to another embodiment of the present disclosure.
  • FIG. 2 is a flowchart illustrating an emotion recognition method according to an embodiment of the present disclosure
  • FIG. 3 is a flowchart illustrating a text similarity matching algorithm according to an embodiment of the present disclosure
  • FIG. 4 is a flowchart illustrating a text similarity comparison method according to an embodiment of the present disclosure
  • 5A is a flowchart illustrating a method for constructing an emotional intervention knowledge base according to an embodiment of the present disclosure
  • 5B is a flowchart illustrating a method of constructing a picture knowledge base according to an embodiment of the present disclosure
  • 5C is a flowchart illustrating a method of constructing a picture knowledge base according to another embodiment of the present disclosure
  • 5D is a flowchart illustrating a method for constructing a picture knowledge base according to yet another embodiment of the present disclosure
  • FIG. 6 is a block diagram illustrating an emotional intervention device according to an embodiment of the present disclosure.
  • FIG. 7 is a block diagram showing an emotional intervention device according to another embodiment of the present disclosure.
  • FIG. 8 is a block diagram illustrating an emotional intervention system according to an embodiment of the present disclosure.
  • FIG. 9A is a schematic structural diagram showing a healing cabin according to an embodiment of the present disclosure.
  • FIG. 9B is a schematic structural diagram showing a massage chair according to an embodiment of the present disclosure.
  • FIG. 10 is a block diagram showing a computer system for implementing one embodiment of the present disclosure.
  • the present disclosure proposes a scheme for emotional intervention based on emotional recognition.
  • FIG. 1A is a flowchart illustrating an emotional intervention method according to an embodiment of the present disclosure. As shown in FIG. 1A, the emotional intervention method includes steps S2 and S4.
  • step S2 the user's emotional state is identified based on the user's first biometric information.
  • the first biometric information includes, for example, facial expressions, sounds, and other information that can reflect emotional states.
  • Emotional states include but are not limited to: neutral, happy, sad, angry, contempt, disgust, surprise, fear.
  • emotional states can be divided into three types of emotions: neutral, positive, and negative.
  • Happiness can correspond to positive emotions.
  • Sadness, anger, contempt, disgust, surprise, fear can correspond to negative emotions.
  • FIG. 2 shows a block diagram of an emotion recognition method according to an embodiment of the present disclosure. As shown in FIG. 2, the emotion recognition method includes steps S21-S24.
  • step S21 the user's first biometric information is acquired in real time.
  • the first biometric information is obtained based on the image of the user captured by an image sensor such as a camera. For example, a face is recognized from the user's image, and based on the correlation between the facial feature and the expression, the user's facial expression is recognized as the first biometric information. You can periodically acquire user images at regular intervals.
  • the image sensor may include a camera set.
  • the camera group can be set in different orientations to obtain user images from multiple angles. For example, one camera can be placed directly opposite to the user, and the remaining cameras can be scattered on both sides of the user.
  • the angle between two adjacent cameras can be set to 180 ° / n, where n is the number of cameras.
  • the acquired user image can be compressed and encoded to reduce the image size, thereby facilitating storage.
  • the time when the user's image is acquired can also be recorded, so that images from multiple angles acquired at the same time can be stored in association.
  • the sound of the user may also be sensed by sound sensors such as a microphone as the first biometric information.
  • step S22 according to the first biometric information acquired in real time, the real-time emotional state of the user is determined.
  • the user's real-time emotional state can be determined based on the information acquired in real time. For example, a pre-built emotion recognition model can be used to classify different biological features into corresponding emotional states to establish the correlation between biological features and emotional states.
  • step S23 the proportion of each real-time emotional state of the user in the unit time is counted.
  • the proportion of each real-time emotional state of the user in the unit time is counted. For example, taking 2 seconds as a unit time, there are 8 possible real-time emotional states. The number of occurrences of different real-time emotional states in a unit of time is recorded as n i , 1 ⁇ i ⁇ 8. The total number of occurrences is recorded as The proportion of each real-time emotional state is
  • step S24 the real-time emotional state with the largest proportion is identified as the user's emotional state within the unit time.
  • angry with a ratio of 0.7 is recognized as the user's emotional state within the 0.2 second unit time.
  • the user's emotional report may be generated based on the identified emotional state.
  • FIG. 1A describes how to recommend the corresponding emotional intervention method after the user ’s emotional state is identified.
  • step S4 at least one emotional intervention method corresponding to the emotional state of the user is recommended.
  • the corresponding emotional intervention data is obtained according to the emotional state of the user; based on the acquired emotional intervention data, at least one emotional intervention method corresponding to the emotional state is recommended.
  • Emotional intervention methods may include at least one of physical therapy, output of media data, adjustment of environmental atmosphere, provision of diet, provision of psychological counseling, and provision of emotional management courses. Accordingly, the emotional intervention data may include at least one of the corresponding data.
  • Physiotherapy can involve different parts, such as eye physiotherapy, head physiotherapy, shoulder physiotherapy, neck physiotherapy, leg physiotherapy.
  • the physiotherapy methods include, for example, massage, light therapy, and magnetic therapy.
  • Media data includes pictures, audio, and video that can emotionally intervene in users.
  • Emotional intervention data can include pictures, text, audio, video, etc. in form.
  • the ambient atmosphere includes lights, negative oxygen ions, smells, etc.
  • Psychological consultation is based on the user ’s emotional state, using big data modeling to evaluate the user ’s mental health status, and providing a self-referral appointment service for users who have a high risk of mental imbalance.
  • Emotional health management methods include, for example, parent-child courses, intimate relationship courses, emotional intelligence courses, reverse quotient courses, and social courses.
  • FIG. 1B is a flowchart illustrating an emotional intervention method according to another embodiment of the present disclosure. 1B is different from FIG. 1A in that it further includes steps S0S6 and S8. Only the differences between FIG. 1B and FIG. 1A will be described below, and the similarities will not be repeated.
  • step S6 it is determined whether the user selects the recommended emotional intervention method.
  • step S8 when the user selects the recommended emotional intervention method, the corresponding emotional intervention method is activated.
  • the corresponding display or player is started according to the user's selection of pictures or music, and the corresponding intervention pictures or music are randomly pushed.
  • the physical therapy device when the user chooses to perform physical therapy, the physical therapy device is activated, and the user is prompted to perform physical therapy according to the physical therapy recommendations.
  • the negative oxygen ion generator When the user selects the intervention of negative oxygen ions, the negative oxygen ion generator is started, and a timer is set in order to turn off the negative oxygen ion generator in time according to the needs.
  • the aroma generator When the user selects the aroma intervention, the aroma generator is started.
  • emotional intervention data is obtained through a text similarity matching algorithm.
  • the text similarity matching algorithm according to some embodiments of the present disclosure will be described below in conjunction with FIG. 3.
  • FIG. 3 shows a flowchart of a text similarity matching algorithm according to an embodiment of the present disclosure. As shown in FIG. 3, the text similarity matching algorithm includes steps S41-S43.
  • step S41 a keyword dictionary corresponding to the goal of emotional intervention is obtained.
  • the goal of emotional intervention is, for example, to relieve the user's emotions. Taking the user's emotion recognized as angry for example, the goal of emotional intervention is to relieve angry emotion.
  • the keyword dictionary corresponding to alleviating angry emotions includes w keywords, where w is a positive integer. When the keywords are, for example, soothing, calm, and cheerful, w is 3.
  • similar keywords may also be expanded on these keywords.
  • a semantic similarity matching algorithm is used to search for cheerful similar words, and joy and happiness are obtained.
  • the expanded keyword dictionary includes soothing, calm, cheerful, joyful, and pleasant.
  • step S42 the text similarity between the keyword dictionary and the text to be compared is compared.
  • the text to be compared is a collection of words, such as text descriptions or articles such as pictures, music, and videos.
  • the text to be compared includes "This music is cheerful, lively, and pleasant.”
  • the text to be compared can be directly crawled from the Internet. You can also use the keywords in the keyword dictionary to search to get the text to be compared.
  • a sparse algorithm can be used to convert the keyword dictionary and the text to be compared into binary codes, such as "01000001".
  • the length and value of the code depends on the specific sparse algorithm.
  • FIG. 4 shows a flowchart of a text similarity comparison method according to an embodiment of the present disclosure. As shown in FIG. 4, the text similarity comparison method includes steps S421-S423.
  • step S421 the keywords in the keyword dictionary and the keywords in the text to be compared are weighted respectively. Weights reflect the importance of keywords.
  • the keywords in the keyword dictionary have n weights, and n is a positive integer.
  • the weights of soothing and calming can be marked as 4, and the weights of cheerfulness, joy, and joy can be marked as 3.
  • n 2
  • the weights of keywords such as this, music, cheerful, lively, pleasant, and pleasant can be marked as 1, 2 respectively , 3, 3, 1, 3.
  • step S422 the keyword dictionary and the keywords with the same weight in the text to be compared are ANDed to obtain n keyword sets.
  • the n keyword sets include a total of a keyword, where a is an integer.
  • the text to be compared is searched for keywords with the same weight as the keyword dictionary.
  • the searched keywords with the same weight are cheerful, lively, and pleasant.
  • step S423 the ratio of a and w is calculated to obtain the text similarity between the text to be compared and the keyword dictionary.
  • step S43 the media data corresponding to the text whose text similarity exceeds the threshold is determined as the emotional intervention data.
  • the user's emotional state recognized as sad there may be multiple emotional intervention methods.
  • the user may be recommended to play cheerful music or positive pictures to arouse the user's emotions.
  • the user's second biometric information may also be considered.
  • the second biometric information includes height, weight, health status, and other information reflecting the state of the body.
  • the second biometric information can be obtained through inquiries or the like. Information such as height and weight can also be obtained through corresponding measurements.
  • the user's physical state is identified according to the user's second biometric information; according to the user's physical state, at least one emotional intervention method corresponding to the emotional state is recommended.
  • the recommended emotional intervention method takes into account the user's physical state, and can more effectively interfere with the user's emotions.
  • a corresponding physical therapy method is recommended. Still taking the case where the user's emotional state is recognized as sad but the eyes are uncomfortable, for example, eye treatments such as eye massage can be recommended to the user. This is because the sadness of the user may be caused by eye discomfort. Relieving the eye discomfort can effectively reduce the user's sadness.
  • multiple intervention methods can also be recommended to interfere with the user's emotions. For example, while recommending eye massage to the user, cheerful music is recommended. On the one hand, alleviating eye discomfort can effectively reduce the user's sadness; on the other hand, cheerful music itself can also relieve the user's sadness. In this way, emotional intervention is more effective.
  • FIG. 5A shows a flowchart of a method for building an emotional intervention knowledge base according to an embodiment of the present disclosure.
  • the emotional intervention knowledge base construction method includes steps S51-S53.
  • step S51 the acquired emotional intervention data is marked.
  • the target of emotional intervention may be stable emotion, for example, a quiet picture, text, audio, or video may be searched, and the corresponding search data may be marked as stable emotion.
  • step S52 the emotional intervention data that does not match the emotional intervention target is deleted.
  • the acquired emotion intervention data may include data that does not match the goal of the emotion intervention. For example, these data cannot stabilize emotions. Therefore, you can delete these data that do not match the goal of emotional intervention.
  • step S53 the remaining emotional intervention data is used to construct an emotional intervention knowledge base.
  • Emotional intervention knowledge base includes but is not limited to: picture knowledge base, music knowledge base, video knowledge base, physiotherapy method knowledge base.
  • the emotional intervention data in the constructed emotional intervention knowledge base. Based on such emotional intervention data, it is more efficient to recommend emotional intervention methods.
  • Constructing a picture knowledge base based on picture content search includes: determining the content conformity of the picture based on the hue of the background color of the picture and / or the objects included in the picture; constructing the picture knowledge base using pictures whose content conformity is greater than or equal to the threshold .
  • FIG. 5B shows a flowchart of a method for constructing a picture knowledge base according to an embodiment of the present disclosure.
  • step 512 according to the hue of the background color of the picture, determine the degree of content conformity of the picture.
  • the number of color pixels in warm colors is n 1
  • the number of color pixels in neutral colors is n 2
  • the number of color pixels in cool colors is n 3 ; Then, according to the following formulas (1)-(3), calculate the number of warm-toned color pixels, neutral-toned color pixels, and cool-toned color pixels as a percentage of the total number of pixels:
  • the percentage of the number of color pixels of the cool tone color 3 n 3 / (n 1 + n 2 + n 3 ), (3).
  • the content of the picture may be positive; otherwise, the content of the picture may be negative. That is, the probability that the content of the picture is positive can be determined according to the hue of the background color of the picture, that is, the degree of conformity of the content of the picture can be determined. For example, the weighted sum of color 1 , color 2 , and color 3 can be used to reflect the content of the picture.
  • a picture knowledge base is constructed by using pictures whose content conformity degree is greater than or equal to a threshold. That is, when the degree of conformity of the picture threshold is greater than or equal to the threshold, the corresponding picture is put into the picture knowledge base.
  • positive emotion pictures include objects reflecting victory, entertainment and tourism, beautiful scenery, flowers and trees, cute animals, pastimes, famous cars, banknotes, gold and silver, sports scenes, happy expressions, affection, friendship, love, etc.
  • Pictures of neutral emotions include objects reflecting daily necessities, life and work scenes, buildings, vehicles, diet, geometric figures, expressionless people, etc .
  • pictures of negative emotions include accidents, natural disasters, objects (such as buildings ) Damage, various garbage, ghosts, insects, medical treatment, corpses, environmental pollution, natural disasters, crying, disability, blood, military scenes, violent conflict, weapons and weapons.
  • step 514 the content of the picture is determined according to the objects included in the picture.
  • the number of objects corresponding to positive emotions in the picture is m 1
  • the number of objects corresponding to neutral emotions is m 2
  • the number of objects corresponding to negative emotions is m 2
  • the number is m 3 .
  • the percentages of the number of positive emotional objects, the number of neutral emotional objects, and the number of negative emotional objects in the total number of objects in the picture can be calculated according to the following formulas (4)-(6):
  • the content of the picture may be positive; otherwise, the content of the picture may be negative. That is, it is possible to determine the probability that the content of the picture is positive according to the objects included in the picture, that is, to determine the degree of conformity of the content of the picture.
  • the weighted sum of mod 1 , mod 2 , and mod 3 can be used to reflect the content conformity of the picture.
  • a picture knowledge base is constructed by using pictures whose content conformity degree is greater than or equal to a threshold. That is, when the content of the picture conforms to the threshold value or more, the corresponding picture is put into the picture knowledge base.
  • the content conformity of the picture may be determined only according to steps 512 or 514, that is, only steps 512 and 518 or only steps 514 and 518 are performed.
  • the results of steps 512 and 514 can also be synthesized to determine the content conformity of the picture, that is, step 516 is performed.
  • step 516 according to the hue of the background color of the picture and the objects included in the picture, the content conformity of the picture is comprehensively determined.
  • ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 1 , ⁇ 2 , ⁇ 3 , ⁇ 1 , and ⁇ 2 can be set according to actual needs. For example, ⁇ 1 takes 1, ⁇ 2 takes 0.5, ⁇ 3 takes -1; ⁇ 1 takes 1, ⁇ 2 takes 0.5, ⁇ 3 takes -1; ⁇ 1 takes 0.5, and ⁇ 2 takes 0.5.
  • a picture knowledge base is constructed by using pictures whose content conformity degree is greater than or equal to a threshold. That is, when the content of the picture conforms to the threshold value or more, the corresponding picture is put into the picture knowledge base.
  • the pictures in the constructed picture knowledge base may also be filtered in advance. For example, in step 510, it is determined whether the resolution x of the picture is greater than the threshold y (ie, the desired resolution). Only if the resolution x of the picture is greater than y, the subsequent steps are performed.
  • the threshold y ie, the desired resolution
  • FIG. 5C shows a flowchart of a method for constructing a picture knowledge base according to another embodiment of the present disclosure.
  • the keywords that match the keyword dictionary A are searched from the descriptive text of the picture to form the keyword dictionary A 1 .
  • the keyword dictionary A can be constructed based on the natural language processing method of words.
  • the keyword dictionary A may be constructed in advance, or may be constructed before step 521 is executed.
  • the keyword dictionary A includes a 0 keywords, and a 0 is a positive integer.
  • the descriptive text of the picture matches the number of keywords in the keyword dictionary A to be a 1 , which constitutes the keyword dictionary A 1 , and a 1 is a positive integer.
  • Matching includes, for example, keywords being identical, semantically identical, or semantically similar.
  • a keyword dictionary B is constructed by performing similar word expansion on the keywords in the keyword dictionary A.
  • step S523 keywords that match the keyword dictionary B are searched from the descriptive text of the picture to form a keyword dictionary A 2 .
  • the number of searched keywords in the keyword dictionary B is a 2 , which constitutes the keyword dictionary A 2 , and a 2 is a positive integer.
  • step S524 from the descriptive text of the picture, a semantic analysis method is used to search for sentences that are semantically similar to the keywords in the keyword dictionary B to form a keyword dictionary A 3 .
  • the searched sentences with similar semantic meaning match the number of keywords in the keyword dictionary B to a 3 , which constitutes the keyword dictionary A 3 , and a 3 is a positive integer.
  • step 525 the keyword dictionaries A 1 , A 2 , and A 3 are merged to form a keyword dictionary C.
  • the number of keywords in the keyword dictionary C matching the keyword dictionary A is c, and c is a positive integer .
  • the keywords in the keyword dictionaries A 1 , A 2 , and A 3 can be combined and operated to realize the merge of the keyword dictionaries.
  • the keyword matching degree is calculated according to a 0 and c.
  • step 527 the pictures whose keyword matching degree is greater than or equal to the threshold are used to construct a picture knowledge base as an emotion intervention knowledge base. That is, similar to step 518, when the content of the picture conforms to the threshold value or more, the corresponding picture is placed in the picture knowledge base.
  • FIG. 5D shows a flowchart of a method for constructing a picture knowledge base according to yet another embodiment of the present disclosure.
  • FIG. 5D uses the results of step 516 in FIG. 5B and step 526 in FIG. 5C to build a picture knowledge base.
  • the picture can be converted in step 530 using formula (8) The greater of the content matching degree of the content and the keyword matching degree of the picture is determined as the picture matching degree:
  • a picture knowledge base is constructed using pictures with a degree of agreement greater than or equal to a threshold. That is, in the case where the degree of conformity of the picture is greater than or equal to z, the corresponding picture is put into the picture knowledge base. Otherwise, discard the picture.
  • manual calibration may be performed before being placed in the picture knowledge base.
  • the picture-based content search is combined with the text-based natural language processing method to construct the picture knowledge base, which can improve the accuracy of the search matching of pictures.
  • the emotional intervention device 60 includes a recognition unit 620 and a recommendation unit 640.
  • the recognition unit 620 is configured, for example, to perform step S2 shown in FIG. 1.
  • the facial expression of the user can be acquired through the image sensor, and then the emotional state of the user can be identified according to the correlation between the facial expression and the emotional state.
  • the recommendation unit 640 is configured, for example, to perform step S4 shown in FIG. 1.
  • the corresponding emotional intervention data can be obtained based on the emotional state of the user, and then at least one emotional intervention method corresponding to the emotional state can be selected therefrom.
  • the emotional intervention method is recommended by comprehensively considering the emotional state and physical state of the user.
  • FIG. 7 is a block diagram illustrating an emotional intervention device according to another embodiment of the present disclosure.
  • the emotional intervention device 70 includes a memory 710 and a processor 720 coupled to the memory 710.
  • the memory 710 is used to store instructions for executing the corresponding embodiment of the emotional intervention method.
  • the processor 720 is configured to execute the emotional intervention method in any of the embodiments of the present disclosure based on the instructions stored in the memory 710.
  • each step in the foregoing emotional intervention method may be implemented by a processor, and may be implemented in any manner of software, hardware, firmware, or a combination thereof.
  • embodiments of the present disclosure may also take the form of computer program products implemented on one or more non-volatile storage media containing computer program instructions. Therefore, an embodiment of the present disclosure also provides a computer-readable storage medium on which computer instructions are stored, which when executed by a processor implements the emotional intervention method in any of the foregoing embodiments.
  • An embodiment of the present disclosure also provides an emotional intervention system, including the emotional intervention device described in any one of the foregoing embodiments.
  • FIG. 8 is a block diagram illustrating an emotional intervention system according to an embodiment of the present disclosure.
  • the emotion intervention system 8 includes a controller 80, an emotion recognition subsystem 81 and an emotion intervention subsystem 82.
  • the controller 80 is configured to perform the emotional intervention method described in any of the foregoing embodiments.
  • the structure of the controller 80 may be similar to the aforementioned emotional intervention device 60 or 70.
  • the emotion recognition subsystem 81 is configured to recognize the user's emotions.
  • the emotion recognition subsystem 81 includes at least one of an image sensor 811, a sound sensor 812, a measuring device 813, and an input device 814.
  • the image sensor 811 is configured to acquire the first biometric information of the user.
  • the user's image is taken through a camera or the like.
  • the user's facial expression can be obtained as the user's first biometric information.
  • the image sensor 811 may also be configured to obtain the second biometric information of the user. For example, the height of the user may be calculated based on the user's whole-body image and the size of the reference object in the image as the user's second biometric information.
  • the sound sensor 812 is configured to acquire the first biometric information of the user. For example, the user's voice is sensed through a microphone or the like as the user's first biometric information.
  • the measuring device 813 is configured to acquire the second biometric information of the user. For example, the height of the user can be measured with a scale, and the weight of the user can be measured with a weight scale.
  • the input device 814 is configured to obtain the user's second biometric information.
  • the user's second biometric information can be obtained in an inquiry manner. That is, the user can input the second biometric information such as height, weight, and health status through text and the like. In some embodiments, the user can also directly input his own emotion perception through the input device. In some embodiments, the user's more detailed and accurate second biometric information can also be obtained by inputting the user's medical case and the like.
  • the emotion intervention subsystem 82 is configured to intervene in the user's emotions.
  • the emotional intervention subsystem 82 includes at least one of a display 821, a player 822, a physiotherapy device 823, and an ambient atmosphere adjustment device 824.
  • the display 821 and the player 822 are configured to output media data when the recommended emotional intervention method includes outputting media data.
  • the display 821 may also be configured to display data such as text or pictures input by the user.
  • the display includes a liquid crystal display or an OLED (Organic Light-Emitting Diode, organic light emitting diode) display.
  • the display can be any product or component with display function, such as mobile phones, tablet computers, televisions, notebook computers, digital photo frames, navigators, projection screens, etc.
  • the display method can also be virtual reality (VR), augmented reality (AR), holographic projection, and so on.
  • the player 822 may also be configured to play voice input by the user.
  • the player is, for example, a speaker or headphones.
  • the physiotherapy device 823 is configured to perform physiotherapy on the user when the recommended emotional intervention method includes performing physiotherapy. Depending on the recommended physiotherapy method, such as massage, light therapy, magnetic therapy, etc., different physiotherapy equipment can be used.
  • the physiotherapy device can be activated by wireless Bluetooth connection or wired connection. In some embodiments, the physiotherapy device is a massage chair.
  • the environmental atmosphere adjustment device 824 is configured to provide a negative oxygen ion environment or a specified lighting environment in the case where the recommended emotional intervention method includes environmental atmosphere adjustment.
  • the environmental atmosphere adjusting device 824 can provide different negative oxygen ion environments or different lighting environment effects as needed.
  • the environmental atmosphere adjustment device 824 includes a negative oxygen ion generator and a negative oxygen ion controller, or a light generator (ie, light source) and a light controller.
  • the environmental atmosphere adjusting device 824 is configured to provide a floral fragrance, a grass fragrance, etc. capable of emotional intervention.
  • the environmental atmosphere adjustment device 824 includes various fragrance generators and fragrance controllers.
  • the environmental atmosphere adjustment device 824 further includes a light sensor, a temperature sensor, or a humidity sensor, etc., so as to adjust the light, temperature, or humidity of the environment as needed.
  • the environmental atmosphere adjusting device 824 may also include auxiliary devices such as timers, buzzers, etc., so as to periodically switch the corresponding devices as needed to achieve a desired environmental atmosphere.
  • auxiliary devices such as timers, buzzers, etc.
  • the emotional intervention subsystem 82 may further include: a diet providing module 825 configured to provide a corresponding diet in the case where the recommended emotional intervention method includes the provision of diet, so as to stimulate the user's nerves from taste System to achieve the purpose of emotional intervention.
  • the catering provision module 825 may include a commodity storage machine, an online payment system, a supply channel, a wireless transceiver module, and the like.
  • the food and beverage providing module 825 is, for example, a vending machine.
  • the emotional intervention subsystem 82 further includes: a psychological consultation module 826 configured to provide an online psychological consultation referral appointment service if the recommended emotional intervention method includes providing psychological consultation.
  • the psychological consultation module 826 is configured to evaluate the user's mental health status by using big data modeling based on the user's emotional state, and provide a self-referral appointment service for users with high risk of mental imbalance. In order to give emotional health management methods.
  • the emotional intervention subsystem 82 further includes: an emotional management course module 827, which is configured to provide online psychological management and other emotional management courses when the recommended emotional intervention method includes providing an emotional management course.
  • Course directions include parent-child courses, intimate relationship courses, emotional intelligence courses, reverse quotient courses, social courses and other directions.
  • Course formats include video courses, e-book courses and other forms. Courses can be displayed and played on the display.
  • the emotional intervention system 8 may also include an information management subsystem 83.
  • the information management subsystem 83 includes: a user mobile terminal 831, a psychologist / doctor mobile terminal 832, an information management platform 833, a back-end maintenance platform 834, an information security platform 835, an evaluation platform 836, and the like.
  • the user mobile terminal 831 has a registration login module, an online appointment module, an online psychological evaluation module, an online course viewing module, an online social module, a consulting customer service module, and a personal information management module.
  • the registration and login module includes two functions: registration and login.
  • the online booking module includes the online booking experience shop function, online booking intervention package function, online booking psychological consultant function, online booking doctor function, online booking course function, etc.
  • the online psychological evaluation module includes an online evaluation function mainly based on the psychological scale and supplemented by other online evaluation methods.
  • the online course viewing module includes functions for viewing courses online, such as viewing course categories, course lists, course details, course playback, course search and other functions.
  • the online social module includes social functions such as online post bar communication and social groups.
  • Consulting customer service module includes the function of contacting and consulting with customer service.
  • the personal information management module includes functions such as managing personal information, member information, system messages, reservation information, message push, clearing the cache, and logging out.
  • the psychological counselor / doctor mobile terminal 832 includes a registration login-qualification verification module, an appointment information viewing module, a schedule information viewing module, a message notification module, and a system setting module.
  • the registered login-qualification verification module includes the registration function of the psychological consultant / doctor, the login function of the psychological consultant / doctor, and the qualification verification function of the psychological consultant / doctor.
  • the reservation information viewing module includes a function to view reservation details such as reservation person, reservation time, reservation place and the like.
  • the message notification module includes functions such as notification of new reservation information, notification of cancellation of reservation information, and timing notification of all reservation information.
  • the system setting module includes functions such as time setting, weekly cycle setting, time period setting, and account status adjustment.
  • the information management platform includes 833 user management module, psychologist / doctor management module, appointment management module, social management module, course management module, promotion management module, experience store management module, setting module, payment module, online customer service module, message Push module, etc.
  • the user management module includes functions such as user registration management, registered user statistics, member statistics, and member payment data statistics.
  • the user management module may also include a member administrator's management function for member information.
  • the consultant / doctor management module includes the consultant account and the introduction and maintenance of the consultant.
  • the appointment management module includes functions such as the member's appointment for emotional intervention packages and psychological consultation packages management.
  • the appointment management module can also include experience store appointment statistics, consultant appointment statistics, doctor appointment statistics, intervention package appointment statistics, and other functions.
  • the social management module includes functions such as social information management and maintenance, and is used to provide a communication platform between members and between members and managers.
  • the Nennen ads of the platform are grouped according to the information of the member ’s work unit, working industry and home address, etc., which allows members to integrate into the work industry and living area circles more quickly, and learn more about the work industry and living community Solve the problem.
  • the course management module includes functions such as course list, course details, and course release.
  • the promotion management module includes such functions as coupon type, coupon usage data statistics, coupon usage rules creation, coupon quota creation, coupon usage period creation, and coupon user selection.
  • the experience store management module includes functions such as experience store information maintenance and experience store administrator management.
  • the setting module includes functions such as carousel configuration, role management, administrator account and permission configuration, message push, and system log.
  • the payment module is used to pay for the costs of emotional healing packages, membership fees, beverage costs, course fees and psychological counseling fees.
  • the message pushing module is used to push messages such as WeChat.
  • the back-end maintenance platform 834 includes a picture music knowledge base update module, a course update module, an expert tagging module, and a maintenance login module.
  • the picture music knowledge base update module is used to update the picture knowledge base and music knowledge base in the healing knowledge base.
  • the course update module is used to update the content of the course.
  • the expert tagging module is used to tag pictures and music automatically searched from the network, and manually added pictures and music.
  • the maintenance login module is used for the maintenance of the login system, the maintenance of the back-end database, and the maintenance of the administrator system, including the update of the picture music knowledge base, the course update, the administrator management and maintenance, the member information maintenance, the psychological consultant information management maintenance, and the doctor information Functions such as management and maintenance.
  • the information security platform is used to ensure the information security of the overall system.
  • the evaluation platform 835 is used to evaluate the service quality and system, including the service evaluation and feedback system.
  • Service evaluation includes evaluation of three aspects: environmental experience, service experience, and system experience. The evaluated experience is fed back to the system in order to update the system.
  • the evaluation platform can also analyze the status of the user group based on the recorded frequency of use of the user, the time of the last login, and the total amount of consumption. For example, you can use the customer model to analyze important value members, important development members, important retention members, important retention members, general value members, general development members, general retention members, general retention members, and early warning of customer churn.
  • the emotional intervention system may be installed in various places, for example, in an open, semi-open, or closed environment in clinics, shopping malls, and other occasions.
  • it can be installed in a closed room with a door, so as to further control the temperature of the environment and the content of negative oxygen ions, etc., to form a healing cabin capable of emotional intervention.
  • the healing cabin 9 includes an emotion recognition subsystem.
  • the emotion recognition subsystem is located in the black box 90.
  • the structure of the emotion recognition subsystem is similar to the emotion recognition subsystem 81 of FIG. 8, for example, an image sensor is provided, which is used to take a user image.
  • the black box 90 is provided with a plurality of cameras 911. As mentioned above, analyzing the facial features of the user's image can obtain the user's facial expressions, thereby identifying the user's emotional state.
  • the healing cabin 9 also includes an emotional intervention subsystem.
  • the structure of the emotional intervention subsystem is similar to the emotional intervention subsystem 82 of FIG. 8, for example, including the display 921, the player 922, the massage chair 923, and the light source 9241, negative oxygen ion generator 9242, aroma generator 9243 and other environmental atmosphere adjustment equipment .
  • the healing cabin 9 also includes an information management subsystem.
  • the structure of the information management subsystem is similar to the information management subsystem 83 of FIG. 8 and includes, for example, a user mobile terminal such as a pad computer 931.
  • the black box 90 may also include some components of the emotional intervention subsystem and the information management subsystem.
  • environmental atmosphere adjusting devices such as timers, aroma controllers, etc. may be provided in the black box 90.
  • the backend maintenance platform such as the image music knowledge base update module, course update module, and expert tagging module can be set in the black box.
  • the controller for controlling the emotional intervention system can also be set in the black box.
  • the black box 90 is provided with various interfaces to connect with external devices.
  • the interface is, for example, a USB interface, an HDMI interface, an audio interface 901, a power interface 902, and the like.
  • the aforementioned display 921, player 922, massage chair 923, light source 9241, negative oxygen ion generator 9242, aroma generator 9243, pad tablet 931, etc. can be wirelessly connected to the black box 90.
  • the black box 90 may be provided with corresponding wireless transceiver modules, such as a Bluetooth module, a WIFI module, and a 3G / 4G / 5G module.
  • the above connection may also be a wired method.
  • the black box 90 is further provided with buttons such as an on-off key 903, a volume key 904, and a restart key 905, so that the user can perform operations as needed.
  • the top surface of the black box 90 may also be provided with a touch screen for user operation.
  • the user's face image is collected in real time through the camera to identify the user's emotional state, and for negative emotions that are not conducive to the user's health state, sound perception (such as music) and vision (such as pictures) are used ), Touch (such as physiotherapy), smell (such as aroma), taste (such as drinks) and environment (such as negative oxygen ions) "six in one" way to interfere with the user's negative emotions.
  • This healing hut can be used for psychological sub-health groups of all ages, has the functions of emotion recognition and emotion intervention, and can be used in various scenarios such as community, family and business circles.
  • FIG. 9B shows a schematic structural diagram of a massage chair according to an embodiment of the present disclosure.
  • an image sensor 911 such as a camera
  • the massage chair 923 is also equipped with a player 922, such as a stereo or headphones.
  • a sound sensor such as a microphone
  • the massage chair can also be provided with measuring equipment, such as a scale for measuring height.
  • the massage chair 923 is further provided with an adjustment switch 923S, which is used to adjust the angle of the massage chair, the massage strength of the massage chair, and the stretching strength.
  • FIG. 10 is a block diagram showing a computer system for implementing one embodiment of the present disclosure.
  • the computer system can be expressed in the form of a general-purpose computing device.
  • the computer system includes a memory 1010, a processor 1020, and a bus 1000 connecting different system components.
  • the memory 1010 may include, for example, a system memory, a non-volatile storage medium, and the like.
  • the system memory stores, for example, an operating system, application programs, a boot loader (Boot Loader), and other programs.
  • System memory may include volatile storage media, such as random access memory (RAM) and / or cache memory.
  • RAM random access memory
  • the non-volatile storage medium stores, for example, instructions to execute the corresponding embodiments of the display method.
  • Non-volatile storage media include, but are not limited to, disk storage, optical storage, flash memory, and so on.
  • the processor 1020 may use discrete hardware such as a central processing unit (CPU), digital signal processor (DSP), application specific integrated circuit (ASIC), field programmable gate array (FPGA) or other programmable logic devices, discrete gates or transistors, etc. Component approach.
  • each module such as the judgment module and the determination module, can be implemented by executing instructions of the corresponding steps in the central processing unit (CPU) running memory, or by a dedicated circuit that executes the corresponding steps.
  • the bus 1000 can use any of various bus structures.
  • the bus structure includes but is not limited to an industry standard architecture (ISA) bus, a micro channel architecture (MCA) bus, and a peripheral component interconnect (PCI) bus.
  • ISA industry standard architecture
  • MCA micro channel architecture
  • PCI peripheral component interconnect
  • the computer system may further include an input and output interface 1030, a network interface 1040, a storage interface 1050, and the like. These interfaces 1030, 1040, 1050 and the memory 1010 and the processor 1020 may be connected by a bus 1000.
  • the input-output interface 1030 can provide a connection interface for input-output devices such as a display, a mouse, and a keyboard.
  • the network interface 1040 provides a connection interface for various networked devices.
  • the storage interface 1050 provides a connection interface for external storage devices such as floppy disks, U disks, and SD cards.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Public Health (AREA)
  • Psychiatry (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Psychology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Veterinary Medicine (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Evolutionary Computation (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Educational Technology (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Data Mining & Analysis (AREA)

Abstract

本公开涉及一种情绪干预方法、装置和系统,以及计算机可读存储介质和疗愈小屋。情绪干预方法包括:根据用户的第一生物特征信息,识别所述用户的情绪状态;推荐与所述情绪状态对应的至少一种情绪干预方式。

Description

情绪干预方法、装置和系统,以及计算机可读存储介质和疗愈小屋
相关申请的交叉引用
本申请是以CN申请号为201811303279.5,申请日为2018年11月2日的申请为基础,并主张其优先权,该CN申请的公开内容在此作为整体引入本申请中。
技术领域
本公开涉及计算机技术领域,特别涉及一种情绪干预方法、装置和系统,以及计算机可读存储介质和疗愈小屋。
背景技术
人类情感在人们的社会交往中发挥着重要作用。随着人机交互技术的快速发展,情感计算成为人工智能的重要研究领域之一。
发明内容
根据本公开实施例的第一方面,提供了一种情绪干预方法,包括:根据用户的第一生物特征信息,识别所述用户的情绪状态;推荐与所述情绪状态对应的至少一种情绪干预方式。
在一些实施例中,根据所述用户的第二生物特征信息,推荐与所述情绪状态对应的至少一种情绪干预方式。
在一些实施例中,推荐与所述情绪状态对应的至少一种情绪干预方式包括:根据所述用户的第二生物特征信息,识别所述用户的身体状态;根据所述用户的身体状态,推荐与所述情绪状态对应的至少一种情绪干预方式。
在一些实施例中,所述情绪干预方式包括输出媒体数据、调节环境氛围、提供饮食、提供心理咨询、提供情绪管理课程和进行理疗中的至少一种。
在一些实施例中,识别所述用户的情绪状态包括:实时获取所述用户的第一生物特征信息;根据实时获取的所述第一生物特征信息,确定所述用户的实时情绪状态;统计所述用户的各实时情绪状态在单位时间内的占比;将占比最大的实时情绪状态识别为所述用户在该单位时间内的情绪状态。
在一些实施例中,获取所述用户的第一生物特征信息包括:获取所述用户的图像;从所述图像中识别所述用户的脸部;根据所述脸部的特征,识别所述用户的脸部表情;将识别的脸部表情作为所述第一生物特征信息。
在一些实施例中,推荐与所述情绪状态对应的至少一种情绪干预方式包括:根据所述用户的情绪状态,获取对应的情绪干预数据,所述干预数据包括理疗建议和媒体数据中的至少一种;基于获取的情绪干预数据,推荐与所述情绪状态对应的至少一种情绪干预方式。
在一些实施例中,所述情绪干预方法还包括:对获取的情绪干预数据进行标注;删除与情绪干预目标不匹配的情绪干预数据;利用剩下的情绪干预数据构建情绪干预知识库。
在一些实施例中,通过文本相似度匹配算法来获取所述情绪干预数据。
在一些实施例中,通过文本相似度匹配算法来获取所述情绪干预数据包括:获取与情绪干预目标对应的关键词字典,所述关键词字典包括w个关键词,w为正整数;比较所述关键词字典与待比对文本之间的文本相似度;将文本相似度超过相似阈值的文本所对应的媒体数据,确定为所述情绪干预数据。
在一些实施例中,比较所述关键词字典与待比对文本之间的文本相似度包括:对所述关键词字典中的关键词、待比对文本中的关键词分别进行权值标记,所述权值反映关键词的重要程度,所述关键词字典中的关键词具有n种权值,n为正整数;将所述关键词字典和待比对文本中权值相同的关键词进行与操作,得到n个关键词集合,n个关键词集合中共包括a个关键词,a为整数;计算a与w的比值,得到待比对文本与所述关键词字典的文本相似度。
在一些实施例中,利用所述关键词字典中的关键词进行搜索,得到待比对文本。
在一些实施例中,所述第一生物特征信息包括脸部表情、声音中的至少一种;所述第二生物特征信息包括身高、体重、健康状况中的至少一种。
在一些实施例中,所述情绪干预方法还包括:确定所述用户是否选择推荐的情绪干预方式;在所述用户选择推荐的情绪干预方式的情况下,启动相应的情绪干预方式。
在一些实施例中,所述情绪干预方法还包括:根据图片背景色的色调和/或图片中包括的物体,确定图片的内容符合程度;利用图片的内容符合程度大于等于第一阈值的图片,构建图片知识库,作为情绪干预知识库。
在一些实施例中,所述情绪干预方法还包括:从图片的描述性文本中搜索到与关 键词字典A匹配的关键词,其中,关键词字典A包括a 0个关键词,a 0为正整数,匹配到关键词字典A中的关键词构成关键词字典A1;通过对关键词字典A中的关键词进行相似词扩充,构建关键词字典B;从图片的描述性文本中搜索到与关键词字典B匹配的关键词,搜索到的匹配到关键词字典B中的关键词构成关键词字典A2;从图片的描述性文本中,利用语意分析方法,搜索与关键词字典B中的关键词语意相近的句子,搜索到的语意相近的句子匹配到关键词字典B中的关键词构成关键词字典A3;将关键词字典A1、A2、和A3进行合并,构成关键词字典C,关键词字典C中与关键词字典A匹配的关键词个数为c,c为正整数;根据a 0和c计算关键词匹配度;利用关键词匹配度大于等于第二阈值的图片,构建图片知识库,作为情绪干预知识库。
在一些实施例中,所述情绪干预方法还包括:将图片的内容符合程度和关键词匹配度的较大值,确定为图片的符合程度;利用符合程度大于等于第三阈值的图片,构建图片知识库,作为情绪干预知识库。
根据本公开实施例的第二方面,提供了一种情绪干预装置,包括:识别单元,被配置为根据用户的第一生物特征信息,识别所述用户的情绪状态;推荐单元,被配置为推荐与所述情绪状态对应的至少一种情绪干预方式。
根据本公开实施例的第三方面,提供了一种情绪干预装置,包括:存储器;和耦接至所述存储器的处理器,所述处理器被配置为基于存储在所述存储器中的指令,执行如前述任一实施例所述的情绪干预方法。
根据本公开实施例的第四方面,提供了一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现如前述任一实施例所述的情绪干预方法。
根据本公开实施例的第五方面,提供了一种情绪干预系统,包括前述任一实施例的情绪干预装置。
在一些实施例中,所述情绪干预系统还包括理疗设备、环境氛围调节设备、显示器、播放器、饮食提供模块、心理咨询模块、情绪管理课程模块中的至少一种,其中:所述理疗设备被配置为在推荐的情绪干预方式中包括进行理疗的情况下,对用户进行理疗;所述环境氛围调节设备被配置为在推荐的情绪干预方式中包括进行环境氛围调节的情况下,进行环境氛围调节;所述显示器、所述播放器被配置为在推荐的情绪干预方式包括输出媒体数据的情况下,输出媒体数据;所述饮食提供模块被配置为在推荐的情绪干预方式中包括提供饮食的情况下,提供相应的饮食,以便从味觉上刺激所述用户的神经系统;所述心理咨询模块被配置为在推荐的情绪干预方式中包括提供心 理咨询的情况下,提供在线心理咨询转诊预约服务;所述情绪管理课程模块被配置为在推荐的情绪干预方式中包括提供情绪管理课程的情况下,提供在线心理管理等情绪管理课程。
在一些实施例中,所述情绪干预系统还包括图像传感器、声音传感器、测量设备、输入设备中的至少一种,其中:所述图像传感器、所述声音传感器被配置为获取用户的第一生物特征信息;所述测量设备、所述输入设备被配置为获取用户的第二生物特征信息。
在一些实施例中,所述理疗设备包括按摩椅。
根据本公开实施例的第六方面,提供了一种疗愈小屋,包括前述任一实施例的情绪干预系统。
通过以下参照附图对本公开的示例性实施例的详细描述,本公开的其它特征及其优点将会变得清楚。
附图说明
构成说明书的一部分的附图描述了本公开的实施例,并且连同说明书一起用于解释本公开的原理。
参照附图,根据下面的详细描述,可以更加清楚地理解本公开,其中:
图1A是示出根据本公开一个实施例的情绪干预方法的流程图;
图1B是示出根据本公开另一个实施例的情绪干预方法的流程图;
图2是示出根据本公开一个实施例的情绪识别方法的流程图;
图3是示出根据本公开一个实施例的文本相似度匹配算法的流程图;
图4是示出根据本公开一个实施例的文本相似度比较方法的流程图;
图5A是示出根据本公开一个实施例的情绪干预知识库构建方法的流程图;
图5B是示出根据本公开一个实施例的图片知识库构建方法的流程图;
图5C是示出根据本公开另一个实施例的图片知识库构建方法的流程图;
图5D是示出根据本公开又一个实施例的图片知识库构建方法的流程图;
图6是示出根据本公开一个实施例的情绪干预装置的框图;
图7是示出根据本公开另一个实施例的情绪干预装置的框图;
图8是示出根据本公开一个实施例的情绪干预系统的框图;
图9A是示出本公开一个实施例的疗愈小屋的结构示意图;
图9B是示出根据本公开一个实施例的按摩椅的结构示意图;
图10是示出用于实现本公开一个实施例的计算机系统的框图。
应当明白,附图中所示出的各个部分的尺寸并不是按照实际的比例关系绘制的。此外,相同或类似的参考标号表示相同或类似的构件。
具体实施方式
现在将参照附图来详细描述本公开的各种示例性实施例。对示例性实施例的描述仅仅是说明性的,决不作为对本公开及其应用或使用的任何限制。本公开可以以许多不同的形式实现,不限于这里所述的实施例。提供这些实施例是为了使本公开透彻且完整,并且向本领域技术人员充分表达本公开的范围。应注意到:除非另外具体说明,否则在这些实施例中阐述的部件和步骤的相对布置应被解释为仅仅是示例性的,而不是作为限制。
本公开使用的所有术语(包括技术术语或者科学术语)与本公开所属领域的普通技术人员理解的含义相同,除非另外特别定义。还应当理解,在诸如通用字典中定义的术语应当被解释为具有与它们在相关技术的上下文中的含义相一致的含义,而不应用理想化或极度形式化的意义来解释,除非这里明确地这样定义。
对于相关领域普通技术人员已知的技术、方法和设备可能不作详细讨论,但在适当情况下,所述技术、方法和设备应当被视为说明书的一部分。
本公开提出一种基于情绪识别进行情绪干预的方案。
图1A是示出根据本公开一个实施例的情绪干预方法的流程图。如图1A所示,情绪干预方法包括步骤S2和S4。
在步骤S2,根据用户的第一生物特征信息,识别所述用户的情绪状态。
第一生物特征信息包括例如脸部表情、声音等能够反映情绪状态的信息。情绪状态包括但不限于:中性、高兴、悲伤、生气、轻蔑、厌恶、吃惊、害怕。在一些实施例中,可以将情绪状态分为中性、积极、和消极三类情绪。高兴可以对应于积极情绪。悲伤、生气、轻蔑、厌恶、吃惊、害怕可以对应于消极情绪。
图2示出根据本公开一个实施例的情绪识别方法的框图。如图2所示,情绪识别方法包括步骤S21-S24。
在步骤S21,实时获取用户的第一生物特征信息。
在一些实施例中,根据摄像头等图像传感器拍摄的用户的图像,获取第一生物特 征信息。例如,从用户的图像中识别脸部,并基于脸部特征与表情的相关性,识别用户的脸部表情,作为第一生物特征信息。可以等时间间隔周期性地获取用户图像。
在一些实施例中,图像传感器可以包括摄像头组。摄像头组可以设置在不同方位,以便从多个角度获取用户图像。例如,可以将一个摄像头设置在用户的正对面,而将其余摄像头分散设置在用户的两侧。可以将相邻两个摄像头之间的角度设置为180°/n,其中n为摄像头的个数。
当然,可以对获取的用户图像进行压缩编码,以减小图像尺寸,从而方便存储。还可以记录获取用户图像的时间,从而可以将同一时间获取的多个角度的图像进行关联存储。
在另一些实施例中,也可以通过麦克风等声音传感器来感测用户的声音,作为第一生物特征信息。
在步骤S22,根据实时获取的第一生物特征信息,确定用户的实时情绪状态。
基于脸部表情、声音等生物特征信息与情绪的相关性,可以根据实时获取的信息确定用户的实时情绪状态。例如,可以通过预先构建的情绪识别模型,将不同的生物特征分类到相应的情绪状态,来建立生物特征与情绪状态的相关性。
在步骤S23,统计用户的各实时情绪状态在单位时间内的占比。
在一些实施例中,根据各实时情绪状态在单位时间的出现次数,统计用户的各实时情绪状态在该单位时间内的占比。例如,以2秒为一个单位时间,可能出现的实时情绪状态有8种,将不同的实时情绪状态在单位时间内的出现次数分别记为n i,1≤i≤8,则实时情绪状态的总出现次数记为
Figure PCTCN2019104911-appb-000001
各实时情绪状态的占比分别为
Figure PCTCN2019104911-appb-000002
设在2秒的单位时间内,仅出现生气和平静2种实时情绪状态。以每0.2秒获取1次信息为例,统计结果可以为:在2秒内生气出现7次,平静出现3次,则可有n 1=7,n 2=3,N=10,P 1=0.7,P 2=0.3。
在步骤S24,将占比最大的实时情绪状态识别为用户在该单位时间内的情绪状态。在上述示例中,将占比为0.7的生气识别为用户在该0.2秒的单位时间内的情绪状态。
在一些实施例中,可以根据识别的情绪状态,生成用户的情绪报告。
下面返回图1A描述在识别出用户的情绪状态后,如何推荐对应的情绪干预方式。
在步骤S4,推荐与所述用户的情绪状态对应的至少一种情绪干预方式。
在一些实施例中,根据用户的情绪状态,获取对应的情绪干预数据;基于获取的情绪干预数据,推荐与情绪状态对应的至少一种情绪干预方式。
情绪干预方式可以包括:进行理疗、输出媒体数据、调节环境氛围、提供饮食、提供心理咨询、提供情绪管理课程中的至少一种。相应地,情绪干预数据可以包括对应数据中的至少一种。
理疗可以涉及不同部位,例如眼部理疗、头部理疗、肩部理疗、颈部理疗、腿部理疗。理疗的方式例如包括按摩、光疗、磁疗等。媒体数据包括能够对用户进行情绪干预的图片、音频和视频等。情绪干预数据从形式上可以包括图片、文字、音频、视频等。
环境氛围包括灯光、负氧离子、气味等。心理咨询例如为根据用户的情绪状态,利用大数据建模的方式对用户的心理健康状态进行测评,并对测评结果为精神失衡危险性高的用户,提供心理自行转诊预约服务,以便给出情绪健康管理办法。情绪管理课程例如包括亲子课程、亲密关系课程、情商课程、逆商课程、社交课程等。
图1B是示出根据本公开另一个实施例的情绪干预方法的流程图。图1B与图1A的不同之处在于,还包括步骤S0S6和S8。下面将仅描述图1B与图1A的不同之处,相同之处不再赘述。
如图1B所示,在步骤S6中,确定用户是否选择推荐的情绪干预方式。
在步骤S8中,在用户选择推荐的情绪干预方式的情况下,启动相应的情绪干预方式。
例如,在用户选择输出媒体数据的干预方式的情况下,根据用户选择图片或音乐启动相应的显示器或播放器,并随机推送相应的干预图片或音乐。
又例如,在用户选择进行理疗的情况下,启动理疗设备,并提示用户根据理疗建议进行理疗。
在用户选择负氧离子干预的情况下,启动负氧离子发生器,并设置定时器,以便根据需要及时关闭负氧离子发生器。而在用户选择香气干预的情况下,启动香气发生器。当然,也可以同时设置定时器,以便在预定的时间关闭香气发生器。
在一些实施例中,通过文本相似度匹配算法来获取情绪干预数据。下面将结合图3来描述根据本公开一些实施例的文本相似度匹配算法。图3示出根据本公开一个实施例的文本相似度匹配算法的流程图。如图3所示,文本相似度匹配算法包括步骤 S41-S43。
在步骤S41,获取与情绪干预目标对应的关键词字典。
情绪干预目标为例如缓解用户的情绪。以用户的情绪被识别为生气为例,情绪干预目标即缓解生气情绪。与缓解生气情绪对应的关键词字典包括w个关键词,w为正整数。在关键词为例如舒缓、平静、欢快时,w为3。
在一些实施例中,也可以对这些关键词进行相似词扩充。例如,采用语意相似度匹配算法搜索欢快的相似词,得到欢乐、愉快,则扩充后的关键词字典包括舒缓、平静、欢快、欢乐、愉快。
在步骤S42,比较所述关键词字典与待比对文本之间的文本相似度。
待比对文本是一些词语的集合,例如为图片、音乐、视频等的文字说明或文章。例如,待比对文本包括“这首音乐欢快、活泼,令人愉快”。可以从网络上直接爬取待比对文本。也可以利用关键词字典中的关键词进行搜索,得到待比对文本。
为了便于计算机实现比较,可以利用稀疏算法将关键词字典与待比对文本转化为二进制编码,例如“01000001”。编码的长度及数值取决于具体的稀疏算法。
图4示出根据本公开一个实施例的文本相似度比较方法的流程图。如图4所示,文本相似度比较方法包括步骤S421-S423。
在步骤S421,对关键词字典中的关键词、待比对文本中的关键词分别进行权值标记。权值反映关键词的重要程度。关键词字典中的关键词具有n种权值,n为正整数。
仍以关键词字典包括舒缓、平静、欢快、欢乐、愉快等关键词为例,可将舒缓、平静的权值均标记为4,而将欢快、欢乐、愉快的权值均标记为3,则关键词字典中具有2种权值,即n=2。类似地,在包括“这首音乐欢快、活泼,令人愉快”的待比对文本中,这首、音乐、欢快、活泼、令人、愉快等关键词的权值可分别标记为1、2、3、3、1、3。
在步骤S422,将所述关键词字典和待比对文本中权值相同的关键词进行与操作,得到n个关键词集合。n个关键词集合中共包括a个关键词,a为整数。
在一些实施例中,在待比对文本中搜索与关键词字典中权值相同的关键词。在上述的示例中,搜索到的权值相同的关键词有欢快、活泼、令人愉快。将这些权值相同的关键词进行与操作,可以得到1个包括欢快、愉快的关键词集合,即n=1。
由于前面对关键词字典进行了相似词的扩充,相应地,在步骤S422中得到的关 键词集合中可删除相似词。即可以从关键词集合中删除愉快,由此得到a=1。
在步骤S423,计算a与w的比值,得到待比对文本与所述关键词字典的文本相似度。
根据前面的分析,可以计算出a/w=1/3。在一些实施例中,可以将a与w的比值作为待比对文本与关键词字典的文本相似度。不同的关键词字典有不同的w值。若关键词字典中不包括舒缓,则可得到w=2。相应地,可以得到待比对文本与关键词字典的文本相似度为1/2。
下面返回图3描述在得到用户的待比对文本与关键词字典的文本相似度后,如何确定情绪干预数据。
在步骤S43,将文本相似度超过阈值的文本所对应的媒体数据,确定为所述情绪干预数据。
根据实际情况可以设置不同的阈值。设阈值为45%,对于上面的示例,对w=2的情况,得到的相似度为50%,则可将待比对文本对应的媒体数据(如音乐)确定为情绪干预数据;而对w=3的情况,得到的相似度为约33%,则不将待比对文本对应的媒体数据确定为情绪干预数据。
下面结合具体示例来描述在获取了情绪干预数据后,如何推荐情绪干预方式。
以用户的情绪状态被识别为悲伤为例,可以有多种情绪干预方式。例如,可以向用户推荐播放欢快的音乐或积极的图片来调动用户的情绪。也可以根据情绪状态与理疗方式的相关性,来推荐进行相应的理疗。例如,按摩相关穴位,可有效缓解悲伤情绪。当然,也可以同时推荐多种情绪干预方式,以提升干预效果。
在推荐情绪干预方式时,还可以考虑用户的第二生物特征信息。第二生物特征信息包括身高、体重、健康状况等反映身体状态的信息。第二生物特征信息可以通过问询等方式获取。对于身高、体重等信息,也可以通过相应的测量来获取。
在一些实施例中,根据用户的第二生物特征信息,识别用户的身体状态;根据用户的身体状态,推荐与情绪状态对应的至少一种情绪干预方式。这样推荐的情绪干预方式兼顾了用户的身体状态,能够更有效地干预用户的情绪。
仍以用户的情绪状态被识别为悲伤为例,可以向用户推荐欢快的音乐或积极的图片来调动用户的情绪。但是,在用户的眼睛不适的情况下,推荐欢快的音乐就比推荐积极的图片更容易被用户接受。因此,考虑用户的身体状态来推荐情绪干预方式,将使得情绪干预更有效。
在一些实施例中,根据用户的身体状态,推荐相应的理疗方式。仍以用户的情绪状态被识别为悲伤、但眼睛不适的情况为例,可以向用户推荐例如眼部按摩的眼部理疗。这是因为用户的悲伤情绪有可能由眼部不适引起,缓解了眼部不适就可以有效减轻用户的悲伤情绪。
如前所述,也可以同时推荐多种干预方式来干预用户的情绪。例如,在向用户推荐眼部按摩的同时,推荐欢快的音乐。一方面,缓解了眼部不适就可以有效减轻用户的悲伤情绪;另一方面欢快的音乐本身也可以缓解用户的悲伤情绪。这样,情绪干预更加有效。
在获取了情绪干预数据后,除了进行情绪干预方式的推荐,还可以利用获取的情绪干预数据构建情绪干预知识库。
图5A示出根据本公开一个实施例的情绪干预知识库构建方法的流程图。如图5A所示,情绪干预知识库构建方法包括步骤S51-S53。
在步骤S51,对获取的情绪干预数据进行标注。例如,在识别用户的情绪为生气时,情绪干预目标可为稳定情绪,例如可以搜索宁静的图片、文字、音频或视频,相应搜索的数据可标注为稳定情绪。当然,也可以对输入的情绪干预数据进行标注。
在步骤S52,删除与情绪干预目标不匹配的情绪干预数据。
获取的情绪干预数据中可能包括与情绪干预目标不匹配的数据,例如,这些数据无法稳定情绪。因此,可以删除这些与情绪干预目标不匹配的数据。
在步骤S53,利用剩下的情绪干预数据构建情绪干预知识库。情绪干预知识库包括但不限于:图片知识库、音乐知识库、视频知识库、理疗方式知识库。
在一些实施例中,在识别用户的情绪状态后,直接在构建好的情绪干预知识库中搜索情绪干预数据。基于这样的情绪干预数据,推荐情绪的干预方式更高效。
下面以情绪干预数据为图片为例,结合图5B来描述根据本公开一些实施例的基于图片的内容搜索构建图片知识库的方法。基于图片的内容搜索构建图片知识库包括:根据图片背景色的色调和/或图片中所包括的物体,确定图片的内容符合程度;利用图片的内容符合程度大于等于阈值的图片,构建图片知识库。
图5B示出根据本公开一个实施例的图片知识库构建方法的流程图。
在步骤512中,根据图片背景色的色调,确定图片的内容符合程度。
可以根据图片背景色的分类方法,确定图片的背景色属于暖色调、中性色调还是冷色调。例如,暖色调的颜色包括红、橙、黄等;中性色调的颜色包括紫、黑、灰、 白等;冷色调的颜色包括绿、青、蓝等。
在一些实施例中,根据上述背景色的分类进行分别统计,统计暖色调的颜色像素个数为n 1、中性色调的颜色像素个数为n 2、冷色调的颜色像素个数为n 3;然后,根据如下公式(1)-(3)分别计算暖色调的颜色像素个数、中性色调的颜色像素个数、冷色调的颜色像素个数占总像素个数的百分比:
暖色调的颜色像素个数的百分比color 1=n 1/(n 1+n 2+n 3),(1),
中性色调的颜色像素个数的百分比color 2=n 2/(n 1+n 2+n 3),(2),
冷色调的颜色像素个数的百分比color 3=n 3/(n 1+n 2+n 3),(3)。
一般而言,图片背景色属于暖色调或中性色调,则图片的内容可能是积极的;反之,则图片的内容可能是消极的。即,可以根据图片背景色的色调来确定图片的内容是积极的概率,即确定图片的内容符合程度。例如,可以用color 1、color 2、color 3的加权和来反映图片的内容符合程度。
然后,在步骤518中,利用图片的内容符合程度大于等于阈值的图片,构建图片知识库。即,在图片阈值符合程度大于等于阈值的情况下,将相应的图片放入图片知识库中。
在另一些实施例中,也可以根据图片中对应积极情绪、中性情绪、消极情绪的物体的个数来确定图片的内容是积极内容、中性内容,还是消极内容。
例如,积极情绪的图片中包括反映获胜、娱乐旅游、美丽的风光、花草树木、可爱的动物、消遣、名车、钞票、金银、运动场景、快乐的表情、亲情、友情、爱情等的物体;中性情绪的图片包括反映日常用品、生活工作场景、建筑物、交通工具、饮食、几何图形、面无表情的人等的物体;消极情绪的图片包括反映事故、自然灾害、物体(如建筑物)损毁、各种垃圾、鬼怪、昆虫、医疗、尸体、环境污染、自然灾害、哭泣、伤残、血淋、军事场面、暴力冲突、武器凶器等的物体。
在步骤514中,根据图片中包括的物体,确定图片的内容符合程度。
例如,可以根据上述图片积极内容、中性内容、消极内容的分类,统计图片中对应积极情绪的物体个数为m 1,对应中性情绪的物体个数为m 2,对应消极情绪的物体个数为m 3。然后,可以根据如下公式(4)-(6)计算图片中积极情绪物体个数、中性情绪物体个数和消极情绪物体个数占总物体个数的百分比:
积极情绪物体个数的百分比mod 1=m 1/(m 1+m 2+m 3),(4),
中性情绪物体个数的百分比mod 2=m 2/(m 1+m 2+m 3),(5),
负性情绪物体个数的百分比mod 3=m 3/(m 1+m 2+m 3),(6)。
一般而言,图片中包括积极物体或中性物体,则图片的内容可能是积极的;反之,则图片的内容可能是消极的。即,可以根据图片中所包括的物体来确定图片的内容是积极的概率,即确定图片的内容符合程度。例如,可以用mod 1、mod 2、mod 3的加权和来反映图片的内容符合程度。
然后,在步骤518中,利用图片的内容符合程度大于等于阈值的图片,构建图片知识库。即,在图片的内容符合程度大于等于阈值的情况下,将相应的图片放入图片知识库中。
应当理解,可以仅根据步骤512或514来确定图片的内容符合程度,即仅执行步骤512和518,或仅执行步骤514和518。当然,为了更准确地确定图片的内容符合程度,也可以综合步骤512和514的结果来确定图片的内容符合程度,即执行步骤516。
在步骤516中,根据图片背景色的色调和图片中包括的物体,综合确定图片的内容符合程度。
例如,根据如下公式(7)来计算图片的内容符合程度:
f 1=γ 11·mod 12·mod 23·mod 3)+γ 21·color 12·color 23·color 3),(7),
其中,α 1、α 2、α 3、β 1、β 2、β 3、γ 1、γ 2的值可以根据实际需要来设置。例如,α 1取1,α 2取0.5,α 3取-1;β 1取1,β 2取0.5β 3取-1;γ 1取0.5,γ 2取0.5。
然后,在步骤518中,利用图片的内容符合程度大于等于阈值的图片,构建图片知识库。即,在图片的内容符合程度大于等于阈值的情况下,将相应的图片放入图片知识库中。
在一些实施例中,为了使得构建的图片知识库中的图片至少具有期望的分辨率,也可以提前筛选图片。例如,在步骤510中,判断图片的分辨率x是否大于阈值y(即期望的分辨率)。仅在图片的分辨率x大于y的情况下,才执行后续的步骤。
下面仍以情绪干预数据为图片为例,结合图5C来描述根据本公开一些实施例的基于文字的自然语言处理方法构建图片知识库的方法。图5C示出根据本公开另一个实施例的图片知识库构建方法的流程图。
在步骤521中,从图片的描述性文本中搜索到与关键词字典A匹配的关键词,构成关键词字典A 1。关键词字典A可以基于文字的自然语言处理方法构建。关键词 字典A可以是提前构建好的,也可以在执行步骤521之前构建。关键词字典A包括a 0个关键词,a 0为正整数。图片的描述性文本匹配到关键词字典A中的关键词个数为a 1,构成关键词字典A 1,a 1为正整数。匹配例如包括关键词完全相同、语意相同、或语义相似等。
在步骤S522中,通过对关键词字典A中的关键词进行相似词扩充,构建关键词字典B。
在步骤S523中,从图片的描述性文本中搜索到与关键词字典B匹配的关键词,构成关键词字典A 2。搜索到的匹配到关键词字典B中的关键词个数为a 2,构成关键词字典A 2,a 2为正整数。
在步骤S524中,从图片的描述性文本中,利用语意分析方法,搜索与关键词字典B中的关键词语意相近的句子,构成关键词字典A 3。搜索到的语意相近的句子匹配到关键词字典B中的关键词个数为a 3,构成关键词字典A 3,a 3为正整数。
在步骤525中,将关键词字典A 1、A 2、和A 3进行合并,构成关键词字典C,关键词字典C中与关键词字典A匹配的关键词个数为c,c为正整数。例如,可以将关键词字典A 1、A 2、和A 3中的关键词进行与操作,来实现关键词字典的合并。
在步骤526中,根据a 0和c计算关键词匹配度。例如,计算关键词匹配度f 2=c/a 0
在步骤527中,利用关键词匹配度大于等于阈值的图片,构建图片知识库,作为情绪干预知识库。即,类似于步骤518,在图片的内容符合程度大于等于阈值的情况下,将相应的图片放入图片知识库中。
为了更准确地确定图片的内容符合程度,还可以综合图5B和5C的结果来构建图片知识库。下面结合图5D来描述根据本公开一些实施例的基于图片的内容搜索,结合基于文字的自然语言处理方法,构建图片知识库的方法。图5D示出根据本公开又一个实施例的图片知识库构建方法的流程图。
图5D利用图5B中步骤516和图5C中步骤526的结果来构建图片知识库。如图5D所示,在步骤516获得了图片的内容符合程度f 1和在步骤512、514或516获得了图片的关键词匹配度f 2之后,可以在步骤530中利用公式(8)将图片的内容符合程度和图片的关键词匹配度中的较大值,确定为图片的符合程度:
f=max(f 1,f 2),(8)。
在步骤532中,利用符合程度大于等于阈值的图片,构建图片知识库。即,在图片符合程度f大于等于z的情况下,将相应的图片放入图片知识库中。反之,则舍弃 该图片。在一些实施例中,在放入图片知识库中之前,还可以进行人工校准。
在上述实施例中,通过基于图片的内容搜索,结合基于文字的自然语言处理方法,构建图片知识库,可提高图片的搜索符合的准确率。
图6是示出根据本公开一个实施例的情绪干预装置的框图。如图6所示,情绪干预装置60包括:识别单元620和推荐单元640。
识别单元620被配置为,例如可以执行如图1所示的步骤S2。如前所述,可以通过图像传感器来获取用户的脸部表情,再根据脸部表情与情绪状态的相关性,来识别用户的情绪状态。
推荐单元640被配置为,例如可以执行如图1所示的步骤S4。如前所述,可以基于用户的情绪状态来获取对应的情绪干预数据,再从中选择与情绪状态对应的至少一种情绪干预方式。在一些实施例中,综合考虑用户的情绪状态和身体状态来推荐情绪干预方式。
图7是示出根据本公开另一个实施例的情绪干预装置的框图。
如图7所示,情绪干预装置70包括:存储器710以及耦接至该存储器710的处理器720。存储器710用于存储执行情绪干预方法对应实施例的指令。处理器720被配置为基于存储在存储器710中的指令,执行本公开中任意一些实施例中的情绪干预方法。
应当理解,前述情绪干预方法中的各个步骤都可以通过处理器来实现,并且可以软件、硬件、固件或其结合的任一种方式实现。
除了情绪干预方法、装置之外,本公开实施例还可采用在一个或多个包含有计算机程序指令的非易失性存储介质上实施的计算机程序产品的形式。因此,本公开实施例还提供一种计算机可读存储介质,其上存储有计算机指令,该指令被处理器执行时实现前述任意实施例中的情绪干预方法。
本公开实施例还提供一种情绪干预系统,包括前述任一实施例所述的情绪干预装置。
图8是示出根据本公开一个实施例的情绪干预系统的框图。
如图8所示,情绪干预系统8包括控制器80、情绪识别子系统81和情绪干预子系统82。控制器80被配置为执行前述任一实施例所述的情绪干预方法。控制器80的结构可以类似与前述的情绪干预装置60或70。
情绪识别子系统81被配置为识别用户的情绪。在一些实施例中,情绪识别子系统81包括:图像传感器811、声音传感器812、测量设备813、输入设备814中的至 少一个。
图像传感器811被配置为获取用户的第一生物特征信息。例如,通过相机等拍摄用户的图像。如前所述,对用户的图像进行脸部特征的分析,可获取用户的脸部表情,作为用户的第一生物特征信息。
在一些实施例中,图像传感器811也可以被配置为获取用户的第二生物特征信息。例如,可以根据用户的全身图像、该图像中的参照物的尺寸来计算用户的身高,作为用户的第二生物特征信息。
声音传感器812被配置为获取用户的第一生物特征信息。例如,通过麦克风等感测用户的声音,作为用户的第一生物特征信息。
测量设备813被配置为获取用户的第二生物特征信息。例如,可以通过刻度尺来测量用户的身高、通过体重秤测量用户的体重。
输入设备814被配置为获取用户的第二生物特征信息。如前所述,可以问询方式获取用户的第二生物特征信息。即,用户可以通过文字等方式输入身高、体重、健康状况等第二生物特征信息。在一些实施例中,用户也可以通过输入设备直接输入对自身的情绪感知。在一些实施例中,还可以通过输入用户的医疗病例等获取用户更为详尽、准确的第二生物特征信息。
情绪干预子系统82被配置为对用户的情绪进行干预。在一些实施例中,情绪干预子系统82包括显示器821、播放器822、理疗设备823、环境氛围调节设备824中的至少一种。
显示器821、播放器822被配置为在推荐的情绪干预方式包括输出媒体数据的情况下,输出媒体数据。
显示器821也可以被配置为显示用户输入的文字或图片等数据。在一些实施例中,显示器包括液晶显示器或OLED(Organic Light-Emitting Diode,有机发光二极管)显示器。显示器可以为:手机、平板电脑、电视机、笔记本电脑、数码相框、导航仪、投影荧幕等任何具有显示功能的产品或部件。显示的方式除了可以是常规的二维显示之外,还可以为虚拟现实(VR)、增强现实(AR)、全息投影等。
播放器822除了播放情绪干预用的音频之外,也可以被配置为播放用户输入的语音。播放器例如为音箱或耳机。
理疗设备823被配置为在推荐的情绪干预方式中包括进行理疗的情况下,对用户进行理疗。根据推荐的理疗方式的不同,例如按摩、光疗、磁疗等,可以采用不同的 理疗设备。可以通过无线蓝牙连接的方式或有线连接的方式启动理疗设备。在一些实施例中,理疗设备是按摩椅。
环境氛围调节设备824被配置为在推荐的情绪干预方式中包括进行环境氛围调节的情况下,提供负氧离子环境或指定的灯光环境。环境氛围调节设备824可以根据需要提供不同的负氧离子环境或不同灯光环境效果。例如,环境氛围调节设备824包括负氧离子发生器和负氧离子控制器,或灯光产生器(即光源)和灯光控制器。
在另一些实施例中,环境氛围调节设备824被配置为提供花香、草香等能够进行情绪干预的设备。例如,环境氛围调节设备824包括各种香味发生器和香味控制器。
在又一些实施例中,环境氛围调节设备824还包括光传感器、温度传感器或湿度传感器等,以便根据需要调节环境的光照、温度或湿度。
在一些实施例中,环境氛围调节设备824还可以包括定时器、声鸣器等辅助设备,以便根据需要定时开关相应的设备,从而实现期望的环境氛围。
在一些实施例中,情绪干预子系统82还可以包括:饮食提供模块825,被配置为在推荐的情绪干预方式中包括提供饮食的情况下,提供相应的饮食,以便从味觉上刺激用户的神经系统,达到情绪干预的目的。饮食提供模块825可以包括商品储藏机、在线支付系统、供货通道、无线收发模块等。饮食提供模块825例如为自动售卖机。
在一些实施例中,情绪干预子系统82还包括:心理咨询模块826,被配置为在推荐的情绪干预方式中包括提供心理咨询的情况下,提供在线心理咨询转诊预约服务。心理咨询模块826被配置为根据用户的情绪状态,利用大数据建模的方式对用户的心理健康状态进行测评,并对测评结果为精神失衡危险性高的用户,提供心理自行转诊预约服务,以便给出情绪健康管理办法。
在另一些实施例中,情绪干预子系统82还包括:情绪管理课程模块827,被配置为在推荐的情绪干预方式中包括提供情绪管理课程的情况下,提供在线心理管理等情绪管理课程。课程方向包括亲子课程、亲密关系课程、情商课程、逆商课程、社交课程等方向。课程形式包括视频课程、电子书课程等形式。课程可以通过显示器来显示和播放。
在一些实施例中,情绪干预系统8还可以包括信息管理子系统83。在一些实施例中,信息管理子系统83包括:用户移动端831、心理咨询师/医生移动端832、信息管理平台833、后端维护平台834、信息安全平台835、评估平台836等。
用户移动端831具有注册登录模块、在线预约模块、在线心理测评模块、在线课 程查看模块、在线社交模块、咨询客服模块、个人信息管理模块。
注册登录模块包括注册和登录两个功能。在线预约模块包括在线预约体验店功能、在线预约干预套餐功能、在线预约心理咨询师功能、在线预约医生功能、在线预约课程功能等。在线心理测评模块包括以心理量表为主、其它在线测评方式为辅的在线测评功能。在线课程查看模块包括在线观看课程的功能,例如查看课程分类、课程列表、课程详情、课程播放、课程搜索等功能。在线社交模块包括以在线贴吧交流、社交群为主的社交功能。咨询客服模块包括与客服进行联系、咨询的功能。个人信息管理模块包括管理个人信息、会员信息、系统消息、预约信息、消息推送、清除缓存、退出登录等功能。
心理咨询师/医生移动端832包括注册登录-资质验证模块、预约信息查看模块、日程信息查看模块、消息通知模块、系统设置模块。
注册登录-资质验证模块包括心理咨询师/医生的注册功能、心理咨询师/医生的登录功能、心理咨询师/医生的资质验证功能。预约信息查看模块包括查看例如预约人、预约时间、预约地点等预约详情的功能。消息通知模块包括新预约信息的通知、取消预约信息的通知、全部预约信息的定时通知等功能。系统设置模块包括可预约时间的设置、每周循环设置、时间段设置、调整账户状态等功能。
信息管理平台包833括用户管理模块、心理咨询师/医生管理模块、预约管理模块、社交管理模块、课程管理模块、促销管理模块、体验店管理模块、设置模块、支付模块、在线客服模块、消息推送模块等。
用户管理模块包括用户的注册登录管理、注册用户统计、会员统计、会员支付数据统计等功能。用户管理模块还可以包括会员管理者对会员信息的管理功能。咨询师/医生管理模块包括咨询师账户、咨询师介绍维护等功能。
预约管理模块包括会员预约情绪干预套餐和心理咨询套餐管理等功能。预约管理模块还可以包括体验店预约统计、咨询师预约统计、医生预约统计、干预套餐预约统计等功能。
社交管理模块包括社交信息管理与维护等功能,用于提供会员与会员之间、会员和管理人员之间的交流平台。该平台嫩嫩广告根据会员工作单位、工作的行业和家庭住址等信息进行分组,可以让会员更快地融入到工作行业和生活区的圈子中,更多地了解工作行业和生活社区的信息,解决问题。
课程管理模块包括课程列表、课程详情、课程发布等功能。促销管理模块包括优 惠券类型、券使用数据统计、优惠券使用规则的新建、优惠券额度的新建、优惠券使用期限的新建、优惠券用户的选择等功能。体验店管理模块包括体验店信息维护、体验店管理员管理等功能。
设置模块包括轮播图配置、角色管理、管理员账户与权限配置、消息推送、系统日志等功能。支付模块用于支付情绪疗愈套餐的费用、会员费用、饮品费用、课程费用和心理咨询费用等。消息推送模块例如用于推送微信等消息。
后端维护平台834包括图片音乐知识库更新模块、课程更新模块、专家标注模块和维护登录模块。
图片音乐知识库更新模块用于更新疗愈知识库中的图片知识库和音乐知识库。课程更新模块用于更新课程的内容。专家标注模块用于对从网络中自动搜索的图片和音乐、手动添加的图片和音乐进行标注。维护登录模块用于登录系统的维护、后端数据库的维护、管理员系统的维护,包括图片音乐知识库更新、课程更新、管理员管理维护、会员信息维护、心理咨询师信息管理维护、医生信息管理维护等功能。信息安全平台用于保证整体系统的信息安全。
评估平台835用于评估服务质量和系统,包括服务评估与反馈系统两部分。服务评估包括对环境体验感、服务体验感、系统体验感这三个方面的评估。评估的体验感反馈到系统中,以便更新系统。评估平台还可以根据记录的用户的使用频次、最近登陆的时间、消费的总金额,分析用户群体状况。例如,可以利用客户模型,分析重要价值会员、重要发展会员、重要保持会员、重要挽留会员、一般价值会员、一般发展会员、一般保持会员、一般挽留会员,并进行客户流失预警。
可以在多种场所安装本公开实施例提供的情绪干预系统,例如安装于诊所、商场等场合的开放式、半开放式或封闭式环境中。例如可以安装于带有门的封闭式房间内,以便进一步的控制环境的温度、负氧离子含量等,即形成一个能够进行情绪干预的疗愈小屋。
图9A示出本公开一个实施例的疗愈小屋的结构示意图。在一些实施例中,疗愈小屋9包括情绪识别子系统。如图9A所示,情绪识别子系统位于黑盒90中。情绪识别子系统的结构与图8的情绪识别子系统81类似,例如设有图像传感器,用于拍摄用户图像。如图9A所示,黑盒90设有多个摄像头911。如前所述,对用户的图像进行脸部特征的分析,可获取用户的脸部表情,从而识别用户的情绪状态。
如图9A所示,疗愈小屋9还包括情绪干预子系统。情绪干预子系统的结构与图8 的情绪干预子系统82类似,例如包括显示器921、播放器922、按摩椅923,以及光源9241、负氧离子发生器9242、香气发生器9243等环境氛围调节设备。
疗愈小屋9还包括信息管理子系统。信息管理子系统的结构与图8的信息管理子系统83类似,例如包括pad平板电脑931等用户移动端。
黑盒90中还可以包括情绪干预子系统和信息管理子系统的一些组成部分。例如,定时器、香气控制器等环境氛围调节设备可以设置在黑盒90中。又例如,图片音乐知识库更新模块、课程更新模块、专家标注模块等后端维护平台可以设置在黑盒中。当然,用于控制情绪干预系统的控制器也可以设置在黑盒中。
如图9A所示,黑盒90上设有各种接口,以便与外部的设备进行连接。接口例如为USB接口、HDMI接口、音频接口901、电源接口902等。前述的显示器921、播放器922、按摩椅923,以及光源9241、负氧离子发生器9242、香气发生器9243、pad平板电脑931等可以无线连接到黑盒90。相应地,黑盒90上可以设置有相应的无线收发模块,例如蓝牙模块、WIFI模块、3G/4G/5G模块。当然,上述连接也可以是有线方式。
在一些实施例中,如图9A所示,黑盒90上还设置有开关键903、音量键904和重启键905等按键,以便用户根据需要进行操作。黑盒90的顶面上也可以设置有触摸屏,以便用户操作。
在一些实施例的疗愈小屋中,通过摄像头实时采集用户的人脸图像,识别出用户的情绪状态,并针对不利于用户健康状态的消极情绪,采用声觉(如音乐)、视觉(如图片)、触觉(如理疗)、嗅觉(如香气)、味觉(如饮品)和环境(如负氧离子)“六位一体”的方式对用户的消极情绪进行干预。该疗愈小屋可以面向各年龄段的心理亚健康人群,具有情绪识别、情绪干预等作用,可用于社区、家庭和商业圈等各种场景。
图9B示出本公开一个实施例的按摩椅的结构示意图。
如图9B所示,按摩椅923上搭载了图像传感器911,例如摄像头。按摩椅923上还搭载了播放器922,例如音响或耳机。在一些实施例中,按摩椅上可以搭载声音传感器,例如麦克风。用户也可以通过麦克风输入语音信息。按摩椅上还可以设置测量设备,例如用来测量身高的刻度尺。
如图9B所示,按摩椅923上还设置有调节开关923S,用于调节按摩椅的角度、按摩椅的按摩力度、拉伸力度等。
图10是示出用于实现本公开一个实施例的计算机系统的框图。
如图10所示,计算机系统可以通用计算设备的形式表现。计算机系统包括存储器1010、处理器1020和连接不同系统组件的总线1000。
存储器1010例如可以包括系统存储器、非易失性存储介质等。系统存储器例如存储有操作系统、应用程序、引导装载程序(Boot Loader)以及其他程序等。系统存储器可以包括易失性存储介质,例如随机存取存储器(RAM)和/或高速缓存存储器。非易失性存储介质例如存储有执行显示方法的对应实施例的指令。非易失性存储介质包括但不限于磁盘存储器、光学存储器、闪存等。
处理器1020可以用中央处理器(CPU)、数字信号处理器(DSP)、应用专用集成电路(ASIC)、现场可编程门阵列(FPGA)或其它可编程逻辑设备、分立门或晶体管等分立硬件组件方式来实现。相应地,诸如判断模块和确定模块的每个模块,可以通过中央处理器(CPU)运行存储器中执行相应步骤的指令来实现,也可以通过执行相应步骤的专用电路来实现。
总线1000可以使用多种总线结构中的任意总线结构。例如,总线结构包括但不限于工业标准体系结构(ISA)总线、微通道体系结构(MCA)总线、外围组件互连(PCI)总线。
计算机系统还可以包括输入输出接口1030、网络接口1040、存储接口1050等。这些接口1030、1040、1050以及存储器1010和处理器1020之间可以通过总线1000连接。输入输出接口1030可以为显示器、鼠标、键盘等输入输出设备提供连接接口。网络接口1040为各种联网设备提供连接接口。存储接口1050为软盘、U盘、SD卡等外部存储设备提供连接接口。
至此,已经详细描述了本公开的各种实施例。为了避免遮蔽本公开的构思,没有描述本领域所公知的一些细节。本领域技术人员根据上面的描述,完全可以明白如何实施这里公开的技术方案。
虽然已经通过示例对本公开的一些特定实施例进行了详细说明,但是本领域的技术人员应该理解,以上示例仅是为了进行说明,而不是为了限制本公开的范围。本领域的技术人员应该理解,可在不脱离本公开的范围和精神的情况下,对以上实施例进行修改或者对部分技术特征进行等同替换。本公开的范围由所附权利要求来限定。

Claims (25)

  1. 一种情绪干预方法,包括:
    根据用户的第一生物特征信息,识别所述用户的情绪状态;
    推荐与所述情绪状态对应的至少一种情绪干预方式。
  2. 根据权利要求1所述的情绪干预方法,其中,根据所述用户的第二生物特征信息,推荐与所述情绪状态对应的至少一种情绪干预方式。
  3. 根据权利要求1或2所述的情绪干预方法,其中,推荐与所述情绪状态对应的至少一种情绪干预方式包括:
    根据所述用户的第二生物特征信息,识别所述用户的身体状态;
    根据所述用户的身体状态,推荐与所述情绪状态对应的至少一种情绪干预方式。
  4. 根据权利要求1至3中任一项所述的情绪干预方法,其中,所述情绪干预方式包括输出媒体数据、调节环境氛围、提供饮食、提供心理咨询、提供情绪管理课程和进行理疗中的至少一种。
  5. 根据权利要求1至4中任一项所述的情绪干预方法,其中,识别所述用户的情绪状态包括:
    实时获取所述用户的第一生物特征信息;
    根据实时获取的所述第一生物特征信息,确定所述用户的实时情绪状态;
    统计所述用户的各实时情绪状态在单位时间内的占比;
    将占比最大的实时情绪状态识别为所述用户在该单位时间内的情绪状态。
  6. 根据权利要求1至5中任一项所述的情绪干预方法,其中,获取所述用户的第一生物特征信息包括:
    获取所述用户的图像;
    从所述图像中识别所述用户的脸部;
    根据所述脸部的特征,识别所述用户的脸部表情;
    将识别的脸部表情作为所述第一生物特征信息。
  7. 根据权利要求1至6中任一项所述的情绪干预方法,其中,推荐与所述情绪状态对应的至少一种情绪干预方式包括:
    根据所述用户的情绪状态,获取对应的情绪干预数据,所述干预数据包括理疗建议和媒体数据中的至少一种;
    基于获取的情绪干预数据,推荐与所述情绪状态对应的至少一种情绪干预方式。
  8. 根据权利要求1至7中任一项所述的情绪干预方法,还包括:
    对获取的情绪干预数据进行标注;
    删除与情绪干预目标不匹配的情绪干预数据;
    利用剩下的情绪干预数据构建情绪干预知识库。
  9. 根据权利要求1至8中任一项所述的情绪干预方法,其中,通过文本相似度匹配算法来获取所述情绪干预数据。
  10. 根据权利要求1至9中任一项所述的情绪干预方法,其中,通过文本相似度匹配算法来获取所述情绪干预数据包括:
    获取与情绪干预目标对应的关键词字典,所述关键词字典包括w个关键词,w为正整数;
    比较所述关键词字典与待比对文本之间的文本相似度;
    将文本相似度超过相似阈值的文本所对应的媒体数据,确定为所述情绪干预数据。
  11. 根据权利要求1至10中任一项所述的情绪干预方法,其中,比较所述关键词字典与待比对文本之间的文本相似度包括:
    对所述关键词字典中的关键词、待比对文本中的关键词分别进行权值标记,所述权值反映关键词的重要程度,所述关键词字典中的关键词具有n种权值,n为正整数;
    将所述关键词字典和待比对文本中权值相同的关键词进行与操作,得到n个关键词集合,n个关键词集合中共包括a个关键词,a为整数;
    计算a与w的比值,得到待比对文本与所述关键词字典的文本相似度。
  12. 根据权利要求1至11中任一项所述的情绪干预方法,其中,利用所述关键词字典中的关键词进行搜索,得到待比对文本。
  13. 根据权利要求1至12中任一项所述的情绪干预方法,其中:
    所述第一生物特征信息包括脸部表情、声音中的至少一种;
    所述第二生物特征信息包括身高、体重、健康状况中的至少一种。
  14. 根据权利要求1至13中任一项所述的情绪干预方法,还包括:
    确定所述用户是否选择推荐的情绪干预方式;
    在所述用户选择推荐的情绪干预方式的情况下,启动相应的情绪干预方式。
  15. 根据权利要求1至14中任一项所述的情绪干预方法,还包括:
    根据图片背景色的色调和/或图片中包括的物体,确定图片的内容符合程度;
    利用图片的内容符合程度大于等于第一阈值的图片,构建图片知识库,作为情绪干预知识库。
  16. 根据权利要求1至15中任一项所述的情绪干预方法,还包括:
    从图片的描述性文本中搜索到与关键词字典A匹配的关键词,其中,关键词字典A包括a 0个关键词,a 0为正整数,匹配到关键词字典A中的关键词构成关键词字典A 1
    通过对关键词字典A中的关键词进行相似词扩充,构建关键词字典B;
    从图片的描述性文本中搜索到与关键词字典B匹配的关键词,搜索到的匹配到关键词字典B中的关键词构成关键词字典A 2
    从图片的描述性文本中,利用语意分析方法,搜索与关键词字典B中的关键词语意相近的句子,搜索到的语意相近的句子匹配到关键词字典B中的关键词构成关键词字典A 3
    将关键词字典A 1、A 2、和A 3进行合并,构成关键词字典C,关键词字典C中与关键词字典A匹配的关键词个数为c,c为正整数;
    根据a 0和c计算关键词匹配度;
    利用关键词匹配度大于等于第二阈值的图片,构建图片知识库,作为情绪干预知识库。
  17. 根据权利要求1至16中任一项所述的情绪干预方法,还包括:
    将图片的内容符合程度和关键词匹配度的较大值,确定为图片的符合程度;
    利用符合程度大于等于第三阈值的图片,构建图片知识库,作为情绪干预知识库。
  18. 一种情绪干预装置,包括:
    识别单元,被配置为根据用户的第一生物特征信息,识别所述用户的情绪状态;
    推荐单元,被配置为推荐与所述情绪状态对应的至少一种情绪干预方式。
  19. 一种情绪干预装置,包括:
    存储器;和
    耦接至所述存储器的处理器,所述处理器被配置为基于存储在所述存储器中的指令,执行如权利要求1至17中任一项所述的情绪干预方法。
  20. 一种计算机可读存储介质,其上存储有计算机程序,该程序被处理器执行时实现如权利要求1至17中任一项所述的情绪干预方法。
  21. 一种情绪干预系统,包括:如权利要求19或20所述的情绪干预装置。
  22. 根据权利要求21所述的情绪干预系统,还包括理疗设备、环境氛围调节设备、显示器、播放器、饮食提供模块、心理咨询模块、情绪管理课程模块中的至少一种,其中:
    所述理疗设备被配置为在推荐的情绪干预方式中包括进行理疗的情况下,对用户进行理疗;
    所述环境氛围调节设备被配置为在推荐的情绪干预方式中包括进行环境氛围调节的情况下,进行环境氛围调节;
    所述显示器、所述播放器被配置为在推荐的情绪干预方式包括输出媒体数据的情况下,输出媒体数据;
    所述饮食提供模块被配置为在推荐的情绪干预方式中包括提供饮食的情况下,提供相应的饮食,以便从味觉上刺激所述用户的神经系统;
    所述心理咨询模块被配置为在推荐的情绪干预方式中包括提供心理咨询的情况下,提供在线心理咨询转诊预约服务;
    所述情绪管理课程模块被配置为在推荐的情绪干预方式中包括提供情绪管理课程的情况下,提供在线心理管理等情绪管理课程。
  23. 根据权利要求21或22所述的情绪干预系统,还包括图像传感器、声音传感器、测量设备、输入设备中的至少一种,其中:
    所述图像传感器、所述声音传感器被配置为获取用户的第一生物特征信息;
    所述测量设备、所述输入设备被配置为获取用户的第二生物特征信息。
  24. 根据权利要求21至23中任一项所述的情绪干预系统,其中,所述理疗设备包括按摩椅。
  25. 一种疗愈小屋,包括:如权利要求21至24中任一项所述的情绪干预系统。
PCT/CN2019/104911 2018-11-02 2019-09-09 情绪干预方法、装置和系统,以及计算机可读存储介质和疗愈小屋 WO2020088102A1 (zh)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/758,743 US11617526B2 (en) 2018-11-02 2019-09-09 Emotion intervention method, device and system, and computer-readable storage medium and healing room

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201811303279.5A CN111145871A (zh) 2018-11-02 2018-11-02 情绪干预方法、装置和系统,以及计算机可读存储介质
CN201811303279.5 2018-11-02

Publications (1)

Publication Number Publication Date
WO2020088102A1 true WO2020088102A1 (zh) 2020-05-07

Family

ID=70464548

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/104911 WO2020088102A1 (zh) 2018-11-02 2019-09-09 情绪干预方法、装置和系统,以及计算机可读存储介质和疗愈小屋

Country Status (3)

Country Link
US (1) US11617526B2 (zh)
CN (1) CN111145871A (zh)
WO (1) WO2020088102A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148907A (zh) * 2020-10-23 2020-12-29 北京百度网讯科技有限公司 图像数据库的更新方法、装置、电子设备和介质
WO2022117776A1 (fr) 2020-12-04 2022-06-09 Socialdream Dispositif immersif

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20210035968A (ko) * 2019-09-24 2021-04-02 엘지전자 주식회사 사용자의 표정이나 발화를 고려하여 마사지 동작을 제어하는 인공 지능 마사지 장치 및 그 방법
US11657071B1 (en) * 2020-03-13 2023-05-23 Wells Fargo Bank N.A. Mapping disparate datasets
CN113925376B (zh) * 2021-11-15 2022-11-15 杭州赛孝健康科技有限公司 一种智能坐便椅
CN115054248B (zh) * 2021-12-10 2023-10-20 荣耀终端有限公司 情绪监测方法和情绪监测装置
CN117338298B (zh) * 2023-12-05 2024-03-12 北京超数时代科技有限公司 情绪干预方法、装置、可穿戴情绪干预设备及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101370195A (zh) * 2007-08-16 2009-02-18 英华达(上海)电子有限公司 移动终端中实现情绪调节的方法及装置
US20100325078A1 (en) * 2009-06-22 2010-12-23 Lee Ho-Sub Device and method for recognizing emotion and intention of a user
CN102467668A (zh) * 2010-11-16 2012-05-23 鸿富锦精密工业(深圳)有限公司 情绪侦测及舒缓系统及方法
CN103164691A (zh) * 2012-09-20 2013-06-19 深圳市金立通信设备有限公司 基于手机用户的情绪识别系统及方法
CN105536118A (zh) * 2016-02-19 2016-05-04 京东方光科技有限公司 一种情绪调节装置、可穿戴设备和缓解情绪的帽子

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6752772B2 (en) * 2002-04-03 2004-06-22 Rocky Kahn Manipulation device with dynamic intensity control
GB0809563D0 (en) * 2008-05-28 2008-07-02 Health Smart Ltd A behaviour modification system
US9098114B2 (en) * 2009-09-03 2015-08-04 Human Touch, Llc Comprehensive user control system for therapeutic wellness devices
US20170095192A1 (en) * 2010-06-07 2017-04-06 Affectiva, Inc. Mental state analysis using web servers
TW201220216A (en) 2010-11-15 2012-05-16 Hon Hai Prec Ind Co Ltd System and method for detecting human emotion and appeasing human emotion
KR101560995B1 (ko) 2012-03-28 2015-10-15 미쓰비시덴키 가부시키가이샤 철도 차량 시스템
WO2014085910A1 (en) * 2012-12-04 2014-06-12 Interaxon Inc. System and method for enhancing content using brain-state data
US9141604B2 (en) * 2013-02-22 2015-09-22 Riaex Inc Human emotion assessment reporting technology—system and method
CN103390060A (zh) * 2013-07-30 2013-11-13 百度在线网络技术(北京)有限公司 基于移动终端的歌曲推荐方法与装置
EP2857276B1 (en) * 2013-08-20 2018-12-12 Harman International Industries, Incorporated Driver assistance system
WO2015025049A1 (en) * 2013-08-23 2015-02-26 Sicpa Holding Sa Method and system for authenticating a device
US20150169832A1 (en) * 2013-12-18 2015-06-18 Lenovo (Singapore) Pte, Ltd. Systems and methods to determine user emotions and moods based on acceleration data and biometric data
US20160063874A1 (en) * 2014-08-28 2016-03-03 Microsoft Corporation Emotionally intelligent systems
US9610510B2 (en) * 2015-07-21 2017-04-04 Disney Enterprises, Inc. Sensing and managing vehicle behavior based on occupant awareness
US10965975B2 (en) * 2015-08-31 2021-03-30 Orcam Technologies Ltd. Systems and methods for recognizing faces using non-facial information
US10872354B2 (en) * 2015-09-04 2020-12-22 Robin S Slomkowski System and method for personalized preference optimization
CN105354184B (zh) * 2015-10-28 2018-04-20 甘肃智呈网络科技有限公司 一种使用优化的向量空间模型实现文档自动分类的方法
US20180032126A1 (en) * 2016-08-01 2018-02-01 Yadong Liu Method and system for measuring emotional state
US20180063064A1 (en) * 2016-08-29 2018-03-01 International Business Machines Corporation Modifying a mood through selective feeding of content
US10192171B2 (en) * 2016-12-16 2019-01-29 Autonomous Fusion, Inc. Method and system using machine learning to determine an automotive driver's emotional state
JP6866715B2 (ja) * 2017-03-22 2021-04-28 カシオ計算機株式会社 情報処理装置、感情認識方法、及び、プログラム
CN107220293B (zh) * 2017-04-26 2020-08-18 天津大学 基于情绪的文本分类方法
CN107424019A (zh) * 2017-08-15 2017-12-01 京东方科技集团股份有限公司 基于情绪识别的艺术品推荐方法、装置、介质和电子设备
US10397350B2 (en) * 2017-09-26 2019-08-27 Disney Enterprises, Inc. Tracking wearables or other devices for emoji stories
CN108198607A (zh) * 2018-03-07 2018-06-22 美的集团股份有限公司 一种食材推荐的方法、设备及计算机存储介质
US20200085673A1 (en) * 2018-09-19 2020-03-19 Golden Gm Holdings Sdn. Bhd. Method and system for customized operation of a therapeutic device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101370195A (zh) * 2007-08-16 2009-02-18 英华达(上海)电子有限公司 移动终端中实现情绪调节的方法及装置
US20100325078A1 (en) * 2009-06-22 2010-12-23 Lee Ho-Sub Device and method for recognizing emotion and intention of a user
CN102467668A (zh) * 2010-11-16 2012-05-23 鸿富锦精密工业(深圳)有限公司 情绪侦测及舒缓系统及方法
CN103164691A (zh) * 2012-09-20 2013-06-19 深圳市金立通信设备有限公司 基于手机用户的情绪识别系统及方法
CN105536118A (zh) * 2016-02-19 2016-05-04 京东方光科技有限公司 一种情绪调节装置、可穿戴设备和缓解情绪的帽子

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112148907A (zh) * 2020-10-23 2020-12-29 北京百度网讯科技有限公司 图像数据库的更新方法、装置、电子设备和介质
WO2022117776A1 (fr) 2020-12-04 2022-06-09 Socialdream Dispositif immersif
FR3117221A1 (fr) * 2020-12-04 2022-06-10 Socialdream dispositif immersif

Also Published As

Publication number Publication date
US11617526B2 (en) 2023-04-04
US20210219891A1 (en) 2021-07-22
CN111145871A (zh) 2020-05-12

Similar Documents

Publication Publication Date Title
WO2020088102A1 (zh) 情绪干预方法、装置和系统,以及计算机可读存储介质和疗愈小屋
US20210192819A1 (en) Intelligent interactive and augmented reality cloud platform
JP6320143B2 (ja) 健康情報サービスシステム
US20190236361A1 (en) Systems and methods for using persistent, passive, electronic information capturing devices
CN109564706B (zh) 基于智能交互式增强现实的用户交互平台
Tomkins et al. What and where are the primary affects? Some evidence for a theory
Greene et al. The briefest of glances: The time course of natural scene understanding
US10579866B2 (en) Method and system for enhancing user engagement during wellness program interaction
US20120164613A1 (en) Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20090119154A1 (en) Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
US20090118593A1 (en) Determining a demographic characteristic based on computational user-health testing of a user interaction with advertiser-specified content
CN107392124A (zh) 情绪识别方法、装置、终端及存储介质
US20140142967A1 (en) Method and system for assessing user engagement during wellness program interaction
Fotios et al. Appraising the intention of other people: Ecological validity and procedures for investigating effects of lighting for pedestrians
JP6646084B2 (ja) 健康情報サービスシステム
Cao et al. An AdaBoost‐backpropagation neural network for automated image sentiment classification
Budner et al. " Making you happy makes me happy"--Measuring Individual Mood with Smartwatches
CN111326235B (zh) 一种情绪调节方法、设备及系统
Akram et al. Altered perception of facially expressed tiredness in insomnia
US20220253418A1 (en) Maintaining User Privacy of Personal, Medical, and Health Care Related Information in Recommendation Systems
Shan et al. Human health assessments of green infrastructure designs using virtual reality
Banerjee Evolution of Dada Uttam Kumar: Performing Masculinity and the Disillusioned Bhadralok Mahanayak in the 1970s’ Popular Melodramas
Chhabra An approach for the transformation of human emotion and energy-field using sound therapy
Korner Liveliness
Silva Age prediction through the influence of fatigue levels in human-computer interaction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19880092

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19880092

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 19880092

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27.10.2021)

122 Ep: pct application non-entry in european phase

Ref document number: 19880092

Country of ref document: EP

Kind code of ref document: A1