CN111741116B - Emotion interaction method and device, storage medium and electronic device - Google Patents

Emotion interaction method and device, storage medium and electronic device Download PDF

Info

Publication number
CN111741116B
CN111741116B CN202010598934.5A CN202010598934A CN111741116B CN 111741116 B CN111741116 B CN 111741116B CN 202010598934 A CN202010598934 A CN 202010598934A CN 111741116 B CN111741116 B CN 111741116B
Authority
CN
China
Prior art keywords
target
target object
preset
emotion
terminal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010598934.5A
Other languages
Chinese (zh)
Other versions
CN111741116A (en
Inventor
祖岩岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Haier Uplus Intelligent Technology Beijing Co Ltd
Original Assignee
Haier Uplus Intelligent Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Haier Uplus Intelligent Technology Beijing Co Ltd filed Critical Haier Uplus Intelligent Technology Beijing Co Ltd
Priority to CN202010598934.5A priority Critical patent/CN111741116B/en
Publication of CN111741116A publication Critical patent/CN111741116A/en
Application granted granted Critical
Publication of CN111741116B publication Critical patent/CN111741116B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • H04L67/025Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/174Facial expression recognition
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/63Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/28Data switching networks characterised by path configuration, e.g. LAN [Local Area Networks] or WAN [Wide Area Networks]
    • H04L12/2803Home automation networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Signal Processing (AREA)
  • Acoustics & Sound (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Computational Linguistics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Hospice & Palliative Care (AREA)
  • Computing Systems (AREA)
  • Child & Adolescent Psychology (AREA)
  • Medical Informatics (AREA)
  • Psychiatry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Information Transfer Between Computers (AREA)
  • Telephonic Communication Services (AREA)

Abstract

The invention provides an emotion interaction method, an emotion interaction device, a storage medium and an electronic device, wherein the emotion interaction method comprises the following steps: monitoring the emotion state of a target object; under the condition that the emotion state of the target object is monitored to be a preset emotion state, sending state data to a control terminal or a server so that the control terminal or the server generates a control instruction according to the state data, wherein the control instruction is used for remotely controlling a target application terminal to execute preset operation corresponding to the preset emotion state; and receiving the control instruction and controlling the target application terminal to execute the preset operation according to the control instruction. The method and the device solve the problem that emotion interaction in the related technology has certain regional limitation and cannot effectively identify emotion change of the target object in time, realize the purpose of expanding regional applicability of emotion interaction, further can effectively identify emotion change of the target object in time and improve user experience.

Description

Emotion interaction method and device, storage medium and electronic device
Technical Field
The present invention relates to the field of communications, and in particular, to an emotion interaction method, an emotion interaction device, a storage medium, and an electronic device.
Background
Under the rapid driving of new generation information technologies such as artificial intelligence, internet of things and big data, the intelligent home is developing from single-product intelligence and interconnection intelligence to initiative intelligence, and the requirements of each home on the intelligent home are evolving from original convenience, intelligence and high efficiency to initiative and humanization. At present, intelligent households are more researched to realize the functional requirements of human-computer intelligent interaction, and neglect the emotion requirements adapting to important household relations.
However, among the many factors affecting the health and well being of people, there is an extremely important and difficult to handle factor, namely "home mood". It not only relates to the physical and mental health of each family member, but also affects the happiness and longevity of a family. Family emotions originate from the emotion of each family member and the relationships among family members. Therefore, the emotion of the family members can be effectively controlled, abnormal emotion of the family members can be timely perceived and relieved, bad family emotion can be improved, and a consistent and active family relationship is molded. However, modern fast-paced social life makes it difficult for people to control their own emotion in the relationship between a parent and a couple, and to perceive or easily ignore the emotion change of the other party.
In the related art, emotion pacifying is generally limited to application scenes of face-to-face communication, but cannot solve the problem of remote communication or remote monitoring of abnormal emotion of family members.
Therefore, the related art has the problem that emotion interaction has certain regional limitation and cannot effectively identify emotion change of a target object in time.
Aiming at the problems in the related art, no effective technical proposal is proposed at present.
Disclosure of Invention
The embodiment of the invention provides an emotion interaction method, an emotion interaction device, a storage medium and an electronic device, which are used for at least solving the problem that emotion interaction in the related technology has certain regional limitation and cannot effectively identify emotion change of a target object in time.
According to one embodiment of the present invention, there is provided an emotion interaction method including: monitoring the emotion state of a target object; under the condition that the emotion state of the target object is monitored to be a preset emotion state, sending state data to a control terminal or a server so that the control terminal or the server generates a control instruction according to the state data, wherein the control instruction is used for remotely controlling a target application terminal to execute preset operation corresponding to the preset emotion state; and receiving the control instruction and controlling the target application terminal to execute the preset operation according to the control instruction.
According to another embodiment of the present invention, there is provided an emotion interaction method including: acquiring state data from a target application terminal, wherein the state data is used for indicating the emotion state of a target object to be a preset emotion state; generating a control instruction based on the state data, wherein the control instruction is used for controlling the target application terminal to execute a preset operation corresponding to the preset emotion state; and sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation.
According to still another embodiment of the present invention, there is provided an emotion interaction device including: the detection module is used for monitoring the emotion state of the target object; the first sending module is used for sending state data to a control terminal or a server under the condition that the emotion state of the target object is monitored to be a preset emotion state, so that the control terminal or the server generates a control instruction according to the state data, wherein the control instruction is used for remotely controlling a target application terminal to execute preset operation corresponding to the preset emotion state; and the receiving module is used for receiving the control instruction and controlling the target application terminal to execute the preset operation according to the control instruction.
According to still another embodiment of the present invention, there is provided an emotion interaction device including: the system comprises an acquisition module, a judgment module and a control module, wherein the acquisition module is used for acquiring state data from a target application terminal, wherein the state data is used for indicating the emotion state of a target object to be a preset emotion state; the control module is used for generating a control instruction based on the state data, wherein the control instruction is used for controlling the target application terminal to execute a preset operation corresponding to the preset emotion state; and the second sending module is used for sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation.
According to a further embodiment of the invention, there is also provided a computer readable storage medium having stored therein a computer program, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
According to a further embodiment of the invention, there is also provided an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
According to the method and the device for monitoring the emotion state of the target object, the emotion state of the target object is actively monitored, and when the emotion state of the target object is in the preset emotion state, state data are sent to the control terminal or the server, so that the control terminal or the server remotely controls the target application terminal to execute the operation corresponding to the preset emotion state, the problem that emotion interaction has certain regional limitation and emotion change of the target object cannot be timely and effectively identified in the related technology can be solved, regional applicability of the emotion interaction is enlarged, and further the aim of timely and effectively identifying the emotion change of the target object is fulfilled, and user experience is improved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a block diagram of a hardware architecture of a mobile terminal of an emotion interaction method according to an embodiment of the present application;
FIG. 2 is a flow chart of a method of emotion interaction according to an embodiment of the present application;
FIG. 3 is a second flowchart of an emotion interaction method according to an embodiment of the present application;
FIG. 4 is a flowchart of an emotion interaction method in accordance with an embodiment of the present application;
FIG. 5 is a block diagram of an emotion interaction device according to an embodiment of the present application;
FIG. 6 is a block diagram of a second embodiment of an emotion interaction device.
Detailed Description
The application will be described in detail hereinafter with reference to the drawings in conjunction with embodiments. It should be noted that, without conflict, the embodiments of the present application and features of the embodiments may be combined with each other.
It should be noted that the terms "first," "second," and the like in the description and the claims of the present application and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order.
The method embodiments provided by the embodiments of the present application may be performed in a mobile terminal, a computer terminal, or similar computing device. Taking the operation on a mobile terminal as an example, fig. 1 is a block diagram of a hardware structure of a mobile terminal of an emotion interaction method according to an embodiment of the present application. As shown in fig. 1, the mobile terminal 10 may include one or more (only one is shown in fig. 1) processors 102 (the processor 102 may include, but is not limited to, a microprocessor MCU or a processing device such as a programmable logic device FPGA) and a memory 104 for storing data, and optionally a transmission device 106 for communication functions and an input-output device 108. It will be appreciated by those skilled in the art that the structure shown in fig. 1 is merely illustrative and not limiting of the structure of the mobile terminal described above. For example, the mobile terminal 10 may also include more or fewer components than shown in FIG. 1 or have a different configuration than shown in FIG. 1.
The memory 104 may be used to store a computer program, for example, a software program of application software and a module, such as a computer program corresponding to an emotion interaction method in an embodiment of the present invention, and the processor 102 executes the computer program stored in the memory 104, thereby performing various functional applications and data processing, that is, implementing the above-mentioned method. Memory 104 may include high-speed random access memory, and may also include non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory. In some examples, the memory 104 may further include memory located remotely from the processor 102, which may be connected to the mobile terminal 10 via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The transmission means 106 is arranged to receive or transmit data via a network. The specific examples of networks described above may include wireless networks provided by the communication provider of the mobile terminal 10. In one example, the transmission device 106 includes a network adapter (Network Interface Controller, simply referred to as NIC) that can connect to other network devices through a base station to communicate with the internet. In one example, the transmission device 106 may be a Radio Frequency (RF) module, which is used to communicate with the internet wirelessly.
In this embodiment, an emotion interaction method is provided, fig. 2 is a flowchart of an emotion interaction method according to an embodiment of the present invention, and as shown in fig. 2, the flowchart includes the following steps:
step S202, monitoring the emotion state of a target object;
step S204, when the emotion state of the target object is monitored to be a preset emotion state, state data are sent to a control terminal or a server, so that the control terminal or the server generates a control instruction according to the state data, wherein the control instruction is used for remotely controlling a target application terminal to execute a preset operation corresponding to the preset emotion state;
step S206, the control instruction is received, and the target application terminal is controlled to execute the preset operation according to the control instruction.
In the above embodiment, the emotion information of the target object may be obtained by monitoring the emotion state of the target object through a voice recognition module or a facial expression recognition module and the like included in the target application terminal. Wherein, the target object may include a person, an animal, etc., when the target object is a person, the emotion state may include calm, happy, sad, surprise, fear, anger, aversion, tension, depression, etc., the predetermined emotion state may include one or more of the above emotion states, and when the target object is detected to be in the predetermined emotion state, the target application terminal (e.g., one or more of the smart home devices) may be remotely controlled by the control terminal (smart device such as a mobile phone, a tablet) to perform a predetermined operation, and the predetermined operation may include playing soothing music, playing video, etc. Wherein different emotion states of different objects correspond to different coping operations, for example, for a stressed emotion state, a target application terminal (for example, an intelligent playing terminal) located in an area where the target object is located can play relaxed music; for sad emotion states, a smiling video or picture and the like can be played by a target application terminal (e.g., an intelligent display terminal) in the area where the target object is located.
Alternatively, the method may be applied to an intelligent home system including a plurality of intelligent terminals (corresponding to the application terminals described above). The main execution body of the steps may be a target application terminal installed with a predetermined application, or other devices with similar processing capability, or may be a machine integrated with at least an image acquisition device, a sound acquisition device and a data processing device, where the image acquisition device may include a graphics acquisition module such as a camera, the sound acquisition device may include a sound acquisition module such as a microphone, and the data processing device may include a terminal such as a computer, a mobile phone, but is not limited thereto.
According to the method and the device for monitoring the emotion state of the target object, the emotion state of the target object is actively monitored, and when the emotion state of the target object is in the preset emotion state, state data are sent to the control terminal or the server, so that the control terminal or the server remotely controls the target application terminal to execute the operation corresponding to the preset emotion state, the problem that emotion interaction has certain regional limitation and emotion change of the target object cannot be timely and effectively identified in the related technology can be solved, regional applicability of the emotion interaction is enlarged, and further the aim of timely and effectively identifying the emotion change of the target object is fulfilled, and user experience is improved.
In an alternative embodiment, sending the status data to the control terminal, so that the control terminal generates the control instruction according to the status data includes: and sending a notification message to a control terminal application of the control terminal, wherein the notification message is used for instructing the control terminal application to generate the control instruction and sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation. In this embodiment, in the case where it is detected that the emotional state of the target object is a predetermined emotional state, the target application terminal may send a notification message to an application installed in the control terminal, so that the application installed in the control terminal generates a control instruction to remotely control the target application terminal to perform a predetermined operation. For example, when the home robot detects that a child in the body is in a casualty state, a notification message may be sent to a control application installed in a mobile phone of a parent, and the control application may remotely control the home robot to play an animation for the child.
In an alternative embodiment, after monitoring the emotional state of the target object, the method further comprises: and under the condition that the emotion state of the target object is monitored to be the preset emotion state, executing at least one of the following operations: sending out first preset information corresponding to the preset emotion state of the target object; and sending out second preset information corresponding to the preset emotion state of the target object by the target application terminal which is smaller than the preset distance from the target object. In this embodiment, the terminal performing the above operation may be an application terminal located near the target object, and the application terminal may immediately send out the first predetermined information after detecting the emotional state of the target object. The first preset information can be entertainment voice information special for relaxing atmosphere, light information and the like, and the preset information can be set in a personalized mode according to application scenes. The control end can also apply the remote control to send out the second predetermined information from the equipment with the distance smaller than the predetermined distance from the target object, namely control the work of televisions, air conditioners, sound equipment, music players, mobile phones, computers, flat panels, lamps and the like, and relieve the atmosphere. The predetermined distance may be 2m (the value is just one realizable way, and specifically, different predetermined distances may be set according to application scenarios, for example, 1m,3m, etc.). For example, when the target object is a person, the first predetermined information may be a voice prompt such as "put down sad, happy is most important" (the voice prompt is only one exemplary illustration for reminding the target object to relieve emotion, the user may set the voice prompt according to his own preference, and may specifically set different prompts according to different scenes, for example, the voice prompt may also be "sweet-spot may be happy", and the like below), and the second predetermined information may be to turn on a device closer to the target object, such as a television, a music player, a mobile phone, etc., playing happy music, a joke, etc. When the emotion state of the target object is tension, the first predetermined information may be a voice prompt, such as "relaxed, tension, etc., and the second predetermined information may be light for adjusting the position of the target object, relaxed music, etc.
In this embodiment, considering that the actual communication environment may be in different home scenes (e.g. in a bedroom scene and a living room scene, or in a living room scene and a kitchen scene, respectively), the smart home application terminal sending the reminding information may be one or may be a plurality of linkage terminals in different home scenes.
In an alternative embodiment, when the number of the target objects is at least two, issuing the first predetermined information corresponding to the predetermined emotional state of the target object includes: determining the pacifying priority of each target object; and sequentially sending out first preset information corresponding to the preset emotion states of all target objects according to the order of the priority from high to low. In this embodiment, the pacifying priority may be determined according to the emotional fluctuation degree of the target object, and after determining the priority, the first predetermined information corresponding to the emotional states of the target objects is sequentially sent out according to the order of the priority from high to low. When disputes occur among family members in a family scene, the pacifying priority can be determined according to the emotion fluctuation degree of the family members, for example, when a adult in the family disputes, the emotion of the child can be fear, the emotion of the adult can be anger, sadness and the like, the pacifying priority can be fear of a first priority, sadness of a second priority and anger of a third priority, and it is required to be stated that the priority is only one realizable mode, different pacifying priorities can be set according to the scene of the application, and the user can also set the priority by himself. For example, the priority may also be set according to the age or sex of the target object, and when the priority is set with the age, the old may be the first priority, the child may be the second priority, the middle-aged may be the third priority, and the young may be the fourth priority; when the priority is set by sex, a female may be set as a first priority and a male as a second priority; wherein the first priority is higher than the second priority, the second priority is higher than the third priority, and so on.
In an optional embodiment, when the number of the target objects is at least two, the controlling the target application terminal that is less than a predetermined distance from the target object to send out second predetermined information corresponding to the predetermined emotional state of the target object includes: determining the pacifying priority of each target object; and sequentially controlling the target application terminals which are smaller than a preset distance from each target object according to the order of the priority from high to low to send out second preset information corresponding to the preset emotion states of each target object. In this embodiment, when the number of target objects is at least two, firstly, the pacifying priority is determined, then the target application terminal closer to the target object is controlled to send out the second predetermined information according to the order of the pacifying priority from high to low, for example, when a person in a family disputes, the emotion of a child may be fear, the emotion of the person may be anger, sadness and the like, the pacifying priority may be fear of first priority, sadness of second priority, anger of third priority, firstly, a television playing cartoon, story-telling and the like which is less than the predetermined distance from the child is turned on, then a light which is less than the predetermined distance from the sad object is turned on to be a warm tone, and a music player is turned on to play happy music, finally an air conditioner which is less than the predetermined distance from the anger object is turned on to make it cool and the like.
In an alternative embodiment, before sending the status data to the control terminal, so that the control terminal generates the control instruction according to the status data, the method further includes: and determining the control terminal corresponding to the target object based on a pre-configured corresponding relation, wherein the corresponding relation is used for recording the binding relation between the object and the control terminal, and different objects bind the same or different control terminals. In this embodiment, when the target object is a person, the opposite party cannot directly obtain the reminding information in the scene of remote communication or remote monitoring of the abnormal emotional state of the family member, so that the target object can be bound with the target control terminal application and the family member, and the family scene (such as "emotion") can be customized on the target control terminal application. The target control terminal can be a mobile phone, a tablet, a computer, intelligent wearable equipment and the like.
In this embodiment, a method for emotion interaction is also provided, and fig. 3 is a flowchart of a method for emotion interaction according to an embodiment of the present invention, as shown in fig. 3, where the flowchart includes the following steps:
step S302, acquiring state data from a target application terminal, wherein the state data is used for indicating the emotion state of a target object to be a preset emotion state;
Step S304, a control instruction is generated based on the state data, wherein the control instruction is used for controlling the target application terminal to execute a preset operation corresponding to the preset emotion state;
and step S306, the control instruction is sent to the target application terminal so as to remotely control the target application terminal to execute the preset operation.
In the above embodiment, the target application terminal may monitor the emotion state of the target object through a voice recognition module or a facial expression recognition module and the like included in the target application terminal, obtain emotion information of the target object, and send a predetermined emotion state of the target object to an application installed in the control terminal, where the application installed in the control terminal may remotely control the target application terminal to perform the predetermined operation, where different emotion states of different objects may correspond to different coping operations, for example, for a tense emotion state, an operation of remotely controlling, by an application installed in the control terminal, an intelligent playing terminal in an area where the target object is located to play comfortable music; for sad emotion states, the intelligent display terminal in the area where the target object is remotely controlled by an application installed in the control terminal can play the operation of the confusing video or picture, and the like. Wherein the target object may include a person, an animal, etc., and when the target object is a person, the emotional state may include calm, happy, sad, surprise, fear, anger, aversion, tension, depression, etc., and the predetermined emotional state may include one or more of the above emotional states.
Alternatively, the execution subject of the above steps may be a control terminal or a server installed with a predetermined application, or other devices with similar processing capabilities, for example, a mobile phone, a computer, a tablet computer, a smart wearable device, etc.
According to the method and the device for monitoring the emotion state of the target object, the emotion state of the target object is actively monitored, and when the emotion state of the target object is in the preset emotion state, state data are sent to the control terminal or the server, so that the control terminal or the server remotely controls the target application terminal to execute the operation corresponding to the preset emotion state, the problem that emotion interaction has certain regional limitation and emotion change of the target object cannot be timely and effectively identified in the related technology can be solved, regional applicability of the emotion interaction is enlarged, and further the aim of timely and effectively identifying the emotion change of the target object is fulfilled, and user experience is improved.
In an alternative embodiment, sending the control instruction to the target application terminal includes at least one of: the control instruction is sent through the target application terminal which is smaller than a preset distance from the target object, so that the target application terminal which is smaller than the preset distance from the target object is controlled to send out second preset information corresponding to the preset emotion state of the target object; and purchasing commodities corresponding to the preset emotion states of the target object on line. In this embodiment, after receiving the abnormal emotion state of the user perceived by the target application terminal, the application installed in the control terminal may send a control instruction to the target application terminal, that is, after receiving the message of the target application terminal, may execute an emotion scene with one key, where the scene may include one key to complete controlling the intelligent home application terminal to play entertainment information, or purchase order, and so on.
In an alternative embodiment, after online purchasing of the merchandise corresponding to the predetermined emotional state of the target object, the method further comprises: and controlling the target application terminal which is smaller than the preset distance from the target object to display the purchase information for purchasing the commodity. In this embodiment, after purchasing the commodity corresponding to the predetermined emotional state of the target object online, the information of purchasing the commodity may be displayed on the target application terminal having a distance smaller than the predetermined distance from the target object by the control terminal, or the information of purchasing the commodity may be voice broadcast by the target application terminal having a distance smaller than the predetermined distance from the target object.
In an alternative embodiment, when the number of the target objects is at least two, sending, by the target application terminal that is less than a predetermined distance from the target object, second predetermined information corresponding to the predetermined emotional state of the target object includes: determining the pacifying priority of each target object; and sending out second preset information corresponding to the preset emotion states of the target objects by the target application terminals which are smaller than the preset distance from the target objects in sequence according to the order of the priority from high to low. In this embodiment, when the number of target objects is at least two, firstly, the pacifying priority is determined, then, the target application terminal closer to the target object sends out the second predetermined information in the order of the pacifying priority from high to low, for example, when the adult in the family disputes, the emotion of the child may be fear, the emotion of the adult may be anger, sadness, etc., the pacifying priority may be fear of first priority, sadness of second priority, anger of third priority, firstly, the television playing cartoon, story-telling, etc. smaller than the predetermined distance from the child is turned on, the light smaller than the predetermined distance from the sad object is turned on to be warm tone, the music player is turned on to play the happy music, finally, the air conditioner smaller than the predetermined distance from the anger object is turned on to make it cool, etc. It should be noted that the priority is just one realizable way, different pacifying priorities can be set according to the application scene, and the user can also set the priority by himself. For example, the priority may also be set according to the age or sex of the target object, and when the priority is set with the age, the old may be the first priority, the child may be the second priority, the middle-aged may be the third priority, and the young may be the fourth priority; when the priority is set by sex, a female may be set as a first priority and a male as a second priority; wherein the first priority is higher than the second priority, the second priority is higher than the third priority, and so on.
In an alternative embodiment, when the number of the target objects is at least two, online purchasing the commodity corresponding to the predetermined emotional state of the target objects includes: determining the pacifying priority of each target object; and sequentially purchasing commodities corresponding to the preset emotion states of the target objects on line according to the order of the priority from high to low. In this embodiment, first, the priority of the target object is determined, the commodities corresponding to the emotion of the target object are sequentially purchased according to the order of the priority from high to low, and then commodity information is sent to the device which is less than the predetermined distance from the target object according to the priority, and the commodity information is displayed on the device or is broadcasted by the device in a voice mode.
The emotion interaction method is described below with reference to the specific embodiment:
FIG. 4 is a flowchart of an emotion interaction method according to an embodiment of the present invention, as shown in FIG. 4, the flowchart includes:
in step S402, the smart home application terminal (corresponding to the target application terminal) obtains abnormal emotion information (for the emotion state of the target object) of the user through the emotion recognition module, which may be a voice recognition module or a facial expression recognition module.
In step S404, in the home scenario of close-range communication, the smart home application terminal immediately sends out the reminding information after sensing the abnormal emotional state of the user. On one hand, the reminding information can be entertainment voice information special for relieving atmosphere, or can be lamplight information and the like, and is set in a personalized way according to families; on the other hand, considering that the actual communication environment may be in different home scenes (such as a bedroom scene and a living room scene, or a living room scene and a kitchen scene, respectively), the intelligent home application terminal for sending the reminding information may be one or a plurality of linkage terminals in different home scenes.
In step S406, in the scenario of remote communication or remote monitoring of abnormal emotion states of family members, the counterpart cannot directly obtain reminding information from the smart home application terminal in the family scenario, so that the smart home application terminal needs to be bound with the smart home control terminal (corresponding to the control terminal), and family members are bound, the abnormal emotion states of the users are automatically actively pushed to the applications installed in the control terminal through the smart home application terminal and the home scenario customized by the control terminal (such as "emotion"), the bound family members can remotely control the smart home application terminal or other linkage terminal equipment through the applications installed in the control terminal, and the command is executed, so that the abnormal emotion of the users is relieved or improved, the intimacy among the family members is promoted, and tragedy occurrence can be prevented.
In step S408, the intelligent home application terminal or other linkage terminals are remotely controlled, and after the application terminal senses the abnormal emotion state of the user, the abnormal emotion state is immediately reported to the control end application through the wireless network system, and after the family receives the information of the control end application, the family can execute the emotion scene by one key, and the scene can include instructions such as playing entertainment information by the intelligent home application terminal by one key, or ordering shopping.
In the foregoing embodiment, the intelligent home application terminal based on emotion recognition is utilized to automatically recognize abnormal emotion of the user, and support a system for actively serving effective communication such as intelligent reminding and emotion pacifying in different home scenes and an intelligent home service system based on emotion recognition, so that the method for reminding and pacifying of abnormal emotion states of family members can be carried out in a long-distance communication or remote monitoring mode, the effective communication of family members can be actively served, the intelligent home system is more active and humanized, and when the intelligent home application terminal automatically recognizes the abnormal emotion states of the user, the user or the family communicating with the user is intelligently reminded to carry out emotion management, so that the user experience of the intelligent home system is more active and humanized is improved. In addition, the applicability of the intelligent home system to actively serve home emotion is improved: on one hand, the intelligent home reminding user or the system range for carrying out emotion management on families communicated with the intelligent home reminding user is expanded to a long-distance communication scene from a face-to-face communication family scene; on the other hand, the abnormal emotion state of a certain family member which is monitored remotely can be automatically pushed to other family members, the family members are automatically reminded of paying attention to the abnormal emotion state of the other family members, emotion pacifying response is timely made, tragedy is avoided, and family emotion requirements under different application scenes can be met. Namely, abnormal emotion state information of a user is automatically pushed to an intelligent home control end application bound with an intelligent home application terminal, and after other families remotely and timely acquire emotion state change information of family members, the intelligent home application terminal or other linkage equipment is remotely controlled to execute a command of the intelligent home application terminal, so that abnormal emotion of the families is relieved or improved, the problem that a family scene of the existing intelligent home service system only can realize physical function requirements in a family environment and cannot meet emotion requirements in an active service family relationship is solved.
From the description of the above embodiments, it will be clear to a person skilled in the art that the method according to the above embodiments may be implemented by means of software plus the necessary general hardware platform, but of course also by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal device (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present invention.
In this embodiment, an emotion interaction device is further provided, and the emotion interaction device is used for implementing the foregoing embodiments and preferred embodiments, and is not described in detail. As used below, the term "module" may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
FIG. 5 is a block diagram of an emotion interaction device according to an embodiment of the present invention, as shown in FIG. 5, the device includes:
a detection module 52 for monitoring an emotional state of the target object;
a first sending module 54, configured to send status data to a control terminal or a server when it is detected that an emotion status of the target object is a predetermined emotion status, so that the control terminal or the server generates a control instruction according to the status data, where the control instruction is used to remotely control a target application terminal to perform a predetermined operation corresponding to the predetermined emotion status;
and the receiving module 56 is configured to receive the control instruction and control the target application terminal to execute the predetermined operation according to the control instruction.
In an alternative embodiment, the first sending module 54 may send status data to the control terminal, so that the control terminal generates a control instruction according to the status data: and sending a notification message to a control terminal application of the control terminal, wherein the notification message is used for instructing the control terminal application to generate the control instruction and sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation.
In an alternative embodiment, the apparatus may be configured to perform, after monitoring the emotional state of the target object, at least one of the following operations in a case where the emotional state of the target object is monitored to be a predetermined emotional state: sending out first preset information corresponding to the preset emotion state of the target object; and sending out second preset information corresponding to the preset emotion state of the target object by the target application terminal which is smaller than the preset distance from the target object.
In an alternative embodiment, the apparatus may be configured to send out first predetermined information corresponding to the predetermined emotional state of the target object when the number of the target objects is at least two by: determining the pacifying priority of each target object; and sequentially sending out first preset information corresponding to the preset emotion states of all target objects according to the order of the priority from high to low.
In an alternative embodiment, the device may be configured to control the target application terminal, which is less than a predetermined distance from the target object, to send out second predetermined information corresponding to the predetermined emotional state of the target object when the number of the target objects is at least two: determining the pacifying priority of each target object; and sequentially controlling the target application terminals which are smaller than a preset distance from each target object according to the order of the priority from high to low to send out second preset information corresponding to the preset emotion states of each target object.
In an optional embodiment, the apparatus may be further configured to determine, before sending status data to a control terminal to enable the control terminal to generate a control instruction according to the status data, the control terminal corresponding to the target object based on a pre-configured correspondence, where the correspondence is used to record a binding relationship between an object and the control terminal, and different objects bind the same or different control terminals.
FIG. 6 is a block diagram II of an emotion interaction device according to an embodiment of the present invention, as shown in FIG. 6, the device includes:
the acquiring module 62 is configured to acquire status data from the target application terminal, where the status data is used to indicate that an emotion state of the target object is a predetermined emotion state;
a control module 64, configured to generate a control instruction based on the state data, where the control instruction is configured to control the target application terminal to perform a predetermined operation corresponding to the predetermined emotion state;
and a second sending module 66, configured to send the control instruction to the target application terminal, so as to remotely control the target application terminal to perform the predetermined operation.
In an alternative embodiment, the second sending module 66 may be implemented by at least one of the following, and sends the control instruction to the target application terminal: the control instruction is sent through the target application terminal which is smaller than a preset distance from the target object, so that the target application terminal which is smaller than the preset distance from the target object is controlled to send out second preset information corresponding to the preset emotion state of the target object; and purchasing commodities corresponding to the preset emotion states of the target object on line.
In an alternative embodiment, the device may be configured to control the target application terminal, which is less than a predetermined distance from the target object, to display purchase information for purchasing the commodity after online purchasing the commodity corresponding to the predetermined emotional state of the target object.
In an alternative embodiment, the executing module 64 means may be configured to control, when the number of the target objects is at least two, the target application terminal that is less than a predetermined distance from the target object to issue second predetermined information corresponding to the predetermined emotional state of the target object: determining the pacifying priority of each target object; and sending out second preset information corresponding to the preset emotion states of the target objects by the target application terminals which are smaller than the preset distance from the target objects in sequence according to the order of the priority from high to low.
In an alternative embodiment, the apparatus may be further configured to purchase goods corresponding to the predetermined emotional state of the target object online when the number of the target objects is at least two: determining the pacifying priority of each target object; and sequentially purchasing commodities corresponding to the preset emotion states of the target objects on line according to the order of the priority from high to low.
It should be noted that each of the above modules may be implemented by software or hardware, and for the latter, it may be implemented by, but not limited to: the modules are all located in the same processor; alternatively, the above modules may be located in different processors in any combination.
Embodiments of the present invention also provide a computer readable storage medium having a computer program stored therein, wherein the computer program is arranged to perform the steps of any of the method embodiments described above when run.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may be configured to store a computer program for performing the steps of:
s1, monitoring the emotion state of a target object;
s2, sending state data to a control terminal or a server under the condition that the emotion state of the target object is monitored to be a preset emotion state, so that the control terminal or the server generates a control instruction according to the state data, wherein the control instruction is used for remotely controlling a target application terminal to execute preset operation corresponding to the preset emotion state;
S3, receiving the control instruction and controlling the target application terminal to execute the preset operation according to the control instruction.
Optionally, the computer readable storage medium is further arranged to store a computer program for performing the steps of:
s1, acquiring state data from a target application terminal, wherein the state data is used for indicating the emotion state of a target object to be a preset emotion state;
s2, generating a control instruction based on the state data, wherein the control instruction is used for controlling the target application terminal to execute a preset operation corresponding to the preset emotion state;
and S3, sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation.
Alternatively, in the present embodiment, the above-described computer-readable storage medium may include, but is not limited to: a usb disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a removable hard disk, a magnetic disk, or an optical disk, or other various media capable of storing a computer program.
An embodiment of the invention also provides an electronic device comprising a memory having stored therein a computer program and a processor arranged to run the computer program to perform the steps of any of the method embodiments described above.
Optionally, the electronic apparatus may further include a transmission device and an input/output device, where the transmission device is connected to the processor, and the input/output device is connected to the processor.
Alternatively, in the present embodiment, the above-described processor may be configured to execute the following steps by a computer program:
s1, monitoring the emotion state of a target object;
s2, sending state data to a control terminal or a server under the condition that the emotion state of the target object is monitored to be a preset emotion state, so that the control terminal or the server generates a control instruction according to the state data, wherein the control instruction is used for remotely controlling a target application terminal to execute preset operation corresponding to the preset emotion state;
s3, receiving the control instruction and controlling the target application terminal to execute the preset operation according to the control instruction.
Optionally, the above processor may be further configured to perform the following steps by a computer program:
s1, acquiring state data from a target application terminal, wherein the state data is used for indicating the emotion state of a target object to be a preset emotion state;
S2, generating a control instruction based on the state data, wherein the control instruction is used for controlling the target application terminal to execute a preset operation corresponding to the preset emotion state;
and S3, sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation.
Alternatively, specific examples in this embodiment may refer to examples described in the foregoing embodiments and optional implementations, and this embodiment is not described herein.
It will be appreciated by those skilled in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be concentrated on a single computing device, or distributed across a network of computing devices, they may alternatively be implemented in program code executable by computing devices, so that they may be stored in a memory device for execution by computing devices, and in some cases, the steps shown or described may be performed in a different order than that shown or described, or they may be separately fabricated into individual integrated circuit modules, or multiple modules or steps within them may be fabricated into a single integrated circuit module for implementation. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only of the preferred embodiments of the present invention and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the principle of the present invention should be included in the protection scope of the present invention.

Claims (11)

1. An emotion interaction method, characterized by comprising:
monitoring the emotion state of a target object;
under the condition that the emotion state of the target object is monitored to be a preset emotion state, sending state data to a control terminal or a server so that the control terminal or the server generates a control instruction according to the state data, wherein the control instruction is used for remotely controlling a target application terminal to execute preset operation corresponding to the preset emotion state;
receiving the control instruction and controlling the target application terminal to execute the preset operation according to the control instruction;
the method for sending the state data to the control terminal so that the control terminal generates the control instruction according to the state data comprises the following steps: sending a notification message to a control terminal application of the control terminal, wherein the notification message is used for instructing the control terminal application to generate the control instruction, and sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation;
Different emotional states of different target objects correspond to different predetermined operations;
after monitoring the emotional state of the target object, the method further comprises:
under the condition that the emotion state of the target object is monitored to be the preset emotion state, executing the following operations: sending out second predetermined information corresponding to the predetermined emotional state of the target object by the target application terminal which is smaller than a predetermined distance from the target object;
when the number of the target objects is at least two, the target application terminal which is controlled to be smaller than a preset distance from the target object sends out second preset information corresponding to the preset emotion state of the target object comprises the following steps:
determining the pacifying priority of each target object;
sequentially controlling the target application terminals which are smaller than a preset distance from each target object according to the order of priority from high to low to send out second preset information corresponding to the preset emotion states of each target object;
wherein the pacifying priority is determined according to the mood swings of the target subjects.
2. The method of claim 1, wherein after monitoring the emotional state of the target object, the method further comprises:
Under the condition that the emotion state of the target object is monitored to be the preset emotion state, executing the following operations:
and sending out first preset information corresponding to the preset emotion state of the target object.
3. The method of claim 2, wherein when the number of target objects is at least two, issuing first predetermined information corresponding to the predetermined emotional state of the target objects comprises:
determining the pacifying priority of each target object;
and sequentially sending out first preset information corresponding to the preset emotion states of all target objects according to the order of the priority from high to low.
4. The method of claim 1, wherein prior to sending status data to a control terminal to cause the control terminal to generate control instructions based on the status data, the method further comprises:
and determining the control terminal corresponding to the target object based on a pre-configured corresponding relation, wherein the corresponding relation is used for recording the binding relation between the object and the control terminal, and different objects bind the same or different control terminals.
5. An emotion interaction method, characterized by comprising:
Acquiring state data from a target application terminal, wherein the state data is used for indicating the emotion state of a target object to be a preset emotion state;
generating a control instruction based on the state data, wherein the control instruction is used for controlling the target application terminal to execute a preset operation corresponding to the preset emotion state;
sending the control instruction to the target application terminal to remotely control the target application terminal to execute the preset operation;
the target application terminal is used for sending state data to the control terminal, so that the control terminal generates a control instruction according to the state data, and the control terminal comprises a control module and a control module, wherein the control module is used for sending the state data to the control terminal through the following mode:
sending a notification message to a control terminal application of the control terminal, wherein the notification message is used for instructing the control terminal application to generate the control instruction, and sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation;
different emotional states of different target objects correspond to different predetermined operations;
the sending the control instruction to the target application terminal comprises the following steps: the control instruction is sent through the target application terminal which is smaller than a preset distance from the target object, so that the target application terminal which is smaller than the preset distance from the target object is controlled to send out second preset information corresponding to the preset emotion state of the target object;
Wherein when the number of the target objects is at least two, sending second predetermined information corresponding to the predetermined emotional state of the target object through the target application terminal which is smaller than a predetermined distance from the target object includes: determining the pacifying priority of each target object; sequentially sending second preset information corresponding to the preset emotion states of all target objects by the target application terminals which are smaller than the preset distance from each target object according to the order of the priority from high to low;
wherein the pacifying priority is determined according to the mood swings of the target subjects.
6. The method of claim 5, wherein sending the control instruction to the target application terminal further comprises:
and purchasing commodities corresponding to the preset emotion states of the target object on line.
7. The method of claim 6, wherein after online purchase of the merchandise corresponding to the predetermined emotional state of the target object, the method further comprises:
and controlling the target application terminal which is smaller than the preset distance from the target object to display the purchase information for purchasing the commodity.
8. An emotion interaction device, comprising:
the detection module is used for monitoring the emotion state of the target object;
the first sending module is used for sending state data to a control terminal or a server under the condition that the emotion state of the target object is monitored to be a preset emotion state, so that the control terminal or the server generates a control instruction according to the state data, wherein the control instruction is used for remotely controlling a target application terminal to execute preset operation corresponding to the preset emotion state;
the receiving module is used for receiving the control instruction and controlling the target application terminal to execute the preset operation according to the control instruction;
the first sending module is further configured to send status data to the control terminal, so that the control terminal generates a control instruction according to the status data, by using the following method: sending a notification message to a control terminal application of the control terminal, wherein the notification message is used for instructing the control terminal application to generate the control instruction, and sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation;
Different emotional states of different target objects correspond to different predetermined operations;
the device can be used for executing the following operations when the emotion state of the target object is monitored to be a preset emotion state after the emotion state of the target object is monitored: sending out second predetermined information corresponding to the predetermined emotional state of the target object by the target application terminal which is smaller than a predetermined distance from the target object;
the device can control the target application terminal which is smaller than a preset distance from the target object to send out second preset information corresponding to the preset emotion state of the target object when the number of the target objects is at least two through the following modes: determining the pacifying priority of each target object; sequentially controlling the target application terminals which are smaller than a preset distance from each target object according to the order of priority from high to low to send out second preset information corresponding to the preset emotion states of each target object;
wherein the pacifying priority is determined according to the mood swings of the target subjects.
9. An emotion interaction device, comprising:
The system comprises an acquisition module, a judgment module and a control module, wherein the acquisition module is used for acquiring state data from a target application terminal, wherein the state data is used for indicating the emotion state of a target object to be a preset emotion state;
the control module is used for generating a control instruction based on the state data, wherein the control instruction is used for controlling the target application terminal to execute a preset operation corresponding to the preset emotion state;
the second sending module is used for sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation;
the device is also used for sending the state data to the control terminal by the following mode, so that the control terminal generates a control instruction according to the state data:
the target application terminal sends a notification message to a control terminal application of the control terminal, wherein the notification message is used for instructing the control terminal application to generate the control instruction and sending the control instruction to the target application terminal so as to remotely control the target application terminal to execute the preset operation;
different emotional states of different target objects correspond to different predetermined operations;
The second sending module is implemented by sending the control instruction to the target application terminal: the control instruction is sent through the target application terminal which is smaller than a preset distance from the target object, so that the target application terminal which is smaller than the preset distance from the target object is controlled to send out second preset information corresponding to the preset emotion state of the target object;
the second sending module is further configured to control, when the number of the target objects is at least two, the target application terminal that is less than a predetermined distance from the target object to send out second predetermined information corresponding to the predetermined emotional state of the target object: determining the pacifying priority of each target object; sequentially sending second preset information corresponding to the preset emotion states of all target objects by the target application terminals which are smaller than the preset distance from each target object according to the order of the priority from high to low;
wherein the pacifying priority is determined according to the mood swings of the target subjects.
10. A computer readable storage medium, characterized in that the computer readable storage medium has stored therein a computer program, wherein the computer program is arranged to perform the method of any of the claims 1 to 4 or the method of any of the claims 5 to 7 when run.
11. An electronic device comprising a memory and a processor, characterized in that the memory has stored therein a computer program, the processor being arranged to run the computer program to perform the method of any of the claims 1 to 4 or to perform the method of any of the claims 5 to 7.
CN202010598934.5A 2020-06-28 2020-06-28 Emotion interaction method and device, storage medium and electronic device Active CN111741116B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010598934.5A CN111741116B (en) 2020-06-28 2020-06-28 Emotion interaction method and device, storage medium and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010598934.5A CN111741116B (en) 2020-06-28 2020-06-28 Emotion interaction method and device, storage medium and electronic device

Publications (2)

Publication Number Publication Date
CN111741116A CN111741116A (en) 2020-10-02
CN111741116B true CN111741116B (en) 2023-08-22

Family

ID=72651446

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010598934.5A Active CN111741116B (en) 2020-06-28 2020-06-28 Emotion interaction method and device, storage medium and electronic device

Country Status (1)

Country Link
CN (1) CN111741116B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113569634B (en) * 2021-06-18 2024-03-26 青岛海尔科技有限公司 Scene characteristic control method and device, storage medium and electronic device

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201742464U (en) * 2010-08-06 2011-02-09 华为终端有限公司 Mobile terminal with function of nursing baby
CN102715902A (en) * 2012-06-15 2012-10-10 天津大学 Emotion monitoring method for special people
CN105528074A (en) * 2015-12-04 2016-04-27 小米科技有限责任公司 Intelligent information interaction method and apparatus, and user terminal
CN106843458A (en) * 2016-12-12 2017-06-13 北京光年无限科技有限公司 A kind of man-machine interaction method and device for intelligent robot
CN107491380A (en) * 2017-07-27 2017-12-19 珠海市魅族科技有限公司 Information output method and device, computer installation and readable storage medium storing program for executing
CN108334583A (en) * 2018-01-26 2018-07-27 上海智臻智能网络科技股份有限公司 Affective interaction method and device, computer readable storage medium, computer equipment
CN108549720A (en) * 2018-04-24 2018-09-18 京东方科技集团股份有限公司 It is a kind of that method, apparatus and equipment, storage medium are pacified based on Emotion identification
CN108733209A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Man-machine interaction method, device, robot and storage medium
CN108874895A (en) * 2018-05-22 2018-11-23 北京小鱼在家科技有限公司 Interactive information method for pushing, device, computer equipment and storage medium
CN108960402A (en) * 2018-06-11 2018-12-07 上海乐言信息科技有限公司 A kind of mixed strategy formula emotion towards chat robots pacifies system
CN108937972A (en) * 2018-06-08 2018-12-07 青岛大学附属医院 A kind of medical user emotion monitoring method of multiple features fusion
CN109376633A (en) * 2018-10-15 2019-02-22 北京车联天下信息技术有限公司 A kind of children pacify method and device
CN110135257A (en) * 2019-04-12 2019-08-16 深圳壹账通智能科技有限公司 Business recommended data generation, device, computer equipment and storage medium
CN110598611A (en) * 2019-08-30 2019-12-20 深圳智慧林网络科技有限公司 Nursing system, patient nursing method based on nursing system and readable storage medium
CN110808071A (en) * 2019-10-29 2020-02-18 浙江萌宠日记信息科技股份有限公司 Mother and infant information transfer method and system based on information fusion
CN111144906A (en) * 2019-12-26 2020-05-12 联想(北京)有限公司 Data processing method and device and electronic equipment

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11269891B2 (en) * 2014-08-21 2022-03-08 Affectomatics Ltd. Crowd-based scores for experiences from measurements of affective response

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN201742464U (en) * 2010-08-06 2011-02-09 华为终端有限公司 Mobile terminal with function of nursing baby
CN102715902A (en) * 2012-06-15 2012-10-10 天津大学 Emotion monitoring method for special people
CN105528074A (en) * 2015-12-04 2016-04-27 小米科技有限责任公司 Intelligent information interaction method and apparatus, and user terminal
CN106843458A (en) * 2016-12-12 2017-06-13 北京光年无限科技有限公司 A kind of man-machine interaction method and device for intelligent robot
CN107491380A (en) * 2017-07-27 2017-12-19 珠海市魅族科技有限公司 Information output method and device, computer installation and readable storage medium storing program for executing
CN108334583A (en) * 2018-01-26 2018-07-27 上海智臻智能网络科技股份有限公司 Affective interaction method and device, computer readable storage medium, computer equipment
CN108733209A (en) * 2018-03-21 2018-11-02 北京猎户星空科技有限公司 Man-machine interaction method, device, robot and storage medium
CN108549720A (en) * 2018-04-24 2018-09-18 京东方科技集团股份有限公司 It is a kind of that method, apparatus and equipment, storage medium are pacified based on Emotion identification
CN108874895A (en) * 2018-05-22 2018-11-23 北京小鱼在家科技有限公司 Interactive information method for pushing, device, computer equipment and storage medium
CN108937972A (en) * 2018-06-08 2018-12-07 青岛大学附属医院 A kind of medical user emotion monitoring method of multiple features fusion
CN108960402A (en) * 2018-06-11 2018-12-07 上海乐言信息科技有限公司 A kind of mixed strategy formula emotion towards chat robots pacifies system
CN109376633A (en) * 2018-10-15 2019-02-22 北京车联天下信息技术有限公司 A kind of children pacify method and device
CN110135257A (en) * 2019-04-12 2019-08-16 深圳壹账通智能科技有限公司 Business recommended data generation, device, computer equipment and storage medium
CN110598611A (en) * 2019-08-30 2019-12-20 深圳智慧林网络科技有限公司 Nursing system, patient nursing method based on nursing system and readable storage medium
CN110808071A (en) * 2019-10-29 2020-02-18 浙江萌宠日记信息科技股份有限公司 Mother and infant information transfer method and system based on information fusion
CN111144906A (en) * 2019-12-26 2020-05-12 联想(北京)有限公司 Data processing method and device and electronic equipment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于Agent的虚拟人情感交互建模技术研究;薛为民;北京联合大学学报(第02期);全文 *

Also Published As

Publication number Publication date
CN111741116A (en) 2020-10-02

Similar Documents

Publication Publication Date Title
US20160171979A1 (en) Tiled grammar for phrase spotting with a persistent companion device
US10623198B2 (en) Smart electronic device for multi-user environment
WO2014152015A1 (en) Apparatus and methods for providing a persistent companion device
CN105847099B (en) Internet of things implementation system and method based on artificial intelligence
CA3019535A1 (en) Persistent companion device configuration and deployment platform
CN105979312B (en) Information sharing method and device
US20140215505A1 (en) Systems and methods for supplementing content with audience-requested information
JP2016076799A (en) Consumer electronics administrative system, consumer electronics, remote-control device, and robot
CN108306851B (en) Information acquisition method, information providing method, information acquisition device, information providing device and information acquisition system
CN104407702A (en) Method, device and system for performing actions based on context awareness
US20140288678A1 (en) Electrical appliance control apparatus, electrical appliance control method, electrical appliance control system, input device, and electrical appliance
KR20160071111A (en) Providing personal assistant service in an electronic device
US10949371B2 (en) Interactive content distribution system with mobile charging device interface
CN106325228A (en) Method and device for generating control data of robot
KR20180133593A (en) Mediating method and device
CN108600680A (en) Method for processing video frequency, terminal and computer readable storage medium
CN111741116B (en) Emotion interaction method and device, storage medium and electronic device
US9392320B1 (en) Adaptive battery life enhancer
CN103905837A (en) Image processing method and device and terminal
KR20200079913A (en) Method for dynamically recommending catalog and electonic device therof
CN111050105A (en) Video playing method and device, toy robot and readable storage medium
JP6883451B2 (en) Servers, information processing methods, network systems, and terminals
CN113825004B (en) Multi-screen sharing method and device for display content, storage medium and electronic device
CN110245295A (en) A kind of face identification method for information recommendation, apparatus and system
CN109166585A (en) The method and device of voice control, storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant