CN108875025B - Smart home emotion interaction system - Google Patents

Smart home emotion interaction system Download PDF

Info

Publication number
CN108875025B
CN108875025B CN201810640781.9A CN201810640781A CN108875025B CN 108875025 B CN108875025 B CN 108875025B CN 201810640781 A CN201810640781 A CN 201810640781A CN 108875025 B CN108875025 B CN 108875025B
Authority
CN
China
Prior art keywords
emotion
text
sentence
words
emotional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810640781.9A
Other languages
Chinese (zh)
Other versions
CN108875025A (en
Inventor
魏巧萍
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Haoji Network Technology Group Co.,Ltd.
Original Assignee
Jiangsu Haoji Network Technology Group Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Haoji Network Technology Group Co ltd filed Critical Jiangsu Haoji Network Technology Group Co ltd
Priority to CN201810640781.9A priority Critical patent/CN108875025B/en
Publication of CN108875025A publication Critical patent/CN108875025A/en
Application granted granted Critical
Publication of CN108875025B publication Critical patent/CN108875025B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L15/00Speech recognition
    • G10L15/26Speech to text systems
    • GPHYSICS
    • G10MUSICAL INSTRUMENTS; ACOUSTICS
    • G10LSPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
    • G10L25/00Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
    • G10L25/48Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
    • G10L25/51Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
    • G10L25/63Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Acoustics & Sound (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • Multimedia (AREA)
  • Child & Adolescent Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Machine Translation (AREA)

Abstract

The invention provides an intelligent home emotion interaction system which comprises a voice acquisition module, a character conversion module, an emotion recognition module and an emotion interaction module, wherein the voice acquisition module is used for acquiring voice information of a user, the character conversion module is used for converting the voice information into a text, the emotion recognition module is used for determining the emotion state of the text, and the emotion interaction module is used for interacting with the user according to the emotion state of the text. The invention has the beneficial effects that: through the identification of the emotional state of the user, the accurate interaction of the human-computer emotion in the intelligent home is realized.

Description

Smart home emotion interaction system
Technical Field
The invention relates to the technical field of smart home, in particular to an intelligent home emotion interaction system.
Background
With the continuous update of scientific technology, the technology of subjects such as information science, cognitive science, brain science, psychology, anthropology and the like is different and different. The development of these advanced technologies puts new demands on the future of artificial intelligence. Meanwhile, along with the continuous improvement of the civilization degree of the society, the level of science and technology is continuously improved, the level of mental civilization of people is also continuously improved, and all the effects of rapid economic development and great improvement of the living standard and the living quality are brought. It is under this opportunity that the internet of things comes at the right moment. The smart home, which is one of the widely applied internet of things, has also received attention from researchers at home and abroad.
In the research of the smart home, a researcher only considers the environment of the hardware of the smart home system, but lacks the consideration of the software environment, easily ignores the personalized experience of a user, and ignores the inherent requirements of the user. Therefore, besides intelligent hardware facilities, a set of intelligent soft environment assistance is also needed to achieve the purpose of harmonious and personalized human-computer interaction.
Disclosure of Invention
Aiming at the problems, the invention aims to provide an intelligent home emotion interaction system.
The purpose of the invention is realized by adopting the following technical scheme:
the intelligent home emotion interaction system comprises a voice acquisition module, a character conversion module, an emotion recognition module and an emotion interaction module, wherein the voice acquisition module is used for acquiring voice information of a user, the character conversion module is used for converting the voice information into a text, the emotion recognition module is used for determining the emotion state of the text, and the emotion interaction module is used for interacting with the user according to the emotion state of the text.
The invention has the beneficial effects that: through the identification of the emotional state of the user, the accurate interaction of the human-computer emotion in the intelligent home is realized.
Optionally, the emotion recognition module includes a word segmentation module, an emotion word determination module, an emotion word modeling module, a sentence emotion state determination module, and a text emotion state determination module, where the word segmentation module is configured to segment a sentence in a text into words, the emotion word determination module is configured to screen out words representing emotion characteristics from the words as emotion words, the emotion word modeling module is configured to establish a mathematical model of the emotion words, the sentence emotion state determination module is configured to determine an emotion state of the sentence according to the mathematical model of the emotion words, and the text emotion state determination module is configured to determine an emotion state of the text according to the emotion state of the sentence.
Optionally, the emotion word modeling module is configured to establish a mathematical model of an emotion word, and specifically includes:
using q as emotion wordnZ (q) for expressing, emotion word, emotion staten)=[b(qn),c(qn)]Where N is 1,2, …, N indicates the number of emotion words in the sentence, and b (q)n) Expressing the emotional polarity of the emotional word, c (q)n) Representing the emotional intensity of the emotional words;
the emotion polarity of the emotion words is specifically: if b (q)n) If 1, the emotion word is a forward emotion word, and if b (q)n) If the emotion words are negative emotion words, the negative emotion words are negative emotion words;
the emotion intensity of the emotion words is specifically as follows:
Figure BDA0001702336340000021
representing emotional words qnEmotional intensity of c (q)n) The larger the emotion, the greater the emotional intensity of the emotion word.
Optionally, the statement emotional state determining module is configured to determine an emotional state of the statement according to the mathematical model of the emotional word, and specifically includes:
using p as sentencemIndicating, for example, the emotional state of the sentence Z (p)m)=[d(pm),e(pm)]Where M is 1,2, …, M, N indicates the number of sentences in the text, d (p)m) Expressing the emotional polarity of the sentence, c (q)n) Representing the emotional intensity of the sentence;
the emotional polarity of the sentence is specifically: d (p) if the sum of the emotional intensity of the positive emotional words in the sentence is greater than that of the negative emotional words in the sentencem) 1, the sentence is a positive emotion sentence, and if the sum of the emotion intensities of the positive emotion words in the sentence is smaller than the sum of the emotion intensities of the negative emotion words in the sentence, d (p)m) If the sum of the emotion intensities of the positive emotion words in the sentence is equal to the sum of the emotion intensities of the negative emotion words in the sentence, d (p)m) 0, the sentence is a neutral emotion sentence;
the emotional intensity of the sentence is specifically as follows:
Figure BDA0001702336340000022
in the formula, e (p)m) Representing a sentence pmEmotional intensity of (1), Σ [ c (q)n)Is just]Represents the sum of the emotional intensity of the positive emotion words in the sentence, Σ [ c (q)n)Negative pole]Representing the sum of the emotional intensities of negative emotion words in the sentence; e (p)m) The larger the emotion intensity of the expression sentence.
Optionally, the text emotional state determining module is configured to determine an emotional state of the text according to an emotional state of the sentence, and specifically includes:
the text is represented by w, the text emotion state is represented by Z (w) ═ f (w), h (w), wherein f (w) represents the emotion polarity of the text, and h (w) represents the emotion intensity of the text;
the emotion polarity of the text is specifically: if the sum of the emotion intensities of the positive emotion sentences in the text is greater than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) ═ 1, the text is a positive emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is less than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) ═ 1, the text is a negative emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is equal to the sum of the emotion intensities of the negative emotion sentences in the text, f (w) ═ 0, and the text is a neutral emotion text;
the emotion intensity of the text is specifically as follows:
Figure BDA0001702336340000031
where h (w) represents the emotional intensity of text w, Σ [ e (p)m)Is just]Represents the sum of the emotion intensities, Σ [ e (p), of the forward emotion sentences in the textm)Negative pole]Representing the sum of the emotion intensities of negative emotion sentences in the text; the larger h (w), the greater the emotional intensity of the text.
Drawings
The invention is further illustrated by means of the attached drawings, but the embodiments in the drawings do not constitute any limitation to the invention, and for a person skilled in the art, other drawings can be obtained on the basis of the following drawings without inventive effort.
FIG. 1 is a schematic structural view of the present invention;
reference numerals:
the device comprises a voice acquisition module 1, a character conversion module 2, an emotion recognition module 3 and an emotion interaction module 4.
Detailed Description
The invention is further described with reference to the following examples.
Referring to fig. 1, the smart home emotion interaction system of the embodiment includes a voice acquisition module 1, a character conversion module 2, an emotion recognition module 3 and an emotion interaction module 4, where the voice acquisition module 1 is configured to acquire voice information of a user, the character conversion module 2 is configured to convert the voice information into a text, the emotion recognition module 3 is configured to determine an emotion state of the text, and the emotion interaction module 4 is configured to interact with the user according to the emotion state of the text.
According to the embodiment, the emotion states of the users are identified, and accurate interaction of human-computer emotion in the smart home is achieved.
Preferably, the emotion recognition module 3 comprises a word segmentation module, an emotion word determination module, an emotion word modeling module, a sentence emotion state determination module and a text emotion state determination module, wherein the word segmentation module is used for segmenting a sentence in the text into words, the emotion word determination module is used for screening out words representing emotion characteristics from the words as emotion words, the emotion word modeling module is used for establishing a mathematical model of the emotion words, the sentence emotion state determination module is used for determining the emotion state of the sentence according to the mathematical model of the emotion words, and the text emotion state determination module is used for determining the emotion state of the text according to the emotion state of the sentence;
the preferred embodiment lays a foundation for text emotion recognition by screening the emotion words and establishing a mathematical model, and realizes accurate recognition of text emotion.
Preferably, the emotion word modeling module is configured to establish a mathematical model of an emotion word, and specifically includes:
using q as emotion wordnZ (q) for expressing, emotion word, emotion staten)=[b(qn),c(qn)]Where N is 1,2, …, N indicates the number of emotion words in the sentence, and b (q)n) Expressing the emotional polarity of the emotional word, c (q)n) Representing the emotional intensity of the emotional words;
the emotion polarity of the emotion words is specifically: if b (q)n) If 1, the emotion word is a forward emotion word, and if b (q)n) If the emotion words are negative emotion words, the negative emotion words are negative emotion words;
the emotion intensity of the emotion words is specifically as follows:
Figure BDA0001702336340000041
representing emotional words qnEmotional intensity of c (q)n) The larger the emotion, the greater the emotional intensity of the emotional word;
the preferred embodiment establishes a mathematical model of the emotional words by defining the emotional polarity and the emotional intensity of the emotional words, and lays a foundation for determining the emotional states of subsequent sentences and texts;
the sentence emotional state determining module is used for determining the emotional state of the sentence according to the mathematical model of the emotional word, and specifically comprises the following steps:
using p as sentencemIndicating, for example, the emotional state of the sentence Z (p)m)=[d(pm),e(pm)]Where M is 1,2, …, M indicates the number of sentences in the text, d (p)m) Expressing the emotional polarity of the sentence, c (q)n) Representing the emotional intensity of the sentence;
the emotional polarity of the sentence is specifically: d (p) if the sum of the emotional intensity of the positive emotional words in the sentence is greater than that of the negative emotional words in the sentencem) 1, the sentence is a positive emotion sentence, and if the sum of the emotion intensities of the positive emotion words in the sentence is smaller than the sum of the emotion intensities of the negative emotion words in the sentence, d (p)m) If the sum of the emotion intensities of the positive emotion words in the sentence is equal to the sum of the emotion intensities of the negative emotion words in the sentence, d (p)m) 0, the sentence is a neutral emotion sentence;
the emotional intensity of the sentence is specifically as follows:
Figure BDA0001702336340000051
in the formula, e (p)m) Representing a sentence pmEmotional intensity of (1), Σ [ c (q)n)Is just]Represents the sum of the emotional intensity of the positive emotion words in the sentence, Σ [ c (q)n)Negative pole]Representing the sum of the emotional intensities of negative emotion words in the sentence; e (p)m) The larger the expression, the greater the emotional intensity of the sentence;
the text emotional state determining module is used for determining the emotional state of the text according to the emotional state of the sentence, and specifically comprises the following steps:
the text is represented by w, the text emotion state is represented by Z (w) ═ f (w), h (w), wherein f (w) represents the emotion polarity of the text, and h (w) represents the emotion intensity of the text;
the emotion polarity of the text is specifically: if the sum of the emotion intensities of the positive emotion sentences in the text is greater than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) ═ 1, the text is a positive emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is less than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) ═ 1, the text is a negative emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is equal to the sum of the emotion intensities of the negative emotion sentences in the text, f (w) ═ 0, and the text is a neutral emotion text;
the emotion intensity of the text is specifically as follows:
Figure BDA0001702336340000052
where h (w) represents the emotional intensity of text w, Σ [ e (p)m)Is just]Represents the sum of the emotion intensities, Σ [ e (p), of the forward emotion sentences in the textm)Negative pole]Representing the sum of the emotion intensities of negative emotion sentences in the text; the larger (h) (w), the greater the emotional intensity of the text;
the preferred embodiment analyzes the emotional words forming the sentences to determine the emotional states of the sentences, analyzes the sentences forming the text to determine the emotional states of the text, lays a foundation for accurate identification of emotion, and particularly realizes accurate acquisition of the emotional states of the text by defining the emotional polarities and the emotional intensities of the sentences and the text according to the emotional states of the emotional words.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the protection scope of the present invention, although the present invention is described in detail with reference to the preferred embodiments, it should be understood by the ordinary technical destination in the art that modifications or equivalent substitutions can be made to the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (3)

1. The smart home emotion interaction system is characterized by comprising a voice acquisition module, a character conversion module, an emotion recognition module and an emotion interaction module, wherein the voice acquisition module is used for acquiring voice information of a user, the character conversion module is used for converting the voice information into a text, the emotion recognition module is used for determining the emotion state of the text, and the emotion interaction module is used for interacting with the user according to the emotion state of the text;
the emotion recognition module comprises a word segmentation module, an emotion word determination module, an emotion word modeling module, a sentence emotion state determination module and a text emotion state determination module, wherein the word segmentation module is used for segmenting a sentence in a text into words, the emotion word determination module is used for screening out words representing emotion characteristics from the words as emotion words, the emotion word modeling module is used for building a mathematical model of the emotion words, the sentence emotion state determination module is used for determining the emotion state of the sentence according to the mathematical model of the emotion words, and the text emotion state determination module is used for determining the emotion state of the text according to the emotion state of the sentence;
the emotion word modeling module is used for establishing a mathematical model of the emotion words, and specifically comprises the following steps:
using q as emotion wordnZ (q) for expressing, emotion word, emotion staten)=[b(qn),c(qn)]Where N is 1,2, …, N indicates the number of emotion words in the sentence, and b (q)n) Expressing the emotional polarity of the emotional word, c (q)n) Representing the emotional intensity of the emotional words;
the emotion polarity of the emotion words is specifically: if b (q)n) If 1, the emotion word is a forward emotion word, and if b (q)n) If the emotion words are negative emotion words, the negative emotion words are negative emotion words;
the emotion intensity of the emotion words is specifically as follows:
Figure FDA0003032028820000011
c(qn) Representing emotional words qnEmotional intensity of c (q)n) The larger the emotion, the greater the emotional intensity of the emotion word.
2. The smart home emotion interaction system of claim 1, wherein the sentence emotion state determination module is configured to determine an emotion state of a sentence according to a mathematical model of an emotion word, and specifically comprises:
using p as sentencemIndicating, for example, the emotional state of the sentence Z (p)m)=[d(pm),e(pm)]Where M is 1,2, …, M indicates the number of sentences in the text, d (p)m) Expressing the emotional polarity of the sentence, c (q)n) Representing the emotional intensity of the sentence;
the emotional polarity of the sentence is specifically: d (p) if the sum of the emotional intensity of the positive emotional words in the sentence is greater than that of the negative emotional words in the sentencem) 1, the sentence is a positive emotion sentence, and if the sum of the emotion intensities of the positive emotion words in the sentence is smaller than the sum of the emotion intensities of the negative emotion words in the sentence, d (p)m) If the sum of the emotion intensities of the positive emotion words in the sentence is equal to the sum of the emotion intensities of the negative emotion words in the sentence, d (p)m) 0, the sentence is a neutral emotion sentence;
the emotional intensity of the sentence is specifically as follows:
Figure FDA0003032028820000021
in the formula, e (p)m) Representing a sentence pmEmotional intensity of (1), Σ [ c (q)n)Is just]Represents the sum of the emotional intensity of the positive emotion words in the sentence, Σ [ c (q)n)Negative pole]Representing the sum of the emotional intensities of negative emotion words in the sentence; e (p)m) The larger the emotion intensity of the expression sentence.
3. The smart home emotion interaction system of claim 2, wherein the text emotion state determination module is configured to determine an emotion state of a text according to an emotion state of a sentence, and specifically is configured to:
the text is represented by w, the text emotion state is represented by Z (w) ═ f (w), h (w), wherein f (w) represents the emotion polarity of the text, and h (w) represents the emotion intensity of the text;
the emotion polarity of the text is specifically: if the sum of the emotion intensities of the positive emotion sentences in the text is greater than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) ═ 1, the text is a positive emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is less than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) ═ 1, the text is a negative emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is equal to the sum of the emotion intensities of the negative emotion sentences in the text, f (w) ═ 0, and the text is a neutral emotion text;
the emotion intensity of the text is specifically as follows:
Figure FDA0003032028820000022
where h (w) represents the emotional intensity of text w, Σ [ e (p)m)Is just]Represents the sum of the emotion intensities, Σ [ e (p), of the forward emotion sentences in the textm)Negative pole]Representing the sum of the emotion intensities of negative emotion sentences in the text; the larger h (w), the greater the emotional intensity of the text.
CN201810640781.9A 2018-06-21 2018-06-21 Smart home emotion interaction system Active CN108875025B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810640781.9A CN108875025B (en) 2018-06-21 2018-06-21 Smart home emotion interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810640781.9A CN108875025B (en) 2018-06-21 2018-06-21 Smart home emotion interaction system

Publications (2)

Publication Number Publication Date
CN108875025A CN108875025A (en) 2018-11-23
CN108875025B true CN108875025B (en) 2021-12-03

Family

ID=64340160

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810640781.9A Active CN108875025B (en) 2018-06-21 2018-06-21 Smart home emotion interaction system

Country Status (1)

Country Link
CN (1) CN108875025B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111833907B (en) * 2020-01-08 2023-07-18 北京嘀嘀无限科技发展有限公司 Man-machine interaction method, terminal and computer readable storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105244042A (en) * 2015-08-26 2016-01-13 安徽建筑大学 FSA (Finite State Automaton) based voice emotion interaction device and method
CN107291696A (en) * 2017-06-28 2017-10-24 达而观信息科技(上海)有限公司 A kind of comment word sentiment analysis method and system based on deep learning
CN107944008A (en) * 2017-12-08 2018-04-20 神思电子技术股份有限公司 A kind of method that Emotion identification is carried out for natural language

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7991607B2 (en) * 2005-06-27 2011-08-02 Microsoft Corporation Translation and capture architecture for output of conversational utterances

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105244042A (en) * 2015-08-26 2016-01-13 安徽建筑大学 FSA (Finite State Automaton) based voice emotion interaction device and method
CN107291696A (en) * 2017-06-28 2017-10-24 达而观信息科技(上海)有限公司 A kind of comment word sentiment analysis method and system based on deep learning
CN107944008A (en) * 2017-12-08 2018-04-20 神思电子技术股份有限公司 A kind of method that Emotion identification is carried out for natural language

Also Published As

Publication number Publication date
CN108875025A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
Bojanić et al. Call redistribution for a call center based on speech emotion recognition
CN110992987A (en) Parallel feature extraction system and method for general specific voice in voice signal
CN110110059B (en) Medical dialogue system intention identification and classification method based on deep learning
Gerwien et al. Event segmentation: Cross-linguistic differences in verbal and non-verbal tasks
CN108446278B (en) A kind of semantic understanding system and method based on natural language
Dhanjal et al. Tools and techniques of assistive technology for hearing impaired people
CN108875025B (en) Smart home emotion interaction system
CN111158490B (en) Auxiliary semantic recognition system based on gesture recognition
Lee et al. Automatic object detection algorithm-based braille image generation system for the recognition of real-life obstacles for visually impaired people
Maarif et al. Malaysian Sign Language database for research
Seebun et al. Let’s talk: An assistive mobile technology for hearing and speech impaired persons
CN108877795B (en) Method and apparatus for presenting information
Nakamura et al. 3DCNN-based mouth shape recognition for patient with intractable neurological diseases
CN108984522B (en) Intelligent nursing system
CN116092497A (en) Semantic cloud brain robot based on knowledge graph and artificial intelligence
Kutlimuratov et al. CHALLENGES OF SPEECH EMOTION RECOGNITION SYSTEM MODELING AND ITS SOLUTIONS
Lay et al. The application of extension neuro-network on computer-assisted lip-reading recognition for hearing impaired
Subashini et al. Artificial Intelligence-Based Assistive Technology
Lokeshwar et al. An App to Assist Differently Abled People
CN112002314A (en) Man-machine intelligent chatting method and device based on artificial intelligence
Wang An analysis of over-translation and under-translation in perspective of cultural connotation
Naithani A Wild Philology
Thahseen et al. Smart System to Support Hearing Impaired Students in Tamil
Dokhe et al. Survey Paper: Image Reader For Blind Person
Osaka et al. Investigation of Methods to Create Future Multimodal Emotional Data for Robot Interactions in Patients with Schizophrenia: A Case Study

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20211118

Address after: 222000 17-2-1805 Huaguoshan Avenue, Haizhou District, Lianyungang City, Jiangsu Province

Applicant after: Jiangsu Haoji Network Technology Group Co.,Ltd.

Address before: 030600 No. 4, row 2, zone 1, Xiaohe Huayuan, Yuci District, Jinzhong City, Shanxi Province

Applicant before: Wei Qiaoping

GR01 Patent grant
GR01 Patent grant