CN108984522B - Intelligent nursing system - Google Patents

Intelligent nursing system Download PDF

Info

Publication number
CN108984522B
CN108984522B CN201810666145.3A CN201810666145A CN108984522B CN 108984522 B CN108984522 B CN 108984522B CN 201810666145 A CN201810666145 A CN 201810666145A CN 108984522 B CN108984522 B CN 108984522B
Authority
CN
China
Prior art keywords
emotion
text
emotional
sentence
words
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810666145.3A
Other languages
Chinese (zh)
Other versions
CN108984522A (en
Inventor
肖金保
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Yijia Lao Xiao Technology Co ltd
Original Assignee
Beijing Yijia Lao Xiao Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Yijia Lao Xiao Technology Co ltd filed Critical Beijing Yijia Lao Xiao Technology Co ltd
Priority to CN201810666145.3A priority Critical patent/CN108984522B/en
Publication of CN108984522A publication Critical patent/CN108984522A/en
Application granted granted Critical
Publication of CN108984522B publication Critical patent/CN108984522B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/284Lexical analysis, e.g. tokenisation or collocates
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Theoretical Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Primary Health Care (AREA)
  • Pathology (AREA)
  • Epidemiology (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Machine Translation (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention provides an intelligent nursing system which comprises a data acquisition subsystem, a data processing subsystem, an emotion recognition subsystem, an emotion interaction subsystem and a display device, wherein the data acquisition subsystem acquires physiological data of a user through a sensor, the physiological data of the user comprises heart rate and body temperature, the data processing subsystem is used for analyzing the health condition of the user according to the physiological data of the user, the emotion recognition subsystem is used for recognizing the emotion state of the user, the emotion interaction subsystem is used for interacting with the user according to the health condition and the emotion state of the user, and the display device is used for displaying the health condition of the user. The invention has the beneficial effects that: the intelligent nursing system is provided, and the health condition and the emotional state of the user are identified, so that the user experience is improved while the user is nursed healthily.

Description

Intelligent nursing system
Technical Field
The invention relates to the technical field of nursing, in particular to an intelligent nursing system.
Background
Along with the continuous improvement of the civilization degree of the society, the level of science and technology is also continuously improved, the level of mental civilization of people is also continuously improved, the living standard and the living quality of people are greatly improved, and the people put forward higher requirements on health. Many elderly and children require health care.
The existing nursing system easily ignores the personalized experience and emotion requirements of a user, so that the user experience is low.
Disclosure of Invention
In view of the above problems, the present invention aims to provide an intelligent nursing system.
The purpose of the invention is realized by adopting the following technical scheme:
the intelligent nursing system comprises a data acquisition subsystem, a data processing subsystem, an emotion recognition subsystem, an emotion interaction subsystem and a display device, wherein the data acquisition subsystem acquires physiological data of a user through a sensor, the physiological data of the user comprises heart rate and body temperature, the data processing subsystem is used for analyzing the health condition of the user according to the physiological data of the user, the emotion recognition subsystem is used for recognizing the emotion state of the user, the emotion interaction subsystem is used for interacting with the user according to the health condition and the emotion state of the user, and the display device is used for displaying the health condition of the user.
The invention has the beneficial effects that: the intelligent nursing system is provided, and the health condition and the emotional state of the user are identified, so that the user experience is improved while the user is nursed healthily.
Optionally, the emotion recognition subsystem includes a voice acquisition module, a word conversion module and an emotion recognition module, the voice acquisition module is configured to acquire voice information of a user, the word conversion module is configured to convert the voice information into a text, and the emotion recognition module is configured to determine an emotional state of the text.
The emotion recognition module comprises a word segmentation module, an emotion word determination module, an emotion word modeling module, a sentence emotion state determination module and a text emotion state determination module, wherein the word segmentation module is used for segmenting a sentence in a text into words, the emotion word determination module is used for screening out words representing emotion characteristics from the words to serve as emotion words, the emotion word modeling module is used for building a mathematical model of the emotion words, the sentence emotion state determination module is used for determining the emotion state of the sentence according to the mathematical model of the emotion words, and the text emotion state determination module is used for determining the emotion state of the text according to the emotion state of the sentence.
Optionally, the emotion word modeling module is configured to establish a mathematical model of an emotion word, and specifically includes:
using q as emotion word n Z (q) for expressing, emotion word, emotion state n )=[b(q n ),c(q n )]Where N =1,2, \ 8230;, N, N indicates the number of emotional words in the sentence, b (q) n ) Expressing the emotional polarity of the emotional word, c (q) n ) Representing the emotional intensity of the emotional words;
the emotion polarity of the emotion words is specifically: if b (q) n ) =1, the emotion word is a forward emotion word, if b (q) n ) If the emotion word is negative, the emotion word is a negative emotion word;
the emotion intensity of the emotion words is specifically as follows:
Figure BDA0001702372760000021
c(q n ) Representing emotional words q n Emotional intensity of c (q) n ) The larger the size, the emotional words are representedThe greater the emotional intensity of.
Optionally, the statement emotional state determining module is configured to determine an emotional state of the statement according to the mathematical model of the emotional word, and specifically includes:
using p as sentence m Indicating, for example, the emotional state of the sentence Z (p) m )=[d(p m ),e(p m )]Where M =1,2, \ 8230;, M, M denotes the number of sentences in the text, d (p) m ) Denotes the emotional polarity of the sentence, c (q) n ) Representing the emotional intensity of the sentence;
the emotional polarity of the sentence is specifically: d (p) if the sum of the emotional intensity of the positive emotional words in the sentence is greater than that of the negative emotional words in the sentence m ) =1, the statement is a positive emotion statement, if the sum of the emotion intensities of positive emotion words in the statement is smaller than the sum of the emotion intensities of negative emotion words in the statement, d (p) m ) = -1, the sentence is a negative emotion sentence, if the sum of the emotion intensities of the positive emotion words in the sentence is equal to the sum of the emotion intensities of the negative emotion words in the sentence, d (p) m ) =0, the statement is a neutral sentiment statement;
the emotional intensity of the sentence is specifically as follows:
Figure BDA0001702372760000022
in the formula, e (p) m ) Representing a sentence p m Emotional intensity of (1), Σ [ c (q) n ) Is just ]Represents the sum of the emotional intensity of the positive emotion words in the sentence, Σ [ c (q) n ) Negative pole ]Representing the sum of the emotional intensities of negative emotion words in the sentence; e (p) m ) The larger the emotion intensity of the expression sentence.
Optionally, the text emotional state determining module is configured to determine an emotional state of the text according to an emotional state of the sentence, and specifically includes:
representing a text by w, and representing the emotional state of the text by Z (w) = [ f (w), h (w) ], wherein f (w) represents the emotional polarity of the text, and h (w) represents the emotional intensity of the text;
the emotion polarity of the text is specifically: if the sum of the emotion intensities of the positive emotion sentences in the text is greater than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) =1, the text is the positive emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is less than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) = -1, the text is the negative emotion text, and if the sum of the emotion intensities of the positive emotion sentences in the text is equal to the sum of the emotion intensities of the negative emotion sentences in the text, f (w) =0, the text is the neutral emotion text;
the emotion intensity of the text is specifically as follows:
Figure BDA0001702372760000031
where h (w) represents the emotional intensity of text w, Σ [ e (p) ] m ) Is just ]Represents the sum of the emotion intensities, Σ [ e (p), of the forward emotion sentences in the text m ) Negative pole ]Representing the sum of the emotion intensities of negative emotion sentences in the text; the larger h (w) is, the greater the emotion intensity of the text is.
Drawings
The invention is further illustrated by means of the attached drawings, but the embodiments in the drawings do not constitute any limitation to the invention, and for a person skilled in the art, other drawings can be obtained on the basis of the following drawings without inventive effort.
FIG. 1 is a schematic structural view of the present invention;
reference numerals are as follows:
the system comprises a data acquisition subsystem 1, a data processing subsystem 2, an emotion recognition subsystem 3, an emotion interaction subsystem 4 and a display device 5.
Detailed Description
The invention is further described with reference to the following examples.
Referring to fig. 1, the intelligent nursing system of the embodiment includes a data acquisition subsystem 1, a data processing subsystem 2, an emotion recognition subsystem 3, an emotion interaction subsystem 4 and a display device 5, wherein the data acquisition subsystem 1 acquires physiological data of a user through a sensor, the physiological data of the user includes a heart rate and a body temperature, the data processing subsystem 2 is configured to analyze a health condition of the user according to the physiological data of the user, the emotion recognition subsystem 3 is configured to recognize an emotion state of the user, the emotion interaction subsystem 4 is configured to interact with the user according to the health condition and the emotion state of the user, and the display device 5 is configured to display the health condition of the user.
The embodiment provides an intelligent nursing system, which identifies the health condition and the emotional state of a user, realizes the health nursing of the user and simultaneously improves the user experience.
Preferably, the emotion recognition subsystem 3 includes a voice acquisition module, a character conversion module and an emotion recognition module, the voice acquisition module is configured to acquire voice information of a user, the character conversion module is configured to convert the voice information into a text, and the emotion recognition module is configured to determine an emotion state of the text.
The emotion recognition module comprises a word segmentation module, an emotion word determination module, an emotion word modeling module, a sentence emotion state determination module and a text emotion state determination module, wherein the word segmentation module is used for segmenting a sentence in a text into words, the emotion word determination module is used for screening out words representing emotion characteristics from the words as emotion words, the emotion word modeling module is used for building a mathematical model of the emotion words, the sentence emotion state determination module is used for determining the emotion state of the sentence according to the mathematical model of the emotion words, and the text emotion state determination module is used for determining the emotion state of the text according to the emotion state of the sentence;
the preferred embodiment lays a foundation for text emotion recognition by screening the emotion words and establishing a mathematical model, and realizes accurate recognition of text emotion.
Preferably, the emotion word modeling module is used for establishing a mathematical model of an emotion word, and specifically comprises:
using q as emotion words n Z (q) for expressing, emotion word, emotion state n )=[b(q n ),c(q n )]Wherein N =1,2, \8230, N, N tableNumber of emotional words in the sentence, b (q) n ) Expressing the emotional polarity of the emotional word, c (q) n ) Representing the emotional intensity of the emotional words;
the emotion polarity of the emotion words is specifically: if b (q) n ) If =1, the emotion word is a forward emotion word, if b (q) n ) If the emotion word is negative, the emotion word is a negative emotion word;
the emotion intensity of the emotion words is specifically as follows:
Figure BDA0001702372760000041
c(q n ) Representing emotional words q n Emotional intensity of c (q) n ) The larger the emotion, the greater the emotional intensity of the emotional word;
the preferred embodiment establishes a mathematical model of the emotional words by defining the emotional polarity and the emotional intensity of the emotional words, and lays a foundation for determining the emotional states of subsequent sentences and texts;
the sentence emotional state determining module is used for determining the emotional state of the sentence according to the mathematical model of the emotional word, and specifically comprises the following steps:
using p as sentence m Indicating, for example, the emotional state of the sentence Z (p) m )=[d(p m ),e(p m )]Where M =1,2, \ 8230;, M, M denotes the number of sentences in the text, d (p) m ) Denotes the emotional polarity of the sentence, c (q) n ) Representing the emotional intensity of the sentence;
the emotional polarity of the sentence is specifically: d (p) if the sum of the emotion intensities of the positive emotion words in the sentence is greater than the sum of the emotion intensities of the negative emotion words in the sentence m ) =1, the sentence is a positive emotion sentence, if the sum of the emotion intensities of the positive emotion words in the sentence is less than the sum of the emotion intensities of the negative emotion words in the sentence, d (p) m ) = -1, the sentence is a negative emotion sentence, if the sum of the emotion intensities of the positive emotion words in the sentence is equal to the sum of the emotion intensities of the negative emotion words in the sentence, d (p) m ) =0, the statement is a neutral sentiment statement;
the emotional intensity of the sentence is specifically as follows:
Figure BDA0001702372760000051
in the formula, e (p) m ) Representing a sentence p m Emotional intensity of (1), Σ [ c (q) n ) Is just ]Represents the sum of the emotional intensity of the positive emotion words in the sentence, Σ [ c (q) n ) Negative pole ]Representing the sum of the emotional intensities of the negative emotional words in the sentence; e (p) m ) The larger the emotion intensity of the expression sentence is;
the text emotional state determining module is used for determining the emotional state of the text according to the emotional state of the sentence, and specifically comprises the following steps:
representing a text by w, and representing the emotional state of the text by Z (w) = [ f (w), h (w) ], wherein f (w) represents the emotional polarity of the text, and h (w) represents the emotional intensity of the text;
the emotion polarity of the text is specifically: if the sum of the emotion intensities of the positive emotion sentences in the text is greater than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) =1, the text is a positive emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is less than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) = -1, the text is a negative emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is equal to the sum of the emotion intensities of the negative emotion sentences in the text, f (w) =0, and the text is a neutral emotion text;
the emotion intensity of the text is specifically as follows:
Figure BDA0001702372760000052
where h (w) represents the emotional intensity of text w, Σ [ e (p) ] m ) Is just for ]Represents the sum of the emotion intensities, Σ [ e (p), of the forward emotion sentences in the text m ) Negative pole ]Representing the sum of the emotion intensities of negative emotion sentences in the text; the larger h (w) is, the greater the emotional intensity of the representation text is;
the preferred embodiment analyzes the emotional words forming the sentences to determine the emotional states of the sentences, analyzes the sentences forming the text to determine the emotional states of the text, lays a foundation for accurate identification of emotion, and particularly realizes accurate acquisition of the emotional states of the text by defining the emotional polarities and the emotional intensities of the sentences and the text according to the emotional states of the emotional words.
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the protection scope of the present invention, although the present invention is described in detail with reference to the preferred embodiments, it should be understood by the ordinary technical destination in the art that modifications or equivalent substitutions can be made to the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (3)

1. An intelligent nursing system is characterized by comprising a data acquisition subsystem, a data processing subsystem, an emotion recognition subsystem, an emotion interaction subsystem and a display device, wherein the data acquisition subsystem acquires physiological data of a user through a sensor, the physiological data of the user comprises heart rate and body temperature, the data processing subsystem is used for analyzing the health condition of the user according to the physiological data of the user, the emotion recognition subsystem is used for recognizing the emotion state of the user, the emotion interaction subsystem is used for interacting with the user according to the health condition and the emotion state of the user, and the display device is used for displaying the health condition of the user;
the emotion recognition subsystem comprises a voice acquisition module, a character conversion module and an emotion recognition module, wherein the voice acquisition module is used for acquiring voice information of a user, the character conversion module is used for converting the voice information into a text, and the emotion recognition module is used for determining the emotion state of the text;
the emotion recognition module comprises a word segmentation module, an emotion word determination module, an emotion word modeling module, a sentence emotion state determination module and a text emotion state determination module, wherein the word segmentation module is used for segmenting a sentence in a text into words, the emotion word determination module is used for screening out words representing emotion characteristics from the words to serve as emotion words, the emotion word modeling module is used for building a mathematical model of the emotion words, the sentence emotion state determination module is used for determining the emotion state of the sentence according to the mathematical model of the emotion words, and the text emotion state determination module is used for determining the emotion state of the text according to the emotion state of the sentence;
the emotion word modeling module is used for establishing a mathematical model of the emotion words, and specifically comprises the following steps:
using q as emotion words n Z (q) for expressing, emotion word, emotion state n )=[b(q n ),c(q n )]Where N =1,2, \ 8230;, N, N indicates the number of emotional words in the sentence, b (q) n ) Expressing the emotional polarity of the emotional word, c (q) n ) Representing the emotional intensity of the emotional words;
the emotion polarity of the emotion words is specifically: if b (q) n ) If =1, the emotion word is a forward emotion word, if b (q) n ) If the meaning is =1, the emotional word is a negative emotional word;
the emotion intensity of the emotion words is specifically as follows:
Figure FDA0003533502380000011
c(q n ) Representing emotional words q n Emotional intensity of c (q) n ) The larger the emotion, the greater the emotional intensity of the emotional word.
2. The intelligent nursing system of claim 1, wherein the sentence emotional state determination module is configured to determine the emotional state of the sentence according to the mathematical model of the emotional word, and specifically:
using p as sentence m Indicating, for example, the emotional state of the sentence Z (p) m )=[d(p m ),e(p m )]Where M =1,2, \ 8230;, M, M denotes the number of sentences in the text, d (p) m ) Denotes the emotional polarity of the sentence, c (q) n ) Representing the emotional strength of the sentence;
the emotional polarity of the sentence is specifically: d (p) if the sum of the emotion intensities of the positive emotion words in the sentence is greater than the sum of the emotion intensities of the negative emotion words in the sentence m ) =1, the statement is a positive emotion statement, if the sum of the emotion intensities of the positive emotion words in the statement is less than the emotion of the negative emotion words in the statementSum of intensities, then d (p) m ) = -1, the sentence is a negative emotion sentence, if the sum of the emotion intensities of the positive emotion words in the sentence is equal to the sum of the emotion intensities of the negative emotion words in the sentence, d (p) m ) =0, the sentence is a neutral emotion sentence;
the emotional intensity of the sentence is specifically as follows:
Figure FDA0003533502380000021
in the formula, e (p) m ) Representing a sentence p m Emotional intensity of, Σ [ c (q) ] n ) Is just ]Represents the sum of the emotional intensity of the positive emotional words in the sentence, Σ [ c (q) n ) Negative pole ]Representing the sum of the emotional intensities of the negative emotional words in the sentence; e (p) m ) The larger the emotion intensity of the expression sentence.
3. The intelligent nursing system of claim 2, wherein the text emotional state determination module is configured to determine the emotional state of the text according to the emotional state of the sentence, and specifically:
representing a text by w, and representing the emotional state of the text by Z (w) = [ f (w), h (w) ], wherein f (w) represents the emotional polarity of the text, and h (w) represents the emotional intensity of the text;
the emotion polarity of the text is specifically: if the sum of the emotion intensities of the positive emotion sentences in the text is greater than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) =1, the text is the positive emotion text, if the sum of the emotion intensities of the positive emotion sentences in the text is less than the sum of the emotion intensities of the negative emotion sentences in the text, f (w) = -1, the text is the negative emotion text, and if the sum of the emotion intensities of the positive emotion sentences in the text is equal to the sum of the emotion intensities of the negative emotion sentences in the text, f (w) =0, the text is the neutral emotion text;
the emotion intensity of the text is specifically as follows:
Figure FDA0003533502380000022
in the formula, h (w) represents the emotional intensity of the text w, Σ [ e (p) m ) Is just for ]Represents the sum of the emotion intensities, Σ [ e (p), of the forward emotion sentences in the text m ) Negative pole ]Representing the sum of the emotion intensities of negative emotion sentences in the text; the larger h (w) is, the greater the emotional intensity of the text is.
CN201810666145.3A 2018-06-21 2018-06-21 Intelligent nursing system Active CN108984522B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810666145.3A CN108984522B (en) 2018-06-21 2018-06-21 Intelligent nursing system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810666145.3A CN108984522B (en) 2018-06-21 2018-06-21 Intelligent nursing system

Publications (2)

Publication Number Publication Date
CN108984522A CN108984522A (en) 2018-12-11
CN108984522B true CN108984522B (en) 2022-12-23

Family

ID=64538268

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810666145.3A Active CN108984522B (en) 2018-06-21 2018-06-21 Intelligent nursing system

Country Status (1)

Country Link
CN (1) CN108984522B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105244042A (en) * 2015-08-26 2016-01-13 安徽建筑大学 FSA (Finite State Automaton) based voice emotion interaction device and method
CN105740236A (en) * 2016-01-29 2016-07-06 中国科学院自动化研究所 Writing feature and sequence feature combined Chinese sentiment new word recognition method and system
CN205924042U (en) * 2016-04-11 2017-02-08 西北师范大学 Intelligence emotion discernment health supervision appearance
CN106776557A (en) * 2016-12-13 2017-05-31 竹间智能科技(上海)有限公司 Affective state memory recognition methods and the device of emotional robot
CN106815192A (en) * 2015-11-27 2017-06-09 北京国双科技有限公司 Model training method and device and sentence emotion identification method and device
CN107291696A (en) * 2017-06-28 2017-10-24 达而观信息科技(上海)有限公司 A kind of comment word sentiment analysis method and system based on deep learning

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007167105A (en) * 2005-12-19 2007-07-05 Olympus Corp Apparatus and method for evaluating mind-body correlation data
CN105832073A (en) * 2016-03-22 2016-08-10 华中科技大学 Intelligent interactive emotional care bolster robot system
CN105975798A (en) * 2016-05-30 2016-09-28 深圳达实智能股份有限公司 Smart ward information interaction system and method

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105244042A (en) * 2015-08-26 2016-01-13 安徽建筑大学 FSA (Finite State Automaton) based voice emotion interaction device and method
CN106815192A (en) * 2015-11-27 2017-06-09 北京国双科技有限公司 Model training method and device and sentence emotion identification method and device
CN105740236A (en) * 2016-01-29 2016-07-06 中国科学院自动化研究所 Writing feature and sequence feature combined Chinese sentiment new word recognition method and system
CN205924042U (en) * 2016-04-11 2017-02-08 西北师范大学 Intelligence emotion discernment health supervision appearance
CN106776557A (en) * 2016-12-13 2017-05-31 竹间智能科技(上海)有限公司 Affective state memory recognition methods and the device of emotional robot
CN107291696A (en) * 2017-06-28 2017-10-24 达而观信息科技(上海)有限公司 A kind of comment word sentiment analysis method and system based on deep learning

Also Published As

Publication number Publication date
CN108984522A (en) 2018-12-11

Similar Documents

Publication Publication Date Title
Filippini et al. Thermal infrared imaging-based affective computing and its application to facilitate human robot interaction: A review
CN110556129B (en) Bimodal emotion recognition model training method and bimodal emotion recognition method
Shin et al. American sign language alphabet recognition by extracting feature from hand pose estimation
US20180018300A1 (en) System and method for visually presenting auditory information
Bojanić et al. Call redistribution for a call center based on speech emotion recognition
CN110110059B (en) Medical dialogue system intention identification and classification method based on deep learning
CN112766173B (en) Multi-mode emotion analysis method and system based on AI deep learning
Taniguchi et al. Earable TEMPO: A novel, hands-free input device that uses the movement of the tongue measured with a wearable ear sensor
See et al. A smartphone-based mobility assistant using depth imaging for visually impaired and blind
Dhanjal et al. Tools and techniques of assistive technology for hearing impaired people
CN108510988A (en) A kind of speech recognition system and method for deaf-mute
CN112232127A (en) Intelligent speech training system and method
Mocanu et al. Utterance level feature aggregation with deep metric learning for speech emotion recognition
Płaza et al. Machine learning algorithms for detection and classifications of emotions in contact center applications
Ferreira et al. Exploring silent speech interfaces based on frequency-modulated continuous-wave radar
Park et al. Identification of speech characteristics to distinguish human personality of introversive and extroversive male groups
Li et al. Sequence-to-sequence voice reconstruction for silent speech in a tonal language
CN108984522B (en) Intelligent nursing system
Seebun et al. Let’s talk: An assistive mobile technology for hearing and speech impaired persons
Cominelli et al. A multimodal perception framework for users emotional state assessment in social robotics
CN105807924A (en) Flexible electronic skin based interactive intelligent translation system and method
CN108875025B (en) Smart home emotion interaction system
Villa-Parra et al. Towards multimodal equipment to help in the diagnosis of COVID-19 using machine learning algorithms
Patel et al. Teachable interfaces for individuals with dysarthric speech and severe physical disabilities
Kuo et al. Deep-learning-based automated classification of Chinese speech sound disorders

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20221125

Address after: 100020 2008, 2009, 20th Floor, Building 402, Baiziwan Xili, Chaoyang District, Beijing

Applicant after: Beijing Yijia Lao Xiao Technology Co.,Ltd.

Address before: 030060 No.4, row 2, area 1, Xiaohe Huayuan, Yuci District, Jinzhong City, Shanxi Province

Applicant before: Xiao Jinbao

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant