RU2000118824A - DEVICE AND METHODS FOR EMOTION DETECTION - Google Patents

DEVICE AND METHODS FOR EMOTION DETECTION

Info

Publication number
RU2000118824A
RU2000118824A RU2000118824/09A RU2000118824A RU2000118824A RU 2000118824 A RU2000118824 A RU 2000118824A RU 2000118824/09 A RU2000118824/09 A RU 2000118824/09A RU 2000118824 A RU2000118824 A RU 2000118824A RU 2000118824 A RU2000118824 A RU 2000118824A
Authority
RU
Russia
Prior art keywords
individual
information
emotional state
speech
flat sections
Prior art date
Application number
RU2000118824/09A
Other languages
Russian (ru)
Other versions
RU2294023C2 (en
Inventor
Амир ЛИБЕРМАН
Original Assignee
Амир ЛИБЕРМАН
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from IL12263297A external-priority patent/IL122632A0/en
Application filed by Амир ЛИБЕРМАН filed Critical Амир ЛИБЕРМАН
Publication of RU2000118824A publication Critical patent/RU2000118824A/en
Application granted granted Critical
Publication of RU2294023C2 publication Critical patent/RU2294023C2/en

Links

Claims (19)

1. Устройство для определения эмоционального состояния индивидуума, содержащее анализатор речи, выполненный с возможностью вводить образец речи, выданный индивидуумом, и извлекать из него информацию об интонации, и устройство сообщения эмоционального состояния, выполненное с возможностью генерировать выходную индикацию об эмоциональном состоянии индивидуума на основании упомянутой информации об интонации.1. A device for determining the emotional state of an individual, containing a speech analyzer, configured to enter a speech sample issued by the individual, and to extract information about intonation from it, and an emotional state message device, configured to generate an output indication of the emotional state of the individual based on said information about intonation. 2. Устройство по п. 1, отличающееся тем, что упомянутый образец речи подается к упомянутому анализатору речи по телефону. 2. The device according to claim 1, characterized in that said speech sample is supplied to said speech analyzer by telephone. 3. Устройство по п. 1, отличающееся тем, что упомянутое сообщение об эмоциональном состоянии индивидуума включает сообщение о детектировании лжи на основании эмоционального состояния индивидуума. 3. The device according to p. 1, characterized in that the said message about the emotional state of the individual includes a message about detecting lies based on the emotional state of the individual. 4. Устройство по одному из пп. 1-3, отличающееся тем, что упомянутая информация об интонации содержит многомерную информацию об интонации. 4. The device according to one of paragraphs. 1-3, characterized in that the said information about intonation contains multidimensional information about intonation. 5. Устройство по п. 4, отличающееся тем, что упомянутая многомерная информация содержит, по меньшей мере, 3-мерную информацию. 5. The device according to p. 4, characterized in that the said multidimensional information contains at least 3-dimensional information. 6. Устройство по п. 5, отличающееся тем, что упомянутая многомерная информация содержит, по меньшей мере, 4-мерную информацию. 6. The device according to p. 5, characterized in that said multidimensional information contains at least 4-dimensional information. 7. Устройство по одному из пп. 1-3, 5-6, отличающееся тем, что упомянутая информация об интонации включает информацию, относящуюся к шипам речи. 7. The device according to one of paragraphs. 1-3, 5-6, characterized in that the said information about intonation includes information related to spikes of speech. 8. Устройство по п. 7, отличающееся тем, что упомянутая информация, относящаяся к упомянутым шипам, содержит ряд шипов в течение предопределенного периода времени. 8. The device according to p. 7, characterized in that said information related to said spikes comprises a series of spikes for a predetermined period of time. 9. Устройство по п. 8, отличающееся тем, что упомянутая информация, относящаяся к шипам, содержит распределение упомянутых шипов во времени. 9. The device according to p. 8, characterized in that the said information related to the spikes contains the distribution of the spikes in time. 10. Устройство по одному из пп. 1-3, 5-6, 8-9, отличающееся тем, что упомянутая информация об интонация включает информацию, относящуюся к упомянутым плоским участкам. 10. The device according to one of paragraphs. 1-3, 5-6, 8-9, characterized in that the said information about intonation includes information related to the said flat sections. 11. Устройство по п. 10, отличающееся тем, что упомянутая информация, относящаяся к упомянутым плоским участкам, содержит ряд плоских участков в течение предопределенных периодов времени. 11. The device according to p. 10, characterized in that the said information related to the said flat sections contains a series of flat sections for predetermined periods of time. 12. Устройство по п. 11, отличающееся тем, что упомянутая информация, относящаяся к упомянутым плоским участкам, содержит информацию, относящуюся к длительности упомянутых плоских участков. 12. The device according to p. 11, characterized in that said information related to said flat sections contains information related to the duration of said flat sections. 13. Устройство по п. 12, отличающееся тем, что упомянутая информация, относящаяся к длительности упомянутых плоских участков, содержит среднюю длительность упомянутых плоских участков в течение предопределенного периода времени. 13. The device according to p. 12, characterized in that the said information related to the duration of said flat sections contains the average duration of said flat sections for a predetermined period of time. 14. Устройство по п. 12, отличающееся тем, что упомянутая информация, относящаяся к длительности упомянутых плоских участков, содержит стандартное отклонение длительности упомянутых плоских участков в течение определенного периода времени. 14. The device according to p. 12, characterized in that said information relating to the duration of said flat sections contains a standard deviation of the duration of said flat sections over a certain period of time. 15. Устройство по одному из пп. 1-3, 5-6, 8-9, 11-14, отличающееся тем, что упомянутый образец речи содержит глазную волну речи, имеющую период, и в которой упомянутый анализатор речи выполнен с возможностью анализировать образец речи, и определять частоту появления плоских участков, причем каждый плоский участок показывает, что локальная низкочастотная волна наложена на главную волну речи, и устройство сообщения об эмоциональном состоянии выполнено с возможностью генерировать соответствующую выходную индикацию на основании частоты появления упомянутых плоских участков. 15. The device according to one of paragraphs. 1-3, 5-6, 8-9, 11-14, characterized in that said speech sample contains an eye speech wave having a period, and wherein said speech analyzer is configured to analyze a speech sample and determine the frequency of occurrence of flat sections wherein each flat portion shows that a local low-frequency wave is superimposed on the main speech wave, and the emotional state reporting device is configured to generate an appropriate output indication based on the frequency of occurrence of said flat portions. 16. Система детектирования лжи, содержащая многомерный анализатор речи, выполненный с возможностью вводить образец речи, выданный индивидуумом, и численно определить множество характеристик упомянутого образца речи, и устройство выдачи оценки правдоподобности, выполненное с возможностью генерировать выходную индикацию о правдивости индивидуума, включая детектирование лжи, на основании упомянутого множества численных характеристик. 16. A lie detection system comprising a multidimensional speech analyzer configured to input a speech sample issued by an individual and numerically determine a plurality of characteristics of said speech sample, and a likelihood assessment issuing device configured to generate an output indication of an individual’s veracity, including lie detection, based on the above set of numerical characteristics. 17. Способ многомерного детектирования лжи, согласно которому принимают образец речи, выданный индивидуумом, и численное определение множества характеристик упомянутого образца речи и генерируют выходную индикацию правдивости индивидуума, включая детектирование лжи, на основании упомянутого множества численных характеристик. 17. A method for multidimensional lie detection, according to which a speech sample issued by an individual is received and a numerical determination of a plurality of characteristics of said speech sample is generated and an output truth indication of an individual is generated, including lie detection, based on said plurality of numerical characteristics. 18. Способ определения эмоционального состояния, согласно которому устанавливают диапазон многомерных характеристик, характеризующих диапазон эмоций индивидуума, находящегося в спокойном состоянии, посредством контроля индивидуума по множеству относящихся к эмоциональному состоянию параметров в течение первого периода времени, во время которого индивидуум находится в нейтральном эмоциональном состоянии, определения диапазона характеристик в функции диапазона относящихся к эмоциональному состоянию параметров во время упомянутого первого периода, наблюдения за индивидуумом по упомянутым относящимся к эмоциональному состоянию параметрам в течение второго периода времени, во время которого целесообразно определить эмоциональное состояние индивидуума, чтобы таким образом получить измеренные значения упомянутого множества относящихся к эмоциональному состояния параметров, и регулировать упомянутые измеренные значения, чтобы принять во внимание упомянутый диапазон. 18. A method for determining an emotional state, according to which a range of multidimensional characteristics is established characterizing the range of emotions of an individual in a calm state by controlling the individual according to a plurality of parameters related to the emotional state during the first period of time during which the individual is in a neutral emotional state, determining the range of characteristics as a function of the range related to the emotional state of the parameters during the mentioned the first period of observing the individual according to the parameters related to the emotional state during the second period of time during which it is advisable to determine the emotional state of the individual so as to obtain measured values of said plurality of emotional state parameters and adjust said measured values to take into account the mentioned range. 19. Способ определения эмоционального состояния индивидуума, согласно которому принимают образец речи, выданный индивидуумом, и извлекают из него информацию об интонации и, генерируют выходную индикацию об эмоциональном состоянии индивидуума на основании упомянутой информации об интонации. 19. A method for determining the emotional state of an individual, according to which a sample of speech issued by the individual is taken and information about intonation is extracted from it and an output indication of the emotional state of the individual is generated based on said information about intonation.
RU2000118824/09A 1997-12-16 1998-12-16 Device and method for detecting emotions RU2294023C2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL12263297A IL122632A0 (en) 1997-12-16 1997-12-16 Apparatus and methods for detecting emotions
IL122632 1997-12-16

Publications (2)

Publication Number Publication Date
RU2000118824A true RU2000118824A (en) 2002-07-27
RU2294023C2 RU2294023C2 (en) 2007-02-20

Family

ID=11070983

Family Applications (1)

Application Number Title Priority Date Filing Date
RU2000118824/09A RU2294023C2 (en) 1997-12-16 1998-12-16 Device and method for detecting emotions

Country Status (19)

Country Link
US (1) US6638217B1 (en)
EP (1) EP1038291B1 (en)
JP (1) JP4309053B2 (en)
CN (1) CN1174373C (en)
AT (1) ATE354159T1 (en)
AU (1) AU770410B2 (en)
BR (1) BR9814288A (en)
CA (1) CA2313526C (en)
DE (1) DE69837107T2 (en)
ES (1) ES2283082T3 (en)
HK (1) HK1034796A1 (en)
HU (1) HU226537B1 (en)
IL (1) IL122632A0 (en)
PL (1) PL341296A1 (en)
PT (1) PT1038291E (en)
RU (1) RU2294023C2 (en)
TR (1) TR200001765T2 (en)
TW (1) TW446933B (en)
WO (1) WO1999031653A1 (en)

Families Citing this family (53)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL129399A (en) * 1999-04-12 2005-03-20 Liberman Amir Apparatus and methods for detecting emotions in the human voice
US6480826B2 (en) * 1999-08-31 2002-11-12 Accenture Llp System and method for a telephonic emotion detection that provides operator feedback
US6427137B2 (en) 1999-08-31 2002-07-30 Accenture Llp System, method and article of manufacture for a voice analysis system that detects nervousness for preventing fraud
US6275806B1 (en) * 1999-08-31 2001-08-14 Andersen Consulting, Llp System method and article of manufacture for detecting emotion in voice signals by utilizing statistics for voice signal parameters
US7222075B2 (en) * 1999-08-31 2007-05-22 Accenture Llp Detecting emotions using voice signal analysis
US6463415B2 (en) * 1999-08-31 2002-10-08 Accenture Llp 69voice authentication system and method for regulating border crossing
US7590538B2 (en) 1999-08-31 2009-09-15 Accenture Llp Voice recognition system for navigating on the internet
TWI221574B (en) * 2000-09-13 2004-10-01 Agi Inc Sentiment sensing method, perception generation method and device thereof and software
ES2177437B1 (en) * 2000-12-13 2003-09-01 Neturin S L ANIMAL ANALYZING DEVICE FOR MAMMALS.
IL144818A (en) * 2001-08-09 2006-08-20 Voicesense Ltd Method and apparatus for speech analysis
US6721704B1 (en) 2001-08-28 2004-04-13 Koninklijke Philips Electronics N.V. Telephone conversation quality enhancer using emotional conversational analysis
EP1300831B1 (en) * 2001-10-05 2005-12-07 Sony Deutschland GmbH Method for detecting emotions involving subspace specialists
US7191134B2 (en) * 2002-03-25 2007-03-13 Nunally Patrick O'neal Audio psychological stress indicator alteration method and apparatus
CN100534103C (en) * 2004-09-10 2009-08-26 松下电器产业株式会社 Information processing terminal
US20060229882A1 (en) * 2005-03-29 2006-10-12 Pitney Bowes Incorporated Method and system for modifying printed text to indicate the author's state of mind
US7580512B2 (en) * 2005-06-28 2009-08-25 Alcatel-Lucent Usa Inc. Selection of incoming call screening treatment based on emotional state criterion
US20080040110A1 (en) * 2005-08-08 2008-02-14 Nice Systems Ltd. Apparatus and Methods for the Detection of Emotions in Audio Interactions
CA2622365A1 (en) * 2005-09-16 2007-09-13 Imotions-Emotion Technology A/S System and method for determining human emotion by analyzing eye properties
WO2007072485A1 (en) * 2005-12-22 2007-06-28 Exaudios Technologies Ltd. System for indicating emotional attitudes through intonation analysis and methods thereof
US8204747B2 (en) * 2006-06-23 2012-06-19 Panasonic Corporation Emotion recognition apparatus
MX2009000206A (en) * 2006-07-12 2009-06-08 Medical Cyberworlds Inc Computerized medical training system.
BRPI0716106A2 (en) * 2006-09-07 2014-07-01 Procter & Gamble METHODS FOR MEASURING EMOTIONAL RESPONSE AND PREFERENCE OF CHOICE
CN101517636A (en) * 2006-10-03 2009-08-26 安德烈·耶夫根尼耶维奇·纳兹德拉坚科 Method for determining nervous state of a person according to voice and device for implementing same
US20080260212A1 (en) * 2007-01-12 2008-10-23 Moskal Michael D System for indicating deceit and verity
US20110022395A1 (en) * 2007-02-15 2011-01-27 Noise Free Wireless Inc. Machine for Emotion Detection (MED) in a communications device
EP1998452A1 (en) * 2007-06-01 2008-12-03 EADS Deutschland GmbH Method for compression and expansion of audio signals
WO2009086033A1 (en) 2007-12-20 2009-07-09 Dean Enterprises, Llc Detection of conditions from sound
US8219397B2 (en) * 2008-06-10 2012-07-10 Nuance Communications, Inc. Data processing system for autonomously building speech identification and tagging data
US20090318773A1 (en) * 2008-06-24 2009-12-24 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Involuntary-response-dependent consequences
US20100010370A1 (en) 2008-07-09 2010-01-14 De Lemos Jakob System and method for calibrating and normalizing eye data in emotional testing
US8136944B2 (en) 2008-08-15 2012-03-20 iMotions - Eye Tracking A/S System and method for identifying the existence and position of text in visual media content and for determining a subjects interactions with the text
US8340974B2 (en) * 2008-12-30 2012-12-25 Motorola Mobility Llc Device, system and method for providing targeted advertisements and content based on user speech data
WO2010100567A2 (en) 2009-03-06 2010-09-10 Imotions- Emotion Technology A/S System and method for determining emotional response to olfactory stimuli
EP2420045A1 (en) * 2009-04-17 2012-02-22 Koninklijke Philips Electronics N.V. An ambient telephone communication system, a movement member, method, and computer readable medium therefor
EP2659486B1 (en) * 2010-12-30 2016-03-23 Nokia Technologies Oy Method, apparatus and computer program for emotion detection
US20120182309A1 (en) * 2011-01-14 2012-07-19 Research In Motion Limited Device and method of conveying emotion in a messaging application
US20120182211A1 (en) * 2011-01-14 2012-07-19 Research In Motion Limited Device and method of conveying emotion in a messaging application
GB2500362A (en) * 2011-02-03 2013-09-25 Research In Motion Ltd Device and method of conveying emotion in a messaging application
GB2500363A (en) * 2011-02-03 2013-09-25 Research In Motion Ltd Device and method of conveying emotion in a messaging application
US10445846B2 (en) 2011-04-14 2019-10-15 Elwha Llc Cost-effective resource apportionment technologies suitable for facilitating therapies
US9626650B2 (en) 2011-04-14 2017-04-18 Elwha Llc Cost-effective resource apportionment technologies suitable for facilitating therapies
US10373508B2 (en) * 2012-06-27 2019-08-06 Intel Corporation Devices, systems, and methods for enriching communications
RU2553413C2 (en) * 2012-08-29 2015-06-10 Федеральное государственное бюджетное образовательное учреждение высшего профессионального образования "Воронежский государственный университет" (ФГБУ ВПО "ВГУ") Method of detecting emotional state of person from voice
RU2525284C2 (en) * 2012-11-16 2014-08-10 Валерий Викторович Курышев Method for determining degree of emotional impact of entertainment shows on spectators
US9378741B2 (en) 2013-03-12 2016-06-28 Microsoft Technology Licensing, Llc Search results using intonation nuances
TWI500023B (en) 2013-04-11 2015-09-11 Univ Nat Central Hearing assisting device through vision
US10204642B2 (en) 2013-08-06 2019-02-12 Beyond Verbal Communication Ltd Emotional survey according to voice categorization
CN103829958B (en) * 2014-02-19 2016-11-09 广东小天才科技有限公司 A kind of method and device monitoring people's mood
US9786299B2 (en) 2014-12-04 2017-10-10 Microsoft Technology Licensing, Llc Emotion type classification for interactive dialog system
JP2018515155A (en) 2015-03-09 2018-06-14 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. System, device, and method for remotely monitoring a user's goodness using a wearable device
EP3674951A1 (en) * 2018-12-31 2020-07-01 Nederlandse Organisatie voor toegepast- natuurwetenschappelijk onderzoek TNO System and method of obtaining authentication information for user input information
US11138379B2 (en) 2019-04-25 2021-10-05 Sorenson Ip Holdings, Llc Determination of transcription accuracy
CN110265063B (en) * 2019-07-22 2021-09-24 东南大学 Lie detection method based on fixed duration speech emotion recognition sequence analysis

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3971034A (en) * 1971-02-09 1976-07-20 Dektor Counterintelligence And Security, Inc. Physiological response analysis method and apparatus
US3855418A (en) 1972-12-01 1974-12-17 F Fuller Method and apparatus for phonation analysis leading to valid truth/lie decisions by vibratto component assessment
DE2431458C2 (en) * 1974-07-01 1986-05-28 Philips Patentverwaltung Gmbh, 2000 Hamburg Procedure and arrangement for automatic speaker recognition
US4093821A (en) 1977-06-14 1978-06-06 John Decatur Williamson Speech analyzer for analyzing pitch or frequency perturbations in individual speech pattern to determine the emotional state of the person
US5148483A (en) 1983-08-11 1992-09-15 Silverman Stephen E Method for detecting suicidal predisposition
US5029214A (en) 1986-08-11 1991-07-02 Hollander James F Electronic speech control apparatus and methods
JPH01107240U (en) * 1988-01-08 1989-07-19
JPH0512023A (en) * 1991-07-04 1993-01-22 Omron Corp Emotion recognizing device
IL108401A (en) * 1994-01-21 1996-12-05 Hashavshevet Manufacture 1988 Method and apparatus for indicating the emotional state of a person
JP3280825B2 (en) * 1995-04-26 2002-05-13 富士通株式会社 Voice feature analyzer
JPH09265378A (en) * 1996-03-28 1997-10-07 Hitachi Ltd Supporting method and supporting system for operator
US5853005A (en) * 1996-05-02 1998-12-29 The United States Of America As Represented By The Secretary Of The Army Acoustic monitoring system
US5875427A (en) * 1996-12-04 1999-02-23 Justsystem Corp. Voice-generating/document making apparatus voice-generating/document making method and computer-readable medium for storing therein a program having a computer execute voice-generating/document making sequence
US6055501A (en) * 1997-07-03 2000-04-25 Maccaughelty; Robert J. Counter homeostasis oscillation perturbation signals (CHOPS) detection

Similar Documents

Publication Publication Date Title
RU2000118824A (en) DEVICE AND METHODS FOR EMOTION DETECTION
RU2294023C2 (en) Device and method for detecting emotions
US5521840A (en) Diagnostic system responsive to learned audio signatures
EP1171873B1 (en) Apparatus and methods for detecting emotions in the human voice
Patterson A pulse ribbon model of monaural phase perception
Huron et al. An improved model of tonality perception incorporating pitch salience and echoic memory.
Massaro Preperceptual images, processing time, and perceptual units in auditory perception.
DE69334139T2 (en) Testing of communication device
ATE345730T1 (en) METHOD AND DEVICE FOR ESTIMATING A PHYSIOLOGICAL PARAMETER USING TRANSFORMATIONS
Moore Psychoacoustics
DE69531776D1 (en) DEVICE AND METHOD FOR DETERMINING A FIRST PARAMETER OF AN OBJECT
NL8400552A (en) SYSTEM FOR ANALYZING HUMAN SPEECH.
WO2000075746A3 (en) System and method for measuring temporal coverage detection
DE60010224D1 (en) DEVICE, COMPUTER SYSTEM AND COMPUTER PROGRAM FOR DETERMINING A CARDIOVASCULAR PARAMETER
EP1696218B9 (en) Knocking presence evaluation circuit for an internal combustion engine, knocking identification and control system and corresponding pressure signal processing method
Duifhuis Audibility of high harmonics in a periodic pulse. II. Time effect
EP1001352A1 (en) Data conversion method, data converter, and program storage medium
JPH1123411A (en) Strange sound judging apparatus and method therefor
Rennies et al. Temporal weighting in loudness of broadband and narrowband signals
Buunen On the perception of phase differences in acoustic signals
Maurus et al. Interrelations between structure and function in the vocal repertoire of Saimiri: asking the monkeys themselves where to split and where to lump
JPH01116600A (en) Identification of speech signal
Arana et al. An Efficient Algorithm for the Evaluation of Tonality and the Determination of the Tonal Frequency According to IEC 61400-11
JP3145317B2 (en) Hearing test equipment
Arana Burgui et al. An efficient algorithm for the evaluation of tonality and the determination of the tonal frequency according to IEC 61400-11