CN108073336A - User emotion detecting system and method based on touch - Google Patents
User emotion detecting system and method based on touch Download PDFInfo
- Publication number
- CN108073336A CN108073336A CN201611031201.3A CN201611031201A CN108073336A CN 108073336 A CN108073336 A CN 108073336A CN 201611031201 A CN201611031201 A CN 201611031201A CN 108073336 A CN108073336 A CN 108073336A
- Authority
- CN
- China
- Prior art keywords
- touch
- user
- interaction
- data
- intelligent
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/045—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means using resistive elements, e.g. a single continuous surface or two parallel surfaces put in contact
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Manipulator (AREA)
Abstract
This application discloses a kind of user emotion detecting systems, including intelligent interaction module, data acquisition module, data processing module and touch array.Intelligent interaction module is configured to carry out touch interaction with user.Data acquisition module includes multiple touch sensors, to gather the touch interaction data between intelligent interaction module and user.Data processing module obtains the user emotion type to match with touching interaction data according to predetermined data message.Multiple touch sensors form the touch array being in contact with intelligent interaction module in the form of an array.
Description
Technical field
This application involves mood detection field, specifically, this application involves the user emotion detecting system based on touch and
Method.
Background technology
It needs to be detected the mood of user in many fields, for example, emotion decision-making, intelligent robot, chaperone machine
The fields such as device people, service robot.Below using medical rehabilitation field as example.In home for destitute, senile dementia sanatorium, self-closing
Disease children's hospital etc. is needed in the mechanism to the careful nursing of special population progress, firstly, since the reasons such as human resources, medical care people
Member cannot look after each patient at any time, in addition, these patient's language communication scarce capacities or even due to mental disease and subjective
Repulsion is linked up with medical staff.This causes medical staff to be difficult to efficiently monitor the state of patient and takes in time corresponding
Medical measure.
Compared with language communication scarce capacity, these patients are more likely to by taking action to express the mood of oneself, for example,
It is interacted in a manner of touch with pet.Therefore, its mood can be detected by observing the touch interaction of patient.So
And general duty nursing personnel do not judge the ability of patient mood accurately and timely according to patient's action, it is also not possible to suffer to be each
Person arranges the healthcare givers with aforementioned capabilities.It is then desired to system and the side of user emotion can be detected based on touch
State that is accurate, in time, effectively monitoring user that method is come.
A kind of children are disclosed in 105126355 A of Chinese patent literature CN accompanies robot to accompany system with children,
In the patent application, man-machine touch interaction is touch control operation, i.e. for controlling robot but unrelated with user emotion detection.
A kind of company robot is disclosed in 204819536 U of Chinese patent literature CN.Sensing in the company robot
Device acquisition module includes touch sensor, but does not describe how to detect user emotion using touch sensor.
A kind of multi-sensor information acquisition analysis system and self-closing is disclosed in 102176222 B of Chinese patent literature CN
Disease child custody auxiliary system, the system include much information acquisition analysis system, including being distributed at the top of robot
The appearance formula touch sensor system that (can arbitrarily increase and decrease when needing) at shape shell up to 16.Using the charged principle of human body, pass through
Detecting system electrode determines whether human body touch robot in the variation with capacitance during human contact.With autism children into
Enhance mutual perception ability during row interactive training, improve interaction success rate and alternating treatment effect.
Within the system, interaction is touched for determining whether human body touch robot, can not detect the mood of user.
A kind of multiple spot sensing is disclosed in 202123513 U of Chinese patent literature CN and touches robot.Multiple spot sensing is touched
It touches robot and has mainly used capacitive induction touching technique, three-dimensional multiple spot sensing touch technology etc., around sensation and feelings
Two aspect expansion of sense.It is interacted by sensory system with people, wherein employing multimodal human-computer interaction technology, utilizes multiple sensations
Parallel and cooperation with effect passage interacts.Transmission of the information transmission information of people in nervous system is similarly to, is
By different sensory channels, different information is transmitted.Wherein, three-dimensional multiple spot sensing touch-control system realizes touch-control system from flat
Face is improved to three-dimensional function, enhances the practicability of human-computer interaction touching technique.Touch sensible is connected using elastic conductor, is carried
The reliability that height touches.
In the multiple spot senses and touches robot, capacitive induction touching technique and three-dimensional multiple spot sensing touch technology are equal
For improving touch and machine interactive performance and touch reliability, so as to ensure that multiple point touching robot is in optimal operation shape
State can not detect the mood of user.
The content of the invention
Technical problems to be solved in this application be to provide it is a kind of can based on touch detect the system of user emotion come
Monitor the user emotion detecting system of the state of user;It is detected for this purpose, the application also provides one kind by intelligent interactive system
The method of user emotion.
The one side of the application embodiment provides a kind of user emotion detecting system.The user's mood detecting system can
Including:Intelligent interaction module is configured to carry out touch interaction with user;Data acquisition module, it may include multiple touch sensors,
To gather the touch interaction data between intelligent interaction module and user;And data processing module, it can be according to predetermined
Data message obtains the user emotion type to match with touching interaction data.
According to presently filed embodiment, multiple touch sensors can be formed and intelligent interaction module phase in the form of an array
The touch array of contact.
According to presently filed embodiment, intelligent interaction module may include intelligent robot.
According to presently filed embodiment, touching interaction data may include the way of contact, contact position and contact dynamics.
According to presently filed embodiment, each touch sensor may each comprise:Resistive touch sensor is attached to intelligence
Energy interactive module, available for the touch interaction data between acquisition intelligent interaction module and user;Capacitive touch sensors, can
For judge user whether touch intelligent interactive module;And metal foil, it is arranged on resistive touch sensor and condenser type
Between touch sensor, contact area and sensitivity are touched available for increasing.
According to presently filed embodiment, each touch sensor further includes middle part, and middle part may be provided at resistance-type
It between touch sensor and metal foil, can be made of flexible insulating material, to prevent resistive touch sensor and condenser type
Touch sensor interferes with each other, and can provide soft touch feeling to the user.
According to presently filed embodiment, data message may include to touch the triggering state of array during interaction, touch and hand over
Mutual data and the mood of user.
The another aspect of the application embodiment provides a kind of method that user emotion is detected by intelligent interactive system,
This method may include:Gather the touch interaction data between intelligent interactive system and user;And according to predetermined data
Information obtains the user emotion type to match with touching interaction data.According to presently filed embodiment, intelligent interaction is gathered
The step of touch interaction data between system and user, may include:By the multiple touch sensors arranged in the form of an array,
Gather the touch interaction data between intelligent interactive system and user.
According to presently filed embodiment, touching interaction data may include the way of contact, contact position and contact dynamics.
According to presently filed embodiment, the method that user emotion is detected by intelligent interactive system may also include acquisition number
It is believed that the step of breath, which may include:User is made to be interacted respectively in different emotional states with user emotion detecting system;Acquisition
The triggering state of array is touched during interaction, touches interaction data and the mood of user;And it establishes and touches interaction data with using
The direct mapping of the mood at family.
Description of the drawings
Described in detail below, other features, purpose and advantageous effect are made by reading referring to the drawings
It can become apparent from.The same or similar element in different attached drawings is denoted by the same reference numerals.In the accompanying drawings:
Fig. 1 is the composition frame chart for showing the user emotion detecting system according to the application embodiment;
Fig. 2 is the exemplary stereogram for showing the user emotion detecting system according to the application embodiment;
Fig. 3 is the schematic diagram for showing the touch sensor according to the application embodiment;
Fig. 4 is the exemplary plot for showing the touch sensor according to the application embodiment;
Fig. 5 is the schematic diagram for showing the acquisition data message according to the application embodiment;And
Fig. 6 is the flow chart for showing the mood detection method based on touch data according to the application embodiment;
Specific embodiment
Specific reference will be made to some specific embodiments of the disclosure now, including inventor imagine for performing sheet
Disclosed optimal mode.The example of these specific embodiments is shown in the drawings.It is described when combining these specific embodiments
During the disclosure, it should be understood that it is not intended to the embodiment that the disclosure is limited to description.It may include on the contrary, it is intended to covering
The replacement in spirit and scope of the present disclosure that is defined by the following claims, modification and equivalent.In the following description, state
A large amount of details are to provide the thorough understanding of the disclosure.The disclosure can be in some or all of these no details
In the case of implement.In other instances, grasped in order to avoid unnecessarily obscuring the disclosure without being handled well known to detailed description
Make.
Terms used herein is merely to describing specific embodiment and being not intended to limit the disclosure.As herein
Used, singulative is also intended to including plural form, unless context is expressly otherwise indicated.It should also be understood that when at this
In specification use term " including " and/or " including " when, expression there are the feature, entirety, step, operation, element,
And/or component, but do not preclude the presence or addition of other one or more features, entirety, step, operation, element, component,
And/or combination thereof.
Fig. 1 shows the composition frame chart of the user emotion detecting system 100 according to the application embodiment.User emotion is examined
Examining system 100 includes intelligent interaction module 1, data acquisition module 2 and data processing module 3.Wherein, intelligent interaction module 1 is matched somebody with somebody
It is set to and carries out touch interaction with user.Data acquisition module 2 may include multiple touch sensors, to gather intelligent interaction module 1
Touch interaction data between user.Data processing module 3 can obtain and interacted with touching according to predetermined data message
The user emotion type of data match.
Fig. 2 shows the solid of a specific example of the user emotion detecting system 100 according to the application embodiment
Figure.In this example, user emotion detecting system 100 includes intelligent interaction module 1, data acquisition module 2 and data processing mould
Block 3 (not shown in FIG. 2).Wherein, intelligent interaction module 1 is intelligent interaction robot, is configured to carry out touch friendship with user
Mutually.Data acquisition module 2 may include multiple touch sensors 12, to gather the friendship of the touch between intelligent interaction module 1 and user
Mutual data.Data processing module 3 can obtain the user's feelings to match with touching interaction data according to predetermined data message
Thread type.
In this embodiment, with the artificial platform of intelligent interaction machine, touch sensor 12 is distributed in an array manner
In the whole body of intelligent interaction robot, the touch point detection of whole body many places (such as at 50 more than) can be realized, so as to be formed and intelligence
The touch array 4 that energy interaction robot is in contact.
Intelligent interaction robot can have the medical robot of lovely shape, including but not limited to panda, doggie, small
The medical robot of the shapes such as cat.
For user during with intelligent interaction robot interactive, touch is one of most important interactive means, including and
The various modes such as it is not limited to stroke, pats, retakes, embracing, kissing.Therefore behavior and dynamics of the user under different moods
To be acquired by the touch array 4 for being distributed in robot whole body.
By studies have shown that user under different emotional states with the way of contact of robot, contact position and connecing
Touch degree is different, and these data can be gathered by touch sensor.Thus, by the acquisition of mass data,
Reuse mode, which knows method for distinguishing, can establish reflecting between the triggering state of touch sensor array and user's current emotional
Penetrate relation.In the case where experimenter and experimental data are enough, classified by the data to acquisition, user can be passed through
Contact condition between intelligent interaction robot carrys out the more accurate current emotional for detecting user.
On touch sensor, different touch sensors respectively has advantage and disadvantage.Therefore, in presently filed embodiment,
When setting touch sensor array, several different sensors can be combined together rather than only select single one
Kind sensor.Select different sensor come when designing touch sensor array, it is necessary to consider some important factors, such as
Sensitivity, accuracy also have cost, and design whole body to be facilitated to touch array 4.
The schematic diagram of the touch sensor 12 according to the application embodiment is shown in Fig. 3.
In embodiments, touch sensor 12 may include resistive touch sensor 121, capacitive touch sensors
122nd, metal foil 123 and middle part 124.In touch sensor 12, resistive touch sensor 121 is attached to intelligent interaction
Module 1, available for the touch interaction data between acquisition intelligent interaction module and user.Capacitive touch sensors 122 can be used
In judge user whether touch intelligent interactive module 1.Metal foil 123 is arranged on resistive touch sensor 121 and condenser type
Between touch sensor 122, contact area and sensitivity are touched available for increasing.Metal foil 123 can be made of copper, and can
It is of different sizes.Middle part 124 may be provided between resistive touch sensor 121 and metal foil 124, can be by flexibility
Insulating materials is made, such as sponge.Middle part 124 plays insulating effect, so as to prevent resistive touch sensor 121 and condenser type
Touch sensor 122 interferes with each other, and can provide the touch feeling of soft comfortable to the user.
Fig. 4 shows the exemplary plot of the touch sensor 12 according to the application embodiment.When touch sensor 12 is set,
Resistive touch sensor 121 is attached on the shell of intelligent interaction robot first, in this example, resistive touch passes
Sensor 121 is resistive touch sensor FSR;Then, in resistive touch sensor 121 layer overlay antistatic centre
Portion 124, in this example, middle part 124 are the sponge with certain thickness (for example, 10cm is thick);And it will be passed with condenser type
The metal foil that sensor 122 welds together is attached on middle part 124, and in this example, metal foil is copper foil.
Due to its structure, 12 high sensitivity of touch sensor in the application also may be used even if across certain thickness fur
Accurately to detect current touch state;Precision is high, based on high-precision resistive touch sensor, has high measurement accuracy.
In addition, the area of detection of touch sensor 12 is controllable, this is because capacitive touch screen 122 is connected to metal not of uniform size
On paillon 124, therefore the sensing area of touch sensor 12 can be changed easily by changing the size of metal foil 124.
Touch sensor 12 in the application also has that loading range is big, light-weight, to be easily installed use, excellent touch feeling etc. excellent
Point.
After the intelligent interaction module 1 of user emotion detecting system 100, data acquisition module 2 is set and using
The user's mood detecting system 100 is come before detecting user emotion, it is also necessary to be obtained ahead of time for user emotion detecting system
The primary data information (pdi) that 100 data processing module 3 is trained, these primary data information (pdi)s may be configured as database form.
The primary data information (pdi) is obtained by following steps:User is made to be handed over respectively in different emotional states and user emotion detecting system 1
Mutually;The triggering state of array 4 is touched during acquisition interaction, touches interaction data and the mood of user;And it establishes and touches interaction
The direct mapping of data and the mood of user.
Fig. 5 shows the schematic diagram of the acquisition primary data information (pdi) according to the application embodiment.As shown in figure 5, it can lead to
Substantial amounts of volunteer's test is crossed to obtain the content of primary data information (pdi).It is required that volunteer is respectively with different emotional state and intelligence
Energy interaction robot interacts, and for each action under each mood of volunteer, records touch battle array in time
The triggering state of row 4 and stress size, and repeated several times and the result for recording touch array 4 respectively.Utilize pattern-recognition
Method such as SVM (Support Vector Machine;Support vector machines) data that record obtains are classified and trained,
It sets up and touches the result of array 4 and the direct image relation of user emotion, will be used as training user's mood detection system
The database of the primary data information (pdi) of system 100.
The structure of primary data information (pdi) is as follows:N volunteers are invited, every volunteer is respectively with m kind moods
(in this example, m=7, i.e. it is happy, sad, angry, surprised, detest, ponder, frightened 7 kinds of moods) and intelligent interaction machine
People interacts.For each mood, volunteer needs to complete k (in this example, k=3) a touch most probable under the mood
Interactive action.Therefore every volunteer can provide reaction action of the m × k user under different moods.From the above description may be used
Know, the quantity of volunteer is more (that is, n is bigger), and obtained data message is more complete, so that the result of mood detection
It is more accurate.Theoretically, m, n and k can be infinity.
After setting according to the user emotion detecting system 100 of the application embodiment, it is possible to for detecting use
The mood at family.The method that the mood of user is detected using user emotion detecting system 100 is comprised the following steps:Gather user's feelings
Touch interaction data between thread detecting system 100 and user;Then, according to predetermined data message, obtain and touch
The user emotion type that interaction data matches.
Fig. 6 shows the flow chart of the mood detection method based on touch data according to the application embodiment.The stream
Journey is integrally divided into two parts, and database training part and technology realize part:
Database training part:Each action has the one-to-one state for touching array, and (for example the action is when acting on
The force acting on transducer which touch sensor is triggered, is triggered is much etc.), it detects each and acts corresponding touch battle array
The state of row, in addition, to the training of database can also from digital-to-analogue array extension to vision, the input of the other forms such as voice,
Can also utilize data fusion method multi-medium data is pre-processed after for training data certain herbaceous plants with big flowers;Then, by the shape
State is transmitted to robot master controller and carries out data processing and preservation, so that it is determined that the action of a volunteer and robot
One touch array state between correspondence.As can be seen from Figure 5, for each volunteer, 21 differences can be provided
Action, therefore 21 groups of interactive actions can be generated and touch the relation between array status.For n volunteer, you can obtain
(21 × n) organizes valid data to establish raw data base.
Technology realizes part:After the completion of Database, user emotion detecting system 100 may be used for detection user's
Current emotional.When user with user emotion detecting system 100 touch with a certain posture to be interacted, data acquisition module 2 is just
Can by touch array 4 collect it is corresponding touch array status, by the sample state stored in this state and database into
Row matching.Since the corresponding emotional state of sample state is known, user is can obtain by matching works as cause
Not-ready status.
In addition, whole mathematical concepts in the application can be abstracted for the calculating of various state transitions.
Although it have been described that the preferable example of the present invention, but those skilled in the art can be according to known basic invention
Design carries out various variations or modifications to these examples.Appended claim is intended to be believed to comprise preferred example and fall
Enter all variations or modifications in the scope of the present invention.
Obviously, without departing from the spirit and scope of the present invention, those skilled in the art can carry out the present invention
Various variations or modifications.Therefore, if these variations or modifications belong to the scope of claim and equivalent technologies, they also may be used
It falls into the scope of the present invention.
Claims (10)
1. a kind of user emotion detecting system, including:
Intelligent interaction module is configured to carry out touch interaction with the user;
Data acquisition module, including multiple touch sensors to gather the touch between the intelligent interaction module and the user
Interaction data;And
Data processing module according to predetermined data message, obtains the user's feelings to match with the touch interaction data
Thread type.
2. user emotion detecting system according to claim 1, wherein, the multiple touch sensor shape in the form of an array
Into the touch array being in contact with the intelligent interaction module.
3. user emotion detecting system according to claim 1, wherein, the intelligent interaction module includes intelligence machine
People.
4. user emotion detecting system according to claim 1, wherein, the touch interaction data include the way of contact,
Contact position or contact dynamics.
5. the user emotion detecting system according to any one in claim 1-4, wherein, each touch sensor
Include:
Resistive touch sensor is attached to the intelligent interaction module for gathering the intelligent interaction module and the user
Between touch interaction data;
Capacitive touch sensors, for judging whether the user touches the intelligent interaction module;And
Metal foil is arranged between the resistive touch sensor and the capacitive touch sensors, is touched with increasing
Contact area and touch sensitivity.
6. user emotion detecting system as claimed in claim 6, wherein, each touch sensor further includes:
It is arranged on the middle part made of flexible insulating material between the resistive touch sensor and the metal foil.
7. user emotion detecting system according to claim 1, wherein, the data message includes:
The triggering state for touching array, the mood for touching interaction data and the user during the interaction.
8. a kind of method that user emotion is detected by intelligent interactive system, including:
Gather the touch interaction data between the intelligent interactive system and user;And
According to predetermined data message, the user emotion type to match with the touch interaction data is obtained,
Wherein, the step of gathering the touch interaction data between the intelligent interactive system and user includes:
By the multiple touch sensors arranged in the form of an array, the touch gathered between the intelligent interactive system and user is handed over
Mutual data.
9. according to the method described in claim 8, wherein, the touch interaction data includes the way of contact, contact position and connects
Touch degree.
10. method according to claim 8 or claim 9 further includes the step of obtaining the data message, which includes:
User is made to be interacted respectively in different emotional states with the user emotion detecting system;
The triggering state of the touch array during gathering the interaction, the feelings for touching interaction data and the user
Thread;And
Establish the touch interaction data and the direct mapping of the mood of the user.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611031201.3A CN108073336A (en) | 2016-11-18 | 2016-11-18 | User emotion detecting system and method based on touch |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611031201.3A CN108073336A (en) | 2016-11-18 | 2016-11-18 | User emotion detecting system and method based on touch |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108073336A true CN108073336A (en) | 2018-05-25 |
Family
ID=62160833
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611031201.3A Pending CN108073336A (en) | 2016-11-18 | 2016-11-18 | User emotion detecting system and method based on touch |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108073336A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108714903A (en) * | 2018-06-28 | 2018-10-30 | 香港中文大学(深圳) | A kind of eyes simulation mechanism and the expression robot using the eyes simulation mechanism |
CN110653813A (en) * | 2018-06-29 | 2020-01-07 | 深圳市优必选科技有限公司 | Robot control method, robot and computer storage medium |
WO2020211701A1 (en) * | 2019-04-17 | 2020-10-22 | 华为技术有限公司 | Model training method, emotion recognition method, related apparatus and device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1732543A (en) * | 2002-12-09 | 2006-02-08 | Iee国际电子及工程股份有限公司 | Foil-type switching element with multi-layered carrier foil |
CN101648079A (en) * | 2009-09-02 | 2010-02-17 | 杭州全动科技有限公司 | Emotional doll |
CN102622039A (en) * | 2012-03-27 | 2012-08-01 | 华东理工大学 | Intelligent control handle based on natural contact sensors and applications thereof |
US20130185648A1 (en) * | 2012-01-17 | 2013-07-18 | Samsung Electronics Co., Ltd. | Apparatus and method for providing user interface |
US20160117029A1 (en) * | 2012-04-25 | 2016-04-28 | Electronic Entertainment Design And Research | Interactive gaming analysis systems and methods |
US20160162672A1 (en) * | 2014-04-10 | 2016-06-09 | Bank Of America Corporation | Rhythm-based user authentication |
CN105988591A (en) * | 2016-04-26 | 2016-10-05 | 北京光年无限科技有限公司 | Intelligent robot-oriented motion control method and intelligent robot-oriented motion control device |
-
2016
- 2016-11-18 CN CN201611031201.3A patent/CN108073336A/en active Pending
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1732543A (en) * | 2002-12-09 | 2006-02-08 | Iee国际电子及工程股份有限公司 | Foil-type switching element with multi-layered carrier foil |
CN101648079A (en) * | 2009-09-02 | 2010-02-17 | 杭州全动科技有限公司 | Emotional doll |
US20130185648A1 (en) * | 2012-01-17 | 2013-07-18 | Samsung Electronics Co., Ltd. | Apparatus and method for providing user interface |
CN102622039A (en) * | 2012-03-27 | 2012-08-01 | 华东理工大学 | Intelligent control handle based on natural contact sensors and applications thereof |
US20160117029A1 (en) * | 2012-04-25 | 2016-04-28 | Electronic Entertainment Design And Research | Interactive gaming analysis systems and methods |
US20160162672A1 (en) * | 2014-04-10 | 2016-06-09 | Bank Of America Corporation | Rhythm-based user authentication |
CN105988591A (en) * | 2016-04-26 | 2016-10-05 | 北京光年无限科技有限公司 | Intelligent robot-oriented motion control method and intelligent robot-oriented motion control device |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108714903A (en) * | 2018-06-28 | 2018-10-30 | 香港中文大学(深圳) | A kind of eyes simulation mechanism and the expression robot using the eyes simulation mechanism |
CN110653813A (en) * | 2018-06-29 | 2020-01-07 | 深圳市优必选科技有限公司 | Robot control method, robot and computer storage medium |
WO2020211701A1 (en) * | 2019-04-17 | 2020-10-22 | 华为技术有限公司 | Model training method, emotion recognition method, related apparatus and device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Zhu et al. | Haptic-feedback smart glove as a creative human-machine interface (HMI) for virtual/augmented reality applications | |
Cheng et al. | Smart-surface: Large scale textile pressure sensors arrays for activity recognition | |
Xu et al. | ecushion: A textile pressure sensor array design and calibration for sitting posture analysis | |
Picard et al. | Evaluating affective interactions: Alternatives to asking what users feel | |
EP2678757B1 (en) | Gesture recognition system | |
Meyer et al. | Design and modeling of a textile pressure sensor for sitting posture classification | |
KR101497690B1 (en) | Method and system providing healthcare program service based on bio-signals and symptoms information | |
Ran et al. | A portable sitting posture monitoring system based on a pressure sensor array and machine learning | |
Zu et al. | Multiangle, self-powered sensor array for monitoring head impacts | |
Fujiwara et al. | Optical fiber force myography sensor for identification of hand postures | |
Hudec et al. | A smart IoT system for detecting the position of a lying person using a novel textile pressure sensor | |
JP2019091498A (en) | System and method for correcting answers | |
Segil et al. | Multi-modal prosthetic fingertip sensor with proximity, contact, and force localization capabilities | |
CN108073336A (en) | User emotion detecting system and method based on touch | |
Welch et al. | A review on measuring affect with practical sensors to monitor driver behavior | |
Vera Anaya et al. | Stretchable triboelectric sensor for measurement of the forearm muscles movements and fingers motion for Parkinson's disease assessment and assisting technologies | |
Orefice et al. | Pressure variation study in human-human and human-robot handshakes: Impact of the mood | |
CN116049751A (en) | Exception data processing system | |
CN102622039A (en) | Intelligent control handle based on natural contact sensors and applications thereof | |
Ye et al. | Force-sensing glove system for measurement of hand forces during motorbike riding | |
Cui et al. | Recent developments in impedance-based tactile sensors: A review | |
Archana et al. | Stress detection using machine learning algorithms | |
Müller et al. | Smart fur tactile sensor for a socially assistive mobile robot | |
Haratian | Assistive Wearable Technology for Mental Wellbeing: Sensors and Signal Processing Approaches | |
Roggen et al. | Signal processing technologies for activity-aware smart textiles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |