CN104244824A - Affect-monitoring system - Google Patents
Affect-monitoring system Download PDFInfo
- Publication number
- CN104244824A CN104244824A CN201380019356.3A CN201380019356A CN104244824A CN 104244824 A CN104244824 A CN 104244824A CN 201380019356 A CN201380019356 A CN 201380019356A CN 104244824 A CN104244824 A CN 104244824A
- Authority
- CN
- China
- Prior art keywords
- emotion
- passenger
- display
- emotional information
- surveillance
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012544 monitoring process Methods 0.000 title abstract 2
- 230000008451 emotion Effects 0.000 claims description 238
- 230000002996 emotional effect Effects 0.000 claims description 67
- 230000001815 facial effect Effects 0.000 abstract description 7
- 238000013507 mapping Methods 0.000 description 49
- 238000000034 method Methods 0.000 description 14
- 230000008569 process Effects 0.000 description 14
- 238000000605 extraction Methods 0.000 description 8
- 238000013461 design Methods 0.000 description 7
- 238000001514 detection method Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 206010022998 Irritability Diseases 0.000 description 5
- 206010041349 Somnolence Diseases 0.000 description 5
- 206010011469 Crying Diseases 0.000 description 4
- 230000009471 action Effects 0.000 description 4
- 239000007799 cork Substances 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 238000012360 testing method Methods 0.000 description 4
- 230000015572 biosynthetic process Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000008921 facial expression Effects 0.000 description 2
- 238000010191 image analysis Methods 0.000 description 2
- 238000004378 air conditioning Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 210000004709 eyebrow Anatomy 0.000 description 1
- 210000000744 eyelid Anatomy 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 230000029052 metamorphosis Effects 0.000 description 1
- 210000000214 mouth Anatomy 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/743—Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
- G06V40/175—Static expression
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
- A61B2576/02—Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/7405—Details of notification to user or communication with user or patient ; user input means using sound
- A61B5/7415—Sound rendering of measured values, e.g. by pitch or volume variation
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Psychology (AREA)
- Hospice & Palliative Care (AREA)
- Educational Technology (AREA)
- Developmental Disabilities (AREA)
- Child & Adolescent Psychology (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Image Analysis (AREA)
- Navigation (AREA)
Abstract
An affect-monitoring system (1) characterized in being provided with: a facial image-acquiring unit (3) capable of acquiring facial images of occupants (101) seated at two or more seating positions of a vehicle; an affect information-acquiring unit (13), which acquires affect information representing the affect of the occupants on the basis of the facial images acquired by the facial image-acquiring unit; and an affect display-generating unit (15), which generates an affect display that corresponds to the affect information acquired by the affect information-acquiring unit.
Description
The cross reference of related application
The Japanese patent application 2012-89493 that the application submitted to based on April 10th, 2012, quotes its contents at this.
Technical field
The application relates to a kind of emotion surveillance.
Background technology
In patent documentation 1, disclose following technology.Obtain the facial expression of the multidigit passenger in vehicle, the good degree of the expression of each passenger is learnt with car external environment with being associated, and detects according to sequential.Further, propose that passenger's expression remains good travel route (car external environment).
At first technical literature
Patent documentation
Patent documentation 1: JP 2011 – No. 117905 publications
Summary of the invention
In the technology described in patent documentation 1, the information of the facial expression of regarding passengers only supports process for route selection, and do not show, therefore each passenger cannot understand the emotion of other passengers.The application puts to propose in view of the above problems, its object is to provide a kind of emotion surveillance that each passenger can be made to understand the emotion of other passengers.
The emotion surveillance of the one side of the application possesses: face-image obtaining section, for the plural seating position of vehicle, can obtain the face-image of the passenger occupying this seating position; Emotional information obtaining section, obtains the emotional information of the emotion representing passenger based on the face-image obtained by face-image obtaining section; And emotion display preparing department, make the emotion corresponding with the emotional information obtained by emotional information obtaining section and show.
When the emotion surveillance of this aspect such as has a multidigit passenger in vehicle, the emotion display of the emotion representing each passenger can be made.Therefore, even if passenger such as occupies the position that other passengers cannot directly see, the abnormal conditions etc. of that passenger also can be noticed like a cork.
Accompanying drawing explanation
By while with reference to accompanying drawing while carry out following detailed description, about the above-mentioned purpose of the application and other objects, feature or advantage will become more clear and definite.
Fig. 1 is the block diagram of the formation representing emotion surveillance,
The flow chart of the main process of Fig. 2 performed by expression emotion surveillance,
The flow chart of the emotion determination processing of Fig. 3 performed by expression emotion surveillance,
The flow chart that Fig. 4 processes for the emotion display making represented performed by emotion surveillance,
Fig. 5 is the key diagram of the AU (Action Unit: motor unit) representing passenger's face,
Fig. 6 is the key diagram of the indication example representing emotion display,
Fig. 7 is the key diagram of the indication example representing emotion display,
Fig. 8 is the key diagram of the indication example representing emotion display,
Fig. 9 is the key diagram of the indication example representing emotion display.
Detailed description of the invention
1. the formation of emotion surveillance 1
The formation of emotion surveillance 1 is described based on Fig. 1.Emotion surveillance 1 is for being equipped on the onboard system of vehicle, possess camera (camera, face-image obtaining section) 3, seating position test section 5, passenger identification part 7, storage part 9, feature amount extraction module 11, emotion detection unit (emotional information obtaining section) 13, emotion display preparing department 15, storage part (the 1st storage part, the 2nd storage part) 17 and interface (interface) portion 19.Seating position test section 5, passenger identification part 7, feature amount extraction module 11, emotion detection unit 13 and emotion display preparing department 15 are made up of known computer.
Camera 3 is arranged at the ceiling of car indoor, all seating positions (the right seat of driver's seat, co-pilot seat, rear side, the left seat of rear side etc.) of its image pickup scope covering car indoor.Thus, camera 3 can occupy face and the health of the passenger 101 of this seating position for all seating position shootings of car indoor.In addition, also camera 3 can be set one by one for each seating position.Now, each camera 3 shooting occupies face and the health of a passenger 101 of corresponding seating position.In addition, as camera 3, the camera at the camera in the front (driver's seat and co-pilot seat) of shooting car indoor and the rear (rear side seat) of shooting car indoor also can be set.
Seating position test section 5 detects the seating position of passenger 101 according to the image taken by camera 3.Namely, by image recognition, judge in the image taken by camera 3, at each seating position, whether passenger 101 takes a seat.
Passenger identification part 7 uses the image taken by camera 3, identifies passenger 101.Be described in detail later.
Storage part 9 stores the identifier for identifying passenger 101.This identifier is the intrinsic feature of each passenger that can be extracted from image by image recognition, such as, the pattern (shape of the facial key element of eye, nose, mouth etc. or size, facial key element position relationship etc. each other) of face, the shape of clothes or color, build etc. can be listed.Storage part 9 stores the identifier of the passenger 101 in the past registered for each passenger.In addition, in storage part 9, the title etc. of seating position, emotional information described later and passenger can be stored with being associated with identifier.In addition, storage part 9 stores emotion detection unit 13 the data base judging the reference of emotion time institute.
In addition, in this application, " information " both can be used as uncountable noun use and also can be used as countable noun use.
Feature amount extraction module 11 according to the face of the image recognition passenger 101 taken by camera 3, and then, in its face, extracted the characteristic quantity (AU) preset by image procossing.If identify the face of multidigit passenger 101 in the image taken by camera 3, then extract characteristic quantity (AU) for individual passenger.In addition, the details of Characteristic Extraction is aftermentioned.
Emotion detection unit 13 judges the emotion of passenger 101 according to the characteristic quantity extracted by feature amount extraction module 11 (AU), makes emotional information.If identify the face of multidigit passenger 101 in the image taken by camera 3, then make emotional information for individual passenger.In addition, the details of the making of emotional information is aftermentioned.
Emotion display preparing department 15 uses the emotional information made by emotion detection unit 13, makes emotion display.In addition, emotion display preparing department 15 also can make notification voice.The details of the making of emotion display and notification voice is aftermentioned.
Storage part 17 stores the mapping (corresponding relation of the form that the benchmark relevant with the kind of emotional information, the intensity of emotion and emotion show) that emotion display preparing department 15 uses when making emotion display and notification voice.In this mapping, there are two kinds of mappings, namely (i) stores by every passenger 101 and establishes with the identifier of this passenger 101 mapping associated, and the mapping that (ii) is general.Content about mapping can describe in detail later.
Interface portion 19 possess the display 19a that can show image, the speaker 19b that can export sound, can sound import mike 19c and accept operating portion (keyboard, various switch, the contact panel etc.) 19d of input operation of passenger 101.Interface portion 19 shows the emotion display of emotion display made by preparing department 15 on display 19a.In addition, the notification voice made by emotion display preparing department 15 is sent by speaker 19.In addition, by for the Speech input of mike 19c or the input operation for operating portion 19d, can the mapping being stored in storage part 17 be edited (adding, for the additional or renewal of the content of existing mapping, the deletion etc. of mapping of new mapping).
The display (such as other passengers are difficult to the display watched, maybe cannot see) that the driver that the display 19a of interface portion 19 can be set to vehicle can observe.As such display 19a, such as, the dual screen display frame etc. of the display frame of instrument, navigation screen can be listed.In addition, display 19a also can be the display (such as driver is difficult to the display watched, maybe cannot see) that the some or all passengers beyond driver can watch.As such display 19a, such as, can list and be arranged on the rear side of seat or the display of ceiling.In addition, display 19a also can be the display that driver and other passenger both sides all can watch.
2. the process performed by emotion surveillance 1
Based on Fig. 2 ~ Fig. 8, the process performed by emotion surveillance 1 is described.
At this, the flow chart described in this application or the process of flow chart are made up of multiple sections (or perhaps steps), and each section such as shows as S1.And then each section can be divided into multiple subsegment, on the other hand, also multistage merging a section can be formed.And then each section that forms like this can as device, module, means.
Emotion surveillance 1 is " opening " period in the ignition switch of the vehicle being equipped with this system, repeatedly performs the process shown in Fig. 2 in required time.In the S1 of Fig. 2, camera 3 obtains the image of the scope of all seating positions covering vehicle.If passenger occupies the arbitrary seating position in vehicle, then include face and the health of this passenger in described image.In addition, if passenger occupies the plural seating position in vehicle respectively, then comprise face and the health of all passengers in described image.
In S2, by each seating position of vehicle, seating position test section 5, by carrying out image analysis to the image obtained at described S1, judges whether passenger takes a seat.
In S3, passenger identification part 7, for being judged as all seating positions that passenger takes a seat in described S2, performs the identification of the passenger taken a seat as follows.First, for being judged as the seating position α that passenger takes a seat in described S2, be used in the identifier that image zooming-out that described S1 obtains occupies the passenger of this seating position α.Then, the identifier of extraction and the identifier being stored in storage part 9 are contrasted.With regard to the result of contrast, if among the identifier being stored in storage part 9, there is identical with the identifier of the passenger occupying seating position α or akin identifier, then this identifier is stored in storage part 9 with being associated with seating position α.On the other hand, with regard to the result of contrast, if among the identifier being stored in storage part 9, there is not identical with the identifier of the passenger occupying seating position α or akin identifier, then the identifier occupying the passenger of seating position α is newly stored in storage part 9 with being associated with seating position α.
Not only for seating position α, for being judged as that in described S2 all seating position β, γ that passenger takes a seat perform above-mentioned process similarly.
In S4, for being judged as that in described S2 the seating position that passenger takes a seat, execution obtain the emotion determination processing of the emotional information of the emotion representing passenger.The seating position that passenger takes a seat if be judged as in described S2 exists multiple, then obtain the emotional information of passenger for this all multiple seating position.
Based on Fig. 3, this emotion determination processing is described.In the S11 of Fig. 3, repeatedly obtained the image of all seating positions of covering car indoor in required time by camera 3.
In S12, feature amount extraction module 11 is by carrying out image analysis for each image repeatedly obtained in described S11 (comprising the face-image of the face of passenger), extract the features of the face of passenger (such as, eyebrow, eyelid, mouth), and then, extract the action (hereinafter referred to as AU (Action Unit: motor unit)) of the features with time process.As AU, there is the unit shown in table 1 and Fig. 5.
In addition, R, L in Fig. 5 refer to the right side, the left side of face.
[table 1]
It is widely known that there is dependency relation (the facial behavior coding system that Ai Keman proposes) between the emotion and AU of people.
In S13, emotion detection unit 13 judges the emotion of passenger according to the AU extracted in described S12.Specifically, carry out as following.Storage part 9 prestores the data base of the corresponding relation showing AU and emotion.This data base such as stores as following corresponding relation.
It is glad: 6R,
6L,12R,12L,25
Be taken aback: 1R,
1L,2R,2L,5R,5L,25,27
It is sad: 1R,
1L,4R,4L,15R,15L,17
By the AU extracted in described S12 is applicable to this data base, the kind (glad, sad etc.) of the emotion of passenger can be judged.In addition, the intensity of emotion can be judged by the size of AU.Using the information of the intensity of the kind and emotion that comprise this emotion as emotional information.For being judged as all seating positions that passenger takes a seat in described S2, make emotional information.In addition, emotional information is stored in storage part 9 with being associated with corresponding seating position and identifier.Such as, the emotional information occupying passenger's (its identifier is set to identifier X) of seating position α is stored in storage part 9 with being associated with seating position α and identifier X.As more than, feature amount extraction module 11 obtains the emotional information representing passenger emotion based on the face-image obtained by camera 3.
Be back to Fig. 2, in S5, emotion display preparing department 15 carries out making the emotion that shows of the emotion corresponding with the emotional information obtained in described S4 and shows to make and process.Illustrate that the display of this emotion makes process based on Fig. 4.In the S21 of Fig. 4, be invoked at the emotional information being stored in storage part 9 in described S13.If emotional information exists multiple, call all.
In S22, call mapping from storage part 17.Carry out as following in detail.First, for being judged as the seating position α that passenger takes a seat in described S2, from storage part 9, read the identifier stored with being associated with this seating position α.Then, search in storage part 17 and establish with the identifier of this reading the mapping associated, if there is the mapping establishing association, read this mapping.On the other hand, if there is not the mapping establishing association, general mapping is read.If be judged as in described S2, the seating position that passenger takes a seat also has other except seating position α, then call mapping similarly for all seating positions beyond seating position α.
At this, map as shown in table 2 ~ table 4.
[table 2]
The mapping of passenger (identifier X)
Categories of emotions | The making of emotion display | Show Color | ? | Notification voice |
Smiling face's (happiness) | ○ | Pink | Below L5 bright light, more than L6 glimmers | More than L6 |
Angry (irritability) | ○ | Red | Below L5 bright light, more than L6 glimmers | More than L6 |
Be taken aback (error) | ○ | Blue | Below L5 bright light, more than L6 glimmers | More than L6 |
Sad (face of crying) | × | ― | ― | ― |
Be weary of | ○ | Green | Below L5 bright light, more than L6 glimmers | More than L6 |
Puzzled | × | ― | ― | ― |
Sleepy | ○ | Purple | Below L5 bright light, more than L6 glimmers | More than L6 |
[table 3]
The mapping of passenger (identifier Y)
Categories of emotions | The making of emotion display | Show Color | Bright light or flicker | Notification voice |
Smiling face's (happiness) | ○ | Pink | Below L6 bright light, more than L7 glimmers | More than L8 |
Angry (irritability) | ○ | Red | Below L6 bright light, more than L7 glimmers | More than L8 |
Be taken aback (error) | ○ | Blue | Below L6 bright light, more than L7 glimmers | More than L8 |
Sad (face of crying) | ○ | Orange | Below L6 bright light, more than L7 glimmers | More than L8 |
Be weary of | ○ | Green | Below L6 bright light, more than L7 glimmers | More than L8 |
Puzzled | ○ | Yellow | Below L6 bright light, more than L7 glimmers | More than L8 |
Sleepy | ○ | Purple | Below L6 bright light, more than L7 glimmers | More than L8 |
[table 4]
General mapping
Categories of emotions | The making of emotion display | Show Color | Bright light or flicker | Notification voice |
Smiling face's (happiness) | ○ | Pink | Below L7 bright light, more than L8 glimmers | More than L8 |
Angry (irritability) | ○ | Red | Below L7 bright light, more than L8 glimmers | More than L8 |
Be taken aback (error) | × | ― | ― | ― |
Sad (face of crying) | × | ― | ― | ― |
Be weary of | × | ― | ― | ― |
Puzzled | × | ― | ― | ― |
Sleepy | ○ | Purple | Below L7 bright light, more than L8 glimmers | More than L8 |
Being mapped as of table 2 establishes with certain identifier X the mapping associated, and being mapped as of table 3 establishes with the identifier Y being different from identifier X the mapping associated, table 4 be mapped as the general mapping not establishing with any identifier and associate.
Each mapping comprises the benchmark relevant with the kind that should make the emotional information that emotion shows.Namely, each emotional information mapping part categories of emotions among by all kinds with emotion is defined as the emotional information making emotion display, the emotional information with other categories of emotions is defined as the emotional information (" making of the emotion display " row in table 2 ~ 4) not making emotion display.Content in " making of emotion display " row both can be different by each mapping (namely by each identifier), also can be common.
Such as, in the mapping of table 2, have the smiling face's (happiness) indicating "○" in " emotion display making " row, angry (irritability), be taken aback (error), be weary of, the emotional information of any one sleepy categories of emotions is the emotional information making emotion display, have during " making of emotion display " arranges indicate the sadness (face of crying) of "×", the emotional information of any one categories of emotions of puzzlement is do not make the emotional information that emotion shows.The mapping of table 3, table 4 too.
In addition, each mapping defines the display form (Show Color, bright light or flicker) of emotion display.Namely, in each mapping, each categories of emotions for emotional information in " Show Color " row specifies the color that emotion shows in advance, the relation of the form of bright light/flicker that the intensity (represent with 10 stages of L1 ~ L10, the larger then emotion of numerical value is stronger) that " bright light or flicker " row define the emotion in emotional information shows with emotion.Such as, in the mapping of table 2, be defined as, based on categories of emotions be smiling face's (happiness) emotional information make emotion display color be pink, intensity emotion display bright light (brightness is fixed) within the scope of L1 ~ L5 of emotion, emotion display flicker in L6 ~ L10.The mapping of table 3, table 4 too.
In S23, make emotion display and notification voice in the following manner.For being judged as the seating position α that passenger takes a seat in described S2, according to the mapping read in described S22 and establish with seating position α the emotional information associated and make emotion and show.For this reason, first, judge whether the categories of emotions in emotional information is the kind (namely, meeting the benchmark relevant with the kind of emotional information) indicating "○" in " making of the emotion display " hurdle mapped.If indicate the kind of "○", then make emotion display for this emotional information.On the other hand, if indicate the kind of "×", then do not make emotion display for this emotional information.
When making emotion display, determine the design of emotion display according to the categories of emotions in emotional information.In addition, the design that categories of emotions and emotion show is established in advance to be associated, and is stored in storage part 17.Such as, the design being designed to the emotion display 217 in Fig. 6 of emotion display when categories of emotions is smiling face (happiness), the design being designed to the emotion display 219 in Fig. 6 that categories of emotions is emotion display time angry (irritability), the design being designed to the emotion display 221 in Fig. 6 of emotion display when categories of emotions is sleepy.In addition, when making emotion shows, according to the form (form that emotion shows) mapping the color of decision emotion display and the intensity decision bright light/flicker according to the emotion in emotional information.
In addition, when making emotion display for seating position α, the position based on the seating position α of car indoor determines the display position of emotion display.Specifically, if seating position α is driver's seat, the display position that emotion shows is set to the position corresponding with driver's seat, if seating position α is co-pilot seat, the display position that emotion shows is set to the position corresponding with co-pilot seat, if seating position α is the right seat of rear side, the display position that emotion shows is set to position corresponding to seat right with rear side, if seating position α is the left seat of rear side, the display position that emotion shows is set to position corresponding to seat left with rear side.
According to more than, determine emotion display (form of design, color, bright light/flicker) and its display position of seating position α.If be judged as in described S2, the seating position that passenger takes a seat also has other except seating position α, then perform the making process of emotion display and its display position similarly for all seating positions beyond seating position α.
In addition, in this S23, the presence or absence of the sounding of notification voice is also determined according to emotional information.As long as among the emotional information making emotion display, the condition that " notification voice " that have the intensity of the emotion of to meet mapping arranges, just determines the sounding of notification voice.Notification voice can be set as according to meeting the kind of emotion of the condition that " notification voice " arranges and different.On the other hand, if among the emotional information making emotion display, the condition that " notification voice " that do not have the intensity of the emotion of to meet mapping arranges, then determine sound of not giving notice.The identifier of the passenger of the condition that the kind of " notification voice " or volume arrange according to " notification voice " that the intensity of emotion meets mapping set also can, set according to the seating position of this passenger and also can.
Be back to Fig. 2, in S6, emotion display preparing department 15 will represent that the signal of the content (emotion shows the presence or absence with the sounding of its display position, notification voice) determined in described S23 exports interface portion 19 to.If the emotion display made exists multiple, then export all emotion displays.
In addition, in S6, there is the seating position that passenger does not take a seat if judge in described S2, then export the positional information of this seating position to interface portion 19.The seating position that this positional information does not take a seat for showing passenger in interface portion 19.In addition, in described S23, be judged as that if exist passenger takes a seat but do not make the seating position of emotion display, then export the positional information of this seating position to interface portion 19.This positional information is used for showing passenger in interface portion 19 and takes a seat but the seating position not making emotion display.
In addition, in interface portion 19, the signal according to inputting from emotion display preparing department 15 carries out following display or sounding.Namely, the display 19a of interface portion 19 such as shown in Figure 6, display emotion display 215,217,219,221.In addition, Fig. 6, for detect passenger at all 4 seating positions, makes indication example when emotion shows at all seating positions.
The design of emotion display 215,217,219,221, the form of color and bright light/flicker determine in described S23.In addition, emotion display 215,217,219,221 is based on the display position determined in described S23, and the seating position corresponding to showing with each emotion shows with being associated.Such as, emotion display 215 makes according to the emotional information of the passenger (driver) occupying the driver's seat among seating position, and this emotion display 215 is shown in the position 205 schematically showing driver's seat.Similarly, according to occupy co-pilot seat passenger emotional information and the emotion display that makes 217 is shown in the position 207 schematically showing co-pilot seat, according to occupy rear side right seat passenger emotional information and make emotion display 219 be shown in schematically show rear side right seat position 209, according to occupy rear side left seat passenger emotional information and make emotion display 221 be shown in schematically show rear side left seat position 211.
In addition, interface portion 19 display 19a by represent the line 203 of the profile of car room, the arrow 213 representing the direct of travel of vehicle, represent steering wheel mark 215 etc. respectively performance represent the position 205 of driver's seat, vice driver's seat position 207, represent the position 209 of the right seat of rear side and represent the position 211 of the left seat of rear side.
In addition, Fig. 7 for detect passenger at all 4 seating positions, but only makes emotion at driver's seat and co-pilot seat and shows, and the indication example when rear side seat does not make emotional information.Now, schematically showing the position 209 of the right seat of rear side and schematically showing position 211 place of the left seat of rear side among the display 19a of interface portion 19, although distinctive display when carrying out detecting that passenger does not make emotional information.This is shown as inside is blank circle 225,227 (showing all different and also different with display when not detecting passenger displays from any emotion).
In addition, Fig. 8, for detect passenger at driver's seat and co-pilot seat, makes emotion display at this place, and the indication example when rear side seat does not detect passenger.Now, schematically showing the position 209 of the right seat of rear side and schematically showing position 211 place of the left seat of rear side among the display 19a of interface portion 19, does not show (show all different from any emotion and be the also different display of circle 225,227 of blank with inside) whatever.
In addition, to give the alarm sound for emotion shows preparing department 15 if determine, then interface portion 19 to be given the alarm sound by speaker 19b.
In S7, judge whether the ignition switch of vehicle is in " opening ".If " open " and then enter S1, if " closedown " then terminates present treatment.
In addition, be stored in storage part 17 mapping can by by the Speech input of passenger to the mike 19c of interface portion 19, or the operating portion 19d in operation interface portion 19 carries out changing, reseting.If the midway of the process shown in Fig. 2 is changed, reseted mapping, then interrupt this process, be back to S1.In addition, by Speech input to the change mapped, reset can often time carry out, by the operation of operating portion 19d to the change of mapping, reset, can only carry out when vehicle stops.
3. the effect that plays of emotion surveillance 1
(1), when emotion surveillance 1 has a multidigit passenger in vehicle, the emotion display of the emotion representing each passenger can be made.Therefore, passenger can notice the abnormal conditions of other passengers occupying the position that cannot directly see like a cork.Especially, driver can keep the attention needed for driving to grasp the same situation taking advantage of personnel.
(2) emotion surveillance 1 emotion display is shown with this emotion corresponding to the seating position of passenger show with being associated.Therefore, it is possible to understand the corresponding relation of emotion display and passenger like a cork.
(3) emotion surveillance 1 possesses the mapping defining the benchmark relevant with the kind of emotional information, makes emotion display for the emotional information meeting this benchmark.Therefore, it is possible to the display of the emotion of making to be defined as the high emotion display of importance degree.In addition, store above-mentioned mapping for each passenger, therefore, it is possible to the emotion display shown by determining according to the situation of each passenger.
(4) emotion surveillance 1 possesses the mapping of the corresponding relation of the form that the intensity that defines emotion and emotion show, can based on this mapping, according to the metamorphosis that the intensity of emotion makes emotion show.Therefore, it is possible to shown by emotion, understand the intensity of emotion like a cork.In addition, store above-mentioned mapping for each passenger, therefore, it is possible to the form shown according to the situation determination emotion of each passenger.
(5) emotion surveillance 1 identifies passenger, according to its recognition result, uses corresponding mapping.Therefore, even if passenger is when seating position is changed in midway, also the mapping corresponding with this passenger can be used.
4. other embodiment
(1) such as, emotion surveillance 1 only detects driver as passenger, be specific kind with the categories of emotions of the emotional information of this driver and the intensity of emotion for more than defined threshold for condition, by the speaker 19b of interface portion 19 send mechanicalness synthetic video (such as " what's the matter? ") also can.
(2) in addition, emotion surveillance 1 be specific kind with the categories of emotions of the emotional information of passenger and the intensity of emotion for more than defined threshold for condition, part or all of proposing to establish in advance with this passenger the emotion comfortableization portion (such as air-conditioning adjusts, volume adjusts, song adjusts, greeting etc.) associated to this passenger or other passengers also can.
(3) in addition, to obtain the emotional information of the categories of emotions with regulation for condition from the passenger of the regulation ratio exceeded in the passenger that detects, carry out notifying that (such as by the speaker 19b ring buzz of interface portion 19) also can.
(4) when arranging camera 3 for each seating position, the power-off of the camera 3 not detecting passenger can be made.
(5) detection of the passenger of seating position is undertaken also can by the sensor (such as seat sensor) being arranged at each seating position.
(6) emotion surveillance 1 does not possess interface portion 19 and also can.Now, preparing department's 15 pairs of external equipment (such as onboard navigation system, portable terminal device etc.) output signals can be shown from emotion, by the display of this external equipment display emotion, sound of giving notice.The display of external equipment can be set to the display same with the display of above-mentioned display 19a.
(7) form showing emotion display on the display 19a of interface portion 19 is that form as shown in Figure 9 also can.In the left side of emotion display 303,304,305,306, show respectively " driver's seat ", " co-pilot seat ", " the right seat of rear side " that represent and show corresponding seating position with emotion, the left seat of rear side etc.).Namely, emotion display 303,304,305,306 to show with this emotion corresponding to seating position (achieving the seating position of the passenger of the emotional information on the basis shown as this emotion) show with being associated.
In addition, on the right side of emotion display 303,304,305,306, the title showing corresponding passenger with this emotion is shown respectively.Namely, in described S3, if in the identifier being stored in storage part 9, there is or akin identifier identical with the identifier of the passenger occupying seating position, then the title of the passenger corresponding with this identifier is shown in the right side that emotion shows, if in the identifier being stored in storage part 9, there is not identical with the identifier of the passenger occupying seating position or akin identifier, then carry out the display of " guest 1 ", " guest 2 " etc.
(8), in " emotion display making " row in the map, all categories of emotions are "○" and also can.Now, emotion display can be made based on all emotional informations.
(9) form of emotion display can set arbitrarily.Such as, even if categories of emotions is identical, the various forms (such as, the color of emotion display, the light levels of emotion display, the size of emotion display, the action etc. of emotion display) of emotion display also can be changed according to the intensity of emotion.In addition, if the intensity of emotion does not reach defined threshold, then also can not make emotion display.In addition, the intensity of emotion does not reach display during defined threshold, can be set to the display as the circle 225,227 in Fig. 7.
(10) be mapped in storage part 17 to store with being associated with seating position and also can.Now, in described S22, read and establish with seating position α the mapping associated, use this mapping, make emotion display and the notification voice of seating position α.Therefore, the corresponding relation of the form shown for intensity and the emotion of each seating position setting benchmark relevant with the kind of emotional information or emotion.
The application describes in accordance with embodiment, but is construed as the application and is not limited to this embodiment or structure.The application also comprises the distortion in various variation or equivalency range.In addition, various combination and mode and then to comprise among these category and thought range that only key element, its above or its other following combinations and mode are also contained in the application.
Claims (6)
1. an emotion surveillance (1), is characterized in that possessing:
Face-image obtaining section (3), for the plural seating position of vehicle, can obtain the face-image of the passenger (101) occupying this seating position;
Emotional information obtaining section (13), based on the face-image obtained by described face-image obtaining section, obtains the emotional information of the emotion representing described passenger; And
Emotion display preparing department (15), makes the emotion corresponding with the described emotional information obtained by described emotional information obtaining section and shows.
2. emotion surveillance as claimed in claim 1, is characterized in that possessing:
Display part (19a), the described seating position of the described passenger corresponding to described emotion display being shown with this emotion shows with being associated.
3. emotion surveillance as claimed in claim 1 or 2, is characterized in that,
This emotion surveillance possesses the 1st storage part (17), and the 1st storage part (17) stores the benchmark relevant with the kind of described emotional information,
Described emotion display preparing department makes described emotion for the described emotional information meeting the benchmark relevant with the kind of described emotional information and shows.
4. emotion surveillance as claimed in claim 3, is characterized in that,
This emotion surveillance possesses passenger identification part (7), and this passenger identification part (7) identifies described passenger,
Described 1st storage part stores the benchmark relevant with the kind of described emotional information by each described passenger,
Described emotion display preparing department, based on the recognition result of described passenger identification part, uses with the kind of the described emotional information relevant benchmark corresponding with this passenger by each described passenger.
5. the emotion surveillance as described in any one of Claims 1-4, is characterized in that,
Described emotional information comprises the intensity of described emotion,
This emotion surveillance possesses the 2nd storage part (17), and the 2nd storage part (17) stores the corresponding relation of the form that the intensity of described emotion and described emotion show,
The form of the described emotion display made by described emotion display preparing department determines based on described corresponding relation.
6. emotion surveillance as claimed in claim 5, is characterized in that,
This emotion surveillance possesses passenger identification part (7), and this passenger identification part (7) identifies described passenger,
Described 2nd storage part stores described corresponding relation by each described passenger,
Described emotion display preparing department, based on the recognition result of described passenger identification part, uses the described corresponding relation corresponding with this passenger by each described passenger.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012089493A JP5729345B2 (en) | 2012-04-10 | 2012-04-10 | Emotion monitoring system |
JP2012-089493 | 2012-04-10 | ||
PCT/JP2013/002333 WO2013153781A1 (en) | 2012-04-10 | 2013-04-04 | Affect-monitoring system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN104244824A true CN104244824A (en) | 2014-12-24 |
CN104244824B CN104244824B (en) | 2016-09-07 |
Family
ID=49327366
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201380019356.3A Expired - Fee Related CN104244824B (en) | 2012-04-10 | 2013-04-04 | Mood monitoring system |
Country Status (5)
Country | Link |
---|---|
US (1) | US9465978B2 (en) |
JP (1) | JP5729345B2 (en) |
CN (1) | CN104244824B (en) |
DE (1) | DE112013001979T5 (en) |
WO (1) | WO2013153781A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104853481A (en) * | 2015-04-01 | 2015-08-19 | 浙江农林大学 | LED mood presenting and adjusting device and method |
CN106127828A (en) * | 2016-06-28 | 2016-11-16 | 广东欧珀移动通信有限公司 | The processing method of a kind of augmented reality, device and mobile terminal |
CN106562793A (en) * | 2015-10-08 | 2017-04-19 | 松下电器(美国)知识产权公司 | Method for controlling information display apparatus, and information display apparatus |
CN108501956A (en) * | 2018-03-13 | 2018-09-07 | 深圳市海派通讯科技有限公司 | A kind of intelligent braking method based on Emotion identification |
CN109050396A (en) * | 2018-07-16 | 2018-12-21 | 浙江合众新能源汽车有限公司 | A kind of vehicle intelligent robot |
CN109711299A (en) * | 2018-12-17 | 2019-05-03 | 北京百度网讯科技有限公司 | Vehicle passenger flow statistical method, device, equipment and storage medium |
CN110667468A (en) * | 2018-07-03 | 2020-01-10 | 奥迪股份公司 | Driving assistance system and driving assistance method |
CN111772648A (en) * | 2020-06-10 | 2020-10-16 | 南京七岩电子科技有限公司 | Method and device for judging emotion by combining HRV signal and facial expression |
CN113221611A (en) * | 2020-02-05 | 2021-08-06 | 丰田自动车株式会社 | Emotion estimation device, method, program, and vehicle |
Families Citing this family (38)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015174526A1 (en) * | 2014-05-16 | 2015-11-19 | エイディシーテクノロジー株式会社 | Vehicle control system |
JP5818050B1 (en) * | 2015-01-28 | 2015-11-18 | ビックリック株式会社 | Status judgment system |
JP2017054241A (en) * | 2015-09-08 | 2017-03-16 | 株式会社東芝 | Display control device, method, and program |
KR101823611B1 (en) * | 2015-10-05 | 2018-01-31 | 주식회사 감성과학연구센터 | Method for extracting Emotional Expression information based on Action Unit |
JP6656079B2 (en) * | 2015-10-08 | 2020-03-04 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Control method of information presentation device and information presentation device |
US11194405B2 (en) | 2015-10-08 | 2021-12-07 | Panasonic Intellectual Property Corporation Of America | Method for controlling information display apparatus, and information display apparatus |
CN113532464A (en) * | 2015-10-08 | 2021-10-22 | 松下电器(美国)知识产权公司 | Control method, personal authentication apparatus, and recording medium |
FR3048175A1 (en) * | 2016-02-29 | 2017-09-01 | Peugeot Citroen Automobiles Sa | DEVICE AND METHOD FOR DETECTING A COGNITIVE DISTRACTION STATUS OF A DRIVER |
JP6818424B2 (en) * | 2016-04-13 | 2021-01-20 | キヤノン株式会社 | Diagnostic support device, information processing method, diagnostic support system and program |
JP2019531560A (en) | 2016-07-05 | 2019-10-31 | ナウト, インコーポレイテッドNauto, Inc. | Automatic driver identification system and method |
JP2019527832A (en) | 2016-08-09 | 2019-10-03 | ナウト, インコーポレイテッドNauto, Inc. | System and method for accurate localization and mapping |
US10733460B2 (en) | 2016-09-14 | 2020-08-04 | Nauto, Inc. | Systems and methods for safe route determination |
US9928432B1 (en) | 2016-09-14 | 2018-03-27 | Nauto Global Limited | Systems and methods for near-crash determination |
JP6657048B2 (en) * | 2016-09-30 | 2020-03-04 | 本田技研工業株式会社 | Processing result abnormality detection device, processing result abnormality detection program, processing result abnormality detection method, and moving object |
JP6382273B2 (en) * | 2016-09-30 | 2018-08-29 | 本田技研工業株式会社 | Facility satisfaction calculation device |
EP3535646A4 (en) | 2016-11-07 | 2020-08-12 | Nauto, Inc. | System and method for driver distraction determination |
JP7060779B2 (en) * | 2017-03-28 | 2022-04-27 | テイ・エス テック株式会社 | Combination selection system |
JP6686959B2 (en) * | 2017-04-11 | 2020-04-22 | 株式会社デンソー | Vehicle alarm device |
US10430695B2 (en) | 2017-06-16 | 2019-10-01 | Nauto, Inc. | System and method for contextualized vehicle operation determination |
WO2018229549A2 (en) | 2017-06-16 | 2018-12-20 | Nauto Global Limited | System and method for digital environment reconstruction |
US10453150B2 (en) | 2017-06-16 | 2019-10-22 | Nauto, Inc. | System and method for adverse vehicle event determination |
JP6897377B2 (en) * | 2017-07-11 | 2021-06-30 | トヨタ自動車株式会社 | Information provider |
CN107647873B (en) * | 2017-08-31 | 2021-04-02 | 维沃移动通信有限公司 | Emotion detection method and mobile terminal |
JP6613290B2 (en) | 2017-11-28 | 2019-11-27 | 株式会社Subaru | Driving advice device and driving advice method |
KR102564854B1 (en) | 2017-12-29 | 2023-08-08 | 삼성전자주식회사 | Method and apparatus of recognizing facial expression based on normalized expressiveness and learning method of recognizing facial expression |
US11453333B2 (en) * | 2018-01-04 | 2022-09-27 | Harman International Industries, Incorporated | Moodroof for augmented media experience in a vehicle cabin |
JP6713490B2 (en) * | 2018-02-07 | 2020-06-24 | 本田技研工業株式会社 | Information providing apparatus and information providing method |
US11392131B2 (en) | 2018-02-27 | 2022-07-19 | Nauto, Inc. | Method for determining driving policy |
JP7091796B2 (en) * | 2018-04-12 | 2022-06-28 | 株式会社Jvcケンウッド | Video control device, vehicle shooting device, video control method and program |
JP7235276B2 (en) | 2018-08-21 | 2023-03-08 | 株式会社データ・テック | Operation control device and computer program |
US11049299B2 (en) * | 2018-09-26 | 2021-06-29 | The Alchemy Of You, Llc | System and method for improved data structures and related interfaces |
JP7115216B2 (en) * | 2018-10-24 | 2022-08-09 | トヨタ自動車株式会社 | Information processing device and information processing method |
JP7429498B2 (en) * | 2018-11-06 | 2024-02-08 | 日産自動車株式会社 | Driving support method and driving support device |
JP2020095538A (en) * | 2018-12-13 | 2020-06-18 | 本田技研工業株式会社 | Information processor and program |
JP7092045B2 (en) * | 2019-01-16 | 2022-06-28 | トヨタ自動車株式会社 | Vehicle interior control device |
JP7131433B2 (en) * | 2019-02-26 | 2022-09-06 | トヨタ自動車株式会社 | In-vehicle information processing device, inter-vehicle information processing system, and information processing system |
US20240135729A1 (en) * | 2021-03-16 | 2024-04-25 | Mitsubishi Electric Corporation | Occupant detection device, occupant detection system, and occupant detection method |
US11793434B1 (en) * | 2022-05-18 | 2023-10-24 | GM Global Technology Operations LLC | System to perform visual recognition and vehicle adaptation |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003339681A (en) * | 2002-05-27 | 2003-12-02 | Denso Corp | Display device for vehicle |
CN101120598A (en) * | 2005-11-17 | 2008-02-06 | 普利电子有限公司 | Emoticon message transforming system and method for the same |
CN101540090A (en) * | 2009-04-14 | 2009-09-23 | 华南理工大学 | Driver fatigue monitoring device based on multivariate information fusion and monitoring method thereof |
JP2010092094A (en) * | 2008-10-03 | 2010-04-22 | Nikon Corp | Image processing apparatus, image processing program, and digital camera |
JP2010226484A (en) * | 2009-03-24 | 2010-10-07 | Nikon Corp | Image display and digital camera |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1845491A4 (en) * | 2005-02-03 | 2008-02-13 | Pioneer Corp | Image editing device, image editing method, image editing program and computer readable recording medium |
JP2011117905A (en) * | 2009-12-07 | 2011-06-16 | Pioneer Electronic Corp | Route selection support device and route selection support method |
-
2012
- 2012-04-10 JP JP2012089493A patent/JP5729345B2/en not_active Expired - Fee Related
-
2013
- 2013-04-04 WO PCT/JP2013/002333 patent/WO2013153781A1/en active Application Filing
- 2013-04-04 CN CN201380019356.3A patent/CN104244824B/en not_active Expired - Fee Related
- 2013-04-04 US US14/390,806 patent/US9465978B2/en active Active
- 2013-04-04 DE DE112013001979.5T patent/DE112013001979T5/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003339681A (en) * | 2002-05-27 | 2003-12-02 | Denso Corp | Display device for vehicle |
CN101120598A (en) * | 2005-11-17 | 2008-02-06 | 普利电子有限公司 | Emoticon message transforming system and method for the same |
JP2010092094A (en) * | 2008-10-03 | 2010-04-22 | Nikon Corp | Image processing apparatus, image processing program, and digital camera |
JP2010226484A (en) * | 2009-03-24 | 2010-10-07 | Nikon Corp | Image display and digital camera |
CN101540090A (en) * | 2009-04-14 | 2009-09-23 | 华南理工大学 | Driver fatigue monitoring device based on multivariate information fusion and monitoring method thereof |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104853481B (en) * | 2015-04-01 | 2018-08-14 | 浙江农林大学 | A kind of LED moods are presented and regulating device and method |
CN104853481A (en) * | 2015-04-01 | 2015-08-19 | 浙江农林大学 | LED mood presenting and adjusting device and method |
CN106562793B (en) * | 2015-10-08 | 2021-12-21 | 松下电器(美国)知识产权公司 | Information presentation device control method and information presentation device |
CN106562793A (en) * | 2015-10-08 | 2017-04-19 | 松下电器(美国)知识产权公司 | Method for controlling information display apparatus, and information display apparatus |
CN106127828A (en) * | 2016-06-28 | 2016-11-16 | 广东欧珀移动通信有限公司 | The processing method of a kind of augmented reality, device and mobile terminal |
CN108501956A (en) * | 2018-03-13 | 2018-09-07 | 深圳市海派通讯科技有限公司 | A kind of intelligent braking method based on Emotion identification |
CN110667468A (en) * | 2018-07-03 | 2020-01-10 | 奥迪股份公司 | Driving assistance system and driving assistance method |
CN109050396A (en) * | 2018-07-16 | 2018-12-21 | 浙江合众新能源汽车有限公司 | A kind of vehicle intelligent robot |
CN109711299A (en) * | 2018-12-17 | 2019-05-03 | 北京百度网讯科技有限公司 | Vehicle passenger flow statistical method, device, equipment and storage medium |
US11562307B2 (en) | 2018-12-17 | 2023-01-24 | Apollo Intelligent Driving Technology (Beijing) Co., Ltd. | Vehicle passenger flow statistical method, apparatus, device, and storage medium |
CN113221611A (en) * | 2020-02-05 | 2021-08-06 | 丰田自动车株式会社 | Emotion estimation device, method, program, and vehicle |
CN113221611B (en) * | 2020-02-05 | 2024-03-15 | 丰田自动车株式会社 | Emotion estimation device, method, program, and vehicle |
CN111772648A (en) * | 2020-06-10 | 2020-10-16 | 南京七岩电子科技有限公司 | Method and device for judging emotion by combining HRV signal and facial expression |
Also Published As
Publication number | Publication date |
---|---|
JP2013216241A (en) | 2013-10-24 |
CN104244824B (en) | 2016-09-07 |
DE112013001979T5 (en) | 2015-03-12 |
US20150078632A1 (en) | 2015-03-19 |
US9465978B2 (en) | 2016-10-11 |
WO2013153781A1 (en) | 2013-10-17 |
JP5729345B2 (en) | 2015-06-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104244824A (en) | Affect-monitoring system | |
JP6643461B2 (en) | Advertising billboard display and method for selectively displaying advertisements by sensing demographic information of vehicle occupants | |
TWI738132B (en) | Human-computer interaction method based on motion analysis, in-vehicle device | |
CN105292125B (en) | Driver state monitoring method | |
US11609565B2 (en) | Methods and systems to facilitate monitoring center for ride share and safe testing method based for selfdriving cars to reduce the false call by deuddaction systems based on deep learning machine | |
JP2012148710A (en) | Image display apparatus | |
US20220324328A1 (en) | Display control device, display system, and display control method for controlling display of alert | |
CN110516622A (en) | A kind of gender of occupant, age and emotional intelligence recognition methods and system | |
CN113782020A (en) | In-vehicle voice interaction method and system | |
CN111813491A (en) | Vehicle-mounted assistant anthropomorphic interaction method and device and automobile | |
CN105383497B (en) | Vehicle-mounted system | |
CN112644375B (en) | Mood perception-based in-vehicle atmosphere lamp adjusting method, system, medium and terminal | |
CN115402333A (en) | In-vehicle interaction control system and method based on driver emotion and storage medium | |
CN115471890A (en) | Vehicle interaction method and device, vehicle and storage medium | |
CN112506353A (en) | Vehicle interaction system, method, storage medium and vehicle | |
CN218287648U (en) | Vehicle-mounted entertainment system of automobile and automobile | |
CN110176011A (en) | Vehicle abnormality based reminding method and device | |
CN210983432U (en) | Driver detection system that makes a call based on computer vision technique | |
JP2019125305A (en) | Support device for creating teacher data | |
CN115101078A (en) | Voiceprint capturing and displaying system, vehicle with voiceprint capturing and displaying system, control method and storage medium | |
CN118019186A (en) | Method and device for adjusting atmosphere lamp of vehicle, computer equipment and storage medium | |
CN118082881A (en) | Driving mode recommendation method and device and electronic equipment | |
CN114290988A (en) | Method and device for adjusting light in vehicle, electronic equipment and storage medium | |
CN117508019A (en) | Auxiliary talking method, storage medium and vehicle | |
CN114115656A (en) | Vehicle-mounted augmented reality vehicle window entertainment system and operation method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20160907 |