EP3899852A1 - Method and device for monitoring an occupant of a vehicle - Google Patents
Method and device for monitoring an occupant of a vehicleInfo
- Publication number
- EP3899852A1 EP3899852A1 EP19816229.9A EP19816229A EP3899852A1 EP 3899852 A1 EP3899852 A1 EP 3899852A1 EP 19816229 A EP19816229 A EP 19816229A EP 3899852 A1 EP3899852 A1 EP 3899852A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- information content
- vehicle
- output
- data
- conversation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000012544 monitoring process Methods 0.000 title claims abstract description 10
- 238000004458 analytical method Methods 0.000 claims abstract description 44
- 238000001514 detection method Methods 0.000 claims description 29
- 230000036651 mood Effects 0.000 claims description 17
- 238000011156 evaluation Methods 0.000 claims description 13
- 238000006243 chemical reaction Methods 0.000 claims description 8
- 230000015654 memory Effects 0.000 description 8
- 238000004891 communication Methods 0.000 description 6
- 238000011161 development Methods 0.000 description 5
- 230000018109 developmental process Effects 0.000 description 5
- 230000008447 perception Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000008451 emotion Effects 0.000 description 2
- 230000008921 facial expression Effects 0.000 description 2
- 206010041308 Soliloquy Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000037308 hair color Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0242—Determining effectiveness of advertisements
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/013—Eye tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/20—Natural language analysis
- G06F40/279—Recognition of textual entities
- G06F40/284—Lexical analysis, e.g. tokenisation or collocates
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F40/00—Handling natural language data
- G06F40/30—Semantic analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0265—Vehicular advertisement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0265—Vehicular advertisement
- G06Q30/0266—Vehicular advertisement based on the position of the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
- G10L25/51—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination
- G10L25/63—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use for comparison or discrimination for estimating an emotional state
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/21—Voice
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/22—Psychological state; Stress level or workload
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/225—Direction of gaze
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/229—Attention level, e.g. attentive to driving, reading or sleeping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/01—Indexing scheme relating to G06F3/01
- G06F2203/011—Emotion or mood input determined on the basis of sensed human body parameters such as pulse, heart rate or beat, temperature of skin, facial expressions, iris, voice pitch, brain activity patterns
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L15/00—Speech recognition
- G10L15/08—Speech classification or search
- G10L2015/088—Word spotting
-
- G—PHYSICS
- G10—MUSICAL INSTRUMENTS; ACOUSTICS
- G10L—SPEECH ANALYSIS TECHNIQUES OR SPEECH SYNTHESIS; SPEECH RECOGNITION; SPEECH OR VOICE PROCESSING TECHNIQUES; SPEECH OR AUDIO CODING OR DECODING
- G10L25/00—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00
- G10L25/48—Speech or voice analysis techniques not restricted to a single one of groups G10L15/00 - G10L21/00 specially adapted for particular use
Definitions
- the present invention relates to a method for monitoring an occupant of a vehicle, in which it is determined whether information content perceivable in the vehicle is output and detection data for output information content are recorded. Furthermore, the invention relates to a device for monitoring an occupant of a vehicle, which device has a determination device which is designed to determine whether information content perceptible in the vehicle is output. Furthermore, the
- Device has a recognition device for acquiring recognition data for output information information and a microphone, which is arranged in the interior of the vehicle and which is designed to record a conversation in the interior of the vehicle.
- advertising content is output visually or audibly. It is in the interest of the advertising company to find out whether the advertisements issued were perceived by an occupant of the vehicle. This information can be obtained by monitoring the occupants of the vehicle.
- An entertainment system for a vehicle is known from US 2016/0316237 A1, via which films, audio programs, information videos, text descriptions and advertising content can be output.
- the system includes a camera with which the user's face is recorded.
- the images captured by the camera are used to analyze the user's emotions. In this way, for example, the reaction of a user to advertising can be recorded.
- the direction of the user's gaze can be recorded in order to determine where the user is looking on a display.
- personal characteristics of the user such as his age, hair color and similar characteristics can be obtained.
- DE 10 2014 204 530 A1 discloses a method and a device for a subjective advertisement effectiveness analysis. Advertising is presented in the vehicle and the user reaction to this advertising is analyzed. There is one for this
- Vehicle camera provided that visually records occupant reactions. In this way it can be determined whether the user liked or disliked advertisements. For example, facial recognition software can run when an advertisement is reproduced so that user emotions can be recorded and time stamped.
- DE 10 2012 213 466 A1 describes a method and an apparatus for
- Vehicle occupants are detected by means of a gaze detection device and the gaze direction is automatically assigned to a detected object.
- the object can, for example, be an advertising poster.
- the environment detection device comprises a video and / or a radar system, which can be used to identify objects, such as an advertising poster, that are located on the route, in particular on the street or on the roadside. The objects are identified by a distance estimate and a
- the present invention is based on the object of specifying a method and a device of the type mentioned at the outset with which the perception of an occupant of a vehicle for the output of information content can be better monitored.
- Conversation information output is determined whether the output
- the conversation can be done e.g. B. between at least two occupants of the vehicle or between at least one occupant and a caller or between at least one occupant and a person outside the vehicle or as a soliloquy of an occupant respectively.
- the explanations below relate to all of the options
- the information content is any form of information that can be perceived by a vehicle occupant.
- the information content is, in particular, an advertising content. Advertising is thus output, the content of which can be perceived by an occupant of the vehicle.
- Procedure monitors the vehicle occupant to determine whether it is advertising content
- Recognition data of the assigned information content are produced. If there is such a connection, it is concluded that the information content was perceived by a vehicle occupant.
- the conversation analysis ensures that the perception of information content can be concluded very reliably.
- the time interval after the information content has been output can include, for example, a period of up to 30 minutes after the end of the output of the information content.
- the time interval comprises in particular a period of 20 minutes, preferably 10 minutes or 2 minutes, after the information content has been output.
- the information content can be output audibly. Alternatively or additionally, it can be output visually.
- the information content can be displayed, for example, visually in the vehicle or by an object in the traffic area outside the vehicle, so that it can be perceived in the vehicle. It can also be output acoustically via a loudspeaker.
- the recognition data are obtained by analyzing the information content output.
- a visual output of the information content can be captured by an interior camera or a camera that records the surroundings of the vehicle.
- An auditory output can be recorded acoustically via an interior microphone.
- the data recorded by a camera and / or the interior microphone can then be analyzed in order to obtain recognition data for the information content.
- the identification data can advantageously be obtained solely from devices in the vehicle, without it being necessary to access devices external to the vehicle.
- a device is determined which outputs the information content. Data for identifying the determined device is then transmitted to a device external to the vehicle. The vehicle-external device determines the on the basis of the data for identifying the determined device
- Recognition data relating to the information content output by the device and the ascertained recognition data are transmitted from the vehicle-external device to the vehicle.
- the recognition data can be obtained reliably and possibly in a predetermined manner, since the
- Vehicle-external device data may be available, which information content is output at what time via the determined device.
- setting data for a setting of the device which outputs the information content and / or for the point in time or the time interval of the output of the information content are recorded.
- This setting data is transmitted to the vehicle-external device, which uses the setting data to identify the data output by the device
- Information content determined. This ensures that the vehicle-external device can receive data about which information content is output by a device if this device can output different information content with different settings.
- the device can be a radio, for example.
- the setting data can then indicate which radio station outputs the information content.
- the device external to the vehicle determines which information content was output by the radio station at which time or time interval and which acoustic content
- the output contained recognition data For example, the vehicle-external device can access a data memory which contains data about what which radio station outputs and when. Assigned to this information output, predefined recognition data can be stored, which can then be transmitted to the vehicle by the vehicle-external device.
- the information content can be output from an in-vehicle device, independently of a set radio station, in the interior of the vehicle.
- the data are available inside the vehicle, which information content was output at what point in time and which identification data the output contained.
- the identification data can include, for example, keywords of the information content output. This advantageously ensures that a particularly reliable conversation analysis can be carried out.
- the analysis of the recorded conversation which takes place during or for a time interval after the output of the
- Information content determined whether there is a match between at least a subset of the keywords belonging to the information content and words of the conversation. If words occur in the conversation that correspond to keywords of the information content, it can advantageously be concluded very reliably that the conversation content is related to the information content output.
- the keywords are therefore special or rarely occurring words of the information content, so that in the conversation analysis a clear distinction between a general conversation and a conversation about the
- Information content was conducted, the content of the conversation is determined and determines what the reaction of at least one occupant to the information content is. For example, it can be analyzed whether the occupant has responded positively, neutrally or negatively to the information content. With this configuration, it can advantageously be determined not only whether the output of the information content was perceived, but also what the reaction to the information content is.
- the mood of the occupant is determined by means of a mood detection device at the time or at the
- Mood detection device can be coupled, for example, to an interior camera and / or a microphone. In this way, the voice, facial expressions and / or gestures of the occupant can be analyzed and classified by the mood detection device and thus assigned to a specific mood. Personal characteristics of the occupant can also be recorded. For example, the identity of the occupant, in particular the driver of the vehicle, can be determined using the
- Vehicle key or be determined based on an entry when driving.
- the occupant's personal characteristics are assessed using a
- Occupant detection device detected. This can access an interior camera and / or a microphone. In this way, for example, the age group and the sex of the occupant can be determined automatically. This information can support the further evaluation of the data on the perceptions of information content.
- an in-vehicle analysis device detects which information content was perceived, preferably which information content in the vehicle by which occupant and when
- To determine information content in particular output advertising content.
- the in-vehicle analysis device detects which metadata contains the information content.
- This metadata can, for example, contain information about which product or company was advertised when advertising content is output. This metadata can also be found in the
- Recognition data may be included.
- the advertised product or the advertised company can also be keywords of the information content output.
- the vehicle-internal analysis device can also record which personal characteristics and / or which mood of the occupant and which the information content
- the data recorded by the analysis device can in particular be transmitted to an evaluation device external to the vehicle.
- This evaluation device can advantageously receive the data from a large number of vehicles and in this way carry out a comprehensive and efficient evaluation of the effectiveness of the information content output.
- the evaluation device external to the vehicle can determine from the data of a large number of vehicles, such as information content associated with metadata
- the device according to the invention is characterized by an analysis device which is coupled to the microphone and which is designed, at least during or for one
- the device according to the invention is in particular designed to carry out the method according to the invention as explained above. It therefore has the same advantages as the method according to the invention.
- the figure shows an embodiment of the device according to the invention.
- the device according to the invention is arranged in a vehicle 1.
- An occupant of a vehicle 1 can be monitored by means of the device. It comprises a display 2 for the visual output of information content.
- the device comprises a radio 3, which is connected to a loudspeaker 4, so that information content can be output audibly.
- the information content in the exemplary embodiment is an advertising content.
- the advertising content can thus be output acoustically by means of the radio 3 via the loudspeaker 4 and / or visually by means of the display 2.
- the information content is
- the recognition data are keywords that are related to the information content.
- the key words can be words which are shown visually when the information content is output on the display 2 or which occur in spoken text when the information content is output audibly.
- the identification data in particular the keywords, can also be related to the information content in other ways stand. It is not absolutely necessary that the keywords are displayed or output acoustically.
- the identification data is related to the information content in such a way that the identification data can be used to determine whether a user has perceived the output of the information content. For example, the
- Recognition data also contain keywords that are, in a figurative sense, related to the information content. For example, it may be keywords that are likely to be spoken during a conversation about the information content without the keywords themselves occurring in the information content. In particular, the keywords are chosen so that the
- Frequency of occurrence of these words is as low as possible, so that the keywords are less likely to be mentioned in a different context.
- the display 2 and the radio 3 are coupled to an information output device 5.
- the information output device controls the information output via the display 2 and the radio 3. Settings data of the display 2 and the radio 3 are also transmitted to the information output device 5.
- the device further comprises a sensor device 6 arranged in the vehicle 1, which is coupled to a plurality of sensors 7.
- the sensors 7 include
- an interior camera 7-1 which among other things an occupant of the vehicle
- Vehicle 1 in particular the driver, and an environment camera 7-2, which records images in the environment of vehicle 1, in particular in the direction of travel.
- the sensors 7 include an interior microphone 7-3, which is arranged such that it can record conversations in the interior of the vehicle 1 in the interior of the vehicle 1.
- the sensors 7 comprise an identification unit 7-4, via which an occupant of the vehicle 1 can identify.
- the information output device 5 and the sensor device 6 are with a
- the determination device 8 is designed to determine whether information information that is perceptible in the vehicle 1 is output. For example, the determination device 8 can use the information output device 5 to determine whether the radio 3 is switched on or whether information is shown on the display. Alternatively or additionally, the determination device 8 can use the data from the sensors 7 transmitted by the sensor device 6 to determine whether a certain information content, for example an advertising content, was output visually by means of the display 2 or audibly by means of the radio 3 and the loudspeaker 4 connected to it. Further alternatively or additionally, the determination device 8 can be via a certain information content, for example an advertising content, was output visually by means of the display 2 or audibly by means of the radio 3 and the loudspeaker 4 connected to it. Further alternatively or additionally, the determination device 8 can be via a
- the data memory 15 can contain information about when a certain information content, for example an advertising content, on which radio station is output and when. This information can be transmitted to the determination device 8 via the communication interface 13.
- the radio 3 transmits the information about which radio station is currently being output via the information output device 5 as setting data.
- the determination device 8 can then use this information to determine when the information content relevant for monitoring the occupant is output.
- the information that a relevant information content for example an advertising content
- the determination device 8 transmits it to a recognition device 9, with which recognition data on output information content can be acquired.
- the recognition data includes
- the recognition data can be obtained in various ways using the recognition device 9:
- the detection device 9 is coupled to the sensor device 6.
- the information content output is recorded via the interior microphone 7-3. This recording is analyzed by the recognition device 9.
- the recognition data are extracted.
- the detection device 9 is provided with detection data directly via the information output device 5. For example, at
- Detection device 9 provides.
- the recognition device 9 uses the information output device 5 to determine which device is used to output the information content, that is to say in the present exemplary embodiment whether the information content is output visually via the display 2 and / or audibly via the loudspeaker 4.
- the data for identifying the determined device are transmitted to the external device 14 together with a time stamp.
- the vehicle-external device 14 determines the on the basis of the data for identifying the determined device Recognition data for the information content output by the device by data retrieval from the data store 15.
- the data store 15 can not only store the time at which information content is output via which radio station, but also the respectively assigned recognition data.
- the data memory 15 can also store corresponding recognition data relating to information content visually output on the display 2.
- recognition data for example as part of a television program, can be shown on the display 2.
- the detection data determined by the device 14 external to the vehicle are transmitted via the
- the recognition device 9 can also acquire metadata on the information content output. This metadata can be shared with the recognition device 9 .
- the detection device 9 is connected to an in-vehicle analysis device 12.
- the analysis device 12 is also coupled to the sensor device 6. This transmits 12 recordings of the interior microphone 7-3 to the analysis device.
- the analysis device 12 is designed to analyze a conversation recorded during the output of an information content or for a time interval after the output of an information content.
- Analysis device 12 determines this conversation analysis and the detected
- the analysis device 12 extracts in particular words of the detected one
- the analysis device 12 concludes that a vehicle occupant has perceived the information content output.
- the information that a certain information content was perceived by a vehicle occupant is from the
- Analysis device 12 is stored in an internal memory together with associated time information for the output of the information content and any setting data of the device for output of the information content.
- the analysis device 12 is designed for the analysis of the detected
- Conversation that occurs during or for a time interval after the information content has been output was conducted to determine the content of the conversation and to determine what the reaction of at least one inmate to the information content is. For example, predefined recognition words can be used to determine whether the occupant's reaction to the information content is positive, neutral or negative. This information is also stored internally by the analysis device 12.
- the analysis device 12 is connected to a mood detection device 10.
- the mood detection device 10 detects how the mood of the occupant is or was at the time or at the time interval at which the
- the mood detection device 10 is coupled to the sensor device 6.
- the facial expressions, gestures and voice of the occupant can be analyzed and assigned to a specific mood class.
- the analysis device 12 is connected to an occupant detection device 11, which in turn is connected to the sensor device 6.
- Occupant detection device 11 can detect personal characteristics of the occupant on the basis of the data acquired by sensors 7.
- the identity can be determined directly via the identification unit 7-4, for example on the basis of a key personally assigned to the occupant.
- the occupant's personal characteristics can be determined by image analysis and / or voice analysis. Data from the interior camera 7-1 and / or the interior microphone 7-3 can be evaluated.
- the data obtained via the mood detection device 10 and the occupant detection device 11 are also stored by the analysis device 12 in the internal memory. In this way, a data record is stored in the analysis device, which indicates which information content of which vehicle 1 by which occupant and when
- This evaluation device 16 collects the data obtained from a large number of surveillance of occupants of vehicles 1 and evaluates them in order to obtain data on the effectiveness of the output of the information content.
- the evaluation device external to the vehicle also uses the data from a large number of vehicles to determine how information content associated with metadata is perceived.
- the information content is not output visually or auditorily inside the vehicle 1, but outside the vehicle 1.
- the information content can be output by an object in the traffic area in which the vehicle 1 is moving.
- an analysis of a conversation in the interior of the vehicle can be used to determine whether it is outside the vehicle
- Information content output vehicle 1 was perceived by an occupant of vehicle 1.
- Output of information content is understood to mean, inter alia, changing display contents from a display external to the vehicle.
- the information content can also be output via a stationary advertising poster without changing information content. In this case, it is a stationary information display, for example through an advertising poster. If the vehicle 1 passes such an object in the traffic area, the conversation analysis determines whether a vehicle occupant has perceived this information content.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Accounting & Taxation (AREA)
- Human Computer Interaction (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- Development Economics (AREA)
- General Health & Medical Sciences (AREA)
- Multimedia (AREA)
- Computational Linguistics (AREA)
- Audiology, Speech & Language Pathology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Game Theory and Decision Science (AREA)
- Artificial Intelligence (AREA)
- Acoustics & Sound (AREA)
- Psychiatry (AREA)
- Hospice & Palliative Care (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Signal Processing (AREA)
- Child & Adolescent Psychology (AREA)
- User Interface Of Digital Computer (AREA)
- Navigation (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018133453.2A DE102018133453A1 (en) | 2018-12-21 | 2018-12-21 | Method and device for monitoring an occupant of a vehicle |
PCT/EP2019/082819 WO2020126376A1 (en) | 2018-12-21 | 2019-11-27 | Method and device for monitoring an occupant of a vehicle |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3899852A1 true EP3899852A1 (en) | 2021-10-27 |
Family
ID=68806721
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19816229.9A Pending EP3899852A1 (en) | 2018-12-21 | 2019-11-27 | Method and device for monitoring an occupant of a vehicle |
Country Status (5)
Country | Link |
---|---|
US (1) | US11810149B2 (en) |
EP (1) | EP3899852A1 (en) |
CN (1) | CN113168644A (en) |
DE (1) | DE102018133453A1 (en) |
WO (1) | WO2020126376A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102018133453A1 (en) | 2018-12-21 | 2020-06-25 | Volkswagen Aktiengesellschaft | Method and device for monitoring an occupant of a vehicle |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE10033333A1 (en) | 2000-07-01 | 2002-01-17 | Berndt Gmbh & Co Kg | Method for determination of the value of advertising carried by vehicles in which route and time data recorded with a navigation system are correlated with traffic data for locations and times to determine an advertising value |
JP3529348B2 (en) * | 2000-11-06 | 2004-05-24 | Necソフト株式会社 | In-vehicle information provision system |
JP2004226070A (en) | 2003-01-20 | 2004-08-12 | Nissan Motor Co Ltd | Information acquiring device for vehicle, and vehicle roadside communication system |
DE102004020878A1 (en) | 2004-04-28 | 2005-11-24 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method and device for information reproduction |
WO2008032329A2 (en) * | 2006-09-13 | 2008-03-20 | Alon Atsmon | Providing content responsive to multimedia signals |
US10034034B2 (en) * | 2011-07-06 | 2018-07-24 | Symphony Advanced Media | Mobile remote media control platform methods |
US20120143693A1 (en) | 2010-12-02 | 2012-06-07 | Microsoft Corporation | Targeting Advertisements Based on Emotion |
DE102012213466A1 (en) | 2012-07-31 | 2014-02-06 | Robert Bosch Gmbh | Method and device for monitoring a vehicle occupant |
US20140214933A1 (en) | 2013-01-28 | 2014-07-31 | Ford Global Technologies, Llc | Method and Apparatus for Vehicular Social Networking |
US20140279021A1 (en) | 2013-03-14 | 2014-09-18 | Ford Global Technologies, Llc | Ad Manager for a Vehicle Multimedia System |
US20140278910A1 (en) | 2013-03-15 | 2014-09-18 | Ford Global Technologies, Llc | Method and apparatus for subjective advertisment effectiveness analysis |
US20140278933A1 (en) * | 2013-03-15 | 2014-09-18 | F. Gavin McMillan | Methods and apparatus to measure audience engagement with media |
US20140379456A1 (en) * | 2013-06-24 | 2014-12-25 | United Video Properties, Inc. | Methods and systems for determining impact of an advertisement |
US20150142552A1 (en) | 2013-11-21 | 2015-05-21 | At&T Intellectual Property I, L.P. | Sending Information Associated with a Targeted Advertisement to a Mobile Device Based on Viewer Reaction to the Targeted Advertisement |
US9852355B2 (en) | 2015-04-21 | 2017-12-26 | Thales Avionics, Inc. | Facial analysis for vehicle entertainment system metrics |
JP6751436B2 (en) | 2015-09-04 | 2020-09-02 | ローベルト ボツシユ ゲゼルシヤフト ミツト ベシユレンクテル ハフツングRobert Bosch Gmbh | Access to autonomous vehicles and driving control |
US20190122661A1 (en) * | 2017-10-23 | 2019-04-25 | GM Global Technology Operations LLC | System and method to detect cues in conversational speech |
DE102018133453A1 (en) | 2018-12-21 | 2020-06-25 | Volkswagen Aktiengesellschaft | Method and device for monitoring an occupant of a vehicle |
-
2018
- 2018-12-21 DE DE102018133453.2A patent/DE102018133453A1/en active Pending
-
2019
- 2019-11-27 CN CN201980084892.9A patent/CN113168644A/en active Pending
- 2019-11-27 WO PCT/EP2019/082819 patent/WO2020126376A1/en unknown
- 2019-11-27 US US17/415,925 patent/US11810149B2/en active Active
- 2019-11-27 EP EP19816229.9A patent/EP3899852A1/en active Pending
Also Published As
Publication number | Publication date |
---|---|
US11810149B2 (en) | 2023-11-07 |
CN113168644A (en) | 2021-07-23 |
WO2020126376A1 (en) | 2020-06-25 |
DE102018133453A1 (en) | 2020-06-25 |
US20220084061A1 (en) | 2022-03-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102014204530A1 (en) | METHOD AND DEVICE FOR SUBJECTIVE ADVERTISING EFFECTIVITY ANALYSIS | |
DE102018219984B3 (en) | Method and system for supporting an automated vehicle | |
DE102019118184A1 (en) | System and method for user-specific adaptation of vehicle parameters | |
DE102017215283A1 (en) | Method for anonymizing an image for a camera system of a motor vehicle, image processing device, camera system and motor vehicle | |
DE102018128634A1 (en) | Method for providing visual information about at least part of an environment, computer program product, mobile communication device and communication system | |
DE102015225135A1 (en) | System and method for adapting an acoustic output of a navigation system | |
WO2020126376A1 (en) | Method and device for monitoring an occupant of a vehicle | |
WO2020126375A1 (en) | Method and apparatus for monitoring an occupant of a vehicle and system for analysing the perception of objects | |
DE102019126688A1 (en) | SYSTEM AND METHOD FOR AUTOMATIC SUBTITLE DISPLAY | |
DE102020004792A1 (en) | Method and device for the detection and reporting of parking accidents for vehicles | |
DE112017008236T5 (en) | METHOD AND DEVICE TO ASSIST DRIVING | |
EP3729423B1 (en) | Method for operating a sound output device of a motor vehicle, voice-analysis and control device, motor vehicle and motor-vehicle-external server device | |
DE102019118183A1 (en) | Information system and process | |
DE102013019563B4 (en) | Method for providing information about an environment to a smart device | |
DE102015218967A1 (en) | Method and system for identifying and using property relationships | |
DE102018130754A1 (en) | SEAMLESS ADVISOR INTERVENTION | |
EP3785236B1 (en) | Method for processing surroundings information of a motor vehicle and electronic information processing system | |
DE102021130155A1 (en) | Method and system for providing information requested in a motor vehicle about an object in the vicinity of the motor vehicle | |
DE102011113633A1 (en) | Method for supporting user of motor car during detecting another motor car of e.g. friend in personal network located in environment of motor car, involves providing hit record with number plate when automatic alignment is taken place | |
DE102020005466A1 (en) | Procedure for selecting data for ADAS systems | |
DE112018006597B4 (en) | Speech processing device and speech processing method | |
DE102020135046A1 (en) | Method for triggering a compensation process to compensate for damage to a motor vehicle and associated processor circuit | |
DE102020201742A1 (en) | Selection of training data related to the sensor environment | |
DE102014101785A1 (en) | System for processing data and method for operating the system | |
DE102019204849A1 (en) | Detection of a potential danger posed by people |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210721 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230322 |