EP3899851A1 - Method and apparatus for monitoring an occupant of a vehicle and system for analysing the perception of objects - Google Patents
Method and apparatus for monitoring an occupant of a vehicle and system for analysing the perception of objectsInfo
- Publication number
- EP3899851A1 EP3899851A1 EP19816228.1A EP19816228A EP3899851A1 EP 3899851 A1 EP3899851 A1 EP 3899851A1 EP 19816228 A EP19816228 A EP 19816228A EP 3899851 A1 EP3899851 A1 EP 3899851A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- vehicle
- occupant
- perceived
- detected
- determined
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/02—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0242—Determining effectiveness of advertisements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0272—Period of advertisement exposure
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
- G06V20/58—Recognition of moving objects or obstacles, e.g. vehicles or pedestrians; Recognition of traffic objects, e.g. traffic signs, traffic lights or roads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09626—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages where the origin of the information is within the own vehicle, e.g. a local storage device, digital map
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/54—Audio sensitive means, e.g. ultrasound
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/225—Direction of gaze
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/10—Recognition assisted with metadata
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F15/00—Boards, hoardings, pillars, or like structures for notices, placards, posters, or the like
- G09F15/0006—Boards, hoardings, pillars, or like structures for notices, placards, posters, or the like planar structures comprising one or more panels
Definitions
- the present invention relates to a method for monitoring an occupant of a vehicle
- Vehicle in which the geographical position of the vehicle is recorded, and a
- the invention relates to a device for monitoring an occupant of a vehicle, which has a position detection device for detecting the geographic position of the vehicle and a gaze detection device for detecting a gaze direction of the occupant.
- the invention relates to a system for analyzing the perception of objects in a traffic area, which comprises a plurality of vehicles which have the device for monitoring an occupant of the vehicle.
- advertising objects are known, in which sensors are integrated, which recognize when a person is viewing the advertising object. It is disadvantageous here that high costs are incurred for the sensors in the advertising objects and the range of the sensors is limited, so that it cannot be detected when occupants in passing vehicles do so
- the Environment detection device are detected, the direction of view of a vehicle occupant is detected by means of an eye detection device and the detected direction of view is automatically assigned to a detected object.
- the object can, for example, be an advertising poster.
- the environment detection device comprises a video and / or a radar system which can be used to identify objects, such as an advertising poster, that are located on the route, in particular on the road or on the roadside.
- the objects are detected by a distance estimate and an object classification of the video camera or by a distance measurement and radar object determination of the radar system.
- a disadvantage of this method or device is that the object detection is unreliable.
- An entertainment system for a vehicle is known from US 2016/0316237 A1, via which films, audio programs, information videos, text descriptions and advertising content can be output.
- the system includes a camera with which the user's face is recorded.
- the images captured by the camera are used to analyze the user's emotions. In this way, for example, the reaction of a user to advertising can be recorded. Furthermore, the direction of the user's gaze can be recorded in order to determine where the user is looking on a display. Finally, personal characteristics of the user, such as his age, hair color and similar characteristics can be obtained.
- DE 10 2014 204 530 A1 discloses a method and a device for a subjective advertisement effectiveness analysis. Advertising is presented in the vehicle and the user reaction to this advertising is analyzed. There is one for this
- Vehicle camera provided that visually records occupant reactions. In this way it can be determined whether the user liked or disliked advertisements. For example, facial recognition software can run when an advertisement is displayed so that user emotions can be recorded and timestamped.
- the invention has for its object to provide a method, a device and a system of the type mentioned, with which the perception of an object in the surroundings of the vehicle by a vehicle occupant can be reliably determined.
- this object is achieved by a method with the features of claim 1, a device with the features of claim 14 and a system with the features of claim 15.
- the method according to the invention is characterized in that a data memory is accessed, in which objects and their associated geographic positions are stored, and it is determined on the basis of the detected geographic position of the vehicle whether one of the objects is in the vicinity of the vehicle. Furthermore, depending on the detected line of sight of the occupant, the detected geographic position of the vehicle and the geographic position of the determined object in the The surroundings of the vehicle determine whether the object in the surroundings of the vehicle was perceived by the occupant.
- the object relevant for monitoring the occupant of the vehicle are recorded in that information associated with them, which in particular includes the geographical position of the object, is stored in a data memory which is accessed in order to determine whether one of the objects is located in the surroundings of the vehicle.
- the data memory can in particular be located outside the vehicle. In this case, the data memory is accessed via a wireless communication interface for transmitting the current geographical position of the vehicle to the data memory and for transmitting the geographical one
- the data storage is e.g. B. a non-vehicle server device connected by means of a communication link to the motor vehicle and / or a navigation system of the motor vehicle.
- the orientations of the stored objects are also stored in the data memory. Then it gets in
- the detected geographic position of the vehicle and the geographic position and orientation of the determined object in the surroundings of the vehicle it is determined whether the object in the surroundings of the vehicle was perceived by the occupant. This further development is particularly expedient if the information content represented by the object can only be perceived from a certain viewing direction.
- the data memory can thus in particular store map data and the locations and orientations of the objects.
- the objects can also be classified.
- the objects can in particular be advertising objects, such as billboards.
- An orientation is stored for such advertising objects, via which a direction of view can be determined.
- Direction of vision indicates the viewing angles from which the object can be visually perceived.
- the vehicle Depends on the geographic position and orientation of the object Assigned to the observation area. Depending on the geographical position of the vehicle, it is determined whether the vehicle is within the observation area. If the vehicle is within the observation area, it is determined on the basis of the detected viewing direction of the occupant whether the object in the surroundings of the vehicle was perceived by the occupant.
- the viewing area defines in particular the area in which a vehicle must be located so that an occupant of the vehicle can perceive the object.
- a position within the viewing area is a necessary, but not a sufficient condition for the perception of the object, since the visual perceptibility of the object can be restricted in the case of restricted visibility conditions, such as in fog. In this way, however, it is very easy to determine whether an object is suitable for further analysis. If the vehicle is outside the observation area of an object, there is no need to further determine whether the occupant of the vehicle has perceived this object.
- the viewing area can also define a minimum distance that is required to perceive the object. If the object includes, for example, font of a certain size, a minimum distance can be defined, which is necessary so that an occupant of the vehicle can not only see the object but can also read the font. In this case, perception of the object is defined in such a way that the content of the written notification can be perceived.
- the gaze direction of the occupant is determined in that the gaze detection device determines the gaze direction of the occupant relative to the orientation of the vehicle and then the gaze direction of the occupant relative to the surroundings of the vehicle based on the orientation of the vehicle and the gaze direction of the occupant is determined relative to the orientation of the vehicle.
- the orientation of the vehicle can be determined relatively easily on the basis of the change in the geographical position of the vehicle over time.
- the alignment can be determined in a manner known per se via the sensors located in the vehicle.
- the gaze detection device then no longer has to determine the gaze direction relative to the object in the vicinity of the vehicle, but only relative to the reference system of the vehicle. Since the gaze detection device is arranged in particular in the vehicle, that is, also in the reference system of the vehicle, known eye tracking systems can be used, which are arranged in the vehicle and which for example, tracking movements of the driver's pupil or another vehicle occupant.
- Change in the detected geographic position of the vehicle determines how long the object in the surroundings of the vehicle was perceived by the occupant.
- the object is then particularly marked as perceived.
- a minimum time interval can be defined. Only if the duration of the perception of the object exceeds this minimum time interval is the object marked as perceived.
- it is determined continuously or for successive points in time whether the viewing direction is directed towards the object at the respective geographical position of the vehicle. Since the vehicle is moving, the direction in which the occupant looks must also change when he looks at an object for a certain length of time. This change in the position of the
- the method according to the invention is taken into account in order to determine how long the object was perceived in the surroundings of the vehicle.
- This metadata can include, for example, information about the advertising content shown by the object. They can also include information about the advertised product or company.
- the metadata can also be stored in the data store. They can be accessed by accessing the data store. The determination of such metadata is advantageous for further analysis. It is also possible to output this metadata to the occupant of the vehicle. In this way, the occupant can associate with the perceived object
- the metadata of the perceived object change over time.
- the object can comprise a display which displays messages, advertising content or others at different time intervals
- Information such as traffic information.
- the time or the time interval in which the object was perceived by the occupant is detected and the metadata of the perceived object is determined at the time or the time interval in which the object was perceived. That way It is not only recorded that the object was perceived by the occupant, but also which information, for example, the object at that time or in this
- the method it is determined whether an information content is output from the determined object. Recognition data on output information content is then recorded. At least during or for a time interval after the information content has been output, a conversation between occupants of the vehicle is recorded by means of a microphone in the interior of the vehicle. Then an analysis of the captured conversation that is made during or for a time interval after the issue of the
- Information content is determined whether the information content output was perceived by an occupant of the vehicle. In this way, it can be determined more precisely on the basis of information content which was output by the object whether the object was perceived.
- the time interval after the information content has been output may include, for example, a period of up to 30 minutes after the end of the output of the information content.
- the time interval comprises in particular a period of 20 minutes, preferably 10 minutes or 2 minutes, after the information content has been output.
- the information content is any form of information that can be perceived by a vehicle occupant.
- the information content is, in particular, an advertising content. Advertising is thus output, the content of which can be perceived by an occupant of the vehicle.
- the vehicle occupant is monitored to determine whether he has perceived the advertising content.
- the content of a conversation between occupants of the vehicle is analyzed.
- a connection can be made with the identification data of the assigned information content. If there is such a connection, it is concluded that the information content was perceived by a vehicle occupant.
- the conversation analysis it is achieved that the perception of information content and thus the perception of the object that has output this information content can be concluded very reliably.
- the recognition data are obtained by analyzing the information content output.
- a visual output of the information content can be captured by a camera that records the surroundings of the vehicle.
- the data captured by a camera can then be analyzed in order to obtain recognition data for the information content.
- the identification data can advantageously be obtained solely from devices in the vehicle, without it being necessary to access devices external to the vehicle.
- recognition data relating to the information content of the determined object in the vicinity of the vehicle is transmitted from the data memory to the vehicle.
- the identification data can be, for example, keywords of the output
- the analysis of the recorded conversation which takes place during or for a time interval after the output of the
- Information content determined whether there is a match between at least a subset of the keywords belonging to the information content and words of the conversation. If words occur in the conversation that correspond to keywords of the information content, it can advantageously be concluded very reliably that the conversation content is related to the information content output.
- the keywords are therefore special or rarely occurring words of the information content, so that in the conversation analysis a clear distinction between a general conversation and a conversation about the
- Information content was conducted, the content of the conversation is determined and determines what the reaction of at least one occupant to the information content is. For example, it can be analyzed whether the occupant has responded positively, neutrally or negatively to the information content. With this configuration, it can advantageously be determined not only whether the output of the information content was perceived, but also what the reaction to the information content is.
- the mood of the occupant is determined by means of a mood detection device at the time or at the
- Mood detection device can be coupled, for example, to an interior camera and / or a microphone. In this way, the voice, facial expressions and / or gestures of the occupant can be analyzed and classified by the mood detection device and thus assigned to a specific mood.
- Personal characteristics of the occupant can also be recorded. For example, the identity of the occupant, in particular the driver of the vehicle, can be determined using the
- Vehicle key or be determined based on an entry when driving.
- the occupant's personal characteristics are assessed using a
- Occupant detection device detected. This can access an interior camera and / or a microphone. In this way, for example, the age group and the sex of the occupant can be determined automatically. This information can support the further evaluation of the data on the perceptions of objects.
- Vehicle-internal analysis device detects which object in the surroundings of the vehicle was perceived by which occupant and when.
- the in-vehicle analysis device can detect which metadata the perceived object has, how long the object has been perceived and / or which personal characteristics and / or which mood the occupant who perceived the object had.
- the data recorded by the analysis device can then be transmitted to an evaluation device external to the vehicle. This makes it possible to transmit the data obtained during the monitoring of the vehicle occupant via an external, in particular central, vehicle Evaluate evaluation device in order to obtain data on the perception of objects.
- the evaluation device external to the vehicle can in particular use the data from a large number of vehicles to determine how one belongs to metadata
- Information content is perceived. Alternatively or additionally, it can determine the effectiveness of the output of the information content.
- the data memory can be at least one non-vehicle server device connected to the motor vehicle by means of a communication link and / or a navigation system of the motor vehicle.
- the device according to the invention for monitoring an occupant of a vehicle is characterized by an interface for access to a data memory, in which objects and their geographic positions are stored. Furthermore, the device is characterized by an analysis device, which is designed to determine on the basis of the detected geographical position of the vehicle whether one of the objects is in the vicinity of the vehicle, and also as a function of one detected
- the analysis device is arranged in particular in the vehicle.
- the data memory is arranged, in particular, outside the vehicle. In this case it is
- Interface in particular a wireless communication interface for one
- the device according to the invention is particularly designed to carry out the method according to the invention. It therefore also has the same advantages as the method according to the invention.
- the invention further relates to a system for analyzing the perception of objects in a traffic area.
- the system comprises a large number of vehicles, each of which comprises the device according to the invention described above.
- the vehicles each include an analysis device.
- the system further comprises an evaluation device external to the vehicle, which is designed to communicate with the interfaces of the vehicles and to receive data from the analysis devices of the vehicles indicate which objects were perceived by the occupants of the vehicles.
- data on the perception of objects by occupants of the vehicles can advantageously be obtained over a long period of time, which pass the object, which is arranged, for example, on the roadside.
- Figure 1 illustrates the perception of an object by a vehicle occupant according to an embodiment of the method
- FIG. 2 shows an embodiment of the device according to the invention.
- An object 5 is positioned next to the street. This can be, for example, an advertising object, such as a billboard.
- Object 5 has a specific one
- Alignment so that it can be perceived from a certain viewing area 6 by a person within this viewing area 6.
- the occupant 2 of the vehicle 1, who is located in the viewing area 6, can look with the viewing direction 3 in the direction of a surface of the object 5 on which an information content is displayed. In this way, the occupant 2 can object 5 and that of object 5
- the vehicle 2 moves past the object 5 on the lane 4, the direction of view 3 of the occupant 2 will change such that it is directed towards the object 5 for a certain duration when the occupant 2 is the object or the ones shown Perceives information content.
- FIG. 2 An exemplary embodiment of the device according to the invention and of the method according to the invention is explained with reference to FIG. 2:
- the vehicle 1 comprises various sensors 7.
- the sensors 7 include, for example, an interior camera 7-1, an interior microphone 7-2, an identification unit 7-3 and a position detection device 7-4.
- the sensors 7 are equipped with a gaze detection device 8, a mood detection device 9 and an occupant detection device 10 are coupled.
- the gaze direction 3 of a vehicle occupant 2 relative to the reference system of the vehicle can be determined on the basis of the data recorded by the sensors 7
- the eye detection device 8 can thus comprise an eye tracking system. It is detected, for example, where the occupant 2 looks at a vehicle window, for example the windshield or a side window. Furthermore, the position of the eyes of the occupant 2 within the vehicle 1 can be determined, from which the viewing direction 3 can then be calculated.
- the mood detection device 9 can detect the mood of the occupant 2 at a specific point in time or at a time interval on the basis of the data transmitted by the sensors 7. For example, the facial expressions, gestures and the voice of the occupant 2 can be analyzed and assigned to a specific mood class.
- the occupant detection device 10 can detect personal characteristics of the occupant 2 on the basis of the data acquired by the sensors 7. On the one hand, the identity of the occupant 2 can be determined directly via the identification unit 7-3, for example on the basis of a key personally assigned to the occupant 2. Alternatively or additionally, the personal characteristics of the occupant 2 can be determined by an image analysis and / or voice analysis.
- the gaze direction 3 of the occupant 2 detected by the gaze detection device 8 the mood of the occupant 2 detected by the mood detection device 9 and that by the
- the occupant detection device 10 detects personal characteristics of the occupant 2 continuously or within certain time intervals to an internal vehicle
- the position detection device 7-4 also detects the current geographical position of the vehicle 1 and the pose, that is to say the spatial position or the orientation of the vehicle 1, and continuously to the analysis device
- the analysis device 11 can determine the speed and the direction of movement of the vehicle 1.
- the in-vehicle analysis device 11 is connected to a via the communication interface 12, for example a data connection via the mobile radio network
- Object detection device 13 coupled.
- the object detection device 13 engages towards a data memory 15.
- Data on objects 5 which are relevant for monitoring the occupant 2 are stored in the data memory 15.
- the data memory 15 stores geographical positions that are assigned to the objects 5. Furthermore, the orientations assigned to the objects are stored in the data memory if it is only possible to perceive the object from certain viewing directions.
- the objects 5 can be advertising objects, objects that show information and references, or other objects 5 whose perception by an occupant 2 is of interest.
- Metadata can also be stored for objects 5. This includes data on the viewing area 6, data on information content shown by the object 5, in particular advertising content, and time data, which may indicate which information content was or will be displayed and when.
- the information content can also include information about an advertised product and an advertised company.
- the in-vehicle analysis device 11 is also via the wireless
- Communication interface 12 connected to an evaluation device 14 external to the vehicle.
- the geographic position of the vehicle 1 is continuously from
- Position detection device 7-4 detected.
- the position of the vehicle 1 is continuously transmitted to the object detection device 13 via the communication interface 12 by means of the in-vehicle analysis device 11.
- the object determination device 13 determines those objects 5 which are in the vicinity of the vehicle 1 as a function of the geographic position of the vehicle 1 transmitted by the analysis device 11. For example, the
- Object detection device 13 determine whether the vehicle 1 is within the
- the object detection device 13 then transmits data to the objects 5 in the vicinity of the vehicle 1 back to the
- Analysis device 11 These data include the geographical position and the orientation of the object 5. Furthermore, the data can include a time stamp and all or a subset of the aforementioned metadata of the object 5. Furthermore, the line of sight 3 of the occupant 2 is continuously captured. If the line of sight 3 is detected relative to the reference system of the vehicle 1, the calculation
- Analysis device 11 taking into account the movement of vehicle 1, continuously the viewing direction 3 of occupant 2 in the reference system of roadway 4, in which object 5 is located. In this frame of reference, it is particularly immobile.
- the analysis device 1 1 determines, depending on the detected current direction of view 3 of the occupant 2, the detected current geographic position of the vehicle 1 and the geographic position and orientation of the object 5 in the surroundings of the vehicle 1, whether the object 5 in the surroundings of the vehicle Vehicle 1 was perceived by occupant 2. If an object 5 can only be perceived with a certain orientation relative to a viewer, the orientation of the determined object 5 in the surroundings of the vehicle 1 is also taken into account when determining whether the object 5 in the surroundings of the vehicle 1 is perceived by the occupant 2 has been.
- Such an alignment is to be taken into account, for example, in the case of a flat object 5, if the object 5 is only to be marked as perceived if it was perceived from a certain viewing direction, that is to say if, for example, a surface with displayed information was perceived and not only that Object itself.
- the analysis device 11 detects at what point in time and for how long the object 5 was perceived by the occupant 2. This information is stored in an internal
- Memory of the analysis device 11 stored. In connection with this, the detected mood of the occupant 2 while he was perceiving the object 5 and the personal characteristics of the occupant 2 are also stored.
- the mentioned information is stored in the internal memory of the analysis device 11 whenever the vehicle 1 passes an object 5. This stored information can then be sent via the communication interface 12 or in another way to the vehicle external
- Evaluation device 14 are transmitted, which in this way can further evaluate the data obtained during the monitoring of the occupant 2 of the vehicle 1.
- information content from object 5 is displayed. It can be a stationary display or a changing display.
- Analysis device 11 determines whether an information content is output from the determined object 5.
- the information content is, in particular, an advertising content.
- Recognition data and possibly metadata are assigned to the information content.
- the recognition data are keywords that are related to the information content.
- the keywords can be words that visually appear when the
- the recognition data especially the
- Keywords may be related to the information content in other ways. It is not essential that the keywords have been displayed.
- the identification data is related to the information content in such a way that the identification data can be used to determine whether a user has perceived the output of the information content.
- the recognition data can also contain keywords that are, in a figurative sense, related to the information content. For example, it may be keywords that are likely to be spoken during a conversation about the information content without the keywords themselves occurring in the information content. In particular, the keywords are chosen so that the
- Frequency of occurrence of these words is as low as possible, so that the keywords are less likely to be mentioned in a different context.
- the analysis device 11 If the analysis device 11 has determined that an object 5 is outputting information content, it acquires recognition data for outputted information content.
- the analysis device 11 can obtain the recognition data by transmitting recognition data relating to the information content of the determined object 5 in the vicinity of the vehicle 1 to the analysis device 11 from the data memory 15. For this purpose, not only can it be stored in the data memory 15 at what time which information content is displayed by which object 5, but additionally also the respectively assigned identification data.
- the analysis device 11 records recognition data for output
- the analysis device 12 also receives recordings of the interior microphone 7-2. It is designed to analyze a conversation recorded during the output of an information content or for a time interval after the output of an information content.
- Analysis device 12 determines this conversation analysis and the detected
- the analysis device 12 extracts in particular words of the detected one
- the analysis device 12 can verify that an occupant 2 has perceived the object 5 which has output the information content.
- a large number of vehicles 1 are equipped with the device according to the invention described above.
- the vehicles 1 monitor one or more occupants 2 of the respective vehicle 1 with regard to the perception of
- the respective analysis device 11 of a vehicle 1 detects which object 5 was perceived by which occupant 2 at what time. Furthermore, the aforementioned metadata on the perception of the object 5 can be recorded. The data recorded by the analysis devices 11 are transmitted to the evaluation device 14 external to the vehicle, so that it can comprehensively evaluate how a specific object was perceived.
- the evaluation device external to the vehicle uses the data from a large number of vehicles to determine how information content associated with metadata is perceived.
- Reference number list vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Theoretical Computer Science (AREA)
- Strategic Management (AREA)
- Finance (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Multimedia (AREA)
- General Business, Economics & Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Automation & Control Theory (AREA)
- Game Theory and Decision Science (AREA)
- Entrepreneurship & Innovation (AREA)
- Human Computer Interaction (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018133445.1A DE102018133445A1 (en) | 2018-12-21 | 2018-12-21 | Method and device for monitoring an occupant of a vehicle and system for analyzing the perception of objects |
PCT/EP2019/082818 WO2020126375A1 (en) | 2018-12-21 | 2019-11-27 | Method and apparatus for monitoring an occupant of a vehicle and system for analysing the perception of objects |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3899851A1 true EP3899851A1 (en) | 2021-10-27 |
Family
ID=68806720
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19816228.1A Pending EP3899851A1 (en) | 2018-12-21 | 2019-11-27 | Method and apparatus for monitoring an occupant of a vehicle and system for analysing the perception of objects |
Country Status (6)
Country | Link |
---|---|
US (1) | US20220019824A1 (en) |
EP (1) | EP3899851A1 (en) |
KR (1) | KR102663092B1 (en) |
CN (1) | CN113168643A (en) |
DE (1) | DE102018133445A1 (en) |
WO (1) | WO2020126375A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11704698B1 (en) * | 2022-03-29 | 2023-07-18 | Woven By Toyota, Inc. | Vehicle advertising system and method of using |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070205963A1 (en) * | 2006-03-03 | 2007-09-06 | Piccionelli Gregory A | Heads-up billboard |
US9047256B2 (en) * | 2009-12-30 | 2015-06-02 | Iheartmedia Management Services, Inc. | System and method for monitoring audience in response to signage |
DE102012213466A1 (en) | 2012-07-31 | 2014-02-06 | Robert Bosch Gmbh | Method and device for monitoring a vehicle occupant |
US20140278910A1 (en) | 2013-03-15 | 2014-09-18 | Ford Global Technologies, Llc | Method and apparatus for subjective advertisment effectiveness analysis |
US20140278933A1 (en) * | 2013-03-15 | 2014-09-18 | F. Gavin McMillan | Methods and apparatus to measure audience engagement with media |
US20140379456A1 (en) * | 2013-06-24 | 2014-12-25 | United Video Properties, Inc. | Methods and systems for determining impact of an advertisement |
DE102014109079A1 (en) * | 2013-06-28 | 2014-12-31 | Harman International Industries, Inc. | DEVICE AND METHOD FOR DETECTING THE INTEREST OF A DRIVER ON A ADVERTISING ADVERTISEMENT BY PURSUING THE OPERATOR'S VIEWS |
US20150310451A1 (en) * | 2014-04-29 | 2015-10-29 | Ford Global Technologies, Llc | Vehicle driver tracking and reporting |
US9607515B2 (en) * | 2014-12-22 | 2017-03-28 | Intel Corporation | System and method for interacting with digital signage |
US9852355B2 (en) | 2015-04-21 | 2017-12-26 | Thales Avionics, Inc. | Facial analysis for vehicle entertainment system metrics |
US10970747B2 (en) * | 2015-09-04 | 2021-04-06 | Robert Bosch Gmbh | Access and control for driving of autonomous vehicle |
JP2017123029A (en) * | 2016-01-06 | 2017-07-13 | 富士通株式会社 | Information notification apparatus, information notification method and information notification program |
JP6694112B2 (en) * | 2017-03-17 | 2020-05-13 | マクセル株式会社 | AR display device and AR display method |
US10083547B1 (en) * | 2017-05-23 | 2018-09-25 | Toyota Jidosha Kabushiki Kaisha | Traffic situation awareness for an autonomous vehicle |
-
2018
- 2018-12-21 DE DE102018133445.1A patent/DE102018133445A1/en active Pending
-
2019
- 2019-11-27 WO PCT/EP2019/082818 patent/WO2020126375A1/en unknown
- 2019-11-27 EP EP19816228.1A patent/EP3899851A1/en active Pending
- 2019-11-27 US US17/414,838 patent/US20220019824A1/en active Granted
- 2019-11-27 CN CN201980084891.4A patent/CN113168643A/en active Pending
- 2019-11-27 KR KR1020217022757A patent/KR102663092B1/en active IP Right Grant
Also Published As
Publication number | Publication date |
---|---|
US20220019824A1 (en) | 2022-01-20 |
CN113168643A (en) | 2021-07-23 |
DE102018133445A1 (en) | 2020-06-25 |
WO2020126375A1 (en) | 2020-06-25 |
KR102663092B1 (en) | 2024-05-03 |
KR20210100731A (en) | 2021-08-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE102012213466A1 (en) | Method and device for monitoring a vehicle occupant | |
DE102017213177A1 (en) | Method for operating a screen of a motor vehicle and motor vehicle | |
DE102016014712B4 (en) | Vehicle and method for outputting information to a vehicle environment | |
DE102017111468A1 (en) | A vehicle system and method for determining whether a vehicle occupant has sensed an off-vehicle object | |
EP3547244A1 (en) | Method, device and computer readable storage medium with instructions for providing content for display for a passenger of a motor vehicle | |
EP2813999B1 (en) | Augmented reality system and method of generating and displaying augmented reality object representations for a vehicle | |
DE102014207398A1 (en) | Object association for contact-analogue display on an HMD | |
DE102015119704A1 (en) | Alarm system for obstacles and a method for its operation | |
DE102012018556B4 (en) | Assistance system to enable an extended foresight for subsequent road users | |
EP3899851A1 (en) | Method and apparatus for monitoring an occupant of a vehicle and system for analysing the perception of objects | |
DE102018210852A1 (en) | Procedure for the determination of illegal driving behavior by a vehicle | |
DE102019001092A1 (en) | Method for operating a driver assistance system, as well as electronic computing device, computer program product and data carrier | |
DE102017209370B4 (en) | Method for determining overtaking information | |
DE102018117015A1 (en) | Method for detecting an interest of a user of a motor vehicle in an object, detection system and motor vehicle | |
WO2022058143A1 (en) | Method and device for providing information in a vehicle | |
WO2020126376A1 (en) | Method and device for monitoring an occupant of a vehicle | |
EP4252218A1 (en) | Method for operating an assistance system of a motor vehicle, computer program product, and assistance system | |
DE102020004792A1 (en) | Method and device for the detection and reporting of parking accidents for vehicles | |
EP2872943B1 (en) | Device for operating multiple optical display devices of a vehicle | |
DE102016103037A1 (en) | Method for checking the functionality of a camera-monitor system, camera-monitor system and motor vehicle | |
DE102019219171A1 (en) | Driver assistance system, crowdsourcing module, procedure and computer program | |
DE102013003035A1 (en) | Method for obtaining movement profile of observed motor car e.g. bus, involves receiving vehicle identifier and associated observation data with meeting time point and position indicator for meeting location of observed motor car | |
DE102018220617A1 (en) | Method for recognizing a gaze area of a person, in particular an occupant in a vehicle | |
DE102018212171B4 (en) | Method and device for detecting road users in the vicinity of a vehicle | |
DE102016212185A1 (en) | Procedure for exchanging and displaying location-related information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20210721 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20230320 |