US20180232770A1 - Billboard display and method for selectively displaying advertisements by sensing demographic information of occupants of vehicles - Google Patents
Billboard display and method for selectively displaying advertisements by sensing demographic information of occupants of vehicles Download PDFInfo
- Publication number
- US20180232770A1 US20180232770A1 US15/751,327 US201615751327A US2018232770A1 US 20180232770 A1 US20180232770 A1 US 20180232770A1 US 201615751327 A US201615751327 A US 201615751327A US 2018232770 A1 US2018232770 A1 US 2018232770A1
- Authority
- US
- United States
- Prior art keywords
- occupants
- vehicle
- billboard display
- demographic information
- vehicles
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims description 32
- 238000004891 communication Methods 0.000 claims description 35
- 230000004044 response Effects 0.000 claims description 24
- 230000008451 emotion Effects 0.000 claims description 14
- 238000006243 chemical reaction Methods 0.000 claims description 5
- 230000008569 process Effects 0.000 claims description 5
- 230000005540 biological transmission Effects 0.000 claims description 4
- 230000001413 cellular effect Effects 0.000 claims description 4
- 230000002996 emotional effect Effects 0.000 claims description 3
- 230000026676 system process Effects 0.000 abstract description 2
- 238000010801 machine learning Methods 0.000 description 7
- 210000000887 face Anatomy 0.000 description 6
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000004883 computer application Methods 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 208000019901 Anxiety disease Diseases 0.000 description 1
- 206010027940 Mood altered Diseases 0.000 description 1
- 230000036506 anxiety Effects 0.000 description 1
- 238000013528 artificial neural network Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007418 data mining Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000002068 genetic effect Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000036651 mood Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002787 reinforcement Effects 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0265—Vehicular advertisement
- G06Q30/0266—Vehicular advertisement based on the position of the vehicle
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/005—Handover processes
- B60W60/0059—Estimation of the risk associated with autonomous or manual driving, e.g. situation too complex, sensor failure or driver incapacity
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0242—Determining effectiveness of advertisements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/02—Marketing; Price estimation or determination; Fundraising
- G06Q30/0241—Advertisements
- G06Q30/0251—Targeted advertisements
- G06Q30/0269—Targeted advertisements based on user profile or attribute
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/593—Recognising seat occupancy
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/043—Identity of occupants
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/045—Occupant permissions
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/049—Number of occupants
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0645—Rental transactions; Leasing transactions
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/178—Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition
Definitions
- Embodiments of the invention relate to a billboard display system that receives data from interior occupant sensing systems of vehicles and analyzes the data to provide advertising on a billboard display that is directed toward the occupants of the vehicles.
- the concept of electronic billboard displays is old and known.
- the electronic billboard displays provide various advertisements that change periodically.
- the advertisements are simply selected based on payments by advertisers and displayed without any knowledge of, or relationship to, the persons in vehicles that are positioned to view the billboard displays.
- the invention provides a method for selectively displaying advertisements with a billboard display system to occupants in one or more vehicles approaching a billboard display located near a roadway.
- the method includes receiving a number of occupants, demographic information of occupants, and vehicle position information from an interior occupant sensing system for one or more vehicles approaching the billboard display, processing the demographic information and the number of occupants to determine advertising relevant to occupants, and selectively displaying relevant advertising with the billboard display to one or more vehicles approaching the billboard display.
- a billboard display system provides advertising on a billboard display to occupants in one or more vehicles in response to operation of an interior occupant sensing system in one or more vehicles to obtain demographic information of occupants.
- the billboard display system includes a display controller including a processor and a memory, along with a communication unit for providing communication between the display controller and the interior occupant sensing system of one or more vehicles.
- the processor is configured to receive via the communication unit, a number of occupants disposed in one or more vehicles, the demographic information for each occupant disposed in one or more vehicles, and vehicle position information from an interior occupant sensing system that includes a video camera disposed in each of the one or more vehicles.
- the processor provides an output to the billboard display to selectively display advertising that is targeted to the demographic information of occupants of the one or more vehicles.
- the invention provides a method of selectively displaying advertisements on a billboard display of a billboard display system to occupants in one or more vehicles approaching the billboard display located near a roadway.
- the method includes, in an interior occupant sensing system of a vehicle approaching the billboard display, operating a video camera to obtain video data including faces of occupants in a vehicle.
- the video data is provided to an electronic control unit.
- the method includes sensing a GPS signal and determining a vehicle position information of a vehicle, detecting a number of occupants disposed in a vehicle from the video data, and determining demographic information from the video data of the faces of occupants.
- the method further includes storing the vehicle position information, the number of occupants, and the demographic information of occupants in a memory of the interior occupant sensing system, and wirelessly transmitting the number of occupants, the demographic information, and the vehicle position information. Further, the method includes, in a display controller of the billboard display system, receiving the number of occupants, the demographic information of occupants, and the vehicle position information wirelessly transmitted by the interior occupant sensing system for a vehicle approaching the billboard display. The billboard display system processes the demographic information and the number of occupants to determine advertising relevant to occupants; and selectively displays relevant advertising on the billboard display to occupants in an approaching vehicle.
- FIG. 1 is a diagram of a roadway and a billboard display system, according to some embodiments.
- FIG. 2 is a block diagram of an interior occupant sensing system, according to some embodiments.
- FIG. 3 is a block diagram of an electronic controller, according to some embodiments.
- FIG. 4 is a front view of a dashboard and a windshield of a vehicle, according to some embodiments.
- FIG. 5 is a flow chart illustrating a method for detecting data for vehicle occupants, according to some embodiments.
- FIG. 6 is a flow chart illustrating a method for detecting marketing metadata for vehicle occupants that includes speech recognition, according to some embodiments.
- FIG. 7 is a flow chart illustrating a method for interacting with a billboard display system using occupant metadata, according to some embodiments.
- embodiments may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware.
- the electronic based aspects of the embodiments may be implemented in software (e.g., stored on non-transitory computer-readable medium) executable by one or more processors.
- processing units and “controllers” described in the specification can include standard processing components, such as one or more processors, one or more memory modules including non-transitory computer-readable medium, one or more input/output interfaces, and various connections (e.g., a system bus) connecting the components.
- processors such as one or more processors, one or more memory modules including non-transitory computer-readable medium, one or more input/output interfaces, and various connections (e.g., a system bus) connecting the components.
- FIG. 1 illustrates a roadway 10 with vehicles 12 , and a nearby billboard display system 20 .
- the billboard display system 20 near the roadway 10 includes a display controller 22 that includes a processor 24 in communication with a memory 26 and a display driver 28 .
- the display driver 28 connects to or communicates with a billboard display 30 located adjacent the roadway 10 .
- the processor 24 is linked to a communication unit 34 for communicating with vehicles 12 , etc.
- the billboard display 30 selectively displays various advertising images.
- the billboard display 30 is a digital electronic display for providing various advertising images stored in memory 26 .
- the selected advertising images include video images, such that the image displayed on the billboard display 30 changes visually to attract attention to the billboard display 30 .
- the processor 24 provides a digital image signal for advertising to the display driver 28 , which provides the image to the billboard display 30 for viewing thereon.
- the processor 24 obtains the digital image signal from the memory 26 or from a remote cloud server 36 shown in FIG. 1 or other electronic storage device via an internet connection or via the communication unit 34 .
- FIG. 2 illustrates an interior occupant sensing system 40 .
- the interior occupant sensing system 40 is integrated into each vehicle 12 , and includes an electronic controller 42 , a video camera 44 , a microphone 46 , a database 48 , a communications system 50 , a human-machine interface (HMI) 52 , a global positioning signal (GPS) navigation system 54 , an entertainment system 56 , and other vehicle systems 58 .
- the components of the interior occupant sensing system 40 are in communication with the electronic controller 42 .
- the electronic controller 42 shown in FIG. 2 may be dedicated to the interior occupant sensing system 40 , or may also be part of another vehicle operating system. In some embodiments, the electronic controller 42 is integrated with the video camera 44 and microphone 46 into a single device. In other embodiments, the electronic controller 42 is part of a multi-camera system that includes interior and exterior cameras. In some embodiments, the interior occupant sensing system 40 includes more than one camera.
- the video camera 44 is a digital video camera in one embodiment. The video camera 44 is positioned to view the interior of the vehicle and the occupants (e.g., the driver and one or more passengers) of the vehicle.
- the electronic controller 42 is configured to receive and process images or video data from the video camera 44 .
- the microphone 46 is positioned in the interior of the vehicle and is configured to sense or detect sound (including voices), convert the sound or audio signal to audio data, and provide the audio data to the electronic controller 42 .
- the electronic controller 42 is configured to receive and process the audio data from the microphone 46 .
- the microphone 46 may be stand alone or it may be part of another vehicle system (e.g., a hands-free cellular system).
- the database 48 shown in FIG. 2 electronically stores information regarding the interior occupant sensing system 40 , including metadata of occupants, according to the methods described herein.
- the electronic controller 42 is configured to read and write such information to and from the database 48 .
- the database 48 is accessible by the electronic controller 42 .
- the database 48 may be located in the electronic controller 42 , or housed on a suitable database server or other system external to the interior occupant sensing system 40 and accessible over one or more intervening networks.
- the communications system 50 shown in FIG. 2 includes hardware and software components that allow the electronic controller 42 to communicate wirelessly using one or more modalities from the group consisting of: cellular data, vehicle-to-everything (V2X) communications, and Wi-Fi. Other communication arrangements and protocols are also contemplated.
- the communications system 50 enables the electronic controller 42 to communicate with other systems using public data networks (e.g., the Internet).
- the HMI 52 shown in FIG. 2 provides an interface between the interior occupant sensing system 40 and the occupants of the vehicle 12 .
- the HMI 52 is coupled to the electronic controller 42 and is configured to receive inputs from the occupants, receive data from the controller, and provide warnings or other information to the occupants based on the data.
- the HMI 52 includes suitable input and output mechanisms, including, for example, buttons, a touch-screen display having a graphical user interface (GUI), voice recognition, etc.
- the GPS navigation system 54 includes hardware and software for locating the vehicle using GPS signals, the global positioning system and plotting routes based for the driver.
- the entertainment system 56 includes hardware (e.g., a display screen) and software configured to provide video and audio entertainment content to occupants of the vehicle. In some embodiments, the entertainment content includes video content streamed through the communications system 50 .
- the electronic controller 42 is also coupled to other vehicle systems 58 , including, for example, steering, braking, and automated vehicle control systems.
- the electronic controller 42 includes an electronic processing unit 60 (e.g., a microprocessor or another suitable programmable device), a non-transitory memory 64 (e.g., a computer-readable storage medium), and an input/output interface 68 .
- the input/output interface 68 includes one or more control or data buses enabling the electronic processing unit 60 to communicate with the various devices illustrated in FIG. 2 .
- the input/output interface 68 provides an electrical connection over a data bus or a wired, wireless, or optical connection that enables the devices shown in FIG. 2 to communicate using network communications protocols, for example, the Ethernet, CAN or FLEXRAY protocol.
- the memory 64 can include a program storage area (e.g., read only memory (ROM) and a data storage area (e.g., random access memory (RAM), and another non-transitory computer readable medium.
- the electronic processing unit 60 executes software stored in the memory 64 .
- the software may include instructions and algorithms for performing methods as described herein.
- the input/output interface 68 shown in FIG. 3 receives inputs and provides outputs to and from systems external to the electronic controller 42 , including the devices and systems shown in FIG. 2 .
- the electronic controller 42 includes a plurality of electrical and electronic components that provide power, operational control, and protection to the components and modules within the electronic controller 42 . It should be understood that the electronic controller 42 may include additional, fewer, or different components.
- the electronic controller 42 is configured to perform machine learning functions.
- the memory 64 stores one or more learning engines executable by the electronic processing unit 60 to process data received from the video camera 44 and the microphone 46 , and develop demographic metadata on the occupants of the vehicle.
- Machine learning generally refers to the ability of a computer application to learn without being explicitly programmed.
- a computer application performing machine learning (sometimes referred to as a learning engine) is configured to develop an algorithm based on training data.
- the training data includes example inputs and corresponding desired (e.g., actual) outputs, and the learning engine progressively develops a model that maps inputs to the outputs included in the training data.
- Machine learning can be performed using various types of methods and mechanisms including, but not limited to, decision tree learning, association rule learning, artificial neural networks, inductive logic programming, support vector machines, clustering, Bayesian networks, reinforcement learning, representation learning, similarity and metric learning, sparse dictionary learning, and genetic algorithms.
- the electronic controller 42 accesses one or more sources of training data, predictive models, or learning engines through one or more communication networks, such as the Internet and other public and private networks. Alternatively or in addition, the electronic controller 42 stores training data in the database 48 or in the memory 64 .
- FIG. 4 is a partial perspective view of a front interior portion 70 of a vehicle 12 that includes a dashboard 72 , a windshield 74 and a roof 76 . Further, a video camera 44 is mounted to the roof 76 near the rear view mirror 78 to obtain video data within the vehicle. Further, FIG. 4 shows the human-machine interface 52 , such as a touch screen.
- the interior occupant sensing system 40 for each vehicle operates as follows.
- the electronic processing unit 60 determines the number, pose and location of each occupant that is detected (step 102 ). For each face, demographic information and markers are estimated or determined (step 106 ) by the electronic processing unit 60 executing classifier algorithms. Determining demographic information includes utilizing the video data or video images for a given occupant's face to 1) classify age, 2) classify gender and 3) classify ethnicity and/or race. In some embodiments, other classification information is obtained. The demographic information is determined for each individual occupant.
- the electronic processing unit 60 Upon determining the demographic information from video images of faces of occupants, the electronic processing unit 60 further determines emotive responses (step 110 ).
- the emotive responses include 1) gaze estimation, including where the face is looking and 2) current emotions, such as a smile, frown or laugh. Changes in the occupants' emotive response to stimuli may be caused by changes in operation of the vehicle itself or externally (i.e., current emotion). For instance, an occupant might display happiness when the vehicle accelerates, or display anxiety when a warning is issued by the HMI 52 .
- the electronic processing unit 60 is configured for storing the emotions and the exact time that the emotions or emotive responses occur in the memory 64 .
- the face 104 of an occupant is tracked or followed within the video image (step 114 ).
- Tracking faces 104 of occupants enables the interior occupant sensing system 40 to determine emotions of occupants and their emotive responses to stimuli, such as advertising. That way, a face 104 that has been classified is followed and reclassification is not necessarily required.
- the electronic processing unit 60 is further configured to combine the demographic information with other vehicle information (i.e. GPS location, duration of trip, passengers entering/exiting, etc.) (step 116 ).
- vehicle information i.e. GPS location, duration of trip, passengers entering/exiting, etc.
- Other vehicle information may include an intended destination for the vehicle, the make and model of the vehicle.
- the clothing worn by the occupants may be utilized in classifying the occupants and predicting what advertising to provide thereto.
- the electronic processing unit 60 is configured to improve classifier estimates as the occupant is tracked over time (step 120 ). Influences such as lighting, occupant movement, and occupant clothing may vary over time, causing the electronic processing unit 60 to make different demographic estimations over time.
- the electronic processing unit 60 uses machine learning algorithms to improve its estimates by recognizing trends or eliminating outliers. This continuous improvement ensures the most reliable metadata.
- the electronic processing unit 60 is configured to store the demographic information, the vehicle position information, the trip destination information, the trip duration information and additional information as metadata in the database 48 (step 124 ).
- the electronic processing unit 60 classifies the data as occupant specific data (for example, based on face, and as set forth below, voice recognition algorithms); generic demographic data tied to a specific vehicle at a specific time and location; or as generic metadata that is not tied to any individual or vehicle, but aggregated to extract demographic patterns of vehicle use and transportation. Depending on the application, this allows for either specific customization of the application to the user, or broad statistics based applications that can ensure individual privacy.
- the microphone 46 collects occupant speech to better ascertain occupant emotive state and reactions to external events.
- the electronic processing unit 60 is configured to perform speech recognition on audio data received from the microphone 46 (step 144 ). Key words or phrases are then identified (step 148 ). Thereafter, the electronic processing unit 60 is further configured to match the speaker to a face (step 152 ). Such matching relies on estimating movement or speaking by a face of an occupant and comparison with the key word.
- This key word and specific occupant information is stored as metadata in the database 48 (step 160 ) along with the demographic information and other metadata for each specific user. Further, the metadata is wirelessly transmitted via communications system 50 to the communication unit 34 of the billboard display system 20 .
- the interior occupant sensing system 40 is capable of providing dense profile models.
- the electronic controller 42 is configured to fuse detected persons with their demographic information, their emotive response and the speech recognition of key words. Collecting occupant data, along with vehicle dynamics, provides market researchers a real-time response of an occupant to an advertisement displayed on the billboard display system.
- the interior occupant sensing system 40 interacts with the billboard display system 20 using occupant data, such as metadata.
- occupant data such as metadata.
- data on the number of faces in a video image corresponding to the number of occupants, the occupant demographics and other metrics are wirelessly transmitted by the communications system 50 via V2X to an upcoming communication unit 34 of a billboard display system 20 .
- a vehicle 12 enters the zone of the billboard display system 20 (step 202 ).
- the communications system 50 of the interior occupant sensing system 40 provides wireless transmission or wireless communication (step 206 ) with the communication unit 34 of the billboard display system 20 .
- the vehicle 12 For communication to occur, the vehicle 12 must be within broadcasting range of the billboard display system 20 .
- the wireless transmission includes the vehicle position information of the vehicle located on a roadway relative to the billboard display 34 as determined from GPS signals.
- the display controller 22 of the billboard display system 20 analyzes a population of occupants approaching the billboard display 30 from one or more approaching vehicles (step 210 ). Further, the display controller 22 is configured to analyze the metrics (step 212 ) from one or more vehicles approaching the billboard display 30 . Besides the number of occupants and the demographic information ( 214 ) of occupants in multiple vehicles that are approaching the billboard display 30 , along with trip destinations ( 216 ) and trip durations ( 218 ) of the multiple vehicles when the trip data is available, the metrics may include other factors. The type of vehicle, the clothing worn by occupants, and the time of day, are also provided as metrics in some embodiments.
- the display controller 22 determines and displays a targeted advertisement (step 220 ).
- the targeted advertisement is considered most applicable to the population consisting of the occupants of each of the vehicles in the field of view of the billboard display 30 .
- a targeted advertisement is based on the various factors and based on time of day. For instance, in one embodiment, advertising for hotels or motels in mid or late evening of a long trip duration is provided. At what is considered close to typical mealtimes, advertising is provided for approaching restaurants. The type of restaurants or establishments displayed in advertising is also based on the ages of the occupants.
- the various metrics interactively assist in the selective display of advertising on the billboard display 30 .
- the billboard display system 20 queries the approaching vehicles 12 for additional or updated metrics (step 230 ).
- the updated metrics include impression counts ( 232 ), emotive responses ( 234 ), and gaze of eyes ( 236 ) of the occupants of the approaching vehicles.
- the interior occupant sensing system 40 is configured to communicate the impression counts, the emotional states, any emotive responses and gaze of eyes of occupants to the billboard display system (step 240 ), along with additional metrics.
- the display controller 22 executes a program using the above information to target or select a future or immediate advertisement that is more likely relevant or useful to occupants of one or more vehicles approaching the billboard display 30 .
- the display controller 22 is configured to process the trip destination information and/or the trip duration information, in combination with the demographic information and number of occupants, to determine advertising relevant to occupants.
- the emotional state of the occupants assists in selection of advertising. For occupants in a positive mood, the advertising is different than the advertising for occupants in an angry or bad mood.
- the display controller 22 is configured to track impression counts for current advertising. Using occupant gaze estimations, advertisers accurately analyze and gauge the effectiveness of various advertising schemes. Analyzing the effectiveness from the impression counts maximizes the relevance of the advertising to be displayed in the future to particular occupants of one or more vehicles.
- Other metrics can be used to target advertisements.
- the billboard display system 20 if the billboard display system 20 is informed that a vehicle's fuel level is low, advertising for nearby gas stations appears.
- child-friendly venues advertise using the billboard display 30 to vehicles 12 carrying children as determined from the demographic information.
- At least one of impression counts, gaze estimation and emotive responses are collected and transmitted to advertisers remotely, such as to the remote cloud server 36 , to gauge the success of advertisements provided on the billboard display 30 .
- the billboard display system 20 is in communication over the internet or other communication with the cloud server 36 .
- the billboard display system 20 provides demographic information to the cloud server 36 and receives advertising for viewing on the billboard display 30 .
- the above described system is controlled using at least one display controller 22 .
- the display controller 22 includes one or more processing units (e.g., a processor, application specific integrated circuits (“ASIC”), etc.).
- the vehicle electronic controller 42 can include one or more processing units as listed above.
- the memory 26 of the display controller 22 and the memory 64 of the vehicle electronic controller 42 are non-transitory computer-readable medium in one embodiment. Both random access memory (RAM) and read only memory (ROM) are contemplated.
- RAM random access memory
- ROM read only memory
- the electronic controller 42 uses the input/output interface 68 for sending and receiving information from one or more sensors or systems external to the electronic controller 42 (e.g., over a vehicle communication bus, such as a controller area network (CAN) bus or a FlexRay bus).
- a vehicle communication bus such as a controller area network (CAN) bus or a FlexRay bus.
- the electronic controller 42 can also include one or more additional internal sensors or systems.
- the billboard display 30 is an electronic billboard display that displays the electronic images from the display driver 28 .
- the billboard display 30 is a video screen of plasma, LCD, LED or other digital light emitting arrangements.
- a loudspeaker or sound output is also provided.
- one vehicle can be considered a first vehicle having a first interior occupant sensing system with a first video camera and another vehicle can be considered a second vehicle having a second interior occupant sensing system with a second video camera.
- first vehicle having a first interior occupant sensing system with a first video camera
- second vehicle having a second interior occupant sensing system with a second video camera.
- third, fourth, fifth, etc. vehicles having interior occupant sensing systems are also contemplated.
- FIG. 1 shows vehicles in one direction approaching the billboard display 30 receiving advertising
- vehicles traveling in opposing directions view advertising messages on respective sides of the billboard display 30 .
- the invention provides, among other things, methods for operating a billboard display system 20 to provide advertising directed to the occupants of vehicles 12 approaching a billboard display 30 and for operating an interior occupant sensing system 40 to provide information or data thereto.
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Accounting & Taxation (AREA)
- Development Economics (AREA)
- Finance (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Game Theory and Decision Science (AREA)
- Marketing (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Entrepreneurship & Innovation (AREA)
- Automation & Control Theory (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Mathematical Physics (AREA)
- Traffic Control Systems (AREA)
- Aviation & Aerospace Engineering (AREA)
- Navigation (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)
- Medical Informatics (AREA)
- Evolutionary Computation (AREA)
- Artificial Intelligence (AREA)
- Controls And Circuits For Display Device (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
Description
- This application claims the benefit of U.S. Provisional Patent Application No. 62/214,416 filed Sep. 4, 2015, the entire contents of which are incorporated herein by reference.
- Embodiments of the invention relate to a billboard display system that receives data from interior occupant sensing systems of vehicles and analyzes the data to provide advertising on a billboard display that is directed toward the occupants of the vehicles.
- The concept of electronic billboard displays is old and known. The electronic billboard displays provide various advertisements that change periodically. The advertisements, however, are simply selected based on payments by advertisers and displayed without any knowledge of, or relationship to, the persons in vehicles that are positioned to view the billboard displays.
- Increasingly, products, including vehicles, are being customized for specific customer demographics. Many applications can benefit from improved knowledge of the demographic profile of each occupant in a vehicle. For example, with demographic knowledge of the occupants, vehicles can be designed for the needs of their users, automotive systems can respond and adjust based on the needs and actions of the occupants. In addition to determining the number and profile of occupants in the vehicle, it would also be useful to analyze each occupant's reactions to external stimuli, such as the reactions to advertisements (e.g., billboards). Thus, advertisers can better target advertisements to occupants.
- Through data mining and machine learning, dense profiles are modeled and analyzed to assist marketers and advertisers with targeted out-of-vehicle electronic displays.
- In one embodiment, the invention provides a method for selectively displaying advertisements with a billboard display system to occupants in one or more vehicles approaching a billboard display located near a roadway. The method includes receiving a number of occupants, demographic information of occupants, and vehicle position information from an interior occupant sensing system for one or more vehicles approaching the billboard display, processing the demographic information and the number of occupants to determine advertising relevant to occupants, and selectively displaying relevant advertising with the billboard display to one or more vehicles approaching the billboard display.
- In another embodiment, a billboard display system provides advertising on a billboard display to occupants in one or more vehicles in response to operation of an interior occupant sensing system in one or more vehicles to obtain demographic information of occupants. Besides the billboard display located near a roadway, the billboard display system includes a display controller including a processor and a memory, along with a communication unit for providing communication between the display controller and the interior occupant sensing system of one or more vehicles. The processor is configured to receive via the communication unit, a number of occupants disposed in one or more vehicles, the demographic information for each occupant disposed in one or more vehicles, and vehicle position information from an interior occupant sensing system that includes a video camera disposed in each of the one or more vehicles. In response to at least the demographic information and the vehicle position information, the processor provides an output to the billboard display to selectively display advertising that is targeted to the demographic information of occupants of the one or more vehicles.
- In another embodiment, the invention provides a method of selectively displaying advertisements on a billboard display of a billboard display system to occupants in one or more vehicles approaching the billboard display located near a roadway. The method includes, in an interior occupant sensing system of a vehicle approaching the billboard display, operating a video camera to obtain video data including faces of occupants in a vehicle. The video data is provided to an electronic control unit. The method includes sensing a GPS signal and determining a vehicle position information of a vehicle, detecting a number of occupants disposed in a vehicle from the video data, and determining demographic information from the video data of the faces of occupants. The method further includes storing the vehicle position information, the number of occupants, and the demographic information of occupants in a memory of the interior occupant sensing system, and wirelessly transmitting the number of occupants, the demographic information, and the vehicle position information. Further, the method includes, in a display controller of the billboard display system, receiving the number of occupants, the demographic information of occupants, and the vehicle position information wirelessly transmitted by the interior occupant sensing system for a vehicle approaching the billboard display. The billboard display system processes the demographic information and the number of occupants to determine advertising relevant to occupants; and selectively displays relevant advertising on the billboard display to occupants in an approaching vehicle.
- Other aspects of the invention will become apparent by consideration of the detailed description and accompanying drawings.
-
FIG. 1 is a diagram of a roadway and a billboard display system, according to some embodiments. -
FIG. 2 is a block diagram of an interior occupant sensing system, according to some embodiments. -
FIG. 3 is a block diagram of an electronic controller, according to some embodiments. -
FIG. 4 is a front view of a dashboard and a windshield of a vehicle, according to some embodiments. -
FIG. 5 is a flow chart illustrating a method for detecting data for vehicle occupants, according to some embodiments. -
FIG. 6 is a flow chart illustrating a method for detecting marketing metadata for vehicle occupants that includes speech recognition, according to some embodiments. -
FIG. 7 is a flow chart illustrating a method for interacting with a billboard display system using occupant metadata, according to some embodiments. - Before any embodiments of the invention are explained in detail, it is to be understood that the invention is not limited in its application to the details of construction and the arrangement of components set forth in the following description or illustrated in the following drawings. The invention is capable of other embodiments and of being practiced or of being carried out in various ways.
- Also, it is to be understood that the phraseology and terminology used herein is for the purpose of description and should not be regarded as limiting. The use of “including,” “comprising” or “having” and variations thereof herein is meant to encompass the items listed thereafter and equivalents thereof as well as additional items. The terms “mounted,” “connected” and “coupled” are used broadly and encompass both direct and indirect mounting, connecting and coupling. Further, “connected” and “coupled” are not restricted to physical or mechanical connections or couplings, and can include electrical connections or couplings, whether direct or indirect. Also, electronic communications and notifications may be performed using any known means including wired connections, wireless connections, etc.
- It should also be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components may be used to implement the embodiments. In addition, it should be understood that embodiments may include hardware, software, and electronic components or modules that, for purposes of discussion, may be illustrated and described as if the majority of the components were implemented solely in hardware. However, one of ordinary skill in the art, and based on a reading of this detailed description, would recognize that, in at least one embodiment, the electronic based aspects of the embodiments may be implemented in software (e.g., stored on non-transitory computer-readable medium) executable by one or more processors. As such, it should be noted that a plurality of hardware and software based devices, as well as a plurality of different structural components may be utilized to implement the embodiments. For example, “processing units” and “controllers” described in the specification can include standard processing components, such as one or more processors, one or more memory modules including non-transitory computer-readable medium, one or more input/output interfaces, and various connections (e.g., a system bus) connecting the components.
- External Billboard Display System
-
FIG. 1 illustrates a roadway 10 withvehicles 12, and a nearbybillboard display system 20. Thebillboard display system 20 near the roadway 10 includes adisplay controller 22 that includes aprocessor 24 in communication with amemory 26 and adisplay driver 28. Thedisplay driver 28 connects to or communicates with abillboard display 30 located adjacent the roadway 10. Theprocessor 24 is linked to acommunication unit 34 for communicating withvehicles 12, etc. The billboard display 30 selectively displays various advertising images. In one embodiment, thebillboard display 30 is a digital electronic display for providing various advertising images stored inmemory 26. In some embodiments, the selected advertising images include video images, such that the image displayed on thebillboard display 30 changes visually to attract attention to thebillboard display 30. - The
processor 24 provides a digital image signal for advertising to thedisplay driver 28, which provides the image to thebillboard display 30 for viewing thereon. Theprocessor 24 obtains the digital image signal from thememory 26 or from aremote cloud server 36 shown inFIG. 1 or other electronic storage device via an internet connection or via thecommunication unit 34. - Interior Occupant Sensing System
-
FIG. 2 illustrates an interioroccupant sensing system 40. The interioroccupant sensing system 40 is integrated into eachvehicle 12, and includes anelectronic controller 42, avideo camera 44, amicrophone 46, adatabase 48, acommunications system 50, a human-machine interface (HMI) 52, a global positioning signal (GPS)navigation system 54, an entertainment system 56, andother vehicle systems 58. As illustrated inFIG. 2 , the components of the interioroccupant sensing system 40 are in communication with theelectronic controller 42. - The
electronic controller 42 shown inFIG. 2 may be dedicated to the interioroccupant sensing system 40, or may also be part of another vehicle operating system. In some embodiments, theelectronic controller 42 is integrated with thevideo camera 44 andmicrophone 46 into a single device. In other embodiments, theelectronic controller 42 is part of a multi-camera system that includes interior and exterior cameras. In some embodiments, the interioroccupant sensing system 40 includes more than one camera. Thevideo camera 44 is a digital video camera in one embodiment. Thevideo camera 44 is positioned to view the interior of the vehicle and the occupants (e.g., the driver and one or more passengers) of the vehicle. Theelectronic controller 42 is configured to receive and process images or video data from thevideo camera 44. Themicrophone 46 is positioned in the interior of the vehicle and is configured to sense or detect sound (including voices), convert the sound or audio signal to audio data, and provide the audio data to theelectronic controller 42. Theelectronic controller 42 is configured to receive and process the audio data from themicrophone 46. Themicrophone 46 may be stand alone or it may be part of another vehicle system (e.g., a hands-free cellular system). - The
database 48 shown inFIG. 2 electronically stores information regarding the interioroccupant sensing system 40, including metadata of occupants, according to the methods described herein. Theelectronic controller 42 is configured to read and write such information to and from thedatabase 48. In the illustrated embodiment, thedatabase 48 is accessible by theelectronic controller 42. In alternative embodiments, thedatabase 48 may be located in theelectronic controller 42, or housed on a suitable database server or other system external to the interioroccupant sensing system 40 and accessible over one or more intervening networks. - The
communications system 50 shown inFIG. 2 includes hardware and software components that allow theelectronic controller 42 to communicate wirelessly using one or more modalities from the group consisting of: cellular data, vehicle-to-everything (V2X) communications, and Wi-Fi. Other communication arrangements and protocols are also contemplated. Thecommunications system 50 enables theelectronic controller 42 to communicate with other systems using public data networks (e.g., the Internet). - The
HMI 52 shown inFIG. 2 provides an interface between the interioroccupant sensing system 40 and the occupants of thevehicle 12. TheHMI 52 is coupled to theelectronic controller 42 and is configured to receive inputs from the occupants, receive data from the controller, and provide warnings or other information to the occupants based on the data. TheHMI 52 includes suitable input and output mechanisms, including, for example, buttons, a touch-screen display having a graphical user interface (GUI), voice recognition, etc. TheGPS navigation system 54 includes hardware and software for locating the vehicle using GPS signals, the global positioning system and plotting routes based for the driver. The entertainment system 56 includes hardware (e.g., a display screen) and software configured to provide video and audio entertainment content to occupants of the vehicle. In some embodiments, the entertainment content includes video content streamed through thecommunications system 50. Theelectronic controller 42 is also coupled toother vehicle systems 58, including, for example, steering, braking, and automated vehicle control systems. - As illustrated in
FIG. 3 , in one embodiment, theelectronic controller 42 includes an electronic processing unit 60 (e.g., a microprocessor or another suitable programmable device), a non-transitory memory 64 (e.g., a computer-readable storage medium), and an input/output interface 68. In one embodiment, the input/output interface 68 includes one or more control or data buses enabling theelectronic processing unit 60 to communicate with the various devices illustrated inFIG. 2 . The input/output interface 68 provides an electrical connection over a data bus or a wired, wireless, or optical connection that enables the devices shown inFIG. 2 to communicate using network communications protocols, for example, the Ethernet, CAN or FLEXRAY protocol. - The
memory 64 can include a program storage area (e.g., read only memory (ROM) and a data storage area (e.g., random access memory (RAM), and another non-transitory computer readable medium. Theelectronic processing unit 60 executes software stored in thememory 64. The software may include instructions and algorithms for performing methods as described herein. - The input/
output interface 68 shown inFIG. 3 receives inputs and provides outputs to and from systems external to theelectronic controller 42, including the devices and systems shown inFIG. 2 . In some embodiments, theelectronic controller 42 includes a plurality of electrical and electronic components that provide power, operational control, and protection to the components and modules within theelectronic controller 42. It should be understood that theelectronic controller 42 may include additional, fewer, or different components. - In one embodiment, the
electronic controller 42 is configured to perform machine learning functions. Thememory 64 stores one or more learning engines executable by theelectronic processing unit 60 to process data received from thevideo camera 44 and themicrophone 46, and develop demographic metadata on the occupants of the vehicle. Machine learning generally refers to the ability of a computer application to learn without being explicitly programmed. In particular, a computer application performing machine learning (sometimes referred to as a learning engine) is configured to develop an algorithm based on training data. For example, to perform supervised learning, the training data includes example inputs and corresponding desired (e.g., actual) outputs, and the learning engine progressively develops a model that maps inputs to the outputs included in the training data. Machine learning can be performed using various types of methods and mechanisms including, but not limited to, decision tree learning, association rule learning, artificial neural networks, inductive logic programming, support vector machines, clustering, Bayesian networks, reinforcement learning, representation learning, similarity and metric learning, sparse dictionary learning, and genetic algorithms. - In some embodiments, the
electronic controller 42 accesses one or more sources of training data, predictive models, or learning engines through one or more communication networks, such as the Internet and other public and private networks. Alternatively or in addition, theelectronic controller 42 stores training data in thedatabase 48 or in thememory 64. -
FIG. 4 is a partial perspective view of a front interior portion 70 of avehicle 12 that includes adashboard 72, awindshield 74 and aroof 76. Further, avideo camera 44 is mounted to theroof 76 near the rear view mirror 78 to obtain video data within the vehicle. Further,FIG. 4 shows the human-machine interface 52, such as a touch screen. - Operation of Interior Occupant Sensing System
- For the
billboard display system 20 to operate properly, information from at least one of thevehicles 12 shown inFIG. 1 must be provided to the billboard display system. Thus, the interioroccupant sensing system 40 for each vehicle operates as follows. - As shown in the images and
flow chart 100 ofFIG. 5 , four faces 104 of occupants are sensed byvideo camera 44 disposed in avehicle 12 and provided as video data to theelectronic processing unit 60. Using machine learning occupant detection algorithms, theelectronic processing unit 60 determines the number, pose and location of each occupant that is detected (step 102). For each face, demographic information and markers are estimated or determined (step 106) by theelectronic processing unit 60 executing classifier algorithms. Determining demographic information includes utilizing the video data or video images for a given occupant's face to 1) classify age, 2) classify gender and 3) classify ethnicity and/or race. In some embodiments, other classification information is obtained. The demographic information is determined for each individual occupant. - Upon determining the demographic information from video images of faces of occupants, the
electronic processing unit 60 further determines emotive responses (step 110). The emotive responses include 1) gaze estimation, including where the face is looking and 2) current emotions, such as a smile, frown or laugh. Changes in the occupants' emotive response to stimuli may be caused by changes in operation of the vehicle itself or externally (i.e., current emotion). For instance, an occupant might display happiness when the vehicle accelerates, or display anxiety when a warning is issued by theHMI 52. Besides determining emotions, theelectronic processing unit 60 is configured for storing the emotions and the exact time that the emotions or emotive responses occur in thememory 64. - Thereafter, the
face 104 of an occupant is tracked or followed within the video image (step 114). Tracking faces 104 of occupants enables the interioroccupant sensing system 40 to determine emotions of occupants and their emotive responses to stimuli, such as advertising. That way, aface 104 that has been classified is followed and reclassification is not necessarily required. - The
electronic processing unit 60 is further configured to combine the demographic information with other vehicle information (i.e. GPS location, duration of trip, passengers entering/exiting, etc.) (step 116). Other vehicle information may include an intended destination for the vehicle, the make and model of the vehicle. Further, the clothing worn by the occupants may be utilized in classifying the occupants and predicting what advertising to provide thereto. - The
electronic processing unit 60 is configured to improve classifier estimates as the occupant is tracked over time (step 120). Influences such as lighting, occupant movement, and occupant clothing may vary over time, causing theelectronic processing unit 60 to make different demographic estimations over time. Theelectronic processing unit 60 uses machine learning algorithms to improve its estimates by recognizing trends or eliminating outliers. This continuous improvement ensures the most reliable metadata. - The
electronic processing unit 60 is configured to store the demographic information, the vehicle position information, the trip destination information, the trip duration information and additional information as metadata in the database 48 (step 124). Theelectronic processing unit 60 classifies the data as occupant specific data (for example, based on face, and as set forth below, voice recognition algorithms); generic demographic data tied to a specific vehicle at a specific time and location; or as generic metadata that is not tied to any individual or vehicle, but aggregated to extract demographic patterns of vehicle use and transportation. Depending on the application, this allows for either specific customization of the application to the user, or broad statistics based applications that can ensure individual privacy. - As illustrated in the flow chart 140 of
FIG. 6 , in some embodiments, themicrophone 46 collects occupant speech to better ascertain occupant emotive state and reactions to external events. Theelectronic processing unit 60 is configured to perform speech recognition on audio data received from the microphone 46 (step 144). Key words or phrases are then identified (step 148). Thereafter, theelectronic processing unit 60 is further configured to match the speaker to a face (step 152). Such matching relies on estimating movement or speaking by a face of an occupant and comparison with the key word. This key word and specific occupant information is stored as metadata in the database 48 (step 160) along with the demographic information and other metadata for each specific user. Further, the metadata is wirelessly transmitted viacommunications system 50 to thecommunication unit 34 of thebillboard display system 20. - Moreover, the interior
occupant sensing system 40 is capable of providing dense profile models. In one exemplary embodiment, theelectronic controller 42 is configured to fuse detected persons with their demographic information, their emotive response and the speech recognition of key words. Collecting occupant data, along with vehicle dynamics, provides market researchers a real-time response of an occupant to an advertisement displayed on the billboard display system. - As illustrated in the
flow chart 200 ofFIG. 7 , in one embodiment the interioroccupant sensing system 40 interacts with thebillboard display system 20 using occupant data, such as metadata. For example, data on the number of faces in a video image corresponding to the number of occupants, the occupant demographics and other metrics (e.g., trip destinations and durations) are wirelessly transmitted by thecommunications system 50 via V2X to anupcoming communication unit 34 of abillboard display system 20. - More specifically, as shown in
FIG. 7 , avehicle 12 enters the zone of the billboard display system 20 (step 202). Thecommunications system 50 of the interioroccupant sensing system 40 provides wireless transmission or wireless communication (step 206) with thecommunication unit 34 of thebillboard display system 20. For communication to occur, thevehicle 12 must be within broadcasting range of thebillboard display system 20. The wireless transmission includes the vehicle position information of the vehicle located on a roadway relative to thebillboard display 34 as determined from GPS signals. - The
display controller 22 of thebillboard display system 20 analyzes a population of occupants approaching thebillboard display 30 from one or more approaching vehicles (step 210). Further, thedisplay controller 22 is configured to analyze the metrics (step 212) from one or more vehicles approaching thebillboard display 30. Besides the number of occupants and the demographic information (214) of occupants in multiple vehicles that are approaching thebillboard display 30, along with trip destinations (216) and trip durations (218) of the multiple vehicles when the trip data is available, the metrics may include other factors. The type of vehicle, the clothing worn by occupants, and the time of day, are also provided as metrics in some embodiments. - Utilizing the analyzed metrics, the
display controller 22 determines and displays a targeted advertisement (step 220). The targeted advertisement is considered most applicable to the population consisting of the occupants of each of the vehicles in the field of view of thebillboard display 30. As set forth above, a targeted advertisement is based on the various factors and based on time of day. For instance, in one embodiment, advertising for hotels or motels in mid or late evening of a long trip duration is provided. At what is considered close to typical mealtimes, advertising is provided for approaching restaurants. The type of restaurants or establishments displayed in advertising is also based on the ages of the occupants. In conclusion, the various metrics interactively assist in the selective display of advertising on thebillboard display 30. - Upon the display of advertising for a limited time based on the metrics, the
billboard display system 20 queries the approachingvehicles 12 for additional or updated metrics (step 230). The updated metrics include impression counts (232), emotive responses (234), and gaze of eyes (236) of the occupants of the approaching vehicles. The interioroccupant sensing system 40 is configured to communicate the impression counts, the emotional states, any emotive responses and gaze of eyes of occupants to the billboard display system (step 240), along with additional metrics. - The
display controller 22 executes a program using the above information to target or select a future or immediate advertisement that is more likely relevant or useful to occupants of one or more vehicles approaching thebillboard display 30. Thedisplay controller 22 is configured to process the trip destination information and/or the trip duration information, in combination with the demographic information and number of occupants, to determine advertising relevant to occupants. In one embodiment, the emotional state of the occupants assists in selection of advertising. For occupants in a positive mood, the advertising is different than the advertising for occupants in an angry or bad mood. - Further, the
display controller 22 is configured to track impression counts for current advertising. Using occupant gaze estimations, advertisers accurately analyze and gauge the effectiveness of various advertising schemes. Analyzing the effectiveness from the impression counts maximizes the relevance of the advertising to be displayed in the future to particular occupants of one or more vehicles. - Other metrics, can be used to target advertisements. In one operation, if the
billboard display system 20 is informed that a vehicle's fuel level is low, advertising for nearby gas stations appears. In another scenario, child-friendly venues advertise using thebillboard display 30 tovehicles 12 carrying children as determined from the demographic information. - In one embodiment, at least one of impression counts, gaze estimation and emotive responses are collected and transmitted to advertisers remotely, such as to the
remote cloud server 36, to gauge the success of advertisements provided on thebillboard display 30. Thebillboard display system 20 is in communication over the internet or other communication with thecloud server 36. Thus, thebillboard display system 20 provides demographic information to thecloud server 36 and receives advertising for viewing on thebillboard display 30. - In some implementations, the above described system is controlled using at least one
display controller 22. Thedisplay controller 22 includes one or more processing units (e.g., a processor, application specific integrated circuits (“ASIC”), etc.). Likewise, the vehicleelectronic controller 42 can include one or more processing units as listed above. - The
memory 26 of thedisplay controller 22 and thememory 64 of the vehicleelectronic controller 42 are non-transitory computer-readable medium in one embodiment. Both random access memory (RAM) and read only memory (ROM) are contemplated. - Besides the
display controller 22, theelectronic controller 42 uses the input/output interface 68 for sending and receiving information from one or more sensors or systems external to the electronic controller 42 (e.g., over a vehicle communication bus, such as a controller area network (CAN) bus or a FlexRay bus). In some implementations, theelectronic controller 42 can also include one or more additional internal sensors or systems. - The
billboard display 30 is an electronic billboard display that displays the electronic images from thedisplay driver 28. Thebillboard display 30 is a video screen of plasma, LCD, LED or other digital light emitting arrangements. In some embodiments, a loudspeaker or sound output is also provided. - While at least one
video camera 44 is contemplated, a plurality of video cameras are provided to obtain video data to detect occupants in some embodiments. - In referencing a situation wherein two vehicles are approaching the
billboard display system 20, one vehicle can be considered a first vehicle having a first interior occupant sensing system with a first video camera and another vehicle can be considered a second vehicle having a second interior occupant sensing system with a second video camera. Of course, third, fourth, fifth, etc. vehicles having interior occupant sensing systems are also contemplated. - While
FIG. 1 shows vehicles in one direction approaching thebillboard display 30 receiving advertising, in another embodiment vehicles traveling in opposing directions view advertising messages on respective sides of thebillboard display 30. - With the availability of an
interior video camera 44 primarily used for occupant sensing, additional demographic data of each occupant in avehicle 12 can be extracted from a camera image. This demographic metadata can be stored, analyzed and fused with a variety of other sensors and vehicle states to provide dense demographic profiles of drivers and passengers. This data is useful for marketers and advertisers. - Thus, the invention provides, among other things, methods for operating a
billboard display system 20 to provide advertising directed to the occupants ofvehicles 12 approaching abillboard display 30 and for operating an interioroccupant sensing system 40 to provide information or data thereto.
Claims (17)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/751,327 US20180232770A1 (en) | 2015-09-04 | 2016-09-02 | Billboard display and method for selectively displaying advertisements by sensing demographic information of occupants of vehicles |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562214416P | 2015-09-04 | 2015-09-04 | |
PCT/US2016/050106 WO2017040924A1 (en) | 2015-09-04 | 2016-09-02 | Billboard display and method for selectively displaying advertisements by sensing demographic information of occupants of vehicles |
US15/751,327 US20180232770A1 (en) | 2015-09-04 | 2016-09-02 | Billboard display and method for selectively displaying advertisements by sensing demographic information of occupants of vehicles |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180232770A1 true US20180232770A1 (en) | 2018-08-16 |
Family
ID=56926335
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/751,323 Active 2037-08-17 US10970747B2 (en) | 2015-09-04 | 2016-09-02 | Access and control for driving of autonomous vehicle |
US15/751,327 Abandoned US20180232770A1 (en) | 2015-09-04 | 2016-09-02 | Billboard display and method for selectively displaying advertisements by sensing demographic information of occupants of vehicles |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/751,323 Active 2037-08-17 US10970747B2 (en) | 2015-09-04 | 2016-09-02 | Access and control for driving of autonomous vehicle |
Country Status (5)
Country | Link |
---|---|
US (2) | US10970747B2 (en) |
EP (2) | EP3345148A1 (en) |
JP (2) | JP6643461B2 (en) |
CN (2) | CN107924528A (en) |
WO (2) | WO2017040924A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20190108548A1 (en) * | 2017-10-11 | 2019-04-11 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for dynamic road sign personalization |
US20200020258A1 (en) * | 2018-07-10 | 2020-01-16 | Color Vision, S.A. | Smart Screen for Citizen Interactions and Communications |
CN110942332A (en) * | 2018-09-21 | 2020-03-31 | 丰田自动车株式会社 | Information processing apparatus and information processing method |
CN110942718A (en) * | 2018-09-25 | 2020-03-31 | 丰田自动车株式会社 | Information system, information processing method, and non-transitory storage medium |
US10636303B2 (en) * | 2016-08-24 | 2020-04-28 | Kyocera Corporation | Electronic device, method of communication, and non-transitory computer readable storage medium |
US20200288289A1 (en) * | 2019-03-06 | 2020-09-10 | GM Global Technology Operations LLC | System and method to display information |
US11373211B2 (en) | 2018-09-21 | 2022-06-28 | Toyota Jidosha Kabushiki Kaisha | Information processing apparatus and information processing method |
US11766998B2 (en) * | 2019-08-02 | 2023-09-26 | Volkswagen Aktiengesellschaft | Vehicle guiding attachment |
US12010381B2 (en) | 2021-06-11 | 2024-06-11 | Sony Group Corporation | Orientation control of display device based on content |
Families Citing this family (69)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9972054B1 (en) | 2014-05-20 | 2018-05-15 | State Farm Mutual Automobile Insurance Company | Accident fault determination for autonomous vehicles |
US11669090B2 (en) | 2014-05-20 | 2023-06-06 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle operation feature monitoring and evaluation of effectiveness |
US10373259B1 (en) | 2014-05-20 | 2019-08-06 | State Farm Mutual Automobile Insurance Company | Fully autonomous vehicle insurance pricing |
US10354330B1 (en) | 2014-05-20 | 2019-07-16 | State Farm Mutual Automobile Insurance Company | Autonomous feature use monitoring and insurance pricing |
US10599155B1 (en) | 2014-05-20 | 2020-03-24 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle operation feature monitoring and evaluation of effectiveness |
US10475127B1 (en) | 2014-07-21 | 2019-11-12 | State Farm Mutual Automobile Insurance Company | Methods of providing insurance savings based upon telematics and insurance incentives |
US20210118249A1 (en) | 2014-11-13 | 2021-04-22 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle salvage and repair |
US20210272207A1 (en) | 2015-08-28 | 2021-09-02 | State Farm Mutual Automobile Insurance Company | Vehicular driver profiles and discounts |
US10386835B2 (en) * | 2016-01-04 | 2019-08-20 | GM Global Technology Operations LLC | System and method for externally interfacing with an autonomous vehicle |
US10134278B1 (en) | 2016-01-22 | 2018-11-20 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle application |
US10324463B1 (en) | 2016-01-22 | 2019-06-18 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle operation adjustment based upon route |
US11719545B2 (en) | 2016-01-22 | 2023-08-08 | Hyundai Motor Company | Autonomous vehicle component damage and salvage assessment |
US10747234B1 (en) | 2016-01-22 | 2020-08-18 | State Farm Mutual Automobile Insurance Company | Method and system for enhancing the functionality of a vehicle |
US11242051B1 (en) | 2016-01-22 | 2022-02-08 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle action communications |
US10395332B1 (en) | 2016-01-22 | 2019-08-27 | State Farm Mutual Automobile Insurance Company | Coordinated autonomous vehicle automatic area scanning |
US11441916B1 (en) | 2016-01-22 | 2022-09-13 | State Farm Mutual Automobile Insurance Company | Autonomous vehicle trip routing |
WO2017138935A1 (en) * | 2016-02-10 | 2017-08-17 | Harman International Industries, Incorporated | Systems and methods for vehicle assistance |
US10365932B2 (en) * | 2017-01-23 | 2019-07-30 | Essential Products, Inc. | Dynamic application customization for automated environments |
JP2018188029A (en) * | 2017-05-09 | 2018-11-29 | オムロン株式会社 | Stop intention determination device and stop intention determination method |
CN108876944A (en) * | 2017-05-12 | 2018-11-23 | 阿里巴巴集团控股有限公司 | A kind of method and apparatus of vehicle-mounted payment |
JP6981051B2 (en) * | 2017-06-06 | 2021-12-15 | 株式会社デンソー | Information processing equipment, information processing system, information processing method and information processing program |
WO2018227297A1 (en) * | 2017-06-14 | 2018-12-20 | Signalisation Kalitec Inc. | Intelligent roadway sign and related method |
JP6852790B2 (en) * | 2017-06-23 | 2021-03-31 | 株式会社村田製作所 | Position estimation system |
JP6785968B2 (en) * | 2017-07-07 | 2020-11-18 | 三菱電機株式会社 | Automatic parking support device and automatic parking support method |
US10416671B2 (en) | 2017-07-11 | 2019-09-17 | Waymo Llc | Methods and systems for vehicle occupancy confirmation |
CA3027627C (en) | 2017-07-13 | 2021-08-10 | Beijing Didi Infinity Technology And Development Co., Ltd. | Systems and methods for trajectory determination |
JP6930274B2 (en) * | 2017-08-08 | 2021-09-01 | トヨタ自動車株式会社 | Digital signage control device, digital signage control method, program, recording medium |
US10395457B2 (en) * | 2017-08-10 | 2019-08-27 | GM Global Technology Operations LLC | User recognition system and methods for autonomous vehicles |
JP2019036048A (en) * | 2017-08-10 | 2019-03-07 | トヨタ自動車株式会社 | Information providing device, method for providing information, program, and recording medium |
JP6888474B2 (en) * | 2017-08-10 | 2021-06-16 | トヨタ自動車株式会社 | Digital signage control device, digital signage control method, program, recording medium |
USD886694S1 (en) * | 2017-08-11 | 2020-06-09 | Trifo, Inc. | Autonomous vehicle sensor housing |
DE102017008084A1 (en) * | 2017-08-25 | 2019-02-28 | Daimler Ag | Procedure for granting access and driving authorization |
EP3470947B1 (en) * | 2017-10-12 | 2021-07-28 | Volvo Car Corporation | Method and system for guiding an autonomous vehicle |
DE102017222884A1 (en) | 2017-12-15 | 2019-06-19 | Zf Friedrichshafen Ag | Method for operating a motor vehicle |
US11080753B2 (en) | 2018-02-27 | 2021-08-03 | Ad Connected, Inc. | Apparatus and method for using connected vehicles as an advertisement platform |
US11252545B2 (en) | 2018-01-17 | 2022-02-15 | Ad Connected, Inc. | Apparatus and method for using connected vehicles as an advertisement platform |
US20190222885A1 (en) * | 2018-01-17 | 2019-07-18 | Ad Connected, Inc. | Apparatus and method for delivering advertisement content to connected vehicles |
CN112292706A (en) * | 2018-01-17 | 2021-01-29 | 广告连接有限公司 | Apparatus and method for delivering advertising content to connected vehicles |
US10838425B2 (en) | 2018-02-21 | 2020-11-17 | Waymo Llc | Determining and responding to an internal status of a vehicle |
JP7149724B2 (en) * | 2018-03-29 | 2022-10-07 | 株式会社Subaru | vehicle controller |
CN110411398A (en) * | 2018-04-27 | 2019-11-05 | 北京林业大学 | A kind of angle gauge measures the technical method of natural Over stand wood tree age |
US10573104B2 (en) * | 2018-06-29 | 2020-02-25 | Robert Bosch Gmbh | Ultra-wideband based vehicle access system and communication protocol for localization of a target device |
KR102545356B1 (en) * | 2018-08-07 | 2023-06-20 | 현대모비스 주식회사 | Apparatus for exchanging control right of autonomous vehicle and method thereof |
US10739150B2 (en) * | 2018-08-21 | 2020-08-11 | GM Global Technology Operations LLC | Interactive routing information between users |
JP7040360B2 (en) * | 2018-08-24 | 2022-03-23 | トヨタ自動車株式会社 | Information processing equipment, information processing methods and programs |
US11009763B2 (en) | 2018-08-30 | 2021-05-18 | OE Solutions Co., Ltd. | Method and apparatus for mitigating adverse effects of bonding wire of external optical modulators |
WO2020073147A1 (en) * | 2018-10-08 | 2020-04-16 | Qualcomm Incorporated | Vehicle entry detection |
CN111079474A (en) | 2018-10-19 | 2020-04-28 | 上海商汤智能科技有限公司 | Passenger state analysis method and device, vehicle, electronic device, and storage medium |
US11030655B2 (en) | 2018-11-02 | 2021-06-08 | International Business Machines Corporation | Presenting targeted content to vehicle occupants on electronic billboards |
DE102018220491A1 (en) | 2018-11-28 | 2020-06-10 | Zf Friedrichshafen Ag | Method for operating a motor vehicle |
US10729378B2 (en) * | 2018-11-30 | 2020-08-04 | Toyota Motor North America, Inc. | Systems and methods of detecting problematic health situations |
JP7062584B2 (en) * | 2018-12-19 | 2022-05-06 | Kddi株式会社 | Advertising media auction device |
DE102018133453A1 (en) | 2018-12-21 | 2020-06-25 | Volkswagen Aktiengesellschaft | Method and device for monitoring an occupant of a vehicle |
DE102018133445A1 (en) * | 2018-12-21 | 2020-06-25 | Volkswagen Aktiengesellschaft | Method and device for monitoring an occupant of a vehicle and system for analyzing the perception of objects |
CN109808697B (en) * | 2019-01-16 | 2021-09-07 | 北京百度网讯科技有限公司 | Vehicle control method, device and equipment |
US10988115B2 (en) | 2019-02-11 | 2021-04-27 | Ford Global Technologies, Llc | Systems and methods for providing vehicle access using biometric data |
JP7196683B2 (en) * | 2019-02-25 | 2022-12-27 | トヨタ自動車株式会社 | Information processing system, program, and control method |
US11094027B2 (en) * | 2019-04-05 | 2021-08-17 | GM Global Technology Operations LLC | System and method to establish primary and secondary control of rideshare experience features |
CN110473017A (en) * | 2019-08-15 | 2019-11-19 | 恩亿科(北京)数据科技有限公司 | A kind of determination method of advertisement placement method, advertisement serving policy |
US11750605B2 (en) * | 2019-08-21 | 2023-09-05 | Texas Instruments Incorporated | Identity validation using Bluetooth fingerprinting authentication |
CN110597383A (en) * | 2019-08-21 | 2019-12-20 | 北京梧桐车联科技有限责任公司 | Information processing method and device, vehicle and storage medium |
US11341781B2 (en) | 2019-10-18 | 2022-05-24 | Toyota Motor Engineering And Manufacturing North America, Inc. | Vehicular communications through identifiers and online systems |
CN110861635B (en) * | 2019-11-15 | 2022-01-07 | 安徽省阜阳市好希望工贸有限公司 | Reminding method and device for safety seat |
DE102019219757A1 (en) * | 2019-12-16 | 2021-06-17 | Volkswagen Aktiengesellschaft | Driver assistance system for providing environmental data for a motor vehicle |
US10926738B1 (en) * | 2019-12-17 | 2021-02-23 | Robert Bosch Gmbh | Method and system for self-learning radio node positions within a vehicle structure |
US11445342B2 (en) * | 2020-03-27 | 2022-09-13 | Qualcomm Incorporated | Enhancements to pedestrian to vehicles (P2V) communications |
JP7415852B2 (en) * | 2020-08-25 | 2024-01-17 | トヨタ自動車株式会社 | Control device, system, program, and advertisement display method |
US11650065B2 (en) | 2020-11-05 | 2023-05-16 | Ford Global Technologies, Llc | Systems and methods for using in-vehicle voce recognition, IoT sensors and vehicle state data for augmenting car-generated GPS/location-based data for predicting travel patterns |
DE102022122441B3 (en) | 2022-09-05 | 2023-09-21 | Audi Aktiengesellschaft | Motor vehicle and method for providing a protective function for the motor vehicle |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080089288A1 (en) * | 2006-10-12 | 2008-04-17 | Bellsouth Intellectual Property Corporation | Methods, systems, and computer program products for providing advertising and/or information services over mobile ad hoc cooperative networks using electronic billboards and related devices |
US20080298562A1 (en) * | 2007-06-04 | 2008-12-04 | Microsoft Corporation | Voice aware demographic personalization |
US7921036B1 (en) * | 2002-04-30 | 2011-04-05 | Videomining Corporation | Method and system for dynamically targeting content based on automatic demographics and behavior analysis |
US20150141043A1 (en) * | 2013-08-23 | 2015-05-21 | Cellepathy Ltd. | Corrective navigation instructions |
US9293042B1 (en) * | 2014-05-19 | 2016-03-22 | Allstate Insurance Company | Electronic display systems connected to vehicles and vehicle-based systems |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3757584B2 (en) * | 1997-11-20 | 2006-03-22 | 株式会社富士通ゼネラル | Advertising effect confirmation system |
US20030233305A1 (en) * | 1999-11-01 | 2003-12-18 | Neal Solomon | System, method and apparatus for information collaboration between intelligent agents in a distributed network |
JP3908425B2 (en) * | 1999-12-24 | 2007-04-25 | アルパイン株式会社 | Navigation device |
US7084385B2 (en) | 2003-08-19 | 2006-08-01 | Autoliv Asp, Inc. | Range discriminating optical sensor having a wide angle lens with a fixed focal length |
US20070050248A1 (en) * | 2005-08-26 | 2007-03-01 | Palo Alto Research Center Incorporated | System and method to manage advertising and coupon presentation in vehicles |
US20070124762A1 (en) * | 2005-11-30 | 2007-05-31 | Microsoft Corporation | Selective advertisement display for multimedia content |
US20080027599A1 (en) * | 2006-07-28 | 2008-01-31 | James Logan | Autonomous vehicle and systems and methods for the operation thereof |
JP5340123B2 (en) * | 2009-12-02 | 2013-11-13 | 株式会社日立国際電気 | In-vehicle monitoring system, in-vehicle monitoring method, imaging apparatus, and imaging method |
JP5834232B2 (en) | 2011-01-17 | 2015-12-16 | パナソニックIpマネジメント株式会社 | Captured image recognition apparatus, captured image recognition system, and captured image recognition method |
US9928524B2 (en) | 2011-03-14 | 2018-03-27 | GM Global Technology Operations LLC | Learning driver demographics from vehicle trace data |
WO2013025803A1 (en) | 2011-08-17 | 2013-02-21 | Eyal Shlomot | Smart electronic roadside billboard |
US20130060642A1 (en) * | 2011-09-01 | 2013-03-07 | Eyal Shlomot | Smart Electronic Roadside Billboard |
US9037852B2 (en) * | 2011-09-02 | 2015-05-19 | Ivsc Ip Llc | System and method for independent control of for-hire vehicles |
US8527146B1 (en) | 2012-01-30 | 2013-09-03 | Google Inc. | Systems and methods for updating vehicle behavior and settings based on the locations of vehicle passengers |
US9177246B2 (en) * | 2012-06-01 | 2015-11-03 | Qualcomm Technologies Inc. | Intelligent modular robotic apparatus and methods |
US9638537B2 (en) * | 2012-06-21 | 2017-05-02 | Cellepathy Inc. | Interface selection in navigation guidance systems |
US20140094987A1 (en) * | 2012-09-28 | 2014-04-03 | Intel Corporation | Tiered level of access to a set of vehicles |
US9311564B2 (en) | 2012-10-05 | 2016-04-12 | Carnegie Mellon University | Face age-estimation and methods, systems, and software therefor |
US8825258B2 (en) | 2012-11-30 | 2014-09-02 | Google Inc. | Engaging and disengaging for autonomous driving |
US10636046B2 (en) | 2013-03-13 | 2020-04-28 | Ford Global Technologies, Llc | System and method for conducting surveys inside vehicles |
US9702349B2 (en) * | 2013-03-15 | 2017-07-11 | ClearMotion, Inc. | Active vehicle suspension system |
US20140278841A1 (en) | 2013-03-15 | 2014-09-18 | Ron Natinsky | Remote sensing of vehicle occupancy |
US20140327752A1 (en) | 2013-05-01 | 2014-11-06 | Nissan North America, Inc. | Vehicle occupancy detection system |
EP2992692B1 (en) * | 2013-05-04 | 2018-08-29 | DECHARMS, Christopher | Mobile security technology |
JP6403776B2 (en) * | 2013-08-19 | 2018-10-10 | ビーエーエスエフ ソシエタス・ヨーロピアBasf Se | Optical detector |
US20150242944A1 (en) * | 2013-09-20 | 2015-08-27 | Eugene S. Willard | Time dependent inventory asset management system for industries having perishable assets |
US20160301698A1 (en) * | 2013-12-23 | 2016-10-13 | Hill-Rom Services, Inc. | In-vehicle authorization for autonomous vehicles |
US20150235538A1 (en) | 2014-02-14 | 2015-08-20 | GM Global Technology Operations LLC | Methods and systems for processing attention data from a vehicle |
US9205805B2 (en) * | 2014-02-14 | 2015-12-08 | International Business Machines Corporation | Limitations on the use of an autonomous vehicle |
US10139824B2 (en) * | 2014-04-30 | 2018-11-27 | Mico Latta Inc. | Automatic driving vehicle and program for automatic driving vehicle |
US20150379782A1 (en) * | 2014-06-26 | 2015-12-31 | Alpine Electronics, Inc. | Method of automatically adjusting toll collection information based on a number of occupants in a vehicle |
US9547985B2 (en) * | 2014-11-05 | 2017-01-17 | Here Global B.V. | Method and apparatus for providing access to autonomous vehicles based on user context |
US10021254B2 (en) * | 2015-01-22 | 2018-07-10 | Verizon Patent And Licensing Inc. | Autonomous vehicle cameras used for near real-time imaging |
US9552564B1 (en) * | 2015-03-19 | 2017-01-24 | Amazon Technologies, Inc. | Autonomous delivery transportation network |
US10049375B1 (en) * | 2015-03-23 | 2018-08-14 | Amazon Technologies, Inc. | Automated graph-based identification of early adopter users |
US9802638B1 (en) * | 2015-06-19 | 2017-10-31 | Waymo Llc | Removable manual controls for an autonomous vehicle |
US9610510B2 (en) * | 2015-07-21 | 2017-04-04 | Disney Enterprises, Inc. | Sensing and managing vehicle behavior based on occupant awareness |
US9958864B2 (en) * | 2015-11-04 | 2018-05-01 | Zoox, Inc. | Coordination of dispatching and maintaining fleet of autonomous vehicles |
US10940854B2 (en) * | 2018-11-29 | 2021-03-09 | Ford Global Technologies, Llc | Systems and methods for reducing vehicle evaporative emissions |
-
2016
- 2016-09-02 CN CN201680050935.8A patent/CN107924528A/en active Pending
- 2016-09-02 US US15/751,323 patent/US10970747B2/en active Active
- 2016-09-02 US US15/751,327 patent/US20180232770A1/en not_active Abandoned
- 2016-09-02 CN CN201680050951.7A patent/CN107924523A/en active Pending
- 2016-09-02 WO PCT/US2016/050106 patent/WO2017040924A1/en active Application Filing
- 2016-09-02 EP EP16766190.9A patent/EP3345148A1/en not_active Withdrawn
- 2016-09-02 WO PCT/US2016/050111 patent/WO2017040929A1/en active Application Filing
- 2016-09-02 JP JP2018511646A patent/JP6643461B2/en not_active Expired - Fee Related
- 2016-09-02 JP JP2018511741A patent/JP6751436B2/en active Active
- 2016-09-02 EP EP16767438.1A patent/EP3345138A1/en not_active Ceased
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7921036B1 (en) * | 2002-04-30 | 2011-04-05 | Videomining Corporation | Method and system for dynamically targeting content based on automatic demographics and behavior analysis |
US20080089288A1 (en) * | 2006-10-12 | 2008-04-17 | Bellsouth Intellectual Property Corporation | Methods, systems, and computer program products for providing advertising and/or information services over mobile ad hoc cooperative networks using electronic billboards and related devices |
US20080298562A1 (en) * | 2007-06-04 | 2008-12-04 | Microsoft Corporation | Voice aware demographic personalization |
US20150141043A1 (en) * | 2013-08-23 | 2015-05-21 | Cellepathy Ltd. | Corrective navigation instructions |
US9293042B1 (en) * | 2014-05-19 | 2016-03-22 | Allstate Insurance Company | Electronic display systems connected to vehicles and vehicle-based systems |
Cited By (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10636303B2 (en) * | 2016-08-24 | 2020-04-28 | Kyocera Corporation | Electronic device, method of communication, and non-transitory computer readable storage medium |
US11113727B2 (en) * | 2017-10-11 | 2021-09-07 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for dynamic road sign personalization |
US20190108548A1 (en) * | 2017-10-11 | 2019-04-11 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for dynamic road sign personalization |
US20210374805A1 (en) * | 2017-10-11 | 2021-12-02 | Toyota Motor Engineering & Manufacturing North America, Inc. | Systems and methods for dynamic road sign personalization |
US20200020258A1 (en) * | 2018-07-10 | 2020-01-16 | Color Vision, S.A. | Smart Screen for Citizen Interactions and Communications |
US10950151B2 (en) * | 2018-07-10 | 2021-03-16 | Color Vision S.A. | Smart screen for citizen interactions and communications |
CN110942332A (en) * | 2018-09-21 | 2020-03-31 | 丰田自动车株式会社 | Information processing apparatus and information processing method |
US11373211B2 (en) | 2018-09-21 | 2022-06-28 | Toyota Jidosha Kabushiki Kaisha | Information processing apparatus and information processing method |
US11367107B2 (en) | 2018-09-21 | 2022-06-21 | Toyota Jidosha Kabushiki Kaisha | Information processing apparatus and information processing method |
CN110942718A (en) * | 2018-09-25 | 2020-03-31 | 丰田自动车株式会社 | Information system, information processing method, and non-transitory storage medium |
US10805777B2 (en) * | 2019-03-06 | 2020-10-13 | GM Global Technology Operations LLC | System and method to display information |
CN111667297A (en) * | 2019-03-06 | 2020-09-15 | 通用汽车环球科技运作有限责任公司 | System and method for displaying information |
US20200288289A1 (en) * | 2019-03-06 | 2020-09-10 | GM Global Technology Operations LLC | System and method to display information |
US11766998B2 (en) * | 2019-08-02 | 2023-09-26 | Volkswagen Aktiengesellschaft | Vehicle guiding attachment |
US12010381B2 (en) | 2021-06-11 | 2024-06-11 | Sony Group Corporation | Orientation control of display device based on content |
Also Published As
Publication number | Publication date |
---|---|
JP2018534187A (en) | 2018-11-22 |
JP2018526749A (en) | 2018-09-13 |
JP6643461B2 (en) | 2020-02-12 |
US20180231979A1 (en) | 2018-08-16 |
US10970747B2 (en) | 2021-04-06 |
WO2017040924A1 (en) | 2017-03-09 |
CN107924528A (en) | 2018-04-17 |
JP6751436B2 (en) | 2020-09-02 |
WO2017040929A1 (en) | 2017-03-09 |
EP3345138A1 (en) | 2018-07-11 |
EP3345148A1 (en) | 2018-07-11 |
CN107924523A (en) | 2018-04-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180232770A1 (en) | Billboard display and method for selectively displaying advertisements by sensing demographic information of occupants of vehicles | |
JP6394735B2 (en) | Detection of limbs using hierarchical context-aware | |
US10222227B2 (en) | Navigation systems and associated methods | |
US11067405B2 (en) | Cognitive state vehicle navigation based on image processing | |
JP6831053B2 (en) | Systems and methods for providing targeted advertising to charging stations for electric vehicles | |
JP6456610B2 (en) | Apparatus and method for detecting a driver's interest in advertisements by tracking the driver's eye gaze | |
US10049389B2 (en) | System and method for interacting with digital signage | |
US8954340B2 (en) | Risk evaluation based on vehicle operator behavior | |
CN107004363B (en) | Image processing device, on-vehicle display system, display device, and image processing method | |
US20170343375A1 (en) | Systems to dynamically guide a user to an autonomous-driving vehicle pick-up location by augmented-reality walking directions | |
US20170072850A1 (en) | Dynamic vehicle notification system and method | |
US20210339759A1 (en) | Cognitive state vehicle navigation based on image processing and modes | |
US11436744B2 (en) | Method for estimating lane information, and electronic device | |
US11669781B2 (en) | Artificial intelligence server and method for updating artificial intelligence model by merging plurality of pieces of update information | |
US20190392383A1 (en) | Refrigerator for providing information on item using artificial intelligence and method of operating the same | |
US11729444B2 (en) | System and methods for sensor-based audience estimation during digital media display sessions on mobile vehicles | |
US20190005565A1 (en) | Method and system for stock-based vehicle navigation | |
US11854059B2 (en) | Smart apparatus | |
US11074814B2 (en) | Portable apparatus for providing notification | |
US11302304B2 (en) | Method for operating a sound output device of a motor vehicle using a voice-analysis and control device | |
CN112109645B (en) | Method and system for providing assistance to a vehicle user | |
US20190370863A1 (en) | Vehicle terminal and operation method thereof | |
US11348585B2 (en) | Artificial intelligence apparatus | |
US20220172249A1 (en) | Systems and Methods for Providing Targeted Advertising | |
US20230392936A1 (en) | Method and apparatus for determining lingering communication indicators |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: ROBERT BOSCH GMBH, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MILLER, JAMES STEPHEN;GRAF, PATRICK;RIGGI, FRANK;REEL/FRAME:044870/0633 Effective date: 20160902 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |