US20180154826A1 - Vehicle projection system - Google Patents

Vehicle projection system Download PDF

Info

Publication number
US20180154826A1
US20180154826A1 US15/366,476 US201615366476A US2018154826A1 US 20180154826 A1 US20180154826 A1 US 20180154826A1 US 201615366476 A US201615366476 A US 201615366476A US 2018154826 A1 US2018154826 A1 US 2018154826A1
Authority
US
United States
Prior art keywords
vehicle
projector
visual content
insert
semitransparent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US15/366,476
Other versions
US9987978B1 (en
Inventor
Dewayne Bontrager
Lawayne Bontrager
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US15/366,476 priority Critical patent/US9987978B1/en
Application granted granted Critical
Publication of US9987978B1 publication Critical patent/US9987978B1/en
Publication of US20180154826A1 publication Critical patent/US20180154826A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/503Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking using luminous text or symbol displays in or on the vehicle, e.g. static text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/2661Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic mounted on parts having other functions
    • B60Q1/268Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic mounted on parts having other functions on windscreens or windows
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0241Advertisements
    • G06Q30/0251Targeted advertisements
    • G06Q30/0265Vehicular advertisement
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects
    • G09F19/18Advertising or display means not otherwise provided for using special optical effects involving the use of optical projection means, e.g. projection of images on clouds
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F21/00Mobile visual advertising
    • G09F21/04Mobile visual advertising by land vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F21/00Mobile visual advertising
    • G09F21/04Mobile visual advertising by land vehicles
    • G09F21/049Mobile visual advertising by land vehicles giving information to passengers inside the vehicles
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • G09G3/002Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to project the image of a two-dimensional display, such as an array of light emitting or modulating elements or a CRT
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3191Testing thereof
    • H04N9/3194Testing thereof including sensor feedback
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/44Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating braking action or preparation for braking, e.g. by detection of the foot approaching the brake pedal
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q1/00Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor
    • B60Q1/26Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic
    • B60Q1/50Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking
    • B60Q1/506Arrangement of optical signalling or lighting devices, the mounting or supporting thereof or circuits therefor the devices being primarily intended to indicate the vehicle, or parts thereof, or to give signals, to other traffic for indicating other intentions or conditions, e.g. request for waiting or overtaking specific to silent vehicles, e.g. for warning that a hybrid or electric vehicle is approaching
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60QARRANGEMENT OF SIGNALLING OR LIGHTING DEVICES, THE MOUNTING OR SUPPORTING THEREOF OR CIRCUITS THEREFOR, FOR VEHICLES IN GENERAL
    • B60Q2400/00Special features or arrangements of exterior signal lamps for vehicles
    • B60Q2400/50Projected symbol or information, e.g. onto the road or car body
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F19/00Advertising or display means not otherwise provided for
    • G09F19/12Advertising or display means not otherwise provided for using special optical effects

Definitions

  • Vehicles can be affixed with text or graphics for advertising purposes.
  • advertising images may be adhered to vehicle windows.
  • a disadvantage to this approach is that an advertiser is limited to still-frame images.
  • changing the displayed advertisement requires the removal of the original and the production of a new decal, which has drawbacks in both cost and time.
  • a semitransparent insert can be inserted into a frame for a vehicle window.
  • the insert can be adhered to a vehicle window, or cut to size so that it rests in the window frame without adhesion.
  • a projector in the vehicle can project visual content, including image or video content, on the semitransparent insert.
  • the projected light passes through the insert, allowing the projected content to be viewed from the vehicle exterior.
  • the projector may be communicatively coupled to one or more sensors that alter the displayed content based on operating circumstances of the vehicle.
  • FIG. 1 is an implementation of an installation of a semitransparent insert of a vehicle projection system
  • FIG. 2A is a view of an example implementation of a vehicle projection system
  • FIG. 2B is a view of an example implementation of a vehicle projection system:
  • FIG. 2C is a view of an example implementation of a vehicle projection system:
  • FIG. 3 is a block diagram of an example implementation of a vehicle projection system
  • FIG. 4 is a flowchart of an example method
  • FIG. 5 is a block diagram of an example computing device in the vehicle projection system.
  • the word “comprise” and variations of the word, such as “comprising” and “comprises,” means “including but not limited to,” and is not intended to exclude, for example, other components, integers or steps.
  • “Exemplary” means “an example of” and is not intended to convey an indication of a preferred or ideal embodiment. “Such as” is not used in a restrictive sense, but for explanatory purposes.
  • the methods and systems may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects.
  • the methods and systems may take the form of a computer program product on a computer-readable storage medium having computer-readable program instructions (e.g., computer software) embodied in the storage medium.
  • the present methods and systems may take the form of web-implemented computer software. Any suitable computer-readable storage medium may be utilized including hard disks, CD-ROMs, optical storage devices, or magnetic storage devices.
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including computer-readable instructions for implementing the function specified in the flowchart block or blocks.
  • the computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
  • blocks of the block diagrams and flowchart illustrations support combinations of means for performing the specified functions, combinations of steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
  • the present disclosure relates to a vehicle projection system for projecting visual content visible from the exterior of a vehicle.
  • a vehicle projection system for projecting visual content visible from the exterior of a vehicle.
  • visual content in the context of advertising, it is understood that the displayed visual content may extend to other uses, including entertainment or artistic purposes.
  • the following discussion addresses a vehicle projection system deployed in an automobile, it is understood that this is a non-limiting example and the vehicle projection system may be deployed in other vehicles.
  • Vehicle exteriors may be used for advertising purposes by adhering decals or other imagery to the vehicle body. This includes perforated decals adhered to windows, as well as other decals or art adhered to the vehicle body or frame.
  • This existing approach has several disadvantages. First, the advertising content is limited to still images, preventing more dynamic advertisements from being presented. Additionally, as the advertising content is adhered directly to the vehicle, changing the advertising content requires a complete removal of an existing advertisement. This requires substantial effort, and may result in the destruction of the removed advertisement.
  • a vehicle projection system allows visual content to be projected within the frame of a vehicle window.
  • a semitransparent insert is inserted into the frame of the vehicle window.
  • a projector internal to the vehicle projects visual content, including still images or video, on the semitransparent insert.
  • the projected visual content is visible from the outside of the vehicle.
  • the insert can include a semitransparent perforated layer to facilitate viewing through the insert from inside the vehicle. Additionally, the semitransparent insert material can minimize the amount of projected imagery visible from the inside of the vehicle.
  • the projector can be coupled to a controller to modify the projected visual content according to operating conditions of the vehicle.
  • the controller may be communicatively coupled to one or more sensors, including a motion sensor, proximity motion sensor, global positioning system radio, light sensor, or other sensor as can be appreciated. Additionally, the controller may be in communication with a mobile device, server, or other computing device to monitor projection usage and vehicle activity for tracking incentives.
  • FIG. 1 illustrates an example installation portion 100 of a vehicle projection system.
  • the disclosed installation portion 100 includes a semitransparent insert 102 .
  • the semitransparent insert allows for rear-projected light projected onto the insert 102 from the interior of the vehicle to be visible from the exterior of the vehicle.
  • the insert 102 comprises plexiglass, fiberglass, plastic, or another substantially rigid material as can be appreciated.
  • the insert 102 can be composed of a semitransparent material.
  • the insert 102 can be composed of a transparent material layer and a semitransparent material layer.
  • the insert 102 can include a transparent plexiglass layer and a semitransparent rear screen projection film layer.
  • the semitransparent rear screen projection film layer is perforated to facilitate visibility from a driver or passenger of the vehicle through the insert 102 .
  • the insert 102 can be cut to size for fitting within a frame of a vehicle window.
  • the window frame holds the insert 102 in place without need for adhesion.
  • the insert 102 can be adhered directly to a vehicle window.
  • FIG. 2A is a view 200 of example implementation of a vehicle projection system. Shown is a vehicle 201 with a projector 202 a affixed near a front end of the vehicle 201 cabin. The projector 202 a is configured to rear project light onto a insert 102 .
  • FIG. 2B is another view 210 of example implementation of a vehicle projection system. Shown is a vehicle 201 with a projector 202 b affixed near a rear end of the vehicle 201 cabin. The projector 202 b is configured to rear project light onto a insert 102 . In this example view 210 , the projector 202 b is a short throw projector 202 b capable of focused projections over a short distance.
  • FIG. 2C is another view 220 of example implementation of a vehicle projection system.
  • the view 220 cab correspond to an overhead view of the view 200 .
  • Shown is a vehicle 201 with a projector 202 a affixed near a front end of the vehicle 201 cabin.
  • the projector 202 a is configured to rear project light onto a insert 102 .
  • FIG. 3 is an example block diagram 300 for a vehicle projection system.
  • the vehicle projection system includes a projector 302 for projecting visual content.
  • the projector 302 can be designed for projection over a short distance onto a projection medium, i.e. a short throw projector.
  • the projector 302 can also be designed for medium to long range projection, such as from a midpoint in a vehicle cabin or a dashboard onto a projection medium.
  • the visual content projected by the projector 302 can be accessed from a media source 304 .
  • the media source 304 can include a memory, including hard drives, solid state drives, or other memory as can be appreciated.
  • the media source 304 can be remotely located from the projector 302 and can be accessed wirelessly as will be discussed below.
  • the media source 304 can also include a wired memory coupled to the projector, or a memory internal to the projector. Additionally, the media source 304 can include a device outputting a video signal for projection by the projector 302 . Such a device can include a computing device, video disc player, or other device as can be appreciated.
  • the projector 302 can be coupled to a controller 306 for controlling the projection operations of the projector 302 .
  • the controller 306 can include any combination of circuitry, software, computing devices, or other components as can be appreciated.
  • the controller 306 can load visual content from a media source 304 for projection.
  • the controller 306 can access a physically coupled media source 304 .
  • the controller 306 can wirelessly access a media source 304 . This can include accessing a media source 304 through a personal area network. This can also include wirelessly accessing a server functioning as a media source 304 .
  • the controller 306 can select visual content from the media source 304 for projection based on operating conditions of the vehicle. Accordingly, the controller 306 can access one or more sensors including a brake sensor 308 , motion sensor 310 , and proximity motion sensor 312 .
  • the brake sensor 308 detects whether brakes of the vehicle are currently being applied.
  • the motion sensor 310 detects whether the vehicle is in motion.
  • the motion sensor 310 can include an accelerometer.
  • the motion sensor 310 can include a global positioning system (GPS) radio, with changes in GPS location indicating a vehicle in motion.
  • the motion sensor 310 can include a speedometer, with a non-zero speed indicating a vehicle in motion.
  • the controller 306 can also include a proximity motion sensor 312 detecting moving vehicles proximate to the vehicle in which the vehicle projection system is installed.
  • the controller 306 can modify a projection of the visual content by the projector 302 based on whether or not brakes are applied as indicated by the brake sensor 308 .
  • the controller 306 can increase or decrease a brightness of the projection when brakes are applied.
  • the controller 306 can apply a video, image, or text overlay when brakes are applied. The overlay can emphasize or indicate that brakes are being applied, or display other information.
  • the controller 306 can also modify a projection of the visual content by the projector 302 based on whether or not the vehicle is in motion as indicated by the motion sensor 310 .
  • the controller 306 can select static or still image visual content for projection while the vehicle is in motion.
  • the controller 306 can select dynamic or video content for projection when the vehicle is not in motion.
  • the controller 306 can select static or still image visual content for projection while the vehicle is stopped and one or more proximate vehicles are detected to be in motion as indicated by the proximity motion sensor 312 .
  • any combination of the brake sensor 308 , motion sensor 310 , and proximity motion sensor 312 may be in direct communication with the controller 306 . That is, each may be communicatively coupled directly to the controller 306 by a wired or wireless interface.
  • any combination of the brake sensor 308 , motion sensor 310 , and proximity motion sensor 312 may be components or peripheral components of the vehicle in which the vehicle projection system is deployed. Accordingly, any of the brake sensor 308 , motion sensor 310 , and proximity motion sensor 312 may by coupled to an onboard computing device in the vehicle accessible to the controller via a vehicle interface 314 .
  • the controller 306 may also be in communication with a mobile device 316 , such as a smartphone, tablet, laptop, or other computing device.
  • the mobile device 316 may serve as a media source 304 transmitting visual content to the controller 306 .
  • the mobile device 316 may provide a text message or image to the controller 306 for overlay on the projected visual content.
  • the mobile device 316 may monitor projector 302 usage, vehicle activity from the vehicle interface 314 , or other activity for reporting to a central server. This aggregated data can be used, for example to calculate incentives associated with a projection of visual content during operation of a vehicle.
  • block diagram 300 is only exemplary, and that any of the depicted components could be removed and additional components may be added. Additionally, it is understood that one or more of the depicted components may be included in the same entity, or their disclosed functions performed by a same device.
  • FIG. 4 is a flowchart 400 of an example method.
  • the vehicle projection system projects static visual content via a projector 302 onto a semitransparent insert 100 in a vehicle.
  • the static visual content can include, for example, a still image, text, or other static content.
  • the controller 306 determines if brakes are being applied according to information from a brake sensor 308 . If brakes are not being applied, the process returns to box 402 where the system continues to project static visual content. Otherwise, the process advances to step 406 where the controller determines if the vehicle is stopped according to information from a motion sensor 310 .
  • step 408 the controller 306 projects modified static visual content.
  • modified static visual content can include, for example, dimming or brightening a projection of the visual content.
  • This can also include applying a visual or text overlay to the projected visual content.
  • the process then returns to step 404 , such that the modified static visual content is projected until the vehicle comes to a full stop, as described above, or the brakes are no longer applied.
  • step 406 determines if there is proximity movement detected, i.e. are there vehicles moving within a detection proximity of a proximity motion sensor 312 . If so, then the process advances to step 412 where the controller 306 projects static visual content. This static visual content can correspond to the originally projected static visual content referenced in step 402 , the modified static visual content referenced in step 408 , or other static visual content as can be appreciated. If no proximity movement is detected in step 410 the process advances to step 414 where the controller 306 projects video content. The process then advances back to step 404 . Thus, on reaching step 414 , video content will continue to be projected until the vehicle stops moving, the brakes are released, or proximity movement is detected.
  • FIG. 5 is a block diagram illustrating an exemplary operating environment for performing the disclosed methods.
  • This exemplary operating environment is only an example of an operating environment and is not intended to suggest any limitation as to the scope of use or functionality of operating environment architecture. Neither should the operating environment be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment.
  • the present methods and systems can be operational with numerous other general purpose or special purpose computing system environments or configurations.
  • Examples of well-known computing systems, environments, and/or configurations that can be suitable for use with the systems and methods comprise, but are not limited to, personal computers, server computers, laptop devices, and multiprocessor systems. Additional examples comprise set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that comprise any of the above systems or devices, and the like.
  • the processing of the disclosed methods and systems can be performed by software components.
  • the disclosed systems and methods can be described in the general context of computer-executable instructions, such as program modules, being executed by one or more computers or other devices.
  • program modules comprise computer code, routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types.
  • the disclosed methods can also be practiced in grid-based and distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network.
  • program modules can be located in both local and remote computer storage media including memory storage devices.
  • the components of the computer 501 can comprise, but are not limited to, one or more processors 503 , a system memory 512 , and a system bus 513 that couples various system components including the one or more processors 503 to the system memory 512 .
  • the system can utilize parallel computing.
  • the system bus 513 represents one or more of several possible types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, or local bus using any of a variety of bus architectures.
  • bus architectures can comprise an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, an Accelerated Graphics Port (AGP) bus, and a Peripheral Component Interconnects (PCI), a PCI-Express bus, a Personal Computer Memory Card Industry Association (PCMCIA), Universal Serial Bus (USB) and the like.
  • ISA Industry Standard Architecture
  • MCA Micro Channel Architecture
  • EISA Enhanced ISA
  • VESA Video Electronics Standards Association
  • AGP Accelerated Graphics Port
  • PCI Peripheral Component Interconnects
  • PCI-Express PCI-Express
  • PCMCIA Personal Computer Memory Card Industry Association
  • USB Universal Serial Bus
  • the bus 513 and all buses specified in this description can also be implemented over a wired or wireless network connection and each of the subsystems, including the one or more processors 503 , a mass storage device 504 , an operating system 505 , projection software 506 , projection data 507 , a network adapter 508 , the system memory 512 , an Input/Output Interface 510 , a display adapter 509 , a display device 511 , and a human machine interface 502 , can be contained within one or more remote computing devices 514 a,b,c at physically separate locations, connected through buses of this form, in effect implementing a fully distributed system.
  • the projectors 202 a , 202 b , and 302 of FIGS. 2A, 2B, 2C, and 3 can be a display device 511 .
  • the computer 501 typically comprises a variety of computer readable media. Exemplary readable media can be any available media that is accessible by the computer 501 and comprises, for example and not meant to be limiting, both volatile and non-volatile media, removable and non-removable media.
  • the system memory 512 comprises computer readable media in the form of volatile memory, such as random access memory (RAM), and/or non-volatile memory, such as read only memory (ROM).
  • the system memory 112 typically contains data such as the projection data 507 and/or program modules such as the operating system 505 and the projection software 506 that are immediately accessible to and/or are presently operated on by the one or more processors 503 .
  • the computer 501 can also comprise other removable/non-removable, volatile/non-volatile computer storage media.
  • FIG. 5 illustrates the mass storage device 504 which can provide non-volatile storage of computer code, computer readable instructions, data structures, program modules, and other data for the computer 501 .
  • the mass storage device 504 can be a hard disk, a removable magnetic disk, a removable optical disk, magnetic cassettes or other magnetic storage devices, flash memory cards, CD-ROM, digital versatile disks (DVD) or other optical storage, random access memories (RAM), read only memories (ROM), electrically erasable programmable read-only memory (EEPROM), and the like.
  • any number of program modules can be stored on the mass storage device 504 , including by way of example, the operating system 505 and the projection software 506 .
  • Each of the operating system 505 and the projection software 506 (or some combination thereof) can comprise elements of the programming and the projection software 506 .
  • the projection data 507 can also be stored on the mass storage device 504 .
  • the projection data 507 can be stored in any of one or more databases known in the art. Examples of such databases comprise, DB2®, Microsoft® Access, Microsoft® SQL Server, Oracle®, mySQL, PostgreSQL, and the like.
  • the databases can be centralized or distributed across multiple systems.
  • the user can enter commands and information into the computer 501 via an input device (not shown).
  • input devices comprise, but are not limited to, a keyboard, pointing device (e.g., a “mouse”), a microphone, a joystick, a scanner, tactile input devices such as gloves, and other body coverings, and the like
  • pointing device e.g., a “mouse”
  • tactile input devices such as gloves, and other body coverings, and the like
  • These and other input devices can be connected to the one or more processors 503 via the human machine interface 502 that is coupled to the system bus 513 , but can be connected by other interface and bus structures, such as a parallel port, game port, an IEEE 1394 Port (also known as a Firewire port), a serial port, or a universal serial bus (USB).
  • a parallel port e.g., game port
  • IEEE 1394 Port also known as a Firewire port
  • serial port e.g., a serial port
  • USB universal serial bus
  • the display device 511 can also be connected to the system bus 513 via an interface, such as the display adapter 509 .
  • the computer 501 can have more than one display adapter 509 and the computer 501 can have more than one display device 511 .
  • the display device 511 can be a monitor, an LCD (Liquid Crystal Display), or a projector.
  • the display device 511 can include an LCD display shaped to fit within a frame of a vehicle window.
  • video content can be displayed according to the aspects described above, but using the LCD display in place of a projector and semitransparent insert.
  • output peripheral devices can comprise components such as speakers (not shown) and a printer (not shown) which can be connected to the computer 501 via the Input/Output Interface 510 .
  • Any step and/or result of the methods can be output in any form to an output device.
  • Such output can be any form of visual representation, including, but not limited to, textual, graphical, animation, audio, tactile, and the like.
  • the display device 511 and computer 501 can be part of one device, or separate devices.
  • the computer 501 can operate in a networked environment using logical connections to one or more remote computing devices 514 a,b,c .
  • a remote computing device can be a personal computer, portable computer, smartphone, a server, a router, a network computer, a peer device or other common network node, and so on.
  • Logical connections between the computer 501 and a remote computing device 514 a,b,c can be made via a network 515 , such as a local area network (LAN) and/or a general wide area network (WAN).
  • LAN local area network
  • WAN general wide area network
  • Such network connections can be through the network adapter 508 .
  • the network adapter 508 can be implemented in both wired and wireless environments. Such networking environments are conventional and commonplace in dwellings, offices, enterprise-wide computer networks, intranets, and the Internet.
  • Computer readable media can comprise “computer storage media” and “communications media.”
  • “Computer storage media” comprise volatile and non-volatile, removable and non-removable media implemented in any methods or technology for storage of information such as computer readable instructions, data structures, program modules, or other data.
  • Exemplary computer storage media comprises, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer.
  • the methods and systems can employ Artificial Intelligence techniques such as machine learning and iterative learning.
  • Artificial Intelligence techniques such as machine learning and iterative learning. Examples of such techniques include, but are not limited to, expert systems, case based reasoning, Bayesian networks, behavior based AI, neural networks, fuzzy systems, evolutionary computation (e.g. genetic algorithms), swarm intelligence (e.g. ant algorithms), and hybrid intelligent systems (e.g. Expert inference rules generated through a neural network or production rules from statistical learning).

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Accounting & Taxation (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Marketing (AREA)
  • Development Economics (AREA)
  • Finance (AREA)
  • Strategic Management (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Game Theory and Decision Science (AREA)
  • Economics (AREA)
  • General Business, Economics & Management (AREA)
  • Computer Hardware Design (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

Disclosed are various embodiments of a vehicle projection system. Visual content is projected from a vehicle interior into a rear projection insert. The insert can include a semitransparent component to facilitate rear projection. The insert can also be perforated for enhanced visibility from the vehicle interior through the insert. A controller can change or modify the projected visual content based on operating conditions of the vehicle.

Description

    BACKGROUND
  • Vehicles can be affixed with text or graphics for advertising purposes. As an example, advertising images may be adhered to vehicle windows. A disadvantage to this approach is that an advertiser is limited to still-frame images. Moreover, changing the displayed advertisement requires the removal of the original and the production of a new decal, which has drawbacks in both cost and time.
  • SUMMARY
  • It is to be understood that both the following general description and the following detailed description are exemplary and explanatory only and are not restrictive. Provided are methods and systems for vehicle projection system. A semitransparent insert can be inserted into a frame for a vehicle window. The insert can be adhered to a vehicle window, or cut to size so that it rests in the window frame without adhesion. A projector in the vehicle can project visual content, including image or video content, on the semitransparent insert. The projected light passes through the insert, allowing the projected content to be viewed from the vehicle exterior. The projector may be communicatively coupled to one or more sensors that alter the displayed content based on operating circumstances of the vehicle.
  • Additional advantages will be set forth in part in the description which follows or may be learned by practice. The advantages will be realized and attained by means of the elements and combinations particularly pointed out in the appended claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments and together with the description, serve to explain the principles of the methods and systems:
  • FIG. 1 is an implementation of an installation of a semitransparent insert of a vehicle projection system;
  • FIG. 2A is a view of an example implementation of a vehicle projection system;
  • FIG. 2B is a view of an example implementation of a vehicle projection system:
  • FIG. 2C is a view of an example implementation of a vehicle projection system:
  • FIG. 3 is a block diagram of an example implementation of a vehicle projection system;
  • FIG. 4 is a flowchart of an example method; and
  • FIG. 5 is a block diagram of an example computing device in the vehicle projection system.
  • DETAILED DESCRIPTION
  • Before the present methods and systems are disclosed and described, it is to be understood that the methods and systems are not limited to specific methods, specific components, or to particular implementations. It is also to be understood that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting.
  • As used in the specification and the appended claims, the singular forms “a,” “an” and “the” include plural referents unless the context clearly dictates otherwise. Ranges may be expressed herein as from “about” one particular value, and/or to “about” another particular value. When such a range is expressed, another embodiment includes from the one particular value and/or to the other particular value. Similarly, when values are expressed as approximations, by use of the antecedent “about,” it will be understood that the particular value forms another embodiment. It will be further understood that the endpoints of each of the ranges are significant both in relation to the other endpoint, and independently of the other endpoint.
  • “Optional” or “optionally” means that the subsequently described event or circumstance may or may not occur, and that the description includes instances where said event or circumstance occurs and instances where it does not.
  • Throughout the description and claims of this specification, the word “comprise” and variations of the word, such as “comprising” and “comprises,” means “including but not limited to,” and is not intended to exclude, for example, other components, integers or steps. “Exemplary” means “an example of” and is not intended to convey an indication of a preferred or ideal embodiment. “Such as” is not used in a restrictive sense, but for explanatory purposes.
  • Disclosed are components that can be used to perform the disclosed methods and systems. These and other components are disclosed herein, and it is understood that when combinations, subsets, interactions, groups, etc. of these components are disclosed that while specific reference of each various individual and collective combinations and permutation of these may not be explicitly disclosed, each is specifically contemplated and described herein, for all methods and systems. This applies to all aspects of this application including, but not limited to, steps in disclosed methods. Thus, if there are a variety of additional steps that can be performed it is understood that each of these additional steps can be performed with any specific embodiment or combination of embodiments of the disclosed methods.
  • The present methods and systems may be understood more readily by reference to the following detailed description of preferred embodiments and the examples included therein and to the Figures and their previous and following description.
  • As will be appreciated by one skilled in the art, the methods and systems may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the methods and systems may take the form of a computer program product on a computer-readable storage medium having computer-readable program instructions (e.g., computer software) embodied in the storage medium. More particularly, the present methods and systems may take the form of web-implemented computer software. Any suitable computer-readable storage medium may be utilized including hard disks, CD-ROMs, optical storage devices, or magnetic storage devices.
  • Embodiments of the methods and systems are described below with reference to block diagrams and flowchart illustrations of methods, systems, apparatuses and computer program products. It will be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, respectively, can be implemented by computer program instructions. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create a means for implementing the functions specified in the flowchart block or blocks.
  • These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including computer-readable instructions for implementing the function specified in the flowchart block or blocks. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer-implemented process such that the instructions that execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart block or blocks.
  • Accordingly, blocks of the block diagrams and flowchart illustrations support combinations of means for performing the specified functions, combinations of steps for performing the specified functions and program instruction means for performing the specified functions. It will also be understood that each block of the block diagrams and flowchart illustrations, and combinations of blocks in the block diagrams and flowchart illustrations, can be implemented by special purpose hardware-based computer systems that perform the specified functions or steps, or combinations of special purpose hardware and computer instructions.
  • The present disclosure relates to a vehicle projection system for projecting visual content visible from the exterior of a vehicle. Although the following discussion sets forth visual content in the context of advertising, it is understood that the displayed visual content may extend to other uses, including entertainment or artistic purposes. Additionally, although the following discussion addresses a vehicle projection system deployed in an automobile, it is understood that this is a non-limiting example and the vehicle projection system may be deployed in other vehicles.
  • Vehicle exteriors may be used for advertising purposes by adhering decals or other imagery to the vehicle body. This includes perforated decals adhered to windows, as well as other decals or art adhered to the vehicle body or frame. This existing approach has several disadvantages. First, the advertising content is limited to still images, preventing more dynamic advertisements from being presented. Additionally, as the advertising content is adhered directly to the vehicle, changing the advertising content requires a complete removal of an existing advertisement. This requires substantial effort, and may result in the destruction of the removed advertisement.
  • A vehicle projection system allows visual content to be projected within the frame of a vehicle window. A semitransparent insert is inserted into the frame of the vehicle window. A projector internal to the vehicle projects visual content, including still images or video, on the semitransparent insert. As the insert is semitransparent, the projected visual content is visible from the outside of the vehicle. The insert can include a semitransparent perforated layer to facilitate viewing through the insert from inside the vehicle. Additionally, the semitransparent insert material can minimize the amount of projected imagery visible from the inside of the vehicle. In an aspect, the projector can be coupled to a controller to modify the projected visual content according to operating conditions of the vehicle. This can include dimming or brightening a projection, switching between static image and dynamic video projected content, applying image or text overlays, or otherwise modifying the projected visual content. Accordingly, the controller may be communicatively coupled to one or more sensors, including a motion sensor, proximity motion sensor, global positioning system radio, light sensor, or other sensor as can be appreciated. Additionally, the controller may be in communication with a mobile device, server, or other computing device to monitor projection usage and vehicle activity for tracking incentives.
  • Although the following figures and discussion address a vehicle projection system with respect to a rear window of a vehicle, it is understood that this is a non-limiting example, and that the vehicle projection system may be implemented using any window of a vehicle.
  • FIG. 1 illustrates an example installation portion 100 of a vehicle projection system. The disclosed installation portion 100 includes a semitransparent insert 102. The semitransparent insert allows for rear-projected light projected onto the insert 102 from the interior of the vehicle to be visible from the exterior of the vehicle. In an aspect, the insert 102 comprises plexiglass, fiberglass, plastic, or another substantially rigid material as can be appreciated. In an aspect, the insert 102 can be composed of a semitransparent material. In another aspect, the insert 102 can be composed of a transparent material layer and a semitransparent material layer. For example, the insert 102 can include a transparent plexiglass layer and a semitransparent rear screen projection film layer. In an aspect the semitransparent rear screen projection film layer is perforated to facilitate visibility from a driver or passenger of the vehicle through the insert 102. In an aspect, the insert 102 can be cut to size for fitting within a frame of a vehicle window. Thus, the window frame holds the insert 102 in place without need for adhesion. In another aspect, the insert 102 can be adhered directly to a vehicle window.
  • FIG. 2A is a view 200 of example implementation of a vehicle projection system. Shown is a vehicle 201 with a projector 202 a affixed near a front end of the vehicle 201 cabin. The projector 202 a is configured to rear project light onto a insert 102.
  • FIG. 2B is another view 210 of example implementation of a vehicle projection system. Shown is a vehicle 201 with a projector 202 b affixed near a rear end of the vehicle 201 cabin. The projector 202 b is configured to rear project light onto a insert 102. In this example view 210, the projector 202 b is a short throw projector 202 b capable of focused projections over a short distance.
  • FIG. 2C is another view 220 of example implementation of a vehicle projection system. In an aspect the view 220 cab correspond to an overhead view of the view 200. Shown is a vehicle 201 with a projector 202 a affixed near a front end of the vehicle 201 cabin. The projector 202 a is configured to rear project light onto a insert 102.
  • FIG. 3 is an example block diagram 300 for a vehicle projection system. The vehicle projection system includes a projector 302 for projecting visual content. The projector 302 can be designed for projection over a short distance onto a projection medium, i.e. a short throw projector. The projector 302 can also be designed for medium to long range projection, such as from a midpoint in a vehicle cabin or a dashboard onto a projection medium. The visual content projected by the projector 302 can be accessed from a media source 304. In an aspect, the media source 304 can include a memory, including hard drives, solid state drives, or other memory as can be appreciated. The media source 304 can be remotely located from the projector 302 and can be accessed wirelessly as will be discussed below. The media source 304 can also include a wired memory coupled to the projector, or a memory internal to the projector. Additionally, the media source 304 can include a device outputting a video signal for projection by the projector 302. Such a device can include a computing device, video disc player, or other device as can be appreciated.
  • The projector 302 can be coupled to a controller 306 for controlling the projection operations of the projector 302. The controller 306 can include any combination of circuitry, software, computing devices, or other components as can be appreciated. In an aspect, the controller 306 can load visual content from a media source 304 for projection. For example, the controller 306 can access a physically coupled media source 304. In another example, the controller 306 can wirelessly access a media source 304. This can include accessing a media source 304 through a personal area network. This can also include wirelessly accessing a server functioning as a media source 304.
  • In an aspect, the controller 306 can select visual content from the media source 304 for projection based on operating conditions of the vehicle. Accordingly, the controller 306 can access one or more sensors including a brake sensor 308, motion sensor 310, and proximity motion sensor 312. The brake sensor 308 detects whether brakes of the vehicle are currently being applied. The motion sensor 310 detects whether the vehicle is in motion. In an aspect, the motion sensor 310 can include an accelerometer. In another aspect, the motion sensor 310 can include a global positioning system (GPS) radio, with changes in GPS location indicating a vehicle in motion. In another aspect, the motion sensor 310 can include a speedometer, with a non-zero speed indicating a vehicle in motion. The controller 306 can also include a proximity motion sensor 312 detecting moving vehicles proximate to the vehicle in which the vehicle projection system is installed.
  • In an aspect, the controller 306 can modify a projection of the visual content by the projector 302 based on whether or not brakes are applied as indicated by the brake sensor 308. For example, the controller 306 can increase or decrease a brightness of the projection when brakes are applied. As another example, the controller 306 can apply a video, image, or text overlay when brakes are applied. The overlay can emphasize or indicate that brakes are being applied, or display other information.
  • The controller 306 can also modify a projection of the visual content by the projector 302 based on whether or not the vehicle is in motion as indicated by the motion sensor 310. For example, the controller 306 can select static or still image visual content for projection while the vehicle is in motion. In an aspect, the controller 306 can select dynamic or video content for projection when the vehicle is not in motion. In another aspect, the controller 306 can select static or still image visual content for projection while the vehicle is stopped and one or more proximate vehicles are detected to be in motion as indicated by the proximity motion sensor 312.
  • In an aspect, any combination of the brake sensor 308, motion sensor 310, and proximity motion sensor 312 may be in direct communication with the controller 306. That is, each may be communicatively coupled directly to the controller 306 by a wired or wireless interface. In another aspect, any combination of the brake sensor 308, motion sensor 310, and proximity motion sensor 312 may be components or peripheral components of the vehicle in which the vehicle projection system is deployed. Accordingly, any of the brake sensor 308, motion sensor 310, and proximity motion sensor 312 may by coupled to an onboard computing device in the vehicle accessible to the controller via a vehicle interface 314.
  • In an aspect, the controller 306 may also be in communication with a mobile device 316, such as a smartphone, tablet, laptop, or other computing device. In such an aspect, the mobile device 316 may serve as a media source 304 transmitting visual content to the controller 306. In another aspect, the mobile device 316 may provide a text message or image to the controller 306 for overlay on the projected visual content. Additionally, the mobile device 316 may monitor projector 302 usage, vehicle activity from the vehicle interface 314, or other activity for reporting to a central server. This aggregated data can be used, for example to calculate incentives associated with a projection of visual content during operation of a vehicle.
  • It is understood that the block diagram 300 is only exemplary, and that any of the depicted components could be removed and additional components may be added. Additionally, it is understood that one or more of the depicted components may be included in the same entity, or their disclosed functions performed by a same device.
  • FIG. 4 is a flowchart 400 of an example method. Beginning with step 402, the vehicle projection system projects static visual content via a projector 302 onto a semitransparent insert 100 in a vehicle. The static visual content can include, for example, a still image, text, or other static content. Next, in box 404, the controller 306 determines if brakes are being applied according to information from a brake sensor 308. If brakes are not being applied, the process returns to box 402 where the system continues to project static visual content. Otherwise, the process advances to step 406 where the controller determines if the vehicle is stopped according to information from a motion sensor 310.
  • If the vehicle is not stopped, the process advances to step 408, where the controller 306 projects modified static visual content. This can include, for example, dimming or brightening a projection of the visual content. This can also include applying a visual or text overlay to the projected visual content. The process then returns to step 404, such that the modified static visual content is projected until the vehicle comes to a full stop, as described above, or the brakes are no longer applied.
  • If, in step 406, it is determined that the vehicle is stopped, the process advances to step 410, where the controller 306 determines if there is proximity movement detected, i.e. are there vehicles moving within a detection proximity of a proximity motion sensor 312. If so, then the process advances to step 412 where the controller 306 projects static visual content. This static visual content can correspond to the originally projected static visual content referenced in step 402, the modified static visual content referenced in step 408, or other static visual content as can be appreciated. If no proximity movement is detected in step 410 the process advances to step 414 where the controller 306 projects video content. The process then advances back to step 404. Thus, on reaching step 414, video content will continue to be projected until the vehicle stops moving, the brakes are released, or proximity movement is detected.
  • In an exemplary aspect, the methods and systems can be implemented on a computer 501 as illustrated in FIG. 5 and described below. By way of example, the controller 306 of FIG. 3 can be a computer as illustrated in FIG. 5. Similarly, the methods and systems disclosed can utilize one or more computers to perform one or more functions in one or more locations. FIG. 5 is a block diagram illustrating an exemplary operating environment for performing the disclosed methods. This exemplary operating environment is only an example of an operating environment and is not intended to suggest any limitation as to the scope of use or functionality of operating environment architecture. Neither should the operating environment be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment.
  • The present methods and systems can be operational with numerous other general purpose or special purpose computing system environments or configurations. Examples of well-known computing systems, environments, and/or configurations that can be suitable for use with the systems and methods comprise, but are not limited to, personal computers, server computers, laptop devices, and multiprocessor systems. Additional examples comprise set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that comprise any of the above systems or devices, and the like.
  • The processing of the disclosed methods and systems can be performed by software components. The disclosed systems and methods can be described in the general context of computer-executable instructions, such as program modules, being executed by one or more computers or other devices. Generally, program modules comprise computer code, routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The disclosed methods can also be practiced in grid-based and distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules can be located in both local and remote computer storage media including memory storage devices.
  • Further, one skilled in the art will appreciate that the systems and methods disclosed herein can be implemented via a general-purpose computing device in the form of a computer 501. The components of the computer 501 can comprise, but are not limited to, one or more processors 503, a system memory 512, and a system bus 513 that couples various system components including the one or more processors 503 to the system memory 512. The system can utilize parallel computing.
  • The system bus 513 represents one or more of several possible types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, or local bus using any of a variety of bus architectures. By way of example, such architectures can comprise an Industry Standard Architecture (ISA) bus, a Micro Channel Architecture (MCA) bus, an Enhanced ISA (EISA) bus, a Video Electronics Standards Association (VESA) local bus, an Accelerated Graphics Port (AGP) bus, and a Peripheral Component Interconnects (PCI), a PCI-Express bus, a Personal Computer Memory Card Industry Association (PCMCIA), Universal Serial Bus (USB) and the like. The bus 513, and all buses specified in this description can also be implemented over a wired or wireless network connection and each of the subsystems, including the one or more processors 503, a mass storage device 504, an operating system 505, projection software 506, projection data 507, a network adapter 508, the system memory 512, an Input/Output Interface 510, a display adapter 509, a display device 511, and a human machine interface 502, can be contained within one or more remote computing devices 514 a,b,c at physically separate locations, connected through buses of this form, in effect implementing a fully distributed system. As an example, the projectors 202 a, 202 b, and 302 of FIGS. 2A, 2B, 2C, and 3, respectively, can be a display device 511.
  • The computer 501 typically comprises a variety of computer readable media. Exemplary readable media can be any available media that is accessible by the computer 501 and comprises, for example and not meant to be limiting, both volatile and non-volatile media, removable and non-removable media. The system memory 512 comprises computer readable media in the form of volatile memory, such as random access memory (RAM), and/or non-volatile memory, such as read only memory (ROM). The system memory 112 typically contains data such as the projection data 507 and/or program modules such as the operating system 505 and the projection software 506 that are immediately accessible to and/or are presently operated on by the one or more processors 503.
  • In another aspect, the computer 501 can also comprise other removable/non-removable, volatile/non-volatile computer storage media. By way of example, FIG. 5 illustrates the mass storage device 504 which can provide non-volatile storage of computer code, computer readable instructions, data structures, program modules, and other data for the computer 501. For example and not meant to be limiting, the mass storage device 504 can be a hard disk, a removable magnetic disk, a removable optical disk, magnetic cassettes or other magnetic storage devices, flash memory cards, CD-ROM, digital versatile disks (DVD) or other optical storage, random access memories (RAM), read only memories (ROM), electrically erasable programmable read-only memory (EEPROM), and the like.
  • Optionally, any number of program modules can be stored on the mass storage device 504, including by way of example, the operating system 505 and the projection software 506. Each of the operating system 505 and the projection software 506 (or some combination thereof) can comprise elements of the programming and the projection software 506. The projection data 507 can also be stored on the mass storage device 504. The projection data 507 can be stored in any of one or more databases known in the art. Examples of such databases comprise, DB2®, Microsoft® Access, Microsoft® SQL Server, Oracle®, mySQL, PostgreSQL, and the like. The databases can be centralized or distributed across multiple systems.
  • In another aspect, the user can enter commands and information into the computer 501 via an input device (not shown). Examples of such input devices comprise, but are not limited to, a keyboard, pointing device (e.g., a “mouse”), a microphone, a joystick, a scanner, tactile input devices such as gloves, and other body coverings, and the like These and other input devices can be connected to the one or more processors 503 via the human machine interface 502 that is coupled to the system bus 513, but can be connected by other interface and bus structures, such as a parallel port, game port, an IEEE 1394 Port (also known as a Firewire port), a serial port, or a universal serial bus (USB).
  • In yet another aspect, the display device 511 can also be connected to the system bus 513 via an interface, such as the display adapter 509. It is contemplated that the computer 501 can have more than one display adapter 509 and the computer 501 can have more than one display device 511. For example, the display device 511 can be a monitor, an LCD (Liquid Crystal Display), or a projector. For example, the display device 511 can include an LCD display shaped to fit within a frame of a vehicle window. In such an aspect, video content can be displayed according to the aspects described above, but using the LCD display in place of a projector and semitransparent insert. In addition to the display device 511, other output peripheral devices can comprise components such as speakers (not shown) and a printer (not shown) which can be connected to the computer 501 via the Input/Output Interface 510. Any step and/or result of the methods can be output in any form to an output device. Such output can be any form of visual representation, including, but not limited to, textual, graphical, animation, audio, tactile, and the like. The display device 511 and computer 501 can be part of one device, or separate devices.
  • The computer 501 can operate in a networked environment using logical connections to one or more remote computing devices 514 a,b,c. By way of example, a remote computing device can be a personal computer, portable computer, smartphone, a server, a router, a network computer, a peer device or other common network node, and so on. Logical connections between the computer 501 and a remote computing device 514 a,b,c can be made via a network 515, such as a local area network (LAN) and/or a general wide area network (WAN). Such network connections can be through the network adapter 508. The network adapter 508 can be implemented in both wired and wireless environments. Such networking environments are conventional and commonplace in dwellings, offices, enterprise-wide computer networks, intranets, and the Internet.
  • For purposes of illustration, application programs and other executable program components such as the operating system 505 are illustrated herein as discrete blocks, although it is recognized that such programs and components reside at various times in different storage components of the computing device 501, and are executed by the one or more processors 503 of the computer. An implementation of the projection software 506 can be stored on or transmitted across some form of computer readable media. Any of the disclosed methods can be performed by computer readable instructions embodied on computer readable media. Computer readable media can be any available media that can be accessed by a computer. By way of example and not meant to be limiting, computer readable media can comprise “computer storage media” and “communications media.” “Computer storage media” comprise volatile and non-volatile, removable and non-removable media implemented in any methods or technology for storage of information such as computer readable instructions, data structures, program modules, or other data. Exemplary computer storage media comprises, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store the desired information and which can be accessed by a computer.
  • The following examples are put forth so as to provide those of ordinary skill in the art with a complete disclosure and description of how the compounds, compositions, articles, devices and/or methods claimed herein are made and evaluated, and are intended to be purely exemplary and are not intended to limit the scope of the methods and systems. Efforts have been made to ensure accuracy with respect to numbers (e.g., amounts, temperature, etc.), but some errors and deviations should be accounted for. Unless indicated otherwise, parts are parts by weight, temperature is in ° C. or is at ambient temperature, and pressure is at or near atmospheric.
  • The methods and systems can employ Artificial Intelligence techniques such as machine learning and iterative learning. Examples of such techniques include, but are not limited to, expert systems, case based reasoning, Bayesian networks, behavior based AI, neural networks, fuzzy systems, evolutionary computation (e.g. genetic algorithms), swarm intelligence (e.g. ant algorithms), and hybrid intelligent systems (e.g. Expert inference rules generated through a neural network or production rules from statistical learning).
  • While the methods and systems have been described in connection with preferred embodiments and specific examples, it is not intended that the scope be limited to the particular embodiments set forth, as the embodiments herein are intended in all respects to be illustrative rather than restrictive.
  • Unless otherwise expressly stated, it is in no way intended that any method set forth herein be construed as requiring that its steps be performed in a specific order. Accordingly, where a method claim does not actually recite an order to be followed by its steps or it is not otherwise specifically stated in the claims or descriptions that the steps are to be limited to a specific order, it is in no way intended that an order be inferred, in any respect. This holds for any possible non-express basis for interpretation, including: matters of logic with respect to arrangement of steps or operational flow; plain meaning derived from grammatical organization or punctuation; the number or type of embodiments described in the specification.
  • Throughout this application, various publications are referenced. The disclosures of these publications in their entireties are hereby incorporated by reference into this application in order to more fully describe the state of the art to which the methods and systems pertain.
  • It will be apparent to those skilled in the art that various modifications and variations can be made without departing from the scope or spirit. Other embodiments will be apparent to those skilled in the art from consideration of the specification and practice disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit being indicated by the following claims.

Claims (21)

1. An apparatus comprising:
a perforated semitransparent insert shaped for insertion into a frame of a window of a vehicle; and
a projector configured to:
receive an indication of the motion of the vehicle from a speedometer communicatively coupled to the projector;
project visual content onto the perforated semitransparent insert based on the indication of the motion by:
displaying a still image in response to the vehicle being stopped and detecting another vehicle in motion; and
displaying video content in response to the vehicle being stopped and not detecting another vehicle in motion.
2. The apparatus of claim 1, wherein the visual content comprises video content.
3. (canceled)
4. The apparatus of claim 1, wherein the projector is configured to display a still image in response to the vehicle being in motion.
5. (canceled)
6. (canceled)
7. (canceled)
8. The apparatus of claim 1, wherein the perforated semitransparent insert comprises plexiglass.
9. The apparatus of claim 1, wherein the perforated semitransparent insert is unadhered to the window of the vehicle.
10. The apparatus of claim 1, wherein the perforated semitransparent insert is adhered to the window of the vehicle.
11. An apparatus comprising:
an insert shaped for insertion into a frame of a window of a vehicle;
a projector configured to project visual content onto the semitransparent insert; and
a global positioning system radio communicatively coupled to the projector; and wherein the projector is configured to:
determine the visual content projected onto the semitransparent insert based on an indication of movement of the vehicle received from at the global positioning system radio by:
display a still image in response to the vehicle being stopped and detecting another vehicle in motion; and
display video content in response to the vehicle being stopped and not detecting another vehicle in motion.
12. The apparatus of claim 11, wherein the projector is configured to dim a projection of the visual content in response to a movement of the vehicle.
13. The apparatus of claim 11, wherein the projector is configured to determine the visual content projected onto the semitransparent insert further based on a braking status of the vehicle.
14. The apparatus of claim 13, wherein the projector is configured to dim a projection of the visual content in response to a braking of the vehicle.
15. The apparatus of claim 13, wherein the projector is configured to display an overlay on the visual content in response to a braking of the vehicle.
16. The apparatus of claim 13, further comprising a mobile device in communication with the projector.
17. The apparatus of claim 16, wherein the projector is configured to display, on the visual content, an overlay received from the mobile device.
18. The apparatus of claim 16, wherein the projector is configured to receive the visual content from the mobile device.
19. The apparatus of claim 16, wherein the mobile device is configured to monitor an amount of time of the visual content is displayed.
20. The apparatus of claim 19, wherein the mobile device is configured to track an incentive accrued based on the amount of time the visual content is displayed.
21. The apparatus of claim 17, wherein the projector is configured to receive the visual content from the mobile device.
US15/366,476 2016-12-01 2016-12-01 Vehicle projection system Active US9987978B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/366,476 US9987978B1 (en) 2016-12-01 2016-12-01 Vehicle projection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US15/366,476 US9987978B1 (en) 2016-12-01 2016-12-01 Vehicle projection system

Publications (2)

Publication Number Publication Date
US9987978B1 US9987978B1 (en) 2018-06-05
US20180154826A1 true US20180154826A1 (en) 2018-06-07

Family

ID=62235107

Family Applications (1)

Application Number Title Priority Date Filing Date
US15/366,476 Active US9987978B1 (en) 2016-12-01 2016-12-01 Vehicle projection system

Country Status (1)

Country Link
US (1) US9987978B1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109823261A (en) * 2019-03-06 2019-05-31 李良杰 Safety reminding device after vehicle
CN110796468A (en) * 2018-08-03 2020-02-14 丰田自动车株式会社 Information processing apparatus, information processing method, and non-transitory storage medium
GB2595316A (en) * 2020-05-20 2021-11-24 Joseph Brooks Aaron Mobile marketing communication systems and methods

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6470335B2 (en) * 2017-03-15 2019-02-13 株式会社Subaru Vehicle display system and method for controlling vehicle display system
USD861793S1 (en) * 2017-09-01 2019-10-01 Loud & Clear Products, LLC Window cling
US11284051B2 (en) * 2018-07-30 2022-03-22 Pony Ai Inc. Systems and methods for autonomous vehicle interactive content presentation
CN110871684A (en) * 2018-09-04 2020-03-10 比亚迪股份有限公司 In-vehicle projection method, device, equipment and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796350A (en) * 1996-03-13 1998-08-18 Toyota Jidosha Kabushiki Kaisha Automobile screen control apparatus
US7477140B1 (en) * 2003-12-26 2009-01-13 Booth Kenneth C See-through lighted information display
US20110018738A1 (en) * 2008-12-04 2011-01-27 Verizon Patent And Licensing, Inc. Motion controlled display
US20140368324A1 (en) * 2013-06-17 2014-12-18 Jerry A. SEIFERT Rear end collision prevention apparatus

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7154383B2 (en) * 2003-07-09 2006-12-26 Steven Earl Berquist Dynamic mobile advertising system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5796350A (en) * 1996-03-13 1998-08-18 Toyota Jidosha Kabushiki Kaisha Automobile screen control apparatus
US7477140B1 (en) * 2003-12-26 2009-01-13 Booth Kenneth C See-through lighted information display
US20110018738A1 (en) * 2008-12-04 2011-01-27 Verizon Patent And Licensing, Inc. Motion controlled display
US20140368324A1 (en) * 2013-06-17 2014-12-18 Jerry A. SEIFERT Rear end collision prevention apparatus

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110796468A (en) * 2018-08-03 2020-02-14 丰田自动车株式会社 Information processing apparatus, information processing method, and non-transitory storage medium
CN109823261A (en) * 2019-03-06 2019-05-31 李良杰 Safety reminding device after vehicle
GB2595316A (en) * 2020-05-20 2021-11-24 Joseph Brooks Aaron Mobile marketing communication systems and methods

Also Published As

Publication number Publication date
US9987978B1 (en) 2018-06-05

Similar Documents

Publication Publication Date Title
US9987978B1 (en) Vehicle projection system
US20230153842A1 (en) Driver Profiles Based Upon Driving Behavior With Passengers
US9996980B1 (en) Augmented reality for providing vehicle functionality through virtual features
US10740796B2 (en) Systems, methods, and devices for generating critical mass in a mobile advertising, media, and communications platform
US20240127636A1 (en) In-vehicle monitoring and reporting apparatus for vehicles
US10205890B2 (en) Systems, methods, and devices for rendering in-vehicle media content based on vehicle sensor data
US11807273B2 (en) Methods and apparatus to provide accident avoidance information to passengers of autonomous vehicles
US20200174261A1 (en) In-vehicle content display apparatus
US20100063885A1 (en) Apparatus, system, and method for advertisement complexity scaling via traffic analysis
US20180022290A1 (en) Systems, Methods, And Devices For Rendering In-Vehicle Media Content Based On Vehicle Sensor Data
US10984405B2 (en) Digital license plate with payment and information handling system
US20180186288A1 (en) Digital License Plate With Camera System
US9911137B2 (en) Reactive signage
CN114385005B (en) Personalized virtual test driving device, method and storage medium
US12013703B2 (en) Systems and methods for evaluating autonomous vehicle software interactions for proposed trips
US11017490B2 (en) Government and first responder interface system for a vehicle having a digital license plate
US10703383B1 (en) Systems and methods for detecting software interactions for individual autonomous vehicles
US10671514B2 (en) Vehicle application simulation environment
US20190092232A1 (en) Method, Device And Computer-Readable Storage Medium With Instructions For Identifying An Exit Side Of A Motor Vehicle
KR102364766B1 (en) Reminders
KR102661703B1 (en) Electronic device and controlling method thereof
US10686800B2 (en) System and method of utilizing augmented reality in various contexts
US20220237961A1 (en) Systems and methods for detecting software interactions for autonomous vehicles within changing environmental conditions
Kim et al. The usefulness of augmenting reality on vehicle head-up display
EP4170573A1 (en) Advertising system and method based on digital intelligent information sharing using external display of vehicle

Legal Events

Date Code Title Description
STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: SURCHARGE FOR LATE PAYMENT, MICRO ENTITY (ORIGINAL EVENT CODE: M3554); ENTITY STATUS OF PATENT OWNER: MICROENTITY

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, MICRO ENTITY (ORIGINAL EVENT CODE: M3551); ENTITY STATUS OF PATENT OWNER: MICROENTITY

Year of fee payment: 4