US20230302900A1 - Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object - Google Patents
Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object Download PDFInfo
- Publication number
- US20230302900A1 US20230302900A1 US17/702,093 US202217702093A US2023302900A1 US 20230302900 A1 US20230302900 A1 US 20230302900A1 US 202217702093 A US202217702093 A US 202217702093A US 2023302900 A1 US2023302900 A1 US 2023302900A1
- Authority
- US
- United States
- Prior art keywords
- vehicle
- visually imperceptible
- notification symbol
- augmented reality
- windscreen
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 45
- 238000001514 detection method Methods 0.000 claims abstract description 49
- 230000000007 visual effect Effects 0.000 claims abstract description 47
- 230000004044 response Effects 0.000 claims abstract description 11
- 238000004891 communication Methods 0.000 claims abstract description 8
- 230000008447 perception Effects 0.000 claims description 26
- 238000000034 method Methods 0.000 claims description 21
- 238000013459 approach Methods 0.000 claims description 19
- 230000007423 decrease Effects 0.000 claims description 6
- 241001465754 Metazoa Species 0.000 claims description 4
- 230000005284 excitation Effects 0.000 claims description 4
- 210000003128 head Anatomy 0.000 description 4
- 238000004020 luminiscence type Methods 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000000446 fuel Substances 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000000737 periodic effect Effects 0.000 description 1
- 230000001052 transient effect Effects 0.000 description 1
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
- B60K35/233—Head-up displays [HUD] controlling the size or position in display areas of virtual images depending on the condition of the vehicle or the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
- B60K35/235—Head-up displays [HUD] with means for detecting the driver's gaze direction or eye points
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/28—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
- B60K35/285—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver for improving awareness by directing driver's gaze direction or eye points
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/80—Arrangements for controlling instruments
- B60K35/81—Arrangements for controlling instruments for controlling displays
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/04—Display arrangements
- G01S7/06—Cathode-ray tube displays or other two dimensional or three-dimensional displays
- G01S7/10—Providing two-dimensional and co-ordinated display of distance and direction
- G01S7/12—Plan-position indicators, i.e. P.P.I.
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/04—Display arrangements
- G01S7/06—Cathode-ray tube displays or other two dimensional or three-dimensional displays
- G01S7/20—Stereoscopic displays; Three-dimensional displays; Pseudo-three-dimensional displays
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/04—Display arrangements
- G01S7/06—Cathode-ray tube displays or other two dimensional or three-dimensional displays
- G01S7/22—Producing cursor lines and indicia by electronic means
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/60—Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/166—Navigation
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/167—Vehicle dynamics information
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/168—Target or limit values
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/177—Augmented reality
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/16—Type of output information
- B60K2360/178—Warnings
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/20—Optical features of instruments
- B60K2360/21—Optical features of instruments using cameras
-
- B60K2370/1529—
-
- B60K2370/166—
-
- B60K2370/167—
-
- B60K2370/168—
-
- B60K2370/177—
-
- B60K2370/178—
-
- B60K2370/21—
-
- B60K2370/52—
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/23—Head-up displays [HUD]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/865—Combination of radar systems with lidar systems
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9323—Alternative operation using light waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
- G01S2013/9327—Sensor installation details
- G01S2013/93271—Sensor installation details in the front of the vehicles
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0183—Adaptation to parameters characterising the motion of the vehicle
Definitions
- the present disclosure relates to an augmented reality head-up display for generating a notification symbol upon the windscreen of a vehicle.
- the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
- Augmented reality involves enhancing the real world with virtual elements that are shown in three-dimensional space and that permit real-time interaction with users.
- a head-up display shows information such as, for example, vehicle speed and navigational instructions, directly onto a windscreen of a vehicle, within the driver's forward field of view. Accordingly, the head-up display provides drivers with information without looking away from the road.
- One possible implementation for augmented reality is an augmented reality head-up display (AR-HUD) for a vehicle. By overlaying images on the windscreen, AR-HUDs enhance a driver's view of the environment outside the vehicle, creating a greater sense of environmental awareness.
- an augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle.
- the augmented reality head-up display system includes one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle, and one or more image-capturing devices that capture image data of the environment surrounding the vehicle, and a graphic projection device for generating images upon the windscreen of the vehicle.
- the system also includes a controller in electronic communication with the one or more non-visual object detection sensors, the one or more image-capturing devices, and the graphic projection device, wherein the controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors and the image data from the one or more non-visual object detection sensors.
- the controller compares the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. Finally, the controller instructs the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
- the controller executes instructions to determine a rate of approach towards the visually imperceptible object by the vehicle and adjusts at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
- the visual parameter is an overall size of the notification symbol, and where the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
- the visual parameter is a color of the notification symbol.
- the controller executes instructions to receive perception data indicative of human vision relative to camera vision, calculate a driver's field of view based on the perception data, and identify the visually imperceptible object based on the perception data.
- the perception data includes one or more of the following: ambient lighting conditions, sun position, headlamp coverage, and weather input.
- identifying the visually imperceptible object is determined based on driver vision capability.
- the one or more non-visual object detection sensors include one or more of the following: a radar, LiDAR, and one or more infrared sensors.
- the controller identifies the visually imperceptible object by determining a luminance contrast ratio between the plurality of detection points and the image data of the environment surrounding the vehicle.
- the controller instructs the graphic projection device of the augmented reality head-up display system to project cluster content information within a near-field image plane of the windscreen.
- the controller instructs the graphic projection device to project the notification symbol within a far-field image plane of the windscreen.
- the notification symbol is one of the following: a caution symbol, a vehicle icon, an animal icon, and a pedestrian icon.
- the visually imperceptible object is one of the following: roadway signage, roadway markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, and road conditions that require attention.
- a method for displaying graphics upon a windscreen of a vehicle includes receiving, by a controller, a plurality of detection points that indicate a presence of an object from one or more non-visual object detection sensors and image data from one or more image-capturing devices. The method includes comparing the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the method includes determining a notification symbol that signifies the visually imperceptible object. Finally, the method includes instructing, by the controller, a graphic projection device to generate the notification symbol upon the windscreen of the vehicle, where the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
- the method includes determining a rate of approach towards the visually imperceptible object by the vehicle and adjusting at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
- the method includes receiving perception data indicative of human vision relative to camera vision, calculating a driver's field of view based on the perception data, and identifying the visually imperceptible object based on the perception data.
- an augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle.
- the augmented reality head-up display system includes one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle, one or more image-capturing devices that capture image data of the environment surrounding the vehicle, a graphic projection device for generating images upon the windscreen of the vehicle, and a controller in electronic communication with the one or more non-visual object detection sensors, the one or more image-capturing devices, and the graphic projection device.
- the controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors.
- the controller compares the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. The controller instructs the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible. The controller determines a rate of approach towards the visually imperceptible object by the vehicle and adjusts at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
- the visual parameter is an overall size of the notification symbol, and the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
- the controller executes instructions to receive perception data indicative of human vision relative to camera vision, calculate a driver's field of view based on the perception data, and identify the visually imperceptible object based on the perception data.
- FIG. 1 is a schematic diagram of the disclosed augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle, according to an exemplary embodiment
- FIG. 2 illustrates an interior view of the windscreen illustrating a notification symbol overlaid at a position on the windscreen where a visually imperceptible object is located, according to an exemplary embodiment
- FIGS. 3 A- 3 C illustrate the notification symbol shown in FIG. 2 increasing in overall size, according to an exemplary embodiment
- FIGS. 4 A- 4 C illustrate various types of notification symbols, according to an exemplary embodiment
- FIG. 5 is a process flow diagram illustrating a method for displaying graphics upon the windscreen of the vehicle by the augmented reality head-up display system, according to an exemplary embodiment.
- the augmented reality head-up display system 10 for displaying graphics upon a windscreen 12 of a vehicle 14 is illustrated.
- the augmented reality head-up display system 10 includes one or more controllers 20 in electronic communication with one or more image-capturing devices 22 , one or more non-visual object detection sensors 24 , a graphic projection device 26 , and an eye location system 28 .
- the image-capturing devices 22 may be cameras that obtain periodic or sequential images.
- the one or more non-visual object detection sensors 24 are configured to detect a position, velocity, and direction of travel of objects in an environment 40 surrounding the vehicle 14 . In the example as shown in FIG.
- the one or more non-visual object detection sensors 24 include a radar 30 , LiDAR 32 , and one or more infrared sensors 34 , however, it is to be appreciated that other sensors that employ non-visual techniques to detect the presence of objects may be used as well. Moreover, information from other sensors may be used as well.
- the graphic projection device 26 is configured to generate images upon the windscreen 12 of the vehicle 14 and includes a projection device for creating an excitation light for projecting images.
- the eye location system 28 includes one or more sensors for determining the location of a head of the driver of the vehicle 14 as well as the orientation or gaze location of the driver's eyes.
- the vehicle 14 may be any type of vehicle such as, but not limited to, a sedan, truck, sport utility vehicle, van, or motor home. It is also to be appreciated that the vehicle 14 is not limited to an automobile, and may be any other type of land vehicle, marine vehicle, or air vehicle. In an embodiment, the vehicle 14 is an autonomous or semi-autonomous vehicle. However, it is to be appreciated that a manually driven vehicle may be used as well.
- the one or more controllers 20 may also be in electronic communication with a global positioning system (GPS) 41 , one or more vehicle systems 42 , one or more road databases 44 , and one or more external networks 46 .
- the one or more vehicle systems 42 include, but are not limited to, a driver monitoring system (DMS) and an automated driving system.
- the vehicle 14 may wirelessly connect to the one or more external networks 46 .
- Some examples of external networks 46 include, but are not limited to, cellular networks, dedicated short-range communications (DSRC) networks, and vehicle-to-infrastructure (V2X) networks.
- FIG. 2 is an exemplary interior view of the windscreen 12 , where the environment 40 surrounding the vehicle 14 is visible through the windscreen 12 .
- the images generated by the graphic projection module 26 are projected as light upon the windscreen 12 , where the light is reflected off the windscreen 12 and is directed to a driver of the vehicle 14 .
- the images generated by the graphic projection module 26 appear to be in front of the vehicle 12 , and beyond the windscreen 12 , when viewed by a driver.
- the augmented reality head-up display system 10 identifies a visually imperceptible object located in the environment 40 surrounding the vehicle 14 .
- the visually imperceptible object is any type of incident or object situated along a roadway 60 that the vehicle 14 travels along.
- Some examples of visually imperceptible objects include, but are not limited to, roadway signage and markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, or road conditions that require attention.
- Some examples of road conditions that require attention include, but are not limited to, icy or slippery road surfaces, potholes, and debris obstructing the roadway 60 . It is to be appreciated that the visually imperceptible object is not visible to a driver of the vehicle 14 because of low-visibility conditions that reduce the driver's ability to view objects on the roadway 60 .
- Some examples of low-visibility conditions include, but are not limited to, snow, rain, fog, nighttime conditions, and low-light conditions.
- the augmented reality head-up display system 10 determines a notification symbol 36 that signifies the visually imperceptible object.
- the notification symbol 36 is generated upon the windscreen 12 of the vehicle 14 and is overlaid at a position 38 upon the windscreen 12 where the visually imperceptible object would normally be visible by the driver.
- the driver would be able to view the visually imperceptible object.
- the driver of the vehicle 14 would be able to view a roadway sign that is otherwise visually imperceptible.
- FIG. 1 the notification symbol 36 that signifies the visually imperceptible object.
- the notification symbol 36 is a caution symbol, however, it is to be appreciated that other types of symbols may be used as well and are illustrated in FIGS. 4 A- 4 B .
- the augmented reality head-up display system 10 determines a rate of approach towards the visually imperceptible object by the vehicle 14 and adjusts at least one visual parameter of the notification symbol 36 based on the rate of approach towards the vehicle 14 .
- the visual parameter of the notification symbol 36 is an overall size, and as the vehicle 14 approaches the visually imperceptible object, the overall size of the notification symbol 36 increases.
- the windscreen 12 includes a first, near-field image plane 50 and a second, far-field image plane 52 , however, it is to be appreciated that more than two image planes may be used as well.
- the controller 20 instructs the graphic projection device 26 of the augmented reality head-up display system 10 to project cluster content information 54 upon the windscreen 12 within the near-field image plane 50 .
- the cluster content information 54 informs the driver of the vehicle 14 of driving conditions such as, but not limited to, vehicle speed, speed limit, gear position, fuel level, current position, and navigational instructions.
- the cluster content information 54 includes vehicle speed and navigational directions.
- the augmented reality head-up display system 10 projects information regarding the notification symbol 36 upon the windscreen 12 within the near-field image plane 50 .
- information regarding the notification symbol 36 include a description of the visually imperceptible object (i.e., is the visually imperceptible object debris on the roadway 60 , another vehicle, a road sign, etc.) and a distance to the visually imperceptible object from the vehicle 14 .
- the controller 20 instructs the graphic projection device 26 to project the notification symbol 36 upon the windscreen 12 within the far-field image plane 52 , where the notification symbol 36 is overlaid at a position 38 upon the windscreen 12 where the visually imperceptible object would normally be visible.
- the far-field image plane 52 contains images overlaid upon the roadway 60 that is visible through the windscreen 12 .
- the far-field image plane 52 only covers a portion of the entire plane of the windscreen 12 , however, it is to be appreciated that in another implementation the far-field image plane 52 may cover the entire plane of the windscreen 12 that is not occupied by the near-field image plane 50 .
- FIG. 2 illustrates the far-field image plane 52 only spanning across a portion of the lanes 62 that are part of the roadway 60 , in embodiments the far-field image plane 52 spans across each lane 62 across the roadway 60 .
- the notification symbol 36 includes any type of graphic image that provides an alert to direct the attention of the driver of the vehicle 14 towards the position 38 of the visually imperceptible object.
- the notification symbol 36 is a caution symbol, however, it is to be appreciated that other types of symbols may be used as well.
- the notification symbol 36 includes a vehicle icon to alert the driver to a vehicle that is visually imperceptible.
- the notification symbol 36 includes an animal icon to alert the driver to wildlife that are visually imperceptible.
- the notification symbol 36 includes a pedestrian icon to alert the driver to a pedestrian that is visually imperceptible.
- the notification symbol 36 may be animated as well to draw attention to the driver's eye. For example, an alert may be transient to draw the driver's attention to a particular region in the visual field.
- the image-capturing devices 22 obtain image data of the environment 40 surrounding the vehicle 14 .
- the one or more non-visual object detection sensors 24 obtain data in the form of detection points that indicate a presence of an object within the environment 40 of the vehicle 14 .
- the controller 20 receives the plurality of detection points that indicate a presence of the object and the image data.
- the controller 20 compares the plurality of detection points with the image data to identify the visually imperceptible object.
- the controller 20 identifies the visually imperceptible object based on the driver's vision capabilities.
- the driver's vision capabilities are entered manually or, in the alternative, may be inferred based on age.
- the controller 20 identifies the visually imperceptible object based on driver perception data received from the eye location system 28 , where the driver perception data includes the location of a head of the driver and the orientation or gaze location of the driver's eyes.
- the driver eye and head positions are at a different location than the image-capturing devices 22 , and therefore there may be areas in the environment 40 that the driver may view that are not captured by the image-capturing devices 22 , and vice versa.
- the different locations between the driver eye and the head position may be calculated as well.
- the controller 20 identifies the visually imperceptible object by first determining a luminance contrast ratio between the plurality of detection points with the image data of the environment 40 , and then compares the luminance contrast ratio with a contrast threshold ratio.
- the image data captured from the one or more image-capturing devices 22 includes data indicating both object luminescence and background luminescence, where the luminescence contrast ratio is determined based on the object and the background luminescence.
- the controller 20 identifies the object being detected as the visually imperceptible object.
- the controller 20 determines the notification symbol 36 ( FIG. 2 ) that signifies the visually imperceptible object.
- the controller 20 then instructs the graphic projection device 26 to generate the notification symbol 36 upon the windscreen 12 of the vehicle 14 .
- the notification symbol 36 is overlaid at the position 38 of the visually imperceptible object.
- the controller 20 calculates a type of symbol, the size, the shape, and the color of the notification symbol 36 based on the driver eye position, a location of the vehicle 14 , and data regarding the environment 40 such as, but not limited to, data from an inclinometer, vehicle speed, roadway curvature, and steering angle.
- the controller 20 determines a rate of approach towards the visually imperceptible object by the vehicle 14 based on one or more inputs from the one or more image-capturing devices 22 , the one or more non-visual object detection sensors 24 , the one or more vehicle systems 42 , the one or more road databases 44 , and the one or more external networks 46 .
- the controller adjusts at least one visual parameter of the notification symbol 36 based on the rate of approach towards the visually imperceptible object. In the embodiment as shown in FIGS.
- the visual parameter is an overall size of the notification symbol, where the overall size of the notification symbol 36 increases as the vehicle 14 travels towards the visually imperceptible object and the overall size of the notification symbol 36 decreases as the vehicle 14 travels away from the visually imperceptible object.
- the visual parameter is a color of the notification symbol, where the color indicates when the vehicle 14 is too close to the visually imperceptible object.
- the color of the notification symbol may start as yellow, turn to orange as the vehicle 14 approaches the visually impermissible object, and eventually turns to red.
- the visual parameter is an animation of the notification symbol 36 .
- the visual parameter may be size, where the size of the graphic increases or decreases to capture the attention of the driver.
- the controller 20 receives perception data indicative of human vision relative to camera vision from the one or more image-capturing devices 22 , the one or more non-visual object detection sensors 24 , the one or more vehicle systems 42 , the one or more road databases 44 , and the one or more external networks 46 .
- the perception includes, but is not limited to, ambient lighting conditions, sun position, headlamp coverage, and weather input.
- the controller 20 calculates a driver's field of view based on the perception data, and further identifies the visually imperceptible object based on the perception data.
- FIG. 5 is a process flow diagram illustrating a method 200 for displaying the notification symbol 36 upon the windscreen 12 of the vehicle 14 by the augmented reality head-up display system 10 .
- the method 200 may begin at block 202 .
- the controller 20 receives the plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors 24 and the image data from one or more image-capturing devices 22 .
- the method 200 may then proceed to block 204 .
- the controller 20 compares the plurality of detection points with the image data of the environment 40 surrounding the vehicle 14 to identify the visually imperceptible object.
- the visually imperceptible object may be identified based by determining a luminance contrast ratio between the plurality of detection points and the image data. The method 200 may then proceed to block 206 .
- the controller 20 determines the notification symbol 36 that signifies the visually imperceptible object. The method 200 may then proceed to block 208 .
- the controller 20 instructs the graphic projection device 26 to generate the notification symbol 36 upon the windscreen 12 of the vehicle 14 .
- the notification symbol 36 is overlaid at a position 38 upon the windscreen 12 where the visually imperceptible object would normally be visible. The method 200 may then terminate.
- the disclosed augmented reality head-up display provides various technical effects and benefits. Specifically, the disclosed augmented reality head-up display generates a notification symbol upon the windscreen of a vehicle to alert the driver of a visually imperceptible object. Therefore, the augmented reality head-up display provides enhanced situational awareness of roadway objects that are not evident to a driver during low-visibility conditions. Moreover, as the vehicle continues to travel towards the visually imperceptible object, the size and color of the notification symbol may change to assist the driver in determining if the visually imperceptible object is stationary, traveling towards the vehicle, or away from the vehicle.
- the controllers may refer to, or be part of an electronic circuit, a combinational logic circuit, a field programmable gate array (FPGA), a processor (shared, dedicated, or group) that executes code, or a combination of some or all of the above, such as in a system-on-chip.
- the controllers may be microprocessor-based such as a computer having a at least one processor, memory (RAM and/or ROM), and associated input and output buses.
- the processor may operate under the control of an operating system that resides in memory.
- the operating system may manage computer resources so that computer program code embodied as one or more computer software applications, such as an application residing in memory, may have instructions executed by the processor.
- the processor may execute the application directly, in which case the operating system may be omitted.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Optics & Photonics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Software Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Ophthalmology & Optometry (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
Abstract
An augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle includes a controller in electronic communication with one or more non-visual object detection sensors, one or more image-capturing devices, and a graphic projection device. The controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors. The controller executes instructions to compare the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. The notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
Description
- The present disclosure relates to an augmented reality head-up display for generating a notification symbol upon the windscreen of a vehicle. The notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
- Augmented reality (AR) involves enhancing the real world with virtual elements that are shown in three-dimensional space and that permit real-time interaction with users. A head-up display (HUD) shows information such as, for example, vehicle speed and navigational instructions, directly onto a windscreen of a vehicle, within the driver's forward field of view. Accordingly, the head-up display provides drivers with information without looking away from the road. One possible implementation for augmented reality is an augmented reality head-up display (AR-HUD) for a vehicle. By overlaying images on the windscreen, AR-HUDs enhance a driver's view of the environment outside the vehicle, creating a greater sense of environmental awareness.
- However, while current augmented reality head-up displays achieve their intended purpose, there is a need in the art for an improved approach for providing information to vehicle occupants.
- According to several aspects, an augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle is disclosed. The augmented reality head-up display system includes one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle, and one or more image-capturing devices that capture image data of the environment surrounding the vehicle, and a graphic projection device for generating images upon the windscreen of the vehicle. The system also includes a controller in electronic communication with the one or more non-visual object detection sensors, the one or more image-capturing devices, and the graphic projection device, wherein the controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors and the image data from the one or more non-visual object detection sensors. The controller compares the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. Finally, the controller instructs the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
- In another aspect, the controller executes instructions to determine a rate of approach towards the visually imperceptible object by the vehicle and adjusts at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
- In yet another aspect, the visual parameter is an overall size of the notification symbol, and where the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
- In an aspect, the visual parameter is a color of the notification symbol.
- In another aspect, the controller executes instructions to receive perception data indicative of human vision relative to camera vision, calculate a driver's field of view based on the perception data, and identify the visually imperceptible object based on the perception data.
- In yet another aspect, the perception data includes one or more of the following: ambient lighting conditions, sun position, headlamp coverage, and weather input.
- In an aspect, identifying the visually imperceptible object is determined based on driver vision capability.
- In another aspect, the one or more non-visual object detection sensors include one or more of the following: a radar, LiDAR, and one or more infrared sensors.
- In yet another aspect, the controller identifies the visually imperceptible object by determining a luminance contrast ratio between the plurality of detection points and the image data of the environment surrounding the vehicle.
- In an aspect, the controller instructs the graphic projection device of the augmented reality head-up display system to project cluster content information within a near-field image plane of the windscreen.
- In another aspect, information regarding the notification symbol displayed within the near-field image plane.
- In yet another aspect, the controller instructs the graphic projection device to project the notification symbol within a far-field image plane of the windscreen.
- In an aspect, the notification symbol is one of the following: a caution symbol, a vehicle icon, an animal icon, and a pedestrian icon.
- In another aspect, the visually imperceptible object is one of the following: roadway signage, roadway markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, and road conditions that require attention.
- In an aspect, a method for displaying graphics upon a windscreen of a vehicle. The method includes receiving, by a controller, a plurality of detection points that indicate a presence of an object from one or more non-visual object detection sensors and image data from one or more image-capturing devices. The method includes comparing the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the method includes determining a notification symbol that signifies the visually imperceptible object. Finally, the method includes instructing, by the controller, a graphic projection device to generate the notification symbol upon the windscreen of the vehicle, where the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
- In another aspect, the method includes determining a rate of approach towards the visually imperceptible object by the vehicle and adjusting at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
- In yet another aspect, the method includes receiving perception data indicative of human vision relative to camera vision, calculating a driver's field of view based on the perception data, and identifying the visually imperceptible object based on the perception data.
- In an aspect, an augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle. The augmented reality head-up display system includes one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle, one or more image-capturing devices that capture image data of the environment surrounding the vehicle, a graphic projection device for generating images upon the windscreen of the vehicle, and a controller in electronic communication with the one or more non-visual object detection sensors, the one or more image-capturing devices, and the graphic projection device. The controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors. The controller compares the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. The controller instructs the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible. The controller determines a rate of approach towards the visually imperceptible object by the vehicle and adjusts at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
- In an aspect, the visual parameter is an overall size of the notification symbol, and the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
- In an aspect, the controller executes instructions to receive perception data indicative of human vision relative to camera vision, calculate a driver's field of view based on the perception data, and identify the visually imperceptible object based on the perception data.
- Further areas of applicability will become apparent from the description provided herein. It should be understood that the description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
- The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.
-
FIG. 1 is a schematic diagram of the disclosed augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle, according to an exemplary embodiment -
FIG. 2 illustrates an interior view of the windscreen illustrating a notification symbol overlaid at a position on the windscreen where a visually imperceptible object is located, according to an exemplary embodiment; -
FIGS. 3A-3C illustrate the notification symbol shown inFIG. 2 increasing in overall size, according to an exemplary embodiment; -
FIGS. 4A-4C illustrate various types of notification symbols, according to an exemplary embodiment; and -
FIG. 5 is a process flow diagram illustrating a method for displaying graphics upon the windscreen of the vehicle by the augmented reality head-up display system, according to an exemplary embodiment. - The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses.
- Referring to
FIG. 1 , an exemplary augmented reality head-updisplay system 10 for displaying graphics upon awindscreen 12 of avehicle 14 is illustrated. The augmented reality head-updisplay system 10 includes one ormore controllers 20 in electronic communication with one or more image-capturingdevices 22, one or more non-visualobject detection sensors 24, agraphic projection device 26, and aneye location system 28. The image-capturingdevices 22 may be cameras that obtain periodic or sequential images. The one or more non-visualobject detection sensors 24 are configured to detect a position, velocity, and direction of travel of objects in anenvironment 40 surrounding thevehicle 14. In the example as shown inFIG. 1 , the one or more non-visualobject detection sensors 24 include aradar 30, LiDAR 32, and one or moreinfrared sensors 34, however, it is to be appreciated that other sensors that employ non-visual techniques to detect the presence of objects may be used as well. Moreover, information from other sensors may be used as well. Thegraphic projection device 26 is configured to generate images upon thewindscreen 12 of thevehicle 14 and includes a projection device for creating an excitation light for projecting images. Theeye location system 28 includes one or more sensors for determining the location of a head of the driver of thevehicle 14 as well as the orientation or gaze location of the driver's eyes. It is to be appreciated that thevehicle 14 may be any type of vehicle such as, but not limited to, a sedan, truck, sport utility vehicle, van, or motor home. It is also to be appreciated that thevehicle 14 is not limited to an automobile, and may be any other type of land vehicle, marine vehicle, or air vehicle. In an embodiment, thevehicle 14 is an autonomous or semi-autonomous vehicle. However, it is to be appreciated that a manually driven vehicle may be used as well. - The one or
more controllers 20 may also be in electronic communication with a global positioning system (GPS) 41, one ormore vehicle systems 42, one ormore road databases 44, and one or moreexternal networks 46. The one ormore vehicle systems 42 include, but are not limited to, a driver monitoring system (DMS) and an automated driving system. Thevehicle 14 may wirelessly connect to the one or moreexternal networks 46. Some examples ofexternal networks 46 include, but are not limited to, cellular networks, dedicated short-range communications (DSRC) networks, and vehicle-to-infrastructure (V2X) networks. -
FIG. 2 is an exemplary interior view of thewindscreen 12, where theenvironment 40 surrounding thevehicle 14 is visible through thewindscreen 12. It is to be appreciated that the images generated by thegraphic projection module 26 are projected as light upon thewindscreen 12, where the light is reflected off thewindscreen 12 and is directed to a driver of thevehicle 14. Thus, the images generated by thegraphic projection module 26 appear to be in front of thevehicle 12, and beyond thewindscreen 12, when viewed by a driver. Referring to bothFIGS. 1 and 2 , the augmented reality head-updisplay system 10 identifies a visually imperceptible object located in theenvironment 40 surrounding thevehicle 14. The visually imperceptible object is any type of incident or object situated along aroadway 60 that thevehicle 14 travels along. Some examples of visually imperceptible objects include, but are not limited to, roadway signage and markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, or road conditions that require attention. Some examples of road conditions that require attention include, but are not limited to, icy or slippery road surfaces, potholes, and debris obstructing theroadway 60. It is to be appreciated that the visually imperceptible object is not visible to a driver of thevehicle 14 because of low-visibility conditions that reduce the driver's ability to view objects on theroadway 60. Some examples of low-visibility conditions include, but are not limited to, snow, rain, fog, nighttime conditions, and low-light conditions. - As explained below, in response to identifying the visually imperceptible object, the augmented reality head-up
display system 10 determines anotification symbol 36 that signifies the visually imperceptible object. As seen inFIG. 2 , thenotification symbol 36 is generated upon thewindscreen 12 of thevehicle 14 and is overlaid at aposition 38 upon thewindscreen 12 where the visually imperceptible object would normally be visible by the driver. In other words, if the low-visibility conditions were not present, then the driver would be able to view the visually imperceptible object. For example, if the fog were to clear, then the driver of thevehicle 14 would be able to view a roadway sign that is otherwise visually imperceptible. In the example as shown inFIG. 2 , thenotification symbol 36 is a caution symbol, however, it is to be appreciated that other types of symbols may be used as well and are illustrated inFIGS. 4A-4B . As also explained below, the augmented reality head-updisplay system 10 determines a rate of approach towards the visually imperceptible object by thevehicle 14 and adjusts at least one visual parameter of thenotification symbol 36 based on the rate of approach towards thevehicle 14. For example, as seen inFIGS. 3A-3C , in an embodiment the visual parameter of thenotification symbol 36 is an overall size, and as thevehicle 14 approaches the visually imperceptible object, the overall size of thenotification symbol 36 increases. - Referring to
FIGS. 1 and 2 , thewindscreen 12 includes a first, near-field image plane 50 and a second, far-field image plane 52, however, it is to be appreciated that more than two image planes may be used as well. Thecontroller 20 instructs thegraphic projection device 26 of the augmented reality head-updisplay system 10 to projectcluster content information 54 upon thewindscreen 12 within the near-field image plane 50. Thecluster content information 54 informs the driver of thevehicle 14 of driving conditions such as, but not limited to, vehicle speed, speed limit, gear position, fuel level, current position, and navigational instructions. In the example as shown inFIG. 2 , thecluster content information 54 includes vehicle speed and navigational directions. In an embodiment, the augmented reality head-updisplay system 10 projects information regarding thenotification symbol 36 upon thewindscreen 12 within the near-field image plane 50. Some examples of information regarding thenotification symbol 36 include a description of the visually imperceptible object (i.e., is the visually imperceptible object debris on theroadway 60, another vehicle, a road sign, etc.) and a distance to the visually imperceptible object from thevehicle 14. - The
controller 20 instructs thegraphic projection device 26 to project thenotification symbol 36 upon thewindscreen 12 within the far-field image plane 52, where thenotification symbol 36 is overlaid at aposition 38 upon thewindscreen 12 where the visually imperceptible object would normally be visible. The far-field image plane 52 contains images overlaid upon theroadway 60 that is visible through thewindscreen 12. In the embodiment as shown inFIG. 2 , the far-field image plane 52 only covers a portion of the entire plane of thewindscreen 12, however, it is to be appreciated that in another implementation the far-field image plane 52 may cover the entire plane of thewindscreen 12 that is not occupied by the near-field image plane 50. Moreover, althoughFIG. 2 illustrates the far-field image plane 52 only spanning across a portion of thelanes 62 that are part of theroadway 60, in embodiments the far-field image plane 52 spans across eachlane 62 across theroadway 60. - The
notification symbol 36 includes any type of graphic image that provides an alert to direct the attention of the driver of thevehicle 14 towards theposition 38 of the visually imperceptible object. In the example as shown inFIG. 2 , thenotification symbol 36 is a caution symbol, however, it is to be appreciated that other types of symbols may be used as well. For example, as seen inFIG. 4A , thenotification symbol 36 includes a vehicle icon to alert the driver to a vehicle that is visually imperceptible. In the embodiment as shown inFIG. 4B , thenotification symbol 36 includes an animal icon to alert the driver to wildlife that are visually imperceptible. In the embodiment as shown inFIG. 4C , thenotification symbol 36 includes a pedestrian icon to alert the driver to a pedestrian that is visually imperceptible. Other examples of symbols that may be used as thenotification symbol 36 include, but are not limited to, traffic sign icons, slippery roadway icons, traffic lights, a stopped vehicle, pedestrians, animals, cross traffic, and the edge of the road. In embodiments, thenotification symbol 36 may be animated as well to draw attention to the driver's eye. For example, an alert may be transient to draw the driver's attention to a particular region in the visual field. - Referring back to
FIG. 1 , the image-capturingdevices 22 obtain image data of theenvironment 40 surrounding thevehicle 14. The one or more non-visualobject detection sensors 24 obtain data in the form of detection points that indicate a presence of an object within theenvironment 40 of thevehicle 14. Thecontroller 20 receives the plurality of detection points that indicate a presence of the object and the image data. Thecontroller 20 compares the plurality of detection points with the image data to identify the visually imperceptible object. - In embodiments, the
controller 20 identifies the visually imperceptible object based on the driver's vision capabilities. The driver's vision capabilities are entered manually or, in the alternative, may be inferred based on age. In another embodiment, thecontroller 20 identifies the visually imperceptible object based on driver perception data received from theeye location system 28, where the driver perception data includes the location of a head of the driver and the orientation or gaze location of the driver's eyes. It is to be appreciated that the driver eye and head positions are at a different location than the image-capturingdevices 22, and therefore there may be areas in theenvironment 40 that the driver may view that are not captured by the image-capturingdevices 22, and vice versa. Furthermore, the different locations between the driver eye and the head position may be calculated as well. - In an embodiment, the
controller 20 identifies the visually imperceptible object by first determining a luminance contrast ratio between the plurality of detection points with the image data of theenvironment 40, and then compares the luminance contrast ratio with a contrast threshold ratio. Specifically, the image data captured from the one or more image-capturingdevices 22 includes data indicating both object luminescence and background luminescence, where the luminescence contrast ratio is determined based on the object and the background luminescence. In response to determining the luminance contrast ratio is greater than or equal to the contrast threshold ratio, thecontroller 20 identifies the object being detected as the visually imperceptible object. - In response to identifying the visually imperceptible object, the
controller 20 determines the notification symbol 36 (FIG. 2 ) that signifies the visually imperceptible object. Thecontroller 20 then instructs thegraphic projection device 26 to generate thenotification symbol 36 upon thewindscreen 12 of thevehicle 14. As seen inFIG. 2 , thenotification symbol 36 is overlaid at theposition 38 of the visually imperceptible object. It is to be appreciated that thecontroller 20 calculates a type of symbol, the size, the shape, and the color of thenotification symbol 36 based on the driver eye position, a location of thevehicle 14, and data regarding theenvironment 40 such as, but not limited to, data from an inclinometer, vehicle speed, roadway curvature, and steering angle. - In embodiments, the
controller 20 determines a rate of approach towards the visually imperceptible object by thevehicle 14 based on one or more inputs from the one or more image-capturingdevices 22, the one or more non-visualobject detection sensors 24, the one ormore vehicle systems 42, the one ormore road databases 44, and the one or moreexternal networks 46. Referring toFIGS. 1 and 3A-3B , the controller adjusts at least one visual parameter of thenotification symbol 36 based on the rate of approach towards the visually imperceptible object. In the embodiment as shown inFIGS. 3A-3C , the visual parameter is an overall size of the notification symbol, where the overall size of thenotification symbol 36 increases as thevehicle 14 travels towards the visually imperceptible object and the overall size of thenotification symbol 36 decreases as thevehicle 14 travels away from the visually imperceptible object. In another embodiment, the visual parameter is a color of the notification symbol, where the color indicates when thevehicle 14 is too close to the visually imperceptible object. For example, the color of the notification symbol may start as yellow, turn to orange as thevehicle 14 approaches the visually impermissible object, and eventually turns to red. In still another embodiment, the visual parameter is an animation of thenotification symbol 36. For example, the visual parameter may be size, where the size of the graphic increases or decreases to capture the attention of the driver. - Referring back to
FIG. 1 , thecontroller 20 receives perception data indicative of human vision relative to camera vision from the one or more image-capturingdevices 22, the one or more non-visualobject detection sensors 24, the one ormore vehicle systems 42, the one ormore road databases 44, and the one or moreexternal networks 46. In embodiments, the perception includes, but is not limited to, ambient lighting conditions, sun position, headlamp coverage, and weather input. Thecontroller 20 calculates a driver's field of view based on the perception data, and further identifies the visually imperceptible object based on the perception data. -
FIG. 5 is a process flow diagram illustrating amethod 200 for displaying thenotification symbol 36 upon thewindscreen 12 of thevehicle 14 by the augmented reality head-updisplay system 10. Referring toFIGS. 1, 2, and 5 , themethod 200 may begin atblock 202. Inblock 202, thecontroller 20 receives the plurality of detection points that indicate a presence of an object from the one or more non-visualobject detection sensors 24 and the image data from one or more image-capturingdevices 22. Themethod 200 may then proceed to block 204. - In
block 204, thecontroller 20 compares the plurality of detection points with the image data of theenvironment 40 surrounding thevehicle 14 to identify the visually imperceptible object. As mentioned above, in an embodiment, the visually imperceptible object may be identified based by determining a luminance contrast ratio between the plurality of detection points and the image data. Themethod 200 may then proceed to block 206. - In
block 206, in response to identifying the visually imperceptible object, thecontroller 20 determines thenotification symbol 36 that signifies the visually imperceptible object. Themethod 200 may then proceed to block 208. - In
block 208, thecontroller 20 instructs thegraphic projection device 26 to generate thenotification symbol 36 upon thewindscreen 12 of thevehicle 14. As seen inFIG. 2 , thenotification symbol 36 is overlaid at aposition 38 upon thewindscreen 12 where the visually imperceptible object would normally be visible. Themethod 200 may then terminate. - Referring generally to the figures, the disclosed augmented reality head-up display provides various technical effects and benefits. Specifically, the disclosed augmented reality head-up display generates a notification symbol upon the windscreen of a vehicle to alert the driver of a visually imperceptible object. Therefore, the augmented reality head-up display provides enhanced situational awareness of roadway objects that are not evident to a driver during low-visibility conditions. Moreover, as the vehicle continues to travel towards the visually imperceptible object, the size and color of the notification symbol may change to assist the driver in determining if the visually imperceptible object is stationary, traveling towards the vehicle, or away from the vehicle.
- The controllers may refer to, or be part of an electronic circuit, a combinational logic circuit, a field programmable gate array (FPGA), a processor (shared, dedicated, or group) that executes code, or a combination of some or all of the above, such as in a system-on-chip. Additionally, the controllers may be microprocessor-based such as a computer having a at least one processor, memory (RAM and/or ROM), and associated input and output buses. The processor may operate under the control of an operating system that resides in memory. The operating system may manage computer resources so that computer program code embodied as one or more computer software applications, such as an application residing in memory, may have instructions executed by the processor. In an alternative embodiment, the processor may execute the application directly, in which case the operating system may be omitted.
- The description of the present disclosure is merely exemplary in nature and variations that do not depart from the gist of the present disclosure are intended to be within the scope of the present disclosure. Such variations are not to be regarded as a departure from the spirit and scope of the present disclosure.
Claims (20)
1. An augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle, the augmented reality head-up display system comprising:
one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle;
one or more cameras that capture image data of the environment surrounding the vehicle;
a graphic projection device for generating images upon the windscreen of the vehicle wherein the graphic projection device includes a projection device that creates an excitation light that creates the images projected upon the windscreen; and
a controller in electronic communication with the one or more non-visual object detection sensors, the one or more cameras, and the graphic projection device, wherein the controller executes instructions to:
receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors and the image data from the one or more cameras;
compare the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment;
in response to identifying the visually imperceptible object, determine a notification symbol that signifies the visually imperceptible object; and
instruct the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
2. The augmented reality head-up display system of claim 1 , wherein the controller executes instructions to:
determine a rate of approach towards the visually imperceptible object by the vehicle; and
adjust at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
3. The augmented reality head-up display system of claim 2 , wherein the visual parameter is an overall size of the notification symbol, and wherein the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
4. The augmented reality head-up display system of claim 1 , wherein the visual parameter is a color of the notification symbol.
5. The augmented reality head-up display system of claim 1 , wherein the controller executes instructions to:
receive perception data indicative of human vision relative to camera vision;
calculate a driver's field of view based on the perception data; and
identify the visually imperceptible object based on the perception data.
6. The augmented reality head-up display system of claim 5 , wherein the perception data includes one or more of the following: ambient lighting conditions, sun position, headlamp coverage, and weather input.
7. The augmented reality head-up display system of claim 1 , wherein identifying the visually imperceptible object is determined based on driver vision capability.
8. The augmented reality head-up display system of claim 1 , wherein the one or more non-visual object detection sensors include one or more of the following: a radar, LiDAR, and one or more infrared sensors.
9. The augmented reality head-up display system of claim 1 , wherein the controller identifies the visually imperceptible object by determining a luminance contrast ratio between the plurality of detection points and the image data of the environment surrounding the vehicle.
10. The augmented reality head-up display system of claim 1 , wherein the controller instructs the graphic projection device of the augmented reality head-up display system to project cluster content information within a near-field image plane of the windscreen.
11. The augmented reality head-up display system of claim 10 , wherein information regarding the notification symbol displayed within the near-field image plane.
12. The augmented reality head-up display system of claim 1 , wherein the controller instructs the graphic projection device to project the notification symbol within a far-field image plane of the windscreen.
13. The augmented reality head-up display system of claim 1 , wherein the notification symbol is one of the following: a caution symbol, a vehicle icon, an animal icon, and a pedestrian icon.
14. The augmented reality head-up display system of claim 1 , wherein the visually imperceptible object is one of the following: roadway signage, roadway markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, and road conditions that require attention.
15. A method for displaying graphics upon a windscreen of a vehicle, the method comprising:
receiving, by a controller, a plurality of detection points that indicate a presence of an object from one or more non-visual object detection sensors and image data from one or more cameras;
comparing the plurality of detection points with the image data of an environment surrounding the vehicle to identify a visually imperceptible object located in the environment;
in response to identifying the visually imperceptible object, determining a notification symbol that signifies the visually imperceptible object; and
instructing, by the controller, a graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible, wherein the graphic projection device includes a projection device that creates an excitation light that creates the images projected upon the windscreen.
16. The method of claim 15 , further comprising:
determining a rate of approach towards the visually imperceptible object by the vehicle; and
adjusting at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
17. The method of claim 15 , further comprising:
receiving perception data indicative of human vision relative to camera vision;
calculating a driver's field of view based on the perception data; and
identifying the visually imperceptible object based on the perception data.
18. An augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle, the augmented reality head-up display system comprising:
one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle;
one or more cameras that capture image data of the environment surrounding the vehicle;
a graphic projection device for generating images upon the windscreen of the vehicle, wherein the graphic projection device includes a projection device that creates an excitation light that creates the images projected upon the windscreen; and
a controller in electronic communication with the one or more non-visual object detection sensors, the one or more cameras, and the graphic projection device, wherein the controller executes instructions to:
receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors;
compare the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment;
in response to identifying the visually imperceptible object, determine a notification symbol that signifies the visually imperceptible object;
instruct the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible;
determine a rate of approach towards the visually imperceptible object by the vehicle; and
adjust at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
19. The augmented reality head-up display system of claim 18 , wherein the visual parameter is an overall size of the notification symbol, and wherein the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
20. The augmented reality head-up display system of claim 18 , wherein the controller executes instructions to:
receive perception data indicative of human vision relative to camera vision;
calculate a driver's field of view based on the perception data; and
identify the visually imperceptible object based on the perception data.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/702,093 US11766938B1 (en) | 2022-03-23 | 2022-03-23 | Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object |
DE102022127379.2A DE102022127379A1 (en) | 2022-03-23 | 2022-10-19 | AUGMENTED REALITY HEAD-UP DISPLAY FOR OVERLAYING A NOTIFICATION ICON ON A VISUALLY UNDERPERCEIVE OBJECT |
CN202211288780.5A CN116841042A (en) | 2022-03-23 | 2022-10-20 | Augmented reality head-up display with symbols superimposed on visually imperceptible objects |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/702,093 US11766938B1 (en) | 2022-03-23 | 2022-03-23 | Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object |
Publications (2)
Publication Number | Publication Date |
---|---|
US11766938B1 US11766938B1 (en) | 2023-09-26 |
US20230302900A1 true US20230302900A1 (en) | 2023-09-28 |
Family
ID=87930676
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/702,093 Active US11766938B1 (en) | 2022-03-23 | 2022-03-23 | Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object |
Country Status (3)
Country | Link |
---|---|
US (1) | US11766938B1 (en) |
CN (1) | CN116841042A (en) |
DE (1) | DE102022127379A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2022184350A (en) * | 2021-06-01 | 2022-12-13 | マツダ株式会社 | head-up display device |
Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5995903A (en) * | 1996-11-12 | 1999-11-30 | Smith; Eric L. | Method and system for assisting navigation using rendered terrain imagery |
US20070031006A1 (en) * | 2005-04-19 | 2007-02-08 | Valeo Vision | Method for detecting nocturnal fog and system for implementing the method |
US20070230800A1 (en) * | 2006-03-29 | 2007-10-04 | Denso Corporation | Visibility range measuring apparatus for vehicle and vehicle drive assist system |
US20080061950A1 (en) * | 2006-09-12 | 2008-03-13 | Denso Corporation | Apparatus for detecting the presence of fog for automotive vehicle |
US20090118909A1 (en) * | 2007-10-31 | 2009-05-07 | Valeo Vision | Process for detecting a phenomenon limiting the visibility for a motor vehicle |
US20100253540A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Enhanced road vision on full windshield head-up display |
US7920250B2 (en) * | 2007-08-03 | 2011-04-05 | Valeo Vision | System for the detection by a motor vehicle of a phenomenon that interferes with visibility |
US20120001742A1 (en) * | 2010-07-05 | 2012-01-05 | Denso Corporation | Obstacle search system |
US20140093131A1 (en) * | 2012-10-01 | 2014-04-03 | Xerox Corporation | Visibility improvement in bad weather using enchanced reality |
US20150310313A1 (en) * | 2012-12-18 | 2015-10-29 | Mitsubishi Electric Corporation | Visibility estimation device, visibility estimation method, and safe driving support system |
US20150352952A1 (en) * | 2014-03-11 | 2015-12-10 | Cessna Aircraft Company | Adjustable Synthetic Vision |
US20160327402A1 (en) * | 2014-02-05 | 2016-11-10 | Panasonic Intellectual Property Management Co., Ltd. | Display apparatus for vehicle and display method of display apparatus for vehicle |
US20170192091A1 (en) * | 2016-01-06 | 2017-07-06 | Ford Global Technologies, Llc | System and method for augmented reality reduced visibility navigation |
US20170345321A1 (en) * | 2014-11-05 | 2017-11-30 | Sierra Nevada Corporation | Systems and methods for generating improved environmental displays for vehicles |
US20180129854A1 (en) * | 2016-11-09 | 2018-05-10 | Samsung Electronics Co., Ltd. | Method and apparatus for generating virtual driving lane for traveling vehicle |
US20180141563A1 (en) * | 2016-06-30 | 2018-05-24 | Faraday&Future Inc. | Classifying of weather situations using cameras on automobiles |
US20180201134A1 (en) * | 2017-01-17 | 2018-07-19 | Lg Electronics, Inc. | User interface apparatus for vehicle and vehicle |
US20180211121A1 (en) * | 2017-01-25 | 2018-07-26 | Ford Global Technologies, Llc | Detecting Vehicles In Low Light Conditions |
US20180328752A1 (en) * | 2017-05-09 | 2018-11-15 | Toyota Jidosha Kabushiki Kaisha | Augmented reality for vehicle lane guidance |
US20200324787A1 (en) * | 2018-10-25 | 2020-10-15 | Samsung Electronics Co., Ltd. | Augmented reality method and apparatus for driving assistance |
US10846833B2 (en) * | 2015-09-02 | 2020-11-24 | SMR Patents S.à.r.l. | System and method for visibility enhancement |
-
2022
- 2022-03-23 US US17/702,093 patent/US11766938B1/en active Active
- 2022-10-19 DE DE102022127379.2A patent/DE102022127379A1/en active Pending
- 2022-10-20 CN CN202211288780.5A patent/CN116841042A/en active Pending
Patent Citations (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5995903A (en) * | 1996-11-12 | 1999-11-30 | Smith; Eric L. | Method and system for assisting navigation using rendered terrain imagery |
US20070031006A1 (en) * | 2005-04-19 | 2007-02-08 | Valeo Vision | Method for detecting nocturnal fog and system for implementing the method |
US20070230800A1 (en) * | 2006-03-29 | 2007-10-04 | Denso Corporation | Visibility range measuring apparatus for vehicle and vehicle drive assist system |
US20080061950A1 (en) * | 2006-09-12 | 2008-03-13 | Denso Corporation | Apparatus for detecting the presence of fog for automotive vehicle |
US7920250B2 (en) * | 2007-08-03 | 2011-04-05 | Valeo Vision | System for the detection by a motor vehicle of a phenomenon that interferes with visibility |
US20090118909A1 (en) * | 2007-10-31 | 2009-05-07 | Valeo Vision | Process for detecting a phenomenon limiting the visibility for a motor vehicle |
US20100253540A1 (en) * | 2009-04-02 | 2010-10-07 | Gm Global Technology Operations, Inc. | Enhanced road vision on full windshield head-up display |
US20120001742A1 (en) * | 2010-07-05 | 2012-01-05 | Denso Corporation | Obstacle search system |
US20140093131A1 (en) * | 2012-10-01 | 2014-04-03 | Xerox Corporation | Visibility improvement in bad weather using enchanced reality |
US20150310313A1 (en) * | 2012-12-18 | 2015-10-29 | Mitsubishi Electric Corporation | Visibility estimation device, visibility estimation method, and safe driving support system |
US20160327402A1 (en) * | 2014-02-05 | 2016-11-10 | Panasonic Intellectual Property Management Co., Ltd. | Display apparatus for vehicle and display method of display apparatus for vehicle |
US20150352952A1 (en) * | 2014-03-11 | 2015-12-10 | Cessna Aircraft Company | Adjustable Synthetic Vision |
US20170345321A1 (en) * | 2014-11-05 | 2017-11-30 | Sierra Nevada Corporation | Systems and methods for generating improved environmental displays for vehicles |
US10846833B2 (en) * | 2015-09-02 | 2020-11-24 | SMR Patents S.à.r.l. | System and method for visibility enhancement |
US20170192091A1 (en) * | 2016-01-06 | 2017-07-06 | Ford Global Technologies, Llc | System and method for augmented reality reduced visibility navigation |
US20180141563A1 (en) * | 2016-06-30 | 2018-05-24 | Faraday&Future Inc. | Classifying of weather situations using cameras on automobiles |
US20180129854A1 (en) * | 2016-11-09 | 2018-05-10 | Samsung Electronics Co., Ltd. | Method and apparatus for generating virtual driving lane for traveling vehicle |
US20180201134A1 (en) * | 2017-01-17 | 2018-07-19 | Lg Electronics, Inc. | User interface apparatus for vehicle and vehicle |
US20180211121A1 (en) * | 2017-01-25 | 2018-07-26 | Ford Global Technologies, Llc | Detecting Vehicles In Low Light Conditions |
US20180328752A1 (en) * | 2017-05-09 | 2018-11-15 | Toyota Jidosha Kabushiki Kaisha | Augmented reality for vehicle lane guidance |
US20200324787A1 (en) * | 2018-10-25 | 2020-10-15 | Samsung Electronics Co., Ltd. | Augmented reality method and apparatus for driving assistance |
Also Published As
Publication number | Publication date |
---|---|
DE102022127379A1 (en) | 2023-09-28 |
CN116841042A (en) | 2023-10-03 |
US11766938B1 (en) | 2023-09-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10909765B2 (en) | Augmented reality system for vehicle blind spot prevention | |
US10168174B2 (en) | Augmented reality for vehicle lane guidance | |
US11996018B2 (en) | Display control device and display control program product | |
US11447067B2 (en) | Maintaining road safety when there is a disabled autonomous vehicle | |
US8514099B2 (en) | Vehicle threat identification on full windshield head-up display | |
US8543254B1 (en) | Vehicular imaging system and method for determining roadway width | |
WO2020125178A1 (en) | Vehicle driving prompting method and apparatus | |
CN113710530A (en) | Method for operating a driver information system in an autonomous vehicle and driver information system | |
WO2017221945A1 (en) | Display apparatus, display system, moving body, and display method | |
CN113498388A (en) | Method for operating a driver information system in a self-propelled vehicle and driver information system | |
US10946744B2 (en) | Vehicular projection control device and head-up display device | |
JP6748947B2 (en) | Image display device, moving body, image display method and program | |
CN112119398A (en) | Method and device for operating a camera-monitor system of a motor vehicle | |
CN112758013A (en) | Display device and display method for vehicle | |
US11697346B1 (en) | Lane position in augmented reality head-up display system | |
JP7255608B2 (en) | DISPLAY CONTROLLER, METHOD, AND COMPUTER PROGRAM | |
US11766938B1 (en) | Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object | |
US11663939B1 (en) | Augmented reality head-up display for generating a contextual graphic signifying a visually occluded object | |
US12030512B2 (en) | Collision warning system for a motor vehicle having an augmented reality head up display | |
KR102695051B1 (en) | Apparatus and method for generating traffic information including low visibility image improvement function | |
US20230406356A1 (en) | Fail-safe corrective actions based on vision information for autonomous vehicles | |
US11630302B1 (en) | Driving guidance system for a motor vehicle having an augmented reality head up display | |
US20240192313A1 (en) | Methods and systems for displaying information to an occupant of a vehicle | |
US20230347921A1 (en) | Vehicle display control system, computer-readable medium, vehicle display control method, and vehicle display control device | |
JP2022154208A (en) | Image processing device, image processing system and image processing method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WEISS, JOHN P.;SZCZERBA, JOSEPH F.;TSIMHONI, OMER;AND OTHERS;SIGNING DATES FROM 20220321 TO 20220323;REEL/FRAME:059378/0483 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |