US20230302900A1 - Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object - Google Patents

Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object Download PDF

Info

Publication number
US20230302900A1
US20230302900A1 US17/702,093 US202217702093A US2023302900A1 US 20230302900 A1 US20230302900 A1 US 20230302900A1 US 202217702093 A US202217702093 A US 202217702093A US 2023302900 A1 US2023302900 A1 US 2023302900A1
Authority
US
United States
Prior art keywords
vehicle
visually imperceptible
notification symbol
augmented reality
windscreen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US17/702,093
Other versions
US11766938B1 (en
Inventor
John P. Weiss
Joseph F. Szczerba
Omer Tsimhoni
Thomas A. Seder
Kai-Han Chang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
GM Global Technology Operations LLC
Original Assignee
GM Global Technology Operations LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by GM Global Technology Operations LLC filed Critical GM Global Technology Operations LLC
Priority to US17/702,093 priority Critical patent/US11766938B1/en
Assigned to GM Global Technology Operations LLC reassignment GM Global Technology Operations LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: TSIMHONI, OMER, CHANG, KAI-HAN, SEDER, THOMAS A., SZCZERBA, JOSEPH F., WEISS, JOHN P.
Priority to DE102022127379.2A priority patent/DE102022127379A1/en
Priority to CN202211288780.5A priority patent/CN116841042A/en
Application granted granted Critical
Publication of US11766938B1 publication Critical patent/US11766938B1/en
Publication of US20230302900A1 publication Critical patent/US20230302900A1/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • B60K35/233Head-up displays [HUD] controlling the size or position in display areas of virtual images depending on the condition of the vehicle or the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • B60K35/235Head-up displays [HUD] with means for detecting the driver's gaze direction or eye points
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • B60K35/285Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver for improving awareness by directing driver's gaze direction or eye points
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/80Arrangements for controlling instruments
    • B60K35/81Arrangements for controlling instruments for controlling displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/04Display arrangements
    • G01S7/06Cathode-ray tube displays or other two dimensional or three-dimensional displays
    • G01S7/10Providing two-dimensional and co-ordinated display of distance and direction
    • G01S7/12Plan-position indicators, i.e. P.P.I.
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/04Display arrangements
    • G01S7/06Cathode-ray tube displays or other two dimensional or three-dimensional displays
    • G01S7/20Stereoscopic displays; Three-dimensional displays; Pseudo-three-dimensional displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/04Display arrangements
    • G01S7/06Cathode-ray tube displays or other two dimensional or three-dimensional displays
    • G01S7/22Producing cursor lines and indicia by electronic means
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/60Extraction of image or video features relating to illumination properties, e.g. using a reflectance or lighting model
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/166Navigation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/167Vehicle dynamics information
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/168Target or limit values
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/178Warnings
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras
    • B60K2370/1529
    • B60K2370/166
    • B60K2370/167
    • B60K2370/168
    • B60K2370/177
    • B60K2370/178
    • B60K2370/21
    • B60K2370/52
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9323Alternative operation using light waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • G01S2013/9327Sensor installation details
    • G01S2013/93271Sensor installation details in the front of the vehicles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0183Adaptation to parameters characterising the motion of the vehicle

Definitions

  • the present disclosure relates to an augmented reality head-up display for generating a notification symbol upon the windscreen of a vehicle.
  • the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
  • Augmented reality involves enhancing the real world with virtual elements that are shown in three-dimensional space and that permit real-time interaction with users.
  • a head-up display shows information such as, for example, vehicle speed and navigational instructions, directly onto a windscreen of a vehicle, within the driver's forward field of view. Accordingly, the head-up display provides drivers with information without looking away from the road.
  • One possible implementation for augmented reality is an augmented reality head-up display (AR-HUD) for a vehicle. By overlaying images on the windscreen, AR-HUDs enhance a driver's view of the environment outside the vehicle, creating a greater sense of environmental awareness.
  • an augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle.
  • the augmented reality head-up display system includes one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle, and one or more image-capturing devices that capture image data of the environment surrounding the vehicle, and a graphic projection device for generating images upon the windscreen of the vehicle.
  • the system also includes a controller in electronic communication with the one or more non-visual object detection sensors, the one or more image-capturing devices, and the graphic projection device, wherein the controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors and the image data from the one or more non-visual object detection sensors.
  • the controller compares the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. Finally, the controller instructs the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
  • the controller executes instructions to determine a rate of approach towards the visually imperceptible object by the vehicle and adjusts at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
  • the visual parameter is an overall size of the notification symbol, and where the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
  • the visual parameter is a color of the notification symbol.
  • the controller executes instructions to receive perception data indicative of human vision relative to camera vision, calculate a driver's field of view based on the perception data, and identify the visually imperceptible object based on the perception data.
  • the perception data includes one or more of the following: ambient lighting conditions, sun position, headlamp coverage, and weather input.
  • identifying the visually imperceptible object is determined based on driver vision capability.
  • the one or more non-visual object detection sensors include one or more of the following: a radar, LiDAR, and one or more infrared sensors.
  • the controller identifies the visually imperceptible object by determining a luminance contrast ratio between the plurality of detection points and the image data of the environment surrounding the vehicle.
  • the controller instructs the graphic projection device of the augmented reality head-up display system to project cluster content information within a near-field image plane of the windscreen.
  • the controller instructs the graphic projection device to project the notification symbol within a far-field image plane of the windscreen.
  • the notification symbol is one of the following: a caution symbol, a vehicle icon, an animal icon, and a pedestrian icon.
  • the visually imperceptible object is one of the following: roadway signage, roadway markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, and road conditions that require attention.
  • a method for displaying graphics upon a windscreen of a vehicle includes receiving, by a controller, a plurality of detection points that indicate a presence of an object from one or more non-visual object detection sensors and image data from one or more image-capturing devices. The method includes comparing the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the method includes determining a notification symbol that signifies the visually imperceptible object. Finally, the method includes instructing, by the controller, a graphic projection device to generate the notification symbol upon the windscreen of the vehicle, where the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
  • the method includes determining a rate of approach towards the visually imperceptible object by the vehicle and adjusting at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
  • the method includes receiving perception data indicative of human vision relative to camera vision, calculating a driver's field of view based on the perception data, and identifying the visually imperceptible object based on the perception data.
  • an augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle.
  • the augmented reality head-up display system includes one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle, one or more image-capturing devices that capture image data of the environment surrounding the vehicle, a graphic projection device for generating images upon the windscreen of the vehicle, and a controller in electronic communication with the one or more non-visual object detection sensors, the one or more image-capturing devices, and the graphic projection device.
  • the controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors.
  • the controller compares the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. The controller instructs the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible. The controller determines a rate of approach towards the visually imperceptible object by the vehicle and adjusts at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
  • the visual parameter is an overall size of the notification symbol, and the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
  • the controller executes instructions to receive perception data indicative of human vision relative to camera vision, calculate a driver's field of view based on the perception data, and identify the visually imperceptible object based on the perception data.
  • FIG. 1 is a schematic diagram of the disclosed augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle, according to an exemplary embodiment
  • FIG. 2 illustrates an interior view of the windscreen illustrating a notification symbol overlaid at a position on the windscreen where a visually imperceptible object is located, according to an exemplary embodiment
  • FIGS. 3 A- 3 C illustrate the notification symbol shown in FIG. 2 increasing in overall size, according to an exemplary embodiment
  • FIGS. 4 A- 4 C illustrate various types of notification symbols, according to an exemplary embodiment
  • FIG. 5 is a process flow diagram illustrating a method for displaying graphics upon the windscreen of the vehicle by the augmented reality head-up display system, according to an exemplary embodiment.
  • the augmented reality head-up display system 10 for displaying graphics upon a windscreen 12 of a vehicle 14 is illustrated.
  • the augmented reality head-up display system 10 includes one or more controllers 20 in electronic communication with one or more image-capturing devices 22 , one or more non-visual object detection sensors 24 , a graphic projection device 26 , and an eye location system 28 .
  • the image-capturing devices 22 may be cameras that obtain periodic or sequential images.
  • the one or more non-visual object detection sensors 24 are configured to detect a position, velocity, and direction of travel of objects in an environment 40 surrounding the vehicle 14 . In the example as shown in FIG.
  • the one or more non-visual object detection sensors 24 include a radar 30 , LiDAR 32 , and one or more infrared sensors 34 , however, it is to be appreciated that other sensors that employ non-visual techniques to detect the presence of objects may be used as well. Moreover, information from other sensors may be used as well.
  • the graphic projection device 26 is configured to generate images upon the windscreen 12 of the vehicle 14 and includes a projection device for creating an excitation light for projecting images.
  • the eye location system 28 includes one or more sensors for determining the location of a head of the driver of the vehicle 14 as well as the orientation or gaze location of the driver's eyes.
  • the vehicle 14 may be any type of vehicle such as, but not limited to, a sedan, truck, sport utility vehicle, van, or motor home. It is also to be appreciated that the vehicle 14 is not limited to an automobile, and may be any other type of land vehicle, marine vehicle, or air vehicle. In an embodiment, the vehicle 14 is an autonomous or semi-autonomous vehicle. However, it is to be appreciated that a manually driven vehicle may be used as well.
  • the one or more controllers 20 may also be in electronic communication with a global positioning system (GPS) 41 , one or more vehicle systems 42 , one or more road databases 44 , and one or more external networks 46 .
  • the one or more vehicle systems 42 include, but are not limited to, a driver monitoring system (DMS) and an automated driving system.
  • the vehicle 14 may wirelessly connect to the one or more external networks 46 .
  • Some examples of external networks 46 include, but are not limited to, cellular networks, dedicated short-range communications (DSRC) networks, and vehicle-to-infrastructure (V2X) networks.
  • FIG. 2 is an exemplary interior view of the windscreen 12 , where the environment 40 surrounding the vehicle 14 is visible through the windscreen 12 .
  • the images generated by the graphic projection module 26 are projected as light upon the windscreen 12 , where the light is reflected off the windscreen 12 and is directed to a driver of the vehicle 14 .
  • the images generated by the graphic projection module 26 appear to be in front of the vehicle 12 , and beyond the windscreen 12 , when viewed by a driver.
  • the augmented reality head-up display system 10 identifies a visually imperceptible object located in the environment 40 surrounding the vehicle 14 .
  • the visually imperceptible object is any type of incident or object situated along a roadway 60 that the vehicle 14 travels along.
  • Some examples of visually imperceptible objects include, but are not limited to, roadway signage and markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, or road conditions that require attention.
  • Some examples of road conditions that require attention include, but are not limited to, icy or slippery road surfaces, potholes, and debris obstructing the roadway 60 . It is to be appreciated that the visually imperceptible object is not visible to a driver of the vehicle 14 because of low-visibility conditions that reduce the driver's ability to view objects on the roadway 60 .
  • Some examples of low-visibility conditions include, but are not limited to, snow, rain, fog, nighttime conditions, and low-light conditions.
  • the augmented reality head-up display system 10 determines a notification symbol 36 that signifies the visually imperceptible object.
  • the notification symbol 36 is generated upon the windscreen 12 of the vehicle 14 and is overlaid at a position 38 upon the windscreen 12 where the visually imperceptible object would normally be visible by the driver.
  • the driver would be able to view the visually imperceptible object.
  • the driver of the vehicle 14 would be able to view a roadway sign that is otherwise visually imperceptible.
  • FIG. 1 the notification symbol 36 that signifies the visually imperceptible object.
  • the notification symbol 36 is a caution symbol, however, it is to be appreciated that other types of symbols may be used as well and are illustrated in FIGS. 4 A- 4 B .
  • the augmented reality head-up display system 10 determines a rate of approach towards the visually imperceptible object by the vehicle 14 and adjusts at least one visual parameter of the notification symbol 36 based on the rate of approach towards the vehicle 14 .
  • the visual parameter of the notification symbol 36 is an overall size, and as the vehicle 14 approaches the visually imperceptible object, the overall size of the notification symbol 36 increases.
  • the windscreen 12 includes a first, near-field image plane 50 and a second, far-field image plane 52 , however, it is to be appreciated that more than two image planes may be used as well.
  • the controller 20 instructs the graphic projection device 26 of the augmented reality head-up display system 10 to project cluster content information 54 upon the windscreen 12 within the near-field image plane 50 .
  • the cluster content information 54 informs the driver of the vehicle 14 of driving conditions such as, but not limited to, vehicle speed, speed limit, gear position, fuel level, current position, and navigational instructions.
  • the cluster content information 54 includes vehicle speed and navigational directions.
  • the augmented reality head-up display system 10 projects information regarding the notification symbol 36 upon the windscreen 12 within the near-field image plane 50 .
  • information regarding the notification symbol 36 include a description of the visually imperceptible object (i.e., is the visually imperceptible object debris on the roadway 60 , another vehicle, a road sign, etc.) and a distance to the visually imperceptible object from the vehicle 14 .
  • the controller 20 instructs the graphic projection device 26 to project the notification symbol 36 upon the windscreen 12 within the far-field image plane 52 , where the notification symbol 36 is overlaid at a position 38 upon the windscreen 12 where the visually imperceptible object would normally be visible.
  • the far-field image plane 52 contains images overlaid upon the roadway 60 that is visible through the windscreen 12 .
  • the far-field image plane 52 only covers a portion of the entire plane of the windscreen 12 , however, it is to be appreciated that in another implementation the far-field image plane 52 may cover the entire plane of the windscreen 12 that is not occupied by the near-field image plane 50 .
  • FIG. 2 illustrates the far-field image plane 52 only spanning across a portion of the lanes 62 that are part of the roadway 60 , in embodiments the far-field image plane 52 spans across each lane 62 across the roadway 60 .
  • the notification symbol 36 includes any type of graphic image that provides an alert to direct the attention of the driver of the vehicle 14 towards the position 38 of the visually imperceptible object.
  • the notification symbol 36 is a caution symbol, however, it is to be appreciated that other types of symbols may be used as well.
  • the notification symbol 36 includes a vehicle icon to alert the driver to a vehicle that is visually imperceptible.
  • the notification symbol 36 includes an animal icon to alert the driver to wildlife that are visually imperceptible.
  • the notification symbol 36 includes a pedestrian icon to alert the driver to a pedestrian that is visually imperceptible.
  • the notification symbol 36 may be animated as well to draw attention to the driver's eye. For example, an alert may be transient to draw the driver's attention to a particular region in the visual field.
  • the image-capturing devices 22 obtain image data of the environment 40 surrounding the vehicle 14 .
  • the one or more non-visual object detection sensors 24 obtain data in the form of detection points that indicate a presence of an object within the environment 40 of the vehicle 14 .
  • the controller 20 receives the plurality of detection points that indicate a presence of the object and the image data.
  • the controller 20 compares the plurality of detection points with the image data to identify the visually imperceptible object.
  • the controller 20 identifies the visually imperceptible object based on the driver's vision capabilities.
  • the driver's vision capabilities are entered manually or, in the alternative, may be inferred based on age.
  • the controller 20 identifies the visually imperceptible object based on driver perception data received from the eye location system 28 , where the driver perception data includes the location of a head of the driver and the orientation or gaze location of the driver's eyes.
  • the driver eye and head positions are at a different location than the image-capturing devices 22 , and therefore there may be areas in the environment 40 that the driver may view that are not captured by the image-capturing devices 22 , and vice versa.
  • the different locations between the driver eye and the head position may be calculated as well.
  • the controller 20 identifies the visually imperceptible object by first determining a luminance contrast ratio between the plurality of detection points with the image data of the environment 40 , and then compares the luminance contrast ratio with a contrast threshold ratio.
  • the image data captured from the one or more image-capturing devices 22 includes data indicating both object luminescence and background luminescence, where the luminescence contrast ratio is determined based on the object and the background luminescence.
  • the controller 20 identifies the object being detected as the visually imperceptible object.
  • the controller 20 determines the notification symbol 36 ( FIG. 2 ) that signifies the visually imperceptible object.
  • the controller 20 then instructs the graphic projection device 26 to generate the notification symbol 36 upon the windscreen 12 of the vehicle 14 .
  • the notification symbol 36 is overlaid at the position 38 of the visually imperceptible object.
  • the controller 20 calculates a type of symbol, the size, the shape, and the color of the notification symbol 36 based on the driver eye position, a location of the vehicle 14 , and data regarding the environment 40 such as, but not limited to, data from an inclinometer, vehicle speed, roadway curvature, and steering angle.
  • the controller 20 determines a rate of approach towards the visually imperceptible object by the vehicle 14 based on one or more inputs from the one or more image-capturing devices 22 , the one or more non-visual object detection sensors 24 , the one or more vehicle systems 42 , the one or more road databases 44 , and the one or more external networks 46 .
  • the controller adjusts at least one visual parameter of the notification symbol 36 based on the rate of approach towards the visually imperceptible object. In the embodiment as shown in FIGS.
  • the visual parameter is an overall size of the notification symbol, where the overall size of the notification symbol 36 increases as the vehicle 14 travels towards the visually imperceptible object and the overall size of the notification symbol 36 decreases as the vehicle 14 travels away from the visually imperceptible object.
  • the visual parameter is a color of the notification symbol, where the color indicates when the vehicle 14 is too close to the visually imperceptible object.
  • the color of the notification symbol may start as yellow, turn to orange as the vehicle 14 approaches the visually impermissible object, and eventually turns to red.
  • the visual parameter is an animation of the notification symbol 36 .
  • the visual parameter may be size, where the size of the graphic increases or decreases to capture the attention of the driver.
  • the controller 20 receives perception data indicative of human vision relative to camera vision from the one or more image-capturing devices 22 , the one or more non-visual object detection sensors 24 , the one or more vehicle systems 42 , the one or more road databases 44 , and the one or more external networks 46 .
  • the perception includes, but is not limited to, ambient lighting conditions, sun position, headlamp coverage, and weather input.
  • the controller 20 calculates a driver's field of view based on the perception data, and further identifies the visually imperceptible object based on the perception data.
  • FIG. 5 is a process flow diagram illustrating a method 200 for displaying the notification symbol 36 upon the windscreen 12 of the vehicle 14 by the augmented reality head-up display system 10 .
  • the method 200 may begin at block 202 .
  • the controller 20 receives the plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors 24 and the image data from one or more image-capturing devices 22 .
  • the method 200 may then proceed to block 204 .
  • the controller 20 compares the plurality of detection points with the image data of the environment 40 surrounding the vehicle 14 to identify the visually imperceptible object.
  • the visually imperceptible object may be identified based by determining a luminance contrast ratio between the plurality of detection points and the image data. The method 200 may then proceed to block 206 .
  • the controller 20 determines the notification symbol 36 that signifies the visually imperceptible object. The method 200 may then proceed to block 208 .
  • the controller 20 instructs the graphic projection device 26 to generate the notification symbol 36 upon the windscreen 12 of the vehicle 14 .
  • the notification symbol 36 is overlaid at a position 38 upon the windscreen 12 where the visually imperceptible object would normally be visible. The method 200 may then terminate.
  • the disclosed augmented reality head-up display provides various technical effects and benefits. Specifically, the disclosed augmented reality head-up display generates a notification symbol upon the windscreen of a vehicle to alert the driver of a visually imperceptible object. Therefore, the augmented reality head-up display provides enhanced situational awareness of roadway objects that are not evident to a driver during low-visibility conditions. Moreover, as the vehicle continues to travel towards the visually imperceptible object, the size and color of the notification symbol may change to assist the driver in determining if the visually imperceptible object is stationary, traveling towards the vehicle, or away from the vehicle.
  • the controllers may refer to, or be part of an electronic circuit, a combinational logic circuit, a field programmable gate array (FPGA), a processor (shared, dedicated, or group) that executes code, or a combination of some or all of the above, such as in a system-on-chip.
  • the controllers may be microprocessor-based such as a computer having a at least one processor, memory (RAM and/or ROM), and associated input and output buses.
  • the processor may operate under the control of an operating system that resides in memory.
  • the operating system may manage computer resources so that computer program code embodied as one or more computer software applications, such as an application residing in memory, may have instructions executed by the processor.
  • the processor may execute the application directly, in which case the operating system may be omitted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Optics & Photonics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Software Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Traffic Control Systems (AREA)

Abstract

An augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle includes a controller in electronic communication with one or more non-visual object detection sensors, one or more image-capturing devices, and a graphic projection device. The controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors. The controller executes instructions to compare the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. The notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.

Description

    INTRODUCTION
  • The present disclosure relates to an augmented reality head-up display for generating a notification symbol upon the windscreen of a vehicle. The notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
  • Augmented reality (AR) involves enhancing the real world with virtual elements that are shown in three-dimensional space and that permit real-time interaction with users. A head-up display (HUD) shows information such as, for example, vehicle speed and navigational instructions, directly onto a windscreen of a vehicle, within the driver's forward field of view. Accordingly, the head-up display provides drivers with information without looking away from the road. One possible implementation for augmented reality is an augmented reality head-up display (AR-HUD) for a vehicle. By overlaying images on the windscreen, AR-HUDs enhance a driver's view of the environment outside the vehicle, creating a greater sense of environmental awareness.
  • However, while current augmented reality head-up displays achieve their intended purpose, there is a need in the art for an improved approach for providing information to vehicle occupants.
  • SUMMARY
  • According to several aspects, an augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle is disclosed. The augmented reality head-up display system includes one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle, and one or more image-capturing devices that capture image data of the environment surrounding the vehicle, and a graphic projection device for generating images upon the windscreen of the vehicle. The system also includes a controller in electronic communication with the one or more non-visual object detection sensors, the one or more image-capturing devices, and the graphic projection device, wherein the controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors and the image data from the one or more non-visual object detection sensors. The controller compares the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. Finally, the controller instructs the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
  • In another aspect, the controller executes instructions to determine a rate of approach towards the visually imperceptible object by the vehicle and adjusts at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
  • In yet another aspect, the visual parameter is an overall size of the notification symbol, and where the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
  • In an aspect, the visual parameter is a color of the notification symbol.
  • In another aspect, the controller executes instructions to receive perception data indicative of human vision relative to camera vision, calculate a driver's field of view based on the perception data, and identify the visually imperceptible object based on the perception data.
  • In yet another aspect, the perception data includes one or more of the following: ambient lighting conditions, sun position, headlamp coverage, and weather input.
  • In an aspect, identifying the visually imperceptible object is determined based on driver vision capability.
  • In another aspect, the one or more non-visual object detection sensors include one or more of the following: a radar, LiDAR, and one or more infrared sensors.
  • In yet another aspect, the controller identifies the visually imperceptible object by determining a luminance contrast ratio between the plurality of detection points and the image data of the environment surrounding the vehicle.
  • In an aspect, the controller instructs the graphic projection device of the augmented reality head-up display system to project cluster content information within a near-field image plane of the windscreen.
  • In another aspect, information regarding the notification symbol displayed within the near-field image plane.
  • In yet another aspect, the controller instructs the graphic projection device to project the notification symbol within a far-field image plane of the windscreen.
  • In an aspect, the notification symbol is one of the following: a caution symbol, a vehicle icon, an animal icon, and a pedestrian icon.
  • In another aspect, the visually imperceptible object is one of the following: roadway signage, roadway markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, and road conditions that require attention.
  • In an aspect, a method for displaying graphics upon a windscreen of a vehicle. The method includes receiving, by a controller, a plurality of detection points that indicate a presence of an object from one or more non-visual object detection sensors and image data from one or more image-capturing devices. The method includes comparing the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the method includes determining a notification symbol that signifies the visually imperceptible object. Finally, the method includes instructing, by the controller, a graphic projection device to generate the notification symbol upon the windscreen of the vehicle, where the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
  • In another aspect, the method includes determining a rate of approach towards the visually imperceptible object by the vehicle and adjusting at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
  • In yet another aspect, the method includes receiving perception data indicative of human vision relative to camera vision, calculating a driver's field of view based on the perception data, and identifying the visually imperceptible object based on the perception data.
  • In an aspect, an augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle. The augmented reality head-up display system includes one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle, one or more image-capturing devices that capture image data of the environment surrounding the vehicle, a graphic projection device for generating images upon the windscreen of the vehicle, and a controller in electronic communication with the one or more non-visual object detection sensors, the one or more image-capturing devices, and the graphic projection device. The controller executes instructions to receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors. The controller compares the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment. In response to identifying the visually imperceptible object, the controller determines a notification symbol that signifies the visually imperceptible object. The controller instructs the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible. The controller determines a rate of approach towards the visually imperceptible object by the vehicle and adjusts at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
  • In an aspect, the visual parameter is an overall size of the notification symbol, and the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
  • In an aspect, the controller executes instructions to receive perception data indicative of human vision relative to camera vision, calculate a driver's field of view based on the perception data, and identify the visually imperceptible object based on the perception data.
  • Further areas of applicability will become apparent from the description provided herein. It should be understood that the description and specific examples are intended for purposes of illustration only and are not intended to limit the scope of the present disclosure.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The drawings described herein are for illustration purposes only and are not intended to limit the scope of the present disclosure in any way.
  • FIG. 1 is a schematic diagram of the disclosed augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle, according to an exemplary embodiment
  • FIG. 2 illustrates an interior view of the windscreen illustrating a notification symbol overlaid at a position on the windscreen where a visually imperceptible object is located, according to an exemplary embodiment;
  • FIGS. 3A-3C illustrate the notification symbol shown in FIG. 2 increasing in overall size, according to an exemplary embodiment;
  • FIGS. 4A-4C illustrate various types of notification symbols, according to an exemplary embodiment; and
  • FIG. 5 is a process flow diagram illustrating a method for displaying graphics upon the windscreen of the vehicle by the augmented reality head-up display system, according to an exemplary embodiment.
  • DETAILED DESCRIPTION
  • The following description is merely exemplary in nature and is not intended to limit the present disclosure, application, or uses.
  • Referring to FIG. 1 , an exemplary augmented reality head-up display system 10 for displaying graphics upon a windscreen 12 of a vehicle 14 is illustrated. The augmented reality head-up display system 10 includes one or more controllers 20 in electronic communication with one or more image-capturing devices 22, one or more non-visual object detection sensors 24, a graphic projection device 26, and an eye location system 28. The image-capturing devices 22 may be cameras that obtain periodic or sequential images. The one or more non-visual object detection sensors 24 are configured to detect a position, velocity, and direction of travel of objects in an environment 40 surrounding the vehicle 14. In the example as shown in FIG. 1 , the one or more non-visual object detection sensors 24 include a radar 30, LiDAR 32, and one or more infrared sensors 34, however, it is to be appreciated that other sensors that employ non-visual techniques to detect the presence of objects may be used as well. Moreover, information from other sensors may be used as well. The graphic projection device 26 is configured to generate images upon the windscreen 12 of the vehicle 14 and includes a projection device for creating an excitation light for projecting images. The eye location system 28 includes one or more sensors for determining the location of a head of the driver of the vehicle 14 as well as the orientation or gaze location of the driver's eyes. It is to be appreciated that the vehicle 14 may be any type of vehicle such as, but not limited to, a sedan, truck, sport utility vehicle, van, or motor home. It is also to be appreciated that the vehicle 14 is not limited to an automobile, and may be any other type of land vehicle, marine vehicle, or air vehicle. In an embodiment, the vehicle 14 is an autonomous or semi-autonomous vehicle. However, it is to be appreciated that a manually driven vehicle may be used as well.
  • The one or more controllers 20 may also be in electronic communication with a global positioning system (GPS) 41, one or more vehicle systems 42, one or more road databases 44, and one or more external networks 46. The one or more vehicle systems 42 include, but are not limited to, a driver monitoring system (DMS) and an automated driving system. The vehicle 14 may wirelessly connect to the one or more external networks 46. Some examples of external networks 46 include, but are not limited to, cellular networks, dedicated short-range communications (DSRC) networks, and vehicle-to-infrastructure (V2X) networks.
  • FIG. 2 is an exemplary interior view of the windscreen 12, where the environment 40 surrounding the vehicle 14 is visible through the windscreen 12. It is to be appreciated that the images generated by the graphic projection module 26 are projected as light upon the windscreen 12, where the light is reflected off the windscreen 12 and is directed to a driver of the vehicle 14. Thus, the images generated by the graphic projection module 26 appear to be in front of the vehicle 12, and beyond the windscreen 12, when viewed by a driver. Referring to both FIGS. 1 and 2 , the augmented reality head-up display system 10 identifies a visually imperceptible object located in the environment 40 surrounding the vehicle 14. The visually imperceptible object is any type of incident or object situated along a roadway 60 that the vehicle 14 travels along. Some examples of visually imperceptible objects include, but are not limited to, roadway signage and markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, or road conditions that require attention. Some examples of road conditions that require attention include, but are not limited to, icy or slippery road surfaces, potholes, and debris obstructing the roadway 60. It is to be appreciated that the visually imperceptible object is not visible to a driver of the vehicle 14 because of low-visibility conditions that reduce the driver's ability to view objects on the roadway 60. Some examples of low-visibility conditions include, but are not limited to, snow, rain, fog, nighttime conditions, and low-light conditions.
  • As explained below, in response to identifying the visually imperceptible object, the augmented reality head-up display system 10 determines a notification symbol 36 that signifies the visually imperceptible object. As seen in FIG. 2 , the notification symbol 36 is generated upon the windscreen 12 of the vehicle 14 and is overlaid at a position 38 upon the windscreen 12 where the visually imperceptible object would normally be visible by the driver. In other words, if the low-visibility conditions were not present, then the driver would be able to view the visually imperceptible object. For example, if the fog were to clear, then the driver of the vehicle 14 would be able to view a roadway sign that is otherwise visually imperceptible. In the example as shown in FIG. 2 , the notification symbol 36 is a caution symbol, however, it is to be appreciated that other types of symbols may be used as well and are illustrated in FIGS. 4A-4B. As also explained below, the augmented reality head-up display system 10 determines a rate of approach towards the visually imperceptible object by the vehicle 14 and adjusts at least one visual parameter of the notification symbol 36 based on the rate of approach towards the vehicle 14. For example, as seen in FIGS. 3A-3C, in an embodiment the visual parameter of the notification symbol 36 is an overall size, and as the vehicle 14 approaches the visually imperceptible object, the overall size of the notification symbol 36 increases.
  • Referring to FIGS. 1 and 2 , the windscreen 12 includes a first, near-field image plane 50 and a second, far-field image plane 52, however, it is to be appreciated that more than two image planes may be used as well. The controller 20 instructs the graphic projection device 26 of the augmented reality head-up display system 10 to project cluster content information 54 upon the windscreen 12 within the near-field image plane 50. The cluster content information 54 informs the driver of the vehicle 14 of driving conditions such as, but not limited to, vehicle speed, speed limit, gear position, fuel level, current position, and navigational instructions. In the example as shown in FIG. 2 , the cluster content information 54 includes vehicle speed and navigational directions. In an embodiment, the augmented reality head-up display system 10 projects information regarding the notification symbol 36 upon the windscreen 12 within the near-field image plane 50. Some examples of information regarding the notification symbol 36 include a description of the visually imperceptible object (i.e., is the visually imperceptible object debris on the roadway 60, another vehicle, a road sign, etc.) and a distance to the visually imperceptible object from the vehicle 14.
  • The controller 20 instructs the graphic projection device 26 to project the notification symbol 36 upon the windscreen 12 within the far-field image plane 52, where the notification symbol 36 is overlaid at a position 38 upon the windscreen 12 where the visually imperceptible object would normally be visible. The far-field image plane 52 contains images overlaid upon the roadway 60 that is visible through the windscreen 12. In the embodiment as shown in FIG. 2 , the far-field image plane 52 only covers a portion of the entire plane of the windscreen 12, however, it is to be appreciated that in another implementation the far-field image plane 52 may cover the entire plane of the windscreen 12 that is not occupied by the near-field image plane 50. Moreover, although FIG. 2 illustrates the far-field image plane 52 only spanning across a portion of the lanes 62 that are part of the roadway 60, in embodiments the far-field image plane 52 spans across each lane 62 across the roadway 60.
  • The notification symbol 36 includes any type of graphic image that provides an alert to direct the attention of the driver of the vehicle 14 towards the position 38 of the visually imperceptible object. In the example as shown in FIG. 2 , the notification symbol 36 is a caution symbol, however, it is to be appreciated that other types of symbols may be used as well. For example, as seen in FIG. 4A, the notification symbol 36 includes a vehicle icon to alert the driver to a vehicle that is visually imperceptible. In the embodiment as shown in FIG. 4B, the notification symbol 36 includes an animal icon to alert the driver to wildlife that are visually imperceptible. In the embodiment as shown in FIG. 4C, the notification symbol 36 includes a pedestrian icon to alert the driver to a pedestrian that is visually imperceptible. Other examples of symbols that may be used as the notification symbol 36 include, but are not limited to, traffic sign icons, slippery roadway icons, traffic lights, a stopped vehicle, pedestrians, animals, cross traffic, and the edge of the road. In embodiments, the notification symbol 36 may be animated as well to draw attention to the driver's eye. For example, an alert may be transient to draw the driver's attention to a particular region in the visual field.
  • Referring back to FIG. 1 , the image-capturing devices 22 obtain image data of the environment 40 surrounding the vehicle 14. The one or more non-visual object detection sensors 24 obtain data in the form of detection points that indicate a presence of an object within the environment 40 of the vehicle 14. The controller 20 receives the plurality of detection points that indicate a presence of the object and the image data. The controller 20 compares the plurality of detection points with the image data to identify the visually imperceptible object.
  • In embodiments, the controller 20 identifies the visually imperceptible object based on the driver's vision capabilities. The driver's vision capabilities are entered manually or, in the alternative, may be inferred based on age. In another embodiment, the controller 20 identifies the visually imperceptible object based on driver perception data received from the eye location system 28, where the driver perception data includes the location of a head of the driver and the orientation or gaze location of the driver's eyes. It is to be appreciated that the driver eye and head positions are at a different location than the image-capturing devices 22, and therefore there may be areas in the environment 40 that the driver may view that are not captured by the image-capturing devices 22, and vice versa. Furthermore, the different locations between the driver eye and the head position may be calculated as well.
  • In an embodiment, the controller 20 identifies the visually imperceptible object by first determining a luminance contrast ratio between the plurality of detection points with the image data of the environment 40, and then compares the luminance contrast ratio with a contrast threshold ratio. Specifically, the image data captured from the one or more image-capturing devices 22 includes data indicating both object luminescence and background luminescence, where the luminescence contrast ratio is determined based on the object and the background luminescence. In response to determining the luminance contrast ratio is greater than or equal to the contrast threshold ratio, the controller 20 identifies the object being detected as the visually imperceptible object.
  • In response to identifying the visually imperceptible object, the controller 20 determines the notification symbol 36 (FIG. 2 ) that signifies the visually imperceptible object. The controller 20 then instructs the graphic projection device 26 to generate the notification symbol 36 upon the windscreen 12 of the vehicle 14. As seen in FIG. 2 , the notification symbol 36 is overlaid at the position 38 of the visually imperceptible object. It is to be appreciated that the controller 20 calculates a type of symbol, the size, the shape, and the color of the notification symbol 36 based on the driver eye position, a location of the vehicle 14, and data regarding the environment 40 such as, but not limited to, data from an inclinometer, vehicle speed, roadway curvature, and steering angle.
  • In embodiments, the controller 20 determines a rate of approach towards the visually imperceptible object by the vehicle 14 based on one or more inputs from the one or more image-capturing devices 22, the one or more non-visual object detection sensors 24, the one or more vehicle systems 42, the one or more road databases 44, and the one or more external networks 46. Referring to FIGS. 1 and 3A-3B, the controller adjusts at least one visual parameter of the notification symbol 36 based on the rate of approach towards the visually imperceptible object. In the embodiment as shown in FIGS. 3A-3C, the visual parameter is an overall size of the notification symbol, where the overall size of the notification symbol 36 increases as the vehicle 14 travels towards the visually imperceptible object and the overall size of the notification symbol 36 decreases as the vehicle 14 travels away from the visually imperceptible object. In another embodiment, the visual parameter is a color of the notification symbol, where the color indicates when the vehicle 14 is too close to the visually imperceptible object. For example, the color of the notification symbol may start as yellow, turn to orange as the vehicle 14 approaches the visually impermissible object, and eventually turns to red. In still another embodiment, the visual parameter is an animation of the notification symbol 36. For example, the visual parameter may be size, where the size of the graphic increases or decreases to capture the attention of the driver.
  • Referring back to FIG. 1 , the controller 20 receives perception data indicative of human vision relative to camera vision from the one or more image-capturing devices 22, the one or more non-visual object detection sensors 24, the one or more vehicle systems 42, the one or more road databases 44, and the one or more external networks 46. In embodiments, the perception includes, but is not limited to, ambient lighting conditions, sun position, headlamp coverage, and weather input. The controller 20 calculates a driver's field of view based on the perception data, and further identifies the visually imperceptible object based on the perception data.
  • FIG. 5 is a process flow diagram illustrating a method 200 for displaying the notification symbol 36 upon the windscreen 12 of the vehicle 14 by the augmented reality head-up display system 10. Referring to FIGS. 1, 2, and 5 , the method 200 may begin at block 202. In block 202, the controller 20 receives the plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors 24 and the image data from one or more image-capturing devices 22. The method 200 may then proceed to block 204.
  • In block 204, the controller 20 compares the plurality of detection points with the image data of the environment 40 surrounding the vehicle 14 to identify the visually imperceptible object. As mentioned above, in an embodiment, the visually imperceptible object may be identified based by determining a luminance contrast ratio between the plurality of detection points and the image data. The method 200 may then proceed to block 206.
  • In block 206, in response to identifying the visually imperceptible object, the controller 20 determines the notification symbol 36 that signifies the visually imperceptible object. The method 200 may then proceed to block 208.
  • In block 208, the controller 20 instructs the graphic projection device 26 to generate the notification symbol 36 upon the windscreen 12 of the vehicle 14. As seen in FIG. 2 , the notification symbol 36 is overlaid at a position 38 upon the windscreen 12 where the visually imperceptible object would normally be visible. The method 200 may then terminate.
  • Referring generally to the figures, the disclosed augmented reality head-up display provides various technical effects and benefits. Specifically, the disclosed augmented reality head-up display generates a notification symbol upon the windscreen of a vehicle to alert the driver of a visually imperceptible object. Therefore, the augmented reality head-up display provides enhanced situational awareness of roadway objects that are not evident to a driver during low-visibility conditions. Moreover, as the vehicle continues to travel towards the visually imperceptible object, the size and color of the notification symbol may change to assist the driver in determining if the visually imperceptible object is stationary, traveling towards the vehicle, or away from the vehicle.
  • The controllers may refer to, or be part of an electronic circuit, a combinational logic circuit, a field programmable gate array (FPGA), a processor (shared, dedicated, or group) that executes code, or a combination of some or all of the above, such as in a system-on-chip. Additionally, the controllers may be microprocessor-based such as a computer having a at least one processor, memory (RAM and/or ROM), and associated input and output buses. The processor may operate under the control of an operating system that resides in memory. The operating system may manage computer resources so that computer program code embodied as one or more computer software applications, such as an application residing in memory, may have instructions executed by the processor. In an alternative embodiment, the processor may execute the application directly, in which case the operating system may be omitted.
  • The description of the present disclosure is merely exemplary in nature and variations that do not depart from the gist of the present disclosure are intended to be within the scope of the present disclosure. Such variations are not to be regarded as a departure from the spirit and scope of the present disclosure.

Claims (20)

1. An augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle, the augmented reality head-up display system comprising:
one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle;
one or more cameras that capture image data of the environment surrounding the vehicle;
a graphic projection device for generating images upon the windscreen of the vehicle wherein the graphic projection device includes a projection device that creates an excitation light that creates the images projected upon the windscreen; and
a controller in electronic communication with the one or more non-visual object detection sensors, the one or more cameras, and the graphic projection device, wherein the controller executes instructions to:
receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors and the image data from the one or more cameras;
compare the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment;
in response to identifying the visually imperceptible object, determine a notification symbol that signifies the visually imperceptible object; and
instruct the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible.
2. The augmented reality head-up display system of claim 1, wherein the controller executes instructions to:
determine a rate of approach towards the visually imperceptible object by the vehicle; and
adjust at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
3. The augmented reality head-up display system of claim 2, wherein the visual parameter is an overall size of the notification symbol, and wherein the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
4. The augmented reality head-up display system of claim 1, wherein the visual parameter is a color of the notification symbol.
5. The augmented reality head-up display system of claim 1, wherein the controller executes instructions to:
receive perception data indicative of human vision relative to camera vision;
calculate a driver's field of view based on the perception data; and
identify the visually imperceptible object based on the perception data.
6. The augmented reality head-up display system of claim 5, wherein the perception data includes one or more of the following: ambient lighting conditions, sun position, headlamp coverage, and weather input.
7. The augmented reality head-up display system of claim 1, wherein identifying the visually imperceptible object is determined based on driver vision capability.
8. The augmented reality head-up display system of claim 1, wherein the one or more non-visual object detection sensors include one or more of the following: a radar, LiDAR, and one or more infrared sensors.
9. The augmented reality head-up display system of claim 1, wherein the controller identifies the visually imperceptible object by determining a luminance contrast ratio between the plurality of detection points and the image data of the environment surrounding the vehicle.
10. The augmented reality head-up display system of claim 1, wherein the controller instructs the graphic projection device of the augmented reality head-up display system to project cluster content information within a near-field image plane of the windscreen.
11. The augmented reality head-up display system of claim 10, wherein information regarding the notification symbol displayed within the near-field image plane.
12. The augmented reality head-up display system of claim 1, wherein the controller instructs the graphic projection device to project the notification symbol within a far-field image plane of the windscreen.
13. The augmented reality head-up display system of claim 1, wherein the notification symbol is one of the following: a caution symbol, a vehicle icon, an animal icon, and a pedestrian icon.
14. The augmented reality head-up display system of claim 1, wherein the visually imperceptible object is one of the following: roadway signage, roadway markings, another vehicle, a pedestrian, a bicyclist, a traffic incident, and road conditions that require attention.
15. A method for displaying graphics upon a windscreen of a vehicle, the method comprising:
receiving, by a controller, a plurality of detection points that indicate a presence of an object from one or more non-visual object detection sensors and image data from one or more cameras;
comparing the plurality of detection points with the image data of an environment surrounding the vehicle to identify a visually imperceptible object located in the environment;
in response to identifying the visually imperceptible object, determining a notification symbol that signifies the visually imperceptible object; and
instructing, by the controller, a graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible, wherein the graphic projection device includes a projection device that creates an excitation light that creates the images projected upon the windscreen.
16. The method of claim 15, further comprising:
determining a rate of approach towards the visually imperceptible object by the vehicle; and
adjusting at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
17. The method of claim 15, further comprising:
receiving perception data indicative of human vision relative to camera vision;
calculating a driver's field of view based on the perception data; and
identifying the visually imperceptible object based on the perception data.
18. An augmented reality head-up display system for displaying graphics upon a windscreen of a vehicle, the augmented reality head-up display system comprising:
one or more non-visual object detection sensors to detect objects in an environment surrounding the vehicle;
one or more cameras that capture image data of the environment surrounding the vehicle;
a graphic projection device for generating images upon the windscreen of the vehicle, wherein the graphic projection device includes a projection device that creates an excitation light that creates the images projected upon the windscreen; and
a controller in electronic communication with the one or more non-visual object detection sensors, the one or more cameras, and the graphic projection device, wherein the controller executes instructions to:
receive a plurality of detection points that indicate a presence of an object from the one or more non-visual object detection sensors;
compare the plurality of detection points with the image data of the environment surrounding the vehicle to identify a visually imperceptible object located in the environment;
in response to identifying the visually imperceptible object, determine a notification symbol that signifies the visually imperceptible object;
instruct the graphic projection device to generate the notification symbol upon the windscreen of the vehicle, wherein the notification symbol is overlaid at a position upon the windscreen where the visually imperceptible object would normally be visible;
determine a rate of approach towards the visually imperceptible object by the vehicle; and
adjust at least one visual parameter of the notification symbol based on the rate of approach towards the visually imperceptible object by the vehicle.
19. The augmented reality head-up display system of claim 18, wherein the visual parameter is an overall size of the notification symbol, and wherein the overall size of the notification symbol increases as the vehicle travels towards the visually imperceptible object and the overall size of the notification symbol decreases as the vehicle travels away from the visually imperceptible object.
20. The augmented reality head-up display system of claim 18, wherein the controller executes instructions to:
receive perception data indicative of human vision relative to camera vision;
calculate a driver's field of view based on the perception data; and
identify the visually imperceptible object based on the perception data.
US17/702,093 2022-03-23 2022-03-23 Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object Active US11766938B1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US17/702,093 US11766938B1 (en) 2022-03-23 2022-03-23 Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object
DE102022127379.2A DE102022127379A1 (en) 2022-03-23 2022-10-19 AUGMENTED REALITY HEAD-UP DISPLAY FOR OVERLAYING A NOTIFICATION ICON ON A VISUALLY UNDERPERCEIVE OBJECT
CN202211288780.5A CN116841042A (en) 2022-03-23 2022-10-20 Augmented reality head-up display with symbols superimposed on visually imperceptible objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US17/702,093 US11766938B1 (en) 2022-03-23 2022-03-23 Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object

Publications (2)

Publication Number Publication Date
US11766938B1 US11766938B1 (en) 2023-09-26
US20230302900A1 true US20230302900A1 (en) 2023-09-28

Family

ID=87930676

Family Applications (1)

Application Number Title Priority Date Filing Date
US17/702,093 Active US11766938B1 (en) 2022-03-23 2022-03-23 Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object

Country Status (3)

Country Link
US (1) US11766938B1 (en)
CN (1) CN116841042A (en)
DE (1) DE102022127379A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2022184350A (en) * 2021-06-01 2022-12-13 マツダ株式会社 head-up display device

Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5995903A (en) * 1996-11-12 1999-11-30 Smith; Eric L. Method and system for assisting navigation using rendered terrain imagery
US20070031006A1 (en) * 2005-04-19 2007-02-08 Valeo Vision Method for detecting nocturnal fog and system for implementing the method
US20070230800A1 (en) * 2006-03-29 2007-10-04 Denso Corporation Visibility range measuring apparatus for vehicle and vehicle drive assist system
US20080061950A1 (en) * 2006-09-12 2008-03-13 Denso Corporation Apparatus for detecting the presence of fog for automotive vehicle
US20090118909A1 (en) * 2007-10-31 2009-05-07 Valeo Vision Process for detecting a phenomenon limiting the visibility for a motor vehicle
US20100253540A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Enhanced road vision on full windshield head-up display
US7920250B2 (en) * 2007-08-03 2011-04-05 Valeo Vision System for the detection by a motor vehicle of a phenomenon that interferes with visibility
US20120001742A1 (en) * 2010-07-05 2012-01-05 Denso Corporation Obstacle search system
US20140093131A1 (en) * 2012-10-01 2014-04-03 Xerox Corporation Visibility improvement in bad weather using enchanced reality
US20150310313A1 (en) * 2012-12-18 2015-10-29 Mitsubishi Electric Corporation Visibility estimation device, visibility estimation method, and safe driving support system
US20150352952A1 (en) * 2014-03-11 2015-12-10 Cessna Aircraft Company Adjustable Synthetic Vision
US20160327402A1 (en) * 2014-02-05 2016-11-10 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for vehicle and display method of display apparatus for vehicle
US20170192091A1 (en) * 2016-01-06 2017-07-06 Ford Global Technologies, Llc System and method for augmented reality reduced visibility navigation
US20170345321A1 (en) * 2014-11-05 2017-11-30 Sierra Nevada Corporation Systems and methods for generating improved environmental displays for vehicles
US20180129854A1 (en) * 2016-11-09 2018-05-10 Samsung Electronics Co., Ltd. Method and apparatus for generating virtual driving lane for traveling vehicle
US20180141563A1 (en) * 2016-06-30 2018-05-24 Faraday&Future Inc. Classifying of weather situations using cameras on automobiles
US20180201134A1 (en) * 2017-01-17 2018-07-19 Lg Electronics, Inc. User interface apparatus for vehicle and vehicle
US20180211121A1 (en) * 2017-01-25 2018-07-26 Ford Global Technologies, Llc Detecting Vehicles In Low Light Conditions
US20180328752A1 (en) * 2017-05-09 2018-11-15 Toyota Jidosha Kabushiki Kaisha Augmented reality for vehicle lane guidance
US20200324787A1 (en) * 2018-10-25 2020-10-15 Samsung Electronics Co., Ltd. Augmented reality method and apparatus for driving assistance
US10846833B2 (en) * 2015-09-02 2020-11-24 SMR Patents S.à.r.l. System and method for visibility enhancement

Patent Citations (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5995903A (en) * 1996-11-12 1999-11-30 Smith; Eric L. Method and system for assisting navigation using rendered terrain imagery
US20070031006A1 (en) * 2005-04-19 2007-02-08 Valeo Vision Method for detecting nocturnal fog and system for implementing the method
US20070230800A1 (en) * 2006-03-29 2007-10-04 Denso Corporation Visibility range measuring apparatus for vehicle and vehicle drive assist system
US20080061950A1 (en) * 2006-09-12 2008-03-13 Denso Corporation Apparatus for detecting the presence of fog for automotive vehicle
US7920250B2 (en) * 2007-08-03 2011-04-05 Valeo Vision System for the detection by a motor vehicle of a phenomenon that interferes with visibility
US20090118909A1 (en) * 2007-10-31 2009-05-07 Valeo Vision Process for detecting a phenomenon limiting the visibility for a motor vehicle
US20100253540A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Enhanced road vision on full windshield head-up display
US20120001742A1 (en) * 2010-07-05 2012-01-05 Denso Corporation Obstacle search system
US20140093131A1 (en) * 2012-10-01 2014-04-03 Xerox Corporation Visibility improvement in bad weather using enchanced reality
US20150310313A1 (en) * 2012-12-18 2015-10-29 Mitsubishi Electric Corporation Visibility estimation device, visibility estimation method, and safe driving support system
US20160327402A1 (en) * 2014-02-05 2016-11-10 Panasonic Intellectual Property Management Co., Ltd. Display apparatus for vehicle and display method of display apparatus for vehicle
US20150352952A1 (en) * 2014-03-11 2015-12-10 Cessna Aircraft Company Adjustable Synthetic Vision
US20170345321A1 (en) * 2014-11-05 2017-11-30 Sierra Nevada Corporation Systems and methods for generating improved environmental displays for vehicles
US10846833B2 (en) * 2015-09-02 2020-11-24 SMR Patents S.à.r.l. System and method for visibility enhancement
US20170192091A1 (en) * 2016-01-06 2017-07-06 Ford Global Technologies, Llc System and method for augmented reality reduced visibility navigation
US20180141563A1 (en) * 2016-06-30 2018-05-24 Faraday&Future Inc. Classifying of weather situations using cameras on automobiles
US20180129854A1 (en) * 2016-11-09 2018-05-10 Samsung Electronics Co., Ltd. Method and apparatus for generating virtual driving lane for traveling vehicle
US20180201134A1 (en) * 2017-01-17 2018-07-19 Lg Electronics, Inc. User interface apparatus for vehicle and vehicle
US20180211121A1 (en) * 2017-01-25 2018-07-26 Ford Global Technologies, Llc Detecting Vehicles In Low Light Conditions
US20180328752A1 (en) * 2017-05-09 2018-11-15 Toyota Jidosha Kabushiki Kaisha Augmented reality for vehicle lane guidance
US20200324787A1 (en) * 2018-10-25 2020-10-15 Samsung Electronics Co., Ltd. Augmented reality method and apparatus for driving assistance

Also Published As

Publication number Publication date
DE102022127379A1 (en) 2023-09-28
CN116841042A (en) 2023-10-03
US11766938B1 (en) 2023-09-26

Similar Documents

Publication Publication Date Title
US10909765B2 (en) Augmented reality system for vehicle blind spot prevention
US10168174B2 (en) Augmented reality for vehicle lane guidance
US11996018B2 (en) Display control device and display control program product
US11447067B2 (en) Maintaining road safety when there is a disabled autonomous vehicle
US8514099B2 (en) Vehicle threat identification on full windshield head-up display
US8543254B1 (en) Vehicular imaging system and method for determining roadway width
WO2020125178A1 (en) Vehicle driving prompting method and apparatus
CN113710530A (en) Method for operating a driver information system in an autonomous vehicle and driver information system
WO2017221945A1 (en) Display apparatus, display system, moving body, and display method
CN113498388A (en) Method for operating a driver information system in a self-propelled vehicle and driver information system
US10946744B2 (en) Vehicular projection control device and head-up display device
JP6748947B2 (en) Image display device, moving body, image display method and program
CN112119398A (en) Method and device for operating a camera-monitor system of a motor vehicle
CN112758013A (en) Display device and display method for vehicle
US11697346B1 (en) Lane position in augmented reality head-up display system
JP7255608B2 (en) DISPLAY CONTROLLER, METHOD, AND COMPUTER PROGRAM
US11766938B1 (en) Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object
US11663939B1 (en) Augmented reality head-up display for generating a contextual graphic signifying a visually occluded object
US12030512B2 (en) Collision warning system for a motor vehicle having an augmented reality head up display
KR102695051B1 (en) Apparatus and method for generating traffic information including low visibility image improvement function
US20230406356A1 (en) Fail-safe corrective actions based on vision information for autonomous vehicles
US11630302B1 (en) Driving guidance system for a motor vehicle having an augmented reality head up display
US20240192313A1 (en) Methods and systems for displaying information to an occupant of a vehicle
US20230347921A1 (en) Vehicle display control system, computer-readable medium, vehicle display control method, and vehicle display control device
JP2022154208A (en) Image processing device, image processing system and image processing method

Legal Events

Date Code Title Description
AS Assignment

Owner name: GM GLOBAL TECHNOLOGY OPERATIONS LLC, MICHIGAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:WEISS, JOHN P.;SZCZERBA, JOSEPH F.;TSIMHONI, OMER;AND OTHERS;SIGNING DATES FROM 20220321 TO 20220323;REEL/FRAME:059378/0483

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCF Information on status: patent grant

Free format text: PATENTED CASE