US20040066376A1 - Mobility assist device - Google Patents

Mobility assist device Download PDF

Info

Publication number
US20040066376A1
US20040066376A1 US10/626,953 US62695303A US2004066376A1 US 20040066376 A1 US20040066376 A1 US 20040066376A1 US 62695303 A US62695303 A US 62695303A US 2004066376 A1 US2004066376 A1 US 2004066376A1
Authority
US
United States
Prior art keywords
display
objects
location
mobile body
signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/626,953
Inventor
Max Donath
Craig Shankwitz
Heon Lim
Bryan Newstrom
Alec Gorjestani
Sameer Pardhy
Lee Alexander
Pi-Ming Cheng
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Minnesota
Original Assignee
University of Minnesota
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Minnesota filed Critical University of Minnesota
Priority to US10/626,953 priority Critical patent/US20040066376A1/en
Publication of US20040066376A1 publication Critical patent/US20040066376A1/en
Assigned to UNIVERSITY OF MINNESOTA reassignment UNIVERSITY OF MINNESOTA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: ALEXANDER, LEE, CHENG, PI-MING, DONATH, MAX, GORJESTANI, ALEC, LIM, HEON MIN, NEWSTROM, BRYAN, PARDHY, SAMEER, SHANKWITZ, CRAIG R.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/20Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
    • B60R2300/205Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using a head-up display
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/301Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with other obstacle sensor information, e.g. using RADAR/LIDAR/SONAR sensors for estimating risk of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/302Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing combining image information with GPS information or vehicle data, e.g. vehicle speed, gyro, steering angle data
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • B60R2300/305Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images merging camera image with lines or icons
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/307Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing virtually distinguishing relevant parts of a scene from the background of the scene
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/60Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by monitoring and displaying vehicle exterior scenes from a transformed perspective
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/804Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for lane monitoring
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/8093Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for obstacle warning
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60TVEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
    • B60T2201/00Particular use of vehicle brake systems; Special systems using also the brakes; Special software modules within the brake system controller
    • B60T2201/08Lane monitoring; Lane Keeping Systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60TVEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
    • B60T2201/00Particular use of vehicle brake systems; Special systems using also the brakes; Special software modules within the brake system controller
    • B60T2201/08Lane monitoring; Lane Keeping Systems
    • B60T2201/086Lane monitoring; Lane Keeping Systems using driver related features

Definitions

  • the present invention deals with mobility assistance. More particularly, the present invention deals with a vision assist device in the form of a head up display (HUD) for assisting mobility of a mobile body, such as a person non-motorized vehicle or motor vehicle.
  • HUD head up display
  • the driver's forward-looking vision simply does not provide enough information to facilitate safe control of the vehicle. This can be exacerbated, particularly on snow removal equipment, because even on a relatively calm, clear day, snow can be blown up from the front or sides of snowplow blades, substantially obstructing the visual field of the driver.
  • the present invention is directed to a visual assist device which provides a conformal, augmented display to assist in movement of a mobile body.
  • the mobile body is a vehicle (motorized or non-motorized) and the present invention assists the driver in either lane keeping or collision avoidance, or both.
  • the system can display lane boundaries, other navigational or guidance elements or a variety of other objects in proper perspective, to assist the driver.
  • the mobile body is a person (or group of people) and the present invention assists the person in either staying on a prescribed path or collision avoidance or both.
  • the system can display path boundaries, other navigational or guidance elements or a variety of other objects in proper perspective, to assist the walking person.
  • FIG. 1 is a block diagram of a mobility assist device in accordance with one embodiment of the present invention.
  • FIGS. 2 is a more detailed block diagrams of another embodiment of the mobility assist device.
  • FIG. 3A is a partial-pictorial and partial block diagram illustrating operation of a mobility assist device in accordance with one embodiment of the present invention.
  • FIG. 3B illustrates the concept of a combiner and virtual screen.
  • FIGS. 3C, 3D and 3 E are pictorial illustrations of a conformal, augmented projection and display in accordance with one embodiment of the present invention.
  • FIGS. 3F, 3G, 3 H and 3 I are pictorial illustrations of an actual conformal, augmented display in accordance with an embodiment of the present invention.
  • FIGS. 4 A- 4 C are flow diagrams illustrating general operation of the mobility assist device.
  • FIG. 5A illustrates coordinate frames used in accordance with one embodiment of the present invention.
  • FIGS. 5 B- 1 to 5 K- 3 illustrate the development of a coordinate transformation matrix in accordance with one embodiment of the present invention.
  • FIG. 6 is a side view of a vehicle employing the ranging system in accordance with one embodiment of the present invention.
  • FIG. 7 is a flow diagram illustrating a use of the present invention in performing system diagnostics and improved radar processing.
  • FIG. 8 is a pictorial view of a head up virtual mirror, in accordance with one embodiment of the present invention.
  • FIG. 9 is a top view of one embodiment of a system used to obtain position information corresponding to a vehicle.
  • FIG. 10 is a block diagram of another embodiment of the present invention.
  • FIG. 1 is a simplified block diagram of one embodiment of driver assist device 10 in accordance with the present invention.
  • Driver assist device 10 includes controller 12 , vehicle location system 14 , geospatial database 16 , ranging system 18 , operator interface 20 and display 22 .
  • controller 12 is a microprocessor, microcontroller, digital computer, or other similar control device having associated memory and timing circuitry. It should be understood that the memory can be integrated with controller 12 , or be located separately therefrom. The memory, of course, may include random access memory, read only memory, magnetic or optical disc drives, tape memory, or any other suitable computer readable medium.
  • Operator interface 20 is illustratively a keyboard, a touch-sensitive screen, a point and click user input device (e.g. a mouse), a keypad, a voice activated interface, joystick, or any other type of user interface suitable for receiving user commands, and providing those commands to controller 12 , as well as providing a user viewable indication of operating conditions from controller 12 to the user.
  • the operator interface may also include, for example, the steering wheel and the throttle and brake pedals suitably instrumented to detect the operator's desired control inputs of heading angle and speed.
  • Operator interface 20 may also include, for example, a LCD screen, LEDs, a plasma display, a CRT, audible noise generators, or any other suitable operator interface display or speaker unit.
  • vehicle location system 14 determines and provides a vehicle location signal, indicative of the vehicle location in which driver assist device 10 is mounted, to controller 12 .
  • vehicle location system 14 can include a global positioning system receiver (GPS receiver) such as a differential GPS receiver, an earth reference position measuring system, a dead reckoning system (such as odometery and an electronic compass), an inertial measurement unit (such as accelerometers, inclinometers, or rate gyroscopes), etc.
  • GPS receiver global positioning system receiver
  • vehicle location system 14 periodically provides a location signal to controller 12 which indicates the location of the vehicle on the surface of the earth.
  • Geospatial database 16 contains a digital map which digitally locates road boundaries, lane boundaries, possibly some landmarks (such as road signs, water towers, or other landmarks) and any other desired items (such as road barriers, bridges etc . . . ) and describes a precise location and attributes of those items on the surface of the earth.
  • landmarks such as road signs, water towers, or other landmarks
  • any other desired items such as road barriers, bridges etc . . .
  • the earth is approximately spherical in shape, it is convenient to determine a location on the surface of the earth if the location values are expressed in terms of an angle from a reference point.
  • Longitude and latitude are the most commonly used angles to express a location on the earth's surface or in orbits around the earth.
  • Latitude is a measurement on a globe of location north or south of the equator, and longitude is a measurement of the location east or west of the prime meridian at Greenwich, the specifically designated imaginary north-south line that passes through both geographic poles of the earth and Greenwich, England.
  • the combinations of meridians of longitude and parallels of latitude establishes a framework or grid by means of which exact positions can be determined in reference to the prime meridian and the equator.
  • Many of the currently available GPS systems provide latitude and longitude values as location data.
  • One standard projection method is the Lambert Conformal Conic Projection Method. This projection method is extensively used in a ellipsoidal form for large scale mapping of regions of predominantly east-west extent, including topographic, quadrangles for many of the U.S. state plane coordinate system zones, maps in the International Map of the World series and the U.S. State Base maps. The method uses well known, and publicly available, conversion equations to calculate state coordinate values from GPS receiver longitude and latitude angle data.
  • the digital map stored in the geospatial database 16 contains a series of numeric location data of, for example, the center line and lane boundaries of a road on which system 10 is to be used, as well as construction data which is given by a number of shape parameters including, starting and ending points of straight paths, the center of circular sections, and starting and ending angles of circular sections. While the present system is described herein in terms of starting and ending points of circular sections it could be described in terms of starting and ending points and any curvature between those points. For example, a straight path can be characterized as a section of zero curvature. Each of these items is indicated by a parameter marker, which indicates the type of parameter it is, and has associated location data giving the precise geographic location of that point on the map.
  • each road point of the digital map in database 16 was generated at uniform 10 meter intervals.
  • the road points represent only the centerline of the road, and the lane boundaries are calculated from that centerline point.
  • both the center line and lane boundaries are mapped.
  • geospatial database 16 also illustratively contains the exact location data indicative of the exact geographical location of street signs and other desirable landmarks.
  • Database 16 can be obtained by manual mapping operations or by a number of automated methods such as, for example, placing a GPS receiver on the lane stripe paint spraying nozzle or tape laying mandrel to continuously obtain locations of lane boundaries.
  • Ranging system 18 is configured to detect targets in the vicinity of the vehicle in which system 10 is implemented, and also to detect a location (such as range, range rate and azimuth angle) of the detected targets, relative to the vehicle.
  • Targets are illustratively objects which must be monitored because they may collide with the mobile body either due to motion of the body or of the object.
  • ranging system 18 is a radar system commercially available from Eaton Vorad.
  • ranging system 18 can also include a passive or active infrared system (which could also provide the amount of heat emitted from the target) or laser based ranging system, or a directional ultrasonic system, or other similar systems.
  • Another embodiment of system 18 is an infrared sensor calibrated to obtain a scaling factor for range, range rate and azimuth which is used for transformation to an eye coordinate system.
  • Display 22 includes a projection unit and one or more combiners which are described in greater detail later in the specification.
  • the projection unit receives a video signal from controller 12 and projects video images onto one or more combiners.
  • the projection unit illustratively includes a liquid crystal display (LCD) matrix and a high-intensity light source similar to a conventional video projector, except that it is small so that it fits near the driver's seat space.
  • the combiner is a partially-reflective, partially transmissive beam splitter formed of optical glass or polymer for reflecting the projected light from the projection unit back to the driver.
  • the combiner is positioned such that the driver looks through the combiner, when looking through the forward-looking visual field, so that the driver can see both the actual outside road scene, as well as the computer generated images projected onto the combiner.
  • the computer-generated images substantially overlay the actual images.
  • combiners or other similar devices can be placed about the driver to cover substantially all fields of view or be implemented in the glass of the windshield and windows. This can illustratively be implemented using a plurality of projectors or a single projector with appropriate optics to scan the projected image across the appropriate fields of view.
  • FIG. 2 illustrates that controller 12 may actually be formed of first controller 24 and second controller 26 (or any number of controllers with processing distributed among them, as desired).
  • first controller 24 performs the primary data processing functions with respect to sensory data acquisition, and also performs database queries in the geospatial database 16 . This entails obtaining velocity and heading information from GPS receiver and correction system 28 .
  • First controller 24 also performs processing of the target signal from radar ranging system 18 .
  • FIG. 2 also illustrates that vehicle location system 14 may illustratively include a differential GPS receiver and correction system 28 as well as an auxiliary inertial measurement unit (IMU) 30 (although other approaches would also work).
  • Second controller 26 processes signals from auxiliary IMU 30 , where necessary, and handles graphics computations for providing the appropriate video signal to display 22 .
  • differential GPS receiver and correcting system 28 is illustratively a Novatel RT-20 differential GPS (DGPS) system with a 20-centimeter accuracy, while operating at a 5 Hz sampling rate or Trimble MS 750 with 2 cm accuracy operating at 10 Hz sampling rate.
  • DGPS Novatel RT-20 differential GPS
  • FIG. 2 also illustrates that system 10 can include optional vehicle orientation detection system 31 and head tracking system 32 .
  • Vehicle orientation detection system 31 detects the orientation (such as roll and pitch) of the vehicle in which system 10 is implemented.
  • the roll angle refers to the rotational orientation of the vehicle about its longitudinal axis (which is parallel to its direction of travel) .
  • the roll angle can change, for example, if the vehicle is driving over a banked road, or on uneven terrain.
  • the pitch angle is the angle that the vehicle makes in a vertical plane along the longitudinal direction. The pitch angle becomes significant if the vehicle is climbing up or descending down a hill. Taking into account the pitch and roll angles can make the projected image more accurate, and more closely conform to the actual image seen by the driver.
  • Optional head tracking system 32 can be provided to accommodate for movements in the driver's head or eye position relative to the vehicle.
  • the actual head and eye position of the driver is not monitored.
  • the dimensions of the cab or operator compartment of the vehicle in which system 10 is implemented are taken and used, along with ergonomic data, such as the height and eye position of an operator, given the dimension of the operator compartment, and the image is projected on display 22 such that the displayed images will substantially overlie the actual images for an average operator.
  • Specific measurements can be taken for any given operator as well, such that such a system can more closely conform to any given operator.
  • Head tracking system 32 tracks the position of the operator's head, and eyes, in real time.
  • FIGS. 3 A- 3 E better illustrate the display of information on display 22 .
  • FIG. 3A illustrates that display 22 includes projector 40 , and combiner 42 .
  • FIG. 3A also illustrates an operator 44 sitting in an operator compartment which includes seat 46 and which is partially defined by windshield 48 .
  • Projector 40 receives the video display signal from controller 12 and projects road data onto combiner 42 .
  • Combiner 42 is partially reflective and partially transmissive. Therefore, the operator looks forward through combiner 42 and windshield 48 to a virtual focal plane 50 .
  • the road data (such as lane boundaries) are projected from projector 40 in proper perspective onto combiner 42 such that the lane boundaries appear to substantially overlie those which the operator actually sees, in the correct perspective. In this way, when the operator's view of the actual lane boundaries becomes obstructed, the operator can safely maintain lane keeping because the operator can navigate by the projected lane boundaries.
  • FIG. 3A also illustrates that combiner 42 , in one illustrative embodiment, is hinged to an upper surface or side surface or other structural part 52 , of the operator compartment. Therefore, combiner 42 can be pivoted along an arc generally indicated by arrow 54 , up and out of the view of the operator, on days when no driver assistance is needed, and down to the position shown in FIG. 3A, when the operator desires to look through combiner 42 .
  • FIG. 3B better illustrates combiner 42 , window 48 and virtual screen or focal plane 50 .
  • Combiner 42 while being partially reflective, is essentially a transparent, optically correct, coated glass or polymer lens. Light reaching the eyes of operator 44 is a combination of light passing through the lens and light reflected off of the lens from the projector. With an unobstructed forward-looking visual field, the driver actually sees two images accurately superimposed together. The image passing through the combiner 42 comes from the actual forward-looking field of view, while the reflected image is generated by the graphics processor portion of controller 12 .
  • the optical characteristics of combiner 42 allow the combination of elements to generate the virtual screen, or virtual focal plane 50 , which is illustratively projected to appear approximately 30-80 feet ahead of combiner 42 . This feature results in a virtual focus in front of the vehicle, and ensures that the driver's eyes are not required to focus back and forth between the real image and the virtual image, thus reducing eyestrain and fatigue.
  • combiner 42 is formed such that the visual image size spans approximately 30° along a horizontal axis and 15° along a vertical axis with the projector located approximately 18 inches from the combiner.
  • FIG. 1 Another embodiment is a helmet supported visor (or eyeglass device) on which images are projected, through which the driver can still see.
  • Such displays might include technologies such as those available from Kaiser Electro-Optics, Inc. of Carlsbad, Calif., The MicroOptical Corporation of Westwood, Mass., Universal Display Corporation of Ewing, N.J., Microvision, Inc. of Bothell, Wash. and IODisplay System LLC of Menlo Park, Calif.
  • FIGS. 3C and 3D are illustrative displays from projector 40 which are projected onto combiner 42 .
  • the left most line is the left side road boundary.
  • the dotted line corresponds to the centerline of a two-way road, while the right most curved line, with vertical poles, corresponds to the right-hand side road boundary.
  • the gray circle near the center of the image shown in FIG. 3C corresponds to a target detected and located by ranging system 18 described in greater detail later in the application.
  • the gray shape need not be a circle but could be any icon or shape and could be transparent, opaque or translucent.
  • the screens illustrated in FIGS. 3C and 3D can illustratively be projected in the forward-looking visual field of the driver by projecting them onto combiner 42 with the correct scale so that objects (including the painted line stripes and road boundaries) in the screen are superimposed on the actual objects in the outer scene observed by the driver.
  • the black area on the screens illustrated in FIGS. 3C and 3D appear transparent on combiner 42 under typical operating conditions. Only the brightly colored lines appear on the virtual image that is projected onto combiner 42 .
  • the thickness and colors of the road boundaries illustrated in FIGS. 3C and 3D can be varied, as desired, they are illustratively white lines that are approximately 1-5 pixels thick while the center line is also white and is approximately 1-5 pixels thick as well.
  • FIG. 3E illustrates a virtual image projected onto an actual image as seen through combiner 42 by the driver.
  • the outline of combiner 42 can be seen in the illustration of FIG. 3E and the area 60 which includes the projected image has been outlined in FIG. 3E for the sake of clarity, although no such outline actually appears on the display.
  • the display generated is a conformal, augmented display which is highly useful in low-visibility situations.
  • Geographic landmarks are projected onto combiner 42 and are aligned with the view out of the windshield.
  • Fixed roadside signs i.e., traditional speed limit signs, exit information signs, etc.
  • Data supporting fixed signage and other fixed items projected onto the display are retrieved from geospatial database 16 .
  • FIGS. 3 F- 3 H are pictorial illustrations of actual displays.
  • FIG. 3F illustrates two vehicles in close proximity to the vehicle on which system 10 is deployed. It can be seen that the two vehicles have been detected by ranging system 18 (discussed in greater detail below) and have icons projected thereover.
  • FIG. 3G illustrates a vehicle more distant than those in FIG. 3F.
  • FIG. 3G also shows line boundaries which are projected over the actual boundaries.
  • FIG. 3H shows even more distant vehicles and also illustrates objects around an intersection. For example, right turn lane markers are shown displayed over the actual lane boundaries.
  • variable road signs such as stoplights, caution lights, railroad crossing warnings, etc.
  • processor 12 determines, based on access to the geospatial database, that a variable sign is within the normal viewing distance of the vehicle.
  • a radio frequency (RF) receiver for instance
  • Processor 12 then proceeds to project the variable sign information to the driver on the projector.
  • RF radio frequency
  • this can take any desirable form. For instance, a stop light with a currently red light can be projected, such that it overlies the actual stoplight and such that the red light is highly visible to the driver.
  • Other suitable information and display items can be implemented as well.
  • text of signs or road markers can be enlarged to assist drivers with poor night vision.
  • Items outside the driver's field of view can be displayed (e.g., at the top or sides of. the display) to give the driver information about objects out of view.
  • Such items can be fixed or transitionary objects or in the nature of advertising such as goods or services available in the vicinity of the vehicle.
  • Such information can be included in the geospatial database and selectively retrieved based on vehicle position.
  • Directional signs can also be incorporated into the display to guide the driver to a destination (such as a rest area or hotel), as shown in FIG. 3I. It can be seen that the directional arrows are superimposed directly over the lane.
  • database 16 can be stored locally on the vehicle or queried remotely. Also, database 16 can be periodically updated (either remotely or directly) with a wide variety of information such as detour or road construction information or any other desired information.
  • Transitory obstacles also referred to herein as unexpected targets
  • Transitory obstacle information indicative of such transitory targets or obstacles is derived from ranging system 18 .
  • Transitory obstacles are distinguished from conventional roadside obstacles (such as road signs, etc.) by processor 12 .
  • Processor 12 senses an obstacle from the signal provided by ranging system 18 .
  • Processor 12 determines whether the target indicated by ranging system 18 actually corresponds to a conventional, expected roadside obstacle which has been mapped into database 16 .
  • the transitory targets basically represent items which are not in a fixed location during normal operating conditions on the roadway.
  • Such objects can include water towers, trees, bridges, road dividers, other landmarks, etc . . .
  • Such indicators can also be warnings or alarms such as not to turn the wrong way on a one-way road or an off ramp, that the vehicle is approaching an intersection or work zone at too high a high rate of speed.
  • the combiner can perform other tasks as well.
  • Such tasks can include the display of blocking templates which block out or reduce glare from the sun or headlights from other cars. The location of the sun can be computed from the time, and its position relative to the driver can also be computed (the same is true for cars). Therefore, an icon can simply be displayed to block the undesired glare.
  • the displays can be integrated with other operator perceptible features, such as a haptic feedback, sound, seat or steering wheel vibration, etc.
  • FIGS. 4 A- 4 C illustrate the operation of system 10 in greater detail.
  • FIG. 4A is a functional block diagram of a portion of system 10 illustrating software components and internal data flow throughout system 10 .
  • FIG. 4B is a simplified flow diagram illustrating operation of system 10
  • FIG. 4C is a simplified flow diagram illustrating target filtering in accordance with one embodiment of the present invention.
  • system 10 It is first determined whether system 10 is receiving vehicle location information from its primary vehicle location system. This is indicated by block 62 in FIG. 4B.
  • this signal may be temporarily lost. The signal may be lost, for instance, when the vehicle goes under a bridge, or simply goes through a pocket or area where GPS or correction signals can not be received or is distorted. If the primary vehicle location signal is available, that signal is received as indicated by block 64 . If not, system 10 accesses information from auxiliary inertial measurement unit 30 .
  • Auxiliary IMU 30 may, illustratively, be complimented by a dead reckoning system which utilizes the last known position provided by the GPS receiver, as well as speed and angle information, in order to determine a new position. Receiving the location signal from auxiliary IMU 30 is illustrated by block 66 .
  • system 10 also optionally receives head or eye location information, as well as optional vehicle orientation data.
  • vehicle orientation information can be obtained from a roll rate gyroscope 68 to obtain the roll angle, and a tilt sensor 70 (such as an accelerometer) to obtain the pitch angle as well as a yaw rate sensor 69 to obtain yaw angle 83 .
  • obtaining the head or eye location data and the vehicle orientation data are illustrated by optional blocks 72 and 74 in FIG. 4B.
  • the optional driver's eye data is illustrated by block 76 in FIG. 4A
  • the vehicle location data is indicated by block 78
  • the pitch and roll angles are indicated by blocks 80 and 82 , respectively.
  • a coordinate transformation matrix is constructed, as described in greater detail below, from the location and heading angle of the moving vehicle, and from the optional driver's head or eye data and vehicle orientation data, where that data is sensed.
  • the location data is converted into a local coordinate measurement using the transformation matrix, and is then fed into the perspective projection routines to calculate and draw the road shape and target icons in the computer's graphic memory.
  • the road shape and target icons are then projected as a virtual view in the driver's visual field, as illustrated in FIG. 3B above.
  • the coordinate transformation block transforms the coordinate frame of the digital map from the global coordinate frame to the local coordinate frame.
  • the local coordinate frame is a moving coordinate frame that is illustratively attached to the driver's head.
  • the coordinate transformation is illustratively performed by multiplying a four-by-four homogeneous transformation matrix to the road data points although any other coordinate system transformations can be used, such as the Quaternion or other approach. Because the vehicle is kept moving, the matrix must be updated in real time. Movement of the driver's eye that is included in the matrix is also measured and fed into the matrix calculation in real time. Where no head tracking system 32 is provided, then the head angle and position of the driver's eyes are assumed to be constant and the driver is assumed tobe looking forward from a nominal position.
  • the heading angle of the vehicle is estimated from the past history of the GPS location data.
  • a rate gyroscope can be used to determine vehicle heading as well.
  • An absolute heading angle is used in computing the correct coordinate transformation matrix.
  • any other suitable method to measure an absolute heading angle can be used as well, such as a magnetometer (electronic compass) or an inertial measurement unit. Further, where pitch and roll sensors are not used, these angles can be assumed to be 0.
  • the ranging information from ranging system 18 is also received by controller 12 (shown in FIG. 2). This is indicated by blocks 83 in FIG. 4A and by block 86 in FIG. 4B.
  • the ranging data illustratively indicates the presence and location of targets around the vehicle.
  • the radar ranging system 18 developed and available from Eaton Vorad, or Delphi, Celsius Tech, or other vendors provides a signal indicative of the presence of a radar target, its range, its range rate and the azimuth angle of that target with respect to the radar apparatus.
  • controller 12 queries the digital road map in geospatial database 16 and extracts local road data 88 .
  • the local road data provides information with respect to road boundaries as seen by the operator in the position of the vehicle, and also other potential radar targets, such as road signs, road barriers, etc.
  • Accessing geospatial database 16 (which can be stored on the vehicle and receive periodic updates or can be stored remotely and accessed wirelessly) is indicated by block 90 in FIG. 4B.
  • Controller 12 determines whether the targets indicated by target data 83 are expected targets. Controller 12 does this by examining the information in geospatial database 16 . In other words, if the targets correspond to road signs, road barriers, bridges, or other information which would provide a radar return to ranging system 18 , but which is expected because it is mapped into database 16 and does not need to be brought to the attention of the driver, that information can be filtered out such that the driver is not alerted to every single possible item on the road which would provide a radar return. Certain objects may a priori be programmed to be brought to the attention of the driver. Such items may be guard rails, bridge abutments, etc . . . and the filtering can be selective, as desired.
  • all filtering can be turned off so all objects are brought to the driver's attention.
  • the driver can change filtering based on substantially any predetermined filtering criteria, such as distance from the road or driver, location relative to the road or the driver, whether the objects are moving or stationary, or substantially any other criteria.
  • criteria can be invoked by the user through the user interface, or they can be pre-programmed into controller 12 .
  • the target information is determined to correspond to an unexpected target, such as a moving vehicle ahead of the vehicle on which system 10 is implemented, such as a stalled car or a pedestrian on the side of the road, or some other transitory target which has not been mapped to the geospatial database as a permanent, or expected target. It has been found that if all expected targets are brought to the operator's attention, this substantially amounts to noise such that when real targets are brought to the operator's attention, they are not as readily perceived by the operator. Therefore, filtering of targets not posing a threat to the driver is performed as is illustrated by block 92 in FIG. 4B.
  • the frame transformation is performed using the transformation matrix.
  • the result of the coordinate frame transformation provides the road boundary data, as well as the target data, seen from the driver's eye perspective.
  • the road boundary and target data is output, as illustrated by block 94 in FIG. 4B, and as indicated by block 96 in FIG. 4A.
  • the road and target shapes are generated by processor 12 for projection in the proper perspective.
  • the actual image projected is clipped such that it only includes that part of the road which would be visible by the operator with an unobstructed forward-looking visual field. Clipping is described in greater detail below, and is illustrated by block 104 in FIG. 4A. The result of the entire process is the projected road and target data as illustrated by block 106 in FIG. 4A.
  • FIG. 4C is a more detailed flow diagram illustrating how targets are projected or filtered from the display.
  • ranging system 18 is providing a target signal indicating the presence of a target. This is indicated by block 108 . If so, then when controller 12 accesses geospatial database 16 , controller 12 determines whether sensed targets correlate to any expected targets. This is indicated by block 110 . If so, the expected targets are filtered from the sensed targets.
  • ranging system 18 may provide an indication of a plurality of targets at any given time. In that case, only the expected targets are filtered from the target signal. This is indicated by block 112 . If any targets remain, other than the expected targets, the display signal is generated in which the unexpected, or transitory, targets are placed conformally on the display. This is indicated by block 114 .
  • the display signal is also configured such that guidance markers (such as lane boundaries, lane striping or road edges) is also placed conformally on the display. This is indicated by block 116 .
  • the display signal is then output to the projector such that the conformal, augmented display is provided to the user. This is indicated by block 118 .
  • the term “conformal” is used herein to indicate that the “virtual image” generated by the present system projects images represented by the display in a fashion such that they are substantially aligned, and in proper perspective with, the actual images which would be seen by the driver, with an unobstructed field of view.
  • augmented means that the actual image perceived by the operator is supplemented by the virtual image projected onto the head up display. Therefore, even if the driver's forward-looking visual field is obstructed, the augmentation allows the operator to receive and process information, in the proper perspective, as to the actual objects which would be seen with an unobstructed view.
  • FIG. 5A The global coordinate frame is illustrated by the axes 120 . All distances and angles are measured about these axes.
  • FIG. 5A also shows vehicle 124 , with the vehicle coordinate frame represented by axes 126 and the user's eye coordinate frame (also referred to as the graphic screen coordinate frame) illustrated by axes 128 .
  • FIG. 5A also shows road point data 130 , which illustrates data corresponding to the center of road 132 .
  • the capital letters “X”, “Y” and “Z” in this description are used as names of each axis.
  • the positive Y-axis is the direction to true north
  • the positive X-axis is the direction to true east in global coordinate frame 120 .
  • Compass 122 is drawn to illustrate that the Y-axis of global coordinate frame 120 points due north.
  • the elevation is defined by the Z-axis and is used to express elevation of the road shape and objects adjacent to, or on, the road.
  • All of the road points 130 stored in the road map file in geospatial database 16 are illustratively expressed in terms of the global coordinate frame 120 .
  • the vehicle coordinate frame 126 (V) is defined and used to express the vehicle configuration data, including the location and orientation of the driver's eye within the operator compartment, relative to the origin of the vehicle.
  • the vehicle coordinate frame 126 is attached to the vehicle and moves as the vehicle moves.
  • the origin is defined as the point on the ground under the location of the GPS receiver antenna. Everything in the vehicle is measured from the ground point under the GPS antenna. Other points, such as located on a vertical axis through the GPS receiver antenna or at any other location on the vehicle, can also be selected.
  • the forward moving direction is defined as the positive y-axis.
  • the direction to the right when the vehicle is moving forward is defined as the positive x-axis
  • the vertical upward direction is defined as the positive z-axis which is parallel to the global coordinate frame Z-axis.
  • the yaw angle, i.e. heading angle, of the vehicle is measured from true north, and has a positive value in the clockwise direction (since the positive z-axis points upward).
  • the pitch angle is measured about the x-axis in coordinate frame 126 and the roll angle is measured as a rotation about the y-axis in coordinate frame 126 .
  • the local L-coordinate frame 128 is defined and used to express the road data relative to the viewer's location and direction.
  • the coordinate system 128 is also referred to herein as the local coordinate frame. Even though the driver's eye location and orientation may be assumed to be constant (where no head tracking system 30 is used) the global information still needs to be converted into the eye-coordinate frame 128 for calculating the perspective projection.
  • the location of the eye i.e. the viewing point, is the origin of the local coordinate frame.
  • the local coordinate frame 128 is defined with respect to the vehicle coordinate frame.
  • the relative location of the driver's eye from the origin of the vehicle coordinate frame is measured and used in the coordinate transformation matrix described in greater detail below.
  • the directional angle information from the driver's line of sight is used in constructing the projection screen. This angle information is also integrated into the coordinate transformation matrix.
  • the objects in the outer world are drawn on a flat two-dimensional video projection screen which corresponds to the virtual focal plane, or virtual screen 50 perceived by human drivers.
  • the virtual screen coordinate frame has only two axes.
  • the positive x-axis of the screen is defined to be the same as the positive x-axis of the vehicle coordinate frame 126 for ease in coordinate conversion.
  • the upward direction in the screen coordinate frame is the same as the positive z-axis and the forward-looking direction (or distance to the objects located on the visual screen) is the positive y-axis.
  • the positive x-axis and the y-axis in the virtual projection screen 50 are mapped to the positive x-axis and the negative y-axis in computer memory space, because the upper left corner is deemed to be the beginning of the video memory.
  • Road data points including the left and right edges which are expressed with respect to the global coordinate frame ⁇ G ⁇ as P k , shown in FIG. 5B- 1 , are converted into the local coordinate frame ⁇ L ⁇ which is attached to the moving vehicle 124 coordinate frame ⁇ V ⁇ . Its origin (Ov) and direction ( ⁇ v) are changing continually as the vehicle 124 moves.
  • the origin (O L ) of the local coordinate frame ⁇ L ⁇ , i.e. driver's eye location, and its orientation ( ⁇ E ) change as the driver moves his or her head and eyeballs.
  • a homogeneous transformation matrix [T] was defined and used to convert the global coordinate data into local coordinate data.
  • the matrix [T] is developed illustratively, as follows.
  • the parameters in FIGS. 5 B- 1 and 5 B- 2 are as follows:
  • P k is the k-th road point
  • O G is the origin of the global coordinate frame
  • O V is the origin of the vehicle coordinate frame with respect to the global coordinate frame
  • O E is the origin of the local eye-attached coordinate frame.
  • Any point in 3-dimensional space can be expressed in terms of either a global coordinate frame or a local coordinate frame. Because everything seen by the driver is defined with respect to his or her location and viewing direction (i.e. the relative geometrical configuration between the viewer and the environment) all of the viewable environment should be expressed in terms of a local coordinate frame. Then, any objects or line segments can be projected onto a flat surface or video screen by means of the perspective projection. Thus, the mathematical calculation of the coordinate transformation is performed by constructing the homogenous transformation matrix and applying the matrix to the position vectors.
  • the coordinate transformation matrix [T] is defined as a result of the multiplication of a number of matrices described in the following paragraphs.
  • the letter G P is a point in terms of coordinates X, Y, Z as referenced from the global coordinate system.
  • the letter L P represents the same point in terms of x, y, z in the local coordinate system.
  • the transformation matrix L G[T tran ] allows for a translational transformation from the global G coordinate system to the local L coordinate system.
  • [ x y z 1 ] [ cos ⁇ ⁇ ⁇ sin ⁇ ⁇ ⁇ 0 0 - sin ⁇ ⁇ ⁇ cos ⁇ ⁇ ⁇ 0 0 0 0 1 0 0 0 0 1 ] ⁇ [ X Y Z 1 ] Eq . ⁇ 5
  • [ T ] [ C E S E 0 - O L ⁇ XC E - O L ⁇ YS E - S E C E 0 + O L ⁇ XS E - O L ⁇ YC E 0 0 1 - O L ⁇ Z 0 0 1 ] ⁇ [ C V S V 0 - O V ⁇ XC V - O X ⁇ YS V - S V C V 0 + O V ⁇ XS V - O V ⁇ YC V 0 0 1 - O V ⁇ Z 0 0 0 1 ] Eq . ⁇ 11
  • T 13 0 Eq. 16
  • T 23 0 Eq. 20
  • the display screen is the virtual focal plane.
  • the display screen is the plane, which is located at S y position, parallel to the z-x plane, where s x , s z , are the horizontal and vertical dimensions of the display screen.
  • the object is projected onto the screen, it should be projected with the correct perspective so that the projected images match with the outer scene.
  • the head up display system match the drawn road shapes (exactly or at least closely) the actual road which is in front of the driver.
  • the perspective projection makes closer objects appear larger and further objects appear smaller.
  • the prospective projection can be calculated from triangle similarity as shown in FIGS. 5G to 5 H- 2 . From the figures, one can find the location of the point s(x,z) for the known data p(x,y,z).
  • the points are connected using straight lines to build up the road shapes.
  • the line-connected road shape provides a better visual cue of the road geometry than plotting just a series of dots.
  • the visible limit is illustrated by FIGS. 5I to 5 J- 3 .
  • the visible three-dimensional volume is defined as a rectangular cone cut at the display screen. Every object in this visible region needs to be displayed on the projection screen. Objects in the small rectangular cone defined by O L and the display screen, a three dimensional volume space between the viewer's eye and the displaying screen, is displayed in an enlarged size. If the object in this region is too close to the viewer, then it results in an out of limit error or a divide by zero error during the calculation. However, usually there are no objects located in the “enlarging space.”
  • FIGS. 5 J- 1 to 5 J- 3 and the following equations of lines were used for checking whether an object is in the visible space or not. Using these clipping techniques, if the position of a point in the local coordinate frame is defined as p(x, y, z) then this point is visible to the viewer only if:
  • the point p is in front of the display screen.
  • Equations in the diagram of FIGS. 5 J- 1 to 5 J- 3 are not line-equations but equations of planes in 3 dimensional space.
  • the above conditions can be expressed by the following equations mathematically, which describe what we mean by “in front of”
  • FIGS. 5 K- 1 to 5 K- 3 show one of many possible situations.
  • FIG. 5K- 1 is a top view, which is a projection of the xy plane. It will now be described how to locate point p so that only the contained segment is drawn.
  • k is an arbitrary real number, (0 ⁇ k ⁇ 1) and
  • the projected values s x , and s z can be calculated by a perspective projection in the same manner as the other parameters.
  • FIG. 6 illustrates a vehicle 200 with placement of ranging system 18 thereon.
  • Vehicle 200 is, illustratively, a snow plow which includes an operator compartment 202 and a snow plow blade 204 .
  • Ranging system 18 in the embodiment illustrated in FIG. 6, includes a first radar subsystem 206 and a second radar subsystem 208 . It can be desirable to be able to locate targets closely proximate to blade 204 . However, since radar subsystems 206 and 208 are directional, it is difficult, with one subsystem, to obtain target coverage close to blade 204 , yet still several hundred meters ahead of vehicle 200 , because of the placement of blade 204 .
  • the two subsystems 206 and 208 are employed to obtain ranging system 18 .
  • Radar subsystem 208 is located just above blade 204 and is directed approximately straightforwardly, in a horizontal plane. Radar subsystem 206 is located above blade 204 and is directed downwardly, such that targets can be detected closely proximate the front of blade 204 .
  • the radar subsystems 206 and 208 are each illustratively an array of aligned radar detectors which is continuously scanned by a processor such that radar targets can be detected, and their range, range rate and azimuth angle from the radar subsystem 206 or 208 can be estimated as well. In this way, information regarding the location of radar targets can be provided to controller 12 such that controller 12 can display an icon or other visual element representative of the target on the head up display 22 . of course, the icon can be opaque or transparent.
  • the icon representative of the target can be shaped in any desirable shape.
  • bit maps can be placed on the head up display 22 which represent targets.
  • targets can be small, colored or otherwise coded to indicate distance. In other words, if the targets are very close to vehicle 200 , they can be large, begin to flash, or turn red. Similarly, if the targets are a long distance from vehicle 200 , they can maintain a constant glow or halo.
  • FIG. 7 is a flow diagram illustrating how ranging system 18 can be used, in combination with the remainder of the system, to verify operation of the subsystems.
  • controller 12 receives a position signal. This is indicated by block 210 . This is the signal, illustratively, from the vehicle location system 14 . Controller 12 then receives a ranging signal, as indicated by block 212 in FIG. 7. This is the signal from ranging system 18 which is indicative of targets located within the ranging field of vehicle 200 .
  • controller 12 queries geospatial database 16 . This is indicated by block 214 . In querying geospatial database 16 , controller 12 verifies that targets, such as street signs, road barriers, etc.
  • controller 12 determines that system 10 is operating properly. This is indicated by block 216 and 218 . In view of this determination, controller 12 can provide an output to user interface 20 indicating that the system is healthy.
  • controller 12 determines that something is not operating correctly, either the ranging system 18 is malfunctioning, the vehicle positioning system is malfunctioning, information retrieval from the geospatial database 16 is malfunctioning or the geospatial database 16 has been corrupted, etc.
  • controller 12 illustratively provides an output to user interface (UI) 20 indicating a system problem exists. This is indicated by block 220 . Therefore, while controller 12 may not be able to detect the exact type of error which is occurring, controller 12 can detect that an error is occurring and provide an indication to the operator to have the system checked or to have further diagnostics run.
  • UI user interface
  • the present invention need not be provided only for the forward-looking field of view of the operator.
  • the present system 10 can be implemented as a side-looking or rear-looking virtual mirror.
  • ranging system 18 includes radar detectors (or other similar devices) located on the sides or to the rear of vehicle 200 .
  • the transformation matrix would be adjusted to transform the view of the operator to the side looking or rear looking, field of view as appropriate.
  • Vehicles or objects which are sensed, but which are not part of the fixed geospatial landscape are presented iconically based on the radar or other range sensing devices in ranging system 18 .
  • the fixed lane boundaries are also presented conformally to the driver.
  • Fixed geospatial landmarks which may be relevant to the driver (such as the backs of road signs, special pavement markings, bridges being passed under, watertowers, trees, etc.) can also be presented to the user, in the proper prospective. This gives the driver a sense of motion as well as cues to proper velocity.
  • FIG. 8 One illustration of the present invention as both a forward looking driver assist device and one which assists in a rear view is illustrated in FIG. 8.
  • a forward-looking field of view is illustrated by block 250 while the virtual rear view mirror is illustrated by block 252 .
  • the view is provided, just as the operator would see when looking in a traditional mirror.
  • the mirror may illustratively be virtually gimbaled along any axis (i.e., the image is rotated from side-to-side or top-to-bottom) in software such that the driver can change the angle of the mirror, just as the driver currently can mechanically, to accommodate different driver sizes, or to obtain a different view than is currently being represented by the mirror.
  • FIG. 9 gives another illustrative embodiment of a vehicle positioning system which provides vehicle position along the roadway.
  • the system illustrated in FIG. 9 can, illustratively, be used as the auxiliary vehicle positioning system 30 illustrated in FIG. 2A. This can provide vehicle positioning information when, for example, the DGPS signal is lost, momentarily, for whatever reason.
  • vehicle 200 includes an array of magnetic sensors 260 .
  • the road lane 262 is bounded by magnetic strips 264 which, illustratively, are formed of tape having magnetized portions 266 therein. Although a wide variety of such magnetic strips could be used, one illustrative embodiment is illustrated in U.S. Pat. No. 5,853,846 to the 3M Company of St.
  • magnetometers in strip 260 are monitored such that the field strength sensed by each magnetometer is identified. Therefore, as the vehicle approaches strip 260 and begins to cross lane boundary 268 , magnetometers 270 and 272 begin to provide a signal indicating a larger field strength.
  • Scanning the array of magnetometers is illustratively accomplished using a microprocessor which scans them quickly enough to detect even fairly high frequency changes in vehicle position toward or away from the magnetic elements in the marked lane boundaries. In this way, a measure of the vehicle's position in the lane can be obtained, even if the primary vehicle system is temporarily not working.
  • FIG. 9 shows magnetometers mounted to the front of the vehicle, they can be mounted to the rear as well. This would allow an optional calculation of the vehicle's yaw angle relative to the magnetic strips.
  • FIG. 10 is a block diagram of another embodiment of the present invention. All items are the same as those illustrated in FIG. 1 and are similarly numbered, and operate substantially the same way. However, rather than providing an output to display 22 , controller 12 provides an output to neurostimulator 300 .
  • Neurostimulator 300 is a stimulating device which operates in a known manner to provide stimulation signals to the cortex to elicit image formation in the brain.
  • the signal provided by controller 12 includes information as to eye perspective and image size and shape, thus enhancing the ability of neurostimulator 300 to properly stimulate the cortex in a meaningful way. Of course, as the person using the system moves and turns the head, the image stimulation will change accordingly.
  • the present invention provides a significant advancement in the art of mobility assist devices, particularly, with respect to moving in conditions where the outward looking field of view of the observer is partially or fully obstructed.
  • the present invention provides assistance in not only lane keeping, but also in collision avoidance, since the driver can use the system to steer around displayed obstacles.
  • the present invention can also be used in many environments such as snow removal, mining or any other environment where airborne matter obscures vision.
  • the invention can also be used in walking or driving in low light areas or at night, or through wooden or rocky areas where vision is obscured by the terrain.
  • the present invention can be used on ships or boats to, for example, guide the water-going vessel into port, through a canal, through lock and dams, around rocks or other obstacles.
  • the present invention can also be used on non-motorized, earth-based vehicles such as bicycles, wheelchairs, by skiers or substantially any other vehicle.
  • the present invention can also be used to aid blind or vision impaired persons.

Abstract

The present invention is directed to a visual mobility assist device which provides a conformal, augmented display to assist a moving body. When the moving body is a motor vehicle, for instance (although it can be substantially any other body), the present invention assists the driver in either lane keeping or collision avoidance, or both. The system can display objects such as lane boundaries, targets, other navigational and guidance elements or objects, or a variety of other indicators, in proper perspective, to assist the driver.

Description

    BACKGROUND OF THE INVENTION
  • The present invention deals with mobility assistance. More particularly, the present invention deals with a vision assist device in the form of a head up display (HUD) for assisting mobility of a mobile body, such as a person non-motorized vehicle or motor vehicle. [0001]
  • Driving a motor vehicle on the road, with a modicum of safety, can be accomplished if two different aspects of driving are maintained. The first is referred to as “collision avoidance” which means maintaining motion of a vehicle without colliding with other obstacles. The second aspect in maintaining safe driving conditions is referred to as “lane keeping” which means maintaining forward motion of a vehicle without erroneously departing from a given driving lane. [0002]
  • Drivers accomplish collision avoidance and lane keeping by continuously controlling vehicle speed, lateral position and heading direction by adjusting the acceleration and brake pedals, as well as the steering wheel. The ability to adequately maintain both collision avoidance and lane keeping is greatly compromised when the forward-looking visual field of a driver is obstructed. In fact, many researchers have concluded that the driver's ability to perceive the forward-looking visual field is the most essential input for the task of driving. [0003]
  • There are many different conditions which can obstruct (to varying degrees) the forward-looking visual field of a driver. For example, heavy snowfall, heavy rain, fog, smoke, darkness, blowing dust or sand, or any other substance or mechanism which obstructs (either partially or fully) the forward-looking visual field of a driver makes it difficult to identify obstacles and road boundaries which, in turn, compromises collision avoidance and lane keeping. Similarly, even on sunny, or otherwise clear days, blowing snow or complete coverage of the road by snow, may result in a loss of visual perception of the road. Such “white out” conditions are often encountered by snowplows working on highways, due to the nature of their task. The driver's forward-looking vision simply does not provide enough information to facilitate safe control of the vehicle. This can be exacerbated, particularly on snow removal equipment, because even on a relatively calm, clear day, snow can be blown up from the front or sides of snowplow blades, substantially obstructing the visual field of the driver. [0004]
  • Similarly, driving at night in heavy snowfall causes the headlight beams of the vehicle to be reflected into the driver's forward-looking view. Snow flakes glare brightly when they are illuminated at night and make the average brightness level perceived by the driver's eye higher than normal. This higher brightness level causes the iris to adapt to the increased brightness and, as a result, the eye becomes insensitive to the darker objects behind the glaring snowflakes, which are often vital to driving. Such objects can include road boundaries, obstacles, other vehicles, signs, etc. [0005]
  • Research has also been done which indicates that prolonged deprivation of visual stimulation can lead to confusion. For example, scientists believe that one third of human brain neurons are devoted to visual processing. Pilots, who are exposed to an empty visual field for longer than a certain amount of time, such as during high-altitude flight, or flight in thick fog, have a massive number of unstimulated visual neurons. This can lead to control confusion which makes it difficult for the pilot to control the vehicle. A similar condition can occur when attempting to navigate or plow a snowy road during daytime heavy snowfall in a featureless rural environment. [0006]
  • Many other environments are also plagued by poor visibility conditions. For instance, in military or other environments one may be moving through terrain at night, either in a vehicle or on foot, without the assistance of lights. Further, in mining environments or simply when driving on a dirt, sand or gravel surface particulate matter can obstruct vision. In water-going vehicles, it can be difficult to navigate through canals, around rocks, into a port, or through lock and dams because obstacles may be obscured by fog, below the water, or by other weather conditions. Similarly, surveyors may find it difficult to survey land with dense vegetation or rock formations which obstruct vision. People in non-motorized vehicles (such as in wheelchairs, on bicycles, on skis, etc . . . can find themselves in these environments as well. All such environments, and many others, have visual conditions which act as a hindrance to persons working in, or moving through, those environments. [0007]
  • SUMMARY OF THE INVENTION
  • The present invention is directed to a visual assist device which provides a conformal, augmented display to assist in movement of a mobile body. In one example, the mobile body is a vehicle (motorized or non-motorized) and the present invention assists the driver in either lane keeping or collision avoidance, or both. The system can display lane boundaries, other navigational or guidance elements or a variety of other objects in proper perspective, to assist the driver. In another example, the mobile body is a person (or group of people) and the present invention assists the person in either staying on a prescribed path or collision avoidance or both. The system can display path boundaries, other navigational or guidance elements or a variety of other objects in proper perspective, to assist the walking person. [0008]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a mobility assist device in accordance with one embodiment of the present invention. [0009]
  • FIGS. [0010] 2 is a more detailed block diagrams of another embodiment of the mobility assist device.
  • FIG. 3A is a partial-pictorial and partial block diagram illustrating operation of a mobility assist device in accordance with one embodiment of the present invention. [0011]
  • FIG. 3B illustrates the concept of a combiner and virtual screen. [0012]
  • FIGS. 3C, 3D and [0013] 3E are pictorial illustrations of a conformal, augmented projection and display in accordance with one embodiment of the present invention.
  • FIGS. 3F, 3G, [0014] 3H and 3I are pictorial illustrations of an actual conformal, augmented display in accordance with an embodiment of the present invention.
  • FIGS. [0015] 4A-4C are flow diagrams illustrating general operation of the mobility assist device.
  • FIG. 5A illustrates coordinate frames used in accordance with one embodiment of the present invention. [0016]
  • FIGS. [0017] 5B-1 to 5K-3 illustrate the development of a coordinate transformation matrix in accordance with one embodiment of the present invention.
  • FIG. 6 is a side view of a vehicle employing the ranging system in accordance with one embodiment of the present invention. [0018]
  • FIG. 7 is a flow diagram illustrating a use of the present invention in performing system diagnostics and improved radar processing. [0019]
  • FIG. 8 is a pictorial view of a head up virtual mirror, in accordance with one embodiment of the present invention. [0020]
  • FIG. 9 is a top view of one embodiment of a system used to obtain position information corresponding to a vehicle. [0021]
  • FIG. 10 is a block diagram of another embodiment of the present invention.[0022]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • The present invention can be used with substantially any mobile body, such as a human being,a motor vehicle or a non-motorized vehicle. However, the present description proceeds with respect to an illustrative embodiment in which the invention is implemented on a motor vehicle as a driver assist device. FIG. 1 is a simplified block diagram of one embodiment of [0023] driver assist device 10 in accordance with the present invention. Driver assist device 10 includes controller 12, vehicle location system 14, geospatial database 16, ranging system 18, operator interface 20 and display 22.
  • In one embodiment, [0024] controller 12 is a microprocessor, microcontroller, digital computer, or other similar control device having associated memory and timing circuitry. It should be understood that the memory can be integrated with controller 12, or be located separately therefrom. The memory, of course, may include random access memory, read only memory, magnetic or optical disc drives, tape memory, or any other suitable computer readable medium.
  • [0025] Operator interface 20 is illustratively a keyboard, a touch-sensitive screen, a point and click user input device (e.g. a mouse), a keypad, a voice activated interface, joystick, or any other type of user interface suitable for receiving user commands, and providing those commands to controller 12, as well as providing a user viewable indication of operating conditions from controller 12 to the user. The operator interface may also include, for example, the steering wheel and the throttle and brake pedals suitably instrumented to detect the operator's desired control inputs of heading angle and speed. Operator interface 20 may also include, for example, a LCD screen, LEDs, a plasma display, a CRT, audible noise generators, or any other suitable operator interface display or speaker unit.
  • As is described in greater detail later in the specification, [0026] vehicle location system 14 determines and provides a vehicle location signal, indicative of the vehicle location in which driver assist device 10 is mounted, to controller 12. Thus, vehicle location system 14 can include a global positioning system receiver (GPS receiver) such as a differential GPS receiver, an earth reference position measuring system, a dead reckoning system (such as odometery and an electronic compass), an inertial measurement unit (such as accelerometers, inclinometers, or rate gyroscopes), etc. In any case, vehicle location system 14 periodically provides a location signal to controller 12 which indicates the location of the vehicle on the surface of the earth.
  • [0027] Geospatial database 16 contains a digital map which digitally locates road boundaries, lane boundaries, possibly some landmarks (such as road signs, water towers, or other landmarks) and any other desired items (such as road barriers, bridges etc . . . ) and describes a precise location and attributes of those items on the surface of the earth.
  • It should be noted that there are many possible coordinate systems that can be used to express a location on the surface of the earth, but the most common coordinate frames include longitudinal and latitudinal angle, state coordinate frame, and county coordinate frame. [0028]
  • Because the earth is approximately spherical in shape, it is convenient to determine a location on the surface of the earth if the location values are expressed in terms of an angle from a reference point. Longitude and latitude are the most commonly used angles to express a location on the earth's surface or in orbits around the earth. Latitude is a measurement on a globe of location north or south of the equator, and longitude is a measurement of the location east or west of the prime meridian at Greenwich, the specifically designated imaginary north-south line that passes through both geographic poles of the earth and Greenwich, England. The combinations of meridians of longitude and parallels of latitude establishes a framework or grid by means of which exact positions can be determined in reference to the prime meridian and the equator. Many of the currently available GPS systems provide latitude and longitude values as location data. [0029]
  • Even though the actual landscape on the earth is a curved surface, it is recognized that land is utilized as if it is a flat surface. A Cartesian coordinate system whose axes are defined as three perpendicular vectors is usually used. Each state has its own standard coordinate system to locate points within their state boundaries. All construction and measurements are done using distance dimensions (such as meters or feet). Therefore, a curved surface on the earth needs to be converted into a flat surface and this conversion is referred to as a projection. There are many projection methods used as standards for various local areas on the earth's surface. Every projection involves some degree of distortion due to the fact that a surface of a sphere is constrained to be mapped onto a plane. [0030]
  • One standard projection method is the Lambert Conformal Conic Projection Method. This projection method is extensively used in a ellipsoidal form for large scale mapping of regions of predominantly east-west extent, including topographic, quadrangles for many of the U.S. state plane coordinate system zones, maps in the International Map of the World series and the U.S. State Base maps. The method uses well known, and publicly available, conversion equations to calculate state coordinate values from GPS receiver longitude and latitude angle data. [0031]
  • The digital map stored in the [0032] geospatial database 16 contains a series of numeric location data of, for example, the center line and lane boundaries of a road on which system 10 is to be used, as well as construction data which is given by a number of shape parameters including, starting and ending points of straight paths, the center of circular sections, and starting and ending angles of circular sections. While the present system is described herein in terms of starting and ending points of circular sections it could be described in terms of starting and ending points and any curvature between those points. For example, a straight path can be characterized as a section of zero curvature. Each of these items is indicated by a parameter marker, which indicates the type of parameter it is, and has associated location data giving the precise geographic location of that point on the map.
  • In one embodiment, each road point of the digital map in [0033] database 16 was generated at uniform 10 meter intervals. In one embodiment, the road points represent only the centerline of the road, and the lane boundaries are calculated from that centerline point. In another embodiment, both the center line and lane boundaries are mapped. Of course, geospatial database 16 also illustratively contains the exact location data indicative of the exact geographical location of street signs and other desirable landmarks. Database 16 can be obtained by manual mapping operations or by a number of automated methods such as, for example, placing a GPS receiver on the lane stripe paint spraying nozzle or tape laying mandrel to continuously obtain locations of lane boundaries.
  • Ranging [0034] system 18 is configured to detect targets in the vicinity of the vehicle in which system 10 is implemented, and also to detect a location (such as range, range rate and azimuth angle) of the detected targets, relative to the vehicle. Targets are illustratively objects which must be monitored because they may collide with the mobile body either due to motion of the body or of the object. In one illustrative embodiment, ranging system 18 is a radar system commercially available from Eaton Vorad. However, ranging system 18 can also include a passive or active infrared system (which could also provide the amount of heat emitted from the target) or laser based ranging system, or a directional ultrasonic system, or other similar systems. Another embodiment of system 18 is an infrared sensor calibrated to obtain a scaling factor for range, range rate and azimuth which is used for transformation to an eye coordinate system.
  • [0035] Display 22 includes a projection unit and one or more combiners which are described in greater detail later in the specification. Briefly, the projection unit receives a video signal from controller 12 and projects video images onto one or more combiners. The projection unit illustratively includes a liquid crystal display (LCD) matrix and a high-intensity light source similar to a conventional video projector, except that it is small so that it fits near the driver's seat space. The combiner is a partially-reflective, partially transmissive beam splitter formed of optical glass or polymer for reflecting the projected light from the projection unit back to the driver. In one embodiment, the combiner is positioned such that the driver looks through the combiner, when looking through the forward-looking visual field, so that the driver can see both the actual outside road scene, as well as the computer generated images projected onto the combiner. In one illustrative embodiment, the computer-generated images substantially overlay the actual images.
  • It should also be noted, however, that combiners or other similar devices can be placed about the driver to cover substantially all fields of view or be implemented in the glass of the windshield and windows. This can illustratively be implemented using a plurality of projectors or a single projector with appropriate optics to scan the projected image across the appropriate fields of view. [0036]
  • Before discussing the operation of [0037] system 10 in greater detail, it is worth pointing out that system 10 can also, in one illustrative embodiment, be varied, as desired. For example, FIG. 2 illustrates that controller 12 may actually be formed of first controller 24 and second controller 26 (or any number of controllers with processing distributed among them, as desired). In that embodiment, first controller 24 performs the primary data processing functions with respect to sensory data acquisition, and also performs database queries in the geospatial database 16. This entails obtaining velocity and heading information from GPS receiver and correction system 28. First controller 24 also performs processing of the target signal from radar ranging system 18.
  • FIG. 2 also illustrates that [0038] vehicle location system 14 may illustratively include a differential GPS receiver and correction system 28 as well as an auxiliary inertial measurement unit (IMU) 30 (although other approaches would also work). Second controller 26 processes signals from auxiliary IMU 30, where necessary, and handles graphics computations for providing the appropriate video signal to display 22.
  • In a specific illustrative embodiment, differential GPS receiver and correcting [0039] system 28 is illustratively a Novatel RT-20 differential GPS (DGPS) system with a 20-centimeter accuracy, while operating at a 5 Hz sampling rate or Trimble MS 750 with 2 cm accuracy operating at 10 Hz sampling rate.
  • FIG. 2 also illustrates that [0040] system 10 can include optional vehicle orientation detection system 31 and head tracking system 32. Vehicle orientation detection system 31 detects the orientation (such as roll and pitch) of the vehicle in which system 10 is implemented. The roll angle refers to the rotational orientation of the vehicle about its longitudinal axis (which is parallel to its direction of travel) . The roll angle can change, for example, if the vehicle is driving over a banked road, or on uneven terrain. The pitch angle is the angle that the vehicle makes in a vertical plane along the longitudinal direction. The pitch angle becomes significant if the vehicle is climbing up or descending down a hill. Taking into account the pitch and roll angles can make the projected image more accurate, and more closely conform to the actual image seen by the driver.
  • Optional [0041] head tracking system 32 can be provided to accommodate for movements in the driver's head or eye position relative to the vehicle. Of course, in one illustrative embodiment, the actual head and eye position of the driver is not monitored. Instead, the dimensions of the cab or operator compartment of the vehicle in which system 10 is implemented are taken and used, along with ergonomic data, such as the height and eye position of an operator, given the dimension of the operator compartment, and the image is projected on display 22 such that the displayed images will substantially overlie the actual images for an average operator. Specific measurements can be taken for any given operator as well, such that such a system can more closely conform to any given operator.
  • Alternatively, optional [0042] head tracking system 32 is provided. Head tracking system 32 tracks the position of the operator's head, and eyes, in real time.
  • FIGS. [0043] 3A-3E better illustrate the display of information on display 22. FIG. 3A illustrates that display 22 includes projector 40, and combiner 42. FIG. 3A also illustrates an operator 44 sitting in an operator compartment which includes seat 46 and which is partially defined by windshield 48.
  • [0044] Projector 40 receives the video display signal from controller 12 and projects road data onto combiner 42. Combiner 42 is partially reflective and partially transmissive. Therefore, the operator looks forward through combiner 42 and windshield 48 to a virtual focal plane 50. The road data (such as lane boundaries) are projected from projector 40 in proper perspective onto combiner 42 such that the lane boundaries appear to substantially overlie those which the operator actually sees, in the correct perspective. In this way, when the operator's view of the actual lane boundaries becomes obstructed, the operator can safely maintain lane keeping because the operator can navigate by the projected lane boundaries.
  • FIG. 3A also illustrates that [0045] combiner 42, in one illustrative embodiment, is hinged to an upper surface or side surface or other structural part 52, of the operator compartment. Therefore, combiner 42 can be pivoted along an arc generally indicated by arrow 54, up and out of the view of the operator, on days when no driver assistance is needed, and down to the position shown in FIG. 3A, when the operator desires to look through combiner 42.
  • FIG. 3B better illustrates [0046] combiner 42, window 48 and virtual screen or focal plane 50. Combiner 42, while being partially reflective, is essentially a transparent, optically correct, coated glass or polymer lens. Light reaching the eyes of operator 44 is a combination of light passing through the lens and light reflected off of the lens from the projector. With an unobstructed forward-looking visual field, the driver actually sees two images accurately superimposed together. The image passing through the combiner 42 comes from the actual forward-looking field of view, while the reflected image is generated by the graphics processor portion of controller 12. The optical characteristics of combiner 42 allow the combination of elements to generate the virtual screen, or virtual focal plane 50, which is illustratively projected to appear approximately 30-80 feet ahead of combiner 42. This feature results in a virtual focus in front of the vehicle, and ensures that the driver's eyes are not required to focus back and forth between the real image and the virtual image, thus reducing eyestrain and fatigue.
  • In one illustrative embodiment, [0047] combiner 42 is formed such that the visual image size spans approximately 30° along a horizontal axis and 15° along a vertical axis with the projector located approximately 18 inches from the combiner.
  • Another embodiment is a helmet supported visor (or eyeglass device) on which images are projected, through which the driver can still see. Such displays might include technologies such as those available from Kaiser Electro-Optics, Inc. of Carlsbad, Calif., The MicroOptical Corporation of Westwood, Mass., Universal Display Corporation of Ewing, N.J., Microvision, Inc. of Bothell, Wash. and IODisplay System LLC of Menlo Park, Calif. [0048]
  • FIGS. 3C and 3D are illustrative displays from [0049] projector 40 which are projected onto combiner 42. In FIGS. 3C and 3D, the left most line is the left side road boundary. The dotted line corresponds to the centerline of a two-way road, while the right most curved line, with vertical poles, corresponds to the right-hand side road boundary. The gray circle near the center of the image shown in FIG. 3C corresponds to a target detected and located by ranging system 18 described in greater detail later in the application. Of course, the gray shape need not be a circle but could be any icon or shape and could be transparent, opaque or translucent.
  • The screens illustrated in FIGS. 3C and 3D can illustratively be projected in the forward-looking visual field of the driver by projecting them onto [0050] combiner 42 with the correct scale so that objects (including the painted line stripes and road boundaries) in the screen are superimposed on the actual objects in the outer scene observed by the driver. The black area on the screens illustrated in FIGS. 3C and 3D appear transparent on combiner 42 under typical operating conditions. Only the brightly colored lines appear on the virtual image that is projected onto combiner 42. While the thickness and colors of the road boundaries illustrated in FIGS. 3C and 3D can be varied, as desired, they are illustratively white lines that are approximately 1-5 pixels thick while the center line is also white and is approximately 1-5 pixels thick as well.
  • FIG. 3E illustrates a virtual image projected onto an actual image as seen through [0051] combiner 42 by the driver. The outline of combiner 42 can be seen in the illustration of FIG. 3E and the area 60 which includes the projected image has been outlined in FIG. 3E for the sake of clarity, although no such outline actually appears on the display. It can be seen that the display generated is a conformal, augmented display which is highly useful in low-visibility situations. Geographic landmarks are projected onto combiner 42 and are aligned with the view out of the windshield. Fixed roadside signs (i.e., traditional speed limit signs, exit information signs, etc.) can be projected onto the display, and if desired virtually aligned with actual road signs found in the geospatial landscape. Data supporting fixed signage and other fixed items projected onto the display are retrieved from geospatial database 16.
  • FIGS. [0052] 3F-3H are pictorial illustrations of actual displays. FIG. 3F illustrates two vehicles in close proximity to the vehicle on which system 10 is deployed. It can be seen that the two vehicles have been detected by ranging system 18 (discussed in greater detail below) and have icons projected thereover. FIG. 3G illustrates a vehicle more distant than those in FIG. 3F. FIG. 3G also shows line boundaries which are projected over the actual boundaries. FIG. 3H shows even more distant vehicles and also illustrates objects around an intersection. For example, right turn lane markers are shown displayed over the actual lane boundaries.
  • The presence and condition of variable road signs (such as stoplights, caution lights, railroad crossing warnings, etc.) can also be incorporated into the display. In that instance, [0053] processor 12 determines, based on access to the geospatial database, that a variable sign is within the normal viewing distance of the vehicle. At the same time, a radio frequency (RF) receiver (for instance) which is mounted on the vehicle decodes the signal being broadcast from the variable sign, and provides that information to processor 12. Processor 12 then proceeds to project the variable sign information to the driver on the projector. Of course, this can take any desirable form. For instance, a stop light with a currently red light can be projected, such that it overlies the actual stoplight and such that the red light is highly visible to the driver. Other suitable information and display items can be implemented as well.
  • For instance, text of signs or road markers can be enlarged to assist drivers with poor night vision. Items outside the driver's field of view can be displayed (e.g., at the top or sides of. the display) to give the driver information about objects out of view. Such items can be fixed or transitionary objects or in the nature of advertising such as goods or services available in the vicinity of the vehicle. Such information can be included in the geospatial database and selectively retrieved based on vehicle position. [0054]
  • Directional signs can also be incorporated into the display to guide the driver to a destination (such as a rest area or hotel), as shown in FIG. 3I. It can be seen that the directional arrows are superimposed directly over the lane. [0055]
  • It should be noted that [0056] database 16 can be stored locally on the vehicle or queried remotely. Also, database 16 can be periodically updated (either remotely or directly) with a wide variety of information such as detour or road construction information or any other desired information.
  • The presence and location of transitory obstacles (also referred to herein as unexpected targets) such as stalled cars, moving cars, pedestrians, etc. are also illustratively projected onto [0057] combiner 42 with proper perspective such that they substantially overlie the actual obstacles. Transitory obstacle information indicative of such transitory targets or obstacles is derived from ranging system 18. Transitory obstacles are distinguished from conventional roadside obstacles (such as road signs, etc.) by processor 12. Processor 12 senses an obstacle from the signal provided by ranging system 18. Processor 12, then during its query of geospatial database 16, determines whether the target indicated by ranging system 18 actually corresponds to a conventional, expected roadside obstacle which has been mapped into database 16. If not, it is construed as a transitory obstacle, and projected, as a predetermined geometric shape, or bit map, or other icon, in its proper perspective, on combiner 42. The transitory targets basically represent items which are not in a fixed location during normal operating conditions on the roadway.
  • Of course, other objects can be displayed as well. Such objects can include water towers, trees, bridges, road dividers, other landmarks, etc . . . Such indicators can also be warnings or alarms such as not to turn the wrong way on a one-way road or an off ramp, that the vehicle is approaching an intersection or work zone at too high a high rate of speed. Further, where the combiner is equipped with an LCD film or embedded layer, it can perform other tasks as well. Such tasks can include the display of blocking templates which block out or reduce glare from the sun or headlights from other cars. The location of the sun can be computed from the time, and its position relative to the driver can also be computed (the same is true for cars). Therefore, an icon can simply be displayed to block the undesired glare. Similarly, the displays can be integrated with other operator perceptible features, such as a haptic feedback, sound, seat or steering wheel vibration, etc. [0058]
  • FIGS. [0059] 4A-4C illustrate the operation of system 10 in greater detail. FIG. 4A is a functional block diagram of a portion of system 10 illustrating software components and internal data flow throughout system 10. FIG. 4B is a simplified flow diagram illustrating operation of system 10, and FIG. 4C is a simplified flow diagram illustrating target filtering in accordance with one embodiment of the present invention.
  • It is first determined whether [0060] system 10 is receiving vehicle location information from its primary vehicle location system. This is indicated by block 62 in FIG. 4B. In other words, where the primary vehicle location system constitutes a GPS receiver, this signal may be temporarily lost. The signal may be lost, for instance, when the vehicle goes under a bridge, or simply goes through a pocket or area where GPS or correction signals can not be received or is distorted. If the primary vehicle location signal is available, that signal is received as indicated by block 64. If not, system 10 accesses information from auxiliary inertial measurement unit 30.
  • [0061] Auxiliary IMU 30 may, illustratively, be complimented by a dead reckoning system which utilizes the last known position provided by the GPS receiver, as well as speed and angle information, in order to determine a new position. Receiving the location signal from auxiliary IMU 30 is illustrated by block 66.
  • In any case, once [0062] system 10 has received the vehicle location data, system 10 also optionally receives head or eye location information, as well as optional vehicle orientation data. As briefly discussed above, the vehicle orientation information can be obtained from a roll rate gyroscope 68 to obtain the roll angle, and a tilt sensor 70 (such as an accelerometer) to obtain the pitch angle as well as a yaw rate sensor 69 to obtain yaw angle 83. obtaining the head or eye location data and the vehicle orientation data are illustrated by optional blocks 72 and 74 in FIG. 4B. Also, the optional driver's eye data is illustrated by block 76 in FIG. 4A, the vehicle location data is indicated by block 78, and the pitch and roll angles are indicated by blocks 80 and 82, respectively.
  • A coordinate transformation matrix is constructed, as described in greater detail below, from the location and heading angle of the moving vehicle, and from the optional driver's head or eye data and vehicle orientation data, where that data is sensed. The location data is converted into a local coordinate measurement using the transformation matrix, and is then fed into the perspective projection routines to calculate and draw the road shape and target icons in the computer's graphic memory. The road shape and target icons are then projected as a virtual view in the driver's visual field, as illustrated in FIG. 3B above. [0063]
  • The coordinate transformation block transforms the coordinate frame of the digital map from the global coordinate frame to the local coordinate frame. The local coordinate frame is a moving coordinate frame that is illustratively attached to the driver's head. The coordinate transformation is illustratively performed by multiplying a four-by-four homogeneous transformation matrix to the road data points although any other coordinate system transformations can be used, such as the Quaternion or other approach. Because the vehicle is kept moving, the matrix must be updated in real time. Movement of the driver's eye that is included in the matrix is also measured and fed into the matrix calculation in real time. Where no [0064] head tracking system 32 is provided, then the head angle and position of the driver's eyes are assumed to be constant and the driver is assumed tobe looking forward from a nominal position.
  • The heading angle of the vehicle is estimated from the past history of the GPS location data. Alternatively, a rate gyroscope can be used to determine vehicle heading as well. An absolute heading angle is used in computing the correct coordinate transformation matrix. As noted initially, though heading angle estimation by successive differentiation of GPS data can be used, any other suitable method to measure an absolute heading angle can be used as well, such as a magnetometer (electronic compass) or an inertial measurement unit. Further, where pitch and roll sensors are not used, these angles can be assumed to be 0. [0065]
  • In any case, after the [0066] vehicle position data 78 is received, the ranging information from ranging system 18 is also received by controller 12 (shown in FIG. 2). This is indicated by blocks 83 in FIG. 4A and by block 86 in FIG. 4B. The ranging data illustratively indicates the presence and location of targets around the vehicle. For example, the radar ranging system 18 developed and available from Eaton Vorad, or Delphi, Celsius Tech, or other vendors provides a signal indicative of the presence of a radar target, its range, its range rate and the azimuth angle of that target with respect to the radar apparatus.
  • Based on the position signal, [0067] controller 12 queries the digital road map in geospatial database 16 and extracts local road data 88. The local road data provides information with respect to road boundaries as seen by the operator in the position of the vehicle, and also other potential radar targets, such as road signs, road barriers, etc. Accessing geospatial database 16 (which can be stored on the vehicle and receive periodic updates or can be stored remotely and accessed wirelessly) is indicated by block 90 in FIG. 4B.
  • [0068] Controller 12 determines whether the targets indicated by target data 83 are expected targets. Controller 12 does this by examining the information in geospatial database 16. In other words, if the targets correspond to road signs, road barriers, bridges, or other information which would provide a radar return to ranging system 18, but which is expected because it is mapped into database 16 and does not need to be brought to the attention of the driver, that information can be filtered out such that the driver is not alerted to every single possible item on the road which would provide a radar return. Certain objects may a priori be programmed to be brought to the attention of the driver. Such items may be guard rails, bridge abutments, etc . . . and the filtering can be selective, as desired. If, for example, the driver were to exit the roadway, all filtering can be turned off so all objects are brought to the driver's attention. The driver can change filtering based on substantially any predetermined filtering criteria, such as distance from the road or driver, location relative to the road or the driver, whether the objects are moving or stationary, or substantially any other criteria. Such criteria can be invoked by the user through the user interface, or they can be pre-programmed into controller 12.
  • However, where the geospatial database does not indicate an expected target in the present location, then the target information is determined to correspond to an unexpected target, such as a moving vehicle ahead of the vehicle on which [0069] system 10 is implemented, such as a stalled car or a pedestrian on the side of the road, or some other transitory target which has not been mapped to the geospatial database as a permanent, or expected target. It has been found that if all expected targets are brought to the operator's attention, this substantially amounts to noise such that when real targets are brought to the operator's attention, they are not as readily perceived by the operator. Therefore, filtering of targets not posing a threat to the driver is performed as is illustrated by block 92 in FIG. 4B.
  • Once such targets have been filtered, the frame transformation is performed using the transformation matrix. The result of the coordinate frame transformation provides the road boundary data, as well as the target data, seen from the driver's eye perspective. The road boundary and target data is output, as illustrated by [0070] block 94 in FIG. 4B, and as indicated by block 96 in FIG. 4A. Based on the output road and target data, the road and target shapes are generated by processor 12 for projection in the proper perspective.
  • Generation of road and target shapes is illustrated by [0071] block 98 in FIG. 4A, and the perspective projection is illustrated by blocks 100 in FIG. 4A and 102 in FIG. 4B.
  • It should also be noted that the actual image projected is clipped such that it only includes that part of the road which would be visible by the operator with an unobstructed forward-looking visual field. Clipping is described in greater detail below, and is illustrated by [0072] block 104 in FIG. 4A. The result of the entire process is the projected road and target data as illustrated by block 106 in FIG. 4A.
  • FIG. 4C is a more detailed flow diagram illustrating how targets are projected or filtered from the display. First, it is determined whether ranging [0073] system 18 is providing a target signal indicating the presence of a target. This is indicated by block 108. If so, then when controller 12 accesses geospatial database 16, controller 12 determines whether sensed targets correlate to any expected targets. This is indicated by block 110. If so, the expected targets are filtered from the sensed targets. It should be noted that ranging system 18 may provide an indication of a plurality of targets at any given time. In that case, only the expected targets are filtered from the target signal. This is indicated by block 112. If any targets remain, other than the expected targets, the display signal is generated in which the unexpected, or transitory, targets are placed conformally on the display. This is indicated by block 114.
  • Of course, the display signal is also configured such that guidance markers (such as lane boundaries, lane striping or road edges) is also placed conformally on the display. This is indicated by [0074] block 116. The display signal is then output to the projector such that the conformal, augmented display is provided to the user. This is indicated by block 118.
  • It can thus be seen that the term “conformal” is used herein to indicate that the “virtual image” generated by the present system projects images represented by the display in a fashion such that they are substantially aligned, and in proper perspective with, the actual images which would be seen by the driver, with an unobstructed field of view. The term “augmented”, as used herein, means that the actual image perceived by the operator is supplemented by the virtual image projected onto the head up display. Therefore, even if the driver's forward-looking visual field is obstructed, the augmentation allows the operator to receive and process information, in the proper perspective, as to the actual objects which would be seen with an unobstructed view. [0075]
  • A discussion of coordinate frames, in greater detail, is now provided for the sake of clarity. There are essentially four coordinate frames used to construct the graphics projected in [0076] display 22. Those coordinate frames include the global coordinate frame, the vehicle-attached coordinate frame, the local or eye coordinate frame, and the graphics screen coordinate frame. The position sensor may be attached to a backpack or helmet worn by a walking person in which case this would be the vehicle-attached coordinate frame. The global coordinate frame is the coordinate frame used for road map data construction as illustrated by FIG. 5A. The global coordinate frame is illustrated by the axes 120. All distances and angles are measured about these axes. FIG. 5A also shows vehicle 124, with the vehicle coordinate frame represented by axes 126 and the user's eye coordinate frame (also referred to as the graphic screen coordinate frame) illustrated by axes 128. FIG. 5A also shows road point data 130, which illustrates data corresponding to the center of road 132.
  • The capital letters “X”, “Y” and “Z” in this description are used as names of each axis. The positive Y-axis is the direction to true north, and the positive X-axis is the direction to true east in global coordinate [0077] frame 120. Compass 122 is drawn to illustrate that the Y-axis of global coordinate frame 120 points due north. The elevation is defined by the Z-axis and is used to express elevation of the road shape and objects adjacent to, or on, the road.
  • All of the road points [0078] 130 stored in the road map file in geospatial database 16 are illustratively expressed in terms of the global coordinate frame 120. The vehicle coordinate frame 126, (V) is defined and used to express the vehicle configuration data, including the location and orientation of the driver's eye within the operator compartment, relative to the origin of the vehicle. The vehicle coordinate frame 126 is attached to the vehicle and moves as the vehicle moves. The origin is defined as the point on the ground under the location of the GPS receiver antenna. Everything in the vehicle is measured from the ground point under the GPS antenna. Other points, such as located on a vertical axis through the GPS receiver antenna or at any other location on the vehicle, can also be selected.
  • The forward moving direction is defined as the positive y-axis. The direction to the right when the vehicle is moving forward is defined as the positive x-axis, and the vertical upward direction is defined as the positive z-axis which is parallel to the global coordinate frame Z-axis. The yaw angle, i.e. heading angle, of the vehicle, is measured from true north, and has a positive value in the clockwise direction (since the positive z-axis points upward). The pitch angle is measured about the x-axis in coordinate [0079] frame 126 and the roll angle is measured as a rotation about the y-axis in coordinate frame 126.
  • The local L-coordinate [0080] frame 128 is defined and used to express the road data relative to the viewer's location and direction. The coordinate system 128 is also referred to herein as the local coordinate frame. Even though the driver's eye location and orientation may be assumed to be constant (where no head tracking system 30 is used) the global information still needs to be converted into the eye-coordinate frame 128 for calculating the perspective projection. The location of the eye, i.e. the viewing point, is the origin of the local coordinate frame. The local coordinate frame 128 is defined with respect to the vehicle coordinate frame. The relative location of the driver's eye from the origin of the vehicle coordinate frame is measured and used in the coordinate transformation matrix described in greater detail below. The directional angle information from the driver's line of sight is used in constructing the projection screen. This angle information is also integrated into the coordinate transformation matrix.
  • Ultimately, the objects in the outer world are drawn on a flat two-dimensional video projection screen which corresponds to the virtual focal plane, or [0081] virtual screen 50 perceived by human drivers. The virtual screen coordinate frame has only two axes. The positive x-axis of the screen is defined to be the same as the positive x-axis of the vehicle coordinate frame 126 for ease in coordinate conversion. The upward direction in the screen coordinate frame is the same as the positive z-axis and the forward-looking direction (or distance to the objects located on the visual screen) is the positive y-axis. The positive x-axis and the y-axis in the virtual projection screen 50 are mapped to the positive x-axis and the negative y-axis in computer memory space, because the upper left corner is deemed to be the beginning of the video memory.
  • Road data points including the left and right edges, which are expressed with respect to the global coordinate frame {G} as P[0082] k, shown in FIG. 5B-1, are converted into the local coordinate frame {L} which is attached to the moving vehicle 124 coordinate frame {V}. Its origin (Ov) and direction (θv) are changing continually as the vehicle 124 moves. The origin (OL) of the local coordinate frame {L}, i.e. driver's eye location, and its orientation (θE) change as the driver moves his or her head and eyeballs. Even though the driver's orientation (θE) can be assumed as constant for a simplified embodiment of system 10, all of the potential effects are considered in the coordinate transformation equations below for a more detailed illustrative embodiment of system 10. All road data that are given in terms of the global coordinate frame {G} ultimately need to be converted into the eye coordinate frame {L}. Then they are projected into the video screen 22 using a perspective transformation.
  • A homogeneous transformation matrix [T] was defined and used to convert the global coordinate data into local coordinate data. The matrix [T] is developed illustratively, as follows. The parameters in FIGS. [0083] 5B-1 and 5B-2 are as follows:
  • P[0084] k is the k-th road point;
  • O[0085] G is the origin of the global coordinate frame;
  • O[0086] V is the origin of the vehicle coordinate frame with respect to the global coordinate frame; and
  • O[0087] E is the origin of the local eye-attached coordinate frame.
  • Any point in 3-dimensional space can be expressed in terms of either a global coordinate frame or a local coordinate frame. Because everything seen by the driver is defined with respect to his or her location and viewing direction (i.e. the relative geometrical configuration between the viewer and the environment) all of the viewable environment should be expressed in terms of a local coordinate frame. Then, any objects or line segments can be projected onto a flat surface or video screen by means of the perspective projection. Thus, the mathematical calculation of the coordinate transformation is performed by constructing the homogenous transformation matrix and applying the matrix to the position vectors. The coordinate transformation matrix [T] is defined as a result of the multiplication of a number of matrices described in the following paragraphs. [0088]
  • To change the global coordinate data to the local coordinate data, the translation and rotation of the frame should be considered together. The translation of the coordinate frame transforms point data using the following matrix equation (with reference to FIG. 5C): [0089]
  • X=X−O L X
  • Y=Y−O L Y
  • Z=X−O L Z  Eq. 1
  • or [0090] [ x y z 1 ] = [ 1 0 0 - O L X 0 1 0 - O L Y 0 0 1 - O L Z 0 0 0 1 ] [ X Y Z 1 ] or L p = [ T tran ] G G L P Eq . 2
    Figure US20040066376A1-20040408-M00001
  • where, [0091] L p = [ x y z 1 ] , G P = [ X Y Z 1 ] , and G L [ T tran ] = [ 1 0 0 - O L X 0 1 0 - O L Y 0 0 1 - O L Z 0 0 0 1 ] Eq . 3
    Figure US20040066376A1-20040408-M00002
  • The letter [0092] GP is a point in terms of coordinates X, Y, Z as referenced from the global coordinate system. The letter LP represents the same point in terms of x, y, z in the local coordinate system. The transformation matrix L G[T tran] allows for a translational transformation from the global G coordinate system to the local L coordinate system.
  • The rotation of the coordinate frame about the Z-axis can be expressed by the following matrix equation (with respect to FIG. 5D): [0093]
  • x=X cos θ+Y sin θ
  • y=−X sin θ+Y cos θ
  • z=Z  Eq. 4
  • or, in matrix form [0094] [ x y z 1 ] = [ cos θ sin θ 0 0 - sin θ cos θ 0 0 0 0 1 0 0 0 0 1 ] [ X Y Z 1 ] Eq . 5
    Figure US20040066376A1-20040408-M00003
  • This equation can be written using the following matrix equation, [0095]
  • LP=L G[Ttran]GP  Eq. 6
  • where, the rotational transformation from the G to the L coordinate system is [0096] G L [ T rot ] = [ cos θ sin θ 0 0 - sin θ cos θ 0 0 0 0 1 0 0 0 0 1 ] Eq . 7
    Figure US20040066376A1-20040408-M00004
  • For rotation and translation at the same time, these two matrices can be combined by the following equations, [0097]
  • LP=L G[T]GP  Eq. 8
  • where [0098] G L [ T ] = G L [ T rot ] G L [ T trans ] = [ cos θ sin θ 0 0 - sin θ cos θ 0 0 0 0 1 0 0 0 0 1 ] [ 1 0 0 - O L X 0 1 0 - O L Y 0 0 1 - O L Z 0 0 0 1 ] = [ cos θ sin θ 0 - O L X cos θ - O L Y sin θ - sin θ cos θ 0 O L X sin θ - O L Y cos θ 0 0 1 - O L Z 0 0 0 1 ] Eq . 9
    Figure US20040066376A1-20040408-M00005
  • This relationship can be expanded through the {G}, and {V}, and {L} coordinate frames. The coordinate transform matrix [T] was defined as follows assuming that only heading angles θ[0099] E and θV are considered as rotational angle data;
  • LP=L V[T]V G[T]GP=[T]GP  Eq. 10
  • where, [0100] [ T ] = [ C E S E 0 - O L XC E - O L YS E - S E C E 0 + O L XS E - O L YC E 0 0 1 - O L Z 0 0 0 1 ] [ C V S V 0 - O V XC V - O X YS V - S V C V 0 + O V XS V - O V YC V 0 0 1 - O V Z 0 0 0 1 ] Eq . 11
    Figure US20040066376A1-20040408-M00006
  • and, [0101]
  • cE=cos θE,sE=sin θE,cv=cos θv,and sv=sin θv
  • c E+V=cos (θEV), and s E+V=sin (θEV)  Eq. 12
  • The resultant matrix [T] is then as follows: [0102] [ T ] = [ T 11 T 12 T 13 T 14 T 21 T 22 T 23 T 24 T 31 T 32 T 33 T 34 T 41 T 42 T 43 T 44 ] Eq . 13
    Figure US20040066376A1-20040408-M00007
  • where, [0103]
  • T 11 =c E c v −s E s v=cos (θEV)  Eq. 14
  • T 12 =c E s v +s E c v=sin (θEV)  Eq. 15
  • T13=0  Eq. 16 T 14 = c E ( - O V Xc v - O V Ys V ) = s E ( - O V Xs v - O V Yc V ) + ( - O L Xc E - O L Ys E ) = - c E + V ( - O V X + O L Xc V - O L Ys V ) - s E + V ( O V Y + O L Xs V + O L Xc V ) Eq . 17
    Figure US20040066376A1-20040408-M00008
    T 21 =−s E c v −c E s v=−sin (θEV)  Eq. 18
  • T 22 =−s E s V +c E c V=cos (θEV)  Eq. 19
  • T[0104] 23=0  Eq. 20 T 24 = s E ( O V Xc V + O V Ys V ) + c E ( O V Xs V - O V Yc v ) + ( O L Xs E - O L Yc E ) = s E + V ( O V X + O L Xc V - O L Ys V ) - c E + V ( O V Y + O L Xs V + O L Yc V ) Eq . 21
    Figure US20040066376A1-20040408-M00009
     T31=0  Eq. 22
  • T32=0  Eq. 23
  • T33=1  Eq. 24
  • T 34=−θL.V−θL.Z  Eq. 25
  • T41=0  Eq. 26
  • T42=0  Eq. 27
  • T43=0  Eq. 28
  • T44=1  Eq. 29
  • By multiplying the road points P by the [T] matrix, we will have local coordinate data p. The resultant local coordinate value p is then fed into the perspective projection routine to calculate the projected points on the heads up [0105] display screen 22. The calculations for the perspective projection are now discussed.
  • After the coordinate transformation, all the road data are expressed with respect to the driver's viewing location and orientation. These local coordinate data are illustratively projected onto a flat screen (i.e., the [0106] virtual screen 50 of heads up display 22). Shown in FIGS. 5E-1 to 5F-3).
  • Projecting the scene onto the display screen can be done using simple and well-known geometrical mathematics and computer graphics theory. Physically, the display screen is the virtual focal plane. Thus, the display screen is the plane, which is located at S[0107] y position, parallel to the z-x plane, where sx, sz, are the horizontal and vertical dimensions of the display screen. Where the object is projected onto the screen, it should be projected with the correct perspective so that the projected images match with the outer scene. It is desirable that the head up display system match the drawn road shapes (exactly or at least closely) the actual road which is in front of the driver. The perspective projection makes closer objects appear larger and further objects appear smaller.
  • The prospective projection can be calculated from triangle similarity as shown in FIGS. 5G to [0108] 5H-2. From the figures, one can find the location of the point s(x,z) for the known data p(x,y,z).
  • The values of s[0109] x and sz can be found by similarity of triangles.
  • Py:Sy=px:sx  Eq. 30
  • so, [0110] s x = p x s y p y Eq . 31 s z = p z s y p y Eq . 32
    Figure US20040066376A1-20040408-M00010
  • As expected, s[0111] x and sz are small when the value py is big (i.e. when the object is located far away). This is the nature of perspective projection.
  • After calculating the projected road point on the display screen by the prospective projection, the points are connected using straight lines to build up the road shapes. The line-connected road shape provides a better visual cue of the road geometry than plotting just a series of dots. [0112]
  • The road points that have passed behind the driver's moving position do not need to be drawn. Furthermore, because the projection screen has limited size, only road points and objects that fall within the visible field of view need to be drawn on the projection screen. Finding and then not attempting to draw these points outside the field of view can be important in order to reduce the computation load of [0113] controller 12 and to enhance the display refresh speed.
  • The visible limit is illustrated by FIGS. 5I to [0114] 5J-3. The visible three-dimensional volume is defined as a rectangular cone cut at the display screen. Every object in this visible region needs to be displayed on the projection screen. Objects in the small rectangular cone defined by OL and the display screen, a three dimensional volume space between the viewer's eye and the displaying screen, is displayed in an enlarged size. If the object in this region is too close to the viewer, then it results in an out of limit error or a divide by zero error during the calculation. However, usually there are no objects located in the “enlarging space.” FIGS. 5J-1 to 5J-3 and the following equations of lines were used for checking whether an object is in the visible space or not. Using these clipping techniques, if the position of a point in the local coordinate frame is defined as p(x, y, z) then this point is visible to the viewer only if:
  • the point p is in front of the y=+c[0115] 1x plane (which is marked as dark in the top view diagram of FIG. 5J-1);
  • the point p is in front of the y=−c[0116] 1x plane;
  • the point p is in front of the y=+c[0117] 2z plane (the dark region in the right hand side view diagram of FIG. 5J-3);
  • the point p is in front of the y=−c[0118] 2z plane; and
  • the point p is in front of the display screen.[0119]
  • Equations in the diagram of FIGS. [0120] 5J-1 to 5J-3 (e.g. y=+c1x) are not line-equations but equations of planes in 3 dimensional space. The above conditions can be expressed by the following equations mathematically, which describe what we mean by “in front of”
  • p y >+c 1 p x  Eq. 33
  • p y >−c 1px  Eq. 34
  • p y >+c 2 p z  Eq. 35
  • p y>−c2 p z  Eq. 36
  • and
  • py>sy  Eq. 37
  • Only those points that satisfy all of the five conditions are in the visible region and are then drawn on the projection screen. [0121]
  • In some cases, there could be a line segment of the road whose one end is in the visible region and the other is out of the visible region. In this case, the visible portion of the line segment should be calculated and drawn on the screen. FIGS. [0122] 5K-1 to 5K-3 show one of many possible situations. FIG. 5K-1 is a top view, which is a projection of the xy plane. It will now be described how to locate point p so that only the contained segment is drawn.
  • The range of the ratio value k marked as the distance between point p and p[0123] 1 is from 0.0 to 1.0. The position of point p can be written as,
  • p=p 1 +k(p 2 −p 1)=p 1 +kΔp  Eq. 38
  • where, [0124]
  • k is an arbitrary real number, (0≦k≦1) and [0125]
  • p[0126] 1=(p1x, p1y, p1z),p2=(p2x, p2y, p2z), and
  • Δp=p[0127] 2−p1=(p2x−p1x, p2y−p1y, p2z−p1z)
  • The x and y components of the above equation can be written as follows: [0128]
  • p x =p 1x +kΔp x  Eq. 39
  • P y =p 1y +kΔp y  Eq. 40
  • The x and y components of point p also should satisfy the line equation y=+c[0129] 1x in order to intersect with the line. Therefore,
  • p y =p 1y +kΔp y =c 1(p 1x +kΔp x)=c p 1x +kc 1 Δp x  Eq. 41
  • kp y −c 1 Δp x)=c 1 p 1x −p 1y  Eq. 42
  • then, [0130] k = c 1 p 1 x - p 1 y Δ p y - c 1 Δ p x Eq . 43
    Figure US20040066376A1-20040408-M00011
  • Applying the value k to the above equation p[0131] x, py and pz can be determined as follows, p x = p 1 x + Δ p x c 1 p 1 x - p 1 y Δ p y - c 1 Δ p x Eq . 44 p y = p 1 y + Δ p y c 1 p 1 x - p 1 y Δ p y - c 1 Δ p x Eq . 45 p z = p 1 z + Δ p z c 1 p 1 x - p 1 y Δ p y - c 1 Δ p x Eq . 46
    Figure US20040066376A1-20040408-M00012
  • Using these values of p[0132] x, py and pz, the projected values sx, and sz can be calculated by a perspective projection in the same manner as the other parameters.
  • FIG. 6 illustrates a [0133] vehicle 200 with placement of ranging system 18 thereon. Vehicle 200 is, illustratively, a snow plow which includes an operator compartment 202 and a snow plow blade 204. Ranging system 18, in the embodiment illustrated in FIG. 6, includes a first radar subsystem 206 and a second radar subsystem 208. It can be desirable to be able to locate targets closely proximate to blade 204. However, since radar subsystems 206 and 208 are directional, it is difficult, with one subsystem, to obtain target coverage close to blade 204, yet still several hundred meters ahead of vehicle 200, because of the placement of blade 204. Therefore, in one embodiment, the two subsystems 206 and 208 are employed to obtain ranging system 18. Radar subsystem 208 is located just above blade 204 and is directed approximately straightforwardly, in a horizontal plane. Radar subsystem 206 is located above blade 204 and is directed downwardly, such that targets can be detected closely proximate the front of blade 204. The radar subsystems 206 and 208 are each illustratively an array of aligned radar detectors which is continuously scanned by a processor such that radar targets can be detected, and their range, range rate and azimuth angle from the radar subsystem 206 or 208 can be estimated as well. In this way, information regarding the location of radar targets can be provided to controller 12 such that controller 12 can display an icon or other visual element representative of the target on the head up display 22. of course, the icon can be opaque or transparent.
  • It should also be noted that, while the target illustrated in FIG. 3C is round, and could represent a pedestrian, a vehicle, or any other radar target, the icon representative of the target can be shaped in any desirable shape. In addition, bit maps can be placed on the head up [0134] display 22 which represent targets. Further, targets can be small, colored or otherwise coded to indicate distance. In other words, if the targets are very close to vehicle 200, they can be large, begin to flash, or turn red. Similarly, if the targets are a long distance from vehicle 200, they can maintain a constant glow or halo.
  • FIG. 7 is a flow diagram illustrating how ranging [0135] system 18 can be used, in combination with the remainder of the system, to verify operation of the subsystems. First, controller 12 receives a position signal. This is indicated by block 210. This is the signal, illustratively, from the vehicle location system 14. Controller 12 then receives a ranging signal, as indicated by block 212 in FIG. 7. This is the signal from ranging system 18 which is indicative of targets located within the ranging field of vehicle 200. Next, controller 12 queries geospatial database 16. This is indicated by block 214. In querying geospatial database 16, controller 12 verifies that targets, such as street signs, road barriers, etc. are in the proper places, as detected by the signal received by ranging system 18 in block 212. If the targets identified by the target signal correlate to expected targets in geospatial database 16 given the current position of the vehicle, then controller 12 determines that system 10 is operating properly. This is indicated by block 216 and 218. In view of this determination, controller 12 can provide an output to user interface 20 indicating that the system is healthy.
  • If, however, the detected targets do not correlate to expected targets in the geospatial database for the current vehicle position, then [0136] controller 12 determines that something is not operating correctly, either the ranging system 18 is malfunctioning, the vehicle positioning system is malfunctioning, information retrieval from the geospatial database 16 is malfunctioning or the geospatial database 16 has been corrupted, etc. In any case, controller 12 illustratively provides an output to user interface (UI) 20 indicating a system problem exists. This is indicated by block 220. Therefore, while controller 12 may not be able to detect the exact type of error which is occurring, controller 12 can detect that an error is occurring and provide an indication to the operator to have the system checked or to have further diagnostics run.
  • It should also be noted that the present invention need not be provided only for the forward-looking field of view of the operator. Instead, the [0137] present system 10 can be implemented as a side-looking or rear-looking virtual mirror. In that instance, ranging system 18 includes radar detectors (or other similar devices) located on the sides or to the rear of vehicle 200. The transformation matrix would be adjusted to transform the view of the operator to the side looking or rear looking, field of view as appropriate.
  • Vehicles or objects which are sensed, but which are not part of the fixed geospatial landscape are presented iconically based on the radar or other range sensing devices in ranging [0138] system 18. The fixed lane boundaries, of course, are also presented conformally to the driver. Fixed geospatial landmarks which may be relevant to the driver (such as the backs of road signs, special pavement markings, bridges being passed under, watertowers, trees, etc.) can also be presented to the user, in the proper prospective. This gives the driver a sense of motion as well as cues to proper velocity.
  • One illustration of the present invention as both a forward looking driver assist device and one which assists in a rear view is illustrated in FIG. 8. A forward-looking field of view is illustrated by [0139] block 250 while the virtual rear view mirror is illustrated by block 252. It can be seen that the view is provided, just as the operator would see when looking in a traditional mirror. It should also be noted that the mirror may illustratively be virtually gimbaled along any axis (i.e., the image is rotated from side-to-side or top-to-bottom) in software such that the driver can change the angle of the mirror, just as the driver currently can mechanically, to accommodate different driver sizes, or to obtain a different view than is currently being represented by the mirror.
  • FIG. 9 gives another illustrative embodiment of a vehicle positioning system which provides vehicle position along the roadway. The system illustrated in FIG. 9 can, illustratively, be used as the auxiliary [0140] vehicle positioning system 30 illustrated in FIG. 2A. This can provide vehicle positioning information when, for example, the DGPS signal is lost, momentarily, for whatever reason. In the embodiment illustrated in FIG. 9, vehicle 200 includes an array of magnetic sensors 260. The road lane 262, is bounded by magnetic strips 264 which, illustratively, are formed of tape having magnetized portions 266 therein. Although a wide variety of such magnetic strips could be used, one illustrative embodiment is illustrated in U.S. Pat. No. 5,853,846 to the 3M Company of St. Paul, Minn. The magnetometers in strip 260 are monitored such that the field strength sensed by each magnetometer is identified. Therefore, as the vehicle approaches strip 260 and begins to cross lane boundary 268, magnetometers 270 and 272 begin to provide a signal indicating a larger field strength.
  • Scanning the array of magnetometers is illustratively accomplished using a microprocessor which scans them quickly enough to detect even fairly high frequency changes in vehicle position toward or away from the magnetic elements in the marked lane boundaries. In this way, a measure of the vehicle's position in the lane can be obtained, even if the primary vehicle system is temporarily not working. Further, while FIG. 9 shows magnetometers mounted to the front of the vehicle, they can be mounted to the rear as well. This would allow an optional calculation of the vehicle's yaw angle relative to the magnetic strips. [0141]
  • FIG. 10 is a block diagram of another embodiment of the present invention. All items are the same as those illustrated in FIG. 1 and are similarly numbered, and operate substantially the same way. However, rather than providing an output to display [0142] 22, controller 12 provides an output to neurostimulator 300. Neurostimulator 300 is a stimulating device which operates in a known manner to provide stimulation signals to the cortex to elicit image formation in the brain. The signal provided by controller 12 includes information as to eye perspective and image size and shape, thus enhancing the ability of neurostimulator 300 to properly stimulate the cortex in a meaningful way. Of course, as the person using the system moves and turns the head, the image stimulation will change accordingly.
  • It can thus be seen that the present invention provides a significant advancement in the art of mobility assist devices, particularly, with respect to moving in conditions where the outward looking field of view of the observer is partially or fully obstructed. In an earth-based motor vehicle environment, the present invention provides assistance in not only lane keeping, but also in collision avoidance, since the driver can use the system to steer around displayed obstacles. Of course, the present invention can also be used in many environments such as snow removal, mining or any other environment where airborne matter obscures vision. The invention can also be used in walking or driving in low light areas or at night, or through wooden or rocky areas where vision is obscured by the terrain. Further, the present invention can be used on ships or boats to, for example, guide the water-going vessel into port, through a canal, through lock and dams, around rocks or other obstacles. [0143]
  • Of course, the present invention can also be used on non-motorized, earth-based vehicles such as bicycles, wheelchairs, by skiers or substantially any other vehicle. The present invention can also be used to aid blind or vision impaired persons. [0144]
  • Although the present invention has been described with reference to preferred embodiments, workers skilled in the art will recognize that changes may be made in form and detail without departing from the spirit and scope of the invention. [0145]

Claims (47)

What is claimed is:
1. A display on a mobile body, comprising:
a conformal, augmented display.
2. The display of claim 1 wherein the conformal, augmented display, comprises:
displayed objects, displayed at a perspective approximately equal to a perspective that would be perceived from an operator position at a location of the mobile body by an operator who has visual contact with actual objects corresponding to the displayed objects.
3. The display of claim 2 wherein the displayed objects include blocking templates displayed in a position to reduce glare.
4. The display of claim 2 wherein the displayed objects include enhanced text of signage located proximate to the mobile body.
5. The display of claim 1 wherein the conformal, augmented display comprises:
a guidance indicator guiding the mobile body in a desired direction.
6. The display of claim 2 wherein the displayed objects are positioned within a field of view of the operator in the operator position, at a location which approximately overlies the actual objects in the field of view.
7. The display of claim 6 wherein the displayed objects are see through.
8. The display of claim 6 wherein the displayed objects are displayed in a forward-looking field of view of the operator.
9. The display of claim 6 wherein the displayed objects are displayed in a rear or side view of the operator.
10. The display of claim 9 wherein the mobile body is a vehicle and wherein the displayed objects are displayed in a location simulating a perspective from the operator through a rearview mirror.
11. The display of claim 6 wherein the displayed objects are displayed in a side view of the operator.
12. The display of claim 11 wherein the mobile body is a vehicle and wherein the displayed objects are displayed in a location simulating a perspective from the operator through a side view mirror.
13. The display of claim 6 wherein the displayed objects comprise:
at least one of traffic lane markings or virtual path boundaries.
14. The display of claim 13 wherein the displayed objects comprise: at least one of traffic lights, traffic signals and traffic signs.
15. The display of claim 13 wherein the displayed objects comprise: landmarks.
16. The display of claim 1 wherein the conformal, augmented display, comprises:
displayed target objects, displayed at a perspective approximately equal to a perspective that would be perceived from an operator position at a location of the mobile body by an operator who has visual contact with actual targets corresponding to the displayed target objects.
17. The display of claim 16 wherein the displayed target objects are positioned within a field of view of the operator in the operator position, at a location which approximately overlies the actual target objects in the field of view.
18. The display of claim 17 wherein the displayed target elements are displayed in a forward-looking view of the operator.
19. The display of claim 18 wherein the mobile body comprises a vehicle and wherein the vehicle travels over a roadway and wherein the displayed target elements correspond to transitory targets, not fixed in place during normal operating circumstances of the roadway.
20. The display of claim 19 wherein the transitory targets comprise:
other vehicles proximate to the roadway.
21. The display of claim 19 wherein the transitory targets comprise:
pedestrians or animals proximate to the roadway.
22. The display of claim 6 and further comprising:
an object display indicative of objects outside the field of view of the driver.
23. The display of claim 22 wherein the object display is indicative of service or goods available in a vicinity of the mobile body.
24. The display of claim 1 and further comprising a warning display, warning of an object which the mobile body is approaching.
25. A mobility assist device, comprising:
a location system providing a location signal indicative of a location of a mobile body;
a data storage system storing object information indicative of objects located in a plurality of locations;
a display system; and
a controller coupled to the location system, the data storage system and the display system, and configured to receive the location signal and retrieve object information based on the location signal and provide a display signal to the display system such that the display system displays objects in substantially a correct perspective of an observer located at the location of the mobile body.
26. The mobility assist device of claim 25 wherein the display system is configured to provide a conformal augmented display of the objects based on the display signal.
27. The mobility assist device of claim 25 wherein the controller provides the display signal such that the objects are displayed at a position in a field of view of the observer at a location which substantially overlies the actual objects in the field of view.
28. The mobility assist device of claim 26 wherein the display system comprises:
a projection system providing a projection of an image of the objects; and
a partially reflective, partially transmissive screen, positioned in the field of view of the observer and positioned to receive the projection to allow the observer to see through the screen and to see the image of the objects projected thereon.
29. The mobility assist device of claim 25 and further comprising:
a ranging system, coupled to the controller and configured to detect transitory objects and provide a detection signal to the controller indicative of the location of the transitory object relative to the mobile body.
30. The mobility assist device of claim 29 wherein the controller is further configured to provide the display signal, based at least in part on the detection signal, such that the display system displays the transitory objects in substantially a correct perspective of an observer located at the location of the mobile body.
31. The mobility assist device of claim 25 wherein the controller is configured to filter the display signal such that the display system displays only transitory objects based on operator-selected criteria.
32. The mobility assist device of claim 25 wherein the controller is configured to filter the display signal such that the display system displays only transitory objects and selected objects indicated by the object information that have been selected for display.
33. The mobility assist device of claim 25 and further comprising:
a mobile body orientation detection system, coupled to the controller and the mobile body, detecting an orientation of the mobile body and providing an orientation signal to the controller.
34. The mobility assist device of claim 25 wherein the observer comprises a human with a head and further comprising:
a head orientation tracking system, coupled to the controller, detecting an orientation of the observer's head and providing a head orientation signal to the controller.
35. The mobility assist device of claim 25 wherein the object information is intermittently updated.
36. The mobility assist device of claim 25 wherein the display system comprises a helmet-mounted display system.
37. The mobility assist device of claim 25 wherein the display system comprises a visor-mounted display system.
38. The mobility assist device of claim 25 wherein the display system comprises an eyeglass-mounted display system.
39. A method of monitoring operation of a mobility assist device having a location system providing a location signal indicative of a location of a mobile body, a data storage system storing object information indicative of objects located in a plurality of locations, a display system, a ranging system detecting a location of objects and transitory objects relative to the mobile body and providing an object detection signal based thereon, and a controller coupled to the location system, the data storage system, the ranging system and the display system, and configured to receive the location signal and the object detection signal and retrieve object information based on the location signal and provide a display signal to the display system such that the display system displays objects and transitory objects in substantially a correct perspective of an observer located at the location of the mobile body, the method comprising:
receiving the object detection signal;
determining whether the object detection signal correlates to the object information in the data storage system; and
providing an output at least indicative of a system problem when the object detection signal and the object information are determined not to correlate.
40. The method of claim 39 wherein determining whether the object detection signal correlates to the object information in the data storage system comprises:
accessing the data storage system based on the location signal; and
determining whether the object detection signal indicates the presence of objects indicated by the object information for the location of the mobile body.
41. The method of claim 39 wherein providing an output comprises:
when the object detection signal does not indicate the presence of objects indicated by the object information for the location of the mobile body, providing a user observable indication of a possible malfunction.
42. The method of claim 40 wherein providing an output comprises:
when the object detection signal indicates the presence of objects indicated by the object information for the location of the mobile body, providing a user observable indication of proper operation.
43. The method of claim 39 wherein providing an output comprises:
providing a visual display.
44. A method of controlling a mobility assist device having a location system providing a location signal indicative of a location of a mobile body, a data storage system storing object information indicative of objects located in a plurality of locations, a display system, a ranging system detecting a location of objects and transitory objects relative to the mobile body and providing an object detection signal based thereon, and a controller coupled to the location system, the data storage system, the ranging system and the display system, and comprising:
receiving the location signal and the object detection signal;
retrieving object information based on the location signal; and
providing a filtered display signal to the display system, the display signal being filtered such that the display system displays objects and transitory objects, based on operator selected filtering criteria, in substantially a correct perspective of an observer located at the location of the mobile body.
45. A mobility assist device, comprising:
a location system providing a location signal indicative of a location of a mobile body;
a data storage system storing object information indicative of objects located in a plurality of locations;
a neurostimulation system; and
a controller coupled to the location system, the data storage system and the neurostimulation system, and configured to receive the location signal and retrieve object information based on the location signal and provide a stimulation signal to the neurostimulation system.
46. The mobility assist device of claim 45 and further comprising:
a ranging system, coupled to the controller and configured to detect transitory objects and provide a detection signal to the controller indicative of the location of the transitory object relative to the mobile body.
47. The mobility assist device of claim 46 wherein the controller is further configured to provide the display signal, based at least in part on the detection signal.
US10/626,953 2000-07-18 2003-07-25 Mobility assist device Abandoned US20040066376A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/626,953 US20040066376A1 (en) 2000-07-18 2003-07-25 Mobility assist device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US09/618,613 US6977630B1 (en) 2000-07-18 2000-07-18 Mobility assist device
US10/626,953 US20040066376A1 (en) 2000-07-18 2003-07-25 Mobility assist device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US09/618,613 Continuation US6977630B1 (en) 2000-07-18 2000-07-18 Mobility assist device

Publications (1)

Publication Number Publication Date
US20040066376A1 true US20040066376A1 (en) 2004-04-08

Family

ID=32043605

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/618,613 Expired - Lifetime US6977630B1 (en) 2000-07-18 2000-07-18 Mobility assist device
US10/626,953 Abandoned US20040066376A1 (en) 2000-07-18 2003-07-25 Mobility assist device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US09/618,613 Expired - Lifetime US6977630B1 (en) 2000-07-18 2000-07-18 Mobility assist device

Country Status (1)

Country Link
US (2) US6977630B1 (en)

Cited By (81)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050012682A1 (en) * 2003-07-17 2005-01-20 Jenson Barton James Visual display system for displaying virtual images onto a field of vision
US20050052348A1 (en) * 2003-08-22 2005-03-10 Shunpei Yamazaki Light emitting device, driving support system, and helmet
US20050073136A1 (en) * 2002-10-15 2005-04-07 Volvo Technology Corporation Method and arrangement for interpreting a subjects head and eye activity
US20050107952A1 (en) * 2003-09-26 2005-05-19 Mazda Motor Corporation On-vehicle information provision apparatus
US20050116911A1 (en) * 2003-10-07 2005-06-02 Tomohiro Mukai Information display
WO2005053991A1 (en) * 2003-12-01 2005-06-16 Volvo Technology Corporation Method and system for supporting path control
US20050134479A1 (en) * 2003-12-17 2005-06-23 Kazuyoshi Isaji Vehicle display system
US20050174257A1 (en) * 2002-03-05 2005-08-11 The University Of Minnesota Intersection assistance system and method
US20050206727A1 (en) * 2000-10-13 2005-09-22 L-3 Communications Corporation System and method for forming images for display in a vehicle
US20050219695A1 (en) * 2004-04-05 2005-10-06 Vesely Michael A Horizontal perspective display
US20050264559A1 (en) * 2004-06-01 2005-12-01 Vesely Michael A Multi-plane horizontal perspective hands-on simulator
US6977630B1 (en) 2000-07-18 2005-12-20 University Of Minnesota Mobility assist device
US20060010699A1 (en) * 2004-07-15 2006-01-19 C&N Inc. Mobile terminal apparatus
US20060022811A1 (en) * 2004-07-28 2006-02-02 Karsten Haug Night vision device
US20060087507A1 (en) * 2004-10-25 2006-04-27 Sony Corporation Information processing apparatus and method, program, and navigation apparatus
US7050908B1 (en) * 2005-03-22 2006-05-23 Delphi Technologies, Inc. Lane marker projection method for a motor vehicle vision system
US20060126927A1 (en) * 2004-11-30 2006-06-15 Vesely Michael A Horizontal perspective representation
US20060178815A1 (en) * 2005-02-04 2006-08-10 Samsung Electronics Co., Ltd. Apparatus and method for correcting location information of mobile body, and computer-readable media storing computer program for controlling the apparatus
US7098913B1 (en) * 2002-07-30 2006-08-29 Rockwell Collins, Inc. Method and system for providing depth cues by attenuating distant displayed terrain
US20060250391A1 (en) * 2005-05-09 2006-11-09 Vesely Michael A Three dimensional horizontal perspective workstation
US20060252978A1 (en) * 2005-05-09 2006-11-09 Vesely Michael A Biofeedback eyewear system
US20060269437A1 (en) * 2005-05-31 2006-11-30 Pandey Awadh B High temperature aluminum alloys
US20070013495A1 (en) * 2005-06-15 2007-01-18 Denso Coropration Vehicle drive assist system
US20070032914A1 (en) * 2005-08-05 2007-02-08 Nissan Motor Co., Ltd. Vehicle driving assist system
US20070040705A1 (en) * 2005-08-19 2007-02-22 Denso Corporation Unsafe location warning system
US20070040905A1 (en) * 2005-08-18 2007-02-22 Vesely Michael A Stereoscopic display using polarized eyewear
US20070043466A1 (en) * 2005-08-18 2007-02-22 Vesely Michael A Stereoscopic display using polarized eyewear
US20070102214A1 (en) * 2005-09-06 2007-05-10 Marten Wittorf Method and system for improving traffic safety
US20070165910A1 (en) * 2006-01-17 2007-07-19 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus, method, and program
US20070257836A1 (en) * 2006-05-05 2007-11-08 Clint Chaplin Site survey tracking
FR2902381A1 (en) * 2006-06-20 2007-12-21 Peugeot Citroen Automobiles Sa Motor vehicle driving assisting method, involves merging image captured by image formation device of night vision system and synthesis image, and displaying merged image on internal display of night vision system
EP1887541A2 (en) * 2006-08-04 2008-02-13 Audi Ag Motor vehicle with a lane detection system
US20080046151A1 (en) * 2006-08-16 2008-02-21 Gm Global Technology Operations, Inc. Method and System for Adjusting Vehicular Components Based on Sun Position
EP1894779A1 (en) * 2006-09-01 2008-03-05 Harman Becker Automotive Systems GmbH Method of operating a night-view system in a vehicle and corresponding night-view system
US20080195315A1 (en) * 2004-09-28 2008-08-14 National University Corporation Kumamoto University Movable-Body Navigation Information Display Method and Movable-Body Navigation Information Display Unit
WO2008108853A1 (en) * 2007-03-02 2008-09-12 Nanolumens Acquisition, Inc. Vehicle video display for exterior view
US7552008B2 (en) 2001-07-18 2009-06-23 Regents Of The University Of Minnesota Populating geospatial database for onboard intelligent vehicle applications
US20090265061A1 (en) * 2006-11-10 2009-10-22 Aisin Seiki Kabushiki Kaisha Driving assistance device, driving assistance method, and program
US20110050548A1 (en) * 2008-03-04 2011-03-03 Elbit Systems Electro Optics Elop Ltd. Head up display utilizing an lcd and a diffuser
US20110102232A1 (en) * 1999-06-14 2011-05-05 Escort Inc. Radar detector with navigation function
US20110153266A1 (en) * 2009-12-23 2011-06-23 Regents Of The University Of Minnesota Augmented vehicle location system
US20110301813A1 (en) * 2010-06-07 2011-12-08 Denso International America, Inc. Customizable virtual lane mark display
EP2412557A1 (en) * 2009-03-23 2012-02-01 Kabushiki Kaisha Toshiba Vehicluar display system, method of displaying and vehicle
US20120044090A1 (en) * 2010-08-18 2012-02-23 GM Global Technology Operations LLC Motor vehicle with digital projectors
EP1857780A3 (en) * 2006-05-16 2012-11-14 Navteq North America, LLC Dual road geometry representation for position and curvature-heading
US20120310531A1 (en) * 2011-05-31 2012-12-06 Broadcom Corporation Navigation system employing augmented labeling and/or indicia
CN102842138A (en) * 2011-04-08 2012-12-26 F·波尔希名誉工学博士公司 Method for operating image-based driver assistance system in motorcycle, involves utilizing connection plane between central vertical axis of pickup unit and alignment axis as x-z-plane to determine data of vehicle
WO2013113500A1 (en) * 2012-02-02 2013-08-08 Audi Ag Driver assistance system and method for virtual representation of a road layout under obscured visibility and/or poor visibility conditions
US20130261950A1 (en) * 2012-03-28 2013-10-03 Honda Motor Co., Ltd. Railroad crossing barrier estimating apparatus and vehicle
US20130294650A1 (en) * 2012-02-16 2013-11-07 Panasonic Corporation Image generation device
US20130342666A1 (en) * 2006-08-15 2013-12-26 Koninklijke Philips N.V. Assistance system for visually handicapped persons
WO2013189927A1 (en) * 2012-06-20 2013-12-27 Bayerische Motoren Werke Aktiengesellschaft Method and device for operating a head-up display for a vehicle
US20140005886A1 (en) * 2012-06-29 2014-01-02 Microsoft Corporation Controlling automotive functionality using internal- and external-facing sensors
US8717423B2 (en) 2005-05-09 2014-05-06 Zspace, Inc. Modifying perspective of stereoscopic images based on changes in user viewpoint
US8717360B2 (en) 2010-01-29 2014-05-06 Zspace, Inc. Presenting a view within a three dimensional scene
US8786529B1 (en) 2011-05-18 2014-07-22 Zspace, Inc. Liquid crystal variable drive voltage
US20140267415A1 (en) * 2013-03-12 2014-09-18 Xueming Tang Road marking illuminattion system and method
US20140310610A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc Vehicle occupant impairment assisted vehicle
US20140348389A1 (en) * 2011-12-29 2014-11-27 David L. Graumann Systems, methods, and apparatus for controlling devices based on a detected gaze
US20150071496A1 (en) * 2013-09-06 2015-03-12 Robert Bosch Gmbh method and control and recording device for the plausibility checking for the wrong-way travel of a motor vehicle
US9064420B2 (en) 2013-03-14 2015-06-23 Honda Motor Co., Ltd. Augmented reality heads up display (HUD) for yield to pedestrian safety cues
US20150211876A1 (en) * 2014-01-29 2015-07-30 Brian R. Edelen Visual guidance system
US20150229885A1 (en) * 2012-08-21 2015-08-13 Robert Bosch Gmbh Method for supplementing a piece of object information assigned to an object and method for selecting objects in surroundings of a vehicle
WO2015144751A1 (en) * 2014-03-25 2015-10-01 Jaguar Land Rover Limited Navigation system
US20160247395A1 (en) * 2013-08-30 2016-08-25 Komatsu Ltd. Management system and management method for mining machine
US20170163863A1 (en) * 2015-12-03 2017-06-08 Fico Mirrors, S.A. Rear vision system for a motor vehicle
USD791158S1 (en) * 2015-10-08 2017-07-04 Mitsubishi Electric Corporation Display screen with graphical user interface
WO2018007003A1 (en) * 2016-07-03 2018-01-11 DDG Benelux S.A. Non-rail bound driving device provided with a virtual reality unit
US20180031849A1 (en) * 2016-07-29 2018-02-01 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Augmented reality head-up display road correction
US20180086262A1 (en) * 2016-09-29 2018-03-29 Valeo Vision Method for projecting an image by a projection system of a motor vehicle, and associated projection system
CN107878301A (en) * 2016-09-29 2018-04-06 法雷奥照明公司 Method and associated optical projection system for the projection system projects image by motor vehicles
CN107924630A (en) * 2015-08-05 2018-04-17 株式会社电装 Position detecting device, method for detecting position and position detecting system
US20190071094A1 (en) * 2017-09-01 2019-03-07 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and storage medium
US20190265818A1 (en) * 2006-03-24 2019-08-29 Northwestern University Haptic device with indirect haptic feedback
US10679530B1 (en) * 2019-02-11 2020-06-09 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for mobile projection in foggy conditions
EP3667239A1 (en) * 2018-12-12 2020-06-17 HERE Global B.V. Method and apparatus for augmented reality based on localization and environmental conditions
US20200348140A1 (en) * 2017-12-27 2020-11-05 Bayerische Motoren Werke Aktiengesellschaft Deformation Correction of a Digital Map for a Vehicle
US11009884B2 (en) * 2017-09-29 2021-05-18 Direct Current Capital LLC Method for calculating nominal vehicle paths for lanes within a geographic region
EP3845861A1 (en) * 2020-01-02 2021-07-07 Samsung Electronics Co., Ltd. Method and device for displaying 3d augmented reality navigation information
US20210403026A1 (en) * 2020-06-29 2021-12-30 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for 3d modeling
US11529872B2 (en) * 2019-03-07 2022-12-20 Deutsche Post Ag Vehicle with display device

Families Citing this family (59)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7630806B2 (en) * 1994-05-23 2009-12-08 Automotive Technologies International, Inc. System and method for detecting and protecting pedestrians
US7209221B2 (en) * 1994-05-23 2007-04-24 Automotive Technologies International, Inc. Method for obtaining and displaying information about objects in a vehicular blind spot
US7783403B2 (en) * 1994-05-23 2010-08-24 Automotive Technologies International, Inc. System and method for preventing vehicular accidents
US8041483B2 (en) 1994-05-23 2011-10-18 Automotive Technologies International, Inc. Exterior airbag deployment techniques
US7426437B2 (en) * 1997-10-22 2008-09-16 Intelligent Technologies International, Inc. Accident avoidance systems and methods
US20150149079A1 (en) * 1999-12-15 2015-05-28 American Vehicular Sciences Llc Vehicle heads-up display navigation system
US7852462B2 (en) * 2000-05-08 2010-12-14 Automotive Technologies International, Inc. Vehicular component control methods based on blind spot monitoring
US6895126B2 (en) 2000-10-06 2005-05-17 Enrico Di Bernardo System and method for creating, storing, and utilizing composite images of a geographic location
SE520042C2 (en) * 2000-10-26 2003-05-13 Autoliv Dev Device for improving the night vision of a vehicle such as a car
US20030202097A1 (en) * 2000-10-26 2003-10-30 Autoliv Development Ab Night vision arrangement
GB2370709A (en) * 2000-12-28 2002-07-03 Nokia Mobile Phones Ltd Displaying an image and associated visual effect
US20040247157A1 (en) * 2001-06-15 2004-12-09 Ulrich Lages Method for preparing image information
DE10131720B4 (en) * 2001-06-30 2017-02-23 Robert Bosch Gmbh Head-Up Display System and Procedures
US7995095B2 (en) * 2001-10-18 2011-08-09 Autoliv Development Ab Night vision device for a vehicle
DE10218010A1 (en) * 2002-04-23 2003-11-06 Bosch Gmbh Robert Method and device for lateral guidance support in motor vehicles
US8718919B2 (en) * 2002-04-23 2014-05-06 Robert Bosch Gmbh Method and apparatus for lane recognition for a vehicle
JP2006511384A (en) * 2002-11-13 2006-04-06 アカレッタ、ラウル デルガド Information display device
DE10253510A1 (en) * 2002-11-16 2004-05-27 Robert Bosch Gmbh Visibility improvement device in motor vehicle, has processing unit with arrangement for detecting road profile from acquired optical signal(s) and controlling signaling arrangement accordingly
US7480512B2 (en) 2004-01-16 2009-01-20 Bones In Motion, Inc. Wireless device, program products and methods of using a wireless device to deliver services
JP4134785B2 (en) * 2003-03-28 2008-08-20 株式会社デンソー Display device
JP4055656B2 (en) * 2003-05-30 2008-03-05 トヨタ自動車株式会社 Collision prediction device
JP4401728B2 (en) * 2003-09-30 2010-01-20 キヤノン株式会社 Mixed reality space image generation method and mixed reality system
JP4609695B2 (en) * 2003-10-21 2011-01-12 日本精機株式会社 Vehicle display device
JP4254501B2 (en) * 2003-11-20 2009-04-15 日産自動車株式会社 VEHICLE DRIVE OPERATION ASSISTANCE DEVICE AND VEHICLE HAVING VEHICLE DRIVE OPERATION ASSISTANCE DEVICE
JP3900162B2 (en) * 2004-02-09 2007-04-04 日産自動車株式会社 VEHICLE DRIVE OPERATION ASSISTANCE DEVICE AND VEHICLE WITH VEHICLE DRIVE OPERATION ASSISTANCE DEVICE
WO2005104063A1 (en) * 2004-04-21 2005-11-03 Mitsubishi Denki Kabushiki Kaisha Facilities display device
US8521411B2 (en) * 2004-06-03 2013-08-27 Making Virtual Solid, L.L.C. En-route navigation display method and apparatus using head-up display
JP4443327B2 (en) * 2004-07-01 2010-03-31 パイオニア株式会社 Information display device
JP4556794B2 (en) * 2004-10-06 2010-10-06 株式会社デンソー Navigation device
KR20060057917A (en) * 2004-11-24 2006-05-29 한국전자통신연구원 Wearable apparatus for converting vision signal into haptic signal and agent system using the same
JP4533762B2 (en) * 2005-01-19 2010-09-01 日立オートモティブシステムズ株式会社 Variable transmittance window system
GB2438783B8 (en) * 2005-03-16 2011-12-28 Lucasfilm Entertainment Co Ltd Three-dimensional motion capture
US7307578B2 (en) * 2005-03-31 2007-12-11 Honeywell International Inc. Declutter of graphical TCAS targets to improve situational awareness
US20060262140A1 (en) * 2005-05-18 2006-11-23 Kujawa Gregory A Method and apparatus to facilitate visual augmentation of perceived reality
KR100721560B1 (en) * 2005-11-30 2007-05-23 한국전자통신연구원 System and method for provision of 3-dimensional car information with arbitrary viewpoint
JP4935145B2 (en) * 2006-03-29 2012-05-23 株式会社デンソー Car navigation system
CN101467003A (en) * 2006-06-30 2009-06-24 电子地图北美公司 Method and system for collecting user update requests regarding geographic data to support automated analysis, processing and geographic data updates
GB2459220B (en) 2007-01-12 2012-09-05 Kopin Corp Head mounted computing device
US9217868B2 (en) 2007-01-12 2015-12-22 Kopin Corporation Monocular display device
US8130225B2 (en) * 2007-01-16 2012-03-06 Lucasfilm Entertainment Company Ltd. Using animation libraries for object identification
US8542236B2 (en) * 2007-01-16 2013-09-24 Lucasfilm Entertainment Company Ltd. Generating animation libraries
US8199152B2 (en) 2007-01-16 2012-06-12 Lucasfilm Entertainment Company Ltd. Combining multiple session content for animation libraries
US8144153B1 (en) 2007-11-20 2012-03-27 Lucasfilm Entertainment Company Ltd. Model production for animation libraries
US8392064B2 (en) * 2008-05-27 2013-03-05 The Board Of Trustees Of The Leland Stanford Junior University Systems, methods and devices for adaptive steering control of automotive vehicles
US8065082B2 (en) * 2008-11-14 2011-11-22 Honeywell International Inc. Display systems with enhanced symbology
US9142024B2 (en) 2008-12-31 2015-09-22 Lucasfilm Entertainment Company Ltd. Visual and physical motion sensing for three-dimensional motion capture
US8437939B2 (en) * 2010-01-29 2013-05-07 Toyota Jidosha Kabushiki Kaisha Road information detecting device and vehicle cruise control device
KR20120113579A (en) * 2011-04-05 2012-10-15 현대자동차주식회사 Apparatus and method for displaying road guide information on the windshield
US8565481B1 (en) * 2011-05-26 2013-10-22 Google Inc. System and method for tracking objects
US8948447B2 (en) 2011-07-12 2015-02-03 Lucasfilm Entertainment Companyy, Ltd. Scale independent tracking pattern
US8954255B1 (en) 2011-09-16 2015-02-10 Robert J. Crawford Automobile-speed control using terrain-based speed profile
US9508176B2 (en) 2011-11-18 2016-11-29 Lucasfilm Entertainment Company Ltd. Path and speed based character control
US8868254B2 (en) 2012-06-08 2014-10-21 Apple Inc. Accessory control with geo-fencing
GB2524514B (en) * 2014-03-25 2017-08-16 Jaguar Land Rover Ltd Navigation system
JP6353881B2 (en) * 2016-08-25 2018-07-04 株式会社Subaru Vehicle display device
JP2018097541A (en) * 2016-12-12 2018-06-21 三菱自動車工業株式会社 Driving support device
CN109427199B (en) * 2017-08-24 2022-11-18 北京三星通信技术研究有限公司 Augmented reality method and device for driving assistance
CN113486796B (en) * 2018-09-07 2023-09-05 百度在线网络技术(北京)有限公司 Unmanned vehicle position detection method, unmanned vehicle position detection device, unmanned vehicle position detection equipment, storage medium and vehicle
EP3720751A4 (en) * 2018-10-25 2021-07-14 Samsung Electronics Co., Ltd. Augmented reality method and apparatus for driving assistance

Citations (74)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US47234A (en) * 1865-04-11 Improvement in boxes for transporting plants
US4120566A (en) * 1977-04-18 1978-10-17 Salvatore Sanci Rearview apparatus for vehicles
US4406501A (en) * 1982-01-29 1983-09-27 Caterpillar Tractor Co. Recoil system with guided slide assembly for track-type vehicles
US4498620A (en) * 1982-08-13 1985-02-12 Champion International Corporation Carton with carrying handle
US5203923A (en) * 1990-11-27 1993-04-20 Research Derivatives, Inc. Apparatus for painting highway markings
US5214757A (en) * 1990-08-07 1993-05-25 Georesearch, Inc. Interactive automated mapping system
US5231379A (en) * 1987-09-18 1993-07-27 Hughes Flight Dynamics, Inc. Automobile head-up display system with apparatus for positioning source information
US5291338A (en) * 1989-06-15 1994-03-01 Jaeger Head-down type optical device for delivering information to the driver of a motor vehicle
US5381338A (en) * 1991-06-21 1995-01-10 Wysocki; David A. Real time three dimensional geo-referenced digital orthophotograph-based positioning, navigation, collision avoidance and decision support system
US5414439A (en) * 1994-06-09 1995-05-09 Delco Electronics Corporation Head up display with night vision enhancement
US5444442A (en) * 1992-11-05 1995-08-22 Matsushita Electric Industrial Co., Ltd. Method for predicting traffic space mean speed and traffic flow rate, and method and apparatus for controlling isolated traffic light signaling system through predicted traffic flow rate
US5497271A (en) * 1993-09-07 1996-03-05 Jaguar Cars Limited Head up displays for motor vehicles
US5499325A (en) * 1992-08-20 1996-03-12 International Business Machines Corporation Brightness controls for visual separation of vector and raster information
US5517419A (en) * 1993-07-22 1996-05-14 Synectics Corporation Advanced terrain mapping system
US5529433A (en) * 1993-12-14 1996-06-25 Pavement Marking Technologies, Inc. Apparatus and method for marking a surface
US5543789A (en) * 1994-06-24 1996-08-06 Shields Enterprises, Inc. Computerized navigation system
US5599133A (en) * 1995-05-25 1997-02-04 Argus International Method and apparatus for painting road surfaces
US5602741A (en) * 1994-02-18 1997-02-11 Trimble Navigation Limited Centimeter accurate global positioning system receiver for on-the-fly real-time kinematic measurement and control
US5652705A (en) * 1995-09-25 1997-07-29 Spiess; Newton E. Highway traffic accident avoidance system
US5721685A (en) * 1995-06-29 1998-02-24 Holland; Robert E. Digi-track digital roadway and railway analyzer
US5734358A (en) * 1994-03-18 1998-03-31 Kansei Corporation Information display device for motor vehicle
US5761630A (en) * 1995-03-23 1998-06-02 Honda Giken Kogyo Kabushiki Kaisha Vehicle control system for merging vehicles safely
US5765116A (en) * 1993-08-28 1998-06-09 Lucas Industries Public Limited Company Driver assistance system for a vehicle
US5826212A (en) * 1994-10-25 1998-10-20 Honda Giken Kogyo Kabushiki Kaisha Current-position map and three dimensional guiding objects displaying device for vehicle
US5848373A (en) * 1994-06-24 1998-12-08 Delorme Publishing Company Computer aided map location system
US5872526A (en) * 1996-05-23 1999-02-16 Sun Microsystems, Inc. GPS collision avoidance system
US5910817A (en) * 1995-05-18 1999-06-08 Omron Corporation Object observing method and device
US5926117A (en) * 1997-06-10 1999-07-20 Hitachi, Ltd. Vehicle control system, vehicle mounting apparatus, base station apparatus and vehicle control method
US5949331A (en) * 1993-02-26 1999-09-07 Donnelly Corporation Display enhancements for vehicle vision system
US5951620A (en) * 1996-01-26 1999-09-14 Navigation Technologies Corporation System and method for distributing information for storage media
US5953722A (en) * 1996-10-25 1999-09-14 Navigation Technologies Corporation Method and system for forming and using geographic data
US5966132A (en) * 1994-06-17 1999-10-12 Namco Ltd. Three-dimensional image synthesis which represents images differently in multiple three dimensional spaces
US5978737A (en) * 1997-10-16 1999-11-02 Intel Corporation Method and apparatus for hazard detection and distraction avoidance for a vehicle
US5999635A (en) * 1996-01-12 1999-12-07 Sumitomo Electric Industries, Ltd. Traffic congestion measuring method and apparatus and image processing method and apparatus
US5999878A (en) * 1997-04-11 1999-12-07 Navigation Technologies Corp. System and method for acquiring geographic data for forming a digital database of road geometry in a geographic region
US6038496A (en) * 1995-03-07 2000-03-14 Daimlerchrysler Ag Vehicle with optical scanning device for a lateral road area
US6038559A (en) * 1998-03-16 2000-03-14 Navigation Technologies Corporation Segment aggregation in a geographic database and methods for use thereof in a navigation application
US6047234A (en) * 1997-10-16 2000-04-04 Navigation Technologies Corporation System and method for updating, enhancing or refining a geographic database using feedback
US6049295A (en) * 1997-12-05 2000-04-11 Fujitsu Limited Method and system for avoiding a collision at an intersection and a recording medium storing programs performing such a method
US6120460A (en) * 1996-09-04 2000-09-19 Abreu; Marcio Marc Method and apparatus for signal acquisition, processing and transmission for evaluation of bodily functions
US6122593A (en) * 1999-08-03 2000-09-19 Navigation Technologies Corporation Method and system for providing a preview of a route calculated with a navigation system
US6144335A (en) * 1998-04-14 2000-11-07 Trimble Navigation Limited Automated differential correction processing of field data in a global positional system
US6157342A (en) * 1997-05-27 2000-12-05 Xanavi Informatics Corporation Navigation device
US6161071A (en) * 1999-03-12 2000-12-12 Navigation Technologies Corporation Method and system for an in-vehicle computing architecture
US6166698A (en) * 1999-02-16 2000-12-26 Gentex Corporation Rearview mirror with integrated microwave receiver
US6184823B1 (en) * 1998-05-01 2001-02-06 Navigation Technologies Corp. Geographic database architecture for representation of named intersections and complex intersections and methods for formation thereof and use in a navigation application program
US6188957B1 (en) * 1999-10-04 2001-02-13 Navigation Technologies Corporation Method and system for providing bicycle information with a navigation system
US6192314B1 (en) * 1998-03-25 2001-02-20 Navigation Technologies Corp. Method and system for route calculation in a navigation application
US6208934B1 (en) * 1999-01-19 2001-03-27 Navigation Technologies Corp. Method and system for providing walking instructions with route guidance in a navigation program
US6208927B1 (en) * 1997-09-10 2001-03-27 Fuji Jukogyo Kabushiki Kaisha Vehicle maneuvering control device
US6212474B1 (en) * 1998-11-19 2001-04-03 Navigation Technologies Corporation System and method for providing route guidance with a navigation application program
US6218934B1 (en) * 1999-07-21 2001-04-17 Daimlerchrysler Corporation Mini-trip computer for use in a rearview mirror assembly
US6226389B1 (en) * 1993-08-11 2001-05-01 Jerome H. Lemelson Motor vehicle warning and control system and method
US6253151B1 (en) * 2000-06-23 2001-06-26 Navigation Technologies Corp. Navigation system with feature for reporting errors
US6268825B1 (en) * 1996-11-25 2001-07-31 Toyota Jidosha Kabushiki Kaisha Navigation device for vehicle and preparation of road shape data used therefor
US6272431B1 (en) * 1997-04-29 2001-08-07 Thomas Zamojdo Method for displaying a map in a vehicle en-route guidance system
US6278942B1 (en) * 2000-03-21 2001-08-21 Navigation Technologies Corp. Method and system for providing routing guidance
US6289278B1 (en) * 1998-02-27 2001-09-11 Hitachi, Ltd. Vehicle position information displaying apparatus and method
US6297516B1 (en) * 1997-11-24 2001-10-02 The Trustees Of Princeton University Method for deposition and patterning of organic thin film
US6308177B1 (en) * 1996-10-25 2001-10-23 Vijaya S. Israni System and method for use and storage of geographic data on physical media
US6314365B1 (en) * 2000-01-18 2001-11-06 Navigation Technologies Corp. Method and system of providing navigation services to cellular phone devices from a server
US6361321B1 (en) * 1997-04-10 2002-03-26 Faac, Inc. Dynamically controlled vehicle simulator system, and methods of constructing and utilizing same
US20020036584A1 (en) * 2000-02-28 2002-03-28 Jocoy Edward H. System and method for avoiding accidents in intersections
US6370475B1 (en) * 1997-10-22 2002-04-09 Intelligent Technologies International Inc. Accident avoidance system
US6370261B1 (en) * 1998-01-30 2002-04-09 Fuji Jukogyo Kabushiki Kaisha Vehicle surroundings monitoring apparatus
US6385539B1 (en) * 1999-08-13 2002-05-07 Daimlerchrysler Ag Method and system for autonomously developing or augmenting geographical databases by mining uncoordinated probe data
US6405132B1 (en) * 1997-10-22 2002-06-11 Intelligent Technologies International, Inc. Accident avoidance system
US6438491B1 (en) * 1999-08-06 2002-08-20 Telanon, Inc. Methods and apparatus for stationary object detection
US6486856B1 (en) * 1998-04-15 2002-11-26 Daimlerchrysler Ag Apparatus for improved contrast in a motor vehicle heads-up display
US20020184236A1 (en) * 2000-07-18 2002-12-05 Max Donath Real time high accuracy geospatial database for onboard intelligent vehicle applications
US20030023614A1 (en) * 2001-07-18 2003-01-30 Newstrom Bryan J. Populating geospatial database for onboard intelligent vehicle applications
US6526352B1 (en) * 2001-07-19 2003-02-25 Intelligent Technologies International, Inc. Method and arrangement for mapping a road
US6587778B2 (en) * 1999-12-17 2003-07-01 Itt Manufacturing Enterprises, Inc. Generalized adaptive signal control method and system
US6690268B2 (en) * 2000-03-02 2004-02-10 Donnelly Corporation Video mirror systems incorporating an accessory module

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6366851B1 (en) 1999-10-25 2002-04-02 Navigation Technologies Corp. Method and system for automatic centerline adjustment of shape point data for a geographic database
US6977630B1 (en) 2000-07-18 2005-12-20 University Of Minnesota Mobility assist device

Patent Citations (81)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US47234A (en) * 1865-04-11 Improvement in boxes for transporting plants
US4120566A (en) * 1977-04-18 1978-10-17 Salvatore Sanci Rearview apparatus for vehicles
US4406501A (en) * 1982-01-29 1983-09-27 Caterpillar Tractor Co. Recoil system with guided slide assembly for track-type vehicles
US4498620A (en) * 1982-08-13 1985-02-12 Champion International Corporation Carton with carrying handle
US5231379A (en) * 1987-09-18 1993-07-27 Hughes Flight Dynamics, Inc. Automobile head-up display system with apparatus for positioning source information
US5291338A (en) * 1989-06-15 1994-03-01 Jaeger Head-down type optical device for delivering information to the driver of a motor vehicle
US5214757A (en) * 1990-08-07 1993-05-25 Georesearch, Inc. Interactive automated mapping system
US5203923A (en) * 1990-11-27 1993-04-20 Research Derivatives, Inc. Apparatus for painting highway markings
US5381338A (en) * 1991-06-21 1995-01-10 Wysocki; David A. Real time three dimensional geo-referenced digital orthophotograph-based positioning, navigation, collision avoidance and decision support system
US5499325A (en) * 1992-08-20 1996-03-12 International Business Machines Corporation Brightness controls for visual separation of vector and raster information
US5444442A (en) * 1992-11-05 1995-08-22 Matsushita Electric Industrial Co., Ltd. Method for predicting traffic space mean speed and traffic flow rate, and method and apparatus for controlling isolated traffic light signaling system through predicted traffic flow rate
US5949331A (en) * 1993-02-26 1999-09-07 Donnelly Corporation Display enhancements for vehicle vision system
US5517419A (en) * 1993-07-22 1996-05-14 Synectics Corporation Advanced terrain mapping system
US6226389B1 (en) * 1993-08-11 2001-05-01 Jerome H. Lemelson Motor vehicle warning and control system and method
US5765116A (en) * 1993-08-28 1998-06-09 Lucas Industries Public Limited Company Driver assistance system for a vehicle
US5497271A (en) * 1993-09-07 1996-03-05 Jaguar Cars Limited Head up displays for motor vehicles
US5529433A (en) * 1993-12-14 1996-06-25 Pavement Marking Technologies, Inc. Apparatus and method for marking a surface
US5602741A (en) * 1994-02-18 1997-02-11 Trimble Navigation Limited Centimeter accurate global positioning system receiver for on-the-fly real-time kinematic measurement and control
US5734358A (en) * 1994-03-18 1998-03-31 Kansei Corporation Information display device for motor vehicle
US5414439A (en) * 1994-06-09 1995-05-09 Delco Electronics Corporation Head up display with night vision enhancement
US5966132A (en) * 1994-06-17 1999-10-12 Namco Ltd. Three-dimensional image synthesis which represents images differently in multiple three dimensional spaces
US6104316A (en) * 1994-06-24 2000-08-15 Navigation Technologies Corporation Computerized navigation system
US6107944A (en) * 1994-06-24 2000-08-22 Navigation Technologies Corporation Electronic navigation system and method
US5543789A (en) * 1994-06-24 1996-08-06 Shields Enterprises, Inc. Computerized navigation system
US5808566A (en) * 1994-06-24 1998-09-15 Navigation Technologies Corporation Electronic navigation system and method
US5848373A (en) * 1994-06-24 1998-12-08 Delorme Publishing Company Computer aided map location system
US5826212A (en) * 1994-10-25 1998-10-20 Honda Giken Kogyo Kabushiki Kaisha Current-position map and three dimensional guiding objects displaying device for vehicle
US6038496A (en) * 1995-03-07 2000-03-14 Daimlerchrysler Ag Vehicle with optical scanning device for a lateral road area
US5761630A (en) * 1995-03-23 1998-06-02 Honda Giken Kogyo Kabushiki Kaisha Vehicle control system for merging vehicles safely
US5910817A (en) * 1995-05-18 1999-06-08 Omron Corporation Object observing method and device
US5599133A (en) * 1995-05-25 1997-02-04 Argus International Method and apparatus for painting road surfaces
US5721685A (en) * 1995-06-29 1998-02-24 Holland; Robert E. Digi-track digital roadway and railway analyzer
US5652705A (en) * 1995-09-25 1997-07-29 Spiess; Newton E. Highway traffic accident avoidance system
US5999635A (en) * 1996-01-12 1999-12-07 Sumitomo Electric Industries, Ltd. Traffic congestion measuring method and apparatus and image processing method and apparatus
US5951620A (en) * 1996-01-26 1999-09-14 Navigation Technologies Corporation System and method for distributing information for storage media
US5872526A (en) * 1996-05-23 1999-02-16 Sun Microsystems, Inc. GPS collision avoidance system
US6120460A (en) * 1996-09-04 2000-09-19 Abreu; Marcio Marc Method and apparatus for signal acquisition, processing and transmission for evaluation of bodily functions
US5953722A (en) * 1996-10-25 1999-09-14 Navigation Technologies Corporation Method and system for forming and using geographic data
US6308177B1 (en) * 1996-10-25 2001-10-23 Vijaya S. Israni System and method for use and storage of geographic data on physical media
US6268825B1 (en) * 1996-11-25 2001-07-31 Toyota Jidosha Kabushiki Kaisha Navigation device for vehicle and preparation of road shape data used therefor
US6361321B1 (en) * 1997-04-10 2002-03-26 Faac, Inc. Dynamically controlled vehicle simulator system, and methods of constructing and utilizing same
US5999878A (en) * 1997-04-11 1999-12-07 Navigation Technologies Corp. System and method for acquiring geographic data for forming a digital database of road geometry in a geographic region
US6272431B1 (en) * 1997-04-29 2001-08-07 Thomas Zamojdo Method for displaying a map in a vehicle en-route guidance system
US6157342A (en) * 1997-05-27 2000-12-05 Xanavi Informatics Corporation Navigation device
US5926117A (en) * 1997-06-10 1999-07-20 Hitachi, Ltd. Vehicle control system, vehicle mounting apparatus, base station apparatus and vehicle control method
US6208927B1 (en) * 1997-09-10 2001-03-27 Fuji Jukogyo Kabushiki Kaisha Vehicle maneuvering control device
US6047234A (en) * 1997-10-16 2000-04-04 Navigation Technologies Corporation System and method for updating, enhancing or refining a geographic database using feedback
US5978737A (en) * 1997-10-16 1999-11-02 Intel Corporation Method and apparatus for hazard detection and distraction avoidance for a vehicle
US6370475B1 (en) * 1997-10-22 2002-04-09 Intelligent Technologies International Inc. Accident avoidance system
US6405132B1 (en) * 1997-10-22 2002-06-11 Intelligent Technologies International, Inc. Accident avoidance system
US6297516B1 (en) * 1997-11-24 2001-10-02 The Trustees Of Princeton University Method for deposition and patterning of organic thin film
US6049295A (en) * 1997-12-05 2000-04-11 Fujitsu Limited Method and system for avoiding a collision at an intersection and a recording medium storing programs performing such a method
US6370261B1 (en) * 1998-01-30 2002-04-09 Fuji Jukogyo Kabushiki Kaisha Vehicle surroundings monitoring apparatus
US6289278B1 (en) * 1998-02-27 2001-09-11 Hitachi, Ltd. Vehicle position information displaying apparatus and method
US6038559A (en) * 1998-03-16 2000-03-14 Navigation Technologies Corporation Segment aggregation in a geographic database and methods for use thereof in a navigation application
US6298303B1 (en) * 1998-03-25 2001-10-02 Navigation Technologies Corp. Method and system for route calculation in a navigation application
US6192314B1 (en) * 1998-03-25 2001-02-20 Navigation Technologies Corp. Method and system for route calculation in a navigation application
US6144335A (en) * 1998-04-14 2000-11-07 Trimble Navigation Limited Automated differential correction processing of field data in a global positional system
US6486856B1 (en) * 1998-04-15 2002-11-26 Daimlerchrysler Ag Apparatus for improved contrast in a motor vehicle heads-up display
US6184823B1 (en) * 1998-05-01 2001-02-06 Navigation Technologies Corp. Geographic database architecture for representation of named intersections and complex intersections and methods for formation thereof and use in a navigation application program
US6212474B1 (en) * 1998-11-19 2001-04-03 Navigation Technologies Corporation System and method for providing route guidance with a navigation application program
US6208934B1 (en) * 1999-01-19 2001-03-27 Navigation Technologies Corp. Method and system for providing walking instructions with route guidance in a navigation program
US6166698A (en) * 1999-02-16 2000-12-26 Gentex Corporation Rearview mirror with integrated microwave receiver
US6161071A (en) * 1999-03-12 2000-12-12 Navigation Technologies Corporation Method and system for an in-vehicle computing architecture
US6218934B1 (en) * 1999-07-21 2001-04-17 Daimlerchrysler Corporation Mini-trip computer for use in a rearview mirror assembly
US6122593A (en) * 1999-08-03 2000-09-19 Navigation Technologies Corporation Method and system for providing a preview of a route calculated with a navigation system
US6249742B1 (en) * 1999-08-03 2001-06-19 Navigation Technologies Corp. Method and system for providing a preview of a route calculated with a navigation system
US6438491B1 (en) * 1999-08-06 2002-08-20 Telanon, Inc. Methods and apparatus for stationary object detection
US6385539B1 (en) * 1999-08-13 2002-05-07 Daimlerchrysler Ag Method and system for autonomously developing or augmenting geographical databases by mining uncoordinated probe data
US6188957B1 (en) * 1999-10-04 2001-02-13 Navigation Technologies Corporation Method and system for providing bicycle information with a navigation system
US6587778B2 (en) * 1999-12-17 2003-07-01 Itt Manufacturing Enterprises, Inc. Generalized adaptive signal control method and system
US6314365B1 (en) * 2000-01-18 2001-11-06 Navigation Technologies Corp. Method and system of providing navigation services to cellular phone devices from a server
US20020036584A1 (en) * 2000-02-28 2002-03-28 Jocoy Edward H. System and method for avoiding accidents in intersections
US6690268B2 (en) * 2000-03-02 2004-02-10 Donnelly Corporation Video mirror systems incorporating an accessory module
US6278942B1 (en) * 2000-03-21 2001-08-21 Navigation Technologies Corp. Method and system for providing routing guidance
US6253151B1 (en) * 2000-06-23 2001-06-26 Navigation Technologies Corp. Navigation system with feature for reporting errors
US6314367B1 (en) * 2000-06-23 2001-11-06 Navigation Technologies Corporation Navigation system with feature for reporting errors
US20020184236A1 (en) * 2000-07-18 2002-12-05 Max Donath Real time high accuracy geospatial database for onboard intelligent vehicle applications
US20030023614A1 (en) * 2001-07-18 2003-01-30 Newstrom Bryan J. Populating geospatial database for onboard intelligent vehicle applications
US6526352B1 (en) * 2001-07-19 2003-02-25 Intelligent Technologies International, Inc. Method and arrangement for mapping a road
US20030128182A1 (en) * 2001-10-01 2003-07-10 Max Donath Virtual mirror

Cited By (146)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9046594B1 (en) * 1999-06-14 2015-06-02 Escort Inc. Radar detector with navigation function
US20110102232A1 (en) * 1999-06-14 2011-05-05 Escort Inc. Radar detector with navigation function
US8525723B2 (en) * 1999-06-14 2013-09-03 Escort Inc. Radar detector with navigation function
US6977630B1 (en) 2000-07-18 2005-12-20 University Of Minnesota Mobility assist device
US7227515B2 (en) * 2000-10-13 2007-06-05 L-3 Communications Corporation System and method for forming images for display in a vehicle
US20050206727A1 (en) * 2000-10-13 2005-09-22 L-3 Communications Corporation System and method for forming images for display in a vehicle
US7552008B2 (en) 2001-07-18 2009-06-23 Regents Of The University Of Minnesota Populating geospatial database for onboard intelligent vehicle applications
US7209051B2 (en) 2002-03-05 2007-04-24 University Of Minnesota Intersection assistance system and method
US20050174257A1 (en) * 2002-03-05 2005-08-11 The University Of Minnesota Intersection assistance system and method
US7098913B1 (en) * 2002-07-30 2006-08-29 Rockwell Collins, Inc. Method and system for providing depth cues by attenuating distant displayed terrain
US20050073136A1 (en) * 2002-10-15 2005-04-07 Volvo Technology Corporation Method and arrangement for interpreting a subjects head and eye activity
US7460940B2 (en) * 2002-10-15 2008-12-02 Volvo Technology Corporation Method and arrangement for interpreting a subjects head and eye activity
US20050012682A1 (en) * 2003-07-17 2005-01-20 Jenson Barton James Visual display system for displaying virtual images onto a field of vision
US7158095B2 (en) * 2003-07-17 2007-01-02 Big Buddy Performance, Inc. Visual display system for displaying virtual images onto a field of vision
US8791878B2 (en) 2003-08-22 2014-07-29 Semiconductor Energy Laboratory Co., Ltd. Light emitting device, driving support system, and helmet
US20050052348A1 (en) * 2003-08-22 2005-03-10 Shunpei Yamazaki Light emitting device, driving support system, and helmet
EP1515295A3 (en) * 2003-08-22 2008-09-10 Semiconductor Energy Laboratory Co., Ltd. Light emitting device, driving support system, and helmet
US7598927B2 (en) 2003-08-22 2009-10-06 Semiconductor Energy Laboratory Co., Ltd. Light-emitting device, driving support system, and helmet
US8456382B2 (en) 2003-08-22 2013-06-04 Semiconductor Energy Laboratory Co., Ltd. Light emitting device, driving support system, and helmet
US20050107952A1 (en) * 2003-09-26 2005-05-19 Mazda Motor Corporation On-vehicle information provision apparatus
US20050116911A1 (en) * 2003-10-07 2005-06-02 Tomohiro Mukai Information display
US7385599B2 (en) * 2003-10-07 2008-06-10 Seiko Epson Corporation Information display
US7656313B2 (en) 2003-11-30 2010-02-02 Volvo Technology Corp. Method and system for supporting path control
WO2005055189A1 (en) * 2003-12-01 2005-06-16 Volvo Technology Corporation Perceptual enhancement displays based on knowledge of head and/or eye and/or gaze position
US8497880B2 (en) * 2003-12-01 2013-07-30 Volvo Technology Corporation Method and system for presenting information
US20080079753A1 (en) * 2003-12-01 2008-04-03 Volvo Technology Corporation Method and system for presenting information
WO2005053991A1 (en) * 2003-12-01 2005-06-16 Volvo Technology Corporation Method and system for supporting path control
US20070139176A1 (en) * 2003-12-01 2007-06-21 Volvo Technology Corporation Method and system for supporting path control
US20050134479A1 (en) * 2003-12-17 2005-06-23 Kazuyoshi Isaji Vehicle display system
US20050219695A1 (en) * 2004-04-05 2005-10-06 Vesely Michael A Horizontal perspective display
US7796134B2 (en) 2004-06-01 2010-09-14 Infinite Z, Inc. Multi-plane horizontal perspective display
US20050275915A1 (en) * 2004-06-01 2005-12-15 Vesely Michael A Multi-plane horizontal perspective display
US20050281411A1 (en) * 2004-06-01 2005-12-22 Vesely Michael A Binaural horizontal perspective display
US20050264857A1 (en) * 2004-06-01 2005-12-01 Vesely Michael A Binaural horizontal perspective display
US20050264559A1 (en) * 2004-06-01 2005-12-01 Vesely Michael A Multi-plane horizontal perspective hands-on simulator
WO2005118998A1 (en) * 2004-06-01 2005-12-15 Vesely Michael A Horizontal perspective simulator
US20060010699A1 (en) * 2004-07-15 2006-01-19 C&N Inc. Mobile terminal apparatus
US7194816B2 (en) * 2004-07-15 2007-03-27 C&N Inc. Mobile terminal apparatus
US20060022811A1 (en) * 2004-07-28 2006-02-02 Karsten Haug Night vision device
US7482909B2 (en) * 2004-07-28 2009-01-27 Robert Bosch Gmbh Night vision device
US8195386B2 (en) * 2004-09-28 2012-06-05 National University Corporation Kumamoto University Movable-body navigation information display method and movable-body navigation information display unit
US20080195315A1 (en) * 2004-09-28 2008-08-14 National University Corporation Kumamoto University Movable-Body Navigation Information Display Method and Movable-Body Navigation Information Display Unit
US7420558B2 (en) * 2004-10-25 2008-09-02 Sony Corporation Information processing apparatus and method, program, and navigation apparatus
US20060087507A1 (en) * 2004-10-25 2006-04-27 Sony Corporation Information processing apparatus and method, program, and navigation apparatus
US20060126927A1 (en) * 2004-11-30 2006-06-15 Vesely Michael A Horizontal perspective representation
US7509213B2 (en) * 2005-02-04 2009-03-24 Samsung Electronics Co., Ltd. Apparatus and method for correcting location information of mobile body, and computer-readable media storing computer program for controlling the apparatus
US20060178815A1 (en) * 2005-02-04 2006-08-10 Samsung Electronics Co., Ltd. Apparatus and method for correcting location information of mobile body, and computer-readable media storing computer program for controlling the apparatus
US7050908B1 (en) * 2005-03-22 2006-05-23 Delphi Technologies, Inc. Lane marker projection method for a motor vehicle vision system
US20060250391A1 (en) * 2005-05-09 2006-11-09 Vesely Michael A Three dimensional horizontal perspective workstation
US9292962B2 (en) 2005-05-09 2016-03-22 Zspace, Inc. Modifying perspective of stereoscopic images based on changes in user viewpoint
US7907167B2 (en) 2005-05-09 2011-03-15 Infinite Z, Inc. Three dimensional horizontal perspective workstation
US20060252978A1 (en) * 2005-05-09 2006-11-09 Vesely Michael A Biofeedback eyewear system
US8717423B2 (en) 2005-05-09 2014-05-06 Zspace, Inc. Modifying perspective of stereoscopic images based on changes in user viewpoint
US20060252979A1 (en) * 2005-05-09 2006-11-09 Vesely Michael A Biofeedback eyewear system
US9684994B2 (en) 2005-05-09 2017-06-20 Zspace, Inc. Modifying perspective of stereoscopic images based on changes in user viewpoint
US20060269437A1 (en) * 2005-05-31 2006-11-30 Pandey Awadh B High temperature aluminum alloys
US7486175B2 (en) * 2005-06-15 2009-02-03 Denso Corporation Vehicle drive assist system
DE102006027679B4 (en) 2005-06-15 2018-05-30 Denso Corporation Driving assistance system for vehicles
US20070013495A1 (en) * 2005-06-15 2007-01-18 Denso Coropration Vehicle drive assist system
US20070032914A1 (en) * 2005-08-05 2007-02-08 Nissan Motor Co., Ltd. Vehicle driving assist system
US7904246B2 (en) * 2005-08-05 2011-03-08 Nissan Motor Co., Ltd. Vehicle driving assist system
US20070043466A1 (en) * 2005-08-18 2007-02-22 Vesely Michael A Stereoscopic display using polarized eyewear
US20070040905A1 (en) * 2005-08-18 2007-02-22 Vesely Michael A Stereoscopic display using polarized eyewear
US20070040705A1 (en) * 2005-08-19 2007-02-22 Denso Corporation Unsafe location warning system
DE102006041857B4 (en) * 2005-09-06 2017-10-05 GM Global Technology Operations LLC (n. d. Ges. d. Staates Delaware) Method and system for improving traffic safety
US7782184B2 (en) 2005-09-06 2010-08-24 Gm Global Technology Operations, Inc. Method and system for improving traffic safety
US20070102214A1 (en) * 2005-09-06 2007-05-10 Marten Wittorf Method and system for improving traffic safety
US20070165910A1 (en) * 2006-01-17 2007-07-19 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus, method, and program
US8175331B2 (en) * 2006-01-17 2012-05-08 Honda Motor Co., Ltd. Vehicle surroundings monitoring apparatus, method, and program
US20190265818A1 (en) * 2006-03-24 2019-08-29 Northwestern University Haptic device with indirect haptic feedback
US10620769B2 (en) * 2006-03-24 2020-04-14 Northwestern University Haptic device with indirect haptic feedback
US11016597B2 (en) * 2006-03-24 2021-05-25 Northwestern University Haptic device with indirect haptic feedback
US20070257836A1 (en) * 2006-05-05 2007-11-08 Clint Chaplin Site survey tracking
EP1857780A3 (en) * 2006-05-16 2012-11-14 Navteq North America, LLC Dual road geometry representation for position and curvature-heading
FR2902381A1 (en) * 2006-06-20 2007-12-21 Peugeot Citroen Automobiles Sa Motor vehicle driving assisting method, involves merging image captured by image formation device of night vision system and synthesis image, and displaying merged image on internal display of night vision system
EP1887541A3 (en) * 2006-08-04 2010-07-21 Audi Ag Motor vehicle with a lane detection system
EP1887541A2 (en) * 2006-08-04 2008-02-13 Audi Ag Motor vehicle with a lane detection system
US20130342666A1 (en) * 2006-08-15 2013-12-26 Koninklijke Philips N.V. Assistance system for visually handicapped persons
US9603769B2 (en) * 2006-08-15 2017-03-28 Koninklijke Philips N.V. Assistance system for visually handicapped persons
US8330591B2 (en) * 2006-08-16 2012-12-11 GM Global Technology Operations LLC Method and system for adjusting vehicular components based on sun position
US20080046151A1 (en) * 2006-08-16 2008-02-21 Gm Global Technology Operations, Inc. Method and System for Adjusting Vehicular Components Based on Sun Position
EP1894779A1 (en) * 2006-09-01 2008-03-05 Harman Becker Automotive Systems GmbH Method of operating a night-view system in a vehicle and corresponding night-view system
US20090265061A1 (en) * 2006-11-10 2009-10-22 Aisin Seiki Kabushiki Kaisha Driving assistance device, driving assistance method, and program
WO2008108853A1 (en) * 2007-03-02 2008-09-12 Nanolumens Acquisition, Inc. Vehicle video display for exterior view
US20110050548A1 (en) * 2008-03-04 2011-03-03 Elbit Systems Electro Optics Elop Ltd. Head up display utilizing an lcd and a diffuser
US8786519B2 (en) * 2008-03-04 2014-07-22 Elbit Systems Ltd. Head up display utilizing an LCD and a diffuser
EP2412557A1 (en) * 2009-03-23 2012-02-01 Kabushiki Kaisha Toshiba Vehicluar display system, method of displaying and vehicle
EP2412557A4 (en) * 2009-03-23 2012-08-22 Toshiba Kk Vehicluar display system, method of displaying and vehicle
US20110153266A1 (en) * 2009-12-23 2011-06-23 Regents Of The University Of Minnesota Augmented vehicle location system
US9824485B2 (en) 2010-01-29 2017-11-21 Zspace, Inc. Presenting a view within a three dimensional scene
US8717360B2 (en) 2010-01-29 2014-05-06 Zspace, Inc. Presenting a view within a three dimensional scene
US9202306B2 (en) 2010-01-29 2015-12-01 Zspace, Inc. Presenting a view within a three dimensional scene
US20110301813A1 (en) * 2010-06-07 2011-12-08 Denso International America, Inc. Customizable virtual lane mark display
US20120044090A1 (en) * 2010-08-18 2012-02-23 GM Global Technology Operations LLC Motor vehicle with digital projectors
CN102842138A (en) * 2011-04-08 2012-12-26 F·波尔希名誉工学博士公司 Method for operating image-based driver assistance system in motorcycle, involves utilizing connection plane between central vertical axis of pickup unit and alignment axis as x-z-plane to determine data of vehicle
US9134556B2 (en) 2011-05-18 2015-09-15 Zspace, Inc. Liquid crystal variable drive voltage
US9958712B2 (en) 2011-05-18 2018-05-01 Zspace, Inc. Liquid crystal variable drive voltage
US8786529B1 (en) 2011-05-18 2014-07-22 Zspace, Inc. Liquid crystal variable drive voltage
US20120310531A1 (en) * 2011-05-31 2012-12-06 Broadcom Corporation Navigation system employing augmented labeling and/or indicia
US9517776B2 (en) * 2011-12-29 2016-12-13 Intel Corporation Systems, methods, and apparatus for controlling devices based on a detected gaze
US20140348389A1 (en) * 2011-12-29 2014-11-27 David L. Graumann Systems, methods, and apparatus for controlling devices based on a detected gaze
WO2013113500A1 (en) * 2012-02-02 2013-08-08 Audi Ag Driver assistance system and method for virtual representation of a road layout under obscured visibility and/or poor visibility conditions
US20130294650A1 (en) * 2012-02-16 2013-11-07 Panasonic Corporation Image generation device
US9321460B2 (en) * 2012-03-28 2016-04-26 Honda Motor Co., Ltd. Railroad crossing barrier estimating apparatus and vehicle
US20130261950A1 (en) * 2012-03-28 2013-10-03 Honda Motor Co., Ltd. Railroad crossing barrier estimating apparatus and vehicle
WO2013189927A1 (en) * 2012-06-20 2013-12-27 Bayerische Motoren Werke Aktiengesellschaft Method and device for operating a head-up display for a vehicle
US9791289B2 (en) * 2012-06-20 2017-10-17 Bayerische Motoren Werke Aktiengesellschaft Method and device for operating a head-up display for a vehicle
US20150100234A1 (en) * 2012-06-20 2015-04-09 Bayerische Motoren Werke Aktiengesellschaft Method and Device for Operating a Head-Up Display for a Vehicle
US20140005886A1 (en) * 2012-06-29 2014-01-02 Microsoft Corporation Controlling automotive functionality using internal- and external-facing sensors
US10009580B2 (en) * 2012-08-21 2018-06-26 Robert Bosch Gmbh Method for supplementing a piece of object information assigned to an object and method for selecting objects in surroundings of a vehicle
US20150229885A1 (en) * 2012-08-21 2015-08-13 Robert Bosch Gmbh Method for supplementing a piece of object information assigned to an object and method for selecting objects in surroundings of a vehicle
US20140267415A1 (en) * 2013-03-12 2014-09-18 Xueming Tang Road marking illuminattion system and method
US9064420B2 (en) 2013-03-14 2015-06-23 Honda Motor Co., Ltd. Augmented reality heads up display (HUD) for yield to pedestrian safety cues
US20140310610A1 (en) * 2013-04-15 2014-10-16 Flextronics Ap, Llc Vehicle occupant impairment assisted vehicle
US10089863B2 (en) * 2013-08-30 2018-10-02 Komatsu Ltd. Management system and management method for mining machine
US20160247395A1 (en) * 2013-08-30 2016-08-25 Komatsu Ltd. Management system and management method for mining machine
US20150071496A1 (en) * 2013-09-06 2015-03-12 Robert Bosch Gmbh method and control and recording device for the plausibility checking for the wrong-way travel of a motor vehicle
US10002298B2 (en) * 2013-09-06 2018-06-19 Robert Bosch Gmbh Method and control and recording device for the plausibility checking for the wrong-way travel of a motor vehicle
US9255811B2 (en) * 2014-01-29 2016-02-09 Brian R. Edelen Visual guidance system
US20150211876A1 (en) * 2014-01-29 2015-07-30 Brian R. Edelen Visual guidance system
US10408634B2 (en) 2014-03-25 2019-09-10 Jaguar Land Rover Limited Navigation system
AU2015238339B2 (en) * 2014-03-25 2018-02-22 Jaguar Land Rover Limited Navigation system
WO2015144751A1 (en) * 2014-03-25 2015-10-01 Jaguar Land Rover Limited Navigation system
US10106115B2 (en) * 2015-08-05 2018-10-23 Denso Corporation Position detection apparatus, position detection method, and position detection system
CN107924630A (en) * 2015-08-05 2018-04-17 株式会社电装 Position detecting device, method for detecting position and position detecting system
USD791158S1 (en) * 2015-10-08 2017-07-04 Mitsubishi Electric Corporation Display screen with graphical user interface
US20170163863A1 (en) * 2015-12-03 2017-06-08 Fico Mirrors, S.A. Rear vision system for a motor vehicle
WO2018007003A1 (en) * 2016-07-03 2018-01-11 DDG Benelux S.A. Non-rail bound driving device provided with a virtual reality unit
US20180031849A1 (en) * 2016-07-29 2018-02-01 Panasonic Automotive Systems Company Of America, Division Of Panasonic Corporation Of North America Augmented reality head-up display road correction
US10696223B2 (en) * 2016-09-29 2020-06-30 Valeo Vision Method for projecting an image by a projection system of a motor vehicle, and associated projection system
CN107878300A (en) * 2016-09-29 2018-04-06 法雷奥照明公司 Method and associated optical projection system by the projection system projects image of motor vehicles
US20180086262A1 (en) * 2016-09-29 2018-03-29 Valeo Vision Method for projecting an image by a projection system of a motor vehicle, and associated projection system
CN107878300B (en) * 2016-09-29 2023-03-07 法雷奥照明公司 Method for projecting an image by means of a projection system of a motor vehicle and projection system
CN107878301B (en) * 2016-09-29 2022-11-25 法雷奥照明公司 Method for projecting an image by means of a projection system of a motor vehicle and projection system
CN107878301A (en) * 2016-09-29 2018-04-06 法雷奥照明公司 Method and associated optical projection system for the projection system projects image by motor vehicles
US20190071094A1 (en) * 2017-09-01 2019-03-07 Honda Motor Co., Ltd. Vehicle control system, vehicle control method, and storage medium
US11009884B2 (en) * 2017-09-29 2021-05-18 Direct Current Capital LLC Method for calculating nominal vehicle paths for lanes within a geographic region
US20200348140A1 (en) * 2017-12-27 2020-11-05 Bayerische Motoren Werke Aktiengesellschaft Deformation Correction of a Digital Map for a Vehicle
US10870351B2 (en) 2018-12-12 2020-12-22 Here Global B.V. Method and apparatus for augmented reality based on localization and environmental conditions
EP3667239A1 (en) * 2018-12-12 2020-06-17 HERE Global B.V. Method and apparatus for augmented reality based on localization and environmental conditions
US10679530B1 (en) * 2019-02-11 2020-06-09 Toyota Motor Engineering & Manufacturing North America, Inc. Systems and methods for mobile projection in foggy conditions
US11529872B2 (en) * 2019-03-07 2022-12-20 Deutsche Post Ag Vehicle with display device
EP3845861A1 (en) * 2020-01-02 2021-07-07 Samsung Electronics Co., Ltd. Method and device for displaying 3d augmented reality navigation information
US11709069B2 (en) 2020-01-02 2023-07-25 Samsung Electronics Co., Ltd. Method and device for displaying 3D augmented reality navigation information
US20210403026A1 (en) * 2020-06-29 2021-12-30 Beijing Baidu Netcom Science And Technology Co., Ltd. Method and apparatus for 3d modeling
US11697428B2 (en) * 2020-06-29 2023-07-11 Apollo Intelligent Driving Technology (Beijing) Co., Ltd. Method and apparatus for 3D modeling

Also Published As

Publication number Publication date
US6977630B1 (en) 2005-12-20

Similar Documents

Publication Publication Date Title
US6977630B1 (en) Mobility assist device
US20050149251A1 (en) Real time high accuracy geospatial database for onboard intelligent vehicle applications
JP4383862B2 (en) Display method and display device for driving instruction in car navigation system
US7375728B2 (en) Virtual mirror
US11829138B1 (en) Change detection using curve alignment
US8195386B2 (en) Movable-body navigation information display method and movable-body navigation information display unit
ES2330351T3 (en) NAVIGATION DEVICE WITH CAMERA INFORMATION.
US9482540B2 (en) Navigation display method and system
US20210199437A1 (en) Vehicular component control using maps
US7612797B2 (en) Vehicle display apparatus for displaying information of a forward view of the vehicle
US7552008B2 (en) Populating geospatial database for onboard intelligent vehicle applications
US5936553A (en) Navigation device and method for displaying navigation information in a visual perspective view
EP3667239A1 (en) Method and apparatus for augmented reality based on localization and environmental conditions
US6591190B2 (en) Navigation system
WO2008038369A1 (en) Drive controller, drive control method, drive control program and recording medium
US11590902B2 (en) Vehicle display system for displaying surrounding event information
EP2676845A1 (en) A method and system for dynamically adjusting a vehicle mirror
EP2141454A2 (en) Method for providing search area coverage information
Carlson et al. Evaluation of traffic control devices, year 2.
Hu et al. Real-time data fusion on stabilizing camera pose estimation output for vision-based road navigation
Sergi et al. Bus rapid transit technologies: A virtual mirror for eliminating vehicle blind zones
JP2006146362A (en) Display device for vehicle
Ravani A RURAL FIELD TEST OF THE ROADVIEW SYSTEM
Sergi et al. Bus Rapid Transit Technologies: A Virtual Mirror for Eliminating Vehicle Blind Zones: Volume 2

Legal Events

Date Code Title Description
AS Assignment

Owner name: UNIVERSITY OF MINNESOTA, MINNESOTA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DONATH, MAX;SHANKWITZ, CRAIG R.;LIM, HEON MIN;AND OTHERS;REEL/FRAME:015255/0838

Effective date: 20001030

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION