EP1689607A1 - Method and system for supporting path control - Google Patents

Method and system for supporting path control

Info

Publication number
EP1689607A1
EP1689607A1 EP04803401A EP04803401A EP1689607A1 EP 1689607 A1 EP1689607 A1 EP 1689607A1 EP 04803401 A EP04803401 A EP 04803401A EP 04803401 A EP04803401 A EP 04803401A EP 1689607 A1 EP1689607 A1 EP 1689607A1
Authority
EP
European Patent Office
Prior art keywords
path
vehicle
driver
actual
future
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP04803401A
Other languages
German (de)
French (fr)
Inventor
Trent Victor
Johan Jarlengrip
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Volvo Technology AB
Original Assignee
Volvo Technology AB
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from PCT/EP2003/013479 external-priority patent/WO2005055189A1/en
Application filed by Volvo Technology AB filed Critical Volvo Technology AB
Priority to EP04803401A priority Critical patent/EP1689607A1/en
Publication of EP1689607A1 publication Critical patent/EP1689607A1/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/3658Lane guidance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/10Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/21Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
    • B60K35/23Head-up displays [HUD]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/20Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
    • B60K35/28Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor characterised by the type of the output information, e.g. video entertainment or vehicle dynamics information; characterised by the purpose of the output information, e.g. for attracting the attention of the driver
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • B60K35/60Instruments characterised by their location or relative disposition in or on vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3626Details of the output of route guidance instructions
    • G01C21/365Guidance using head up displays or projectors, e.g. virtual vehicles or arrows projected on the windscreen or on the road itself
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • G01C21/3697Output of additional, non-guidance related information, e.g. low fuel level
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/149Instrument input by detecting viewing direction not otherwise provided for
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/16Type of output information
    • B60K2360/166Navigation
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/21Optical features of instruments using cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/332Light emitting diodes
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/20Optical features of instruments
    • B60K2360/33Illumination features
    • B60K2360/334Projection means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/77Instrument locations other than the dashboard
    • B60K2360/771Instrument locations other than the dashboard on the ceiling
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2360/00Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
    • B60K2360/77Instrument locations other than the dashboard
    • B60K2360/785Instrument locations other than the dashboard on or in relation to the windshield or windows
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking

Definitions

  • the invention relates to a method and a system for supporting path control especial- ly of a vehicle on a road or in an off-road environment, or of a ship or an airplane.
  • a general object of this invention is to provide a method and system by which the above mentioned risks can be further reduced and the safety especially of driving a vehicle can be further increased.
  • a considerable advantage of the solution is that the method can be used for off-road applications as well if instead of a road, a course is predetermined by e.g. a navigation system.
  • Fig. 1 a schematic view of a first embodiment of a system according to the invention
  • Fig. 2 a schematic view of a second embodiment of a system according to the invention
  • Fig. 3 a schematic view of a third embodiment of a system according to the invention.
  • Fig. 4 a block diagram of components of a first arrangement of the system for presenting the future path of a vehicle according to the invention
  • Fig. 5 a schematic diagram of a desired future path in comparison to an actual future path
  • Fig. 6 an exemplary representation of a future path trajectory display integrated into a head mounted display
  • Fig. 7 a first embodiment of a future path trajectory display
  • Fig. 8 a second embodiment of a future path trajectory display
  • Fig. 9 a third embodiment of a future path trajectory display
  • Fig. 10 a block diagram of components of a second arrangement of the system for presenting the present path of a vehicle according to the invention
  • Fig. 11 a first embodiment of a display for controlling the present path of a vehicle
  • Fig. 12, 13 a second embodiment of a display for controlling the present path of a vehicle
  • Fig. 14 a third embodiment of a display for controlling the present path of a vehicle
  • Fig. 15 a fourth embodiment of a display for controlling the present path of a vehicle
  • Fig. 16 a fifth embodiment of a display for controlling the present path of a vehicle
  • Fig. 17 a sixth embodiment of a display for controlling the present paths of a vehicle
  • Fig. 18 an exemplary representation of an eyeglass mounted display for controlling the present path of a vehicle
  • Fig. 19 a schematic representation of a flow of optical signals for controlling the present path of a vehicle.
  • Figure 1 shows a side view into the cabin C of a vehicle with a driver D who is driving the vehicle and shows a first embodiment of a system for supporting path con- trol according to the invention.
  • a main component of this system is a control device 10 which is connected with a display device 20 e.g. in the form of a laser projector, and a visual behaviour sensor 30 for detecting head and/or eye position of the driver D.
  • a display device 20 e.g. in the form of a laser projector
  • a visual behaviour sensor 30 for detecting head and/or eye position of the driver D.
  • the control device 10 is provided for receiving head and/or eye position data from the visual behaviour sensor 30, and for receiving vehicle movement data, generated by at least one sensor (not shown) for detecting the velocity and/or a yaw rate and/or a wheel angle etc. of the vehicle.
  • the main components especially of the control device 10 shall be described with reference to Figures 4 and 10.
  • the control device is provided for processing these data and for controlling the display device 20 for displaying an estimated actual future path of the vehicle and/or an estimated present deviation of the vehicle from a desired present path on the ba- sis of a detected actual present path to the driver D.
  • the laser projector 20 is provided and installed to project such an image for example onto a certain location of the windscreen of the vehicle.
  • display device is used in this disclosure to refer to any source of visual information presentation to the driver.
  • displays include conventional computer displays, e.g. Liquid Crystal Displays (LCD) or similar, used to present GPS-based navigation and map information or other electronic devices, displays in the instrument panel, head-up displays, light emitting diods (LEDs), and other projection displays.
  • Helmet-mounted-, visor-mounted-, eyeglass-mounted displays can also be used.
  • An example of a projection display is a commercially available diode laser (see e.g. www.lasershow.se) which is capable of producing color picture stimuli.
  • a picture stimulus is made up of a single laser beam which is moved around so quickly that the impression of an image is generated.
  • the beam is controlled by two small electromagnetic motors (x-, and y-axis) with a small mirror on the motor axis.
  • the use of a number of different lasers is advantageous, including a red and/or blue and/or green laser.
  • a simple, inexpensive laser such as those commonly used for as pointing devices for presentations in an auditorium; an automotive grade laser could be used as well.
  • this system comprises a visual behavior sensor 30 for eyetracking which is for example a camera mounted on the dashboard or another sensor which can be head mounted for detecting the gaze direction or gaze position of the driver D and which is connected to the control device 10.
  • a visual behavior sensor 30 for eyetracking which is for example a camera mounted on the dashboard or another sensor which can be head mounted for detecting the gaze direction or gaze position of the driver D and which is connected to the control device 10.
  • Figure 2 again shows a side view into the cabin C of a vehicle with a driver D who is driving the vehicle.
  • Figure 2 shows a second embodiment of a system for supporting path control according to the invention in which instead of the laser projector two arrays of light emitting diodes 21a, 21b are provided which are explained in more details with reference to Figures 12 and 13 below.
  • Figure 3 shows a third embodiment of the invention for supporting path control in which the image is presented to the driver D via a head mounted visor 23 which as well comprises the visual behavior sensor 31 and the control unit 11.
  • Figure 4 shows a book diagram of components of a first arrangement for presenting the future path of the vehicle according to the invention.
  • a first component 40 is provided for detecting head and/or eye position data to the driver.
  • a second component 41 is provided for detecting vehicle data like for example a yaw rate and/or a wheel angel and/or a velocity of the vehicle.
  • a third component 42 is provided for calculating a path prediction on the basis of the detected vehicle data.
  • a fourth component 43 is provided for calculation of display parameters to achieve and obtain a display presentation 44 as disclosed in one of Figures 6 to 10.
  • Figures 5 A, B show a schematic diagram of a desired future path in comparison to an actual future path.
  • the arrows 1 indicate the desired future path and the arrows 2 indicate the actual future path.
  • Ef is the error between the desired and the future path at the preview distance T.
  • Ep is the error between the current path position and a desired path position.
  • the solid line represent the border of the path.
  • Figure 5 A shows an actual future path 2 with correct current (present) path position and incorrect future path trajectory at the far path point.
  • Figure 5B shows an actual future path 2 with both an incorrect current (present) path position and in incorrect future path trajectory at the far path point.
  • Controlling the future path trajectory involves comparing where the vehicle is going (arrow 2 in Figure 5A) with where it should be going (arrow 1).
  • the driver makes steering corrections based on the discrepancy (Ef) between the actual future path of the vehicle (arrow 2) and the desired future path (arrow 1) at a preview distance T.
  • Drivers evaluate future path error Ef by fixating a point on the future path about 4 degrees down from true horizon and at a preview distance about one second into the future, called the far-path point. Eye-fixati- ons become increasingly concentrated to this future path region as demands on the driver increase. Gaze concentration on the future path intensifies with traffic environment workload, secondary task workload, and driver state factors such as fatigue and impairment.
  • Drivers prioritize the visual guidance of their path-control task over recognition and planning tasks.
  • Figures 6A, B show an example of a future path trajectory display integrated into a head mounted display.
  • Figure 6A shows a presentation integrated into the eyeglasses of the head mounted display wherein Figure 6B shows the driver's view of the same information.
  • the gaze point G of the driver is indicated by a cross and the aiming of the vehicle is indicated by a circle.
  • Figure 7 shows a first embodiment of a future path trajectory display as seen by the driver when looking at the road through the windscreen.
  • the indicated lines L are the border lines of a road wherein the markings M are displayed to the driver.
  • Figure 7A shows an on-path situation when driving on a straight road wherein Figure 7B shows an on-path situation when driving into a left turn.
  • Figure 7C shows an off-paths situation when driving along a straight road wherein Figure 7D shows an off-path situation when driving into a left turn.
  • Figures 8 and 9 show a second and a third embodiment, respectively, of a future path trajectory display wherein again the border lines of the road L and the mar- kings M which are presented to the driver are indicated.
  • These Figures show the same on-path and off-path driving situations on a straight road and in a left turn, respectively, as in Figure 7 however, they are provided for clarifying that different kinds of markings M can be used which are displayed to the driver.
  • Displays are known which present the future paths of moving objects such as vehi- cles, aircraft (US 5289185 A), ships, robots exist (see EP 01065642A2, Endsley et al, 1999). These displays are variously called Quickening displays, Predictive displays, and Preview displays (see Lion, 1993; Mathan et al. 1996; and http://www.tunnel-in-the-sky.tudelft.nl/pred.html). Quickening a display means adding an indicator which extrapolates the current state of the moving object. The most notable use of a quickened display is in the flight director of many modern commercial airlines, which tells the pilot where to head to stay on the flight plan. However, current future path displays do not relate the presentation to eye or head position or present only the future path at the far path point.
  • the system provides predictive information to the driver about the vehicle's actual futu- re path so that the driver can directly see the difference between where the vehicle is heading and where the driver actually wants to go, i.e. to make the error term more visible.
  • Figures 6 to 9 show examples how the actual future path can be presented to the driver. These displays ideally require the system to have information of 1) head position and/or eye position (from, for example, US5802479, US5844486, or Seeing Machines FaceLAB at www.seeingmachines.com), 2) a path prediction estimate (calculated, for example, as in US 06466863, US 06542111, or US 06675094), and 3) the means with which to present information. See Figures 4 and 10.
  • Various set-ups can be used, such as those presented in Figures 1 to 3.
  • the fourth component 43 in Figure 4 represents the calculations of display parameters that are needed to present the future path indication as shown in Figures 6 to 9.
  • the actual future path of the vehicle is calculated with a path prediction algorithm (third component 42), and a future path indication, as shown in Figures 6 to 9, is displayed.
  • Knowledge of head and/or eye position data and position of the display surfaces enable geometric calculations to be made wherein the displayed information is correctly positioned relative to the outer environment as is shown in Figures 6 to 9.
  • the information presentation can be set the distance of the future path point (distance T in Figure 5) if, for example, speed information from vehicle sensors is available.
  • Figure 10 shows a block diagram of components of the second arrangement of the system for presenting the present path of a vehicle according to the invention.
  • a first component 50 is again provided for detecting head position data and/or eye position data of the driver.
  • a second component 51 is a lanetracker for sensing lane- tracking data, and a third component 52 is provided for detecting the speed of the vehicle.
  • geometric calculations are performed by means of a fourth component 53 to achieve and obtain a display presentation 54 as shown in Figures 11 to 17.
  • Figure 11 shows a first embodiment of a display in the form of a line of light emitting elements 60 which are activated and/or deactivated so that a lane-keeping display is achieved.
  • the vehicle is centered in the lane.
  • the vehicle is on its way out of its lane towards the right.
  • the grey striped LED' 61 e.g. green LEDs
  • the black LEDs 62 e.g. red LEDs
  • the goal state LED's 61 line up with the lane markings L in Figure 11 A and the last in the line of error LED's 62 al- so match up with lane markings L in Figure 1 IB.
  • Figures 12 and 13 show a second embodiment of a display in the form of a pair of matrices 21a, 21b of LED elements (see Figure 2) which are installed at the dashboard of the vehicle.
  • the matrix pair A shows how the display looks when the car is centered in lane.
  • the pairs B an C show steps with increasing number of black LEDs (e.g. red LEDs) as the vehicle progresses out of the lane towards the left.
  • the grey markings e.g. green LEDs represent goal state.
  • the matrix pair A again shows how the display looks when the car is centered in lane.
  • the pairs B and C show steps with increasing number of black LEDs (e.g. red LEDs) as the vehicle progresses out of the lane towards the right.
  • the grey markings e.g. green LEDs represent goal state.
  • Figure 14 shows a third embodiment of a display for imaging of lane-keeping information with solid markings.
  • Figure 14A shows markings that are presented when the vehicle is centered in lane. As the vehicle starts moving out of lane towards the right, in Figure 14B, the markings become larger to the right and disappear to the left. The vehicle is moving out of the lane to the left in Figure 14C. It is noted that the presentation is extended to match the lane markings L on the road by projecting information on the windshield as well as on the interior surfaces. Alternatively these same markings can be presented on a head mounted display device. Another alternative is to present the markings only when the vehicle starts moving out of the la- ne, in which case there would be no markings in Figure 14A.
  • Figure 15 shows a fourth embodiment of a display for imaging of lane-keeping information accompanied by sound and with a goal state.
  • Figure 15 A shows goal-state markings GM that are presented when the vehicle is centered in lane. As the vehicle starts moving out of lane towards the right ( Figure 15B) the goal state markings GM remain and additional error markings EM become larger to the right. The goal markings GM also disappear to the left. The vehicle is moving out of the lane to the left in Figure 15C. A sound with increasing in intensity, also accompanies the markings when the vehicle is leaving its lane. In this embodiment the lane markings L are not projected onto the windshield. Alternatively these same markings can be presented on a head mounted display device.
  • Figure 16 shows a fifth embodiment of a display for imaging of lane-keeping information with moving markings.
  • Figure 16A shows markings M that are presented when the vehicle is centered in lane.
  • the markings M are dashed and are presented with the dashes moving toward the driver in Figures 16A, B and C. This movement may increase peripheral vision sensitivity and comprehension of meaning.
  • Figure 16B As the vehicle starts moving out of lane towards the right ( Figure 16B) the markings M become larger to the right and disappear to the left. The vehicle is moving out of the lane to the left in Figure 16C. It is noted that the presentation is extended somewhat onto the windshield as well as on the interior surfaces. Alternatively these same markings M can be presented on a head mounted display device.
  • Figure 17 shows a sixth embodiment of a display for imaging of lane-keeping information with moving markings and a goal state.
  • Figure 17A shows the goal-state markings GM that are presented when the vehicle is centered in lane. The markings are dashed and are presented with the dashes moving toward the driver in Figures 17 A, B and C. This movement may increase peripheral vision sensitivity and comprehension of meaning.
  • Figure 17B shows the goal state markings GM remain and additional error markings EM become larger to the right.
  • the goal markings GM also disappear to the left.
  • the vehicle is moving out of the lane to the left in Figure 17C. It is noted that presentation is extended somewhat onto the windshield as well as on the interior surfaces. Alternatively these same markings can be presented on a head mounted display device.
  • Figure 18 shows an example of an eyeglass mounted display.
  • Figure 18A shows what is presented on the glasses, and indicating the gaze point G.
  • Figure 18B shows the driver's view through the glasses with the markings M; GM overlaid on the view.
  • Figure 19A shows a synthetic flow of optical signals in a straight environment.
  • the natural optic flow is continued into the vehicle by projecting optical moving dots that move substantially at the same speed, curvature, and expansion as the natural flow.
  • Figure 19B shows a synthetic flow of optical signals in a curved environment.
  • the natural optic flow is continued into the vehicle by projecting optical moving dots that move substantially at the same speed, curvature, and expansion as the natural flow.
  • control of the present-path position is achieved mainly by peripheral vision.
  • Peripheral vision is the part of the visual field that is greater than about 10 de- grees visual angle from the gaze point (fovea). Peripheral vision is especially sensitive to movement, spatial orientation, lighting changes, and is sensitive in low lighting.
  • driving drivers rarely gaze directly at the sides of their own vehicle or at lane markings near the vehicle. Rather, information regarding the vehicle's position in lane is extracted with peripheral vision.
  • the driver compares present position in path with desired present position in path and steers the vehicle to correct this error. The driver most often compares present position relative to lane markings, but can also regulate position relative to objects close to the vehicle.
  • Access to the information specifying present-path position is not always entirely accessible to our visual system. For example, when drivers operate information systems, such as a radio, peripheral information is blocked by the interior of the vehicle. If present-path error can be made more easily recognizable, then steering corrections become improved and unintentional lane exits can potentially be elimina- ted. Lane departures and run-off-road incidents represent a large portion of accidents. Lane-keeping difficulties often are the consequence of distraction caused by use of in-vehicle devices (e.g. a radio).
  • the display of lane-keeping information is simple enough to enable the driver to recognize information with peripheral vision only, when gaze is not directed at- or close to the display, without having to move the eyes off the road.
  • Information is presented in such a way that eye movements towards the display of information and subsequent eye-fixations upon the information are not necessary. The information has to be presented as simply and large enough to enable information extraction with peripheral vision.
  • Lane-trackers are commercially available products (see US 06665603 or US 06792345). Devices which provide information on head position and/or eye position (for example US 5802479, US 5844486, or Seeing Machines FaceLAB at www.seeingmachines.com) are known systems. Information about in- clines could be added by extracting this information from navigation systems using Global Positioning Systems and digital maps containing incline information.
  • An example of a display device for presenting information is again a laser projection display (see Figure 1).
  • Other examples of devices for presenting information are conventional computer displays, e.g. Liquid Crystal Displays (LCD) or similar, Head Up Displays, light emitting diods (LEDs), helmet-mounted-displays, visor- mounted-displays, and eyeglass-mounted displays.
  • the information can be designed to work in a number of ways to support lane-kee- ping as e.g. indicated in Figures 11 to 18.
  • Figure 11, 12, and 13 show how deviation in lane is represented by an increase in the number of LEDs being displayed.
  • the amount and placement of LEDs being displayed corresponds to the amount of lane deviation registered by the lane-tracking device.
  • the goal-state markings, represented by either white or green colors, are calculated from knowledge of head position and/or eye position, knowledge of the position of the vehicle in lane, and the width of the vehicle.
  • Goal state markings are presented in the embodiments shown in Figures 11 to 13, 15, 17, and 18. However, they could be left out of the presentation leaving only the error to be presented.
  • the system can turn on lane-keeping error presentation only when the vehicle is about to leave the lane, or only when the driver is 75 looking inside the vehicle, or only when the driver is performing a secondary task, or only when in different driving situations (for example only on motorway). The driver should be able to turn the system off and on as he/she pleases.
  • the display of lane-keeping information can increase the values of a 20 number of perceptual characteristics, such as lighting intensity, lighting density, pattern type, sound, vibrations, and movement.
  • perceptual characteristics such as lighting intensity, lighting density, pattern type, sound, vibrations, and movement.
  • both the number of LEDs shown in the embodiments of Figures 12 and 13 or size of the markings as shown in the embodiments of Figures 14 to 18, their intensity and/or color and/or sound can increase as the deviation from goal state is increased.
  • the presentation of visual in- 25 formation can be used together with sound as shown in the embodiment of Figure 15.
  • the embodiments of Figures 14 to 17 show different versions of providing information.
  • the embodiments of Figures 16 to 18 show moving indicators added to the markings.
  • Another alternative to support the detection of current path position is to add synthetic optic flow to the interior of the vehicle.
  • the natural optic flow created by motion through an environment, is continued into the vehicle by projecting optical moving dots that move at substantially the same speed, curvature, and expansion as the natural flow.
  • a laser projector can also be used to present to the driver35 a continuation of flow of optical dots inside the vehicle (see Figure 19).
  • the synthetic optic-flow projected onto the interior acts as extra spatial orientation information and enhances sensitivity to current path position error (Ep in Figure 5B), or lateral displacement. This increased access to optic flow information inside the vehicle is especially useful when the drivers eyes are diverted from the road. For example, when the driver looks at the gear shift, very little of the outside environment is available on the driver's retina.
  • the driver can easily detect lateral displacements in the optic flow.
  • drivers are able to maintain a more stable course, not weaving in lane as is normally the case when eyes are removed from the road to perform in-vehicle tasks.
  • a random selection of dots moves toward the driver D in a manner that mimics a continuation outside optic array.
  • the current optic flow can be estimated from vehicle information such as that used for path prediction (as described above) or it can be estimated by image processing software using video images from a forward looking camera.
  • the presentation of synthetic optic-flow could also be used in combination with the displayed information shown in the embodiments of Figures 7 to 18.
  • the laser projector (or other displays) can also be used to provide stimulus which would induce a corrective lane-keeping action. This is done by exaggerating the synthetic optic-flow to simulate more curvature than what is actually the case. For example, if the curved synthetic flow lines in Figure 19B are gi- ven more curvature, then the driver is given the impression that the vehicle is turning more to the left than it actually is. The impression of turning more creates a compensatory steering reaction whereby the driver turns a little to the right. Exaggeration of synthetic optic flow works equally well in straight environments. A system with exaggerated synthetic flow induces changes to the driver steering patterns wherein the driver compensates for lane deviations without being aware of it. Thus the unattentive or distracted driver is able to achieve a better lane-keeping performance. The components are the same as those outlined in Figure 10. The exaggeration of synthetic flow can be incrementally increased as the vehicle moves out of lane.
  • the invention can be used as well if the goal state is deter- mined e.g. by a navigation system like GPS.
  • the lanetracker-component 51 shown in Figure 10 is replaced by a pre-programmed evaluation unit for evaluating a deviation from the path which is determined by the navigation system.
  • control of the vehicles path is a combination of steering corrections derived from information coming from the driver's assessment of future path error and present path error.
  • This invention provides information to the driver to improve both of these tasks in combination, or separately, which can preferablybe chosen by the driver.

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Instrument Panels (AREA)
  • Traffic Control Systems (AREA)

Abstract

Method and system for supporting path control Method and a system are disclosed for supporting path control especially of a vehi­cle on a road or in an off-road environment, or of a ship or an airplane. The suppor­ting of path control is especially provided by conducting at least one of the follo­wing steps (a) and (b): (a) estimating an actual future path of the vehicle on the basis of vehicle movement data and optically and/or acoustically and/or tactilely indicating the estimated actual future path to the driver, (b) detecting the actual present path of the vehicle, estimating a present deviation of the detected actual present path from a desired present path and optically and/or acoustically and/or tactilely indicating the estimated present deviation to the driver.

Description

Method and system for supporting path control
The invention relates to a method and a system for supporting path control especial- ly of a vehicle on a road or in an off-road environment, or of a ship or an airplane.
It is generally known that for safely driving a vehicle the driver has to look predominantly onto the road for observing the traffic and avoiding accidents. However, especially drivers in current vehicles are often required to look away from the road and into the interior of the vehicle. For example, the driver frequently needs to directly fixate (look straight at) the speedometer, the radio or navigation displays and he must be able to read and understand the information presented there and to operate these and other devices, additionally to driving the vehicle and monitoring the traffic.
Any glances away from the road for example into the interior of the vehicle can potentially cause, an unsafe driving situation because the driver's ability to detect changes in the on-road environment is reduced. Off-road glances lead to undesirable safety consequences such as increased variability of lane-keeping performance, lane exceedencies, increased brake reaction times, missed events and more.
A general object of this invention is to provide a method and system by which the above mentioned risks can be further reduced and the safety especially of driving a vehicle can be further increased.
Especially, it is an object of the invention to provide a method and system for supporting path control especially of a vehicle on a road or in an off-road environment.
These objects are solved by a method according to claim 1 and a system according to claim 13.
A considerable advantage of the solution is that the method can be used for off-road applications as well if instead of a road, a course is predetermined by e.g. a navigation system.
These solutions are based on the surprising finding that the safely of driving a vehi- cle can substantially be increased if a method for supporting path control is performed with at least one of the steps mentioned in claim 1.
The subclaims disclose advantageous embodiments of the invention.
Further details, features and advantages of the invention are disclosed in the following description of preferred and exemplary embodiments of the invention with reference to the drawings in which shows:
Fig. 1 a schematic view of a first embodiment of a system according to the invention;
Fig. 2 a schematic view of a second embodiment of a system according to the invention;
Fig. 3 a schematic view of a third embodiment of a system according to the invention;
Fig. 4 a block diagram of components of a first arrangement of the system for presenting the future path of a vehicle according to the invention;
Fig. 5 a schematic diagram of a desired future path in comparison to an actual future path;
Fig. 6 an exemplary representation of a future path trajectory display integrated into a head mounted display;
Fig. 7 a first embodiment of a future path trajectory display;
Fig. 8 a second embodiment of a future path trajectory display;
Fig. 9 a third embodiment of a future path trajectory display; Fig. 10 a block diagram of components of a second arrangement of the system for presenting the present path of a vehicle according to the invention;
Fig. 11 a first embodiment of a display for controlling the present path of a vehicle;
Fig. 12, 13 a second embodiment of a display for controlling the present path of a vehicle;
Fig. 14 a third embodiment of a display for controlling the present path of a vehicle;
Fig. 15 a fourth embodiment of a display for controlling the present path of a vehicle; Fig. 16 a fifth embodiment of a display for controlling the present path of a vehicle; Fig. 17 a sixth embodiment of a display for controlling the present paths of a vehicle;
Fig. 18 an exemplary representation of an eyeglass mounted display for controlling the present path of a vehicle; and Fig. 19 a schematic representation of a flow of optical signals for controlling the present path of a vehicle.
Figure 1 shows a side view into the cabin C of a vehicle with a driver D who is driving the vehicle and shows a first embodiment of a system for supporting path con- trol according to the invention.
A main component of this system is a control device 10 which is connected with a display device 20 e.g. in the form of a laser projector, and a visual behaviour sensor 30 for detecting head and/or eye position of the driver D.
The control device 10 is provided for receiving head and/or eye position data from the visual behaviour sensor 30, and for receiving vehicle movement data, generated by at least one sensor (not shown) for detecting the velocity and/or a yaw rate and/or a wheel angle etc. of the vehicle. The main components especially of the control device 10 shall be described with reference to Figures 4 and 10.
The control device is provided for processing these data and for controlling the display device 20 for displaying an estimated actual future path of the vehicle and/or an estimated present deviation of the vehicle from a desired present path on the ba- sis of a detected actual present path to the driver D. The laser projector 20 is provided and installed to project such an image for example onto a certain location of the windscreen of the vehicle.
The term "display device" is used in this disclosure to refer to any source of visual information presentation to the driver. Examples of displays include conventional computer displays, e.g. Liquid Crystal Displays (LCD) or similar, used to present GPS-based navigation and map information or other electronic devices, displays in the instrument panel, head-up displays, light emitting diods (LEDs), and other projection displays. Helmet-mounted-, visor-mounted-, eyeglass-mounted displays can also be used. An example of a projection display is a commercially available diode laser (see e.g. www.lasershow.se) which is capable of producing color picture stimuli. A picture stimulus is made up of a single laser beam which is moved around so quickly that the impression of an image is generated. The beam is controlled by two small electromagnetic motors (x-, and y-axis) with a small mirror on the motor axis. The use of a number of different lasers is advantageous, including a red and/or blue and/or green laser. However, in many applications it is sufficient to use a simple, inexpensive laser, such as those commonly used for as pointing devices for presentations in an auditorium; an automotive grade laser could be used as well.
Finally, this system comprises a visual behavior sensor 30 for eyetracking which is for example a camera mounted on the dashboard or another sensor which can be head mounted for detecting the gaze direction or gaze position of the driver D and which is connected to the control device 10.
Figure 2 again shows a side view into the cabin C of a vehicle with a driver D who is driving the vehicle. Figure 2 shows a second embodiment of a system for supporting path control according to the invention in which instead of the laser projector two arrays of light emitting diodes 21a, 21b are provided which are explained in more details with reference to Figures 12 and 13 below.
Figure 3 shows a third embodiment of the invention for supporting path control in which the image is presented to the driver D via a head mounted visor 23 which as well comprises the visual behavior sensor 31 and the control unit 11.
Figure 4 shows a book diagram of components of a first arrangement for presenting the future path of the vehicle according to the invention. A first component 40 is provided for detecting head and/or eye position data to the driver. A second component 41 is provided for detecting vehicle data like for example a yaw rate and/or a wheel angel and/or a velocity of the vehicle. A third component 42 is provided for calculating a path prediction on the basis of the detected vehicle data. On the basis of the head position and/or the eye position data of the driver, and the calculated path prediction, a fourth component 43 is provided for calculation of display parameters to achieve and obtain a display presentation 44 as disclosed in one of Figures 6 to 10. Figures 5 A, B show a schematic diagram of a desired future path in comparison to an actual future path. In these representations the arrows 1 indicate the desired future path and the arrows 2 indicate the actual future path. Ef is the error between the desired and the future path at the preview distance T. Ep is the error between the current path position and a desired path position. The solid line represent the border of the path.
Figure 5 A shows an actual future path 2 with correct current (present) path position and incorrect future path trajectory at the far path point. Figure 5B shows an actual future path 2 with both an incorrect current (present) path position and in incorrect future path trajectory at the far path point.
Controlling the future path trajectory involves comparing where the vehicle is going (arrow 2 in Figure 5A) with where it should be going (arrow 1). To control the futu- re path trajectory, the driver makes steering corrections based on the discrepancy (Ef) between the actual future path of the vehicle (arrow 2) and the desired future path (arrow 1) at a preview distance T. Drivers evaluate future path error Ef by fixating a point on the future path about 4 degrees down from true horizon and at a preview distance about one second into the future, called the far-path point. Eye-fixati- ons become increasingly concentrated to this future path region as demands on the driver increase. Gaze concentration on the future path intensifies with traffic environment workload, secondary task workload, and driver state factors such as fatigue and impairment. Drivers prioritize the visual guidance of their path-control task over recognition and planning tasks.
Given the significance of the far-path point for detecting future-path error, and the added priority drivers place on it when driving gets demanding, it follows that a system that assists the driver in detecting future-path error would be advantageous. If future-path error can be made more easily recognizable, then steering corrections become improved, path control is improved, and the driver can place more priority on recognition and planning tasks because she is freer to move her eyes around to other objects and areas of vision during highly demanding situations.
Figures 6A, B show an example of a future path trajectory display integrated into a head mounted display. Figure 6A shows a presentation integrated into the eyeglasses of the head mounted display wherein Figure 6B shows the driver's view of the same information. The gaze point G of the driver is indicated by a cross and the aiming of the vehicle is indicated by a circle.
Figure 7 shows a first embodiment of a future path trajectory display as seen by the driver when looking at the road through the windscreen. The indicated lines L are the border lines of a road wherein the markings M are displayed to the driver.
Figure 7A shows an on-path situation when driving on a straight road wherein Figure 7B shows an on-path situation when driving into a left turn. Figure 7C shows an off-paths situation when driving along a straight road wherein Figure 7D shows an off-path situation when driving into a left turn.
Figures 8 and 9 show a second and a third embodiment, respectively, of a future path trajectory display wherein again the border lines of the road L and the mar- kings M which are presented to the driver are indicated. These Figures show the same on-path and off-path driving situations on a straight road and in a left turn, respectively, as in Figure 7 however, they are provided for clarifying that different kinds of markings M can be used which are displayed to the driver.
This first arrangement which has been described with reference to Figures 4 to 9 is provided for supporting future path control and especially for providing a feedback to support future-path trajectory assessment as follows:
Displays are known which present the future paths of moving objects such as vehi- cles, aircraft (US 5289185 A), ships, robots exist (see EP 01065642A2, Endsley et al, 1999). These displays are variously called Quickening displays, Predictive displays, and Preview displays (see Lion, 1993; Mathan et al. 1996; and http://www.tunnel-in-the-sky.tudelft.nl/pred.html). Quickening a display means adding an indicator which extrapolates the current state of the moving object. The most notable use of a quickened display is in the flight director of many modern commercial airlines, which tells the pilot where to head to stay on the flight plan. However, current future path displays do not relate the presentation to eye or head position or present only the future path at the far path point.
To support the control of the future path trajectory, the system according to the invention provides predictive information to the driver about the vehicle's actual futu- re path so that the driver can directly see the difference between where the vehicle is heading and where the driver actually wants to go, i.e. to make the error term more visible.
Figures 6 to 9 show examples how the actual future path can be presented to the driver. These displays ideally require the system to have information of 1) head position and/or eye position (from, for example, US5802479, US5844486, or Seeing Machines FaceLAB at www.seeingmachines.com), 2) a path prediction estimate (calculated, for example, as in US 06466863, US 06542111, or US 06675094), and 3) the means with which to present information. See Figures 4 and 10. Various set-ups can be used, such as those presented in Figures 1 to 3.
The fourth component 43 in Figure 4 represents the calculations of display parameters that are needed to present the future path indication as shown in Figures 6 to 9. The actual future path of the vehicle is calculated with a path prediction algorithm (third component 42), and a future path indication, as shown in Figures 6 to 9, is displayed. Knowledge of head and/or eye position data and position of the display surfaces enable geometric calculations to be made wherein the displayed information is correctly positioned relative to the outer environment as is shown in Figures 6 to 9. The information presentation can be set the distance of the future path point (distance T in Figure 5) if, for example, speed information from vehicle sensors is available.
The second arrangement which is provided for supporting a present path control shall now be described with respect to Figures 10 to 18 and in a further developed embodiment with respect to Figure 19.
Figure 10 shows a block diagram of components of the second arrangement of the system for presenting the present path of a vehicle according to the invention. A first component 50 is again provided for detecting head position data and/or eye position data of the driver. A second component 51 is a lanetracker for sensing lane- tracking data, and a third component 52 is provided for detecting the speed of the vehicle. On the basis of the output signals of these three components 50, 51, 52 geometric calculations are performed by means of a fourth component 53 to achieve and obtain a display presentation 54 as shown in Figures 11 to 17. Figure 11 shows a first embodiment of a display in the form of a line of light emitting elements 60 which are activated and/or deactivated so that a lane-keeping display is achieved.
According to Figure 11A the vehicle is centered in the lane. According to Figure 1 IB the vehicle is on its way out of its lane towards the right. The grey striped LED' 61 (e.g. green LEDs) indicate the goal state and the black LEDs 62 (e.g. red LEDs) represent the amount of error from the goal state. The goal state LED's 61 line up with the lane markings L in Figure 11 A and the last in the line of error LED's 62 al- so match up with lane markings L in Figure 1 IB.
Figures 12 and 13 show a second embodiment of a display in the form of a pair of matrices 21a, 21b of LED elements (see Figure 2) which are installed at the dashboard of the vehicle.
The matrix pair A shows how the display looks when the car is centered in lane. The pairs B an C show steps with increasing number of black LEDs (e.g. red LEDs) as the vehicle progresses out of the lane towards the left. The grey markings (e.g. green LEDs) represent goal state.
In Figure 13 the matrix pair A again shows how the display looks when the car is centered in lane. The pairs B and C show steps with increasing number of black LEDs (e.g. red LEDs) as the vehicle progresses out of the lane towards the right. The grey markings (e.g. green LEDs) represent goal state.
Figure 14 shows a third embodiment of a display for imaging of lane-keeping information with solid markings. Figure 14A shows markings that are presented when the vehicle is centered in lane. As the vehicle starts moving out of lane towards the right, in Figure 14B, the markings become larger to the right and disappear to the left. The vehicle is moving out of the lane to the left in Figure 14C. It is noted that the presentation is extended to match the lane markings L on the road by projecting information on the windshield as well as on the interior surfaces. Alternatively these same markings can be presented on a head mounted display device. Another alternative is to present the markings only when the vehicle starts moving out of the la- ne, in which case there would be no markings in Figure 14A. Figure 15 shows a fourth embodiment of a display for imaging of lane-keeping information accompanied by sound and with a goal state. Figure 15 A shows goal-state markings GM that are presented when the vehicle is centered in lane. As the vehicle starts moving out of lane towards the right (Figure 15B) the goal state markings GM remain and additional error markings EM become larger to the right. The goal markings GM also disappear to the left. The vehicle is moving out of the lane to the left in Figure 15C. A sound with increasing in intensity, also accompanies the markings when the vehicle is leaving its lane. In this embodiment the lane markings L are not projected onto the windshield. Alternatively these same markings can be presented on a head mounted display device.
Figure 16 shows a fifth embodiment of a display for imaging of lane-keeping information with moving markings. Figure 16A shows markings M that are presented when the vehicle is centered in lane. The markings M are dashed and are presented with the dashes moving toward the driver in Figures 16A, B and C. This movement may increase peripheral vision sensitivity and comprehension of meaning. As the vehicle starts moving out of lane towards the right (Figure 16B) the markings M become larger to the right and disappear to the left. The vehicle is moving out of the lane to the left in Figure 16C. It is noted that the presentation is extended somewhat onto the windshield as well as on the interior surfaces. Alternatively these same markings M can be presented on a head mounted display device.
Figure 17 shows a sixth embodiment of a display for imaging of lane-keeping information with moving markings and a goal state. Figure 17A shows the goal-state markings GM that are presented when the vehicle is centered in lane. The markings are dashed and are presented with the dashes moving toward the driver in Figures 17 A, B and C. This movement may increase peripheral vision sensitivity and comprehension of meaning. As the vehicle starts moving out of lane towards the right (Figure 17B) the goal state markings GM remain and additional error markings EM become larger to the right. The goal markings GM also disappear to the left. The vehicle is moving out of the lane to the left in Figure 17C. It is noted that presentation is extended somewhat onto the windshield as well as on the interior surfaces. Alternatively these same markings can be presented on a head mounted display device.
Figure 18 shows an example of an eyeglass mounted display. Figure 18A shows what is presented on the glasses, and indicating the gaze point G. Figure 18B shows the driver's view through the glasses with the markings M; GM overlaid on the view.
Figure 19A shows a synthetic flow of optical signals in a straight environment. The natural optic flow is continued into the vehicle by projecting optical moving dots that move substantially at the same speed, curvature, and expansion as the natural flow.
Figure 19B shows a synthetic flow of optical signals in a curved environment. The natural optic flow is continued into the vehicle by projecting optical moving dots that move substantially at the same speed, curvature, and expansion as the natural flow.
This second arrangement which has been described with reference to Figures 10 to 18 and in a further developed embodiment with respect to Figure 19 is provided for supporting persent path control as follows:
Generally, control of the present-path position is achieved mainly by peripheral vision. Peripheral vision is the part of the visual field that is greater than about 10 de- grees visual angle from the gaze point (fovea). Peripheral vision is especially sensitive to movement, spatial orientation, lighting changes, and is sensitive in low lighting. In driving, drivers rarely gaze directly at the sides of their own vehicle or at lane markings near the vehicle. Rather, information regarding the vehicle's position in lane is extracted with peripheral vision. In controlling the present-path position, the driver compares present position in path with desired present position in path and steers the vehicle to correct this error. The driver most often compares present position relative to lane markings, but can also regulate position relative to objects close to the vehicle.
Access to the information specifying present-path position is not always entirely accessible to our visual system. For example, when drivers operate information systems, such as a radio, peripheral information is blocked by the interior of the vehicle. If present-path error can be made more easily recognizable, then steering corrections become improved and unintentional lane exits can potentially be elimina- ted. Lane departures and run-off-road incidents represent a large portion of accidents. Lane-keeping difficulties often are the consequence of distraction caused by use of in-vehicle devices (e.g. a radio).
The display of lane-keeping information as shown in Figures 10 to 19, assists the driver in perceiving the effects of lane deviation. The display of lane-keeping information is simple enough to enable the driver to recognize information with peripheral vision only, when gaze is not directed at- or close to the display, without having to move the eyes off the road. Information is presented in such a way that eye movements towards the display of information and subsequent eye-fixations upon the information are not necessary. The information has to be presented as simply and large enough to enable information extraction with peripheral vision.
The display of lane-keeping information uses a combination of a lane tracker 51, a head/eye position sensor 50, and a means to present information 54, as shown in Figure 10 and Figures 1 to 3. Lane-trackers are commercially available products (see US 06665603 or US 06792345). Devices which provide information on head position and/or eye position (for example US 5802479, US 5844486, or Seeing Machines FaceLAB at www.seeingmachines.com) are known systems. Information about in- clines could be added by extracting this information from navigation systems using Global Positioning Systems and digital maps containing incline information.
An example of a display device for presenting information is again a laser projection display (see Figure 1). Other examples of devices for presenting information are conventional computer displays, e.g. Liquid Crystal Displays (LCD) or similar, Head Up Displays, light emitting diods (LEDs), helmet-mounted-displays, visor- mounted-displays, and eyeglass-mounted displays.
The information can be designed to work in a number of ways to support lane-kee- ping as e.g. indicated in Figures 11 to 18. Figure 11, 12, and 13 show how deviation in lane is represented by an increase in the number of LEDs being displayed. The amount and placement of LEDs being displayed corresponds to the amount of lane deviation registered by the lane-tracking device. The goal-state markings, represented by either white or green colors, (grey in Figures 11 to 13) are calculated from knowledge of head position and/or eye position, knowledge of the position of the vehicle in lane, and the width of the vehicle. The knowledge of head and/or eye position, and knowledge of the geometries of the surfaces which the information is presented on, allows the system to position the goal-state markings to match a continuation of the lane or road markings as indicated in Figures 11 to 18. If no head and/or eye position data are known, this lane- matching would not be possible because of variations in seating position, height, and head movements. Lane-keeping information can be presented on both sides as in Figure 11, or just on the one side at which there is a danger of leaving the lane as in Figures 12 to 18.
10 Goal state markings are presented in the embodiments shown in Figures 11 to 13, 15, 17, and 18. However, they could be left out of the presentation leaving only the error to be presented. Alternatively, the system can turn on lane-keeping error presentation only when the vehicle is about to leave the lane, or only when the driver is 75 looking inside the vehicle, or only when the driver is performing a secondary task, or only when in different driving situations (for example only on motorway). The driver should be able to turn the system off and on as he/she pleases.
In general, the display of lane-keeping information can increase the values of a 20 number of perceptual characteristics, such as lighting intensity, lighting density, pattern type, sound, vibrations, and movement. For example, both the number of LEDs shown in the embodiments of Figures 12 and 13 or size of the markings as shown in the embodiments of Figures 14 to 18, their intensity and/or color and/or sound can increase as the deviation from goal state is increased. The presentation of visual in- 25 formation can be used together with sound as shown in the embodiment of Figure 15. The embodiments of Figures 14 to 17 show different versions of providing information. The embodiments of Figures 16 to 18 show moving indicators added to the markings.
30 Another alternative to support the detection of current path position is to add synthetic optic flow to the interior of the vehicle. The natural optic flow, created by motion through an environment, is continued into the vehicle by projecting optical moving dots that move at substantially the same speed, curvature, and expansion as the natural flow. For example, a laser projector can also be used to present to the driver35 a continuation of flow of optical dots inside the vehicle (see Figure 19). The synthetic optic-flow projected onto the interior acts as extra spatial orientation information and enhances sensitivity to current path position error (Ep in Figure 5B), or lateral displacement. This increased access to optic flow information inside the vehicle is especially useful when the drivers eyes are diverted from the road. For example, when the driver looks at the gear shift, very little of the outside environment is available on the driver's retina. By having this extra, synthetic spatial orientation information, the driver can easily detect lateral displacements in the optic flow. Thus, drivers are able to maintain a more stable course, not weaving in lane as is normally the case when eyes are removed from the road to perform in-vehicle tasks.
One example of how this can be achieved is to use the laser projector described in connection with Figure 1. A random selection of dots moves toward the driver D in a manner that mimics a continuation outside optic array. The current optic flow can be estimated from vehicle information such as that used for path prediction (as described above) or it can be estimated by image processing software using video images from a forward looking camera. The presentation of synthetic optic-flow could also be used in combination with the displayed information shown in the embodiments of Figures 7 to 18.
In another embodiment, the laser projector (or other displays) can also be used to provide stimulus which would induce a corrective lane-keeping action. This is done by exaggerating the synthetic optic-flow to simulate more curvature than what is actually the case. For example, if the curved synthetic flow lines in Figure 19B are gi- ven more curvature, then the driver is given the impression that the vehicle is turning more to the left than it actually is. The impression of turning more creates a compensatory steering reaction whereby the driver turns a little to the right. Exaggeration of synthetic optic flow works equally well in straight environments. A system with exaggerated synthetic flow induces changes to the driver steering patterns wherein the driver compensates for lane deviations without being aware of it. Thus the unattentive or distracted driver is able to achieve a better lane-keeping performance. The components are the same as those outlined in Figure 10. The exaggeration of synthetic flow can be incrementally increased as the vehicle moves out of lane.
For off-road applications the invention can be used as well if the goal state is deter- mined e.g. by a navigation system like GPS. In this case the lanetracker-component 51 shown in Figure 10 is replaced by a pre-programmed evaluation unit for evaluating a deviation from the path which is determined by the navigation system.
In sum, control of the vehicles path is a combination of steering corrections derived from information coming from the driver's assessment of future path error and present path error. This invention provides information to the driver to improve both of these tasks in combination, or separately, which can preferablybe chosen by the driver.

Claims

Claims
1. Method for supporting path control especially of a vehicle by at least one of the following steps (a) and (b): (a) estimating an actual future path of the vehicle on the basis of vehicle movement data and optically and/or acoustically and/or tactilely indicating the estimated actual future path to the driver,
(b) detecting the actual present path of the vehicle, estimating a present deviation of the detected actual present path from a desired present path and optically and/or acoustically and/or tactilely indicating the estimated present deviation to the driver.
2. Method according to claim 1, wherein step (a) comprises an estimation of a future deviation of the actual future path from a desired future path and optically and/or acoustically and/or tactilely in- dicating the estimated future deviation of the path to the driver.
3. Method according to claim 1, wherein the desired present path and/or the desired future path is given by an actual driving environment, especially a course of a road, or an image of an off-road cour- se established e.g. by a navigation system.
4. Method according to claim 1, comprising a step of determining a head and/or eye position of the driver and presenting the estimated actual future path and/or the estimated present deviation to the driver in an optical relation to an actual driving environment, especially a course of a road, or an image of an off-road course established e.g. by a navigation system.
5. Method according to claim 4, wherein the presentation is provided in the form of an optical overlay of an image of the estimated actual future path and/or of the estimated present deviation over the actual driving environment, especially a course of a road, or over an image of an off-road course established e.g. by a navigation system.
6. Method according to claim 4, wherein the presentation is provided in the form of an optical continuation of the actual driving environment, especially a course of a road or an image of an off-road course established e.g. by a navigation system.
7. Method according to claim 6, wherein the optical continuation comprises an optical indication, especially an opti- cal line, of at least one borderline in the actual driving environment to the driver.
8. Method according to claim 6 or 7, wherein the optical continuation and/or the optical indication, respectively, is three dimensionally presented by means of a laser device.
9. Method according to claim 6 or 7, wherein the optical continuation and/or the optical indication, respectively, is presented by activating and/or deactivating of an arrangement of a plurality of LED elements.
10. Method according to claim 7, wherein the optical indication of the at least one borderline is optically enhanced or attenuated with respect to its intensity and/or thickness and/or contrast and/or colour in correspondence with a decreasing or increasing distance between vehicle and the borderline.
11. Method according to claim 6, wherein the optical continuation of the actual driving environment is provided in the form of a two- or three dimensional flow of optical signals, signs and/or patterns surrounding the driver.
12. Method according to claim 1, wherein the path control is provided in the form of an aiming device in which the aim is provided by an image of the desired future path and/or the desired present path, and the indication of the estimated actual future path and/or the estimated present deviation is presented for supporting the driver's steering of the vehicle.
13. System for supporting path control especially of a vehicle and especially forn- ducting a method according to at least one of claims 1 to 12, comprising at least one of a first and a second arrangement wherein:
- the first arrangement (40, 41, 42, 43, 44) is provided for estimating an actual futu- re path of the vehicle on the basis of vehicle movement data and optically and/or acoustically and/or tactilely indicating the estimated actual future path to the driver, and
- the second arrangement (50, 51, 52, 53, 54) is provided for detecting the actual present path of the vehicle, estimating a present deviation of the detected actual present path from a desired present path and optically and/or acoustically and/or tactile indicating the estimated present deviation to the driver.
14. System according to claim 13, wherein the first arrangement (40, 41, 42, 43, 44) comprises a first device for estimating a future deviation of the actual future path from a desired future path and for optically and/or acoustically and/or tactilely indicating the estimated future deviation of the path to a driver.
15. System according to claim 13, wherein the first and or the second arrangement (40, 41, 42, 43, 44; 50, 51, 52, 53, 54) comprises, a device for determining a head and/or eye position of the driver and a control device for controlling a display device so that the estimated actual future path and/or the estimated present deviation can be presented in an optical relation to an actual driving environment, especially a course of a road or an image of an off- road course established e.g. by a navigation system.
16. System according to claim 15, wherein the display device is provided in the form of a laser projector for generating two- or three-dimensional images in the interior of the vehicle.
17. System according to claim 15, wherein the display device is provided in the form of a plurality of LED elements.
18. Computer program comprising computer program code means adapted to perform a method according to at least one of claims 1 to 12 when said program is run on a programmable microcomputer.
19. Computer program according to claim 18 adapted to be downloaded to a system according to claim 13 or one of its components when run on a computer which is connected to the internet.
20. Computer program product stored on a computer readable medium, comprising computer program code means according to claim 18.
EP04803401A 2003-12-01 2004-12-01 Method and system for supporting path control Withdrawn EP1689607A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP04803401A EP1689607A1 (en) 2003-12-01 2004-12-01 Method and system for supporting path control

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
PCT/EP2003/013479 WO2005055189A1 (en) 2003-12-01 2003-12-01 Perceptual enhancement displays based on knowledge of head and/or eye and/or gaze position
EP04803401A EP1689607A1 (en) 2003-12-01 2004-12-01 Method and system for supporting path control
PCT/EP2004/013632 WO2005053991A1 (en) 2003-12-01 2004-12-01 Method and system for supporting path control

Publications (1)

Publication Number Publication Date
EP1689607A1 true EP1689607A1 (en) 2006-08-16

Family

ID=36649186

Family Applications (1)

Application Number Title Priority Date Filing Date
EP04803401A Withdrawn EP1689607A1 (en) 2003-12-01 2004-12-01 Method and system for supporting path control

Country Status (1)

Country Link
EP (1) EP1689607A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07257228A (en) * 1994-03-18 1995-10-09 Nissan Motor Co Ltd Display device for vehicle
JPH08241493A (en) * 1995-03-02 1996-09-17 Honda Motor Co Ltd Calculation and display device for predictive running locus of vehicle

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH07257228A (en) * 1994-03-18 1995-10-09 Nissan Motor Co Ltd Display device for vehicle
JPH08241493A (en) * 1995-03-02 1996-09-17 Honda Motor Co Ltd Calculation and display device for predictive running locus of vehicle

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2005053991A1 *

Similar Documents

Publication Publication Date Title
US7656313B2 (en) Method and system for supporting path control
US11220274B2 (en) Vehicle display control device
JP6699646B2 (en) Vehicle display control device
US7605773B2 (en) Head-up display system and method for carrying out the location-correct display of an object situated outside a vehicle with regard to the position of the driver
US9649936B2 (en) In-vehicle device, control method of in-vehicle device, and computer-readable storage medium
CN102224443B (en) Vehicle display device and display method
WO2015060193A1 (en) Vehicle information projection system, and projection device
US8937536B2 (en) Vehicle information transmission device
JP4274111B2 (en) Proper inter-vehicle distance display control device
CN106536316A (en) Method for operating a motor vehicle in different driving modes, and motor vehicle
JP2015080988A (en) Vehicle information projection system and projection device
JP6558770B2 (en) Projection display device, projection display method, and projection display program
JP2006184854A (en) Information display method, display controller, and information display device
JP2017206251A (en) Vehicle information projection system
JP7300112B2 (en) Control device, image display method and program
JP2020071415A (en) Head-up display system
CN114537277A (en) Method for displaying virtual elements
WO2019189393A1 (en) Image control apparatus, display apparatus, movable body, and image control method
US20210323403A1 (en) Image control apparatus, display apparatus, movable body, and image control method
JP6669956B2 (en) VEHICLE DISPLAY AND CONTROL METHOD THEREOF
JP2016022919A (en) Head-up display device and vehicle
CN105658479B (en) Visualization system, vehicle and the method for operating visualization system
EP1689607A1 (en) Method and system for supporting path control
CN115867454A (en) Display control device for vehicle, display control system for vehicle, and display control method for vehicle
US20200049984A1 (en) Display device, display control method, storage medium

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20060629

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU MC NL PL PT RO SE SI SK TR

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20130718

RIC1 Information provided on ipc code assigned before grant

Ipc: B60K 35/00 20060101ALI20140731BHEP

Ipc: G01C 21/36 20060101AFI20140731BHEP

Ipc: G02B 27/01 20060101ALI20140731BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20150701