FR3082485A1 - Display device for assisting the driving of a driver of a vehicle - Google Patents

Display device for assisting the driving of a driver of a vehicle Download PDF

Info

Publication number
FR3082485A1
FR3082485A1 FR1855339A FR1855339A FR3082485A1 FR 3082485 A1 FR3082485 A1 FR 3082485A1 FR 1855339 A FR1855339 A FR 1855339A FR 1855339 A FR1855339 A FR 1855339A FR 3082485 A1 FR3082485 A1 FR 3082485A1
Authority
FR
France
Prior art keywords
external
driver
relative
display
motor vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
FR1855339A
Other languages
French (fr)
Inventor
Stephane Viegas Panasqueira
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Valeo Comfort and Driving Assistance SAS
Original Assignee
Valeo Comfort and Driving Assistance SAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Valeo Comfort and Driving Assistance SAS filed Critical Valeo Comfort and Driving Assistance SAS
Priority to FR1855339 priority Critical
Priority to FR1855339A priority patent/FR3082485A1/en
Publication of FR3082485A1 publication Critical patent/FR3082485A1/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/15Output devices or features thereof
    • B60K2370/152Displays
    • B60K2370/1529Head-up displays
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/16Type of information
    • B60K2370/177Augmented reality
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K2370/00Details of arrangements or adaptations of instruments specially adapted for vehicles, not covered by groups B60K35/00, B60K37/00
    • B60K2370/16Type of information
    • B60K2370/179Distances to obstacles or vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images

Abstract

The invention relates to a display device (100) for driving assistance for a driver of a motor vehicle comprising: - a detection unit (10) adapted to detect in the environment of the vehicle an external object to said motor vehicle, and programmed to evaluate at least one relative position of said external object with respect to the motor vehicle, - a memory unit (20) communicating with the detection unit, adapted to memorize said relative position of said detected external object, and , - an image projection unit (30) communicating with the detection unit and with the memory unit, adapted to project into the driver's field of vision a representation of said external object, at the level of the last relative position memorized of said external object, when said detected external object is at least partially hidden from the point of view of the driver of the motor vehicle.

Description

Display device for driving assistance for a driver of a vehicle

Technical field to which the invention relates

The present invention relates generally to a display device for driving assistance for a driver of a motor vehicle, adapted to project information into the driver's field of vision.

TECHNOLOGICAL BACKGROUND

Various display devices are known for assisting the driving of a driver of a motor vehicle, adapted to project into the driver's field of vision various information, such as a route to be followed or advertising information, or else the name of the street taken.

There are also known display devices for driving assistance, suitable for signaling to the driver the presence of obstacles in its path or for evaluating a risk of collision with another vehicle at an intersection for example.

Such devices are capable of detecting objects from the environment of the vehicle, to highlight them and attract the attention of the driver. However, it can happen that objects are hidden by obstacles so that they are no longer seen by the driver or the device. However, these hidden objects can be sources of danger for the driver who may have forgotten their presence.

Object of the invention

In order to remedy the aforementioned drawback of the prior art, the present invention provides a display device for driving assistance for a driver of a motor vehicle which reminds the driver of the presence of a object not visible from his point of view.

More particularly, according to the invention, a display device is proposed for assistance in driving a driver of a motor vehicle comprising:

a detection unit adapted to detect in the environment of the vehicle an object external to said motor vehicle, and programmed to evaluate at least one relative position of said external object relative to the motor vehicle,

a memory unit communicating with the detection unit, adapted to memorize said relative position of said detected external object, and,

an image projection unit communicating with the detection unit and with the memory unit, suitable for projecting into the driver's field of vision a representation of said external object, at the level of the last stored relative position of said external object , when said detected external object is at least partially hidden from the point of view of the driver of the motor vehicle.

Other non-limiting and advantageous characteristics of the device according to the invention, taken individually or in any technically possible combination, are the following:

the image projection unit is adapted to project into the driver's field of vision the representation of said external object at the level of the last stored relative position of said external object and to maintain the projection of said representation of the external object at level of the last relative position memorized when said external object is at least partially hidden from the point of view of the driver of the motor vehicle;

- the detection unit includes:

acquisition means adapted to collect data representative of the environment of the motor vehicle, and analysis means adapted to process the data collected to detect said object external to the vehicle and assess its relative position;

- the analysis means are suitable for processing the data collected to assess at least one dimension characteristic of the detected external object;

- The acquisition means of the detection unit comprise at least one image sensor and a rangefinder;

- the analysis means evaluate the relative position of the external object (and / or said characteristic dimension of said external object) from data collected by the rangefinder;

- the rangefinder is a laser rangefinder (Lidar) and / or an acoustic rangefinder (sonar) and / or a radio rangefinder (Radar);

- The detection unit is adapted to evaluate the evolution of the relative position of the external object over time only by detecting, at different times, at least part of said external object;

- the detection unit is designed to evaluate the relative position of the external object on the sole basis of the data supplied by the analysis means;

the detection unit is designed to evaluate the relative position of the external object without the use of data which would be received via a communication network external to the vehicle such as a wireless local area network (or WLAN) or a network of the type V2X;

- The image projection unit is adapted to project parts of the representation of the detected external object with a transparency effect vis-à-vis another object masking said parts from the driver of the motor vehicle;

- The image projection unit is adapted to project the representation of said external object into the environment of the motor vehicle, at the level of the last stored relative position of said external object;

- the image projection unit includes a head-up display.

Detailed description of an example of realization

The description which follows with reference to the appended drawing, given by way of nonlimiting example, will make it clear what the invention consists of and how it can be carried out.

In the accompanying drawing, Figure 1 is a schematic representation of the main elements of a display device according to the invention.

In Figure 1, there is shown schematically the main elements of a display device 100 for driving assistance for a driver of a motor vehicle.

This display device 100 is adapted to be mounted in a motor vehicle to project into the driver's field of vision a representation of an object external to the vehicle, in particular when said external object is at least partially hidden from the driver's point of view. of the motor vehicle. This helps the driver to adapt his driving while being aware of the presence of said object.

To do this, the display device 100 according to the invention comprises:

a detection unit 10 adapted to detect in the vehicle environment an object external to said motor vehicle, and programmed to evaluate at least one relative position of said external object with respect to the motor vehicle,

a memory unit 20 communicating with the detection unit 10, adapted to memorize said relative position of said detected external object, and,

an image projection unit 30 communicating with the detection unit 10 and with the memory unit 20, adapted to project into the driver's field of vision a representation of said external object, at the level of the last stored relative position of said external object, when said detected external object is at least partially hidden from the point of view of the driver of the motor vehicle.

Thanks to this display device 100, everything happens to the driver as if he could see the external object, even though this external object is actually hidden from his point of view. Thus, everything happens to the driver as if he could see behind the obstacles. This reinforces his security since he can thus adapt his behavior according to all the objects in his environment, including those which are not visible directly from his point of view.

To do this, as shown diagrammatically in FIG. 1, the detection unit 10 here comprises:

acquisition means 11A, 11B adapted to collect data representative of the environment of the motor vehicle, and

- analysis means 12 adapted to process the data collected to detect said object external to the vehicle and assess its relative position with respect to said object.

The acquisition means 11 A, 11B are more specifically intended to be at short and / or long distance, that is to say that they can acquire data at more or less long distances from the position at which find the vehicle.

Here, the acquisition means 11 A, 11B of the detection unit comprise at least one image sensor 11A and a range finder 11 B.

The image sensor 11A is adapted to capture images at a predetermined frequency. In practice, the detection threshold of the image sensor 11A is determined by the optical angles formed by the object to be detected with said image sensor 11 A, said optical angles not only depending on the distance at which the object is located. object, but also the dimensions of this object. The image sensor 11A for example comprises a conventional camera and / or an infrared camera for capturing nightly images. The captured images form data which is processed by the analysis means 12.

The 11B rangefinder is an instrument suitable for measuring the distance to a determined point relatively close to the vehicle, within the limit of a maximum distance of the order of fifty meters. It is for example formed by a means of acquisition by light wave, of the LIDAR type (for "Light Detection And Ranging", according to the English acronym). It could also be an acoustic rangefinder (SONAR) or a radio rangefinder (RADAR). The distances measured by the range finder 11B form data which are processed by the analysis means.

The acquisition means 11 A, 11B are for example arranged around the body of the motor vehicle, preferably in front of the vehicle.

The analysis means 12 are chosen from among shape recognition means, character recognition means, color recognition means, size recognition means, or a combination of its analysis means 12 .

Such analysis means 12 are adapted, from the data acquired by the acquisition means 11 A, 11 B, to evaluate, among other things, the shape of the objects located in the environment, their dimensions, their color, as well only to identify the type of external object detected.

The external objects detected by the analysis means 12 are in particular those useful in the context of the driving assistance device. Exterior objects are recognized here among: other motor vehicles (car, truck, motorcycles ...), bicycles and pedestrians, traffic signs, traffic lights, and markings on the ground.

In practice, the analysis means 12 are therefore suitable for recognizing at least one external object in the environment of the vehicle, on the basis of data acquired by the acquisition means 11 A, 11 B.

It is also conceivable that the analysis means 12 are suitable for evaluating at least one dimension characteristic of said external object from the data collected by the acquisition means 11 A, 11 B.

Once the object has been recognized by the analysis means 12, the analysis means 12 evaluate the relative position of the external object with respect to the motor vehicle. To do this, the analysis means preferably process and use the data collected by the range finder 11 B.

In practice, the detection unit 10 is thus designed to evaluate the relative position of the external object solely on the basis of the data supplied by the analysis means 12. In other words, the detection unit 10 is designed to evaluate the relative position of the external object without using data which would be received via a communication network external to the vehicle such as a wireless local area network (or WLAN for Wireless Local Area Network) or a V2X type communication system.

The detection unit 10 is then adapted to communicate this position relative to the memory unit 20. More precisely, after having detected and recognized at least one external object in the environment of the motor vehicle, then evaluated the relative position of the external object relative to the motor vehicle, the detection unit 10 communicates with the memory unit 20 in order to record said relative position. This communication is symbolized by the arrow between the blocks "10" and "20" in Figure 1.

The memory unit 20 is adapted to store the relative position of said external object, for a predetermined period, for example for a period of the order of 5 minutes.

In practice, the detection unit 10 communicates to the memory unit 20 spatial coordinates corresponding to the relative position of the external object relative to the vehicle. To do this, it is for example possible to consider that the coordinate system in which the spatial coordinates of the external object are determined is the mobile coordinate system associated with the motor vehicle, in other words the coordinate system in which the motor vehicle is always fixed.

In addition, the detection unit 10 is adapted to evaluate the evolution of the relative position of the external object over time. It is in particular adapted to follow the relative movement of the external object relative to the motor vehicle.

Preferably, the detection unit 10 detects the entire external object at least once and evaluates a first relative position of this external object. It then suffices for it to follow the relative movement of this external object relative to the vehicle, and this is possible even when only part of said external object is detected by the detection unit 10.

The image projection unit 30 is adapted to communicate with the detection unit 10 and with the memory unit 20. This communication is symbolized by the arrows connecting the blocks “10” and “30” on the one hand and "10" and "20" on the other hand. The image projection unit 30 is designed to project into the driver's field of vision a representation of said external object, at the level of the last stored relative position of this external object. In particular, the image projection unit 30 projects said representation of the external object when this external object is at least partially hidden from the point of view of the driver of the motor vehicle. Thus, the driver continues to have this external object in mind, even when it is hidden.

In practice, two situations can occur: the detection unit 10 still partially detects the external object or the detection unit 10 no longer detects the external object at all. When the detection unit 10 still partially detects the external object, it evaluates the actual position of the detected object and the image projection unit 30 projects the representation of the detected object at a sufficiently precise spatial position to be perceived by the driver as the actual position of the external object. Thus, the driver sees the representation of the detected detected object as if it were superimposed on the external object. When the detection unit 10 no longer detects the external object, the image projection unit 30 projects the representation of the external object at the last known position of the external object.

According to an advantageous variant, the image projection unit 30 is adapted to project into the driver's field of vision the representation of said external object at the level of the last stored relative position of said external object, including when the object is visible from the driver's point of view, and to maintain the projection of said representation of the external object at the level of the last stored relative position when said external object is at least partially hidden from the point of view of the driver of the motor vehicle.

According to this advantageous variant, it is not necessary for the display device 100 according to the invention to generate complex calculations to predict where the exterior object now hidden behind an obstacle is located. On the contrary, the display device 100 according to the invention predicts that the external object now hidden by an obstacle is located at the last recorded relative position. The display device 100 according to the invention is therefore not very consuming in calculation.

Advantageously, the image projection unit 30 is here adapted to project said representation according to an augmented reality projection principle.

In other words, the image projection unit 30 is adapted to make the representation of the detected external object appear as if this representation were, from the driver's point of view, present in the real environment. In other words, the driver of the vehicle sees the representation as if it were part of the environment itself. Thanks to the projection in augmented reality, virtual elements, here the representation of the detected external object, are projected so that the driver sees them as if they were part of the real environment.

Thus, the image projection unit 30 is adapted to project the representation of said external object into the environment of the motor vehicle, at the level of the last stored relative position of said external object. The last stored relative position corresponds either to the actual position of the external object, as evaluated by the detection unit, or to the last position at which the detection unit 10 detected it. In both cases, this is the last position saved in memory unit 20 for this object.

To do this, the image projection unit 30 here includes a head-up display 31.

Generally, such a head-up display 31 is suitable for creating a virtual image in the field of vision of a driver of the vehicle, so that the driver sees this virtual image without having to look away from the road. The head-up display 31 here comprises an image generation device adapted to generate at least one image of the (virtual) representation of the external object to be projected, and an image return device adapted to transmit said generated image. towards an at least partially reflecting strip placed in the driver's field of vision. In the example described, the partially reflective strip is formed by the windshield of the vehicle itself. This partially reflective blade is an at least partially reflective and at least partially transparent blade.

In practice, the image projection unit 30 is suitable for projecting certain parts of the representation with a transparency effect vis-à-vis the obstacle masking these parts from the driver of the motor vehicle.

More specifically, the parts of the detected external object hidden behind the obstacle are projected into the environment so that they appear with less light intensity. Thus, the driver knows that these parts of the external object are actually behind the obstacle and not in front of him.

The image projection unit 30 can be designed to project the integral representation of the detected external object, or only the parts of the representation corresponding to the hidden portions of the external object.

To establish the representation to be projected, the image projection unit 30 preferably uses the data acquired by the acquisition means 11A, 11B of the detection unit 10

Example of implementation

An example of implementation of the display device 100 is as follows. The driver of the motor vehicle drives in an urban environment. When approaching an intersection, at a distance of about 40 meters, the driver detects on the right side a pedestrian who is about to cross the road. This pedestrian is also detected by the detection unit 10 of the display device 100. After advancing a few meters, the motor vehicle arrives at the level of a bus parked at a bus stop. This bus prevents the driver from distinguishing the pedestrian who is now behind the bus from the driver's point of view. By means of the processing carried out by the device of FIG. 1 as described above, the image projection unit 30 of the display device 100 then projects into the driver's field of vision a representation in augmented reality of this pedestrian. hidden. More specifically, from the driver's point of view, this representation is superimposed on the real image of the bus, but the representation of the pedestrian is projected with a light intensity such that it appears in transparency compared to the bus. In other words, the representation of the pedestrian seen by the driver has an effect of transparency compared to the bus.

With the bus stationary, the driver decides to pass it. The augmented reality representation will remind the user of the presence of the pedestrian. This leads the driver to moderate his speed sharply and to exercise extra caution. Thus, the safety of the driver and the pedestrian is reinforced thanks to the display device 100 according to the invention.

Claims (10)

1. Display device (100) for driving assistance for a driver of a motor vehicle comprising:
- a detection unit (10) adapted to detect in the environment of the vehicle an object external to said motor vehicle, and programmed to evaluate at least one relative position of said external object with respect to the motor vehicle,
a memory unit (20) communicating with the detection unit, adapted to memorize said relative position of said detected external object, and,
- an image projection unit (30) communicating with the detection unit (10) and with the memory unit (20), adapted to project into the driver's field of vision a representation of said external object, at the level of the last stored relative position of said external object, when said detected external object is at least partially hidden from the point of view of the driver of the motor vehicle.
2. Display device (100) according to claim 1, in which the image projection unit (30) is adapted to project in the driver's field of vision the representation of said external object at the level of the last relative position. memorized said external object and to maintain the projection of said representation of the external object at the level of the last relative position memorized when said external object is at least partially hidden from the point of view of the driver of the motor vehicle.
3. Display device (100) according to one of claims 1 and 2, in which the detection unit (10) comprises:
- acquisition means (11 A, 11 B) adapted to collect data representative of the environment of the motor vehicle, and
- analysis means (12) adapted to process the data collected to detect said object external to the vehicle and assess its relative position.
4. Display device (100) according to claim 3, in which the acquisition means (11 A, 11 B) of the detection unit (10) comprise at least one image sensor (11 A) and a rangefinder (11 B).
5. Display device (100) according to claim 4, in which the analysis means (12) evaluate the relative position of the external object from data collected by the range finder (11 B).
6. Display device (100) according to one of claims 1 to 5, in which the detection unit (10) is adapted to evaluate the evolution of the relative position of the external object over time only by detecting, at
5 different times, at least part of said external object.
7. Display device (100) according to one of claims 1 to 6, in which the detection unit (10) is designed to evaluate the relative position of the external object solely on the basis of the data supplied by the means of analysis (12).
10
8. Display device (100) according to one of claims 1 to 7, in which the image projection unit (30) is adapted to project parts of the representation of the detected external object with an effect. transparency vis-à-vis another object masking said parts from the driver of the motor vehicle.
9. Display device (100) according to one of claims 1 to 8, in
15 which the image projection unit (30) is adapted to project the representation of said external object into the environment of the motor vehicle, at the level of the last stored relative position of said external object.
10. Display device (100) according to one of claims 1 to 9, in which the image projection unit (30) comprises a head-up display (31).
FR1855339A 2018-06-18 2018-06-18 Display device for assisting the driving of a driver of a vehicle Pending FR3082485A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
FR1855339 2018-06-18
FR1855339A FR3082485A1 (en) 2018-06-18 2018-06-18 Display device for assisting the driving of a driver of a vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
FR1855339A FR3082485A1 (en) 2018-06-18 2018-06-18 Display device for assisting the driving of a driver of a vehicle

Publications (1)

Publication Number Publication Date
FR3082485A1 true FR3082485A1 (en) 2019-12-20

Family

ID=62952161

Family Applications (1)

Application Number Title Priority Date Filing Date
FR1855339A Pending FR3082485A1 (en) 2018-06-18 2018-06-18 Display device for assisting the driving of a driver of a vehicle

Country Status (1)

Country Link
FR (1) FR3082485A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009027755A1 (en) * 2009-07-16 2011-01-20 Robert Bosch Gmbh Method for assisting driving of vehicle e.g. cab vehicle, involves indicating instantaneous position of obstacle to driver of vehicle, and encrusting obstacle by analog contact display unit in vision field of driver
EP2860971A1 (en) * 2013-10-10 2015-04-15 Panasonic Intellectual Property Management Co., Ltd. Display control apparatus, method, recording medium, and vehicle
US20180059779A1 (en) * 2016-08-23 2018-03-01 Toyota Jidosha Kabushiki Kaisha System for Occlusion Adjustment for In-Vehicle Augmented Reality Systems
US20180129888A1 (en) * 2016-11-04 2018-05-10 X Development Llc Intuitive occluded object indicator

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102009027755A1 (en) * 2009-07-16 2011-01-20 Robert Bosch Gmbh Method for assisting driving of vehicle e.g. cab vehicle, involves indicating instantaneous position of obstacle to driver of vehicle, and encrusting obstacle by analog contact display unit in vision field of driver
EP2860971A1 (en) * 2013-10-10 2015-04-15 Panasonic Intellectual Property Management Co., Ltd. Display control apparatus, method, recording medium, and vehicle
US20180059779A1 (en) * 2016-08-23 2018-03-01 Toyota Jidosha Kabushiki Kaisha System for Occlusion Adjustment for In-Vehicle Augmented Reality Systems
US20180129888A1 (en) * 2016-11-04 2018-05-10 X Development Llc Intuitive occluded object indicator

Similar Documents

Publication Publication Date Title
US10423847B2 (en) Predicting vehicle movements based on driver body language
US10690770B2 (en) Navigation based on radar-cued visual imaging
EP3248838B1 (en) Lighting apparatus for vehicle
US10209712B2 (en) Predicting and responding to cut in vehicles and altruistic responses
US10650254B2 (en) Forward-facing multi-imaging system for navigating a vehicle
US10296083B2 (en) Driver assistance apparatus and method for controlling the same
EP3127771B1 (en) Driver assistance apparatus and vehicle including the same
JP6495388B2 (en) Method and system for detecting weather conditions using in-vehicle sensors
US9857800B2 (en) Systems and methods for determining the status of a turn lane traffic light
US9443154B2 (en) Fusion of far infrared and visible images in enhanced obstacle detection in automotive applications
US9965957B2 (en) Driving support apparatus and driving support method
JP6189815B2 (en) Traveling line recognition system
US9025140B2 (en) Methods and systems for detecting weather conditions including sunlight using vehicle onboard sensors
US10247854B2 (en) Methods and systems for detecting weather conditions using vehicle onboard sensors
US10081370B2 (en) System for a vehicle
US8970451B2 (en) Visual guidance system
US8994520B2 (en) Visual driver information and warning system for a driver of a motor vehicle
US9632210B2 (en) Methods and systems for detecting weather conditions using vehicle onboard sensors
US20190077400A1 (en) Recognition and prediction of lane constraints and construction areas in navigation
US10600250B2 (en) Display system, information presentation system, method for controlling display system, computer-readable recording medium, and mobile body
KR101838967B1 (en) Convenience Apparatus for Vehicle and Vehicle
CN108271408B (en) Generating three-dimensional maps of scenes using passive and active measurements
CN100408383C (en) Information providing device for vehicle
US9493157B2 (en) Autonomous vehicle operation in obstructed occupant view and sensor detection environments
KR101030763B1 (en) Image acquisition unit, acquisition method and associated control unit

Legal Events

Date Code Title Description
PLFP Fee payment

Year of fee payment: 2

PLSC Search report ready

Effective date: 20191220