EP4181109A1 - Verfahren und vorrichtung zur steuerung einer transparenten anzeigevorrichtung mit konfiguration zur anzeige des inhalts in einer windschutzscheibe eines fahrzeugs - Google Patents

Verfahren und vorrichtung zur steuerung einer transparenten anzeigevorrichtung mit konfiguration zur anzeige des inhalts in einer windschutzscheibe eines fahrzeugs Download PDF

Info

Publication number
EP4181109A1
EP4181109A1 EP21208541.9A EP21208541A EP4181109A1 EP 4181109 A1 EP4181109 A1 EP 4181109A1 EP 21208541 A EP21208541 A EP 21208541A EP 4181109 A1 EP4181109 A1 EP 4181109A1
Authority
EP
European Patent Office
Prior art keywords
color
vehicle
determined
content
detected object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP21208541.9A
Other languages
English (en)
French (fr)
Inventor
Manjeet Singh BILRA
Mohamed-Saad ABDELHAMEED
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bayerische Motoren Werke AG
Original Assignee
Bayerische Motoren Werke AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bayerische Motoren Werke AG filed Critical Bayerische Motoren Werke AG
Priority to EP21208541.9A priority Critical patent/EP4181109A1/de
Publication of EP4181109A1 publication Critical patent/EP4181109A1/de
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/02Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
    • G09G5/024Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using colour registers, e.g. to control background, foreground, surface filling
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/06Colour space transformation
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/10Automotive applications

Definitions

  • the present invention is directed to a method for controlling a transparent display configured to display content in a windshield of a vehicle, and a control unit configured to carry out the method at least partly.
  • An, optionally automated, vehicle comprising the control unit may be provided.
  • a computer program comprising instructions which, when the program is executed by a computer, e.g., the control unit, cause the computer to carry out the method at least partly, may be provided.
  • a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the method at least partly, may be provided.
  • Transparent displays configured to display content in a windshield of a vehicle, also called head-up displays, are widely used in vehicles, such as cars, to display content/information to a user/driver of the vehicle.
  • WO 2019/038201 A2 relates to a head-up display comprising a display element, a projection system, a diffusing screen and a mirror element. It is proposed that the diffusing screen has focusing elements on its side facing the projection system and a light-blocking mask on its side facing away from the projection system.
  • one disadvantage of the state of the art is that content displayed in the windshield using the head-up display may irritate the driver of the vehicle since the content may overlap with object located in a field of view of the driver.
  • the object of the present invention is to provide a device and/or a method being suitable for overcoming at least the above-mentioned disadvantage of the prior art, respectively.
  • the object is solved by a method for controlling a transparent display configured to display content in a windshield of a vehicle.
  • the transparent display may be a head-up display.
  • a head-up display also known as a HUD, is any transparent display that presents data/content without requiring users to look away from their usual viewpoints during driving the vehicle. This allows the user/driver of the vehicle to view information with the head positioned "up" and looking forward, instead of angled down looking at lower instruments.
  • a HUD also has the advantage that the user's eyes do not need to refocus to view the outside after looking at the optically nearer instruments.
  • the transparent display may be configured to generate the content, e.g., including information being relevant for driving the vehicle, by an image source (possibly with an intermediate image plane) in a dashboard of the vehicle and project the content via an optical system onto a combiner (i.e., a transparent mirror), here the windshield of the vehicle.
  • a combiner i.e., a transparent mirror
  • the windshield itself may comprise or may be the transparent display, e.g., comprising a transparent OLED (i.e., organic light emitting diode) display.
  • the transparent display may be configured to display augmented reality content.
  • the method comprises capturing sensor data relating to an environment of the vehicle using a sensor system installed at the vehicle, determining at least one color included in the sensor data, and determining at least one color of the displayed content based on the determined at least one color included in the sensor data.
  • the sensor system may include one or more cameras, e.g., a front camera and/or a surround view camera system.
  • the method may comprise fusing the sensor data in case more than one camera is provided.
  • the at least one color determined based on the sensor data may be a color distribution.
  • the at least one color may be determined using the sensor data of a whole field of view of the sensor system or may be determined using solely a part of the filed of view of the sensor system corresponding to solely a part of the sensor data.
  • the method may comprise detecting an object of a predefined class in the environment of the vehicle based on the sensor data and determining a color of the detected object as the at least one color included in the sensor data.
  • the predefined class may comprise one or multiple sub classes such as a pedestrian and/or bicyclist, i.e., a so-called vulnerable road-user.
  • the method is not limited thereto, and the class may comprise any other subclass such as a vehicle and/or a traffic sign.
  • the method may comprise determining a current and/or a future position of the detected object based on the sensor data, and optionally determining the at least one color of the displayed content based on the determined current and/or future position of the detected object.
  • the method may comprise determining a position thereof. Determining the position may be done using the above-described camera data and may optionally include using data of other sensors, e.g., of a radar sensor, a lidar sensor and/or an ultrasonic sensor. The determined position may be used additionally to determine the at least one color of the displayed content, e.g., the at least one color of the displayed content may vary based on the determined position of the detected object.
  • Determining the future position of the detected object may comprise determining an expected trajectory of the detected object.
  • the trajectory may include information of an expected movement of the detected object, i.e., the trajectory may include information on one or more future positions of the object.
  • the trajectory may be represented by a spline.
  • the trajectory may comprise time information, i.e., the trajectory may comprise an information when the object will be at which position.
  • Determining the trajectory of the detected object may comprise determining a current velocity, a current acceleration and/or a current direction of movement of the detected object.
  • the above-described trajectory may be estimated on a current movement profile of the detected object.
  • the method is not limited thereto and other information about the environment around the detected object may be used, such as obstacles and/or a road guidance, optionally including traffic signs.
  • Estimating/Determining the trajectory may be done using a machine learning based algorithm and/or a deterministic algorithm.
  • the machine learning based algorithm may comprise a neural network.
  • the method may comprise determining if the detected object and the content displayed in the windshield are overlapping in at least one display area being located in a predefined field of view of a user of the vehicle when the user is watching through the displayed content based on the determined current and/or future position of the detected object.
  • the transparent display is configured to display the content in a certain area in the windshield of the vehicle.
  • a size and/or a position of this area may be variable, e.g., manually using keys installed in the vehicle and/or automatically based on a position of the user's eyes.
  • the user may not only see the displayed content but may also see the detected object. This might be the case if the detected object is located in front of the vehicle. However, this depends on the size of the area where the content is displayed, the position of the object and the field of view of the user.
  • the predefined field of view of the user may be a fixed field of view, i.e., a default field of view, and/or may be determined based on a position of the user's eyes. In case the user can see both, the displayed content and the detected object, at the same time, the displayed content and the detected object are overlapping (optionally excluding the peripheral field of view of the user).
  • the detected object and the displayed content do not necessarily have to overlap in the whole area where the content is displayed, but may (solely) overlap in a certain sub area thereof, i.e., the so called at least one display area (or sub area of an area in the windshield where the content is displayed).
  • This display area may be determined by the method.
  • the method may comprise determining the at least one color of the displayed content in the determined at least one display area based on the determined at least one color included in the sensor data.
  • the method may include setting a color in the above-described sub area of the area in the windshield where the content is displayed depending on the color of the detected object.
  • the method is not limited thereto, and it is also possible that the color of the whole area in the windshield where the content is displayed is changed based on the color of the detected object in case it is determined that the detected object and the content displayed in the windshield are overlapping in the at least one display area.
  • the method may comprise determining the at least one color of the displayed content in the whole area where the content is displayed in the windshield based on the determined at least one color included in the sensor data, if it is determined that the detected object and the content displayed in the windshield are overlapping in the at least one display area.
  • the at least one color of the displayed content may be determined to be the inverted color of the at least one color included in the sensor data.
  • determining the color of the displayed content may comprise changing or setting the at least one color of the displayed content depending on the at least one color determined based on the sensor data.
  • the at least one color determined based on the sensor data may be a color of a certain detected object in the environment of the vehicle, but is not limited thereto and may also be any color included in the sensor data.
  • the term "color” as used herein does not necessarily solely include a color of a certain pixel/spot but may also include a color distribution. The color distribution may be determined based on the sensor data. Additionally or alternatively, a color distribution for the displayed content may be determined based on the color determined using the sensor data.
  • the term "color" as used herein may also include black and/or white.
  • color may also include red, blue and/or green, and optionally any kind of mix of these three colors.
  • inverting refers to a process for "reversing colors”. Specifically, this means that the "opposite color" of the respective color space may be determined. For example, black becomes white and vice versa.
  • the RGB color space may be used.
  • the method may comprise determining a control signal based on the determined at least one color of the displayed content and output the determined control signal to the transparent display such that the transparent display displays the content with the determined at least one color of the displayed content.
  • a solution for improving a head-up display's capability to display information/content such that a user thereof may easily distinguish collision relevant objects such as pedestrians, bicycles, motorbikes, obstacles, or any other kind of objects from the content/information displayed in the head-up display is provided. More specifically, while using a head-up display, it may happen that some collision relevant objects/obstacles may not be clearly visible or could not be distinguished by the driver from the displayed content due to properties of the head-up display. Hence, it may happen that the driver does not correctly detect a critical object well in advance to take evasive action. This might be overcome by dynamically adapting an interface of the head-up display.
  • the method may take input from different sensors such as a front facing camera and/or a surround view camera to extract properties of collision relevant objects or obstacles. This information may then be provided to a central display control unit which may be configured to adapt properties of the head-up display to distinguish the detected object(s) from the displayed content, hence increasing drivers' attention and/or reducing the driver's mental workload.
  • sensors such as a front facing camera and/or a surround view camera to extract properties of collision relevant objects or obstacles.
  • This information may then be provided to a central display control unit which may be configured to adapt properties of the head-up display to distinguish the detected object(s) from the displayed content, hence increasing drivers' attention and/or reducing the driver's mental workload.
  • the information may be displayed using default properties of the head-up display during normal driving conditions.
  • the object or obstacle may be detected by on-board sensors such as the front facing camera of the vehicle and/or by the surround view camera system for objects/obstacles which are in near range.
  • the on-board sensors may extract the properties of the cyclist such as color, size, shape and/or background color thereof.
  • central display controller which may work as an independent electronic control unit (ECU) taking input from the one or multiple sensors.
  • ECU electronice control unit
  • the central display controller may process the data from different sensors e.g., from different camera's and may plausibilize the properties of a detected, optionally classified as collision critical, object to ensure that qualified object properties with higher confidence may be generated.
  • the central display controller may compare the properties of the detected object to that of the current properties of the head-up display, i.e., such as the color of a cyclist to a color of head-up display information.
  • the central display controller may modify the properties of the head-up display such that it is easily possible for the user/driver to distinguish between the information projected by the head-up display and the detected object.
  • This dynamic adaption of the head-up display properties ensure that the driver is more attentive to (optionally collision critical) objects. This enables the driver to easily distinguish the (optionally collision critical) objects from the displayed content.
  • control unit is provided.
  • the control unit is configured to carry out the above-described method at least partly.
  • the control unit can be an electronic control unit (ECU) for a vehicle.
  • the electronic control unit can be an intelligent processor-controlled unit that can communicate with other modules, optionally via a central gateway (CGW).
  • CGW central gateway
  • the control unit can form part of the vehicle's onboard network via fieldbuses such as the CAN bus, LIN bus, MOST bus and/or FlexRay or via automotive Ethernet, optionally together with a telematics control unit.
  • the electronic control unit may be configured to control functions relevant to a driving behavior of the vehicle, such as an engine control system, a power transmission, a braking system and/or a tire pressure control system.
  • driver assistance systems such as parking assistant, adaptive cruise control, lane departure warning, lane change assistant, traffic sign recognition, light signal recognition, approach assistant, night vision assistant, intersection assistant, and/or many others may be controlled by the control unit.
  • the vehicle comprises the above-described control unit and a transparent display connected to the control unit.
  • the vehicle may be an automobile, e.g., a car.
  • the vehicle may be automated.
  • the automated vehicle can be designed to take over lateral and/or longitudinal guidance at least partially and/or temporarily during automated driving of the automated vehicle. Therefore, inter alia the sensor data of the above-described sensor system may be used.
  • the control unit may be configured to control the automated driving at least partly.
  • the automated driving may be such that the driving of the vehicle is (largely) autonomous.
  • the vehicle may be a vehicle of autonomy level 1, i.e., have certain driver assistance systems that support the driver in vehicle operation, for example adaptive cruise control (ACC).
  • ACC adaptive cruise control
  • the vehicle can be a vehicle of autonomy level 2, i.e., be partially automated in such a way that functions such as automatic parking, lane keeping or lateral guidance, general longitudinal guidance, acceleration and/or braking are performed by driver assistance systems.
  • the vehicle may be an autonomy level 3 vehicle, i.e., automated in such a conditional manner that the driver does not need to continuously monitor the system vehicle.
  • the vehicle autonomously performs functions such as triggering the turn signal, changing lanes, and/or lane keeping. The driver can attend to other matters, but is prompted by the system to take over control within a warning time if needed.
  • the vehicle may be an autonomy level 4 vehicle, i.e., so highly automated that the driving of the vehicle is permanently taken over by the system vehicle. If the driving tasks are no longer handled by the system, the driver may be requested to take over control.
  • autonomy level 4 vehicle i.e., so highly automated that the driving of the vehicle is permanently taken over by the system vehicle. If the driving tasks are no longer handled by the system, the driver may be requested to take over control.
  • the vehicle may be an autonomy level 5 vehicle, i.e., so fully automated that the driver is not required to complete the driving task. No human intervention is required other than setting the destination and starting the system.
  • the vehicle can operate without a steering wheel or pedals.
  • a computer program comprising instructions which, when the program is executed by a computer, optionally the control unit, cause the computer to carry out the above-described method at least partly.
  • the program may comprise any program code, in particular a code suitable for control systems of vehicles.
  • the description given above with respect to the method, the control unit and the vehicle applies mutatis mutandis to the computer program and vice versa.
  • a computer-readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the above-described method at least partly.
  • the computer-readable medium may be any digital data storage device, such as a USB flash drive, a hard disk, a CD-ROM, an SD card or an SSD card.
  • the above-described computer program may be stored on the computer-readable medium. However, the computer program does not necessarily have to be stored on such a computer-readable medium, but can also be obtained via the Internet.
  • a vehicle 1 is shown schematically and partly.
  • the vehicle 1 comprises a control unit 2 being connected to a front camera 3 and a surround view camera system 4 on an input side thereof, and to a transparent display 5 on an output side thereof.
  • the front camera 3 and the surround view camera system 4 form part of a sensor system of the vehicle 1.
  • the transparent display 5 is configured to display content/information 6 in a windshield 7 of the vehicle 1, as can be gathered from figure 1 .
  • the sensor system is configured to capture - as sensor data - images from the environment of the vehicle 1.
  • the situation shown in figure 1 shows a predefined field of view 11 of a user 9 of the vehicle 1, which is looking in a main driving direction of the vehicle 1.
  • the control unit 2 is configured to carry out a method for controlling the transparent display 5 based on the sensor data captured by and received from the sensor system shown in figure 2 , wherein a flowchart of the method is shown in figure 3 .
  • the method substantially comprises sixth steps S1 - S6.
  • the sensor system being installed at the vehicle 1, captures the sensor data relating to the environment of the vehicle 1.
  • the control unit 2 detects an object of a predefined class in the environment of the vehicle 1 based on the sensor data captured in the first step S1 of the method.
  • the detected object is a bicycle 8.
  • this is just an example and every other object, such as a pedestrian and/or an obstacle, may be detected.
  • the control unit 2 determines a current position 81 and future positions 82, 83 of the detected bicycle 8, as well as a size, a color and a shape of the detected bicycle 8 based on the sensor data captured in the first step S1 of the method. For determining the future positions 82, 83 of the bicycle 8, the control unit 2 determines an expected trajectory 12 of the detected bicycle 8. Determining the trajectory 12 of the detected bicycle 8 comprises determining a current velocity, a current acceleration and/or a current direction of movement of the bicycle 8.
  • Figure 1 shows the bicycle 8 in its current position 81 on a left side of the windshield 7, in a first future position 82 being located on a right side of the current position 81 and in a second future position 83 being located on a right side of the first future position 82. Therefore, the current position 81 is the start of the trajectory 12 and the first and the second future positions 82, 83 are located on the trajectory 12 such that the three positions 81, 82, 83 are connected by a spline representing the trajectory 12.
  • the method may optionally include judging based on the class of the detected object 8 and/or its trajectory 12 if the detected object 8 is a safety critical object and then only if the detected object 8 is a safety critical object the following steps S4 - S6 are carried out.
  • a fourth step S4 of the method the control unit 2 determines if the detected bicycle 8 and the content 6 displayed in the windshield 7 are overlapping in at least one display area 10 being located in a predefined field of view 11 of the user 9 of the vehicle 1 in the situation shown in figure 1 , i.e., when the user 9 is watching through the displayed content 6 in the main driving direction of the vehicle 1.
  • the field of view 11 is a default area in the windshield 7 of the vehicle 1 corresponding to an area in which the transparent display 2 may display the content 6, i.e., the method uses a fixed predefined field of view 11.
  • the control unit 2 receives image data from a camera installed in the vehicle capturing images from the user 9 of the vehicle 1 and determines a current/varying field of view 11 of the user 9 of the vehicle 1.
  • the control unit 2 uses the trajectory 12 including the current and future positions 81 - 83 of the bicycle 8 as well as the shape and the size of the bicycle 8 to determine a bounding box around the bicycle 8, i.e., the bounding box is the display area 10 inside which the bicycle 8 is located in its current and future positions 81 - 83.
  • the bounding box 10 may also be a box around the current position 81 of the bicycle 8 and may move according the trajectory 12 together with the bicycle 8.
  • the display area/bounding box 10 is partly located inside the field of view 11, i.e., the displayed content 6 and the detected object 8 overlap, such that the method proceeds with a fifth step S5. Otherwise, the method would start again with the first step S1. That is, the fifth step S5 is only carried out if it is determined that the detected object 8 and the content 6 displayed in the windshield 7 are overlapping in the at least one display area 10 and optionally if the content 6 has similar properties, such as a color, as the bicycle 8. In conclusion, determining the color of the displayed content 6 is also done based on the determined current and future positions 81 - 83 of the detected bicycle 8.
  • the control unit 2 determines a color of the displayed content 6 based on the color of the bicycle 8 determined in the third step S3 of the method.
  • two alternatives may be distinguished in determining the color of the displayed content 6.
  • the control unit 2 solely determines a color of the content 6 displayed in the display area/bounding box 10 based on the determined color of the bicycle 8 (wherein the remaining part of the content 6 outside the display area/bounding box 10 is not considered by the control unit 2).
  • the control unit 2 determines the at least one color of the displayed content 6 in a whole area where the content 6 is displayed in the windshield 7, here the area defined as the field of view 11, based on the determined color of the bicycle 8.
  • the control unit 2 determines the inverted/opposite color of the detected object, here the bicycle 8, and sets the inverted color as the color of the displayed content 6.
  • control unit 2 determines a control signal based on the determined at least one color of the displayed content 6 and outputs the determined control signal to the transparent display 2 such that the transparent display 2 displays the content 6 and/or the bicycle 8 with the determined at least one color of the displayed content 6.
  • the method starts again with the first step S1.
  • the method is not limited to the strict order of the steps S1 - S6 shown in figure 3 .
  • some of the steps may be processed simultaneously, e.g., the first and the fifth step S1, S5.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Traffic Control Systems (AREA)
EP21208541.9A 2021-11-16 2021-11-16 Verfahren und vorrichtung zur steuerung einer transparenten anzeigevorrichtung mit konfiguration zur anzeige des inhalts in einer windschutzscheibe eines fahrzeugs Pending EP4181109A1 (de)

Priority Applications (1)

Application Number Priority Date Filing Date Title
EP21208541.9A EP4181109A1 (de) 2021-11-16 2021-11-16 Verfahren und vorrichtung zur steuerung einer transparenten anzeigevorrichtung mit konfiguration zur anzeige des inhalts in einer windschutzscheibe eines fahrzeugs

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP21208541.9A EP4181109A1 (de) 2021-11-16 2021-11-16 Verfahren und vorrichtung zur steuerung einer transparenten anzeigevorrichtung mit konfiguration zur anzeige des inhalts in einer windschutzscheibe eines fahrzeugs

Publications (1)

Publication Number Publication Date
EP4181109A1 true EP4181109A1 (de) 2023-05-17

Family

ID=78676364

Family Applications (1)

Application Number Title Priority Date Filing Date
EP21208541.9A Pending EP4181109A1 (de) 2021-11-16 2021-11-16 Verfahren und vorrichtung zur steuerung einer transparenten anzeigevorrichtung mit konfiguration zur anzeige des inhalts in einer windschutzscheibe eines fahrzeugs

Country Status (1)

Country Link
EP (1) EP4181109A1 (de)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100253594A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Peripheral salient feature enhancement on full-windshield head-up display
US20180198955A1 (en) * 2015-07-10 2018-07-12 Shuichi Tayama Vehicle-use image display system and method
WO2019038201A2 (de) 2017-08-22 2019-02-28 Continental Automotive Gmbh Head-up-display

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100253594A1 (en) * 2009-04-02 2010-10-07 Gm Global Technology Operations, Inc. Peripheral salient feature enhancement on full-windshield head-up display
US20180198955A1 (en) * 2015-07-10 2018-07-12 Shuichi Tayama Vehicle-use image display system and method
WO2019038201A2 (de) 2017-08-22 2019-02-28 Continental Automotive Gmbh Head-up-display

Similar Documents

Publication Publication Date Title
US11008016B2 (en) Display system, display method, and storage medium
US11267398B2 (en) Control unit and method for defining movement regions
US8896687B2 (en) Lane departure prevention support apparatus, method of displaying a lane boundary line and program
US11325471B2 (en) Method for displaying the course of a safety zone in front of a transportation vehicle or an object by a display unit, device for carrying out the method, and transportation vehicle and computer program
US9620009B2 (en) Vehicle surroundings monitoring device
WO2018139139A1 (ja) 車両用表示装置
WO2016130294A1 (en) Environmental scene condition detection
JP2004535971A (ja) ヘッドアップディスプレイシステム及び方法
US20220358840A1 (en) Motor Vehicle
WO2015194345A1 (ja) 車両用照明装置
KR102077201B1 (ko) 차량의 통합 제어 장치 및 방법
CN106251698B (zh) 用于在自主车辆与乘客之间通信的方法和控制单元
JP6748947B2 (ja) 画像表示装置、移動体、画像表示方法及びプログラム
JP2009139306A (ja) 路面標示認識装置
EP4149809B1 (de) Fahrzeugfahrhilfe bei niedrigen wettersichtbedingungen, insbesondere mit nebel
EP4181109A1 (de) Verfahren und vorrichtung zur steuerung einer transparenten anzeigevorrichtung mit konfiguration zur anzeige des inhalts in einer windschutzscheibe eines fahrzeugs
US11766938B1 (en) Augmented reality head-up display for overlaying a notification symbol over a visually imperceptible object
US20230314157A1 (en) Parking assist in augmented reality head-up display system
JP2020201824A (ja) 情報表示装置
CN115723571A (zh) 车辆及其控制方法
JP2023028536A (ja) 自動車
CN113401055A (zh) 注意提醒装置和注意提醒方法
JP2017174043A (ja) 表示装置
TWI821689B (zh) 整合後視影像的車用抬頭顯示器
JP6476504B2 (ja) 車両用表示システム、車両用表示方法、および車両用表示プログラム

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN PUBLISHED

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230523

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20231018

RBV Designated contracting states (corrected)

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR