WO2021054579A1 - Système d'affichage tête haute en 3d et son procédé de commande - Google Patents

Système d'affichage tête haute en 3d et son procédé de commande Download PDF

Info

Publication number
WO2021054579A1
WO2021054579A1 PCT/KR2020/007919 KR2020007919W WO2021054579A1 WO 2021054579 A1 WO2021054579 A1 WO 2021054579A1 KR 2020007919 W KR2020007919 W KR 2020007919W WO 2021054579 A1 WO2021054579 A1 WO 2021054579A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vehicle
image
user
display
Prior art date
Application number
PCT/KR2020/007919
Other languages
English (en)
Korean (ko)
Inventor
김필재
Original Assignee
김필재
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 김필재 filed Critical 김필재
Publication of WO2021054579A1 publication Critical patent/WO2021054579A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/31Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles providing stereoscopic vision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/24Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view in front of the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/02Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to ambient conditions
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W40/00Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
    • B60W40/08Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/403Image sensing, e.g. optical camera
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2420/00Indexing codes relating to the type of sensors based on the principle of their operation
    • B60W2420/40Photo, light or radio wave sensitive means, e.g. infrared sensors
    • B60W2420/408Radar; Laser, e.g. lidar
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2556/00Input parameters relating to data
    • B60W2556/45External transmission of data to or from the vehicle
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems

Definitions

  • the present invention relates to a 3D head-up display system and a control method thereof, and in particular, a 3D display that provides a user with an image located at a desired distance using a 3D display capable of providing different information to both eyes of the user. It relates to a head-up display system and a control method thereof.
  • a head-up display (or windshield display) is a device that displays an indication of a vehicle speed or heading direction on the front glass of a vehicle.
  • the articulated head-up display has a difference between the image of the head-up display and the actual external object due to the limit of the distance between the image and the user's eyes. Due to the difference in the distance between the image of the display and the actual external object, a feeling of disparity is felt when using.
  • the head-up display has to create a location length where the image is formed by a projection unit and a plurality of reflectors in order to create a difference in the length of the path, and the longer the location of the image, the larger the size of the module is inevitably.
  • the position of the image thus created can only be located at a single position, so there is a limit to the image to be expressed.
  • An object of the present invention is to provide a 3D head-up display system and a control method thereof that provide an image located at a desired distance to a user using a 3D display capable of providing different information to both eyes of a user. have.
  • Another object of the present invention is to generate a 3D image using information obtained through a communication unit, a sensor unit, a photographing unit, etc., and reflect a 3D image displayed through a 3D display through a reflector to windshield a 3D HUD image of a virtual image. It is to provide a 3D head-up display system and a control method thereof.
  • a 3D head-up display system includes: a sensor unit that measures information on an object located in front of a vehicle, which is a moving direction of the vehicle, and collects information related to the vehicle; A photographing unit that acquires image information including the object; A communication unit for receiving information on surrounding conditions according to the location of the vehicle provided from a server and additional information related to an object in the image information; A gaze tracking unit that detects a gaze direction and a gaze position of a user in the vehicle; Information on the measured object located in front of the vehicle, information related to the collected vehicle, the acquired image information, surrounding situation information according to the location of the received vehicle, addition related to the object in the received image information A control unit for generating 3D content to be provided to the user based on at least one of information, the detected gaze direction and gaze position of the user, and destination information according to a user input; A 3D display that outputs 3D content generated by the control unit; A reflector that reflects the 3D content output by the 3D display; And a windshield that displays
  • the 3D content includes at least one of route guidance information according to the destination information, additional information on an object located in front of the vehicle, and detailed information related to the vehicle, and the additional information, It includes at least one of distance, speed, and additional information between the vehicle and the object, and the detailed information may include surrounding situation information according to the location of the vehicle and information related to the collected vehicle.
  • the windshield when the 3D content includes additional information on an object located in front of the vehicle and detailed information related to the vehicle, the windshield adjacent to an actual object located in front of the vehicle
  • the additional information included in the reflected image may be displayed on one side of the windshield, and the detailed information included in the reflected image may be displayed on the other side of the windshield.
  • a method of controlling a 3D head-up display system includes: measuring information on an object located in front of a vehicle, which is a moving direction of the vehicle, by a sensor unit; Collecting information related to the vehicle by the sensor unit; Obtaining, by a photographing unit, image information including the object; Receiving, by a communication unit, information on surrounding conditions according to the location of the vehicle provided from a server and additional information related to an object in the image information; Detecting, by a gaze tracking unit, a gaze direction and a gaze position of a user in the vehicle; By the control unit, information on the measured object located in front of the vehicle, information related to the collected vehicle, the acquired image information, surrounding situation information according to the position of the received vehicle, within the received image information Generating 3D content to be provided to the user based on at least one of additional information related to an object, the detected user's gaze direction and gaze position, and destination information according to a user input; Outputting 3D content generated by the control unit by a 3D
  • the 3D content may include route guidance information according to the destination information, additional information on an object located in front of the vehicle, and detailed information related to the vehicle.
  • information on the driving direction of the vehicle according to the road guidance information included in the reflected image is displayed on one side of the windshield based on the actual lane located in the front of the vehicle, and the actual object located in the front of the vehicle is displayed.
  • Additional information included in the reflected image may be displayed on the other side of the adjacent windshield, and the detailed information included in the reflected image may be displayed on another side of the windshield.
  • the present invention provides a more realistic HUD image to the user by using a 3D display capable of providing different information to both eyes of the user and providing an image located at a desired distance to the user, and a state with less heterogeneity. As the information is displayed, there is an effect of increasing the satisfaction level in use.
  • the present invention generates a 3D image using information obtained through a communication unit, a sensor unit, a photographing unit, etc., and reflects a 3D image displayed through a 3D display through a reflector to reflect a 3D HUD image of a virtual image to the windshield.
  • FIG. 1 and 2 are block diagrams showing the configuration of a 3D head-up display system according to an embodiment of the present invention.
  • FIG. 3 is a flowchart illustrating a method of controlling a 3D head-up display system according to an embodiment of the present invention.
  • FIG. 4 is a diagram showing an example of image information according to an embodiment of the present invention.
  • 5 and 6 are diagrams showing examples of virtual images displayed on a windshield according to an embodiment of the present invention.
  • first and second used in the present invention may be used to describe the elements, but the elements should not be limited by the terms. The terms are used only to distinguish one component from another. For example, without departing from the scope of the present invention, a first element may be referred to as a second element, and similarly, a second element may be referred to as a first element.
  • FIG. 1 and 2 are block diagrams showing the configuration of a 3D head-up display system 100 according to an embodiment of the present invention.
  • the 3D head-up display system 100 includes a communication unit 110, a sensor unit 120, a photographing unit 130, a gaze tracking unit 140, a storage unit 150, and a 3D display. 160), a reflector 170, a windshield 180, and a control unit 190. Not all of the components of the 3D head-up display system 100 shown in FIGS. 1 and 2 are essential components, and the 3D head-up display system ( 100) may be implemented, or the 3D head-up display system 100 may be implemented with fewer components.
  • the 3D head-up display system (or articulated 3D head-up display system) 100 may be arranged (or configured/formed) at a specific position in the vehicle and configured to be detachable/attached.
  • the communication unit 110 communicates with an internal element or at least one external terminal through a wired/wireless communication network.
  • the external terminal may include a server (not shown), another terminal provided in another adjacent vehicle (not shown), and the like.
  • wireless Internet technology wireless LAN (WLAN), DLNA (Digital Living Network Alliance), Wireless Broadband: Wibro, Wimax (World Interoperability for Microwave Access: Wimax), HSDPA (High Speed Downlink Packet Access) ), High Speed Uplink Packet Access (HSUPA), IEEE 802.16, Long Term Evolution (LTE), Long Term Evolution-Advanced (LTE-A), Wireless Mobile Broadband Service (WMBS), etc.
  • the communication unit 110 transmits and receives data according to at least one wireless Internet technology in a range including Internet technologies not listed above.
  • short-range communication technologies include Bluetooth, Radio Frequency Identification (RFID), Infrared Data Association (IrDA), Ultra Wideband (UWB), ZigBee, and Near Field Communication (NFC).
  • RFID Radio Frequency Identification
  • IrDA Infrared Data Association
  • UWB Ultra Wideband
  • NFC Near Field Communication
  • USB Ultra Sound Communication
  • VLC Visible Light Communication
  • Wi-Fi Wi-Fi Direct
  • wired communication technologies may include power line communication (PLC), USB communication, Ethernet, serial communication, optical/coaxial cable, and the like.
  • the communication unit 110 may mutually transmit information with an arbitrary terminal through a universal serial bus (USB).
  • USB universal serial bus
  • the communication unit 110 includes technical standards or communication methods for mobile communication (for example, GSM (Global System for Mobile communication), CDMA (Code Division Multi Access), CDMA2000 (Code Division Multi Access 2000)), EV -DO (Enhanced Voice-Data Optimized or Enhanced Voice-Data Only), WCDMA (Wideband CDMA), HSDPA (High Speed Downlink Packet Access), HSUPA (High Speed Uplink Packet Access), LTE (Long Term Evolution), LTE-A (Long Term Evolution-Advanced), etc.), transmits and receives radio signals to and from the base station, the server, the other terminal, and the like.
  • GSM Global System for Mobile communication
  • CDMA Code Division Multi Access
  • CDMA2000 Code Division Multi Access 2000
  • EV -DO Enhanced Voice-Data Optimized or Enhanced Voice-Data Only
  • WCDMA Wideband CDMA
  • HSDPA High Speed Downlink Packet Access
  • HSUPA High Speed Uplink Packet Access
  • LTE Long Term Evolution
  • LTE-A Long
  • the communication unit 110 is controlled by the control unit 190, the location information of the vehicle identified in real time by the server, the image information acquired through the photographing unit 130 (or the object in the image information) Information about (for example, including vehicle number of an object such as a vehicle), etc. are transmitted.
  • the communication unit 110 under the control of the control unit 190, is transmitted from the server in response to the transmission of the information, information related to the surrounding situation according to the location of the vehicle (or surrounding situation information) (for example, Including speed limit information on the corresponding road, construction information on neighboring roads, etc.), additional information related to an object in the image information (eg, information on whether a stolen vehicle or non-payment of fines is arrears), and the like are received.
  • the sensor unit 120 is disposed (or configured/formed) on one side of the vehicle in which the 3D head-up display system 100 is configured.
  • the sensor unit 120 includes various sensors (eg, including lidar sensors, ultrasonic sensors, 3D scanners, etc.) for measuring distances and speeds related to objects located in the traveling direction of the vehicle (or the front of the vehicle). do.
  • the object includes other vehicles, buildings, road signs, traffic signs, etc. located in front of the vehicle.
  • the sensor unit 120 measures (or senses) information on an object including a distance, speed, etc. related to the object located in front of the vehicle.
  • the sensor unit 120 measures information on an object located in a direction opposite to the traveling direction of the vehicle (or the rear of the vehicle).
  • the sensor unit 120 collects information related to the vehicle (eg, driving speed, engine rotational value (RPM), instant fuel economy, real-time voltage, water temperature, mileage, etc.). In this case, the sensor unit 120 may collect information related to the vehicle measured through various other sensors (not shown) provided in the vehicle.
  • RPM engine rotational value
  • the sensor unit 120 may collect information related to the vehicle measured through various other sensors (not shown) provided in the vehicle.
  • the photographing unit (or camera unit) 130 is disposed (or configured/formed) on one side of the vehicle in which the 3D head-up display system 100 is configured.
  • the photographing unit 130 is configured with one or more image sensors (camera module or camera) to photograph the front surface including the front of the vehicle, the rear of the vehicle, and the side of the vehicle.
  • the photographing unit 130 may be configured as a stereo camera capable of obtaining image information in all directions of 360 degrees.
  • the photographing unit 130 acquires (or photographs) image information including one or more objects located in front of the vehicle.
  • the photographing unit 130 processes an image frame such as a still image or a video obtained by an image sensor (a camera module or a camera) in a video call mode, a photographing mode, a video conference mode, and the like. That is, according to the codec (CODEC), corresponding image data obtained by the image sensor is encoded/decoded to meet each standard. For example, the photographing unit 130 photographs an object (or subject) and outputs a video signal corresponding to the photographed image (subject image).
  • CODEC codec
  • the photographing unit 130 acquires other image information including one or more other objects located at the rear of the vehicle.
  • the image frame (or image information) processed by the photographing unit 130 is stored in a digital video recorder (DVR), stored in the storage unit 150, or through the communication unit 110. It can be transmitted to an external server or the like.
  • DVR digital video recorder
  • the 3D head-up display system 100 may receive destination information according to a user input through an input unit (not shown).
  • the communication unit 110, the sensor unit 120, the photographing unit 130, and the input unit included in the 3D head-up display system 100 include information related to the vehicle, and the vehicle is running. It is possible to collect (or measure/detect) various information about one or more objects located in the front of the vehicle and/or at the rear of the vehicle corresponding to the opposite direction of driving.
  • the gaze tracking unit 140 is disposed (or configured/formed) on one side of a vehicle in which the 3D head-up display system 100 is configured.
  • the gaze tracking unit 140 detects (or tracks) the gaze direction and gaze position (or the direction and location in which the user looks outside the vehicle) of the user (or driver) located in the driver's seat of the vehicle.
  • the gaze tracking unit 140 detects a direction and a position in which the user located in the driver's seat looks at the front of the vehicle or a side mirror (not shown) configured on the side of the vehicle.
  • the storage unit 150 stores various user interfaces (UI), graphic user interfaces (GUI), and the like.
  • UI user interfaces
  • GUI graphic user interfaces
  • the storage unit 150 stores data and programs necessary for the 3D head-up display system 100 to operate.
  • the storage unit 150 includes a plurality of application programs (application programs or applications) driven by the 3D head-up display system 100, data for the operation of the 3D head-up display system 100, You can store commands. At least some of these application programs may be downloaded from an external server through wireless communication. In addition, at least some of these application programs may exist on the 3D head-up display system 100 from the time of shipment for basic functions of the 3D head-up display system 100. Meanwhile, the application program is stored in the storage unit 150, installed in the 3D head-up display system 100, and performs an operation (or function) of the 3D head-up display system 100 by the control unit 190. Can be driven to
  • the storage unit 150 is a flash memory type (Flash Memory Type), a hard disk type (Hard Disk Type), a multimedia card micro type (Multimedia Card Micro Type), a card type memory (for example, SD or XD Memory, etc.), magnetic memory, magnetic disk, optical disk, RAM (Random Access Memory: RAM), SRAM (Static Random Access Memory), ROM (Read-Only Memory: ROM), EEPROM (Electrically Erasable Programmable Read-Only Memory), It may include at least one storage medium among Programmable Read-Only Memory (PROM).
  • the 3D head-up display system 100 may operate a web storage that performs a storage function of the storage unit 150 on the Internet, or may operate in connection with the web storage.
  • the storage unit 150 includes the location information of the vehicle transmitted to the server under the control of the control unit 190, and the image information acquired through the photographing unit 130 (or the object in the image information). It stores information (eg, vehicle number for an object such as a vehicle, etc.)) and the like.
  • the storage unit 150 is transmitted from the server in response to the transmission of the information under the control of the controller 190, and information related to the surrounding situation according to the location of the vehicle (or surrounding situation information) (for example, It stores speed limit information on the corresponding road, construction information on neighboring roads, etc.), and additional information related to an object in the image information (eg, information on whether a stolen vehicle or non-payment of fines is arrears), and the like.
  • the 3D display (or 3D monitor/3D display) 160 is disposed (or configured/formed) on one side of the vehicle in which the 3D head-up display system 100 is configured.
  • the 3D display 160 may display various contents such as various menu screens using a user interface and/or a graphic user interface stored in the storage unit 150 under the control of the controller 190.
  • the content displayed on the 3D display 160 includes various text or image data (including various information data) and a menu screen including data such as icons, list menus, and combo boxes.
  • the 3D display 160 may be a touch screen.
  • the 3D display 160 includes a liquid crystal display (LCD), a thin film transistor liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible device. It may include at least one of a flexible display, an e-ink display, and a light emitting diode (LED).
  • LCD liquid crystal display
  • TFT LCD thin film transistor liquid crystal display
  • OLED organic light-emitting diode
  • a flexible device may include at least one of a flexible display, an e-ink display, and a light emitting diode (LED).
  • LED light emitting diode
  • the 3D display 160 outputs (or displays) 3D content generated based on the collected various pieces of information under the control of the controller 190.
  • the 3D content includes road guidance information (or route guidance information) according to the destination information, additional information on an object located in front of the vehicle (or an object included in the image information), detailed information related to the vehicle, etc.
  • the route guidance information, the additional information, the detailed information, etc. are displayed adjacent to the actual object located in front of the vehicle in consideration of the position of the object in the image information, or the viewing angle of the user or the user and the object In consideration of the focus of the liver, a location in the corresponding 3D content may be set to be displayed at a specific location.
  • the additional information includes information such as distance between the vehicle and the object, speed, and additional information (for example, whether the vehicle is stolen, whether or not a penalty is arrears). ), etc.
  • the detailed information includes various information related to the surrounding situation according to the location of the vehicle (for example, including speed limit information on the road, construction information on the surrounding road, etc.), the vehicle It includes driving speed, engine speed, instant fuel economy, real-time voltage, water temperature, mileage, alarm alarm switch, overspeed alarm, and water temperature alarm.
  • the 3D head-up display system 100 may further include an audio output unit (not shown) that outputs audio information included in a signal processed by a predetermined signal by the control unit 190.
  • the audio output unit may include a receiver, a speaker, a buzzer, and the like.
  • the voice output unit outputs a guide voice generated by the control unit 190.
  • the audio output unit outputs audio information (or sound effect) included in 3D content output through the 3D display 160 under the control of the controller 190.
  • the reflector (or mirror unit/half mirror) 170 is disposed (or configured/formed) on one side of the vehicle in which the 3D head-up display system 100 is configured.
  • the reflector 170 reflects the 3D content output from the 3D display 160 to the windshield 180.
  • the windshield 180 is disposed (or configured/formed) on one side of the windshield of the vehicle in which the 3D head-up display system 100 is configured.
  • the windshield 180 may have a structure in which a reflective film (not shown) is formed between an outer plate glass (not shown) and an inner plate glass (not shown), or a transparent or translucent structure in which the reflective film is formed.
  • the windshield 180 displays an image (or a virtual image/virtual image) in which 3D content output (or displayed) by the 3D display 160 is reflected by the reflector 170.
  • the reflected image is in a state in which a focal length of the reflected image is adjusted in an actual space, and the reflected image may be displayed at an optimum focal position in the actual space.
  • the windshield 180 Corresponding additional information included in the image is displayed, and detailed information related to the vehicle included in the reflected image is displayed on the other side of the windshield 180.
  • the windshield 180 is the reflected on one side based on the actual lane located in front of the vehicle.
  • Display information on the direction of the vehicle according to the corresponding road guidance information included in the image e.g., information such as straight, left, right, lane usage information, and left/right/U-turn information several meters ahead
  • the vehicle Additional information related to the object included in the reflected image is displayed on the other side of the windshield 180 adjacent to the real object located in front, and the reflected image is displayed on the other side of the windshield 180. Displays detailed information related to the included vehicle.
  • the windshield 180 provides a single user (for example, a user located in a driver's seat or an assistant seat) or a plurality of users (for example, a driver's seat, an assistant seat, etc.) for the 3D content generated by the control unit 190. Of the user) can provide different images.
  • the reflected image displayed on the windshield 180 provides an image corresponding to information of an actual object.
  • the user can place the name tag of an object (or object) 30m away from the vehicle equipped with the 3D head-up display system 100 (or including additional information related to the object) at a position of 30m in a virtual image, the user can The actual image and the virtual image are both accepted and received as an actual image, or the actual object 30m away and the path to it are expressed as a virtual image of 0 to 30m.
  • the 3D head-up display system 100 tracks the direction and position of the user's eyes (or gaze), and expresses an appropriate virtual image through the 3D display 160, so that the actual driver feels that there is an image at a distance. do.
  • the 3D head-up display system 100 may further include a GPS receiver (not shown) that calculates (or acquires) GPS location information of the 3D head-up display system 100.
  • the GPS receiver receives the GPS signal transmitted from the satellite, and based on the longitude coordinates and latitude coordinates included in the received GPS signal, the location data (or location information) of the 3D head-up display system 100 is stored. Occurs.
  • the signal received through the GPS receiver is 802.11, Bluetooth, UWB, and Zigbee, which are standard standards for wireless LANs including wireless LANs and some infrared communications proposed by the Institute of Electrical and Electronics Engineers (IEEE).
  • PAN personal area network
  • MAN wireless metropolitan area network
  • FWA fixed wireless access
  • BWA broadband wireless access
  • 802.20 which is a standard standard for mobile Internet, for wireless MAN (Mobile Broadband Wireless Access: MBWA) including Wibro and WiMAX, etc. It may be configured to provide precise location information of the 3D head-up display system 100 to the 3D head-up display system 100.
  • the GPS receiver is controlled by the control unit 190 to determine the location information of the corresponding 3D head-up display system 100 (or, when the 3D head-up display system 100 is mounted on the vehicle, the location of the corresponding vehicle). Information).
  • the controller (or microcontroller unit (MCU) 190) is disposed (or configured/formed) on one side of the vehicle in which the 3D head-up display system 100 is configured.
  • control unit 190 executes an overall control function of the 3D head-up display system 100.
  • control unit 190 executes an overall control function of the 3D head-up display system 100 using programs and data stored in the storage unit 150.
  • the control unit 190 may include RAM, ROM, CPU, GPU, and bus, and RAM, ROM, CPU, GPU, and the like may be connected to each other through a bus.
  • the CPU can access the storage unit 150 and perform booting using the O/S stored in the storage unit 150, and use various programs, contents, data, etc. stored in the storage unit 150 Thus, various operations can be performed.
  • control unit 190 includes information on an object located in front of the vehicle measured (or sensed) through the sensor unit 120 (including, for example, distance, speed, etc.), through the sensor unit 120. Collected information related to the vehicle, image information including one or more objects obtained through the photographing unit 130, various information related to the surrounding situation of the vehicle received through the communication unit 110, and the received destination 3D content to be provided to a corresponding user is generated based on information, the detected user's gaze direction, and information on the gaze position.
  • the 3D content includes road guidance information (or route guidance information) according to the destination information, additional information on an object located in front of the vehicle (or an object included in the image information), detailed information related to the vehicle, etc.
  • the route guidance information, the additional information, the detailed information, etc. are displayed adjacent to the actual object located in front of the vehicle in consideration of the position of the object in the image information, or the viewing angle of the user or the user and the object In consideration of the focus of the liver, a location in the corresponding 3D content may be set to be displayed at a specific location.
  • the additional information includes a distance between the vehicle and a corresponding object, a speed, and additional information (eg, information such as whether a stolen vehicle or a penalty is arrears), and the like.
  • the detailed information includes various information related to the surrounding situation according to the location of the vehicle (eg, speed limit information on the corresponding road, construction information on the surrounding road, etc.), and the collected information related to the vehicle (eg For example, it includes driving speed related to the vehicle, engine speed, instant fuel economy, real-time voltage, water temperature, mileage, alarm alarm switch, overspeed alarm, water temperature alarm, etc.).
  • the generated 3D content displays additional information related to the object at a location corresponding to the actual object, displays detailed information related to the vehicle at a predetermined specific location, and displays the detailed information related to the vehicle at a location corresponding to the actual lane.
  • This may be information in consideration of the distance between the user's eyes and the HUD image (or virtual image) so as to display information on the driving direction of the vehicle according to the generated road guidance information.
  • the controller 190 reflects the gaze direction and gaze position of the moving user to the 3D content.
  • various information included in the 3D content for example, including road guidance information, additional information, detailed information, etc. is moved in the opposite direction corresponding to the corresponding movement change. I can make it.
  • the control unit 190 changes the corresponding movement (e.g., 5cm to the left) for directions information, additional information, and detailed information included in the 3D content. ), the corrected 3D content is output through the 3D display 160 while moving about 5cm in the opposite direction corresponding to ), and the various information is moved in a direction opposite to the change in the user's gaze movement compared to the previous 3D content.
  • the corrected 3D content may be displayed through the windshield 180.
  • an image located at a desired distance may be provided to the user by using a 3D display capable of providing different information to both eyes of the user.
  • a 3D image is generated using information obtained through a communication unit, a sensor unit, a photographing unit, etc., and a 3D HUD image of a virtual image is reflected through a reflector through a reflector. Can be displayed.
  • FIG. 3 is a flowchart illustrating a method of controlling a 3D head-up display system according to an embodiment of the present invention.
  • the sensor unit 120 measures (or senses) information (including, for example, distance, speed, etc.) on an object located in the traveling direction of the vehicle (or the front of the vehicle).
  • the object includes other vehicles, buildings, road signs, traffic signs, etc. located in front of the vehicle.
  • the sensor unit 120 may measure information on an object located in a direction opposite to the traveling direction of the vehicle (or the rear of the vehicle).
  • the sensor unit 120 collects information related to the vehicle (eg, driving speed, engine rotational value (RPM), instant fuel economy, real-time voltage, water temperature, mileage, etc.). In this case, the sensor unit 120 may collect information related to the vehicle measured through various other sensors (not shown) provided in the vehicle.
  • RPM engine rotational value
  • the sensor unit 120 may collect information related to the vehicle measured through various other sensors (not shown) provided in the vehicle.
  • the photographing unit 130 acquires (or photographs) image information including one or more objects located in front of the vehicle.
  • the photographing unit 130 may acquire other image information including one or more other objects located at the rear of the vehicle.
  • the communication unit 110 stores the real-time location of the vehicle provided from the server (not shown), information related to the surrounding situation according to the object in the image information acquired through the photographing unit 130, information related to the object, etc. Receive.
  • the communication unit 110 controls the location information of the vehicle, which is checked in real time under the control of the control unit 190, and the image information obtained through the photographing unit 130 (or information on the object in the image information (for example, information related to the surrounding situation according to the location of the vehicle (or surrounding situation information) is transmitted from the server to the server, and is transmitted from the server in response to the transmission. (E.g., including speed limit information on the road, construction information of neighboring roads, etc.), additional information related to the object in the image information (e.g., information such as whether a stolen vehicle or non-payment of fines is arrears), etc. .
  • the 3D head-up display system 100 receives destination information according to a user input through an input unit (not shown).
  • the communication unit 110, the sensor unit 120, the photographing unit 130, and the input unit included in the 3D head-up display system 100 include information related to the vehicle, and the vehicle is running. It is possible to collect (or measure/detect) various information about one or more objects located in the front of the vehicle and/or at the rear of the vehicle corresponding to the opposite direction of driving.
  • the sensor unit 120 includes a plurality of vehicles, a plurality of buildings (or buildings) located in the traveling direction of the first vehicle in which the 3D head-up display system 100 provided with the sensor unit 120 is configured. Measure the information.
  • the sensor unit 120 collects information related to the first vehicle (eg, driving speed, engine rotational value, instantaneous fuel economy, real-time voltage, water temperature, mileage, etc.).
  • information related to the first vehicle eg, driving speed, engine rotational value, instantaneous fuel economy, real-time voltage, water temperature, mileage, etc.
  • the photographing unit 130 acquires first image information 400 including a plurality of vehicles, a plurality of buildings (or buildings), etc. located in the traveling direction of the first vehicle. do.
  • the communication unit 110 transmits the location information of the first vehicle, the first image information obtained through the photographing unit 130, and the like, to the server, which are checked in real time under the control of the control unit 190. And, in response to the information transmission, first surrounding situation information related to the surrounding situation where the first vehicle is currently located (eg, first speed limit information on a driving road (eg, 80 km/h)) transmitted from the server , 2nd additional information (for example, no arrears of fines) for the second vehicle included in the first image information is received.
  • first surrounding situation information related to the surrounding situation where the first vehicle is currently located eg, first speed limit information on a driving road (eg, 80 km/h)
  • 2nd additional information for example, no arrears of fines
  • the input unit receives first destination information (for example, Sinsa Station intersection) according to the user input (S310).
  • first destination information for example, Sinsa Station intersection
  • the gaze tracking unit 140 detects (or tracks) the gaze direction and gaze position (or the direction and location in which the user looks outside the vehicle) of the user (or driver) located in the driver's seat in the vehicle.
  • the gaze tracking unit 140 detects a direction and a position in which the user located in the driver's seat looks at the front of the vehicle or a side mirror (not shown) configured on the side of the vehicle.
  • the gaze tracking unit 140 detects a real-time first gaze direction and gaze position of a user located in a driver's seat in the vehicle (S320).
  • the control unit 190 includes information on the object located in front of the vehicle measured (or sensed) through the sensor unit 120 (eg, including distance, speed, etc.), through the sensor unit 120. Collected information related to the vehicle, image information including one or more objects obtained through the photographing unit 130, various information related to the surrounding situation of the vehicle received through the communication unit 110, and the received destination 3D content to be provided to a corresponding user is generated based on information, the detected user's gaze direction, and information on the gaze position.
  • the 3D content includes road guidance information (or route guidance information) according to the destination information, additional information on an object located in front of the vehicle (or an object included in the image information), detailed information related to the vehicle, etc.
  • the route guidance information, the additional information, the detailed information, etc. are displayed adjacent to the actual object located in front of the vehicle in consideration of the position of the object in the image information, or the viewing angle of the user or the user and the object In consideration of the focus of the liver, a location in the corresponding 3D content may be set to be displayed at a specific location.
  • the additional information includes a distance between the vehicle and a corresponding object, a speed, and additional information (eg, information such as whether a stolen vehicle or a penalty is arrears), and the like.
  • the detailed information includes various information related to the surrounding situation according to the location of the vehicle (eg, speed limit information on the corresponding road, construction information on the surrounding road, etc.), and the collected information related to the vehicle (eg For example, it includes driving speed related to the vehicle, engine speed, instant fuel economy, real-time voltage, water temperature, mileage, alarm alarm switch, overspeed alarm, water temperature alarm, etc.).
  • control unit 190 includes information on a plurality of vehicles, a plurality of buildings, etc. located in front of the vehicle measured through the sensor unit 120, and related to the first vehicle collected through the sensor unit 120.
  • Information e.g., including driving speed, engine speed, instant fuel economy, real-time voltage, water temperature, mileage, etc.
  • a plurality of vehicles located in the traveling direction of the first vehicle acquired through the photographing unit 130 First image information including a plurality of buildings (or buildings), etc.
  • first surrounding situation information related to the surrounding situation in which the first vehicle is currently located (e.g.
  • the first 3D content to be provided to the user is generated based on the information on the first gaze direction and gaze position of the user.
  • the first 3D content includes second additional information on the second vehicle (for example, a first distance between the first vehicle and the second vehicle (for example, 50 m), and a second speed of the second vehicle).
  • first detailed information about the first vehicle e.g., first speed limit information on the driving road (e.g., 80 km/h)
  • the first Includes the vehicle's running speed (eg, 70km per hour), the engine's rotational value (eg, 2,000RPM), and the like.
  • control unit 190 generates first route guidance information according to the real-time current location (for example, Yeoksam Station intersection) of the first vehicle according to the received first destination information (for example, Sinsa Station intersection). do.
  • control unit 190 includes information on a plurality of vehicles, a plurality of buildings, etc. located in front of the vehicle measured through the sensor unit 120, and information related to the first vehicle collected through the sensor unit 120 (For example, including driving speed, engine revolution, instant fuel economy, real-time voltage, water temperature, mileage, etc.), a plurality of vehicles located in the traveling direction of the first vehicle acquired through the photographing unit 130, a plurality of First image information including a building (or building) of, etc., first surrounding situation information related to the surrounding situation in which the first vehicle is currently located (for example, the first image information on the driving road) Speed limit information (e.g., 80 km/h), second additional information on the second vehicle included in the first image information (e.g., no arrears of fines), the generated first route guidance information, the gaze
  • the second 3D content to be provided to the user is generated based on the information on the user's first gaze direction and gaze position sensed through the tracking unit 140.
  • the second 3D content includes second additional information on the second vehicle (for example, a first distance between the first vehicle and the second vehicle (for example, 50 m), and a second speed of the second vehicle). (E.g., 80 km/h), including no arrears of fines, etc.), second detailed information about the first vehicle (e.g., first speed limit information on the driving road (e.g., 80 km/h), the first Including the driving speed of the vehicle (for example, 70km per hour), the engine speed (for example, 2,000 RPM), information on the direction of the vehicle according to the first route guidance information (for example, turn right in front of 659m) ) And the like (S330).
  • second additional information on the second vehicle for example, a first distance between the first vehicle and the second vehicle (for example, 50 m), and a second speed of the second vehicle.
  • second detailed information about the first vehicle e.g., first speed limit information on the driving road (e.g., 80 km/h)
  • the 3D display 160 outputs (or displays) the 3D content generated by the control unit 190.
  • the 3D display 160 outputs the generated 3D content under the control of the controller 190.
  • the 3D display 160 configured (or disposed) at a specific location in the vehicle outputs the generated first 3D content under the control of the controller 190.
  • the 3D display 160 outputs the generated second 3D content under the control of the controller 190 (S340).
  • the windshield 180 configured on the front of the vehicle is an image (or virtual image/virtual image) in which 3D content output (or displayed) by the 3D display 160 is reflected by the reflector 170. Is displayed.
  • the reflected image is in a state in which a focal length of the reflected image is adjusted in an actual space, and the reflected image may be displayed at an optimum focal position in the actual space.
  • the windshield 180 Corresponding additional information included in the image is displayed, and detailed information related to the vehicle included in the reflected image is displayed on the other side of the windshield 180.
  • the windshield 180 is the reflected on one side based on the actual lane located in front of the vehicle.
  • Display information on the direction of the vehicle according to the corresponding road guidance information included in the image for example, information such as straight, left turn, right turn, lane usage information, and left turn/right turn/U-turn information after a few meters
  • the vehicle Additional information related to the object included in the reflected image is displayed on the other side of the windshield 180 adjacent to the real object located in front, and the reflected image is displayed on the other side of the windshield 180. Displays detailed information related to the included vehicle.
  • the windshield 180 displays a first virtual image 500 in which the first 3D content output from the 3D display 160 is reflected by the reflector 170 do.
  • the second additional information 510 on the first vehicle included in the first 3D content is displayed adjacent to the second vehicle 520, and first detailed information 530 on the first vehicle Is displayed at a first specific position set in advance.
  • the windshield 180 displays a second virtual image 600 in which the second 3D content output from the 3D display 160 is reflected by the reflector 170.
  • the second additional information 610 on the second vehicle included in the second 3D content is displayed adjacent to the second vehicle 620, and second detailed information 630 on the first vehicle Is displayed at a preset second specific location, and information on the vehicle's traveling direction according to the first route guidance information (for example, turn right in front of 650 m) 640 is displayed at a preset third specific location (S350 ).
  • An embodiment of the present invention provides a more realistic HUD image by providing an image located at a desired distance to a user using a 3D display capable of providing different information to both eyes of the user, as described above.
  • a 3D display capable of providing different information to both eyes of the user, as described above.
  • an embodiment of the present invention generates a 3D image using information obtained through a communication unit, a sensor unit, a photographing unit, etc., and reflects the 3D image displayed through the 3D display through a reflector.
  • a 3D HUD image of a virtual image on the windshield the entire system is configured in a compact size to reduce cost, and the distance between the display and the eye is closer, providing a wider usage environment even with a smaller sized 3D display. It can express information about and can express information about objects at multiple distances.

Landscapes

  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Mathematical Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Combustion & Propulsion (AREA)
  • Chemical & Material Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Instrument Panels (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne un système d'affichage tête haute en 3D et son procédé de commande. La présente invention fournit à un utilisateur une image qui est positionnée à une distance souhaitée en utilisant un afficheur 3D apte à fournir différentes informations aux deux yeux de l'utilisateur, ce qui permet de fournir une image d'affichage tête haute plus réaliste à l'utilisateur et, lorsque les informations sont affichées dans un état avec moins de disparités, le niveau de satisfaction lors de l'utilisation peut être augmenté.
PCT/KR2020/007919 2019-09-17 2020-06-18 Système d'affichage tête haute en 3d et son procédé de commande WO2021054579A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2019-0114220 2019-09-17
KR1020190114220A KR20210042431A (ko) 2019-09-17 2019-09-17 3d 헤드업디스플레이 시스템 및 그의 제어 방법

Publications (1)

Publication Number Publication Date
WO2021054579A1 true WO2021054579A1 (fr) 2021-03-25

Family

ID=74883833

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/007919 WO2021054579A1 (fr) 2019-09-17 2020-06-18 Système d'affichage tête haute en 3d et son procédé de commande

Country Status (2)

Country Link
KR (2) KR20210042431A (fr)
WO (1) WO2021054579A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102628667B1 (ko) * 2021-09-23 2024-01-24 그리다텍 주식회사 태양계 공전 시스템을 모사한 vr 인터페이스 시스템
KR102413715B1 (ko) * 2022-04-05 2022-06-27 최귀철 위성 측위 데이터에 기반하는 차량 속도를 헤드업 디스플레이를 통하여 표시하는 방법 및 그 전자장치

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134639A (ja) * 2008-12-03 2010-06-17 Honda Motor Co Ltd 視覚支援装置
WO2017022051A1 (fr) * 2015-08-03 2017-02-09 三菱電機株式会社 Dispositif de commande d'affichage, dispositif d'affichage et procédé de commande d'affichage
US20180040163A1 (en) * 2016-08-05 2018-02-08 Uber Technologies, Inc. Virtual reality experience for a vehicle
KR20190011944A (ko) * 2017-07-26 2019-02-08 주식회사 크레모텍 차량용 삼차원 헤드업 디스플레이 장치 및 그 형성 방법
KR20190031214A (ko) * 2019-03-07 2019-03-25 삼성전자주식회사 정보 제공 방법 및 이를 위한 정보 제공 차량

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101610275B1 (ko) 2014-11-05 2016-04-07 주식회사 디젠 가변초점 허드 시스템

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010134639A (ja) * 2008-12-03 2010-06-17 Honda Motor Co Ltd 視覚支援装置
WO2017022051A1 (fr) * 2015-08-03 2017-02-09 三菱電機株式会社 Dispositif de commande d'affichage, dispositif d'affichage et procédé de commande d'affichage
US20180040163A1 (en) * 2016-08-05 2018-02-08 Uber Technologies, Inc. Virtual reality experience for a vehicle
KR20190011944A (ko) * 2017-07-26 2019-02-08 주식회사 크레모텍 차량용 삼차원 헤드업 디스플레이 장치 및 그 형성 방법
KR20190031214A (ko) * 2019-03-07 2019-03-25 삼성전자주식회사 정보 제공 방법 및 이를 위한 정보 제공 차량

Also Published As

Publication number Publication date
KR20210088487A (ko) 2021-07-14
KR20210042431A (ko) 2021-04-20

Similar Documents

Publication Publication Date Title
WO2020060308A1 (fr) Dispositif électronique et procédé de commande de dispositif électronique de véhicule, serveur et procédé pour fournir des données cartographiques précises de serveur
WO2017119737A1 (fr) Procédé et dispositif de partage d'informations d'image dans un système de communications
JP4475595B2 (ja) 情報表示装置、ナビゲーション装置
WO2021054579A1 (fr) Système d'affichage tête haute en 3d et son procédé de commande
WO2018026247A1 (fr) Dispositif d'affichage de véhicule et procédé de commande correspondant
WO2011108836A2 (fr) Serveur, système de navigation, système de navigation pour véhicule et procédé d'obtention d'images d'un système de navigation pour véhicule
JP4085928B2 (ja) 車両用ナビゲーションシステム
WO2013018962A1 (fr) Appareil de reconnaissance de voie de trafic et procédé associé
WO2015105236A1 (fr) Visiocasque et son procédé de commande
US20150281593A1 (en) Adaptive low-light view modes
WO2018143589A1 (fr) Procédé et dispositif d'émission d'informations de voie
WO2019117459A1 (fr) Dispositif et procédé d'affichage de contenu
CN108290521A (zh) 一种影像信息处理方法及增强现实ar设备
WO2015083909A1 (fr) Système de guidage de localisation utilisant une navigation transparente, et son procédé
CN104054033A (zh) 用于监视车辆环境的方法
KR20200068776A (ko) 자율주행 및 커넥티드 자동차용 통신 서비스 제공방법
JP2011227874A (ja) 情報処理装置、システム、空きスペース案内方法及びプログラム
KR20140106289A (ko) 차량간 통신을 이용한 협력 영상기록 장치 및 그 방법
WO2018043821A1 (fr) Système de guidage et d'indication d'itinéraire, utilisant des informations météorologiques, pour aéronef sans pilote, procédé associé, et support d'enregistrement sur lequel est enregistré un programme informatique
WO2021045246A1 (fr) Appareil et procédé de fourniture d'une fonction étendue à un véhicule
WO2015046669A1 (fr) Visiocasque et son procédé de commande
US11671700B2 (en) Operation control device, imaging device, and operation control method
WO2021201304A1 (fr) Procédé et dispositif d'aide à la conduite autonome
TW201333753A (zh) 具有實景導覽功能之透明顯示器
WO2023116377A1 (fr) Procédé, appareil et système d'affichage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20866373

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20866373

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 09/09/2022)

122 Ep: pct application non-entry in european phase

Ref document number: 20866373

Country of ref document: EP

Kind code of ref document: A1