WO2023238992A1 - Dispositif d'affichage de ra pour véhicule et son procédé de fonctionnement - Google Patents

Dispositif d'affichage de ra pour véhicule et son procédé de fonctionnement Download PDF

Info

Publication number
WO2023238992A1
WO2023238992A1 PCT/KR2022/015979 KR2022015979W WO2023238992A1 WO 2023238992 A1 WO2023238992 A1 WO 2023238992A1 KR 2022015979 W KR2022015979 W KR 2022015979W WO 2023238992 A1 WO2023238992 A1 WO 2023238992A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
information
driving
data
processor
Prior art date
Application number
PCT/KR2022/015979
Other languages
English (en)
Korean (ko)
Inventor
이지은
채지석
이한성
손정훈
홍진혁
김일완
최병준
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to KR1020237018104A priority Critical patent/KR102609960B1/ko
Priority to EP23175932.5A priority patent/EP4290186A1/fr
Priority to CN202310687467.7A priority patent/CN117215061A/zh
Priority to US18/208,550 priority patent/US20230400321A1/en
Publication of WO2023238992A1 publication Critical patent/WO2023238992A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/36Input/output arrangements for on-board computers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics

Definitions

  • the present invention relates to an AR display device linked to a vehicle and a method of operating the same. More specifically, it relates to an AR display device capable of displaying a guide on driving situations in advance in AR form in front of the vehicle and a method of operating the same. will be.
  • vehicle functions are becoming more diverse. These vehicle functions can be divided into convenience functions to promote driver convenience, and safety functions to promote driver and/or pedestrian safety.
  • Convenience functions of a vehicle have a motive for development related to driver convenience, such as providing infotainment (information + entertainment) functions to the vehicle, supporting partial autonomous driving functions, or helping to secure the driver's field of vision such as night vision or blind spots.
  • driver convenience such as providing infotainment (information + entertainment) functions to the vehicle, supporting partial autonomous driving functions, or helping to secure the driver's field of vision such as night vision or blind spots.
  • ACC adaptive cruise control
  • SPAS smart0020parking assist system
  • NV night vision
  • HUD head up display
  • AHS adaptive headlight system
  • AR objects separated from the AR graphic interface not only provide guidance on predicted situations, but also provide response guides for safe driving in predicted situations.
  • the possible context includes detection and collision possibility of a hidden object that does not appear in the image in front of the vehicle, tracking path and collision possibility according to selection of the following vehicle in front of the vehicle, diagnostic prediction based on the state of the vehicle, and It may be any one of the following: determination of impossibility of driving, congestion and possibility of detour in the current driving route, and detection of a charging area according to the vehicle's entry into the charging station.
  • the processor is configured to cause a second AR object separate from the first AR object to be displayed at a location associated with the occurrence of the estimated context while the first AR object displays the current driving state of the vehicle. Can be rendered.
  • the processor provides additional information to the second AR object based on the separation distance between the location displayed by the separated second AR object and the current location of the vehicle corresponding to the first AR object. Whether to include or not may be determined, and the separated second AR object may be displayed variably based on the decision.
  • the additional information includes a warning display for a hidden object, a change in driving speed or direction of a vehicle following ahead, a vehicle inspection, charging, route guidance display for a parking area, and vehicle congestion-related information. It may include at least one of information and detour route information, and charging status and fee information.
  • the second AR object includes a plurality of fragments when separated, and the processor determines the location of the object corresponding to the context starting from the first AR object through the plurality of fragments.
  • the separated second AR object can be varied to display a plurality of trajectories to follow.
  • the processor in response to the vehicle entering the charging station, connects communication with a server provided in the charging station through the communication module, provides status data of the vehicle to the server, and provides status data of the vehicle to the server based on the status data.
  • a second AR object receives first route information guiding from the server to a charger capable of charging, separates the second AR object, displays the first guide route, and guides from the server to the exit of the charging station when the vehicle is finished charging. By receiving path information, rendering can be updated to display a second guide path using the separated second AR object.
  • the AR display device and its operating method it is possible to provide an augmented reality navigation screen based on a calibrated front image without separate settings, and to provide information about the predicted driving situation currently displayed on the navigation screen.
  • By guiding the guide as an AR object along with the current location of the vehicle more intuitive and realistic AR guide guidance can be provided to the vehicle.
  • a vehicle when a vehicle enters a parking lot or charging station, it communicates with the control server of that location and displays route guidance for parking/charging areas, parking/charging related information, and route guidance when leaving the car in a more intuitive AR graphic interface. It can provide a direct and smart parking/charging related UX.
  • Figure 2 is a view of a vehicle related to an embodiment of the present invention viewed from various angles.
  • 3 and 4 are diagrams showing the interior of a vehicle related to an embodiment of the present invention.
  • 5 and 6 are diagrams referenced for explaining various objects related to driving of a vehicle related to an embodiment of the present invention.
  • Figure 7 is a block diagram referenced for explaining a vehicle and an AR display device related to an embodiment of the present invention.
  • FIG. 8 is a detailed block diagram related to the processor of the AR display device according to an embodiment of the present invention.
  • FIG. 9 is a diagram referenced for explaining a navigation screen according to an embodiment of the present invention
  • FIG. 10 is a diagram referenced for explaining an operation of generating the navigation screen of FIG. 9 .
  • Figure 11 is a flowchart referenced to explain a method of displaying an AR graphic interface on a navigation screen according to an embodiment of the present invention.
  • FIGS. 12A and 12B are illustrations of an AR graphic interface according to an embodiment of the present invention, and are reference diagrams for explaining separation and combination into first and second AR objects.
  • Figure 13 is a flowchart referenced to explain a method of estimating context based on network data and displaying related guide information through an AR graphic interface according to an embodiment of the present invention.
  • Figures 15 and 16 are conceptual diagrams showing various modified examples of an AR graphic interface for displaying guide information according to follow vehicle settings, according to an embodiment of the present invention.
  • FIGS. 17, 18A, and 18B are flowcharts and conceptual diagrams used to explain a method of displaying responses according to vehicle status diagnosis through an AR graphic interface, according to an embodiment of the present invention.
  • FIGS. 19A and 19B are flowcharts and conceptual diagrams used to explain a method of displaying a congestion situation of a driving route and responses related to determination of detour possibility through an AR graphic interface, according to an embodiment of the present invention.
  • the vehicle described in this specification may include a car and a motorcycle. Below, description of vehicles will focus on automobiles.
  • the vehicle described in this specification may be a concept that includes all internal combustion engine vehicles having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
  • the left side of the vehicle refers to the left side of the vehicle's traveling direction
  • the right side of the vehicle refers to the right side of the vehicle's traveling direction
  • the “system” disclosed in this specification may include at least one of a server device and a cloud device, but is not limited thereto.
  • a system may consist of one or more server devices.
  • a system may consist of one or more cloud devices.
  • the system may be operated with a server device and a cloud device configured together.
  • Map information or “map data” disclosed in this specification includes images captured through vision sensors such as cameras, two-dimensional map information, three-dimensional map information, digital twin three-dimensional maps, and high-precision maps (HD maps). ), and map information such as maps in real/virtual space, map data, and map-related applications.
  • vision sensors such as cameras, two-dimensional map information, three-dimensional map information, digital twin three-dimensional maps, and high-precision maps (HD maps).
  • HD maps high-precision maps
  • Point of Interest (POI) information refers to points of interest selected based on the map information or map data, pre-registered POI information (POI stored in the map of the cloud server), and user-set POI information. (e.g., my home, school, work, etc.), driving-related POI information (e.g., destination, waypoint, gas station, rest area, parking lot, etc.), and top search POI information (e.g., recently clicked/highly visited POI, hot place, etc.) may include. This POI information can be updated in real time based on the current location of the vehicle.
  • POI information can be updated in real time based on the current location of the vehicle.
  • the vehicle 100 may include wheels rotated by a power source and a steering input device 510 for controlling the moving direction of the vehicle 100.
  • Vehicle 100 may be an autonomous vehicle.
  • the vehicle 100 may be switched to autonomous driving mode or manual mode based on user input.
  • the vehicle 100 switches from manual mode to autonomous driving mode based on user input received through the user interface device (hereinafter, referred to as 'user terminal') 200, or It can be switched from autonomous driving mode to manual mode.
  • 'user terminal' user interface device
  • the vehicle 100 may be switched to autonomous driving mode or manual mode based on driving situation information.
  • Driving situation information may be generated based on object information provided by the object detection device 300.
  • the vehicle 100 may be switched from manual mode to autonomous driving mode, or from autonomous driving mode to manual mode, based on driving situation information generated by the object detection device 300.
  • the vehicle 100 may be switched from manual mode to autonomous driving mode, or from autonomous driving mode to manual mode, based on driving situation information received through the communication device 400.
  • the autonomous vehicle 100 may be driven based on the driving system 700 .
  • the autonomous vehicle 100 may be driven based on information, data, or signals generated by the driving system 710, the parking system 740, and the parking system 750.
  • the autonomous vehicle 100 may receive user input for driving through the driving control device 500. Based on user input received through the driving control device 500, the vehicle 100 may be driven.
  • the vehicle 100 includes a user interface device (hereinafter referred to as a 'user terminal') 200, an object detection device 300, a communication device 400, and a driving operation device. (500), vehicle driving device 600, driving system 700, navigation system 770, sensing unit 120, vehicle interface unit 130, memory 140, control unit 170, and power supply unit 190 ) may include.
  • the vehicle 100 may further include other components in addition to the components described in this specification, or may not include some of the components described.
  • the user interface device 200 is a device for communication between the vehicle 100 and the user.
  • the user interface device 200 may receive user input and provide information generated by the vehicle 100 to the user.
  • the vehicle 100 may implement User Interfaces (UI) or User Experience (UX) through a user interface device (hereinafter referred to as a 'user terminal') 200.
  • UI User Interfaces
  • UX User Experience
  • the input unit 210 is used to receive information from the user, and the data collected by the input unit 210 can be analyzed by the processor 270 and processed as a user's control command.
  • the input unit 210 may be placed inside the vehicle.
  • the input unit 210 is an area of the steering wheel, an area of the instrument panel, an area of the seat, an area of each pillar, and a door.
  • the voice input unit 211 can convert the user's voice input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the control unit 170.
  • the voice input unit 211 may include one or more microphones.
  • the gesture input unit 212 can convert the user's gesture input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the control unit 170.
  • the gesture input unit 212 may include at least one of an infrared sensor and an image sensor for detecting a user's gesture input. Depending on the embodiment, the gesture input unit 212 may detect a user's 3D gesture input. To this end, the gesture input unit 212 may include a light output unit that outputs a plurality of infrared lights or a plurality of image sensors.
  • the touch input unit 213 can convert the user's touch input into an electrical signal.
  • the converted electrical signal may be provided to the processor 270 or the control unit 170.
  • the touch input unit 213 may include a touch sensor for detecting a user's touch input.
  • the touch input unit 213 may be formed integrally with the display unit 251 to implement a touch screen. This touch screen can provide both an input interface and an output interface between the vehicle 100 and the user.
  • the internal camera 220 can acquire images inside the vehicle.
  • the processor 270 may detect the user's state based on the image inside the vehicle.
  • the processor 270 may obtain the user's gaze information from the image inside the vehicle.
  • the processor 270 may detect a user's gesture from an image inside the vehicle.
  • the biometric detection unit 230 can acquire the user's biometric information.
  • the biometric detection unit 230 includes a sensor that can acquire the user's biometric information, and can obtain the user's fingerprint information, heart rate information, etc. using the sensor. Biometric information can be used for user authentication.
  • the output unit 250 is for generating output related to vision, hearing, or tactile sensation.
  • the output unit 250 may include at least one of a display unit 251, an audio output unit 252, and a haptic output unit 253.
  • the display unit 251 can display graphic objects corresponding to various information.
  • the display unit 251 includes a liquid crystal display (LCD), a thin film transistor-liquid crystal display (TFT LCD), an organic light-emitting diode (OLED), and a flexible display. It may include at least one of a display, a 3D display, and an e-ink display.
  • the display unit 251 and the touch input unit 213 may form a layered structure or be formed as one piece, thereby implementing a touch screen.
  • the display unit 251 may be implemented as a Head Up Display (HUD).
  • HUD Head Up Display
  • the display unit 251 is equipped with a projection module and can output information through an image projected on a windshield or window.
  • the display unit 251 may include a transparent display.
  • the transparent display can be attached to a windshield or window.
  • a transparent display can display a certain screen while having a certain transparency.
  • transparent displays include transparent TFEL (Thin Film Elecroluminescent), transparent OLED (Organic Light-Emitting Diode), transparent LCD (Liquid Crystal Display), transparent transparent display, and transparent LED (Light Emitting Diode) display. It may include at least one of: The transparency of a transparent display can be adjusted.
  • the display unit 251 includes one area of the steering wheel, one area of the instrument panel (521a, 251b, 251e), one area of the seat (251d), one area of each pillar (251f), and one area of the door ( 251g), may be placed in an area of the center console, an area of the headlining, or an area of the sun visor, or may be implemented in an area of the windshield (251c) or an area of the window (251h).
  • the audio output unit 252 converts the electrical signal provided from the processor 270 or the control unit 170 into an audio signal and outputs it. To this end, the sound output unit 252 may include one or more speakers.
  • the haptic output unit 253 generates a tactile output.
  • the haptic output unit 253 may operate to vibrate the steering wheel, seat belt, and seats 110FL, 110FR, 110RL, and 110RR so that the user can perceive the output.
  • Lane OB10 may be a driving lane, a lane next to a driving lane, or a lane in which an oncoming vehicle travels. Lane OB10 may be a concept that includes left and right lines forming a lane.
  • Landforms may include mountains, hills, etc.
  • the object detection device 300 may include a camera 310, radar 320, lidar 330, ultrasonic sensor 340, infrared sensor 350, and processor 370.
  • the object detection apparatus 300 may further include other components in addition to the components described, or may not include some of the components described.
  • the camera 310 may be located at an appropriate location outside the vehicle to obtain images of the exterior of the vehicle.
  • the camera 310 may be a mono camera, a stereo camera 310a, an Around View Monitoring (AVM) camera 310b, or a 360-degree camera.
  • AVM Around View Monitoring
  • camera 310 may be placed close to the front windshield, inside the vehicle, to obtain an image of the front of the vehicle.
  • the camera 310 may be placed around the front bumper or radiator grill.
  • the camera 310 may be placed close to the rear windshield in the interior of the vehicle to obtain an image of the rear of the vehicle.
  • the camera 310 may be placed around the rear bumper, trunk, or tailgate.
  • the camera 310 may be placed close to at least one of the side windows inside the vehicle to obtain an image of the side of the vehicle.
  • the camera 310 may be placed around a side mirror, fender, or door.
  • the camera 310 may provide the acquired image to the processor 370.
  • Radar 320 may include an electromagnetic wave transmitting unit and a receiving unit.
  • the radar 320 may be implemented as a pulse radar or continuous wave radar based on the principle of transmitting radio waves.
  • the radar 320 may be implemented in a frequency modulated continuous wave (FMCW) method or a frequency shift keyong (FSK) method depending on the signal waveform among the continuous wave radar methods.
  • FMCW frequency modulated continuous wave
  • FSK frequency shift keyong
  • the radar 320 detects an object using electromagnetic waves based on a Time of Flight (TOF) method or a phase-shift method, and determines the location of the detected object, the distance to the detected object, and the relative speed. can be detected.
  • TOF Time of Flight
  • phase-shift method determines the location of the detected object, the distance to the detected object, and the relative speed. can be detected.
  • the radar 320 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
  • LiDAR 330 may include a laser transmitter and a receiver. LiDAR 330 may be implemented in a time of flight (TOF) method or a phase-shift method.
  • TOF time of flight
  • LiDAR 330 may be implemented as a driven or non-driven type.
  • the LIDAR 330 When implemented in a driven manner, the LIDAR 330 is rotated by a motor and can detect objects around the vehicle 100.
  • the LIDAR 330 can detect objects located within a predetermined range based on the vehicle 100 through optical steering.
  • the vehicle 100 may include a plurality of non-driven LIDARs 330.
  • the LIDAR 330 detects an object via laser light based on a time of flight (TOF) method or a phase-shift method, and determines the location of the detected object, the distance to the detected object, and Relative speed can be detected.
  • TOF time of flight
  • phase-shift method determines the location of the detected object, the distance to the detected object, and Relative speed can be detected.
  • Lidar 330 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
  • the ultrasonic sensor 340 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
  • the infrared sensor 350 may include an infrared transmitter and a receiver.
  • the infrared sensor 340 can detect an object based on infrared light, and detect the location of the detected object, the distance to the detected object, and the relative speed.
  • the infrared sensor 350 may be placed at an appropriate location outside the vehicle to detect objects located in front, behind, or on the sides of the vehicle.
  • the processor 370 may control the overall operation of each unit of the object detection device 300.
  • the processor 370 can detect and track an object based on the acquired image.
  • the processor 370 can perform operations such as calculating a distance to an object and calculating a relative speed to an object through an image processing algorithm.
  • the processor 370 can detect and track an object based on reflected electromagnetic waves that are transmitted when the electromagnetic waves are reflected by the object and returned.
  • the processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed to an object, based on electromagnetic waves.
  • the processor 370 may detect and track an object based on reflected laser light that is returned after the transmitted laser is reflected by the object.
  • the processor 370 may perform operations such as calculating the distance to the object and calculating the relative speed to the object, based on the laser light.
  • the processor 370 may detect and track an object based on reflected ultrasonic waves in which the transmitted ultrasonic waves are reflected by the object and returned.
  • the processor 370 may perform operations such as calculating a distance to an object and calculating a relative speed to an object based on ultrasonic waves.
  • the object detection apparatus 300 may include a plurality of processors 370 or may not include the processor 370.
  • the camera 310, radar 320, lidar 330, ultrasonic sensor 340, and infrared sensor 350 may each individually include a processor.
  • the object detection device 300 may be operated under the control of the processor or control unit 170 of the device in the vehicle 100.
  • the communication device 400 is a device for communicating with an external device.
  • the external device may be another vehicle, mobile terminal, or server.
  • the communication device 400 may further include other components in addition to the components described, or may not include some of the components described.
  • the short-range communication unit 410 may form a wireless area network and perform short-range communication between the vehicle 100 and at least one external device.
  • the V2X communication unit 430 is a unit for performing wireless communication with a server (V2I: Vehicle to Infra), another vehicle (V2V: Vehicle to Vehicle), or a pedestrian (V2P: Vehicle to Pedestrian).
  • the V2X communication unit 430 may include an RF circuit capable of implementing communication with infrastructure (V2I), communication between vehicles (V2V), and communication with pedestrians (V2P) protocols.
  • the optical communication unit 440 is a unit for communicating with an external device through light.
  • the optical communication unit 440 may include an optical transmitter that converts an electrical signal into an optical signal and transmits it to the outside, and an optical receiver that converts the received optical signal into an electrical signal.
  • the light emitting unit may be formed to be integrated with the lamp included in the vehicle 100.
  • the broadcast transceiver 450 is a unit for receiving a broadcast signal from an external broadcast management server through a broadcast channel or transmitting a broadcast signal to the broadcast management server.
  • Broadcast channels may include satellite channels and terrestrial channels.
  • Broadcast signals may include TV broadcast signals, radio broadcast signals, and data broadcast signals.
  • the processor 470 may control the overall operation of each unit of the communication device 400.
  • the communication device 400 may include a plurality of processors 470 or may not include the processor 470.
  • the communication device 400 may implement a vehicle display device together with the user interface device 200.
  • the vehicle display device may be called a telematics device or an AVN (Audio Video Navigation) device.
  • the driving control device 500 is a device that receives user input for driving.
  • the vehicle 100 may be operated based on signals provided by the driving control device 500.
  • the driving control device 500 may include a steering input device 510, an acceleration input device 530, and a brake input device 570.
  • the steering input device 510 may receive an input of the direction of travel of the vehicle 100 from the user.
  • the steering input device 510 is preferably formed in a wheel shape to enable steering input by rotation.
  • the steering input device may be formed in the form of a touch screen, touch pad, or button.
  • the acceleration input device 530 may receive an input for acceleration of the vehicle 100 from the user.
  • the brake input device 570 may receive an input for decelerating the vehicle 100 from the user.
  • the acceleration input device 530 and the brake input device 570 are preferably formed in the form of pedals. Depending on the embodiment, the acceleration input device or the brake input device may be formed in the form of a touch screen, touch pad, or button.
  • the driving control device 500 may be operated under the control of the control unit 170.
  • the vehicle driving device 600 is a device that electrically controls the operation of various devices in the vehicle 100.
  • the vehicle driving device 600 may further include other components in addition to the components described, or may not include some of the components described.
  • the vehicle driving device 600 may include a processor. Each unit of the vehicle driving device 600 may individually include a processor.
  • the power train driver 610 can control the operation of the power train device.
  • the power train driving unit 610 may include a power source driving unit 611 and a transmission driving unit 612.
  • the power source driver 611 may control the power source of the vehicle 100.
  • the power source driver 610 may perform electronic control of the engine. Thereby, the output torque of the engine, etc. can be controlled.
  • the power source driving unit 611 can adjust the engine output torque according to the control of the control unit 170.
  • the power source driver 610 may control the motor.
  • the power source driving unit 610 can adjust the rotational speed and torque of the motor according to the control of the control unit 170.
  • the transmission drive unit 612 can control the transmission.
  • the transmission drive unit 612 can adjust the state of the transmission.
  • the transmission drive unit 612 can adjust the state of the transmission to forward (D), reverse (R), neutral (N), or park (P).
  • the transmission drive unit 612 can adjust the gear engagement state in the forward (D) state.
  • the chassis driver 620 can control the operation of the chassis device.
  • the chassis drive unit 620 may include a steering drive unit 621, a brake drive unit 622, and a suspension drive unit 623.
  • the steering drive unit 621 may perform electronic control of the steering apparatus within the vehicle 100.
  • the steering drive unit 621 can change the moving direction of the vehicle.
  • the brake driver 622 may perform electronic control of the brake apparatus within the vehicle 100. For example, the speed of the vehicle 100 can be reduced by controlling the operation of the brakes disposed on the wheels.
  • the brake driver 622 can individually control each of the plurality of brakes.
  • the brake driver 622 can control braking force applied to a plurality of wheels differently.
  • the suspension drive unit 623 may perform electronic control of the suspension apparatus within the vehicle 100. For example, when the road surface is curved, the suspension drive unit 623 may control the suspension device to reduce vibration of the vehicle 100. Meanwhile, the suspension driving unit 623 can individually control each of the plurality of suspensions.
  • the door/window driving unit 630 may perform electronic control of the door apparatus or window apparatus within the vehicle 100.
  • the door/window driving unit 630 may include a door driving unit 631 and a window driving unit 632.
  • the door driver 631 can control the door device.
  • the door driver 631 can control the opening and closing of a plurality of doors included in the vehicle 100.
  • the door driver 631 can control the opening or closing of the trunk or tail gate.
  • the door driver 631 can control the opening or closing of the sunroof.
  • the window driver 632 may perform electronic control of a window apparatus. It is possible to control the opening or closing of a plurality of windows included in the vehicle 100.
  • the safety device driver 640 may include an airbag driver 641, a seat belt driver 642, and a pedestrian protection device driver 643.
  • the seat belt drive unit 642 may perform electronic control of the seat belt appartus in the vehicle 100. For example, when danger is detected, the seat belt drive unit 642 can control the passenger to be fixed to the seat (110FL, 110FR, 110RL, 110RR) using the seat belt.
  • the lamp driver 650 may perform electronic control of various lamp apparatuses in the vehicle 100.
  • the air conditioning driver 660 may perform electronic control of the air conditioning device (air cinditioner) in the vehicle 100. For example, when the temperature inside the vehicle is high, the air conditioning driver 660 can control the air conditioning device to operate so that cold air is supplied into the vehicle interior.
  • the air conditioning driver 660 can control the air conditioning device to operate so that cold air is supplied into the vehicle interior.
  • the driving system 700 may include a driving system 710, a parking system 740, and a parking system 750.
  • the navigation system 700 may further include other components in addition to the components described, or may not include some of the components described.
  • the navigation system 700 when the navigation system 700 is implemented in software, it may be a sub-concept of the control unit 170.
  • the navigation system 700 includes at least one of the user interface device 200, the object detection device 300, the communication device 400, the vehicle driving device 600, and the control unit 170. It may be a concept that includes
  • the driving system 710 may receive navigation information from the navigation system 770 and provide a control signal to the vehicle driving device 600 to drive the vehicle 100.
  • the driving system 710 may receive object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 to drive the vehicle 100.
  • the driving system 710 may receive a signal from an external device through the communication device 400 and provide a control signal to the vehicle driving device 600 to drive the vehicle 100.
  • the parking system 740 can remove the vehicle 100.
  • the parking system 750 may receive navigation information from the navigation system 770 and provide a control signal to the vehicle driving device 600 to park the vehicle 100.
  • the parking system 750 may receive object information from the object detection device 300 and provide a control signal to the vehicle driving device 600 to park the vehicle 100.
  • the parking system 750 may park the vehicle 100 by receiving a signal from an external device through the communication device 400 and providing a control signal to the vehicle driving device 600.
  • the navigation system 770 may include memory and a processor.
  • the memory can store navigation information.
  • the processor may control the operation of the navigation system 770.
  • the sensing unit 120 includes vehicle posture information, vehicle collision information, vehicle direction information, vehicle location information (GPS information), vehicle angle information, vehicle speed information, vehicle acceleration information, vehicle tilt information, vehicle forward/backward information, and battery. Obtain sensing signals for information, fuel information, tire information, vehicle lamp information, vehicle interior temperature information, vehicle interior humidity information, steering wheel rotation angle, vehicle exterior illumination, pressure applied to the accelerator pedal, pressure applied to the brake pedal, etc. can do.
  • the sensing unit 120 includes an accelerator pedal sensor, a pressure sensor, an engine speed sensor, an air flow sensor (AFS), an intake temperature sensor (ATS), a water temperature sensor (WTS), and a throttle position sensor. (TPS), TDC sensor, crank angle sensor (CAS), etc. may be further included.
  • the vehicle interface unit 130 may serve as a passageway for various types of external devices connected to the vehicle 100.
  • the vehicle interface unit 130 may have a port that can be connected to a mobile terminal, and can be connected to a mobile terminal through the port. In this case, the vehicle interface unit 130 can exchange data with the mobile terminal.
  • the vehicle interface unit 130 may serve as a conduit for supplying electrical energy to a connected mobile terminal.
  • the vehicle interface unit 130 may provide electrical energy supplied from the power supply unit 190 to the mobile terminal under the control of the control unit 170. .
  • processors and control units 170 included in the vehicle 100 include application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), and FPGAs ( It may be implemented using at least one of field programmable gate arrays, processors, controllers, micro-controllers, microprocessors, and other electrical units for performing functions.
  • ASICs application specific integrated circuits
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • processors controllers, micro-controllers, microprocessors, and other electrical units for performing functions.
  • the AR display device 800 is used for navigation of the vehicle 100. Based on the information and data received from the AR camera, an AR graphic interface indicating the driving state of the vehicle can be displayed in real time through AR registration on the front image of the vehicle (or the vehicle's windshield).
  • the AR display device 800 includes a communication module 810 for communicating with other devices/systems, servers, and vehicles, a processor 820 that controls the overall operation of the AR display device 800, and an AR graphic interface.
  • a communication module 810 for communicating with other devices/systems, servers, and vehicles
  • a processor 820 that controls the overall operation of the AR display device 800
  • an AR graphic interface may include a display 830 for displaying a navigation screen including a rendered forward image.
  • 'front image' refers not only to images captured through camera sensors (or smart glasses including these functions), but also to images reflected on the LCD screen through camera sensors and windshield/dash images. It can include both real-space images and/or digitally twinned 3D images shown on the board.
  • 'Navigation screen including forward image (or driving image)' disclosed in this specification is a navigation screen created based on the current location and navigation information, a forward image captured through the vehicle's camera, and reflected on the LCD screen. This may mean that a forward image implemented in the form of one of the images of real space shown through a video, windshield, etc. and/or a digitally twinned 3D image is layered on the navigation screen.
  • the navigation screen may be an AR navigation screen to which AR technology is applied.
  • the 'AR graphic interface' disclosed in this specification is a graphic user interface to which augmented reality (AR) technology is applied, and is AR matched to the front image of the vehicle in real time.
  • AR augmented reality
  • the AR graphic interface may be an AR graphic image representing the current driving state of the vehicle. Additionally, the AR graphic interface disclosed in this specification may be an AR graphic image that further represents a guide to the driving situation of the vehicle simultaneously with the current driving state of the vehicle. At this time, a guide to the driving situation of the vehicle is displayed on the image in front of the vehicle at a certain distance and/or a certain time ahead of the driving situation.
  • the AR display device 800 may be implemented as part of the electrical equipment or system of the vehicle 100, or may be implemented as a separate independent device or system.
  • the AR display device 800 may be implemented in the form of a program consisting of instructions operated by a processor such as a user terminal of the vehicle 100.
  • the AR display device 800 communicates with the vehicle 100, other devices, and/or servers to display the front image of the vehicle acquired through an AR camera and sensors provided in the vehicle (e.g., gyroscope sensor, acceleration sensor, gravity sensor). , geomagnetic sensors, temperature sensors, etc.).
  • sensors e.g., gyroscope sensor, acceleration sensor, gravity sensor. , geomagnetic sensors, temperature sensors, etc.
  • the AR display device 800 may run a preset application, for example, an (AR) navigation application.
  • a preset application for example, an (AR) navigation application.
  • the AR display device 800 displays an AR object separated from the AR graphic interface based on map data (e.g., information such as route, POI, etc.), sensing data, and a front image acquired by a camera to display information about the driving situation of the vehicle. It can be rendered to display a guide and provided in real time to the AR GUI surface and AR camera surface of the navigation application.
  • map data e.g., information such as route, POI, etc.
  • sensing data e.g., information such as route, POI, etc.
  • a front image acquired by a camera to display information about the driving situation of the vehicle. It can be rendered to display a guide and provided in real time to the AR GUI surface and AR camera surface of the navigation application.
  • the separated AR object may be named 'second AR object', and the remaining part of the AR graphic interface after the second AR object is separated may be named 'first AR object'.
  • the AR graphic interface can be said to include a first AR object that represents the current driving state of the vehicle and a second AR object that displays a guide to the driving situation of the vehicle.
  • FIG. 8 is a detailed block diagram related to the processor 820 of the AR display device 800 related to the embodiment of the present invention described above.
  • the conceptual diagram shown in FIG. 8 may include configurations related to operations performed by the processor 820 of the AR display device 800 and information, data, and programs used therefor.
  • the block diagram shown in FIG. 8 may be used to mean a service provided through the processor 820 and/or a system executed/implemented by the processor 820.
  • processor 820 for convenience of explanation, it will be referred to as processor 820.
  • FIG. 9 is a diagram referenced for explaining a navigation screen according to an embodiment of the present invention
  • FIG. 10 is a diagram referenced for explaining an operation of generating the navigation screen of FIG. 9 .
  • the navigation engine 910 may include a navigation controller 911.
  • the navigation controller 911 may receive map matching data, map display data, and route guidance data.
  • the navigation controller 911 may provide route guidance data and map display frames to the navigation application 930.
  • the AR engine 920 may include an adapter 921 and a renderer 922.
  • the adapter 921 uses front image data acquired from a camera (e.g., AR camera), sensors of the vehicle, such as a gyroscope, an acceleration sensor, a gravity sensor, and a magnetometer. , and/or sensing data obtained from a temperature sensor (Thermometer) may be received.
  • a camera e.g., AR camera
  • sensors of the vehicle such as a gyroscope, an acceleration sensor, a gravity sensor, and a magnetometer.
  • a temperature sensor Thermometer
  • the AR engine 920 may receive sensing data obtained from ADAS sensors (e.g., Camera, Radar, Lidar, Ultrasonic, Sonar). For example, through ADAS sensors, driving-related sensing data such as driving direction, speed, and distance from lanes can be obtained as sensing data.
  • ADAS sensors e.g., Camera, Radar, Lidar, Ultrasonic, Sonar.
  • driving-related sensing data such as driving direction, speed, and distance from lanes can be obtained as sensing data.
  • the AR engine 920 can receive high-precision map data and programs related thereto.
  • the high-precision map is a map to provide detailed road and surrounding terrain information to autonomous vehicles in advance. It has an accuracy within about 10cm of the error range and provides lane-level information such as the road center line and boundary line as well as lane-level information. Information such as traffic lights, signs, curbs, road marks, and various structures are contained in 3D digital format.
  • the TCU (Transmission Control Unit) of the sensor and map 940 is a communication control device mounted on the vehicle, for example, V2X (vehicle to everything), a communication technology that communicates with various elements on the road for autonomous vehicles (e.g. , situational data collectable through V2V and V2I), and enables communication with ITS (Intelligent Transport Systems) or C-ITS (Cooperative Intelligent Transport Systems), which are cooperative intelligent transport system technologies.
  • V2X vehicle to everything
  • ITS Intelligent Transport Systems
  • C-ITS Cooperative Intelligent Transport Systems
  • the navigation application 930 can create an AR navigation screen.
  • the AR engine 920 may register a callback function to receive front image data from the camera server 1001.
  • the camera server 1001 may be understood as a concept included in the memory of the AR display device 800, for example.
  • Each process in FIG. 11 may be performed by a processor (or AR engine) unless otherwise specified.
  • the process of FIG. 11 includes or performs operations of the navigation engine 910, AR engine 920, and navigation application 930 by the processor 820 described above with reference to FIGS. 8 to 10. At least some of these may be performed before or after the process of FIG. 11.
  • the method begins with running a preset application (S10).
  • the processor can merge the AR graphic interface generated in real time with the front image of the vehicle in real time.
  • the preset condition may include a case where a change in the driving situation of the vehicle is predicted from the current driving state based on the vehicle's sensing data.
  • the preset condition is a situation in which a change in the driving situation of the vehicle or a need for guidance is predicted from the current driving state based on at least one of ADAS sensing data, high-precision map data, and TCU communication data such as V2X, ITS, and C-ITS. May include detected cases.
  • FIGS. 12A and 12B are examples of an AR graphic interface according to an embodiment of the present invention, and are reference diagrams for explaining separation and combination into first and second AR objects based on predicted changing driving situations.
  • the AR graphic interface 1200 may be implemented as a combination of a first object and a second object.
  • the condition for separating the first and second AR objects 1210 and 1220 is based on at least one of vehicle ADAS sensing data, high-precision map data, and TCU communication data such as V2X, ITS, and C-ITS, This may include a case in which a change in the driving situation of the vehicle or a situation in which the need for guidance is predicted is detected from the current driving state of the vehicle.
  • the plurality of fragments may be expressed as moving a certain distance ahead of the first AR object 1220.
  • it is implemented to express a driving guide according to the occurrence of a predicted situation while moving based on the current location and driving state of the vehicle.
  • the number and/or display length of the plurality of fragments may be proportional to the maintenance time or maintenance distance of the predicted situation. For example, a case in which the status maintenance time is long may include a greater number of fragments or the total display length may be expressed as longer than in a case where the status maintenance time is long.
  • the plurality of fragments of the separated second AR object 1210 provide guidance on situations predicted from the current driving state corresponding to the first AR object 1220 in a more gradual and seamless manner.
  • the AR display device 800 receives map data of the vehicle, status data of the vehicle, and network data, and estimates a context that may occur while the vehicle is driving based on the received data.
  • Driving guide information related to the context can be displayed by changing the AR graphic interface.
  • the processor 820 may display a forward image in which an AR graphic interface including the current driving state of the vehicle and a guide for the driving situation of the vehicle is rendered on the navigation screen (S1310).
  • the AR graphic interface includes first and second AR objects.
  • the first AR object displays the current driving state of the vehicle
  • the second AR object is combined or separated from the first AR object and displays a guide for the driving situation expected while the vehicle is driving.
  • the network data is communication data based on external resources, such as third-party service-related data received while driving the vehicle, intelligent transportation system (ITS)-related data, and V2V/V2X communication data received from the control server. It may include processed data and other external service data.
  • external resources such as third-party service-related data received while driving the vehicle, intelligent transportation system (ITS)-related data, and V2V/V2X communication data received from the control server. It may include processed data and other external service data.
  • the processor 820 may display a second AR object separated from the first AR object at a location associated with the occurrence of the estimated context while the first AR object continues to display the current driving state of the vehicle. It can be rendered to be displayed.
  • warning signs for hidden objects changes in the driving speed or direction of a vehicle following ahead, indications for vehicle inspection, charging, and route guidance for areas where parking is possible, information on vehicle congestion, and detour route guidance.
  • Display, charging status, fee information, etc. may be included as additional information related to the context.
  • Figure 14 is a conceptual diagram illustrating an AR graphic interface that displays related guide information according to the detection of hidden objects and the possibility of collision, according to an embodiment of the present invention.
  • the processor 820 of the AR display device 800 may determine whether to detect a possible context by combining network data obtained from external resources with one or more of map data, vehicle sensing data, and location data.
  • the processor 820 may receive, for example, ADAS sensing data, vehicle sensing data (e.g., CAN data), map data such as navigation/map/GPS data, and external service data.
  • the processor 820 can periodically detect the presence of a hidden object in a hidden area around a driving vehicle (or around a driving lane) based on ADAS sensing data or external service data. there is. Additionally, the processor 820 may periodically determine the possibility of collision with a vehicle when detecting a hidden object.
  • the processor 820 determines at least one of location information and situation information corresponding to the context (hidden obstacle detection in a hidden area) through the separated second AR object. One can be displayed on the front image.
  • the second object OB2 obscured by the first object OB1 may be detected as a hidden object based on ADAS sensing data or external service data. there is.
  • the second object OB2 is not visible in the front image 1401 and its movement cannot be confirmed because it is obscured by the first object OB1 (eg, a bus), so there is a possibility of collision with a vehicle traveling straight.
  • the first object OB1 eg, a bus
  • the processor 820 separates the AR graphic interface, and the first AR object 1420 is displayed in the current driving state (e.g., straight driving). ) is maintained, and the display is updated so that the separated second AR object moves to the predicted position of the second object OB2.
  • the current driving state e.g., straight driving
  • the second AR object that has moved to the predicted position of the second object OB2 may output a rotation animation effect so that it faces the predicted position of the hidden second object OB2. Additionally, the rotated second AR object 1410a may display context-related information, for example, 'pedestrian!', in text or image form.
  • the UX display 1410b including context-related information may be provided by being implemented as a variable second AR object or as a separate third AR object.
  • the notification level of the UX display 1410b may be varied to correspond to the degree of possibility of collision between the vehicle and a hidden object.
  • the degree of possibility of collision between a vehicle and a hidden obstacle is weighted according to the separation distance between the predicted position of the hidden object and the current vehicle position and/or comparison of before and after values of ADAS sensing data or external service data. It can be calculated/estimated by applying.
  • the separated second AR object is It moves to and combines with 1 AR object 1420 (at this time, the third AR object is removed). That is, when the situation corresponding to the context ends, the AR graphic interface is provided with the first and second AR objects combined.
  • Figures 15 and 16 are conceptual diagrams showing various modified examples of an AR graphic interface for displaying guide information according to follow vehicle settings, according to an embodiment of the present invention.
  • the processor 820 may determine detection of a context that may occur by combining network data, which is an external resource, with one or more of map data, vehicle sensing data, and location data, and based on the determination, separate second AR. At least one of location information and situation information corresponding to the context can be displayed through the object.
  • At this time, at least one of the location information and situation information may be displayed in association with the current driving state of the vehicle corresponding to the first AR object.
  • the vehicle now drives by following the preceding vehicle, providing convenience to the driver. Furthermore, the present invention provides a UX that allows users to intuitively understand which vehicle the vehicle is following and what driving control is necessary considering the possibility of collision with the following vehicle.
  • the second AR object is transformed to include a plurality of fragments when separated.
  • a fragment close to the first AR object is closely associated with the state (e.g., direction and rotation angle) of the first AR object indicating the current driving state of the vehicle, and as the fragment moves away from the first AR object, the context Aim for the corresponding location and direction.
  • the separated second AR object can be said to point to or move to the location of the following vehicle.
  • the processor 820 may vary the separated second AR object to display a plurality of trajectories that start from the first AR object and follow the position of the object corresponding to the context through these plural fragments. .
  • the processor 820 provides the navigation application 930 with an updated AR GUI frame based on information about the changed second AR object, allowing the AR GUI surface to be updated in real time, thereby creating a more complete and intuitive AR. Provides an interface.
  • the following vehicle 30 is changed to the selected following vehicle 30' according to the following vehicle input 1501, and the AR graphic interface is displayed separately.
  • the separated second AR object 1510 displays the location of the first AR object (i.e., the current location of the vehicle) and that of the following vehicle 30', while the first AR object 1520 represents the current driving state of the vehicle. It is displayed in the form of multiple guide trajectories connecting locations.
  • the lengths of the plurality of guide trajectories represent the length of the following travel distance that the vehicle must follow with respect to the following vehicle 30. Accordingly, as the vehicle approaches the following vehicle 30, the length of the plurality of guide trajectories becomes shorter, and as the vehicle moves further away from the following vehicle 30, the length of the plurality of guide trajectories becomes longer.
  • the desired tracking distance may be displayed on the separated second AR object 1510 or a notification indicating that the tracking distance has reached a threshold may be provided to assist in tracking driving.
  • the plurality of (guide) trajectories constituting the separated second AR object may include a driving guide related to the next driving state of the vehicle predicted based on the moving state of the object corresponding to the context.
  • the separated second AR object may indicate the location and driving state of the following vehicle.
  • the driver will be able to see the second AR object changed in this way and increase the driving speed so as not to move away from the following vehicle 30'. (Alternatively, a signal may be transmitted to the driver of the following vehicle 30' to slow down the driving speed.)
  • the speeding state of the vehicle can be displayed by changing the color of the first AR object, and the speeding state of the following vehicle 30' can be displayed by changing the color of the separated second AR object as described above.
  • Figure 16(b) displays a plurality of guide trajectories when the following vehicle 30' changes lanes.
  • a plurality of guide trajectories connect the position of the first AR object and the position of the following vehicle 30' that changed lanes. It is indicated to do so.
  • FIGS. 17, 18A, and 18B are flowcharts and conceptual diagrams used to explain a method of displaying responses according to vehicle status diagnosis through an AR graphic interface, according to an embodiment of the present invention.
  • the AR display device 800 can check the status of the vehicle based on ADAS sensing data, vehicle sensing data (e.g., CAN data), navigation/map/GPS data, and external service data received while the vehicle is driving. Depending on the results of the vehicle's status check, customized POI information can be provided through the AR graphic interface.
  • vehicle sensing data e.g., CAN data
  • navigation/map/GPS data e.g., GPS data
  • external service data e.g., GPS data
  • customized POI information can be provided through the AR graphic interface.
  • the processor 820 processes ADAS sensing data received in real time/periodically, vehicle sensing data (e.g., CAN data), navigation/ Vehicle abnormality signals can be detected based on map/GPS data and external service data (1720)
  • vehicle sensing data e.g., CAN data
  • navigation/ Vehicle abnormality signals can be detected based on map/GPS data and external service data (1720)
  • the processor 820 determines whether the vehicle is impossible to drive (1730). If the vehicle can be driven according to judgment 1730, only a notification is displayed and the vehicle continues to drive to the destination set in the navigation.
  • the destination is changed to a location suitable for the vehicle condition corresponding to the vehicle abnormality signal (1740).
  • a location suitable for the vehicle condition refers to a location where vehicle abnormalities can be resolved according to the vehicle condition inspection results. For example, as a result of the inspection, if there is a problem with a vehicle part, the repair shop POI can be selected, if the battery is low, the charging station POI can be selected, and if the gas gauge is low, the gas station POI can be determined as the appropriate location.
  • the processor 820 can set a new destination based on the determined place, the current location of the vehicle, and POI history (e.g., frequently visited repair shop, etc.), and guide the newly set destination using a separated second AR object. there is.
  • POI history e.g., frequently visited repair shop, etc.
  • the processor 820 displays the AR graphic interface variably according to the change in destination (1750). In other words, route guidance to the changed destination is provided through the AR graphic interface. The above processes may be repeated until it is determined that the driving has ended (1760), and when the driving is completed, the guidance process according to the vehicle status check ends.
  • Figure 18a shows a case in which an emergency situation is determined based on vehicle inspection based on ADAS sensing data, vehicle sensing data (e.g., CAN data), navigation/map/GPS data, and external service data while the vehicle is driving.
  • vehicle sensing data e.g., CAN data
  • navigation/map/GPS data e.g., GPS data
  • external service data e.g., external service data while the vehicle is driving.
  • the processor 820 sets the surrounding parking area as a new destination (P1) at a location appropriate for the vehicle condition. At this time, the previously set destination is canceled. Afterwards, while the first AR object 1820 displays the current driving state of the vehicle, the second separated AR object guides the new destination P1 through a plurality of fragments.
  • the reason for changing to the new destination (P1) and warning notification information may be displayed together through the separated second AR object.
  • Figure 18b shows a case in which a battery shortage situation is determined based on vehicle inspection based on ADAS sensing data, vehicle sensing data (e.g., CAN data), navigation/map/GPS data, and external service data while the vehicle is driving.
  • vehicle sensing data e.g., CAN data
  • navigation/map/GPS data e.g., GPS data
  • external service data e.g., external service data while the vehicle is driving.
  • the reason for changing to the new destination (P1) that is, the low battery situation and warning notification information may be displayed together through the separated second AR object.
  • the separated second AR object moves to the new destination (P1) location.
  • the moved second AR object 1810a is displayed along with additional information 1810b' suggesting battery charging (e.g., 'charge here!').
  • FIGS. 19A and 19B are flowcharts and conceptual diagrams used to explain a method of displaying a congestion situation of a driving route and responses related to determination of detour possibility through an AR graphic interface, according to an embodiment of the present invention.
  • the AR display device 800 may collect the following road conditions/information (1920) while guiding the driving of the vehicle through the AR graphic interface (1710).
  • the processor 820 receives real-time/periodically received ADAS sensing data, vehicle sensing data (e.g., CAN data), map data such as navigation/map/GPS data, and external service data, and provides road conditions based on this. Information can be collected about (1920).
  • vehicle sensing data e.g., CAN data
  • map data such as navigation/map/GPS data
  • external service data e.g., CAN data
  • Information can be collected about (1920).
  • the processor 820 may collect road condition information on the driving route through ADAS sensing data or external service data.
  • the road condition information may include accident information, construction information, road conditions (eg, road damage, potholes, sinkholes, etc.).
  • the processor 820 determines whether the collected information about road conditions is important information (1930).
  • whether the information is important may be related to information related to the driving route to the destination, information related to the driving time (e.g., traffic jam, congestion, etc.), information related to safe driving of the vehicle, etc.
  • the processor 820 may detect a faster route than the current driving route based on the collected information about road conditions, and determine the detected fast route as important information. At this time, additional information about the new driving route (e.g., shortened time, driving distance, blockage information, etc.) may be provided together.
  • additional information about the new driving route e.g., shortened time, driving distance, blockage information, etc.
  • the processor 820 may determine whether to change the driving lane as important information. Or, for example, the processor 820 determines whether to change the driving lane as important information when bad road condition information such as potholes is detected (received) on the current driving route based on the collected information about the road condition. You can decide.
  • the processor 820 can change the AR graphic interface based on the collected information about the road situation and render it on the image in front of the vehicle (1940).
  • the processor 820 provides context, for example, one of location and situation information corresponding to road situation information, through a second AR object separated from the AR graphic interface, in order to guide information about the collected road situation. can be displayed.
  • At this time, at least one of location and situation information may be displayed in relation to the current driving state of the vehicle corresponding to the first AR object displayed on the front image.
  • the location information includes a new driving direction or driving path corresponding to context, for example, road situation information, based on the current location of the vehicle.
  • the situation information may include road situation data collected for the vehicle's current driving lane or set driving path.
  • the processor 820 displays a variable display so that the separated second AR object includes a guide trajectory that starts from the first AR object and guides a new driving direction or driving path, based on the collected road situation data. can do.
  • a first AR object 1920 indicating the current driving state of the vehicle is displayed on the front image, and a second AR object 1910 separated from this is displayed through a plurality of fragments (or guide trajectories).
  • Guides driving lane changes with a new driving direction or driving path corresponding to road situation information.
  • situation information 1930 based on collected road situation data may be displayed as part of the second AR object or through the third AR object.
  • 'forward accident detour' may be displayed as situation information corresponding to the collected road situation data.
  • the processor 820 may determine between the starting point of the guide trajectory 1910 of the first AR object 1920 and the second AR object, Rendering of the separated second AR object may be updated so that the context information 1930 is displayed.
  • the processor 820 renders the separated second AR object to be displayed again in the AR graphic interface coupled to the first AR object. can be updated,
  • Figure 20 is a flowchart referenced to explain a method of displaying route guidance for a charging area, charging information, and exit route guidance through an AR graphic interface based on network data when a vehicle enters a charging station, according to an embodiment of the present invention. .
  • EV charging control is shown for explanation purposes, but the process of FIG. 20 can be equally/similarly applied to parking control. Therefore, the parking lot/charging station control server described below was used to include both EV charging control and parking control.
  • the AR display device 800 may be communicatively connected to the parking lot/charging station control server (via the vehicle 100). Accordingly, as shown in FIG. 20, the parking lot/charging station control server, for example, the EV charging control 2000, the AR display device 800, and the vehicle 100 are connected to transmit the requested data and /Or receive the requested data.
  • the parking lot/charging station control server for example, the EV charging control 2000, the AR display device 800, and the vehicle 100 are connected to transmit the requested data and /Or receive the requested data.
  • control server of the parking lot/charging station uses digital twins to monitor events (situations, operations, functions, etc.) that occur within the parking lot/charging station and devices installed in the parking lot/charging station (e.g., sensors, chargers, other connected devices). devices/devices, etc.) may be included in or implemented as a controllable digital twin-based system.
  • a digital twin refers to an object that exists in reality (objects, space, environment, process, procedure, etc.) expressed as a digital data model on a computer so that it can be copied identically and react to each other in real time.
  • These digital twins can create virtual models of assets such as physical objects, spaces, environments, people, and processes using software, allowing them to operate or perform the same actions as they do in the real world.
  • the parking lot/charging station control server (hereinafter referred to as 'control server 2000') may include a communication module and a processor.
  • the communication module of the control server can receive sensing data from sensors installed in the parking lot/charging station.
  • the control server's processor can create a digital twin that can reflect the parking/charging situation in real time, and can remotely control vehicles entering the parking lot/charging station when certain conditions are met.
  • the processor 820 of the AR display device 800 is connected to a server provided in the parking lot/charging station, that is, the control server, through the communication module 810 in response to the vehicle entering the parking lot/charging station. Status data of the vehicle can be provided to.
  • the processor 820 may receive first route information guiding the user to a parking area/charger available for parking/charging from the control server, separate the second AR object of the AR graphic interface, and display the first guide route.
  • the processor 820 receives second route information from the control server guiding the vehicle to the exit of the charging station and receives the separated second route information from the control server. 2 Rendering can be updated to display the second guide path using the AR object.
  • the control server 2000 can detect the vehicle's entry through a sensor provided in the parking lot/charging station.
  • the control server 2000 transmits a connection request to the vehicle 100 or the AR display device 800 connected to the vehicle 100 based on the detection of the vehicle 100 entering (2002).
  • the connection request may include GPS information, authority information (vehicle control rights), vehicle information, etc.
  • the AR display device 800 receives vehicle information from the vehicle 100 (2003) and transmits the received vehicle information to the control server 2000 (2004).
  • vehicle information may include GPS information, authority information (vehicle control rights), battery information, etc.
  • the AR display device 800 displays information acquired from the control server 2000 (e.g., information about available parking areas or chargeable (fast or slow) chargers based on incoming vehicle information, real-time parking lot/charging station information (e.g. , charging unit price (rate by ultra-rapid/rapid/slow speed), charging vehicle occupancy, charging waiting time, charger failure information, etc.) can be received from the control server (2000).
  • information acquired from the control server 2000 e.g., information about available parking areas or chargeable (fast or slow) chargers based on incoming vehicle information
  • real-time parking lot/charging station information e.g. , charging unit price (rate by ultra-rapid/rapid/slow speed), charging vehicle occupancy, charging waiting time, charger failure information, etc.
  • control server 2000 is the AR display device 800 or the information acquired by the AR display device 800 through the sensors of the vehicle 100 (e.g., remaining battery level, whether remote control is possible, and the protocol for remote control). charging time, etc.) can be received from the AR display device 800.
  • the AR display device 800 can receive charger usage status and information about ultra-fast/rapid/slow chargers from the control server 2000, and displays an AR graphic interface corresponding to the received information to guide selection. You can.
  • the control server 2000 identifies the location of an empty parking/chargeable area or slot (charger) based on the sensor's sensing data or no transmitted data, and provides route information (hereinafter referred to as 'first route information') for this. ) can be created.
  • the control server 2000 may generate first path information based on object location estimation information generated based on sensing data.
  • the first path information may be the shortest path connecting the current location of the vehicle 100 to the location of an empty parking/chargeable area or slot (charger).
  • the control server 2000 transmits the first route information to the AR display device 800 based on GPS information, authority information (vehicle control rights), and battery information. Based on the received first path information, the AR display device 800 changes the AR graphic interface, that is, separates the second AR object and updates rendering to display the first guide path corresponding to the first path information. (2005).
  • the AR display device 800 branches the separated second AR objects again based on each of the plurality of first path information and displays different objects to guide them to each parking area/charger.
  • the rendering can be updated to display the selection guide path as well.
  • control server 2000 regenerates the first route information.
  • the control server 2000 When a user selection input is detected, the control server 2000 generates charging information and transmits it to the AR display device 800.
  • the AR display device 800 displays the charging state through a second AR object based on the received charging information (2008).
  • the charging information may include charging unit price, charging vehicle occupancy, charging standby time, charger failure information, etc.
  • the AR display device 800 receives information about events/promotions associated with the parking lot/charging station (e.g., coffee discounts, car wash discounts, convenience store promotions, etc.) from the control server 2000, etc., and displays AR It can be displayed through a graphical interface.
  • events/promotions associated with the parking lot/charging station e.g., coffee discounts, car wash discounts, convenience store promotions, etc.
  • the end of charging e.g., charging interruption or completion of charging
  • the end of parking e.g., preparation for leaving the vehicle
  • control server 2000 can generate second route information that guides from the current location of the vehicle 100 to the exit of the parking lot/charging station as parking information and transmit it to the AR display device 800.
  • the AR display device 800 may display the second guide path through a separated second AR object based on the received second path information.
  • the final destination of the second route information may include another vehicle waiting area rather than a parking lot/charging station exit.
  • the AR display device 800 may collect information about surrounding POIs as information for exiting the car and display it through a second AR object.
  • the front image is calibrated without separate settings. It is possible to provide an augmented reality navigation screen based on You can.
  • the AR display device and its operating method in addition to the vehicle's sensing data and map data, external resources such as network data are additionally considered to estimate possible contexts, thereby It can provide intuitive UX for more diverse driving situations predicted during driving. Accordingly, the driver can drive the vehicle more easily and safely. Additionally, it is possible to avoid the possibility of collision with hidden objects that are not visible in the front image. In addition, when setting up the following vehicle in front, the driving direction and speed to be followed are guided through an intuitive AR graphic interface, enabling flexible tracking and avoiding the possibility of collision. In addition, it provides more reliable notifications and actions for diagnosing the vehicle's condition, and can identify road congestion situations that are difficult to confirm through navigation information in advance and provide a detour route.
  • a vehicle when a vehicle enters a parking lot or charging station, it communicates with the control server of that location and displays route guidance for parking/charging areas, parking/charging related information, and route guidance when leaving the car in a more intuitive AR graphic interface. It can provide a direct and smart parking/charging related UX.
  • the above-described present invention can be implemented as computer-readable code (or application or software) on a program-recorded medium.
  • the control method of the self-driving vehicle described above can be implemented using codes stored in memory, etc.
  • Computer-readable media includes all types of recording devices that store data that can be read by a computer system. Examples of computer-readable media include HDD (Hard Disk Drive), SSD (Solid State Disk), SDD (Silicon Disk Drive), ROM, RAM, CD-ROM, magnetic tape, floppy disk, optical data storage device, etc. It also includes those implemented in the form of carrier waves (e.g., transmission via the Internet). Additionally, the computer may include a processor or control unit. Accordingly, the above detailed description should not be construed as restrictive in all respects and should be considered illustrative. The scope of the present invention should be determined by reasonable interpretation of the appended claims, and all changes within the equivalent scope of the present invention are included in the scope of the present invention.

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Automation & Control Theory (AREA)
  • Traffic Control Systems (AREA)
  • Navigation (AREA)

Abstract

L'invention divulgue un dispositif d'affichage de RA relié à un véhicule et un procédé de fonctionnement de celui-ci. Un dispositif d'affichage de RA relié à un véhicule, selon la présente invention, peut prédire des situations dans lesquelles un problème peut se produire pendant le déplacement du véhicule en considérant des ressources externes en plus des ressources internes, telles que des données cartographiques et des données détectées, et afficher une interface graphique de RA ajustée en fonction de la prédiction en temps réel de façon à informer le conducteur intuitivement de la situation prédite et d'une action de réponse associée à celle-ci. Dans la présente invention, un objet de RA séparé de l'interface graphique de RA affiche non seulement un guidage pour la situation prédite mais fournit également une réponse guidée pour permettre une conduite sûre dans la situation prédite.
PCT/KR2022/015979 2022-06-10 2022-10-19 Dispositif d'affichage de ra pour véhicule et son procédé de fonctionnement WO2023238992A1 (fr)

Priority Applications (4)

Application Number Priority Date Filing Date Title
KR1020237018104A KR102609960B1 (ko) 2022-06-10 2022-10-19 차량의 ar 디스플레이 장치 및 그것의 동작방법
EP23175932.5A EP4290186A1 (fr) 2022-06-10 2023-05-30 Dispositif d'affichage ar pour véhicule et son procédé de fonctionnement
CN202310687467.7A CN117215061A (zh) 2022-06-10 2023-06-09 车辆用ar显示装置及其动作方法
US18/208,550 US20230400321A1 (en) 2022-06-10 2023-06-12 Ar display device for vehicle and method for operating same

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR20220070770 2022-06-10
KR10-2022-0070770 2022-06-10

Publications (1)

Publication Number Publication Date
WO2023238992A1 true WO2023238992A1 (fr) 2023-12-14

Family

ID=89118487

Family Applications (4)

Application Number Title Priority Date Filing Date
PCT/KR2022/095146 WO2023239003A1 (fr) 2022-06-10 2022-10-19 Dispositif d'affichage ar pour véhicule et son procédé de fonctionnement
PCT/KR2022/015982 WO2023238993A1 (fr) 2022-06-10 2022-10-19 Dispositif d'affichage en réalité augmentée (ar) de véhicule et procédé de fonctionnement de celui-ci
PCT/KR2022/015979 WO2023238992A1 (fr) 2022-06-10 2022-10-19 Dispositif d'affichage de ra pour véhicule et son procédé de fonctionnement
PCT/KR2022/095145 WO2023239002A1 (fr) 2022-06-10 2022-10-19 Dispositif d'affichage de ra pour véhicule et son procédé de fonctionnement

Family Applications Before (2)

Application Number Title Priority Date Filing Date
PCT/KR2022/095146 WO2023239003A1 (fr) 2022-06-10 2022-10-19 Dispositif d'affichage ar pour véhicule et son procédé de fonctionnement
PCT/KR2022/015982 WO2023238993A1 (fr) 2022-06-10 2022-10-19 Dispositif d'affichage en réalité augmentée (ar) de véhicule et procédé de fonctionnement de celui-ci

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/KR2022/095145 WO2023239002A1 (fr) 2022-06-10 2022-10-19 Dispositif d'affichage de ra pour véhicule et son procédé de fonctionnement

Country Status (1)

Country Link
WO (4) WO2023239003A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012032811A (ja) * 2010-07-09 2012-02-16 Toshiba Corp 表示装置、画像データ生成装置、画像データ生成プログラム及び表示方法
US20160129836A1 (en) * 2013-07-05 2016-05-12 Clarion Co., Ltd. Drive assist device
KR20170101758A (ko) * 2016-02-26 2017-09-06 자동차부품연구원 증강현실 헤드 업 디스플레이 내비게이션
KR20190136691A (ko) * 2018-05-31 2019-12-10 주식회사 엘지유플러스 모바일 디바이스 및 상기 모바일 디바이스의 도보네비 모드 전환 방법
JP2022058537A (ja) * 2019-02-14 2022-04-12 株式会社デンソー 表示制御装置

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102014261B1 (ko) * 2017-12-12 2019-08-26 엘지전자 주식회사 차량에 구비된 차량 제어 장치 및 차량의 제어방법

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012032811A (ja) * 2010-07-09 2012-02-16 Toshiba Corp 表示装置、画像データ生成装置、画像データ生成プログラム及び表示方法
US20160129836A1 (en) * 2013-07-05 2016-05-12 Clarion Co., Ltd. Drive assist device
KR20170101758A (ko) * 2016-02-26 2017-09-06 자동차부품연구원 증강현실 헤드 업 디스플레이 내비게이션
KR20190136691A (ko) * 2018-05-31 2019-12-10 주식회사 엘지유플러스 모바일 디바이스 및 상기 모바일 디바이스의 도보네비 모드 전환 방법
JP2022058537A (ja) * 2019-02-14 2022-04-12 株式会社デンソー 表示制御装置

Also Published As

Publication number Publication date
WO2023239002A1 (fr) 2023-12-14
WO2023239003A1 (fr) 2023-12-14
WO2023238993A1 (fr) 2023-12-14

Similar Documents

Publication Publication Date Title
WO2019117333A1 (fr) Dispositif d'affichage fourni dans un véhicule et procédé de commande de dispositif d'affichage
WO2017222299A1 (fr) Dispositif de commande de véhicule monté sur un véhicule et procédé de commande du véhicule
WO2019031852A1 (fr) Appareil pour fournir une carte
WO2019098434A1 (fr) Dispositif de commande de véhicule embarqué et procédé de commande de véhicule
WO2018044098A1 (fr) Appareil d'interface utilisateur de véhicule et véhicule
WO2021045257A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire par ce dernier
WO2018070646A1 (fr) Dispositif de commande de véhicule embarqué et procédé de commande du véhicule
EP3475134A1 (fr) Dispositif de commande de véhicule monté sur un véhicule et procédé de commande du véhicule
WO2021141142A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire correspondant
WO2018097465A1 (fr) Dispositif embarqué de commande de véhicule et procédé de commande du véhicule
WO2019035652A1 (fr) Système d'assistance à la conduite et véhicule comprenant celui-ci
WO2018088614A1 (fr) Dispositif d'interface utilisateur de véhicule et véhicule
WO2019066477A1 (fr) Véhicule autonome et son procédé de commande
WO2018110789A1 (fr) Technologie de commande de véhicule
EP3545380A1 (fr) Dispositif embarqué de commande de véhicule et procédé de commande du véhicule
WO2022154369A1 (fr) Dispositif d'affichage en interaction avec un véhicule et son procédé de fonctionnement
WO2021045256A1 (fr) Appareil de fourniture d'itinéraire et son procédé de fourniture d'itinéraire
WO2021157760A1 (fr) Appareil de fourniture d'itinéraire et son procédé de fourniture d'itinéraire
WO2017155199A1 (fr) Dispositif de commande de véhicule disposé dans un véhicule, et procédé de commande de véhicule
WO2020017677A1 (fr) Dispositif de diffusion d'images
WO2020149431A1 (fr) Dispositif de fourniture d'itinéraire et procédé de commande associé
WO2021045255A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire associé
WO2020149427A1 (fr) Dispositif de fourniture d'itinéraire et procédé de fourniture d'itinéraire associé
WO2018236012A1 (fr) Dispositif d'entrée/sortie
WO2021230387A1 (fr) Dispositif de fourniture d'un itinéraire et procédé de fourniture d'un itinéraire pour celui-ci

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22945951

Country of ref document: EP

Kind code of ref document: A1