WO2024085580A1 - Dispositif de traitement de signal et dispositif d'affichage de véhicule le comprenant - Google Patents

Dispositif de traitement de signal et dispositif d'affichage de véhicule le comprenant Download PDF

Info

Publication number
WO2024085580A1
WO2024085580A1 PCT/KR2023/015989 KR2023015989W WO2024085580A1 WO 2024085580 A1 WO2024085580 A1 WO 2024085580A1 KR 2023015989 W KR2023015989 W KR 2023015989W WO 2024085580 A1 WO2024085580 A1 WO 2024085580A1
Authority
WO
WIPO (PCT)
Prior art keywords
signal processing
processing device
data
processor
sensor
Prior art date
Application number
PCT/KR2023/015989
Other languages
English (en)
Korean (ko)
Inventor
김상헌
이주영
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Publication of WO2024085580A1 publication Critical patent/WO2024085580A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/029Adapting to failures or work around with other constraints, e.g. circumvention by avoiding use of failed parts

Definitions

  • the present disclosure relates to a signal processing device and a vehicle display device equipped with the same, and more specifically, to a signal processing device that can stably drive a vehicle and a vehicle display device provided therewith.
  • a vehicle is a device that moves the user in the desired direction.
  • a representative example is a car.
  • a vehicle signal processing device is installed inside the vehicle.
  • a signal processing device for a vehicle can execute various applications to drive the vehicle.
  • a signal processing device for a vehicle may run a vehicle driver assistance (ADAS) application or an automatic driving (AD) application.
  • ADAS vehicle driver assistance
  • AD automatic driving
  • a signal processing device in a vehicle performs signal processing based on sensor data or camera data for vehicle driving.
  • the problem that the present disclosure aims to solve is to provide a signal processing device that can stably drive a vehicle and a vehicle display device equipped with the same.
  • Another problem that the present disclosure aims to solve is to provide a signal processing device that can stably perform control switching between a plurality of signal processing devices and a vehicle display device including the same.
  • a signal processing device and a vehicle display device including the same exchange data with a central signal processing device, and in normal mode, sensor data received from a sensor device or camera data received from a camera are transmitted to the central signal processing device. It is provided with a processor that controls transmission to a processing device, and the processor controls it to operate in an emergency mode when the central signal processing device, sensor device, or camera malfunctions.
  • the processor may control the sensor device, camera, or actuator to operate based on a control signal received based on transmission of sensor data or camera data.
  • the processor determines that the performance of the sensor device or camera is degraded and sends the performance deterioration information or deterioration mode entry information to the central signal processing device. You can control transmission to .
  • the processor may determine that the sensor device or camera has failed and control the device to operate in an emergency mode.
  • the processor may control transmission of failure information or emergency mode entry information to the central signal processing device.
  • the processor centrally stores second sensor data or second camera data received from a sensor device or camera different from the sensor device or camera. It can be controlled to transmit to a signal processing device.
  • the processor may run a hypervisor and run an area fault manager on the hypervisor.
  • the processor further executes a sensor service for transmission of sensor data or camera data on the hypervisor, and the safety level of the area fault manager may be higher than the safety level of the sensor service.
  • the processor may further execute a system fault manager on the hypervisor, and the safety level of the area fault manager may be equal to the safety level of the system fault manager.
  • the processor may execute a system fault manager.
  • the system fault manager may generate a diagnosis result log based on diagnosis result data received from the central signal processing device.
  • the processor controls the sensor device or camera or actuator to operate based on a control signal received based on transmission of sensor data or camera data, and the safety level of the actuator is higher than the safety level of the sensor device or camera. It can be high.
  • the processor may generate an emergency control message according to the emergency mode and emergency control the actuator based on the emergency control message.
  • the processor determines that the central signal processing device is malfunctioning, it switches to emergency mode, generates a control message for replacement of the central signal processing device, and controls the sensor device, camera, or actuator to operate based on the control message. can do.
  • the processor may determine an error grade or recovery grade based on driving state information and fault type information, and control the emergency mode or degraded mode to operate based on the error grade or recovery grade.
  • the processor may control recovery to the previous mode when recovery conditions are met during emergency mode or degraded mode operation.
  • a signal processing device and a vehicle display device including the same exchange data with an area signal processing device and operate the vehicle based on sensor data or camera data received from the area signal processing device. It is provided with a processor that executes an application for, and controls the processor to operate in a degradation mode when receiving performance degradation information or degradation mode entry information from the area signal processing device.
  • the processor may control transmission of diagnosis result data of sensor data or camera data to the area signal processing device based on sensor data or camera data received from the area signal processing device.
  • the processor executes a hypervisor, executes a plurality of virtualization machines on the hypervisor, and any one of the plurality of virtualization machines may execute a fault manager.
  • the processor executes a second fault manager, and the safety level of the second fault manager may be greater than the safety level of the fault manager running on the hypervisor.
  • a vehicle display device includes at least one display, an area signal processing device that receives sensor data or camera data from a sensor device or a camera, and an image displayed on the display based on the sensor data or camera data. It includes a central signal processing unit that outputs signals.
  • a signal processing device and a vehicle display device including the same exchange data with a central signal processing device, and in normal mode, sensor data received from a sensor device or camera data received from a camera are transmitted to the central signal processing device. It is provided with a processor that controls transmission to a processing device, and the processor controls it to operate in an emergency mode when the central signal processing device, sensor device, or camera malfunctions. Accordingly, it is possible to stably drive the vehicle. In particular, when a failure of a sensor device or camera is determined, the vehicle can be driven stably.
  • the processor may control the sensor device, camera, or actuator to operate based on a control signal received based on transmission of sensor data or camera data. Accordingly, it is possible to stably drive the vehicle in normal mode.
  • the processor determines that the performance of the sensor device or camera is degraded and sends the performance deterioration information or deterioration mode entry information to the central signal processing device. You can control transmission to . Accordingly, it is possible to stably drive the vehicle in the degraded mode.
  • the processor may determine that the sensor device or camera has failed and control the device to operate in an emergency mode. Accordingly, it is possible to stably drive the vehicle in emergency mode.
  • the processor may control transmission of failure information or emergency mode entry information to the central signal processing device. Accordingly, it is possible to stably drive the vehicle.
  • the processor centrally stores second sensor data or second camera data received from a sensor device or camera different from the sensor device or camera. It can be controlled to transmit to a signal processing device. Accordingly, it is possible to stably drive the vehicle.
  • the processor may run a hypervisor and run an area fault manager on the hypervisor. Accordingly, it is possible to stably drive the vehicle.
  • the processor further executes a sensor service for transmission of sensor data or camera data on the hypervisor, and the safety level of the area fault manager may be higher than the safety level of the sensor service. Accordingly, it is possible to stably drive the vehicle.
  • the processor may further execute a system fault manager on the hypervisor, and the safety level of the area fault manager may be equal to the safety level of the system fault manager. Accordingly, it is possible to stably drive the vehicle.
  • the processor may execute a system fault manager. Accordingly, it is possible to stably drive the vehicle.
  • the system fault manager may generate a diagnosis result log based on diagnosis result data received from the central signal processing device. Accordingly, it is possible to stably drive the vehicle.
  • the processor controls the sensor device or camera or actuator to operate based on a control signal received based on transmission of sensor data or camera data
  • the safety level of the actuator is higher than the safety level of the sensor device or camera. It can be high. Accordingly, it is possible to stably drive the vehicle. In particular, actuators with a higher safety level can be controlled stably.
  • the processor may generate an emergency control message according to the emergency mode and emergency control the actuator based on the emergency control message. Accordingly, it is possible to stably drive the vehicle in emergency mode.
  • the processor determines that the central signal processing device is malfunctioning, it switches to emergency mode, generates a control message for replacement of the central signal processing device, and controls the sensor device, camera, or actuator to operate based on the control message. can do. Accordingly, it is possible to stably drive the vehicle in emergency mode.
  • the processor may determine an error grade or recovery grade based on driving state information and fault type information, and control the emergency mode or degraded mode to operate based on the error grade or recovery grade. Accordingly, it is possible to stably drive the vehicle in emergency mode or deterioration mode.
  • the processor may control recovery to the previous mode when recovery conditions are met during emergency mode or degraded mode operation. Accordingly, it is possible to stably drive the vehicle.
  • a signal processing device and a vehicle display device including the same exchange data with an area signal processing device and operate the vehicle based on sensor data or camera data received from the area signal processing device. It is provided with a processor that executes an application for, and controls the processor to operate in a degradation mode when receiving performance degradation information or degradation mode entry information from the area signal processing device. Accordingly, it is possible to stably drive the vehicle. In particular, vehicle driving can be performed stably in degraded mode.
  • the processor may control transmission of diagnosis result data of sensor data or camera data to the area signal processing device based on sensor data or camera data received from the area signal processing device. Accordingly, it is possible to stably drive the vehicle.
  • the processor executes a hypervisor, executes a plurality of virtualization machines on the hypervisor, and any one of the plurality of virtualization machines may execute a fault manager. Accordingly, it is possible to stably drive the vehicle.
  • the processor executes a second fault manager, and the safety level of the second fault manager may be greater than the safety level of the fault manager running on the hypervisor. Accordingly, it is possible to stably drive the vehicle.
  • a vehicle display device includes at least one display, an area signal processing device that receives sensor data or camera data from a sensor device or a camera, and an image displayed on the display based on the sensor data or camera data. It includes a central signal processing unit that outputs signals. Accordingly, it is possible to stably drive the vehicle.
  • FIG. 1 is a diagram showing an example of the exterior and interior of a vehicle.
  • Figure 2 is a diagram showing an example of a communication gateway for a vehicle.
  • FIG. 3A is a diagram illustrating an example of the arrangement of a vehicle display device inside a vehicle.
  • FIG. 3B is a diagram illustrating another example of the arrangement of a vehicle display device inside a vehicle.
  • FIG. 4 is an example of an internal block diagram of the vehicle display device of FIG. 3B.
  • FIGS. 5A to 5D are diagrams illustrating various examples of vehicle display devices.
  • Figure 6 is an example of a block diagram of a vehicle display device according to an embodiment of the present disclosure.
  • Figure 7 is an example of an internal block diagram of a signal processing device according to an embodiment of the present disclosure.
  • Figure 8 is a flowchart showing a method of operating a vehicle display device according to an embodiment of the present disclosure.
  • Figure 9 is another example of a block diagram of a vehicle display device according to an embodiment of the present disclosure.
  • Figure 10 is another example of a block diagram of a vehicle display device according to an embodiment of the present disclosure.
  • Figure 11 is another example of a block diagram of a vehicle display device according to an embodiment of the present disclosure.
  • FIGS. 12A to 19C are diagrams referenced in the description of the operation of FIGS. 8 to 11 .
  • module and “part” for components used in the following description are simply given in consideration of the ease of writing this specification, and do not in themselves give any particularly important meaning or role. Accordingly, the terms “module” and “unit” may be used interchangeably.
  • FIG. 1 is a diagram showing an example of the exterior and interior of a vehicle.
  • the vehicle 200 is operated by a plurality of wheels 103FR, 103FL, and 103RL rotated by a power source, and a steering wheel 150 to control the direction of travel of the vehicle 200.
  • the vehicle 200 may be further equipped with a camera 195 for acquiring images in front of the vehicle.
  • the vehicle 200 may be equipped with a plurality of displays 180a and 180b inside for displaying images, information, etc.
  • a cluster display 180a and an Audio Video Navigation (AVN) display 180b are illustrated as a plurality of displays 180a and 180b.
  • AVB Audio Video Navigation
  • HUD Head Up Display
  • the AVN (Audio Video Navigation) display 180b may also be called a center information display.
  • the vehicle 200 described in this specification may be a concept that includes all vehicles including an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source. there is.
  • Figure 2 is a diagram showing an example of a communication gateway for a vehicle.
  • the architecture 300a of a vehicle communication gateway may correspond to a zone-based architecture.
  • sensor devices and processors inside the vehicle may be placed in each of the plurality of zones (Z1 to Z4), and in the central area of the plurality of zones (Z1 to Z4), a vehicle communication gateway ( A signal processing device 170a including GWDa) may be disposed.
  • the signal processing device 170a may further include an autonomous driving control module (ACC), a cockpit control module (CPG), etc., in addition to the vehicle communication gateway (GWDa).
  • ACC autonomous driving control module
  • CPG cockpit control module
  • GWDa vehicle communication gateway
  • the vehicle communication gateway (GWDa) in the signal processing device 170a may be a High Performance Computing (HPC) gateway.
  • HPC High Performance Computing
  • the signal processing device 170a of FIG. 2 is an integrated HPC and can exchange data with an external communication module (not shown) or a processor (not shown) within a plurality of zones (Z1 to Z4).
  • FIG. 3A is a diagram illustrating an example of the arrangement of a vehicle display device inside a vehicle.
  • cluster display 180a
  • AVN Audio Video Navigation
  • Rear Seat Entertainment display 180c, 180d
  • room mirror display not shown
  • FIG. 3B is a diagram illustrating another example of the arrangement of a vehicle display device inside a vehicle.
  • the vehicle display device 100 performs signal processing to display images, information, etc. on a plurality of displays 180a to 180b, and a plurality of displays 180a to 180b, and at least one A signal processing device 170 that outputs an image signal to the displays 180a to 180b may be provided.
  • the first display (180a) is a cluster display (180a) for displaying driving status, operation information, etc.
  • the second display (180b) is a cluster display (180a) for displaying vehicle operation information, navigation maps, and various other displays. It may be an AVN (Audio Video Navigation) display 180b for displaying entertainment information or images.
  • AVN Audio Video Navigation
  • the signal processing device 170 has a processor 175 therein, and can execute first to third virtualization machines (not shown) on a hypervisor (not shown) within the processor 175.
  • a second virtualization machine (not shown) may operate for the first display 180a, and a third virtualization machine (not shown) may operate for the second display 180b.
  • the first virtualization machine (not shown) in the processor 175 shares the hypervisor 505 based on the second virtualization machine (not shown) and the third virtualization machine (not shown) to transmit the same data.
  • the memory 508 can be controlled to be set. Accordingly, the same information or the same image can be displayed in synchronization on the first display 180a and the second display 180b within the vehicle.
  • the first virtual machine (not shown) in the processor 175 shares at least part of the data with the second virtual machine (not shown) and the third virtual machine (not shown) for data sharing processing. Accordingly, data can be shared and processed in multiple virtual machines for multiple displays in the vehicle.
  • the first virtual machine (not shown) in the processor 175 receives and processes wheel speed sensor data of the vehicle into at least one of a second virtual machine (not shown) or a third virtual machine (not shown). , the processed wheel speed sensor data can be transmitted. Accordingly, it is possible to share the vehicle's wheel speed sensor data with at least one virtual machine, etc.
  • the vehicle display device 100 may further include a Rear Seat Entertainment display 180c for displaying driving status information, simple navigation information, and various entertainment information or images. You can.
  • the signal processing device 170 executes a fourth virtualization machine (not shown) in addition to the first to third virtualization machines (not shown) on the hypervisor (not shown) in the processor 175, and performs RSE
  • the display 180c can be controlled.
  • some of the plurality of displays 180a to 180c may operate based on Linux OS, and others may operate based on web OS.
  • the signal processing device 170 can control displays 180a to 180c operating under various operating systems (OS) to display the same information or the same image in synchronization.
  • OS operating systems
  • the vehicle speed indicator 212a and the vehicle internal temperature indicator 213a are displayed on the first display 180a, and a plurality of applications and the vehicle speed indicator 212b are displayed on the second display 180b.
  • a home screen 222 including a vehicle interior temperature indicator 213b is displayed, and a second home screen 222b including a plurality of applications and a vehicle interior temperature indicator 213c is displayed on the third display 180c. Example of what is displayed.
  • FIG. 4 is an example of an internal block diagram of the vehicle display device of FIG. 3B.
  • a vehicle display device 100 includes an input unit 110, a communication unit 120 for communication with an external device, and a plurality of communication modules (EMa to EMd) for internal communication. , it may be provided with a memory 140, a signal processing device 170, a plurality of displays (180a to 180c), an audio output unit 185, and a power supply unit 190.
  • a plurality of communication modules may be respectively disposed in a plurality of zones (Z1 to Z4) in FIG. 2 .
  • the signal processing device 170 may be provided with a communication switch 736b therein for data communication with each communication module (EM1 to EM4).
  • Each communication module may perform data communication with a plurality of sensor devices (SN) or ECU (770) or area signal processing device (170Z).
  • the plurality of sensor devices SN may include a camera 195, LIDAR 196, radar 197, or location sensor 198.
  • the input unit 110 may be equipped with physical buttons, pads, etc. for button input, touch input, etc.
  • the input unit 110 may be equipped with a microphone (not shown) for user voice input.
  • the communication unit 120 can exchange data with the mobile terminal 800 or the server 900 in a wireless manner.
  • the communication unit 120 can exchange data wirelessly with the vehicle driver's mobile terminal.
  • various data communication methods such as Bluetooth, WiFi, WiFi Direct, and APiX are possible.
  • the communication unit 120 may receive weather information, road traffic situation information, for example, Transport Protocol Expert Group (TPEG) information, from the mobile terminal 800 or the server 900.
  • TPEG Transport Protocol Expert Group
  • the communication unit 120 may be equipped with a mobile communication module (not shown).
  • a plurality of communication modules receive sensor data, etc. from the ECU (770), sensor device (SN), or area signal processing device (170Z), and transmit the received sensor data to the signal processing device (170). Can be transmitted.
  • sensor data includes vehicle direction data, vehicle location data (GPS data), vehicle angle data, vehicle speed data, vehicle acceleration data, vehicle tilt data, vehicle forward/reverse data, battery data, fuel data, tire data, vehicle It may include at least one of lamp data, vehicle interior temperature data, and vehicle interior humidity data.
  • These sensor data include heading sensor, yaw sensor, gyro sensor, position module, vehicle forward/reverse sensor, wheel sensor, vehicle speed sensor, It can be obtained from a vehicle body tilt sensor, battery sensor, fuel sensor, tire sensor, steering sensor by steering wheel rotation, vehicle interior temperature sensor, vehicle interior humidity sensor, etc.
  • the position module may include a GPS module or a location sensor 198 for receiving GPS information.
  • At least one of the plurality of communication modules may transmit location information data sensed by the GPS module or the location sensor 198 to the signal processing device 170.
  • At least one of the plurality of communication modules receives vehicle front image data, vehicle side image data, vehicle rear image data, and vehicle surroundings from the camera 195, lidar 196, or radar 197. Obstacle distance information, etc. may be received, and the received information may be transmitted to the signal processing device 170.
  • the memory 140 may store various data for the overall operation of the vehicle display device 100, such as a program for processing or controlling the signal processing device 170.
  • the memory 140 may store data related to a hypervisor and first to third virtualization machines for execution in the processor 175.
  • the audio output unit 185 converts the electrical signal from the signal processing device 170 into an audio signal and outputs it. For this purpose, speakers, etc. may be provided.
  • the power supply unit 190 can supply power required for the operation of each component under the control of the signal processing device 170.
  • the power supply unit 190 may receive power from a battery inside the vehicle.
  • the signal processing device 170 controls the overall operation of each unit within the vehicle display device 100.
  • the signal processing device 170 may include a processor 175 that performs signal processing for the vehicle displays 180a and 180b.
  • the processor 175 may execute first to third virtualization machines (not shown) on a hypervisor (not shown) within the processor 175.
  • the first virtual machine (not shown) may be called a Server Virtual Machine (Server Virtual maschine), and the second to third virtual machines (not shown) may be referred to as Server Virtual maschine. ) can be named a Guest Virtual maschine.
  • Server Virtual maschine a Server Virtual Machine
  • Guest Virtual maschine a Guest Virtual maschine.
  • a first virtualization machine (not shown) in processor 175 may store sensor data from a plurality of sensor devices, such as vehicle sensor data, location information data, camera image data, audio data, or touch input data. can be received, processed or processed and output.
  • the first virtual machine directly receives CAN data, Ethernet data, audio data, radio data, USB data, and wireless communication data for the second to third virtual machines (not shown). and can be processed.
  • the first virtualization machine may transmit the processed data to the second to third virtualization machines (not shown).
  • the first virtual machine (not shown) among the first to third virtual machines (not shown) receives sensor data, communication data, or external input data from a plurality of sensor devices, and performs signal processing.
  • the burden of signal processing on other virtual machines is reduced, 1:N data communication becomes possible, and synchronization during data sharing becomes possible.
  • the first virtualization machine (not shown) records data in the shared memory 508 and can be controlled to share the same data with the second virtualization machine (not shown) and the third virtualization machine (not shown). .
  • a first virtualization machine (not shown) records vehicle sensor data, the location information data, the camera image data, or the touch input data to the shared memory 508, and writes the second virtualization machine (not shown) to the shared memory 508. ) and a third virtual machine (not shown) can be controlled to share the same data. Accordingly, sharing of data in a 1:N manner becomes possible.
  • the first virtualization machine (not shown) in the processor 175 shares the hypervisor 505 based on the second virtualization machine (not shown) and the third virtualization machine (not shown) to transmit the same data.
  • the memory 508 can be controlled to be set.
  • the signal processing device 170 can process various signals, such as audio signals, video signals, and data signals.
  • the signal processing device 170 may be implemented in the form of a system on chip (SOC).
  • the signal processing device 170 in the display device 100 of FIG. 4 may be the same as the signal processing devices 170, 170a1, and 170a2 of the vehicle display device shown in FIG. 5A and below.
  • FIGS. 5A to 5D are diagrams illustrating various examples of vehicle display devices.
  • FIG. 5A shows an example of a vehicle display device according to an embodiment of the present disclosure.
  • a vehicle display device 800a includes signal processing devices 170a1 and 170a2 and a plurality of region signal processing devices 170Z1 to 170Z4.
  • the signal processing devices 170a1 and 170a2 may also be called HPC (High Performance Computing) signal processing devices.
  • the plurality of area signal processing devices 170Z1 to 170Z4 are disposed in each area Z1 to Z4 and can transmit sensor data to the signal processing devices 170a1 and 170a2.
  • the signal processing devices 170a1 and 170a2 receive data by wire from a plurality of area signal processing devices 170Z1 to 170Z4 or the communication device 120.
  • data is exchanged between the signal processing devices 170a1 and 170a2 and a plurality of area signal processing devices 170Z1 to 170Z4 based on wired communication, and the signal processing devices 170a1 and 170a2 and the server 400 are wirelessly connected.
  • data is exchanged based on wireless communication between the communication device 120 and the server 400, and the signal processing devices 170a1 and 170a2 and the communication device 120 are, Data can be exchanged based on wired communication.
  • data received by the signal processing devices 170a1 and 170a2 may include camera data or sensor data.
  • sensor data within a vehicle includes vehicle wheel speed data, vehicle direction data, vehicle location data (GPS data), vehicle angle data, vehicle speed data, vehicle acceleration data, vehicle tilt data, vehicle forward/reverse data, and battery. It may include at least one of data, fuel data, tire data, vehicle lamp data, vehicle interior temperature data, vehicle interior humidity data, vehicle exterior radar data, and vehicle exterior lidar data.
  • camera data may include camera data outside the vehicle and camera data inside the vehicle.
  • the signal processing devices 170a1 and 170a2 may execute a plurality of virtual machines 820, 830, and 840 based on safety.
  • a processor 175 in the signal processing device 170a executes a hypervisor 505, and on the hypervisor 505, according to automotive safety integrity levels (Automotive SIL (ASIL)), first to third Executing virtual machines (820 to 840) is illustrated.
  • ASIL Automotive SIL
  • the first virtual machine 820 may be a virtual machine corresponding to Quality Management (QM), which is the lowest safety level and non-mandatory level in the Automotive Safety Integrity Level (ASIL).
  • QM Quality Management
  • ASIL Automotive Safety Integrity Level
  • the first virtual machine 820 can run an operating system 822, a container runtime 824 on the operating system 822, and containers 827 and 829 on the container runtime 824.
  • the second virtualization machine 820 is an automotive safety integrity level (ASIL) corresponding to ASIL A or ASIL B, where the sum of severity, exposure, and controllability is 7 or 8. It may be a virtual machine.
  • ASIL automotive safety integrity level
  • the second virtual machine 820 may run an operating system 832, a container runtime 834 on the operating system 832, and containers 837 and 839 on the container runtime 834.
  • the third virtual machine 840 is an automotive safety integrity level (ASIL) corresponding to ASIL C or ASIL D, where the sum of severity, exposure, and controllability is 9 or 10. It could be a virtual machine.
  • ASIL automotive safety integrity level
  • ASIL D can correspond to grades requiring the highest safety level.
  • the third virtual machine 840 can run the safety operating system 842 and the application 845 on the operating system 842.
  • the third virtual machine 840 may run the safety operating system 842, the container runtime 844 on the safety operating system 842, and the container 847 on the container runtime 844.
  • the third virtual machine 840 may be executed through a separate core rather than the processor 175. This will be described later with reference to FIG. 5B.
  • FIG. 5B shows another example of a vehicle display device according to an embodiment of the present disclosure.
  • a vehicle display device 800b includes signal processing devices 170a1 and 170a2 and a plurality of region signal processing devices 170Z1 to 170Z4.
  • the vehicle display device 800b of FIG. 5B is similar to the vehicle display device 800a of FIG. 5A, but the signal processing device 170a1 has some differences from the signal processing device 170a1 of FIG. 5A.
  • the signal processing device 170a1 may include a processor 175 and a second processor 177.
  • the processor 175 in the signal processing device 170a1 executes a hypervisor 505, and on the hypervisor 505, according to an automotive safety integrity level (Automotive SIL (ASIL)), first to second virtualization machines ( 820 ⁇ 830).
  • ASIL Automotive SIL
  • the first virtual machine 820 can run an operating system 822, a container runtime 824 on the operating system 822, and containers 827 and 829 on the container runtime 824.
  • the second virtual machine 820 may run an operating system 832, a container runtime 834 on the operating system 832, and containers 837 and 839 on the container runtime 834.
  • the second processor 177 in the signal processing device 170a1 may execute the third virtual machine 840.
  • the third virtual machine 840 may execute the safety operating system 842, Autosa 845 on the operating system 842, and application 845 on Autosa 845. That is, unlike FIG. 5A, Autosa 846 on the operating system 842 can be further executed.
  • the third virtual machine 840 may run the safety operating system 842, the container runtime 844 on the safety operating system 842, and the container 847 on the container runtime 844, similar to FIG. 5A. there is.
  • the third virtual machine 840 which requires a high security level, is preferably executed on a second processor 177, which is a different core or different processor, unlike the first to second virtual machines 820 to 830. .
  • the second signal processing device 170a2 when the first signal processing device 170a malfunctions, the second signal processing device 170a2, which is for backup, may operate.
  • the signal processing devices 170a1 and 170a2 can operate simultaneously, with the first signal processing device 170a operating as the main and the second signal processing device 170a2 operating as the sub. This will be described with reference to FIGS. 5C and 5D.
  • Figure 5C shows another example of a vehicle display device according to an embodiment of the present disclosure.
  • a vehicle display device 800c includes signal processing devices 170a1 and 170a2 and a plurality of region signal processing devices 170Z1 to 170Z4.
  • the signal processing devices 170a1 and 170a2 may also be called HPC (High Performance Computing) signal processing devices.
  • the plurality of area signal processing devices 170Z1 to 170Z4 are disposed in each area Z1 to Z4 and can transmit sensor data to the signal processing devices 170a1 and 170a2.
  • the signal processing devices 170a1 and 170a2 receive data by wire from a plurality of area signal processing devices 170Z1 to 170Z4 or the communication device 120.
  • data is exchanged between the signal processing devices 170a1 and 170a2 and a plurality of area signal processing devices 170Z1 to 170Z4 based on wired communication, and the signal processing devices 170a1 and 170a2 and the server 400 are wirelessly connected.
  • data is exchanged based on wireless communication between the communication device 120 and the server 400, and the signal processing devices 170a1 and 170a2 and the communication device 120 are, Data can be exchanged based on wired communication.
  • data received by the signal processing devices 170a1 and 170a2 may include camera data or sensor data.
  • the processor 175 in the first signal processing device 170a1 among the signal processing devices 170a1 and 170a2 executes the hypervisor 505, and operates the safety virtualization machine 860 on the hypervisor 505. ) and a non-safety virtualization machine 870 can be run, respectively.
  • the processor 175b in the second signal processing device 170a2 among the signal processing devices 170a1 and 170a2 executes the hypervisor 505b, and operates the safety virtualization machine 880 on the hypervisor 505. ) can only be executed.
  • FIG. 5D shows another example of a vehicle display device according to an embodiment of the present disclosure.
  • a vehicle display device 800d includes signal processing devices 170a1 and 170a2 and a plurality of region signal processing devices 170Z1 to 170Z4.
  • the vehicle display device 800d of FIG. 5D is similar to the vehicle display device 800c of FIG. 5C, but the second signal processing device 170a2 has some differences from the second signal processing device 170a2 of FIG. 5C.
  • the processor 175b in the second signal processing device 170a2 of FIG. 5D runs a hypervisor 505b, and on the hypervisor 505, a safety virtualization machine 880 and a non-safety virtualization machine ( 890) can be executed respectively.
  • the difference is that the processor 175b in the second signal processing device 170a2 further executes the non-safety virtualization machine 890.
  • safety and non-safety processing are separated between the first signal processing device 170a1 and the second signal processing device 170a2, thereby improving stability and processing speed. .
  • Figure 6 is an example of a block diagram of a vehicle display device according to an embodiment of the present disclosure.
  • a vehicle display device 900 includes a signal processing device 170 and at least one display.
  • At least one display is exemplified by a cluster display 180a, an AVN display 180b, and a network display 180c and 180d.
  • the cluster display 180a and the AVN display 180b may each be connected to a display port.
  • the network displays 180c and 180d may be connected to the vehicle's internal network through network ports, respectively.
  • the network at this time may be an Ethernet network based on Ethernet communication.
  • the network displays 180c and 180d are illustrated as being connected to the third region signal processing device 170Z3 and the fourth region signal processing device 170Z4, respectively. However, in contrast, they are connected to other region signal processing devices or , it is also possible to directly connect to the signal processing device 170.
  • the vehicle display device 900 may further include a plurality of region signal processing devices 170Z1 to 170Z4.
  • the signal processing device 170 is a high-performance, centralized signal processing and control device having a plurality of CPUs 175, GPUs 178, and NPUs 179, and is a High Performance Computing (HPC) signal processing device or It can be called a central signal processing unit.
  • HPC High Performance Computing
  • the plurality of area signal processing devices 170Z1 to 170Z4 and the signal processing device 170 are connected with wired cables (CB1 to CB4).
  • the plurality of area signal processing devices 170Z1 to 170Z4 may be connected with wired cables (CBa to CBd), respectively.
  • the wired cable may include a CAN communication cable, an Ethernet communication cable, or a PCI Express cable.
  • the signal processing device 170 may include at least one processor 175, 178, and 177 and a large-capacity storage device 925.
  • the signal processing device 170 may include a central processor 175 and 177, a graphics processor 178, and a neural processor 179.
  • sensor data may be transmitted to the signal processing device 170 from at least one of the plurality of area signal processing devices 170Z1 to 170Z4.
  • sensor data may be stored in the storage device 925 within the signal processing device 170.
  • the sensor data at this time includes camera data, lidar data, radar data, vehicle direction data, vehicle location data (GPS data), vehicle angle data, vehicle speed data, vehicle acceleration data, vehicle tilt data, vehicle forward/backward data, It may include at least one of battery data, fuel data, tire data, vehicle lamp data, vehicle interior temperature data, and vehicle interior humidity data.
  • camera data from the camera 195a and lidar data from the lidar sensor 196 are input to the first area signal processing device 170Z1, and camera data and lidar data are input to the second area. In this example, it is transmitted to the signal processing device 170 via the signal processing device 170Z2 and the third area signal processing device 170Z3.
  • the data reading or writing speed to the storage device 925 is faster than the network speed when sensor data is transmitted to the signal processing device 170 from at least one of the plurality of area signal processing devices 170Z1 to 170Z4. Therefore, it is desirable to perform multi-path routing to prevent network bottlenecks from occurring.
  • the signal processing device 170 may perform multi-path routing based on a software defined network (SDN). Accordingly, it is possible to secure a stable network environment when reading or writing data in the storage device 925. Furthermore, since data can be transmitted to the storage device 925 using multiple paths, data can be transmitted by dynamically changing the network configuration.
  • SDN software defined network
  • Data communication between the signal processing device 170 and the plurality of area signal processing devices 170Z1 to 170Z4 in the vehicle display device 900 according to an embodiment of the present disclosure is performed using high-speed external components for high-bandwidth, low-latency communication. It is preferable that it is a connection (Peripheral Component Interconnect Express) communication.
  • connection Peripheral Component Interconnect Express
  • Figure 7 is an example of an internal block diagram of a signal processing device according to an embodiment of the present disclosure.
  • a signal processing system 1000 may include a central signal processing device 170 and a region signal processing device 170z.
  • the signal processing device 170 in the system 1000 includes a plurality of processor cores (CR1 to CRn, MR).
  • processor cores may correspond to processor cores in the central processor (CPU) of FIG. 6.
  • some of the plurality of processor cores may correspond to application processor cores in the central processor (CPU) of FIG. 6.
  • some of the plurality of processor cores (CR1 to CRn, MR) (CR1 to CRn) operate based on the hypervisor 505, and the hypervisor can execute a plurality of virtual machines (820 to 850).
  • MR another part of the plurality of processor cores (CR1 to CRn, MR) may correspond to an M core or a micomcut (MCU).
  • MR multi-tenant processor cores
  • CR1 to CRn, MR multi-tenant processor cores
  • MR multi-tenant processor cores
  • the fourth virtualization machine 840 can be executed.
  • the fourth virtual machine 840 may execute an application corresponding to a second safety level, such as ASIL D, or a micro service 843 corresponding to an application corresponding to the second safety level. Accordingly, the application or microservice 843 corresponding to the second safety level can be stably performed.
  • a second safety level such as ASIL D
  • a micro service 843 corresponding to an application corresponding to the second safety level can be stably performed.
  • the first processor core executes the hypervisor 505 and supports a second security level such as ASIL D on the hypervisor 505.
  • the operating system 805b may be executed, and the first virtual machine 850 may be executed on the operating system 805b.
  • the first virtual machine 850 may execute an application corresponding to a first safety level, such as ASIL B, or microservices 853a and 853b corresponding to an application corresponding to the first safety level. Accordingly, applications or microservices 853a and 853b corresponding to the first safety level can be stably performed.
  • a first safety level such as ASIL B
  • microservices 853a and 853b corresponding to an application corresponding to the first safety level can be stably performed.
  • the first processor core (CR1) among the plurality of processor cores (CR1 to CRn, MR) may run an operating system corresponding to the first safety level, such as ASIL B, on the hypervisor 505. there is.
  • the second processor core (CR2) and the third processor core (CR3) run the hypervisor 505 and run ASIL B on the hypervisor 505.
  • the operating system 805c corresponding to the first security level may be executed, and the second virtual machine 850 may be executed on the operating system 805c.
  • the second virtual machine 850 is configured to run a third application corresponding to the first safety level, such as ASIL B, or a third application corresponding to the first safety level, on the operating system 805c corresponding to the first safety level.
  • the corresponding microservices (833a to 833d) can be executed. Accordingly, applications or microservices 833a to 833d corresponding to the first safety level can be stably performed.
  • the remaining processor cores (CR4 to CRn) among the plurality of processor cores (CR1 to CRn, MR) run the hypervisor 505 and correspond to a third security level such as QM on the hypervisor 505.
  • the operating system 805d may be executed, and the third virtual machine 820 may be executed on the operating system 805d.
  • the third virtual machine 820 runs a fourth application corresponding to a third safety level, such as QM, or a third safety level on the operating system 805d corresponding to a third safety level lower than the first safety level.
  • a third safety level such as QM
  • Microservices 823a to 823d corresponding to the corresponding fourth application may be executed. Accordingly, applications or microservices 823a to 823d corresponding to the third safety level can be stably performed.
  • the area signal processing unit 170z includes a plurality of application processor cores (CRR1 to CRRm) and an M core (MRb) to execute ASIL D applications corresponding to the second safety level, which is the highest safety level. can be provided.
  • RR1 to CRRm of the plurality of processor cores (CRR1 to CRRm, MRb) in the area signal processing device 170z run an operating system 806b corresponding to a first safety level such as ASIL B, On the operating system 806a, a virtualization machine 830b corresponding to the first security level may be executed.
  • a first safety level such as ASIL B
  • the virtual machine 830b corresponding to the first safety level may execute an application corresponding to the first safety level, such as ASIL B, or microservices 830ba to 830bd corresponding to the application corresponding to the first safety level.
  • an application corresponding to the first safety level such as ASIL B
  • microservices 830ba to 830bd corresponding to the application corresponding to the first safety level. Accordingly, applications or microservices (830ba to 830bd) corresponding to the first safety level can be performed stably.
  • MRb another portion of the plurality of processor cores (CRR1 to CRRm, MRb) in the area signal processing device 170z executes the operating system 806a corresponding to the second safety level such as ASIL D, and operates On regime 806a, a virtualization machine 840b may run corresponding to a second security level, such as ASIL D.
  • the virtual machine 840b corresponding to the second safety level may execute an application corresponding to the second safety level, such as ASIL D, or a microservice 843b corresponding to an application corresponding to the second safety level. Accordingly, the application or microservice 843b corresponding to the second safety level can be stably performed.
  • Figure 8 is a flowchart showing a method of operating a vehicle display device according to an embodiment of the present disclosure.
  • the processor (175z in FIG. 9) in the area signal processing device 170z processes sensor data received from the sensor device SN or camera data received from the camera 195 in normal mode to the central signal processing device. Control to transmit to (170) (S810).
  • the area signal processing device 170z can transmit sensor data or camera data from the connected sensor device SN or camera 195 to the central signal processing device 170.
  • the sensor data at this time includes vehicle direction data, vehicle location data (GPS data), vehicle angle data, vehicle speed data, vehicle acceleration data, vehicle tilt data, vehicle forward/backward data, battery data, fuel data, tire data, and vehicle It may include at least one of lamp data, vehicle interior temperature data, vehicle interior humidity data, road surface humidity data, emergency vehicle detection data, ultrasonic sensor data, lidar data, and radar data.
  • the camera data at this time may include camera data outside the vehicle or camera data inside the vehicle.
  • the central signal processing device 170 may control to output an image signal to the display 180 based on received sensor data or camera data in normal mode.
  • the central signal processing unit 170 displays speed information on the cluster display 180a or on the AVN display 180b at the front of the vehicle, based on received sensor data or camera data. You can control the video display.
  • received sensor data or camera data may include data about a vehicle in front, a pedestrian in front, or an obstacle, etc.
  • the central signal processing unit 170 may generate a control signal based on received sensor data or camera data in normal mode and transmit the generated control signal to the area signal processing unit 170z (S815). .
  • the processor 175 in the central signal processing unit 170 may perform lateral or longitudinal control of the vehicle's driving or control the vehicle's travel based on received data about the vehicle ahead, pedestrians ahead, or obstacles, etc. Control signals for acceleration or vehicle deceleration can be generated and output.
  • the processor 175z in the area signal processing device 170z in normal mode, operates the sensor device SN or the camera 195 or the actuator based on a control signal received based on transmission of sensor data or camera data. (SNc in FIG. 9) can be controlled to operate. Accordingly, it is possible to stably drive the vehicle in normal mode.
  • the processor 175z in the area signal processing device 170z may control vehicle travel in the lateral direction based on a received control signal.
  • the processor 175z in the area signal processing device 170z degrades the performance of the sensor device SN or the camera 195 when the amount of sensor data or camera data received is less than the first reference value during normal mode. , and performance degradation information or degradation mode entry information may be generated (S820).
  • the processor 175z in the area signal processing device 170z may control transmission of the generated performance degradation information or degradation mode entry information to the central signal processing device 170.
  • the processor 175 in the central signal processing unit 170 controls to operate in degradation mode when receiving performance degradation information or degradation mode entry information from the area signal processing unit 170z (S828) ).
  • the processor 175z in the area signal processing device 170z when the amount of sensor data or camera data received is less than the first reference value, connects the sensor device SN or the camera 195 Second sensor data or second camera data received from another sensor device (SN) or camera 195 may be controlled to be transmitted to the central signal processing unit 170.
  • the processor 175 in the central signal processing unit 170 after receiving performance degradation information or degradation mode entry information from the area signal processing unit 170z, uses the main sensor device (SNa) or the main camera 195 ) and when second sensor data or second camera data received from an auxiliary sensor device (SNb) or an auxiliary camera is received, a degradation mode is performed based on the second sensor data or second camera data. can do.
  • the second sensor data or the second camera data may have a smaller data amount or resolution than the main sensor data or main camera data from the main sensor device (SNa) or the main camera 195.
  • the processor 175 in the central signal processing unit 170 may generate and output a second control signal generated according to the degradation mode.
  • the processor 175z in the area signal processing device 170z controls the sensor device SN or the camera 195 or the actuator (SNc in FIG. 9) to operate by the second control signal generated in the degraded mode. can do.
  • control in the degraded mode is performed more stably than control in the normal mode.
  • the processor 175z in the area signal processing unit 170z may control, in the degraded mode, so that less lateral control or longitudinal control of vehicle travel or vehicle acceleration travel is performed than in the normal mode. there is. Accordingly, it is possible to stably drive the vehicle in the degraded mode.
  • the processor 175z in the area signal processing device 170z may control the vehicle to perform more decelerated driving in the degraded mode than in the normal mode. Accordingly, it is possible to stably drive the vehicle in the degraded mode.
  • the central signal processing unit 170 when the central signal processing unit 170 or the processor 175 within the central signal processing unit 170 fails, the central signal processing unit 170 generates failure information (S832) and sends the failure information to the area signal processing unit. It can be transmitted to (170z) (S835).
  • the processor 175z in the area signal processing unit 170z receives failure information of the central signal processing unit 170 and controls to operate in emergency mode based on the received failure information. Do (S838).
  • part of the operation of the central signal processing device 170 may be replaced by the area signal processing device 170z.
  • a driving-related application executed in the central signal processing device 170 may be transferred and executed in the area signal processing device 170z. Accordingly, it is possible to stably drive the vehicle in emergency mode.
  • the processor 175z in the area signal processing device 170z determines whether the sensor device SN or the camera 195 is broken (S842), and in case of a malfunction, sends the trouble information or emergency mode entry information to the central signal. It can be controlled to transmit to the processing device 170 (S845).
  • the processor 175z in the area signal processing device 170z can be controlled to operate in emergency mode (S848).
  • the processor 175z in the area signal processing device 170z is smaller than the first reference value. Accordingly, it is possible to stably drive the vehicle in emergency mode.
  • the processor 175z may control transmission of failure information or emergency mode entry information to the central signal processing device 170 when operating in the emergency mode. Accordingly, it is possible to stably drive the vehicle.
  • the processor 175z in the area signal processing device 170z may control a safe stop to be performed according to the emergency mode when the sensor device SN or the camera 195 fails. . Accordingly, vehicle safety can be promoted.
  • Figure 9 is another example of a block diagram of a vehicle display device according to an embodiment of the present disclosure.
  • a vehicle display device 900 includes at least one display 180 and an area signal that receives sensor data or camera data from a sensor device (SN) or a camera 195. It includes a processing device 170z and a central signal processing device 170 that outputs an image signal to a display based on sensor data or camera data.
  • the area signal processing device 170z processes sensor data received from the sensor device SN or camera data received from the camera 195 in normal mode to the central signal processing device 170. It is provided with a processor 175z that controls transmission.
  • the processor 175z in the area signal processing device 170z controls it to operate in an emergency mode when the central signal processing device 170, the sensor device SN, or the camera 195 malfunctions.
  • the processor 175z may include a plurality of processor cores (CRR1 to CRR4).
  • processor cores CRR1 to CRR4
  • FIG. 1 a plurality of processor cores corresponding to ASIL D are illustrated, but various modifications are possible.
  • the processor 175z may execute the hypervisor 505b and execute the area fault manager 2015 on the hypervisor 505b.
  • the processor 175z may further execute sensor services 2012 and 2013 for transmitting sensor data or camera data on the hypervisor 505b.
  • the safety level of the area fault manager (2015) may be higher than the safety level of the sensor service (2012, 2013).
  • the safety level of the area fault manager 2015 is ASIL D, which is higher than ASIL B, which is the safety level of the sensor service. Accordingly, it is possible to stably drive the vehicle.
  • processor 175z may further execute the system fault manager 2017 on the hypervisor 505b.
  • system fault manager 2017 may generate a diagnosis result log based on diagnosis result data received from the central signal processing device 170.
  • the safety level of the area fault manager 2015 and the safety level of the system fault manager 2017 may be the same as ASIL D.
  • a hypervisor 505b is executed on a plurality of processor cores (CRR1 to CRR4) in the region signal processing device 170z, and a plurality of operating systems 806a to 806d are executed on the hypervisor 505b. exemplifies what happens.
  • the plurality of operating systems 806a to 806d include an operating system 806a to 806b corresponding to ASIL B, which is the first safety level, and an operating system corresponding to ASIL D, a second safety level higher than the first safety level. It may include (806c ⁇ 806d).
  • the main sensor service (2012 in FIG. 11) receives sensor data from the main sensor devices (SNa) and the auxiliary sensor devices (SNb)
  • An auxiliary sensor service (2013 in Figure 11) that receives sensor data may be executed.
  • the main sensor devices SNa may include environmental sensors, vehicle sensors, and driver monitoring sensors.
  • the processor 175z may control the sensor device SN, the camera 195, or the actuator SNc to operate based on a control signal received based on transmission of sensor data or camera data.
  • the safety level of the actuator SNc may be higher than the safety level of the sensor device SN or the camera 195.
  • the safety level of the sensor device SN or the camera 195 may correspond to ASIL B, which corresponds to the first safety level, and the safety level of the actuator SNc may correspond to the first safety level. It can correspond to ASIL D, which corresponds to the second, higher safety level.
  • the processor 175z can execute the hypervisor 505b and execute the virtualization machine 840b on the hypervisor 505b.
  • the virtualization machine 840b may execute an area fault manager (2015), a system fault manager (2017), and a sensor service (2012, 2013) for transmitting sensor data or camera data.
  • the central signal processing unit 170 includes a processor 175 that executes an application for driving the vehicle based on sensor data or camera data received from the area signal processing unit 170z.
  • the processor 175 when the processor 175 receives performance degradation information or degradation mode entry information from the area signal processing device 170z, it controls the processor 175 to operate in degradation mode. Accordingly, it is possible to stably drive the vehicle. In particular, vehicle driving can be performed stably in degraded mode.
  • the processor 175 may control the diagnosis result data of the sensor data or camera data to be transmitted to the area signal processing device 170z based on the sensor data or camera data received from the area signal processing device 170z. Accordingly, it is possible to stably drive the vehicle.
  • the processor 175 may execute the hypervisor 505 and execute a plurality of virtual machines (820, 830, 850, 850ra, and 850rb) on the hypervisor 505.
  • the processor 175 may include a plurality of processor cores (MR, CR1 to CR4).
  • the M core executes a safety operating system and can execute communication services, V2X services, system services, fault monitors, etc. on the safety operating system.
  • the first core may execute a safety operating system and, on the safety operating system, a virtualization machine 840 corresponding to ASIL D, which is the second safety level.
  • the first core among the plurality of processor cores (MR, CR1 to CR4) may execute the virtualization machine 830 corresponding to ASIL B, which is the first safety level.
  • the virtualization machine 830 can run a fault manager, container orchestra, etc.
  • the second core (CR2) among the plurality of processor cores (MR, CR1 to CR4) is a virtualization machine 850 corresponding to ASIL B, the first safety level, and a virtualization machine corresponding to QM, the third safety level ( 820) can be executed.
  • the virtualization machine 850 may execute services corresponding to ASIL B, the first safety level.
  • the virtualization machine 820 may execute services corresponding to QM, the third security level.
  • the third core among the plurality of processor cores (MR, CR1 to CR4) may execute a redundant virtualization machine (850ra) corresponding to ASIL B, the first safety level.
  • the fourth core (CR4) among the plurality of processor cores (MR, CR1 to CR4) can execute the redundant virtualization machine (850rb) corresponding to ASIL B, the first safety level.
  • Figure 10 is another example of a block diagram of a vehicle display device according to an embodiment of the present disclosure.
  • a vehicle display device 2100 includes a central signal processing device 170 and a plurality of area signal processing devices 170z1 and 170z2.
  • the central signal processing device 170 may include a processor 175 including an application processor core and a second processor 177 including an M core.
  • the second processor 177 may execute a real-time operating system (RTOS) and execute a master fault manager 2022 on the RTOS.
  • RTOS real-time operating system
  • the fault manager 2022 can be executed without the hypervisor 505.
  • processor 175 runs hypervisor 505, runs an RTOS on a portion of hypervisor 505, runs a container runtime on the RTOS, and on the container runtime, a slave fault manager (fault manager). manager (2024) and voter (2025) can be run.
  • the processor 175 may run an RTOS on another part of the hypervisor 505 and run a node 2030 including containers 2032 and 2034 on the RTOS.
  • processor 175 may run another service node 2039 on another part of hypervisor 505.
  • the first region signal processing device 170z1 runs a hypervisor 505b, runs an RTOS on a part of the hypervisor 505b, and provides a sensor service 2012 on the RTOS. It can be run.
  • the first region signal processing device 170z1 may run another node 2014 on another part of the hypervisor 505.
  • the first region signal processing device 170z1 may execute an RTOS on another part of the hypervisor 505b and execute a fault manager 2015 on the RTOS.
  • the first region signal processing unit 170z1 runs an RTOS on another part of the hypervisor 505b, and includes a system fault manager (2017) and a safe stop planner (2019) on the RTOS. You can run .
  • the second region signal processing device 170z2 runs a hypervisor 505c, runs an RTOS on a part of the hypervisor 505c, and provides a sensor service 2041 on the RTOS. It can be run.
  • the second region signal processing device 170z2 may run another node 2043 on another part of the hypervisor 505c.
  • the second region signal processing device 170z2 may execute an RTOS on another part of the hypervisor 505c and execute a fault manager 2045 on the RTOS.
  • fault information is received by the slave fault manager 2024, and the slave fault manager 2024 can transmit the fault information to the master fault manager 2022.
  • the master fault manager 2022 may transmit fault information of the central signal processing device 170 to the system fault manager 2019 in the first area signal processing device 170z1.
  • the area fault manager 2015 in the first area signal processing device 170z1 may receive fault information from the sensor service 2012 and transmit the fault information to the system fault manager 2019.
  • the area fault manager (2045) in the second area signal processing device 170z2 receives fault information from the sensor service 2041 and sends the fault information to the system fault manager (2045) in the first area signal processing device 170z1. 2019), it can be transmitted.
  • the system fault manager 2019 in the first area signal processing device 170z1 can integrate and manage fault information.
  • system fault manager 2019 may be placed in the central signal processing unit 170 rather than the area signal processing unit 170z1.
  • Figure 11 is another example of a block diagram of a vehicle display device according to an embodiment of the present disclosure.
  • a vehicle display device 2000 includes a central signal processing device 170 and a plurality of area signal processing devices 170z1 and 170z2.
  • the central signal processing device 170 may include a processor 175 including an application processor core and a second processor 177 including an M core.
  • the second processor 177 may execute a real-time operating system (RTOS) and execute communication services 2121 and a fault manager 2022 on the RTOS.
  • RTOS real-time operating system
  • processor 175 runs hypervisor 505, runs an RTOS on a portion of hypervisor 505, runs a container runtime on the RTOS, and runs a fault manager on the container runtime. )(2024) and voter(2025) can be run.
  • the processor 175 may run an RTOS on another part of the hypervisor 505 and run a node 2030 including containers 2032 and 2034 on the RTOS.
  • processor 175 may run redundant node 1 (2036), redundant node 2 (2037), and teleoperation node (2038) on another part of hypervisor 505.
  • the first region signal processing device 170z1 may include a processor.
  • the first region signal processing device 170z1 runs a hypervisor 505b, runs an RTOS on a part of the hypervisor 505b, and provides sensor services (2012, 2013) on the RTOS. ) can be executed.
  • the first area signal processing device 170z1 may execute an actuator service (2011).
  • the first region signal processing device 170z1 may execute an RTOS on another part of the hypervisor 505b and execute a fault manager 2015 on the RTOS.
  • the first region signal processing unit 170z1 runs an RTOS on another part of the hypervisor 505b, and includes a system fault manager (2017) and a safe stop planner (2019) on the RTOS. You can run .
  • the first sensor service 2012 receives sensor data from the main sensor device (SNa) through a normal path, and converts the received sensor data into a central signal through a normal path. It can be transmitted to the processing device 170 or the second area signal processing device 170z2.
  • the second sensor service (2013) receives sensor data from the main sensor devices (SNa) or auxiliary sensor devices (SNb) through a safety path, and sends the received sensor data to the safety path ( It can be transmitted to the central signal processing device 170 or the second area signal processing device 170z2 through a safety path).
  • the actuator service 2011 can control the actuator SNc through a normal path or safety path.
  • the second region signal processing device 170z2 may include a processor.
  • the second region signal processing device 170z2 runs a hypervisor 505c, runs an RTOS on a part of the hypervisor 505c, and provides a sensor service 2041 on the RTOS. It can be run.
  • the second area signal processing device 170z2 may execute a redundant node 2044.
  • the second region signal processing device 170z2 may execute an RTOS on another part of the hypervisor 505c and execute a fault manager 2045 on the RTOS.
  • the sensor service 2041 receives sensor data from the main sensor device (SNaa) through a normal path, and sends the received sensor data to the central signal processing device through a normal path. (170) or may be transmitted to the first area signal processing device 170z1.
  • the sensor service 2041 receives sensor data from the main sensor devices (SNaa) or auxiliary sensor devices (SNba) through a safety path, and sends the received sensor data to the safety path. ), it can be transmitted to the central signal processing unit 170 or the first area signal processing unit 170z1.
  • the processor 175 runs a plurality of virtualization machines (820, 830, 850, 850ra, 850rb) on the hypervisor 505, as shown in FIG. 9, and any one of (820, 830, 850, 850ra, 850rb) runs the fault manager (2024). You can run .
  • the processor 175 executes the second fault manager 2022 without running the hypervisor 505, and the safety level of the second fault manager 2022 is set to the fault manager running on the hypervisor 505. (2024) may be greater than the safety level.
  • the fault manager 2024 runs on the virtual machine 820 of ASI B corresponding to a first safety level, and the second fault manager 2022 runs at a second safety level higher than the first safety level. It can be executed on the virtualization machine 840 of the corresponding ASI D. Accordingly, it is possible to stably drive the vehicle.
  • FIGS. 12A to 19C are diagrams referenced in the description of the operation of FIGS. 8 to 11 .
  • 12A and 12B are diagrams illustrating driving based on sensor data.
  • the first service 2012 in the first area signal processing device 170z1 receives sensor data from the main sensor device SNa (S1210).
  • the first service 2012 in the first area signal processing unit 170z1 analyzes the received sensor data and transmits the sensor data to the central signal processing unit 170 or the second area signal processing unit 170z2. (S1215).
  • services 2032 and 2034 in containers within the central signal processing unit 170, or redundant nodes 2036 and 2037, may receive sensor data and perform recognition and driving judgment (S1220 ).
  • the voter 2025 in the central signal processing unit 170 may generate a control message based on sensor data (S1225).
  • the boater 2025 in the central signal processing unit 170 may generate a control message including an acceleration message, a deceleration message, or a steering angle message.
  • the router 2025 in the central signal processing device 170 may transmit the generated control message to the first area signal processing device 170z1.
  • the actuator service 2011 in the first area signal processing device 170z1 may control the actuator SNc based on the control message.
  • the actuator service 2011 in the first area signal processing device 170z1 may control acceleration or deceleration, steering angle adjustment, etc. to be performed. Accordingly, safe driving based on sensor data is possible.
  • FIGS. 13A and 13B are diagrams illustrating the creation of a diagnosis result log based on sensor data.
  • the first service 2012 in the first area signal processing device 170z1 receives sensor data from the main sensor device (SNa).
  • the first service 2012 in the first area signal processing device 170z1 performs a functional degradation data analysis of the received sensor data during the normal mode (s1240).
  • the first service 2012 in the first area signal processing device 170z1 may determine that the function is degraded if the amount of sensor data or camera data received is less than the first standard value during normal mode. .
  • the sensor data or the sensor data analyzed as functional deterioration is transmitted to the fault manager 2015 in the first area signal processing device 170z1.
  • the fault manager 2015 in the first area signal processing device 170z1 monitors the diagnosis result based on sensor data or sensor data analyzed for functional degradation (S1242).
  • the fault manager 2015 in the first area signal processing device 170z1 may transmit diagnosis result data to the central signal processing device 170 (S1244).
  • the fault manager 2022 in the central signal processing device 170 may receive diagnosis result data and perform diagnosis result monitoring based on the diagnosis result data (S1245).
  • the fault manager 2022 in the central signal processing unit 170 may transmit diagnosis result data of the central signal processing unit 170 to the first area signal processing unit 170z1.
  • the system fault manager 2017 in the first area signal processing device 170z1 may generate a diagnosis result log based on the diagnosis result data of the central signal processing device 170 (S1248). Accordingly, it is possible to generate a log of monitoring and diagnosis results based on functional deterioration of sensor devices, etc.
  • Figures 14a and 14b are diagrams illustrating emergency response based on sensor data.
  • the first service 2012 or the second service 2013 in the first area signal processing device 170z1 receives sensor data from the main sensor device (SNa) or the auxiliary sensor device (SNb).
  • the first service 2012 or the second service 2013 in the first area signal processing device 170z1 analyzes the received sensor data and sends the analyzed sensor data or the analyzed unexpected situation information to the system fault manager ( 2017) can be transmitted (S1250).
  • unexpected situation information may include information such as when a child rushes into the vehicle while driving, when a jaywalker runs into the vehicle while driving at night, or the location of an unexpected obstacle ahead.
  • the system fault manager 2017 in the first area signal processing device 170z1 may control switching to the emergency mode based on the analyzed sensor data or the analyzed emergency situation information (S1252).
  • system fault manager 2017 in the first area signal processing device 170z1 may transmit emergency mode information or emergency mode switch information to the safe stop planner 2019 or the central signal processing device 170.
  • the safe stop planner 2019 in the first area signal processing device 170z1 may generate a control message based on emergency mode information or emergency mode switch information (S1254).
  • the safe stop planner 2019 in the first area signal processing device 170z1 may generate a control message including an acceleration message, a deceleration message, or a steering angle message.
  • the safe stop planner 2019 in the first area signal processing device 170z1 may transmit the generated control message to the actuator service 2011 in the first area signal processing device 170z1.
  • the actuator service 2011 in the first area signal processing device 170z1 may control the actuator SNc based on the control message (S1256).
  • the actuator service 2011 in the first area signal processing device 170z1 may control acceleration or deceleration, steering angle adjustment, etc. to be performed. Accordingly, safe driving is possible when an unexpected situation occurs.
  • FIG. 14C is a diagram referenced in the description of FIGS. 14A and 14B.
  • the first service 2012 in the first area signal processing device 170z1 receives sensor data from the main sensor device SNa (S1405).
  • the second service 2013 in the first area signal processing device 170z1 receives sensor data from the auxiliary sensor device (SNb) or the main sensor device (SNa) (S1407).
  • the first service 2012 or the second service 2013 in the first area signal processing device 170z1 analyzes the received sensor data and analyzes the analyzed sensor data or analysis in step 1250 (S125).
  • the unexpected situation information can be transmitted to the system fault manager (2017) (S1410).
  • unexpected repayment information may include information such as when a child runs into a vehicle while driving, when a jaywalker runs into the vehicle while driving at night, or the location of an unexpected obstacle ahead.
  • the system fault manager 2017 in the first area signal processing device 170z1 controls switching to the emergency mode based on the analyzed sensor data or the analyzed emergency situation information and provides emergency mode information or emergency mode switching information. It can be transmitted to the Safe Stop Planner (2019) or the central signal processing unit 170 (S1415).
  • the safe stop planner 2019 in the first area signal processing device 170z1 may generate a control message based on emergency mode information or emergency mode switch information.
  • the safe stop planner 2019 in the first area signal processing device 170z1 may transmit emergency mode information or emergency mode switch information to the central signal processing device 170 (S1418).
  • the safe stop planner 2019 in the first area signal processing device 170z1 may transmit the generated control message to the actuator service 2011 in the first area signal processing device 170z1 (S1420).
  • the actuator service 2011 in the first area signal processing device 170z1 may control the actuator SNc based on the control message.
  • the actuator service 2011 in the first area signal processing device 170z1 may perform deceleration and control the vehicle to emergency stop. Accordingly, safe driving is possible when an unexpected situation occurs.
  • Figure 15a is a diagram illustrating a degradation mode related to sensor function degradation.
  • the first service 2012 in the first area signal processing device 170z1 receives sensor data from the main sensor device (SNa), and activates the auxiliary sensor device (SNb) when the sensor function deteriorates. do.
  • the second service 2013 in the first area signal processing device 170z1 receives sensor data from the auxiliary sensor device SNb (S1510).
  • the second service 2013 in the first area signal processing device 170z1 transmits sensor data from the auxiliary sensor device SNb to the central signal processing device 170.
  • the voter 2025 in the central signal processing unit 170 may perform recognition or judgment based on sensor data (S1512) and generate a control message based on the recognition or judgment (S1512). S1514).
  • the router 2025 in the central signal processing device 170 may transmit the generated control message to the first area signal processing device 170z1.
  • the actuator service 2011 in the first area signal processing device 170z1 may control the actuator SNc based on the control message (S1516).
  • FIG. 15B is a diagram referenced in the description of FIG. 15A.
  • the main sensor device (SNa) transmits sensor data to the first service 2012 in the first area signal processing device 170z1 (S1520).
  • the first service 2012 in the first area signal processing device 170z1 receives sensor data from the main sensor device (SNa), analyzes the sensor data, and sends the segment result data or sensor data to the fault manager. (2015) can be transmitted (S1522).
  • the fault manager 2015 may determine that the sensor function is deteriorated and transmit an activation message to the auxiliary sensor device SNb (S1524) ).
  • the main sensor device (SNa) may enter a degradation mode (S1523).
  • the auxiliary sensor device SNb is activated and can transmit sensor data to the second service 2013 in the first area signal processing device 170z1 (S1526).
  • the second service 2013 in the first area signal processing device 170z1 may receive sensor data from the auxiliary sensor device SNb and transmit the sensor data to the node in the central signal processing device 170. It can be transmitted to voter (2025) (S1528).
  • the voter 2025 in the central signal processing unit 170 receives sensor data (S1530), performs recognition or judgment based on the sensor data, and sends a control message based on the recognition or judgment. may be generated and the generated control message may be transmitted to the first area signal processing device 170z1 (S1532).
  • the actuator service 2011 in the first area signal processing device 170z1 may control the actuator SNc based on the control message.
  • Figure 16a is a diagram illustrating an emergency mode according to a sensor device failure.
  • the first area signal processing device 170z1 receives sensor data from the sensor device SN, and collects or analyzes related data when the sensor device SN fails (S1610).
  • the first area signal processing device 170z1 switches to the emergency mode (S1612).
  • the first area signal processing device 170z1 may detect a failure of the sensor device SN or the camera 195. It can be judged and controlled to operate in emergency mode.
  • the first area signal processing device 170z1 may be controlled to transmit failure information or emergency mode entry information to the central signal processing device 170 when operating in the emergency mode.
  • the processor 175z in the first area signal processing device 170z1 may generate an emergency control message according to the emergency mode (S1615) and emergency control the actuator SNc based on the emergency control message (S1615). S1616).
  • the processor 175z in the first area signal processing device 170z1 may emergency control the actuator SNc so that the vehicle comes to an emergency stop.
  • the vehicle in the event of a sensor device failure, the vehicle can be safely brought to an emergency stop.
  • FIG. 16B is a diagram referenced in the description of FIG. 16A.
  • steps 1520 to 1532 correspond to the degradation mode of FIG. 15B.
  • the auxiliary sensor device SNb may transmit sensor data to the second service 2013 in the first area signal processing device 170z1 (S1640).
  • the second service 2013 in the first area signal processing device 170z1 receives sensor data from the auxiliary sensor device SNb, analyzes the sensor data, and sends the segment result data or sensor data to the fault manager. (2015) can be transmitted (S1642).
  • the fault manager 2015 in the first area signal processing device 170z1 will transmit sensor device failure information to the system fault manager 2017 in the first area signal processing device 170z1. (S1644).
  • system fault manager 2017 in the first area signal processing device 170z1 may control switching to emergency mode based on sensor device failure information.
  • the system fault manager 2017 in the first area signal processing device 170z1 may transmit emergency mode information or emergency mode switch information to the safe stop planner 2019 (S1645).
  • the safe stop planner 2019 in the first area signal processing device 170z1 may transmit emergency mode information or emergency mode switch information to the central signal processing device (S1646).
  • the safe stop planner 2019 in the first area signal processing device 170z1 generates a control message based on emergency mode information or emergency mode switching information to provide an actuator service ( 2011) can be transmitted (S1648).
  • the actuator service 2011 in the first area signal processing device 170z1 may control the actuator SNc based on the control message.
  • the actuator SNc can bring the vehicle to an emergency stop. Accordingly, in the event of a sensor device failure, the vehicle can be safely brought to an emergency stop.
  • Figure 17a is a diagram illustrating an emergency mode according to a central signal processing device failure.
  • failure data can be generated and transmitted to the first area signal processing unit 170z1 (S1710).
  • the first area signal processing device 170z1 switches to the emergency mode (S1712).
  • the first area signal processing device 170z1 generates a control message based on sensor data received from the sensor device SN (S1714) and controls the actuator SNc based on the generated control message. (S1716).
  • the processor 175z in the first area signal processing device 170z1 determines that the central signal processing device 170 is malfunctioning, it switches to the emergency mode and controls the replacement of the central signal processing device 170.
  • a message may be generated, and based on the control message, the sensor device (SN), camera 195, or actuator (SNc) may be controlled to operate.
  • the central signal processing device 170 fails, the first area signal processing device 170z1 replaces it, allowing the vehicle to drive safely.
  • FIG. 17B is a diagram referenced in the description of FIG. 17A.
  • steps 1720 (S1720) to 1736 (S1736) exemplify a degraded mode
  • steps 1740 (S1740) to 1750 (S1750) exemplify an emergency mode.
  • the node within the central signal processing unit 170 generates deterioration data (S1720) and transmits the deterioration data to the fault manager 2022 (S1722).
  • the fault manager 2022 in the central signal processing unit 170 transmits deactivation information to the node in the central signal processing unit 170 (S1724).
  • the fault manager 2022 in the central signal processing unit 170 transmits auxiliary sensor device activation information to the first area signal processing unit 170z1 (S1725).
  • the fault manager 2022 in the central signal processing unit 170 may transmit activation information to the teleoperation node (S1727).
  • auxiliary sensor device (SNb) is activated, and the auxiliary sensor device (SNb) transmits sensor data (S1728).
  • the second service 2013 in the first area signal processing device 170z1 receives sensor data from the auxiliary sensor device SNb and transmits the sensor data to the central signal processing device 170.
  • sensor data from the auxiliary sensor device may be transmitted to a node in the central signal processing unit 170 (S1730) and may also be transmitted to an activated teleoperation node in the central signal processing unit 170 (S1732). .
  • the node or reorganization node may perform recognition and driving judgment based on the received sensor data.
  • the voter 2025 in the central signal processing unit 170 may generate a control message based on sensor data.
  • the boater 2025 in the central signal processing unit 170 may generate a control message including an acceleration message, a deceleration message, or a steering angle message.
  • the router 2025 in the central signal processing unit 170 may transmit the generated control message to the first area signal processing unit 170z1 (S1736).
  • the actuator service 2011 in the first area signal processing device 170z1 may control the actuator SNc based on the control message.
  • the node within the central signal processing unit 170 generates failure data (S1740) and transmits the failure data to the fault manager 2022 (S1742).
  • the fault manager 2022 in the central signal processing unit 170 transmits deactivation information to the node in the central signal processing unit 170 (S1724).
  • the fault manager 2022 in the central signal processing unit 170 transmits fault data of the central signal processing unit 170 (S1744).
  • the system fault manager 2017 in the first area signal processing device 170z1 receives the failure data of the central signal processing device 170 and activates the emergency mode.
  • system fault manager 2017 in the first area signal processing device 170z1 can transmit emergency mode information to the safe stop planner 2019 (S1746).
  • system fault manager 2017 in the first area signal processing device 170z1 may transmit emergency mode information to the central signal processing device 170 (S1748).
  • the safe stop planner 2019 in the first area signal processing device 170z1 may transmit the generated control message to the actuator service 2011 in the first area signal processing device 170z1 based on the emergency mode information. There is (S1750).
  • the actuator service 2011 in the first area signal processing device 170z1 may control the actuator SNc based on the control message.
  • the actuator service 2011 in the first area signal processing device 170z1 may perform deceleration and control the vehicle to emergency stop. Accordingly, it is possible to emergency stop the vehicle when the central signal processing device 170 malfunctions.
  • 18A to 18C are diagrams showing operations based on vehicle driving status and fault type information.
  • FIG. 18A illustrates an operation flowchart based on vehicle driving status and fault type information.
  • the processor 175z in the area signal processing device 170z1 checks driving state information and fault type information (S1810).
  • the processor 175z may determine an error grade or recovery grade based on the driving state information and fault type information (S1812).
  • the processor 175z can control the emergency mode or degraded mode to operate based on the error grade or recovery grade (S1814). Accordingly, it is possible to stably drive the vehicle in emergency mode or deterioration mode.
  • the processor 175z may control recovery to the previous mode when the recovery condition is met during emergency mode or degraded mode operation (S1816). Accordingly, it is possible to stably drive the vehicle.
  • the 1810th step (S1810) to the 1816th step (S1816) are applicable to the operation of the processor 175 in the central signal processing device 170 in addition to the processor 175z in the area signal processing device 170z1.
  • FIG. 18B is a diagram illustrating errors and recovery between normal mode, degraded mode, and emergency mode
  • FIG. 18C is a diagram illustrating various error types and recovery between normal mode, degraded mode, and emergency mode.
  • the normal mode (MDa) enters the degraded mode (MDb), and when error 2 occurs, the normal mode (MDa) enters the emergency mode (MDc). , If error 3 occurs, it can enter emergency mode (MDc) from degraded mode (MDb).
  • the degraded mode (MDb) is converted to normal mode (MDa)
  • the emergency mode (MDc) is converted to normal mode (MDa)
  • the emergency mode is switched. From (MDc), there can be a transition to degradation mode (MDb).
  • error 1 may be a failure of the main sensor device (SNa) or a failure of the auxiliary sensor device (SNb).
  • Error 2 may be a failure of the central signal processing device, etc.
  • Error 3 may include the failure type of error 2, or may be a failure of the redundant path.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

Un dispositif de traitement de signal et un dispositif d'affichage de véhicule le comprenant, selon un mode de réalisation de la présente divulgation, comprennent un processeur qui commande les dispositifs de sorte à transmettre, à un dispositif de traitement de signal central, des données de capteur reçues en provenance d'un dispositif capteur ou des données de caméra reçues en provenance d'une caméra dans un mode normal, le processeur commandant, en cas de dysfonctionnement du dispositif de traitement de signal central ou du dispositif capteur ou du dispositif caméra, les dispositifs de sorte à fonctionner dans un mode d'urgence. Par conséquent, la conduite du véhicule peut être effectuée de manière stable.
PCT/KR2023/015989 2022-10-18 2023-10-17 Dispositif de traitement de signal et dispositif d'affichage de véhicule le comprenant WO2024085580A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
KR10-2022-0134127 2022-10-18
KR20220134127 2022-10-18

Publications (1)

Publication Number Publication Date
WO2024085580A1 true WO2024085580A1 (fr) 2024-04-25

Family

ID=90738109

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2023/015989 WO2024085580A1 (fr) 2022-10-18 2023-10-17 Dispositif de traitement de signal et dispositif d'affichage de véhicule le comprenant

Country Status (1)

Country Link
WO (1) WO2024085580A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101848312B1 (ko) * 2017-03-28 2018-04-13 (주) 모토텍 차량의 자동긴급제동시스템을 위한 센서융합시스템
CN110133658A (zh) * 2019-04-28 2019-08-16 惠州市德赛西威智能交通技术研究院有限公司 一种应用于车载雷达的故障检测方法以及系统
US20190258251A1 (en) * 2017-11-10 2019-08-22 Nvidia Corporation Systems and methods for safe and reliable autonomous vehicles
KR102158497B1 (ko) * 2019-04-08 2020-09-22 도로교통공단 자율주행 평가시스템
US20220171611A1 (en) * 2020-11-27 2022-06-02 Denso Corporation Electronic control unit, software update method, software update program product and electronic control system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101848312B1 (ko) * 2017-03-28 2018-04-13 (주) 모토텍 차량의 자동긴급제동시스템을 위한 센서융합시스템
US20190258251A1 (en) * 2017-11-10 2019-08-22 Nvidia Corporation Systems and methods for safe and reliable autonomous vehicles
KR102158497B1 (ko) * 2019-04-08 2020-09-22 도로교통공단 자율주행 평가시스템
CN110133658A (zh) * 2019-04-28 2019-08-16 惠州市德赛西威智能交通技术研究院有限公司 一种应用于车载雷达的故障检测方法以及系统
US20220171611A1 (en) * 2020-11-27 2022-06-02 Denso Corporation Electronic control unit, software update method, software update program product and electronic control system

Similar Documents

Publication Publication Date Title
WO2019132344A1 (fr) Procédé de commande d'oreillette et dispositif électronique prenant en charge ledit procédé
CN108291952A (zh) 无人机及其飞行状态的监管方法与监控系统
WO2014189323A1 (fr) Appareil et méthode permettant d'effectuer une opération d'accueil sans fil dans un système de communication prenant en charge un protocole universal plug and play
WO2022181903A1 (fr) Dispositif d'affichage de véhicule
WO2021006437A1 (fr) Robot tondeuse à gazon et son procédé de commande
CN104322023A (zh) 流转发方法、设备及系统
EP3993601A1 (fr) Robot de tondeuse à gazon et procédé de commande associé
WO2024085580A1 (fr) Dispositif de traitement de signal et dispositif d'affichage de véhicule le comprenant
WO2024071947A1 (fr) Dispositif de traitement de signal et dispositif d'affichage pour véhicule le comprenant
WO2024071942A1 (fr) Dispositif de traitement de signal et dispositif d'affichage pour véhicule le comprenant
WO2024071945A1 (fr) Dispositif de traitement de signal et dispositif d'affichage pour véhicule le comprenant
WO2023239008A1 (fr) Dispositif de traitement de signal de véhicule et dispositif de communication pour dispositif de communication de véhicule comportant ce dernier
WO2024071475A1 (fr) Dispositif de traitement de signal et dispositif d'affichage pour véhicule le comprenant
WO2024085284A1 (fr) Appareil de traitement de signal et appareil d'affichage pour véhicule comprenant ledit appareil de traitement de signal
WO2023136374A1 (fr) Dispositif de traitement de signal et dispositif d'affichage de véhicule le comprenant
WO2024034752A1 (fr) Appareil de traitement de signal et appareil de réalité augmentée pour véhicule le comprenant
WO2023113078A1 (fr) Dispositif de traitement de signal et dispositif d'affichage de véhicule le comprenant
WO2024071476A1 (fr) Dispositif de traitement de signal et dispositif d'affichage pour véhicule le comprenant
WO2023136376A1 (fr) Dispositif d'affichage pour véhicule
WO2024111709A1 (fr) Système de traitement de signal pour véhicule, et véhicule le comprenant
WO2024147372A1 (fr) Dispositif de traitement de signal pour véhicule, et véhicule le comprenant
WO2024034708A1 (fr) Dispositif de traitement de signal et dispositif d'affichage pour véhicule le comprenant
WO2024085283A1 (fr) Dispositif de traitement de signal et dispositif d'affichage pour véhicule le comprenant
WO2023195559A1 (fr) Dispositif de traitement de signal et dispositif d'affichage de véhicule le comprenant
WO2023136369A1 (fr) Dispositif d'affichage pour véhicule

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23880157

Country of ref document: EP

Kind code of ref document: A1