WO2021201305A1 - Procédé et dispositif de modification d'un protocole de communication pour un véhicule - Google Patents

Procédé et dispositif de modification d'un protocole de communication pour un véhicule Download PDF

Info

Publication number
WO2021201305A1
WO2021201305A1 PCT/KR2020/004292 KR2020004292W WO2021201305A1 WO 2021201305 A1 WO2021201305 A1 WO 2021201305A1 KR 2020004292 W KR2020004292 W KR 2020004292W WO 2021201305 A1 WO2021201305 A1 WO 2021201305A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
signal quality
communication protocol
server
data
Prior art date
Application number
PCT/KR2020/004292
Other languages
English (en)
Korean (ko)
Inventor
임선희
장은송
김동욱
박민규
Original Assignee
엘지전자 주식회사
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 엘지전자 주식회사 filed Critical 엘지전자 주식회사
Priority to PCT/KR2020/004292 priority Critical patent/WO2021201305A1/fr
Publication of WO2021201305A1 publication Critical patent/WO2021201305A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L1/00Arrangements for detecting or preventing errors in the information received
    • H04L1/20Arrangements for detecting or preventing errors in the information received using signal quality detector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • H04W24/04Arrangements for maintaining operational condition
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information

Definitions

  • This specification relates to wireless communication in a vehicle, and more particularly, to a method of changing a communication protocol and a control device for a vehicle using the same.
  • a vehicle is a device that moves a passenger from one place to another.
  • a typical example is a car.
  • V2X (vehicle-to-everything) communication technology which means vehicle-to-things communication, refers to a technology in which a vehicle communicates with other vehicles, pedestrians, road infrastructure, and servers to provide a series of services.
  • DSRC Dedicated Short Range Communications
  • LTE Long Term Evolution
  • NR New Radio
  • Vehicles for V2X are operating based on wireless communication through wireless media with peripheral devices. As vehicle-to-vehicle communication and vehicle-to-server communication become more frequent, a communication delay may occur or an error may occur in a communication connection. A technique to increase communication reliability is required for V2X.
  • the present specification provides a method of changing a communication protocol and a control device for a vehicle using the same.
  • a method of changing a communication protocol performed by a control device of a vehicle.
  • the method communicates with a server according to a unidirectional communication protocol, and when the signal quality with the server is less than or equal to a predetermined standard, the method switches from the unidirectional communication protocol to a bidriectional communication protocol, and in the bidirectional communication protocol and communicating with the server accordingly.
  • a control device for a vehicle includes a processor and a memory coupled to the processor for storing instructions that, when executed by the processor, cause the vehicle to perform a function.
  • the function communicates with the server according to a unidirectional communication protocol, and when the signal quality with the server falls below a predetermined standard, switching from the unidirectional communication protocol to a bidriectional communication protocol. and communicating with the server according to the bidirectional communication protocol.
  • FIG. 1 shows a system to which an embodiment is applied.
  • FIG. 2 is a block diagram illustrating a vehicle implementing the present embodiment.
  • FIG. 5 illustrates a map generation method according to an embodiment of the present specification.
  • FIG. 6 shows another example of generating a signal quality map.
  • FIG. 7 shows an example in which a communication protocol is changed while a server and a vehicle communicate with each other.
  • FIG. 8 is a flowchart illustrating a method of changing a communication protocol performed by a control device of a vehicle according to an embodiment of the present specification.
  • FIG. 9 is a flowchart illustrating a method of changing a communication protocol performed by a server according to an embodiment of the present specification.
  • the left side of the vehicle means the left side in the forward driving direction of the vehicle
  • the right side of the vehicle means the right side in the forward driving direction of the vehicle
  • FIG. 1 shows a system to which an embodiment is applied.
  • the system 100 includes a vehicle 200 , a base station 110 , and a server 120 .
  • the vehicle 200 may communicate with the base station 110 and/or the surrounding vehicle 130 using a wireless communication protocol.
  • wireless communication protocols for example, Dedicated Short Range Communications (DSRC) based on IEEE (Institute of Electrical and Electronics Engineers) 802.11, WiFi, C-V2X and/or 3rd Generation Partnership Project (3GPP) based of a cellular communication protocol (eg, Wideband Code Division Multiple Access (WCDMA), Long Term Evolution (LTE), New Radio (NR), etc.).
  • DSRC Dedicated Short Range Communications
  • IEEE Institute of Electrical and Electronics Engineers
  • WiFi Wireless Fidelity
  • C-V2X Third Generation Partnership Project
  • 3GPP 3rd Generation Partnership Project
  • WCDMA Wideband Code Division Multiple Access
  • LTE Long Term Evolution
  • NR New Radio
  • the base station 110 may communicate with the vehicle 200 or other base stations using various wireless communication protocols such as DSRC, C-V2X, and cellular communication protocols.
  • various wireless communication protocols such as DSRC, C-V2X, and cellular communication protocols.
  • the server 120 is connected to one or more base stations 110 and includes computing hardware that provides a driving data service to the vehicle 200 .
  • the computing hardware may include a processor and memory.
  • the memory stores map data and driving environment information described in the following embodiments, and the processor may provide the data to the vehicle 200 .
  • the processor may update the map data based on data received from one or more vehicles 200 .
  • the server may be a Mobile/Mutli-access Edge Computing (MEC) based server or a centralized based server.
  • MEC Mobile/Mutli-access Edge Computing
  • the vehicle 200 is defined as a means of transport traveling on a road or track.
  • the vehicle 200 is a concept including a car, a train, and a motorcycle.
  • the vehicle 200 may be a concept including an internal combustion engine vehicle having an engine as a power source, a hybrid vehicle having an engine and an electric motor as a power source, and an electric vehicle having an electric motor as a power source.
  • the vehicle 200 may be a vehicle owned by an individual.
  • the vehicle 200 may be a shared vehicle.
  • the vehicle 200 may be an autonomous driving vehicle.
  • the vehicle 200 may be set to operate autonomously.
  • Autonomous driving refers to driving without the assistance of a human driver, for example.
  • the vehicle 200 may be set to detect the surrounding vehicle 130 and determine the path of the detected vehicle.
  • the vehicle 200 may communicate with the surrounding vehicle 130 to exchange information.
  • the vehicle 200 may perform a switching operation from the autonomous driving mode to the manual driving mode or a switching operation from the manual driving mode to the autonomous driving mode. For example, the vehicle 200 may change the mode of the vehicle 10 from the autonomous driving mode to the manual driving mode or from the manual driving mode to the autonomous driving mode based on a signal received from the user interface device.
  • ADAS Advanced Driver Assistance System
  • ADAS includes Adaptive Cruise Control (ACC), Autonomous Emergency Braking (AEB), Forward Collision Warning (FCW), Lane Keeping Assist (LKA), ), Lane Change Assist (LCA), Target Following Assist (TFA), Blind Spot Detection (BSD), Adaptive High Beam Control (HBA) , Auto Parking System (APS), Pedestrian Collision Warning System (PD Collision Warning System), Traffic Sign Recognition (TSR), Trafffic Sign Assist (TSA), Night Vision System At least one of a Night Vision (NV), a Driver Status Monitoring (DSM), and a Traffic Jam Assist (TJA) may be implemented.
  • ACC Adaptive Cruise Control
  • AEB Autonomous Emergency Braking
  • FCW Forward Collision Warning
  • LKA Lane Keeping Assist
  • LKA Lane Change Assist
  • LKA Lane Change Assist
  • TSA Blind Spot Detection
  • HBA Adaptive High Beam Control
  • FIG. 2 is a block diagram illustrating a vehicle implementing the present embodiment.
  • the vehicle 200 includes a control device 210 , a user interface device 220 , an acceleration device 230 , a braking device 240 , a steering device 250 , and a sensing device. (sensing device, 260) and may include an engine (engine, 270).
  • the devices presented are merely examples, and not all devices are essential.
  • the vehicle 200 may further include additional devices, or specific devices may be omitted. Some of the devices have their own processors and can perform processing related to specific functions of the device.
  • the user interface device 220 is a device for communicating between the vehicle 200 and a user.
  • the user interface device 220 may receive a user input and provide information generated in the vehicle 200 to the user.
  • the vehicle 200 may implement a user interface (UI) or a user experience (UX) through the user interface device 220 .
  • the user interface device 220 may include an input device, an output device, and a user monitoring device.
  • the acceleration device 230 may be a mechanism configured to accelerate the vehicle 200 .
  • the brake device 240 may be a mechanism set to decelerate the vehicle 200 .
  • the steering device 250 may be a mechanism set to control the direction of the vehicle 200 .
  • the vehicle 200 may accelerate through the acceleration device 230 , decelerate through the brake device 240 , and change a driving direction through the steering device 250 .
  • At least one of the acceleration device 230 , the brake device 240 , and the steering device 250 may be controlled by the control device 210 and/or an additional controller to control the speed and direction of the vehicle 200 . .
  • the sensing device 260 may include one or more sensors configured to sense information about the location/speed of the vehicle 200 and/or the environment of the vehicle 200 .
  • the sensing device 260 may include a location data generating device for measuring a geographic location of the vehicle 200 and/or an object detecting device for recognizing an object around the vehicle 200 .
  • the object detecting apparatus may generate information about an object outside the vehicle 200 .
  • the information about the object may include at least one of information on the existence of the object, location information of the object, distance information between the vehicle 200 and the object, and relative speed information between the vehicle 200 and the object. .
  • the object detecting apparatus may detect an object outside the vehicle 200 .
  • the object detecting apparatus may include at least one sensor capable of detecting an object outside the vehicle 200 .
  • the object detecting apparatus may include at least one of a camera, a radar, a lidar, an ultrasonic sensor, and an infrared sensor.
  • the object detection apparatus may provide data on an object generated based on a sensing signal generated by the sensor to at least one control device included in the vehicle 200 .
  • the camera may generate information about an object outside the vehicle 200 by using the image.
  • the camera may include at least one lens, at least one image sensor, and at least one processor that is electrically connected to the image sensor to process a received signal, and generate data about the object based on the processed signal.
  • the camera may be at least one of a mono camera, a stereo camera, and an AVM (Around View Monitoring) camera.
  • the camera may obtain position information of the object, distance information from the object, or relative speed information with the object by using various image processing algorithms.
  • the camera may acquire distance information and relative velocity information from an object based on a change in the size of the object over time from the acquired image.
  • the camera may acquire distance information and relative speed information with respect to an object through a pinhole model, road surface profiling, or the like.
  • the camera may acquire distance information and relative velocity information from an object based on disparity information in a stereo image obtained from the stereo camera.
  • the camera may be mounted at a position where a field of view (FOV) can be secured in the vehicle in order to photograph the outside of the vehicle.
  • the camera may be disposed adjacent to the front windshield in the interior of the vehicle to acquire an image of the front of the vehicle.
  • the camera may be placed around the front bumper or radiator grill.
  • the camera may be disposed adjacent to the rear glass in the interior of the vehicle to acquire an image of the rear of the vehicle.
  • the camera may be placed around the rear bumper, trunk or tailgate.
  • the camera may be disposed adjacent to at least one of the side windows in the interior of the vehicle in order to acquire an image of the side of the vehicle.
  • the camera may be disposed around a side mirror, a fender or a door.
  • the radar may generate information about an object outside the vehicle 200 using radio waves.
  • the radar may include an electromagnetic wave transmitter, an electromagnetic wave receiver, and at least one processor that is electrically connected to the electromagnetic wave transmitter and the electromagnetic wave receiver, processes a received signal, and generates data for an object based on the processed signal.
  • the radar may be implemented in a pulse radar method or a continuous wave radar method in terms of a radio wave emission principle.
  • the radar may be implemented as a frequency modulated continuous wave (FMCW) method or a frequency shift keyong (FSK) method according to a signal waveform among continuous wave radar methods.
  • FMCW frequency modulated continuous wave
  • FSK frequency shift keyong
  • the radar detects an object based on an electromagnetic wave, a time of flight (TOF) method or a phase-shift method, and detects the position of the detected object, the distance to the detected object, and the relative speed.
  • the radar may be placed at a suitable location outside of the vehicle to detect objects located in front, rear or side of the vehicle.
  • the lidar may generate information about an object outside the vehicle 200 using laser light.
  • the lidar may include at least one processor that is electrically connected to the light transmitter, the light receiver, and the light transmitter and the light receiver, processes the received signal, and generates data about the object based on the processed signal. .
  • the lidar may be implemented in a time of flight (TOF) method or a phase-shift method.
  • TOF time of flight
  • Lidar can be implemented as driven or non-driven. When implemented as a driving type, the lidar is rotated by a motor and may detect an object around the vehicle 200 . When implemented as a non-driven type, the lidar may detect an object located within a predetermined range with respect to the vehicle by light steering.
  • Vehicle 200 may include a plurality of non-driven lidar.
  • LiDAR detects an object based on a time of flight (TOF) method or a phase-shift method with a laser light medium, and calculates the position of the detected object, the distance to the detected object, and the relative speed. can be detected.
  • the lidar may be placed at a suitable location outside of the vehicle to detect an object located in front, rear or side of the vehicle.
  • the location data generating apparatus may generate location data of the vehicle 200 .
  • the apparatus for generating location data may include at least one of a Global Positioning System (GPS) and a Differential Global Positioning System (DGPS).
  • GPS Global Positioning System
  • DGPS Differential Global Positioning System
  • the location data generating apparatus may generate location data of the vehicle 200 based on a signal generated by at least one of GPS and DGPS.
  • the apparatus for generating location data may correct location data based on at least one of an Inertial Measurement Unit (IMU) of the sensing device 260 and a camera of the object detecting apparatus.
  • IMU Inertial Measurement Unit
  • the location data generating device may be referred to as a Global Navigation Satellite System (GNSS).
  • GNSS Global Navigation Satellite System
  • the sensing device 260 may include a state sensor configured to sense the state of the vehicle 200 .
  • the status sensor includes an inertial measurement unit (IMU) sensor, a collision sensor, a wheel sensor, a speed sensor, an inclination sensor, a weight sensor, a heading sensor, a position module, and a vehicle forward/reverse sensor.
  • IMU inertial measurement unit
  • a battery sensor a fuel sensor, a tire sensor, a steering sensor, a temperature sensor, a humidity sensor, an ultrasonic sensor, an illuminance sensor
  • an inertial measurement unit (IMU) sensor may include at least one of an acceleration sensor, a gyro sensor, and a magnetic sensor.
  • the sensing device 260 may generate state data of the vehicle based on a signal generated by at least one sensor.
  • the vehicle state data may be information generated based on data sensed by various sensors provided inside the vehicle.
  • the sensing device 260 may include vehicle attitude data, vehicle motion data, vehicle yaw data, vehicle roll data, vehicle pitch data, vehicle collision data, vehicle direction data, vehicle angle data, and vehicle speed. data, vehicle acceleration data, vehicle inclination data, vehicle forward/reverse data, vehicle weight data, battery data, fuel data, tire pressure data, vehicle interior temperature data, vehicle interior humidity data, steering wheel rotation angle data, vehicle exterior illumination Data, pressure data applied to the accelerator pedal, pressure data applied to the brake pedal, and the like may be generated.
  • the engine 270 provides propulsion to the vehicle 200 .
  • the engine 270 may include an internal combustion engine, an electric motor, or a combination thereof.
  • the control device 210 communicates with the user interface device 220 , the acceleration device 230 , the brake device 240 , the steering device 250 , and the sensing device 260 to exchange various information or to control these devices.
  • the control device 210 may include a processor 211 and a memory 212 .
  • the control device 210 may include one or more sub-devices according to functions, and each sub-device includes at least one of a processor and a memory, and is configured to perform processing related to the function of the corresponding sub-device.
  • the control device 210 may include a telematics control unit (TCU) responsible for communication inside and outside the vehicle 200 .
  • TCU telematics control unit
  • the control device 210 may include an autonomous driving device in charge of autonomous driving.
  • the control device 210 may include an infotainment system or AVN (Audio Video Navigation) system that displays driving information to passengers or provides various entertainment.
  • the control device 210 may include a TCU or an infotainment system.
  • the control device 210 may include a combination of a TCU and an infotainment system or a combination of other functions.
  • the control device 210 for autonomous driving may generate a path for autonomous driving based on the obtained data.
  • the control device 210 may generate a driving plan for driving along the generated path.
  • the control device 210 may generate a signal for controlling the movement of the vehicle according to the driving plan.
  • the autonomous driving device 210 may provide the generated signal to the accelerator 230 , the brake device 240 , the steering device 250 , and the engine 270 .
  • the processor 211 is an ASIC (application-specific integrated circuit), CPU (central processing unit), AP (application processor), DSPs (digital signal processors), DSPDs (digital signal processing devices), PLDs (programmable logic devices), FPGAs (field programmable gate arrays), microcontrollers, chipsets, logic circuits, data processing devices, and/or combinations thereof.
  • ASIC application-specific integrated circuit
  • CPU central processing unit
  • AP application processor
  • DSPs digital signal processors
  • DSPDs digital signal processing devices
  • PLDs programmable logic devices
  • FPGAs field programmable gate arrays
  • microcontrollers chipsets, logic circuits, data processing devices, and/or combinations thereof.
  • Memory 212 may store information accessible by processor 211 .
  • the information may include instructions executable by the processor 211 and/or data processed by the processor.
  • Memory 212 may include any form of computer-readable medium operative to store information.
  • the memory 212 may include a read only memory (ROM), a random access memory (RAM), a digital video disc (DVD), an optical disc, a flash memory, a solid state drive (SSD), a hard drive ( hard drive) and combinations thereof.
  • control device 210 is shown to include a processor and a memory as one physical block, the control device 210 may include a plurality of processors and a plurality of memories, and may be physically or logically operably connected.
  • the control device 210 may be connected to the display device 280 for displaying information.
  • the display device 280 includes a liquid crystal display (LCD) touch screen or an organic light emitting diode (OLED) touch screen, and includes various sensors (video camera, microphone, etc.) for detecting the state or gesture of the passenger. may include
  • the control device 210 may be connected to a wireless modem 290 configured to communicate with other devices through a wireless medium.
  • the control device 210 may exchange a wireless signal with a mobile device or server ( 120 in FIG. 1 ) or a surrounding vehicle inside/outside the vehicle 200 through the wireless modem 290 .
  • a wireless communication protocol supported by the wireless modem 290 and the wireless modem 290 may support various wireless communication protocols such as cellular communication, WiFi, Bluetooth, Zigbee, and an infrared link.
  • the memory 212 of the control device 210 may have map information and/or driving plan data.
  • the driving plan data may include information about a vehicle trajectory for the vehicle 200 to track the location of the vehicle from the current location to the destination.
  • the driving plan data may be used to guide the driver on a route or for autonomous driving.
  • the map information may include various maps for defining the driving environment. Map information includes the shape and elevation of a roadway, lane lines, intersections, crosswalks, speed limits, traffic signals, buildings or other objects. I may include information.
  • the map information may further include real-time traffic information, obstacles on the road, road condition information, and the like.
  • the map information and the driving plan data may be updated based on information given by the server 120 or may be updated based on information detected by the sensing device 260 of the vehicle 200 .
  • the control device 210 may generate Electronic Horizon Data.
  • the electronic horizon data may be understood as driving plan data within a range from a point where the vehicle 200 is located to a horizon.
  • the horizon may be understood as a point in front of a preset distance from a point where the vehicle 200 is located based on a preset driving route.
  • the horizon may mean a point to which the vehicle 200 can reach after a predetermined time from a point where the vehicle 200 is located along a preset driving route.
  • the electronic horizon data may include horizon map data and horizon pass data.
  • the horizon map data may include at least one of topology data, road data, HD map data, and dynamic data.
  • the horizon map data may include a plurality of layers.
  • the horizon map data may include a first layer matching topology data, a second layer matching road data, a third layer matching HD map data, and a fourth layer matching dynamic data.
  • the horizon map data may further include static object data.
  • Topology data can be described as a map created by connecting road centers.
  • the topology data is suitable for roughly indicating the location of the vehicle, and may be in the form of data mainly used in navigation for drivers.
  • the topology data may be understood as data on road information excluding information on lanes.
  • the topology data may be generated based on data received from an external server.
  • the topology data may be based on data stored in at least one memory provided in the vehicle 200 .
  • the road data may include at least one of slope data of the road, curvature data of the road, and speed limit data of the road.
  • the road data may further include data on an overtaking prohibited section.
  • the road data may be based on data received from an external server.
  • the road data may be based on data generated by the object detecting apparatus.
  • the HD map data includes detailed lane-by-lane topology information of the road, connection information of each lane, and characteristic information for vehicle localization (eg, traffic signs, Lane Marking/attributes, Road furniture, etc.).
  • vehicle localization eg, traffic signs, Lane Marking/attributes, Road furniture, etc.
  • the HD map data may be based on data received from an external server through the communication device 220 .
  • the dynamic data may include various dynamic information that may be generated on the road.
  • the dynamic data may include construction information, variable speed lane information, road surface condition information, traffic information, moving object information, and the like.
  • the dynamic data may be based on data received from an external server.
  • the dynamic data may be based on data generated by the object detecting apparatus.
  • the horizon pass data may be described as a trajectory that the vehicle 200 can take within a range from a point where the vehicle 200 is located to the horizon.
  • the horizon pass data may include data representing a relative probability of selecting any one road at a decision point (eg, a fork, a junction, an intersection, etc.).
  • the relative probability may be calculated based on the time it takes to arrive at the final destination. For example, at the decision point, if the time taken to arrive at the final destination is shorter when selecting the first road than when selecting the second road, the probability of selecting the first road is higher than the probability of selecting the second road. can be calculated higher.
  • the horizon pass data may include a main path and a sub path.
  • the main path may be understood as a track connecting roads with a high relative probability of being selected.
  • the sub-path may diverge at at least one decision point on the main path.
  • the sub-path may be understood as a trajectory connecting at least one road having a low relative probability of being selected from at least one decision point on the main path.
  • a map showing the driving route 310 may be displayed to the driver, and the driver may directly drive the vehicle.
  • the vehicle 200 may communicate with the server 120 through the first base station 110a at point A, and communicate with the server 210 through the second base station 110b at point B.
  • the number or arrangement of base stations or servers is merely an example.
  • One or more base stations may be disposed on the driving path 310 .
  • one or more servers may be deployed for each base station.
  • Signal quality with the base station on the driving path 310 may be changed for various reasons. For example, the distance between the vehicle and the base station, the number of vehicles on the road, the arrangement of buildings, etc. may cause various path loss and signal quality may change. When the signal quality drops to a certain level, the communication quality between the vehicle 200 and the server 120 may not be guaranteed or the communication connection may be cut off.
  • the signal strength may include at least one of Received Signal Strength Indicator (RSSI), Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ), and Signal to Interference Noise Ratio (SINR).
  • RSSI Received Signal Strength Indicator
  • RSRP Reference Signal Received Power
  • RSSQ Reference Signal Received Quality
  • SINR Signal to Interference Noise Ratio
  • first reference value TH1 and a second reference value TH2 are thresholds on the signal quality map, and the signal strength is represented by three levels of good, marginal, and poor. If the measured signal strength is greater than TH1, it is 'G (good)'. If the measured signal strength is between TH1 and TH2, it is 'M (middle)'. If the measured signal strength is less than TH2, it is 'P (bad)'. The number of levels is only an example.
  • the signal quality map may include information about a section on the driving route 310 and signal quality in each section.
  • the signal quality map may include information about the reference signal quality at a reference position (or reference interval).
  • the signal quality map may include information about one or more reference signal qualities.
  • the signal quality map may be generated integrally with the geographic map for driving information, or may be generated separately from the geographic map.
  • the reference position includes a position where the reference signal quality is measured.
  • the reference section includes a section in which the reference signal quality is measured.
  • the reference time includes a time at which the reference signal quality is measured.
  • the reference sensor includes information about the sensor from which the reference signal quality was measured.
  • the signal quality map may include information about at least one of a reference position, a reference interval, a reference time, and a reference sensor.
  • the signal quality may include one or more indicators indicating the quality of communication between the vehicle and the server.
  • the signal quality may include at least one of signal strength, latency, bandwidth, and GPS accuracy.
  • the delay time may represent a delay time in communication between the vehicle and the server (or the vehicle and the base station).
  • the bandwidth may indicate the maximum bandwidth (and/or minimum bandwidth) used in communication between the vehicle and the server (or the vehicle and the base station).
  • the operation of the vehicle may be implemented by a control device of the vehicle and/or a wireless modem.
  • the operation of the server may be implemented by a processor of the server.
  • step S510 the server sends the signal quality map to the vehicle through a wireless medium.
  • the server may send a signal quality map to the vehicle.
  • Vehicles can also communicate with servers based on signal quality maps, even in areas they have never driven before.
  • step S520 the vehicle communicates with the server through the wireless medium along the driving route of the vehicle based on the signal quality map. Then, the vehicle measures the signal quality along the driving path.
  • a position at which the signal quality is measured on the driving route is referred to as a measurement position
  • a time at which the signal quality is measured is referred to as a measurement time
  • a sensor at which the signal quality is measured is referred to as a measurement sensor.
  • the vehicle may measure the signal quality when the driving route has been driven before.
  • the vehicle may measure the signal quality when driving at least N times (N>1) along the driving path.
  • step S530 the vehicle updates the signal quality map based on the measured signal quality.
  • the vehicle may update the signal quality map based on the measured reliability of the signal quality.
  • the vehicle may update the signal quality map if the measured signal quality is different from the reference signal quality in the signal quality map.
  • the difference between the reference signal quality and the measured signal quality may mean that the difference between the reference signal quality and the measured signal quality is greater than a reference value.
  • the vehicle may update the signal quality map when a specific condition is satisfied.
  • the specific condition may include reliability of the measured signal quality.
  • the reliability of the measured signal quality may be determined based on a measurement position, a measurement time, and a measurement sensor. If the vehicle (i) the measurement position and the reference position are within a specific distance, (ii) the measurement time is later than the reference time, and (iii) the accuracy of the measurement sensor is higher than the accuracy of the reference sensor , the signal quality map may be updated. When at least one of the conditions (i) to (iii) is satisfied, the vehicle may update the signal quality map.
  • the accuracy of the sensor may be estimated based on a manufacturing year of the vehicle, a model number of the vehicle, a sensor type, and a sensor resolution.
  • the signal quality map may include information about a reference interval associated with a reference signal quality.
  • the vehicle may update the signal quality map when the measurement location belongs to the reference section.
  • the signal quality map can be updated to avoid too frequent updates.
  • the vehicle may transmit the updated signal quality map to the server via a wireless medium.
  • the server can communicate with other vehicles based on the updated signal quality map.
  • the server and vehicle can maintain more stable communication by obtaining section information with poor signal quality based on the signal quality map.
  • By updating the signal quality map based on highly reliable information it is possible to increase the accuracy of the signal quality.
  • By storing and updating the signal quality map directly by the vehicle operation is possible even if the connection with the server is lost.
  • the vehicle may predict, based on the signal quality map, how much quality or stability it can maintain communication with the server on the driving route.
  • the vehicle can exchange high-quality/large-capacity messages with the server.
  • the vehicle can exchange a reduced amount of messages with the server, or only essential messages.
  • FIG. 6 shows another example of generating a signal quality map.
  • a travel path 610 moving from point A to point B There are a plurality of reference locations designated by the server.
  • the vehicle receives from the server a signal quality map relating to a corresponding reference signal quality.
  • the vehicle may establish one or more auxiliary reference positions between the two reference positions.
  • the vehicle measures the signal quality at each auxiliary reference location and stores it as the auxiliary reference signal quality. Thereafter, when the vehicle travels on the driving path 610 again, signal quality is measured not only in the reference position but also in the auxiliary reference position. If the measured signal quality differs from the auxiliary signal quality, the signal quality map can be updated. It can be updated based on reliability.
  • the vehicle may identify more detailed signal quality on the driving path 610 through the added auxiliary reference position.
  • the vehicle will notify the server of the auxiliary reference location and the associated auxiliary reference signal. Information about quality can be notified.
  • the server may additionally set the notified auxiliary reference position as a new reference position or replace the existing reference position with the notified auxiliary reference position.
  • a unidirectional communication protocol is a protocol in which a receiver does not send a separate response.
  • the unidirectional communication protocol may include Message Queuing Telemetry Transport (MQTT).
  • MQTT Message Queuing Telemetry Transport
  • the one-way communication protocol is lightweight and completes with the transmitter publishing information to the receiver.
  • a bidriectional communication protocol involves a two-step message exchange in which a transmitter sends information to a receiver, and the receiver sends a response.
  • the bidirectional communication protocol may include HyperText Transfer Protocol (HTTP).
  • HTTP HyperText Transfer Protocol
  • the one-way communication protocol uses MQTT and the two-way communication protocol uses HTTP, but this is only an example.
  • FIG. 7 shows an example in which a communication protocol is changed while a server and a vehicle communicate with each other.
  • the server 120 may include a processor 121 and a memory 122 .
  • the memory 122 may store information and instructions processed by the processor 121 .
  • the processor 121 may implement the operation of the server 120 .
  • the server 120 and the vehicle 100 may exchange and update the signal quality map according to the embodiments of FIGS. 5 and 6 .
  • the server 120 and the vehicle 100 may communicate while switching between a plurality of communication protocols.
  • the server 120 and the vehicle 100 may switch between the first communication protocol and the second communication protocol.
  • the first communication protocol may be a unidirectional communication protocol
  • the second communication protocol may be a bidirectional communication protocol.
  • the vehicle 100 and the server 120 communicate according to MQTT. Based on a specific condition, the vehicle 100 and the server 120 may communicate by switching to HTTP. Later, the vehicle 100 and the server 120 may communicate by switching back to MQTT.
  • the specific condition may be a signal quality between the vehicle 100 and the server 120 and/or an indication by the server 120 .
  • the server 120 and the vehicle 100 basically communicate according to MQTT, but may communicate by switching to HTTP depending on the situation. Afterwards, the server 120 and the vehicle 100 can communicate back to MQTT again.
  • Table 1 is an example of measuring the latency of HTTP and MQTT in LTE network and 5G NR network.
  • MQTT a one-way protocol
  • HTTP a two-way protocol
  • NR which is more recent, is faster than LTE.
  • FIG. 8 is a flowchart illustrating a method of changing a communication protocol performed by a control device of a vehicle according to an embodiment of the present specification.
  • step S810 the vehicle communicates with the server according to MQTT.
  • step S820 the vehicle determines whether to change the communication protocol based on the signal quality with the server.
  • step S830 when the signal quality is less than or equal to a predefined criterion, the vehicle switches from MQTT to HTTP and communicates. If the signal quality is good, use the fast and simple MQTT. If the signal quality is poor, use more stable HTTP.
  • Signal quality may include a packet loss rate. For example, if the packet loss rate is less than 3%, you can keep MQTT, and if the packet loss rate is greater than 3%, you can switch to HTTP.
  • Signal quality maps may be exchanged between the vehicle and the server.
  • the vehicle can switch communication protocols based on the signal quality map.
  • the signal quality with the server may be determined based on the signal quality map.
  • MQTT may be switched to HTTP. If you re-enter the path with good signal quality, you can switch from HTTP to MQTT. Since the signal quality map is shared between the vehicle and the server, the vehicle and the server can predict from each other in which regions MQTT or HTTP will be used.
  • MQTT is used to speed up message transmission.
  • HTTP is used to increase the reliability of message transmission.
  • FIG. 9 is a flowchart illustrating a method of changing a communication protocol performed by a server according to an embodiment of the present specification.
  • step S910 the server communicates with the vehicle according to MQTT.
  • step S920 the server determines whether to change the communication protocol based on the signal quality with the vehicle.
  • step S930 the server switches from MQTT to HTTP and communicates when the signal quality is less than or equal to a predetermined standard. If the signal quality is good, use the fast and simple MQTT. If the signal quality is poor, use more stable HTTP. If the signal quality is the packet loss rate, if the packet loss rate is less than 3%, it can keep MQTT, and if the packet loss rate is greater than 3%, it can switch to HTTP.
  • Signal quality maps may be exchanged between the vehicle and the server.
  • the server can switch the communication protocol based on the signal quality map.
  • the signal quality with the vehicle may be determined based on the signal quality map.
  • MQTT may be switched to HTTP. If you re-enter the path with good signal quality, you can switch from HTTP to MQTT. Since the signal quality map is shared between the vehicle and the server, the vehicle and the server can predict from each other in which regions MQTT or HTTP will be used.

Abstract

L'invention concerne un procédé et un dispositif permettant de modifier un protocole de communication, ainsi qu'un dispositif pour un véhicule. Le dispositif communique avec un serveur selon un protocole de communication unidirectionnelle. Si la qualité d'un signal entre le dispositif et le serveur est égale ou inférieure à une référence prédéterminée, le dispositif est commuté du protocole de communication unidirectionnel vers un protocole de communication bidirectionnel.
PCT/KR2020/004292 2020-03-30 2020-03-30 Procédé et dispositif de modification d'un protocole de communication pour un véhicule WO2021201305A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/KR2020/004292 WO2021201305A1 (fr) 2020-03-30 2020-03-30 Procédé et dispositif de modification d'un protocole de communication pour un véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/KR2020/004292 WO2021201305A1 (fr) 2020-03-30 2020-03-30 Procédé et dispositif de modification d'un protocole de communication pour un véhicule

Publications (1)

Publication Number Publication Date
WO2021201305A1 true WO2021201305A1 (fr) 2021-10-07

Family

ID=77928130

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2020/004292 WO2021201305A1 (fr) 2020-03-30 2020-03-30 Procédé et dispositif de modification d'un protocole de communication pour un véhicule

Country Status (1)

Country Link
WO (1) WO2021201305A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070118264A (ko) * 2005-03-23 2007-12-14 콸콤 플라리온 테크놀로지스, 인코포레이티드 무선 단말과의 다중 무선 링크들을 사용하는 방법 및 장치
KR20150136141A (ko) * 2013-04-23 2015-12-04 구루로직 마이크로시스템스 오이 Http를 이용한 양방향 실시간 통신 시스템
KR20160103223A (ko) * 2015-02-23 2016-09-01 전북대학교산학협력단 양방향 통신을 이용한 시공간 교통량 분산 제어 방법 및 시스템
KR20170089579A (ko) * 2016-01-27 2017-08-04 한국전자통신연구원 단방향 파일 전송 시스템 및 방법
KR101804886B1 (ko) * 2017-01-03 2017-12-06 주식회사 펀진 릴레이 서버

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20070118264A (ko) * 2005-03-23 2007-12-14 콸콤 플라리온 테크놀로지스, 인코포레이티드 무선 단말과의 다중 무선 링크들을 사용하는 방법 및 장치
KR20150136141A (ko) * 2013-04-23 2015-12-04 구루로직 마이크로시스템스 오이 Http를 이용한 양방향 실시간 통신 시스템
KR20160103223A (ko) * 2015-02-23 2016-09-01 전북대학교산학협력단 양방향 통신을 이용한 시공간 교통량 분산 제어 방법 및 시스템
KR20170089579A (ko) * 2016-01-27 2017-08-04 한국전자통신연구원 단방향 파일 전송 시스템 및 방법
KR101804886B1 (ko) * 2017-01-03 2017-12-06 주식회사 펀진 릴레이 서버

Similar Documents

Publication Publication Date Title
EP3629059B1 (fr) Partage d'objets classés perçus par des véhicules autonomes
US10176715B2 (en) Navigation system with dynamic mapping mechanism and method of operation thereof
US11835948B2 (en) Systems and methods for improving vehicle operations using movable sensors
KR20220015491A (ko) 다수의 lidar 디바이스로부터의 데이터의 병합
US10369995B2 (en) Information processing device, information processing method, control device for vehicle, and control method for vehicle
JPWO2016035199A1 (ja) 自動走行管理システム、サーバおよび自動走行管理方法
US11529955B2 (en) Traffic light estimation
US11885893B2 (en) Localization based on predefined features of the environment
CN112469970A (zh) 用于估计在车辆的自定位方面的定位质量的方法、用于执行该方法的方法步骤的设备以及计算机程序
KR102548079B1 (ko) 내비게이션 정보의 이용 가능성에 기반한 자율 주행 차량의 동작
WO2021201304A1 (fr) Procédé et dispositif d'aide à la conduite autonome
JP7172603B2 (ja) 信号処理装置、信号処理方法、およびプログラム
WO2021201308A1 (fr) Procédé de génération de qualité de signal de réflexion de carte, et dispositif pour véhicule utilisant ledit procédé
WO2021201305A1 (fr) Procédé et dispositif de modification d'un protocole de communication pour un véhicule
KR20210110558A (ko) 전동 조향 토크 보상
WO2021215559A1 (fr) Procédé et appareil de surveillance de véhicule
DK201970221A1 (en) Traffic light estimation
WO2021201306A1 (fr) Procédé et dispositif de transmission de vidéo enregistrée par un véhicule
JP7203123B2 (ja) 通信システム、通信端末、制御方法、プログラム、およびプログラムを記憶する記憶媒体
WO2021201307A1 (fr) Procédé et appareil permettant de transmettre une vidéo enregistrée par un véhicule
US11480960B2 (en) Systems and methods remote control of vehicles
DK180220B1 (en) Sharing classified objects perceived by autonomous vehicles
WO2023162733A1 (fr) Dispositif de mesure de distance et procédé de mesure de distance
WO2024018920A1 (fr) Dispositif de traitement d'informations et procédé de traitement d'informations
WO2021100518A1 (fr) Dispositif de traitement de signal, système de traitement de signal, et dispositif mobile

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20928686

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20928686

Country of ref document: EP

Kind code of ref document: A1