WO2021184841A1 - 车联网方法、装置、设备、存储介质及系统 - Google Patents
车联网方法、装置、设备、存储介质及系统 Download PDFInfo
- Publication number
- WO2021184841A1 WO2021184841A1 PCT/CN2020/134932 CN2020134932W WO2021184841A1 WO 2021184841 A1 WO2021184841 A1 WO 2021184841A1 CN 2020134932 W CN2020134932 W CN 2020134932W WO 2021184841 A1 WO2021184841 A1 WO 2021184841A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- vehicle
- information
- road condition
- condition information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 230000004927 fusion Effects 0.000 claims abstract description 137
- 230000008447 perception Effects 0.000 claims abstract description 99
- 230000001953 sensory effect Effects 0.000 claims description 38
- 230000006855 networking Effects 0.000 claims description 29
- 238000004590 computer program Methods 0.000 claims description 12
- 238000013507 mapping Methods 0.000 claims description 9
- 238000004891 communication Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 4
- 206010039203 Road traffic accident Diseases 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000006399 behavior Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000011161 development Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 239000004575 stone Substances 0.000 description 1
- 238000000547 structure data Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 239000002699 waste material Substances 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/021—Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/027—Services making use of location information using location based information parameters using movement velocity, acceleration information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/38—Services specially adapted for particular environments, situations or purposes for collecting sensor information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
- H04W4/44—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Definitions
- the present disclosure relates to the technical field of the Internet of Things, and in particular to a method, device, equipment, storage medium and system for the Internet of Vehicles.
- C-V2X cellular-based vehicle wireless communication technology
- dedicated short-range communication technology are widely used in the Internet of Vehicles industry.
- the cellular-based wireless communication technology for vehicles is selected in this patent because of its low sending and receiving overhead and access to data calculation and processing modules. Through the interaction of information between vehicles, roads and people, it can achieve more security and safety. Road traffic patterns.
- the solution in the related technology transmits basic data to the perception fusion module through the interconnection and intercommunication module, and transmits the perception fusion data to the interconnection and intercommunication module through the basic information real-time data module, which increases time delay and causes unnecessary waste of resources.
- the embodiments of the present disclosure provide a method, device, device, storage medium, and system for the Internet of Vehicles based on perception fusion, aiming to shorten the time delay of the Internet of Vehicles system.
- the embodiments of the present disclosure provide a vehicle networking method based on perception fusion, including:
- Acquire sensory fusion data which is determined based on the sensory data fusion obtained from the sensor components
- determining the road condition information according to the perception fusion data and sending the road condition information to the vehicle-mounted terminal includes:
- the sensory fusion data includes the first position information of the sensor component
- the vehicle-mounted data includes the second location information of the vehicle-mounted terminal, the second location information includes the latitude and longitude information, and the map information includes the mapping relationship between the area node and the latitude and longitude,
- the method further includes:
- the road condition information is sent to the vehicle-mounted terminal.
- determining the road condition information based on the perception fusion data and sending the road condition information to the vehicle-mounted terminal includes:
- the road condition information is determined according to the roadside data and the perception fusion data, and the road condition information is sent to the vehicle terminal.
- the roadside data includes third location information of the roadside unit
- the method further includes: sending at least one of sensory fusion data and roadside data to a big data platform, where the big data platform is used to compare at least one of the sensory fusion data and roadside data.
- the person to store is not limited to: sending at least one of sensory fusion data and roadside data to a big data platform, where the big data platform is used to compare at least one of the sensory fusion data and roadside data. The person to store.
- the roadside data includes at least one of intersection information, status information of signal lights, traffic incident information, traffic sign information, basic road safety information, and basic vehicle safety information;
- the road condition information includes at least one of intersection information, road section information, lane information, connection relationships between roads, traffic event messages, traffic sign messages, basic road safety messages, and status messages of signal lights.
- the sensing component includes at least one of a video acquisition device, a wave radar device, and a lidar device.
- the embodiments of the present disclosure also provide a car networking device based on perception fusion, including a cloud control platform, which is used to execute the aforementioned car networking method based on perception fusion.
- the embodiments of the present disclosure also provide a car networking device based on perception fusion, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor.
- the computer program includes the above-mentioned perception fusion-based Internet of Vehicles method.
- the embodiments of the present disclosure also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium, and the computer program includes the above-mentioned car networking method based on perception fusion.
- the embodiments of the present disclosure also provide a car networking system based on perception fusion, including:
- the cloud control platform is used to execute the above-mentioned method
- Sensing components used to obtain sensing data
- the sensory fusion module is used to fuse sensory data to obtain sensory fusion data.
- the perception fusion data is obtained, and the perception fusion data is directly obtained and determined from the sensing component, and then the road condition information is determined according to the perception fusion data, and the road condition information is sent to the vehicle terminal.
- the sensory fusion data is directly determined from the sensory data fusion of the sensor components, which can reduce the time delay.
- the receiving and forwarding system does not need to receive the sensor data, and there is no need to send the sensor data to the sensory fusion module, which can effectively improve the reception. The work efficiency and resource utilization of the forwarding system.
- FIG. 1 is a schematic flowchart of a method for Internet of Vehicles based on perception fusion according to an embodiment of the present disclosure
- FIG. 2 is a schematic flowchart of a method for Internet of Vehicles based on perception fusion according to another embodiment of the present disclosure
- FIG. 3 is a schematic flowchart of a method for Internet of Vehicles based on perception fusion according to another embodiment of the present disclosure
- FIG. 4 is a schematic structural diagram of a car networking system based on perception fusion provided by an embodiment of the present disclosure
- FIG. 5 is a topology diagram of hardware modules of a car networking system based on perception fusion according to an embodiment of the present disclosure
- Fig. 6 is a toolkit architecture diagram of a vehicle networking system based on perception fusion provided by an embodiment of the present disclosure.
- Fig. 1 is a flowchart of a method for Internet of Vehicles based on perception fusion provided by an embodiment of the present disclosure. The method includes:
- Step S101 Obtain sensory fusion data.
- the sensory fusion data is determined based on the fusion of the sensory data obtained from the sensor component.
- the sensing component includes, for example, at least one of a video acquisition device, a wave radar device, and a lidar device.
- the user of the video acquisition device acquires video data, and the characteristics of the target image can be determined according to the video data.
- the wave radar device is, for example, a millimeter wave radar device, which is used to collect information such as the contour, speed, and position of the target.
- Lidar devices are used, for example, to enhance the description of contours.
- Step S102 Determine the road condition information according to the perception fusion data, and send the road condition information to the vehicle-mounted terminal.
- the perception fusion data is obtained, and the perception fusion data is directly obtained and determined from the sensing component, and then the road condition information is determined according to the perception fusion data, and the road condition information is sent to the vehicle terminal.
- the sensory fusion data is directly determined from the sensory data fusion of the sensor components, which can reduce the time delay.
- the receiving and forwarding system does not need to receive the sensor data, and there is no need to send the sensor data to the sensory fusion module, which can effectively improve the reception. The work efficiency and resource utilization of the forwarding system.
- step S102 includes: acquiring vehicle-mounted data of the vehicle-mounted terminal, and sending road condition information to the vehicle-mounted terminal according to the perception fusion data and the vehicle-mounted data.
- information such as the address of the vehicle terminal can be determined based on the vehicle data. Therefore, sending road condition information to the vehicle terminal based on the perception fusion data and vehicle data can improve the accuracy of the result and send the road condition to the designated vehicle terminal. information.
- step S102 includes:
- Step S1021 Determine the first designated area of the first location in the pre-stored map information.
- Step S1022 Determine the road condition information of the first designated area according to the perception fusion data.
- Step S1022 Send road condition information to the vehicle-mounted terminal located in the first designated area.
- the position of the sensing component in the map information can be determined first based on the sensory fusion data, and then the first designated area is determined.
- the road condition information of the first designated area can be determined at the same time, and the road condition information can be sent to the vehicle-mounted terminal located in the first designated area, so that the vehicle-mounted terminal can timely determine the road condition environment around the vehicle-mounted terminal according to the road condition information.
- the vehicle-mounted data includes second location information of the vehicle-mounted terminal, the second location information is, for example, latitude and longitude information, and the map information includes the mapping relationship between the area node and the latitude and longitude, according to the second location information and the mapping relationship.
- the corresponding location of the vehicle-mounted terminal on the map information can be determined, and when the corresponding location is located in the first designated area, the road condition information is sent to the vehicle-mounted terminal.
- the corresponding position of the vehicle-mounted terminal on the map information can be accurately determined according to the mapping relationship between the latitude and longitude and the area node, so that the location of the vehicle-mounted terminal can be determined.
- Send road condition information to the vehicle terminal When the vehicle-mounted terminal is located in the first designated area, Send road condition information to the vehicle terminal.
- the car networking method based on perception fusion includes:
- Step S301 Obtain roadside data of the roadside unit.
- Step S302 Obtain sensory fusion data.
- the sensory fusion data is determined based on the sensory data fusion obtained from the sensor component.
- Step S303 Determine the road condition information according to the roadside data and the perception fusion data, and send the road condition information to the vehicle-mounted terminal.
- step S301 and step S302 are not limited, and step S301 can be performed before step S302 or after step S302.
- the second embodiment is different from the first embodiment in that it also acquires roadside data, and simultaneously determines road condition information based on the roadside data and the perception fusion data, which can further enrich the road condition information and make the information received by the vehicle-mounted terminal more comprehensive.
- the roadside data includes the third location information of the roadside unit, and the third location is determined to be in the second designated area in the pre-stored map information; the second designated area is determined based on the roadside data and the perception fusion data.
- the road condition information of the area; the road condition information is sent to the vehicle-mounted terminal located in the second designated area.
- the third designated area in the map information may be determined according to the third position information and the first position information, and the road condition information of the third designated area may be determined according to the roadside data and the perception fusion data, And send road condition information to the vehicle-mounted terminal located in the third designated area.
- At least one of the perception fusion data and the roadside data may be sent to the big data platform, and the big data platform is used to store at least one of the perception fusion data and the roadside data. It is convenient for users to retrieve historical data in time, or conduct research on historical data.
- At least one of vehicle-mounted data, sensory fusion data, and roadside data may also be sent to the big data platform.
- the big data platform is used to compare vehicle data, sensory fusion data, and roadside data. At least one of them is stored.
- the type of roadside data is not limited, and the roadside data includes at least one of intersection information, status information of signal lights, traffic incident information, traffic sign information, basic road safety information, and basic vehicle safety information, for example.
- the sensory fusion data includes, for example, at least one of intersection information, road section information, lane information, connection relationships between roads, traffic event messages, traffic sign messages, basic road safety messages, and status messages of signal lights.
- the road condition information includes at least one of intersection information, road section information, lane information, connection relationships between roads, traffic event messages, traffic sign messages, basic road safety messages, and status messages of signal lights, for example.
- the third embodiment of the present disclosure provides a vehicle networking device based on perception fusion.
- the vehicle network device includes a cloud control platform.
- the cloud control platform is used to execute the perception fusion-based The Internet of Vehicles method.
- the fourth embodiment of the present disclosure provides a vehicle networking device based on perception fusion, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor, the computer program including any of the foregoing
- the fifth embodiment of the present disclosure provides a computer-readable storage medium on which a computer program is stored.
- the computer program includes any one of the above-mentioned first and second embodiments of the vehicle networking method based on perception fusion.
- the sixth embodiment of the present disclosure provides a vehicle networking system based on perception fusion, including: a cloud control platform 100 for executing any one of the above-mentioned first and second embodiments.
- the Internet of Vehicles method based on perception fusion.
- the sensing component 500 is used to obtain sensing data.
- the sensory fusion module 300 is used to fuse sensory data to obtain sensory fusion data.
- the car networking system based on perception fusion further includes a toolkit, which is connected between the perception fusion module 300 and the cloud control platform 100, and the toolkit is used to convert the perception fusion data into a specified format. So that the cloud control platform 100 can read a specified number of perceptual fusion data.
- the vehicle networking system based on perception fusion further includes a roadside unit 400 for acquiring roadside data.
- the cloud control platform 100 is also used to obtain roadside data and determine road condition information based on the roadside data and perception fusion data.
- FIGS. 4 to 6 The following takes FIGS. 4 to 6 as examples to illustrate that the sixth embodiment of the present invention provides a car networking system based on perception fusion.
- the roadside unit 400 encodes the acquired roadside data information through the ASN.1 format and forwards it to the cloud control platform 100 using the UDP protocol; when the vehicle-mounted terminal 200 sends the vehicle-mounted data, the vehicle-mounted data will be carried on the vehicle.
- the latitude and longitude information and device number of the terminal 200, and the cloud control platform 100 combine the location information reported by the roadside unit 400 and the vehicle-mounted unit to match the roadside unit 400 and the vehicle-mounted unit to a designated area of pre-stored map information.
- the cloud control platform 100 forwards the message of the roadside unit 400 to the vehicle-mounted terminal 200 that meets the matching rule, that is, when the roadside unit 400 is in the second designated area, it sends road condition information to the vehicle-mounted terminal 200 located in the second designated area.
- the UDP protocol is used during data transmission to achieve high concurrency and low latency.
- the entire design basically satisfies the stability, information sharing, real-time and integrity requirements of the computing platform in the Internet of Vehicles application.
- the map information contains the mapping relationship between the area node and the latitude and longitude.
- the road condition information of the local area can be transmitted to the vehicle-mounted terminal 200.
- Road condition information includes intersection information, road section information, lane information, and connection relationships between roads in a local area.
- the basic vehicle safety message (BSM message) carries the latitude and longitude information and reports it to the cloud control platform 100, that is, the vehicle-mounted data carries the latitude and longitude information and reports it to the cloud control platform 100.
- the server of the cloud control platform 100 registers the vehicle-mounted terminal 200 or the serial number of the vehicle-mounted terminal 200 to the corresponding node according to the reported longitude and latitude in the map information.
- Traffic incident messages, traffic sign messages (RSI messages) and basic road safety messages (RSM messages) go to the map information to obtain the corresponding nodes according to the latitude and longitude information.
- the roadside unit 400 reports the node information corresponding to the status information (SPAT information) of the signal light to the server, that is, the drive test data includes the third location information of the roadside unit 400, and the second designated area can be determined according to the third location information.
- the cloud control platform 100 traverses all nodes when forwarding messages from the roadside unit 400, calculates the distances of all vehicle-mounted terminals 200 under the node in turn, and forwards the above-mentioned types of roadside unit 400 messages registered by the nodes to vehicles within a specified distance.
- the specified distance is determined by the data in the configuration file. That is, the road condition information is sent to the vehicles located in the second designated area.
- the map information stores a mapping relationship between latitude and longitude and regional nodes, and at the same time, calculates a GeoHash value through the GeoHash algorithm according to the latitude and longitude as a spatial index.
- the cloud control platform 100 calculates the regional index value according to the longitude and latitude information carried in at least one of the basic vehicle safety message (BSM), traffic incident message, traffic sign message (RSI), and basic road safety message (RSM), and calculates the vehicle through the GeoHash algorithm
- the system will traverse the calculated 9 GeoHash values in the map message. If one of the GeoHash values exists in the map message, it will take out the map information corresponding to this index value and assign the area information of the map message to the current one. Pending messages.
- the data of the perception fusion module 300 may come from video data, millimeter wave radar, lidar, and the like. Video data can be used to determine the characteristics of the target image, millimeter-wave radar can be used to collect information such as the contour, speed and position of the target, and lidar can be used to enhance the description of the contour.
- the merged data is transmitted to the cloud control platform 100 through the message queue telemetry transmission protocol, and is forwarded to the vehicle terminal 200 through a certain matching rule.
- the above information is fused through a specific algorithm, and the JSON structure is output as the result.
- the cloud control platform 100 converts the JSON structure data into a hexadecimal byte array through the software development kit, and the string complies with the ASN.1 standard at this time. Can be forwarded.
- the data message provided by perception fusion is more detailed and comprehensive.
- the software development kit provides external interfaces, and calls corresponding data when a request is received.
- the perception fusion module 300 puts the processed data into a software development kit in a JSON format, and the software development kit encodes and encapsulates the JSON format data in an ASN.1 format into a message frame (Message Frame). It is sent to the intelligent cloud control system via UDP protocol.
- the software development kit converts the five standardized data received and forwarded by the cloud control platform 100 and the data after perception fusion from the ASN.1 format encapsulated by the message frame to the JSON format according to the request. And provide it to the user side.
- the data can also be uploaded to the cloud of the cloud control management platform through the message encoding and decoding function in the software development kit.
- the roadside unit 400 such as traffic lights, signs, etc.
- the vehicle-mounted terminal 200 also sends its own location information, status information, etc. to the cloud control platform 100.
- the perception fusion module 300 uses video, millimeter wave radar and other sensor components 500 to collect and process sensor data to form perception fusion data, and send the perception fusion data to the cloud control platform 100, and the cloud control platform 100 performs all messages according to the matching rules. Match forwarding processing.
- cluster deployment and big data platform are used for calculation and storage to realize high concurrency scenarios and big data applications.
- the present disclosure is a car networking system that combines perception and fusion, which can realize the reception and real-time forwarding of multiple data. It can receive five types of messages and sensory fusion data defined by the ASN.1 standard.
- the sensory fusion module 300 receives video and radar data, and the data calculated by the algorithm model is encoded and forwarded to the cloud control platform 100.
- Road event reminder event The roadside unit 400 transmits a national standard message to the cloud control platform 100 at a frequency of 1 Hz, and the national standard message is roadside data.
- the perception fusion module 300 will also transmit the perception fusion data to the cloud control platform 100.
- the roadside unit 400 devices send messages, they will bring their respective latitude and longitude information.
- the cloud control platform 100 will register them to the nearest node according to the map message, and obtain the regional node data and latitude and longitude coordinates.
- the corresponding matching relationship of the reported vehicle terminal 200 is calculated based on the latitude and longitude position to calculate the location of the area, obtain the vehicle registered under the same node as the message, calculate according to the matching rule, calculate the vehicle with the configured distance under the threshold and send an event reminder information.
- the UDP protocol is used in the sending process. Based on the actual situation, the road vehicle terminal 200 and the roadside terminal will generate a large amount of data. These data are uniformly sent to the message queue, and the data center station is subscribed to perform big data processing.
- the perception fusion module 300 In automatic driving, when a traffic accident occurs ahead, the perception fusion module 300 combines video, millimeter wave radar, and lidar data for perception fusion to obtain perception fusion data, and the formed event and decision data are sent to The cloud control platform 100, and the cloud control platform 100 uses the 5G communication channel and Uu port to broadcast to the nearby vehicle terminal 200.
- the characteristics of 5G low latency, high bandwidth, and wide connection can be fully utilized, which can effectively reduce secondary accidents. happen.
- the roadside unit 400 obtains real-time status messages of surrounding traffic participants through corresponding detection means as roadside safety messages (RSM messages) and broadcasts them to surrounding vehicles.
- RSS messages roadside safety messages
- the behavior is sent as a roadside safety message to the corresponding vehicle that is about to pass the intersection.
- this behavior can be realized. Avoid pedestrians to achieve safe driving.
- the car networking system based on the perception fusion module 300 in the embodiment of the present disclosure collects video equipment by installing video equipment on the roadside.
- the video and video stream will be pushed to the SRS streaming media server built on the edge computing server, and sliced by ffmpeg (ffmpeg cuts the required video size according to the incoming start time and cutting time), and specifies the output path of the cut video
- the fragmented pictures are transmitted to the perception fusion platform through the rtsp protocol.
- the perception fusion platform quickly calculates the type of the event and forwards the decision and event to the cloud control platform 100, and the cloud control platform 100 transmits the event information Forward to nearby vehicles.
Abstract
Description
Claims (13)
- 一种基于感知融合的车联网方法,包括:获取感知融合数据,所述感知融合数据是根据从传感部件获取到的传感数据融合确定;根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息。
- 根据权利要求1所述的方法,其中,所述根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息,包括:获取所述车载终端的车载数据;根据所述感知融合数据和所述车载数据向所述车载终端发送所述路况信息。
- 根据权利要求2所述的方法,其中,所述感知融合数据包括所述传感部件的第一位置信息,所述根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息,包括:确定所述第一位置在预存储地图信息中的第一指定区域;根据所述感知融合数据确定所述第一指定区域的所述路况信息;向位于所述第一指定区域内的所述车载终端发送所述路况信息。
- 根据权利要求3所述的方法,其中,所述车载数据包括所述车载终端的第二位置信息,所述第二位置信息包括经纬度信息,所述地图信息包括区域节点与经纬度的映射关系,向位于所述第一指定区域内的所述车载终端发送所述感知融合数据的步骤前,所述方法还包括:根据所述第二位置信息以及所述映射关系确定所述车载终端在所述地图信息上的对应位置;当所述对应位置位于所述第一指定区域时,向所述车载终端发送所述路况信息。
- 根据权利要求1-4任一项所述的方法,其中,在所述根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息,包括:获取路侧单元的路侧数据;根据所述路侧数据和所述感知融合数据确定所述路况信息,并向车载终端发送所述路况信息。
- 根据权利要求5所述的方法,其中,所述路侧数据包括所述路侧单元的第三位置信息,所述根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息,包括:确定所述第三位置在预存储地图信息中的第二指定区域;根据所述路侧数据和所述感知融合数据确定所述第二指定区域的所述路况信息;向位于所述第二指定区域的所述车载终端发送所述路况信息。
- 根据权利要求5所述的方法,其中,所述方法还包括:向大数据平台发送所述感知融合数据和所述路侧数据中的至少一者,所述大数据平台用于对所述感知融合数据及所述路侧数据中的至少一者进行存储。
- 根据权利要求5所述的方法,其中,所述路侧数据包括路口信息、信号灯的状态信息、交通事件信息、交通标志牌信息、道路基本安全信息和车辆基本安全信息中的至少一者;所述路况信息包括路口信息、路段信息、车道信息、道路之间的连接关系、交通事件消息、交通标志牌消息、道路基本安全消息、信号灯的状态消息中的至少一者。
- 根据权利要求1所述的方法,其中,所述传感部件包括视频获取装置、波雷达装置和激光雷达装置中的至少一者。
- 一种基于感知融合的车联网装置,包括云控平台,所述云控平台用于执行根据权利要求1至9任一项所述的基于感知融合的车联网方法。
- 一种基于感知融合的车联网设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序包括权利要求1至9中任一项所述的基于感知融合的车联网方法。
- 一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序包括权利要求1至9中任一项所述的基于感知融合的车联网方法。
- 一种基于感知融合的车联网系统,包括:云控平台,用于执行根据权利要求1至9任一项所述的方法;传感部件,用于获取传感数据;感知融合模块,用于融合所述传感数据以得到感知融合数据。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010197663.2 | 2020-03-19 | ||
CN202010197663.2A CN113498011B (zh) | 2020-03-19 | 2020-03-19 | 车联网方法、装置、设备、存储介质及系统 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021184841A1 true WO2021184841A1 (zh) | 2021-09-23 |
Family
ID=77771889
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/134932 WO2021184841A1 (zh) | 2020-03-19 | 2020-12-09 | 车联网方法、装置、设备、存储介质及系统 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113498011B (zh) |
WO (1) | WO2021184841A1 (zh) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113727304A (zh) * | 2021-11-04 | 2021-11-30 | 深圳市城市交通规划设计研究中心股份有限公司 | 一种基于5g通信架构的紧急车辆预警系统及其预警方法 |
CN113888871A (zh) * | 2021-10-20 | 2022-01-04 | 上海电科智能系统股份有限公司 | 高速公路交通事件自动化处置联动系统及方法 |
CN113895442A (zh) * | 2021-10-11 | 2022-01-07 | 苏州智加科技有限公司 | 一种基于路侧与车端协同感知的车辆行驶决策方法及系统 |
CN114244880A (zh) * | 2021-12-16 | 2022-03-25 | 云控智行科技有限公司 | 智能网联驾驶云控功能的运行方法、装置、设备和介质 |
CN114333330A (zh) * | 2022-01-27 | 2022-04-12 | 浙江嘉兴数字城市实验室有限公司 | 一种基于路侧边缘全息感知的交叉口事件检测系统及方法 |
CN114596707A (zh) * | 2022-03-16 | 2022-06-07 | 阿波罗智联(北京)科技有限公司 | 交通控制方法及装置、设备、系统、介质 |
CN114792470A (zh) * | 2022-04-08 | 2022-07-26 | 广州小鹏汽车科技有限公司 | 路况显示方法、装置、可穿戴设备及存储介质 |
CN115100852A (zh) * | 2022-06-09 | 2022-09-23 | 智能汽车创新发展平台(上海)有限公司 | 服务于智能网联汽车的高可用路侧融合感知系统和方法 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114641041B (zh) * | 2022-05-18 | 2022-09-13 | 之江实验室 | 一种面向边缘智能的车联网切片方法及装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105025077A (zh) * | 2015-05-28 | 2015-11-04 | 广州番禺职业技术学院 | 一种基于云计算的车载物联网运营系统 |
CN105390009A (zh) * | 2015-11-17 | 2016-03-09 | 广东好帮手电子科技股份有限公司 | 一种动态交通信息发布方法及系统 |
EP3056861A1 (en) * | 2015-02-12 | 2016-08-17 | Honda Research Institute Europe GmbH | Method and system in a vehicle for improving prediction results of an advantageous driver assistant system |
US20180059680A1 (en) * | 2016-08-29 | 2018-03-01 | Denso Corporation | Vehicle location recognition device |
CN109738923A (zh) * | 2019-03-18 | 2019-05-10 | 腾讯科技(深圳)有限公司 | 一种行车导航方法和装置以及系统 |
CN110210280A (zh) * | 2019-03-01 | 2019-09-06 | 北京纵目安驰智能科技有限公司 | 一种超视距感知方法、系统、终端和存储介质 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120135676A1 (en) * | 2010-11-26 | 2012-05-31 | Industrial Technology Research Institute | System and method for deployment and management of interactive regional broadcast services |
CN105280005A (zh) * | 2014-06-06 | 2016-01-27 | 电信科学技术研究院 | 一种道路安全消息的发送方法及装置 |
CN106331008A (zh) * | 2015-06-26 | 2017-01-11 | 中兴通讯股份有限公司 | 车联网中车辆分组的管理方法及装置 |
CN106331006A (zh) * | 2015-06-26 | 2017-01-11 | 中兴通讯股份有限公司 | 车联网中车辆的分组方法及装置 |
US10394237B2 (en) * | 2016-09-08 | 2019-08-27 | Ford Global Technologies, Llc | Perceiving roadway conditions from fused sensor data |
CN106530782B (zh) * | 2016-09-30 | 2019-11-12 | 广州大正新材料科技有限公司 | 一种道路车辆交通告警方法 |
CN106530703B (zh) * | 2016-11-25 | 2019-09-24 | 四川长虹电器股份有限公司 | 一种基于物联网的智能终端路况采集系统 |
DE102017213925A1 (de) * | 2017-08-10 | 2019-02-14 | Robert Bosch Gmbh | Verfahren zur Erfassung von Straßenzustandsinformationen, Verfahren zur Verteilung von Verkehrsinformationen und ein Verkehrsinformationssystem |
CN108417087B (zh) * | 2018-02-27 | 2021-09-14 | 浙江吉利汽车研究院有限公司 | 一种车辆安全通行系统及方法 |
CN109709593A (zh) * | 2018-12-28 | 2019-05-03 | 国汽(北京)智能网联汽车研究院有限公司 | 基于“云-端”紧耦合的智能网联汽车车载终端平台 |
CN110570674A (zh) * | 2019-09-06 | 2019-12-13 | 杭州博信智联科技有限公司 | 车路协同数据交互方法、系统、电子设备及可读存储介质 |
-
2020
- 2020-03-19 CN CN202010197663.2A patent/CN113498011B/zh active Active
- 2020-12-09 WO PCT/CN2020/134932 patent/WO2021184841A1/zh active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3056861A1 (en) * | 2015-02-12 | 2016-08-17 | Honda Research Institute Europe GmbH | Method and system in a vehicle for improving prediction results of an advantageous driver assistant system |
CN105025077A (zh) * | 2015-05-28 | 2015-11-04 | 广州番禺职业技术学院 | 一种基于云计算的车载物联网运营系统 |
CN105390009A (zh) * | 2015-11-17 | 2016-03-09 | 广东好帮手电子科技股份有限公司 | 一种动态交通信息发布方法及系统 |
US20180059680A1 (en) * | 2016-08-29 | 2018-03-01 | Denso Corporation | Vehicle location recognition device |
CN110210280A (zh) * | 2019-03-01 | 2019-09-06 | 北京纵目安驰智能科技有限公司 | 一种超视距感知方法、系统、终端和存储介质 |
CN109738923A (zh) * | 2019-03-18 | 2019-05-10 | 腾讯科技(深圳)有限公司 | 一种行车导航方法和装置以及系统 |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113895442A (zh) * | 2021-10-11 | 2022-01-07 | 苏州智加科技有限公司 | 一种基于路侧与车端协同感知的车辆行驶决策方法及系统 |
CN113895442B (zh) * | 2021-10-11 | 2023-08-01 | 苏州智加科技有限公司 | 一种基于路侧与车端协同感知的车辆行驶决策方法及系统 |
CN113888871B (zh) * | 2021-10-20 | 2023-05-05 | 上海电科智能系统股份有限公司 | 高速公路交通事件自动化处置联动系统及方法 |
CN113888871A (zh) * | 2021-10-20 | 2022-01-04 | 上海电科智能系统股份有限公司 | 高速公路交通事件自动化处置联动系统及方法 |
CN113727304A (zh) * | 2021-11-04 | 2021-11-30 | 深圳市城市交通规划设计研究中心股份有限公司 | 一种基于5g通信架构的紧急车辆预警系统及其预警方法 |
CN114244880A (zh) * | 2021-12-16 | 2022-03-25 | 云控智行科技有限公司 | 智能网联驾驶云控功能的运行方法、装置、设备和介质 |
CN114244880B (zh) * | 2021-12-16 | 2023-12-26 | 云控智行科技有限公司 | 智能网联驾驶云控功能的运行方法、装置、设备和介质 |
CN114333330B (zh) * | 2022-01-27 | 2023-04-25 | 浙江嘉兴数字城市实验室有限公司 | 一种基于路侧边缘全息感知的交叉口事件检测系统 |
CN114333330A (zh) * | 2022-01-27 | 2022-04-12 | 浙江嘉兴数字城市实验室有限公司 | 一种基于路侧边缘全息感知的交叉口事件检测系统及方法 |
CN114596707A (zh) * | 2022-03-16 | 2022-06-07 | 阿波罗智联(北京)科技有限公司 | 交通控制方法及装置、设备、系统、介质 |
CN114596707B (zh) * | 2022-03-16 | 2023-09-01 | 阿波罗智联(北京)科技有限公司 | 交通控制方法及装置、设备、系统、介质 |
CN114792470A (zh) * | 2022-04-08 | 2022-07-26 | 广州小鹏汽车科技有限公司 | 路况显示方法、装置、可穿戴设备及存储介质 |
CN115100852A (zh) * | 2022-06-09 | 2022-09-23 | 智能汽车创新发展平台(上海)有限公司 | 服务于智能网联汽车的高可用路侧融合感知系统和方法 |
Also Published As
Publication number | Publication date |
---|---|
CN113498011A (zh) | 2021-10-12 |
CN113498011B (zh) | 2023-08-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021184841A1 (zh) | 车联网方法、装置、设备、存储介质及系统 | |
US20200209871A1 (en) | Method and Apparatus for Analyzing Driving Risk and Sending Risk Data | |
CN113256976B (zh) | 一种车路协同系统、模拟仿真方法、车载设备和路侧设备 | |
JP6928184B2 (ja) | 車両システムにおけるターゲット車両選択およびメッセージ配信 | |
WO2017071224A1 (zh) | 一种行车信息共享的方法、车载平台及智能交通系统 | |
WO2022142664A1 (zh) | 交通信息的传输方法、装置、介质、电子设备和程序产品 | |
US20230030446A1 (en) | Remote driving method, apparatus, and system, device, and medium | |
JP7225753B2 (ja) | 情報収集装置、情報収集システム、情報収集方法及びコンピュータプログラム | |
CN102546696A (zh) | 行车感知导航系统 | |
US11308736B2 (en) | Selecting V2X communications interface | |
US20220046391A1 (en) | Vehicle to everything object exchange system | |
US20230269566A1 (en) | System and method of communication between a vehicle and an agent | |
WO2019000745A1 (zh) | 一种兼容多制式v2x的v2x终端、系统及管理方法 | |
US11170640B2 (en) | Method and apparatus for bridging and optimizing V2X networks | |
US20230336953A1 (en) | Method by which first server transmits second message in wireless communication system, and device therefor | |
JP2023517799A (ja) | 車両対あらゆるモノ(v2x)によって支援されるローカルナビゲーション | |
US20230080095A1 (en) | Method and device for generating vru path map related to moving path of vru by softv2x server in wireless communication system supporting sidelink | |
CN112583872B (zh) | 一种通信方法及装置 | |
KR20210070038A (ko) | 교통 안전 서비스를 위한 유무선 통신 네트워크 시스템 및 제공 방법 | |
WO2023233989A1 (ja) | 通信装置及び通信方法 | |
WO2023221031A1 (en) | Systems and methods to provide network services to user devices | |
WO2023171371A1 (ja) | 通信装置および通信方法 | |
Masahiro et al. | Remote Proxy V2V Messaging using IPv6 and GeoNetworking | |
JP6738945B1 (ja) | 通信装置、通信システム及び通信装置のプログラム | |
Huang | 5G-based intelligent transportation system construction |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20925195 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20925195 Country of ref document: EP Kind code of ref document: A1 |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20925195 Country of ref document: EP Kind code of ref document: A1 |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 03/07/2023) |