WO2021184841A1 - 车联网方法、装置、设备、存储介质及系统 - Google Patents

车联网方法、装置、设备、存储介质及系统 Download PDF

Info

Publication number
WO2021184841A1
WO2021184841A1 PCT/CN2020/134932 CN2020134932W WO2021184841A1 WO 2021184841 A1 WO2021184841 A1 WO 2021184841A1 CN 2020134932 W CN2020134932 W CN 2020134932W WO 2021184841 A1 WO2021184841 A1 WO 2021184841A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
vehicle
information
road condition
condition information
Prior art date
Application number
PCT/CN2020/134932
Other languages
English (en)
French (fr)
Inventor
张学伦
唐田
熊诚锋
Original Assignee
中移(上海)信息通信科技有限公司
中国移动通信集团有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中移(上海)信息通信科技有限公司, 中国移动通信集团有限公司 filed Critical 中移(上海)信息通信科技有限公司
Publication of WO2021184841A1 publication Critical patent/WO2021184841A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/021Services related to particular areas, e.g. point of interest [POI] services, venue services or geofences
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/023Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/02Services making use of location information
    • H04W4/025Services making use of location information using location based information parameters
    • H04W4/027Services making use of location information using location based information parameters using movement velocity, acceleration information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/38Services specially adapted for particular environments, situations or purposes for collecting sensor information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D30/00Reducing energy consumption in communication networks
    • Y02D30/70Reducing energy consumption in communication networks in wireless communication networks

Definitions

  • the present disclosure relates to the technical field of the Internet of Things, and in particular to a method, device, equipment, storage medium and system for the Internet of Vehicles.
  • C-V2X cellular-based vehicle wireless communication technology
  • dedicated short-range communication technology are widely used in the Internet of Vehicles industry.
  • the cellular-based wireless communication technology for vehicles is selected in this patent because of its low sending and receiving overhead and access to data calculation and processing modules. Through the interaction of information between vehicles, roads and people, it can achieve more security and safety. Road traffic patterns.
  • the solution in the related technology transmits basic data to the perception fusion module through the interconnection and intercommunication module, and transmits the perception fusion data to the interconnection and intercommunication module through the basic information real-time data module, which increases time delay and causes unnecessary waste of resources.
  • the embodiments of the present disclosure provide a method, device, device, storage medium, and system for the Internet of Vehicles based on perception fusion, aiming to shorten the time delay of the Internet of Vehicles system.
  • the embodiments of the present disclosure provide a vehicle networking method based on perception fusion, including:
  • Acquire sensory fusion data which is determined based on the sensory data fusion obtained from the sensor components
  • determining the road condition information according to the perception fusion data and sending the road condition information to the vehicle-mounted terminal includes:
  • the sensory fusion data includes the first position information of the sensor component
  • the vehicle-mounted data includes the second location information of the vehicle-mounted terminal, the second location information includes the latitude and longitude information, and the map information includes the mapping relationship between the area node and the latitude and longitude,
  • the method further includes:
  • the road condition information is sent to the vehicle-mounted terminal.
  • determining the road condition information based on the perception fusion data and sending the road condition information to the vehicle-mounted terminal includes:
  • the road condition information is determined according to the roadside data and the perception fusion data, and the road condition information is sent to the vehicle terminal.
  • the roadside data includes third location information of the roadside unit
  • the method further includes: sending at least one of sensory fusion data and roadside data to a big data platform, where the big data platform is used to compare at least one of the sensory fusion data and roadside data.
  • the person to store is not limited to: sending at least one of sensory fusion data and roadside data to a big data platform, where the big data platform is used to compare at least one of the sensory fusion data and roadside data. The person to store.
  • the roadside data includes at least one of intersection information, status information of signal lights, traffic incident information, traffic sign information, basic road safety information, and basic vehicle safety information;
  • the road condition information includes at least one of intersection information, road section information, lane information, connection relationships between roads, traffic event messages, traffic sign messages, basic road safety messages, and status messages of signal lights.
  • the sensing component includes at least one of a video acquisition device, a wave radar device, and a lidar device.
  • the embodiments of the present disclosure also provide a car networking device based on perception fusion, including a cloud control platform, which is used to execute the aforementioned car networking method based on perception fusion.
  • the embodiments of the present disclosure also provide a car networking device based on perception fusion, including a processor, a memory, and a computer program stored in the memory and capable of running on the processor.
  • the computer program includes the above-mentioned perception fusion-based Internet of Vehicles method.
  • the embodiments of the present disclosure also provide a computer-readable storage medium, and a computer program is stored on the computer-readable storage medium, and the computer program includes the above-mentioned car networking method based on perception fusion.
  • the embodiments of the present disclosure also provide a car networking system based on perception fusion, including:
  • the cloud control platform is used to execute the above-mentioned method
  • Sensing components used to obtain sensing data
  • the sensory fusion module is used to fuse sensory data to obtain sensory fusion data.
  • the perception fusion data is obtained, and the perception fusion data is directly obtained and determined from the sensing component, and then the road condition information is determined according to the perception fusion data, and the road condition information is sent to the vehicle terminal.
  • the sensory fusion data is directly determined from the sensory data fusion of the sensor components, which can reduce the time delay.
  • the receiving and forwarding system does not need to receive the sensor data, and there is no need to send the sensor data to the sensory fusion module, which can effectively improve the reception. The work efficiency and resource utilization of the forwarding system.
  • FIG. 1 is a schematic flowchart of a method for Internet of Vehicles based on perception fusion according to an embodiment of the present disclosure
  • FIG. 2 is a schematic flowchart of a method for Internet of Vehicles based on perception fusion according to another embodiment of the present disclosure
  • FIG. 3 is a schematic flowchart of a method for Internet of Vehicles based on perception fusion according to another embodiment of the present disclosure
  • FIG. 4 is a schematic structural diagram of a car networking system based on perception fusion provided by an embodiment of the present disclosure
  • FIG. 5 is a topology diagram of hardware modules of a car networking system based on perception fusion according to an embodiment of the present disclosure
  • Fig. 6 is a toolkit architecture diagram of a vehicle networking system based on perception fusion provided by an embodiment of the present disclosure.
  • Fig. 1 is a flowchart of a method for Internet of Vehicles based on perception fusion provided by an embodiment of the present disclosure. The method includes:
  • Step S101 Obtain sensory fusion data.
  • the sensory fusion data is determined based on the fusion of the sensory data obtained from the sensor component.
  • the sensing component includes, for example, at least one of a video acquisition device, a wave radar device, and a lidar device.
  • the user of the video acquisition device acquires video data, and the characteristics of the target image can be determined according to the video data.
  • the wave radar device is, for example, a millimeter wave radar device, which is used to collect information such as the contour, speed, and position of the target.
  • Lidar devices are used, for example, to enhance the description of contours.
  • Step S102 Determine the road condition information according to the perception fusion data, and send the road condition information to the vehicle-mounted terminal.
  • the perception fusion data is obtained, and the perception fusion data is directly obtained and determined from the sensing component, and then the road condition information is determined according to the perception fusion data, and the road condition information is sent to the vehicle terminal.
  • the sensory fusion data is directly determined from the sensory data fusion of the sensor components, which can reduce the time delay.
  • the receiving and forwarding system does not need to receive the sensor data, and there is no need to send the sensor data to the sensory fusion module, which can effectively improve the reception. The work efficiency and resource utilization of the forwarding system.
  • step S102 includes: acquiring vehicle-mounted data of the vehicle-mounted terminal, and sending road condition information to the vehicle-mounted terminal according to the perception fusion data and the vehicle-mounted data.
  • information such as the address of the vehicle terminal can be determined based on the vehicle data. Therefore, sending road condition information to the vehicle terminal based on the perception fusion data and vehicle data can improve the accuracy of the result and send the road condition to the designated vehicle terminal. information.
  • step S102 includes:
  • Step S1021 Determine the first designated area of the first location in the pre-stored map information.
  • Step S1022 Determine the road condition information of the first designated area according to the perception fusion data.
  • Step S1022 Send road condition information to the vehicle-mounted terminal located in the first designated area.
  • the position of the sensing component in the map information can be determined first based on the sensory fusion data, and then the first designated area is determined.
  • the road condition information of the first designated area can be determined at the same time, and the road condition information can be sent to the vehicle-mounted terminal located in the first designated area, so that the vehicle-mounted terminal can timely determine the road condition environment around the vehicle-mounted terminal according to the road condition information.
  • the vehicle-mounted data includes second location information of the vehicle-mounted terminal, the second location information is, for example, latitude and longitude information, and the map information includes the mapping relationship between the area node and the latitude and longitude, according to the second location information and the mapping relationship.
  • the corresponding location of the vehicle-mounted terminal on the map information can be determined, and when the corresponding location is located in the first designated area, the road condition information is sent to the vehicle-mounted terminal.
  • the corresponding position of the vehicle-mounted terminal on the map information can be accurately determined according to the mapping relationship between the latitude and longitude and the area node, so that the location of the vehicle-mounted terminal can be determined.
  • Send road condition information to the vehicle terminal When the vehicle-mounted terminal is located in the first designated area, Send road condition information to the vehicle terminal.
  • the car networking method based on perception fusion includes:
  • Step S301 Obtain roadside data of the roadside unit.
  • Step S302 Obtain sensory fusion data.
  • the sensory fusion data is determined based on the sensory data fusion obtained from the sensor component.
  • Step S303 Determine the road condition information according to the roadside data and the perception fusion data, and send the road condition information to the vehicle-mounted terminal.
  • step S301 and step S302 are not limited, and step S301 can be performed before step S302 or after step S302.
  • the second embodiment is different from the first embodiment in that it also acquires roadside data, and simultaneously determines road condition information based on the roadside data and the perception fusion data, which can further enrich the road condition information and make the information received by the vehicle-mounted terminal more comprehensive.
  • the roadside data includes the third location information of the roadside unit, and the third location is determined to be in the second designated area in the pre-stored map information; the second designated area is determined based on the roadside data and the perception fusion data.
  • the road condition information of the area; the road condition information is sent to the vehicle-mounted terminal located in the second designated area.
  • the third designated area in the map information may be determined according to the third position information and the first position information, and the road condition information of the third designated area may be determined according to the roadside data and the perception fusion data, And send road condition information to the vehicle-mounted terminal located in the third designated area.
  • At least one of the perception fusion data and the roadside data may be sent to the big data platform, and the big data platform is used to store at least one of the perception fusion data and the roadside data. It is convenient for users to retrieve historical data in time, or conduct research on historical data.
  • At least one of vehicle-mounted data, sensory fusion data, and roadside data may also be sent to the big data platform.
  • the big data platform is used to compare vehicle data, sensory fusion data, and roadside data. At least one of them is stored.
  • the type of roadside data is not limited, and the roadside data includes at least one of intersection information, status information of signal lights, traffic incident information, traffic sign information, basic road safety information, and basic vehicle safety information, for example.
  • the sensory fusion data includes, for example, at least one of intersection information, road section information, lane information, connection relationships between roads, traffic event messages, traffic sign messages, basic road safety messages, and status messages of signal lights.
  • the road condition information includes at least one of intersection information, road section information, lane information, connection relationships between roads, traffic event messages, traffic sign messages, basic road safety messages, and status messages of signal lights, for example.
  • the third embodiment of the present disclosure provides a vehicle networking device based on perception fusion.
  • the vehicle network device includes a cloud control platform.
  • the cloud control platform is used to execute the perception fusion-based The Internet of Vehicles method.
  • the fourth embodiment of the present disclosure provides a vehicle networking device based on perception fusion, including a processor, a memory, and a computer program stored on the memory and capable of running on the processor, the computer program including any of the foregoing
  • the fifth embodiment of the present disclosure provides a computer-readable storage medium on which a computer program is stored.
  • the computer program includes any one of the above-mentioned first and second embodiments of the vehicle networking method based on perception fusion.
  • the sixth embodiment of the present disclosure provides a vehicle networking system based on perception fusion, including: a cloud control platform 100 for executing any one of the above-mentioned first and second embodiments.
  • the Internet of Vehicles method based on perception fusion.
  • the sensing component 500 is used to obtain sensing data.
  • the sensory fusion module 300 is used to fuse sensory data to obtain sensory fusion data.
  • the car networking system based on perception fusion further includes a toolkit, which is connected between the perception fusion module 300 and the cloud control platform 100, and the toolkit is used to convert the perception fusion data into a specified format. So that the cloud control platform 100 can read a specified number of perceptual fusion data.
  • the vehicle networking system based on perception fusion further includes a roadside unit 400 for acquiring roadside data.
  • the cloud control platform 100 is also used to obtain roadside data and determine road condition information based on the roadside data and perception fusion data.
  • FIGS. 4 to 6 The following takes FIGS. 4 to 6 as examples to illustrate that the sixth embodiment of the present invention provides a car networking system based on perception fusion.
  • the roadside unit 400 encodes the acquired roadside data information through the ASN.1 format and forwards it to the cloud control platform 100 using the UDP protocol; when the vehicle-mounted terminal 200 sends the vehicle-mounted data, the vehicle-mounted data will be carried on the vehicle.
  • the latitude and longitude information and device number of the terminal 200, and the cloud control platform 100 combine the location information reported by the roadside unit 400 and the vehicle-mounted unit to match the roadside unit 400 and the vehicle-mounted unit to a designated area of pre-stored map information.
  • the cloud control platform 100 forwards the message of the roadside unit 400 to the vehicle-mounted terminal 200 that meets the matching rule, that is, when the roadside unit 400 is in the second designated area, it sends road condition information to the vehicle-mounted terminal 200 located in the second designated area.
  • the UDP protocol is used during data transmission to achieve high concurrency and low latency.
  • the entire design basically satisfies the stability, information sharing, real-time and integrity requirements of the computing platform in the Internet of Vehicles application.
  • the map information contains the mapping relationship between the area node and the latitude and longitude.
  • the road condition information of the local area can be transmitted to the vehicle-mounted terminal 200.
  • Road condition information includes intersection information, road section information, lane information, and connection relationships between roads in a local area.
  • the basic vehicle safety message (BSM message) carries the latitude and longitude information and reports it to the cloud control platform 100, that is, the vehicle-mounted data carries the latitude and longitude information and reports it to the cloud control platform 100.
  • the server of the cloud control platform 100 registers the vehicle-mounted terminal 200 or the serial number of the vehicle-mounted terminal 200 to the corresponding node according to the reported longitude and latitude in the map information.
  • Traffic incident messages, traffic sign messages (RSI messages) and basic road safety messages (RSM messages) go to the map information to obtain the corresponding nodes according to the latitude and longitude information.
  • the roadside unit 400 reports the node information corresponding to the status information (SPAT information) of the signal light to the server, that is, the drive test data includes the third location information of the roadside unit 400, and the second designated area can be determined according to the third location information.
  • the cloud control platform 100 traverses all nodes when forwarding messages from the roadside unit 400, calculates the distances of all vehicle-mounted terminals 200 under the node in turn, and forwards the above-mentioned types of roadside unit 400 messages registered by the nodes to vehicles within a specified distance.
  • the specified distance is determined by the data in the configuration file. That is, the road condition information is sent to the vehicles located in the second designated area.
  • the map information stores a mapping relationship between latitude and longitude and regional nodes, and at the same time, calculates a GeoHash value through the GeoHash algorithm according to the latitude and longitude as a spatial index.
  • the cloud control platform 100 calculates the regional index value according to the longitude and latitude information carried in at least one of the basic vehicle safety message (BSM), traffic incident message, traffic sign message (RSI), and basic road safety message (RSM), and calculates the vehicle through the GeoHash algorithm
  • the system will traverse the calculated 9 GeoHash values in the map message. If one of the GeoHash values exists in the map message, it will take out the map information corresponding to this index value and assign the area information of the map message to the current one. Pending messages.
  • the data of the perception fusion module 300 may come from video data, millimeter wave radar, lidar, and the like. Video data can be used to determine the characteristics of the target image, millimeter-wave radar can be used to collect information such as the contour, speed and position of the target, and lidar can be used to enhance the description of the contour.
  • the merged data is transmitted to the cloud control platform 100 through the message queue telemetry transmission protocol, and is forwarded to the vehicle terminal 200 through a certain matching rule.
  • the above information is fused through a specific algorithm, and the JSON structure is output as the result.
  • the cloud control platform 100 converts the JSON structure data into a hexadecimal byte array through the software development kit, and the string complies with the ASN.1 standard at this time. Can be forwarded.
  • the data message provided by perception fusion is more detailed and comprehensive.
  • the software development kit provides external interfaces, and calls corresponding data when a request is received.
  • the perception fusion module 300 puts the processed data into a software development kit in a JSON format, and the software development kit encodes and encapsulates the JSON format data in an ASN.1 format into a message frame (Message Frame). It is sent to the intelligent cloud control system via UDP protocol.
  • the software development kit converts the five standardized data received and forwarded by the cloud control platform 100 and the data after perception fusion from the ASN.1 format encapsulated by the message frame to the JSON format according to the request. And provide it to the user side.
  • the data can also be uploaded to the cloud of the cloud control management platform through the message encoding and decoding function in the software development kit.
  • the roadside unit 400 such as traffic lights, signs, etc.
  • the vehicle-mounted terminal 200 also sends its own location information, status information, etc. to the cloud control platform 100.
  • the perception fusion module 300 uses video, millimeter wave radar and other sensor components 500 to collect and process sensor data to form perception fusion data, and send the perception fusion data to the cloud control platform 100, and the cloud control platform 100 performs all messages according to the matching rules. Match forwarding processing.
  • cluster deployment and big data platform are used for calculation and storage to realize high concurrency scenarios and big data applications.
  • the present disclosure is a car networking system that combines perception and fusion, which can realize the reception and real-time forwarding of multiple data. It can receive five types of messages and sensory fusion data defined by the ASN.1 standard.
  • the sensory fusion module 300 receives video and radar data, and the data calculated by the algorithm model is encoded and forwarded to the cloud control platform 100.
  • Road event reminder event The roadside unit 400 transmits a national standard message to the cloud control platform 100 at a frequency of 1 Hz, and the national standard message is roadside data.
  • the perception fusion module 300 will also transmit the perception fusion data to the cloud control platform 100.
  • the roadside unit 400 devices send messages, they will bring their respective latitude and longitude information.
  • the cloud control platform 100 will register them to the nearest node according to the map message, and obtain the regional node data and latitude and longitude coordinates.
  • the corresponding matching relationship of the reported vehicle terminal 200 is calculated based on the latitude and longitude position to calculate the location of the area, obtain the vehicle registered under the same node as the message, calculate according to the matching rule, calculate the vehicle with the configured distance under the threshold and send an event reminder information.
  • the UDP protocol is used in the sending process. Based on the actual situation, the road vehicle terminal 200 and the roadside terminal will generate a large amount of data. These data are uniformly sent to the message queue, and the data center station is subscribed to perform big data processing.
  • the perception fusion module 300 In automatic driving, when a traffic accident occurs ahead, the perception fusion module 300 combines video, millimeter wave radar, and lidar data for perception fusion to obtain perception fusion data, and the formed event and decision data are sent to The cloud control platform 100, and the cloud control platform 100 uses the 5G communication channel and Uu port to broadcast to the nearby vehicle terminal 200.
  • the characteristics of 5G low latency, high bandwidth, and wide connection can be fully utilized, which can effectively reduce secondary accidents. happen.
  • the roadside unit 400 obtains real-time status messages of surrounding traffic participants through corresponding detection means as roadside safety messages (RSM messages) and broadcasts them to surrounding vehicles.
  • RSS messages roadside safety messages
  • the behavior is sent as a roadside safety message to the corresponding vehicle that is about to pass the intersection.
  • this behavior can be realized. Avoid pedestrians to achieve safe driving.
  • the car networking system based on the perception fusion module 300 in the embodiment of the present disclosure collects video equipment by installing video equipment on the roadside.
  • the video and video stream will be pushed to the SRS streaming media server built on the edge computing server, and sliced by ffmpeg (ffmpeg cuts the required video size according to the incoming start time and cutting time), and specifies the output path of the cut video
  • the fragmented pictures are transmitted to the perception fusion platform through the rtsp protocol.
  • the perception fusion platform quickly calculates the type of the event and forwards the decision and event to the cloud control platform 100, and the cloud control platform 100 transmits the event information Forward to nearby vehicles.

Abstract

本公开实施例提供一种车联网方法、装置、设备、存储介质及系统,基于感知融合的车联网方法包括:获取感知融合数据,感知融合数据是根据从传感部件获取到的传感数据融合确定;根据感知融合数据确定路况信息,并向车载终端发送路况信息。

Description

车联网方法、装置、设备、存储介质及系统
相关申请的交叉引用
本申请主张在2020年3月19日在中国提交的中国专利申请号No.202010197663.2的优先权,其全部内容通过引用包含于此。
技术领域
本公开涉及物联网技术领域,尤其涉及一种车联网方法、装置、设备、存储介质及系统。
背景技术
随着通信技术的高速发展,更高更快更强的移动网络为车路协同提供了良好的发展前景,随之而来的便是多场景下的应用需求。基于蜂窝的车用无线通信技术(C-V2X)与专用短程通信技术作为两种主流通信方式,在车联网行业被广泛应用。基于蜂窝的车用无线通信技术因具有收发开销小、可接入数据计算处理模块的特点而在本专利中被选用,通过车-路-人之间信息的交互,可以实现更加化及安全化的道路交通模式。
相关技术中的方案通过互联互通模块将基础数据传送给感知融合模块,并将感知融合数据经由基础信息实时数据模块传送给互联互通模块,加大了时延性,造成了不必要的资源浪费。
因此,亟需一种新的基于感知融合的车联网方法、装置、设备、存储介质及系统。
发明内容
本公开实施例提供一种基于感知融合的车联网方法、装置、设备、存储介质及系统,旨在缩短车联网系统的时延性。
一方面,本公开实施例提供了一种基于感知融合的车联网方法,包括:
获取感知融合数据,感知融合数据是根据从传感部件获取到的传感数据融合确定;
根据感知融合数据确定路况信息,并向车载终端发送路况信息。
根据本公开一方面的实施方式,根据感知融合数据确定路况信息,并向车载终端发送路况信息,包括:
获取车载终端的车载数据;
根据感知融合数据和车载数据向车载终端发送路况信息。
根据本公开一方面前述任一实施方式,感知融合数据包括传感部件的第一位置信息,
根据感知融合数据确定路况信息,并向车载终端发送路况信息,包括:
确定第一位置在预存储地图信息中的第一指定区域;
根据感知融合数据确定第一指定区域的路况信息;
向位于第一指定区域内的车载终端发送路况信息。
根据本公开一方面前述任一实施方式,车载数据包括车载终端的第二位置信息,第二位置信息包括经纬度信息,地图信息包括区域节点与经纬度的映射关系,
向位于第一指定区域内的车载终端发送感知融合数据的步骤前,方法还包括:
根据第二位置信息以及映射关系确定车载终端在地图信息上的对应位置;
当对应位置位于第一指定区域时,向车载终端发送路况信息。
根据本公开一方面前述任一实施方式,在根据感知融合数据确定路况信息,并向车载终端发送路况信息,包括:
获取路侧单元的路侧数据;
根据路侧数据和感知融合数据确定路况信息,并向车载终端发送路况信息。
根据本公开一方面前述任一实施方式,路侧数据包括路侧单元的第三位置信息,
根据感知融合数据确定路况信息,并向车载终端发送路况信息,包括:
确定第三位置在预存储地图信息中的第二指定区域;
根据路侧数据和感知融合数据确定第二指定区域的路况信息;
向位于第二指定区域的车载终端发送路况信息。
根据本公开一方面前述任一实施方式,方法还包括:向大数据平台发送感知融合数据和路侧数据中的至少一者,大数据平台用于对感知融合数据及路侧数据中的至少一者进行存储。
根据本公开一方面前述任一实施方式,路侧数据包括路口信息、信号灯的状态信息、交通事件信息、交通标志牌信息、道路基本安全信息和车辆基本安全信息中的至少一者;
路况信息包括路口信息、路段信息、车道信息、道路之间的连接关系、交通事件消息、交通标志牌消息、道路基本安全消息、信号灯的状态消息中的至少一者。
根据本公开一方面前述任一实施方式,传感部件包括视频获取装置、波雷达装置和激光雷达装置中的至少一者。
另一方面,本公开实施例还提供了一种基于感知融合的车联网装置,包括云控平台,云控平台用于执行上述的基于感知融合的车联网方法。
又一方面,本公开实施例还提供了一种基于感知融合的车联网设备,包括处理器、存储器及存储在存储器上并可在处理器上运行的计算机程序,计算机程序包括上述的基于感知融合的车联网方法。
再一方面,本公开实施例还提供了一种计算机可读存储介质,计算机可读存储介质上存储有计算机程序,计算机程序包括上述的基于感知融合的车联网方法。
还一方面,本公开实施例还提供了一种基于感知融合的车联网系统,包括:
云控平台,用于执行上述的方法;
传感部件,用于获取传感数据;
感知融合模块,用于融合传感数据以得到感知融合数据。
在本公开实施例的基于感知融合的车联网中,首先获取感知融合数据,感知融合数据直接从传感部件获取确定,然后根据感知融合数据确定路况信息,向车载终端发送路况信息。本公开实施例中,感知融合数据直接从传感部件的传感数据融合确定,能够减小时延性,接收转发系统无需接收传感数据,更无需向感知融合模块发送传感数据,能够有效提高接收转发系统的工 作效率及资源利用率。
附图说明
通过阅读以下参照附图对非限制性实施例所作的详细描述,本公开的其它特征、目的和优点将会变得更明显,其中,相同或相似的附图标记表示相同或相似的特征。
图1是本公开实施例提供的一种基于感知融合的车联网方法的流程示意图;
图2是本公开另一实施例提供的一种基于感知融合的车联网方法流程示意图;
图3是本公开又一实施例提供的一种基于感知融合的车联网方法流程示意图;
图4是本公开实施例提供的一种基于感知融合的车联网系统结构示意图;
图5是本公开实施例提供的一种基于感知融合的车联网系统的硬件模块的拓扑图;
图6是本公开实施例提供的一种基于感知融合的车联网系统的工具包架构图。
附图标记说明:100、云控平台;200、车载终端;300、感知融合模块;400、路侧单元;500、传感部件。
具体实施方式
下面将详细描述本公开的各个方面的特征和示例性实施例。在下面的详细描述中,提出了许多具体细节,以便提供对本公开的全面理解。但是,对于本领域技术人员来说很明显的是,本公开可以在不需要这些具体细节中的一些细节的情况下实施。下面对实施例的描述仅仅是为了通过示出本公开的示例来提供对本公开的更好的理解。在附图和下面的描述中,至少部分的公知结构和技术没有被示出,以便避免对本公开造成不必要的模糊;并且,为了清晰,可能夸大了部分结构的尺寸。此外,下文中所描述的特征、结构或特性可以以任何合适的方式结合在一个或更多实施例中。
在本公开的描述中,需要说明的是,除非另有说明,“多个”的含义是两个以上;术语“上”、“下”、“左”、“右”、“内”、“外”等指示的方位或位置关系仅是为了便于描述本公开和简化描述,而不是指示或暗示所指的装置或元件必须具有特定的方位、以特定的方位构造和操作,因此不能理解为对本公开的限制。此外,术语“第一”、“第二”等仅用于描述目的,而不能理解为指示或暗示相对重要性。
下述描述中出现的方位词均为图中示出的方向,并不是对本公开的实施例的具体结构进行限定。在本公开的描述中,还需要说明的是,除非另有明确的规定和限定,术语“安装”、“连接”应做广义理解,例如,可以是固定连接,也可以是可拆卸连接,或一体地连接;可以是直接相连,也可以间接相连。对于本领域的普通技术人员而言,可视具体情况理解上述术语在本公开中的具体含义。
为了更好地理解本公开,下面结合图1至图6对本公开实施例的基于感知融合的车联网方法、装置、设备、存储介质及系统进行详细描述。
图1为本公开实施例提供的一种基于感知融合的车联网方法的流程图,方法包括:
步骤S101:获取感知融合数据,感知融合数据是根据从传感部件获取到的传感数据融合确定。
传感部件例如包括视频获取装置、波雷达装置和激光雷达装置中的至少一者。视频获取装置用户获取视频数据,根据视频数据可以确定目标图像的特征。波雷达装置例如为毫米波雷达装置,用于收集目标的轮廓、速度及位置等信息。激光雷达装置例如用于加强对轮廓的描述。
步骤S102:根据感知融合数据确定路况信息,并向车载终端发送路况信息。
在本公开实施例的基于感知融合的车联网中,首先获取感知融合数据,感知融合数据直接从传感部件获取确定,然后根据感知融合数据确定路况信息,向车载终端发送路况信息。本公开实施例中,感知融合数据直接从传感部件的传感数据融合确定,能够减小时延性,接收转发系统无需接收传感数据,更无需向感知融合模块发送传感数据,能够有效提高接收转发系统的工 作效率及资源利用率。
步骤S102的实施方式有多种,在一些可选的实施例中,步骤S102包括:获取车载终端的车载数据,根据感知融合数据和车载数据向车载终端发送路况信息。在这些可选的实施例中,根据车载数据可以确定车载终端的地址等信息,因此根据感知融合数据和车载数据向车载终端发送路况信息能够提高结果的准确性,能够向指定的车载终端发送路况信息。
进一步的,感知融合数据包括传感部件的第一位置信息,如图2所示,步骤S102包括:
步骤S1021:确定第一位置在预存储地图信息中的第一指定区域。
步骤S1022:根据感知融合数据确定第一指定区域的路况信息。
步骤S1022:向位于第一指定区域内的车载终端发送路况信息。
在这些可选的实施例中,首先根据感知融合数据能够确定传感部件在地图信息中的位置,然后确定第一指定区域。根据感知融合数据同时还可以确定第一指定区域的路况信息,向位于第一指定区域内的车载终端发送路况信息,使得车载终端能够根据路况信息及时判断车载终端周侧的路况环境。
在一些可选的实施例中,车载数据包括车载终端的第二位置信息,第二位置信息例如为经纬度信息,地图信息包括区域节点和经纬度之间的映射关系,根据第二位置信息以及映射关系可以确定车载终端在地图信息上的对应位置,当对应位置位于第一指定区域时,向车载终端发送路况信息。
在这些可选的实施例中,根据经纬度和区域节点之间的映射关系能够准确确定车载终端在地图信息上的对应位置,从而能够确定车载终端的位置,当车载终端位于第一指定区域时,向车载终端发送路况信息。
如图3所示,在第二实施例中,基于感知融合的车联网方法包括:
步骤S301:获取路侧单元的路侧数据。
步骤S302:获取感知融合数据,感知融合数据是根据从传感部件获取到的传感数据融合确定。
步骤S303:根据路侧数据和感知融合数据确定路况信息,并向车载终端发送路况信息。
步骤S301和步骤S302的顺序不做限定,步骤S301可以在步骤S302之 前或步骤S302之后进行。
第二实施例与第一实施例不同之处在于其还获取了路侧数据,根据路侧数据和感知融合数据同时确定路况信息,能够进一步丰富路况信息,使得车载终端收到的信息更加全面。
在一些可选的实施例中,路侧数据包括路侧单元的第三位置信息,确定第三位置在预存储地图信息中的第二指定区域;根据路侧数据和感知融合数据确定第二指定区域的路况信息;向位于第二指定区域的车载终端发送路况信息。
在又一些可选的实施例中,还可以根据第三位置信息和第一位置信息确定在地图信息中的第三指定区域,根据路侧数据和感知融合数据确定第三指定区域的路况信息,并向位于第三指定区域的车载终端发送路况信息。
进一步的,还可以向大数据平台发送感知融合数据和路侧数据中的至少一者,大数据平台用于对感知融合数据及路侧数据中的至少一者进行存储。便于用户及时调取历史数据,或者对历史数据进行研究。
在另一些可选的实施例中,还可以向大数据平台发送车载数据、感知融合数据和路侧数据中的至少一者,大数据平台用于对车载数据、感知融合数据及路侧数据中的至少一者进行存储。
路侧数据的类型不做限定,路侧数据例如包括路口信息、信号灯的状态信息、交通事件信息、交通标志牌信息、道路基本安全信息和车辆基本安全信息中的至少一者。
感知融合数据例如包括路口信息、路段信息、车道信息、道路之间的连接关系、交通事件消息、交通标志牌消息、道路基本安全消息、信号灯的状态消息中的至少一者。
路况信息例如包括路口信息、路段信息、车道信息、道路之间的连接关系、交通事件消息、交通标志牌消息、道路基本安全消息、信号灯的状态消息中的至少一者。
本公开第三实施例提供一种基于感知融合的车联网装置,车辆网装置包括云控平台,云控平台用于执行上述任一第一实施例和第二实施例所述的基于感知融合的车联网方法。
本公开第四实施例提供一种基于感知融合的车联网设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序包括上述任一第一实施例和第二实施例所述的基于感知融合的车联网方法。
本公开第五实施例提供一种计算机可读存储介质,其上存储有计算机程序,计算机程序包括上述任一第一实施例和第二实施例所述的基于感知融合的车联网方法。
请一并参阅图4,本公开第六实施例提供一种基于感知融合的车联网系统,包括:云控平台100,用于执行上述任一第一实施例和第二实施例所述的基于感知融合的车联网方法。传感部件500,用于获取传感数据。感知融合模块300,用于融合传感数据以得到感知融合数据。
在一些可选的实施例中,基于感知融合的车联网系统还包括工具包,工具包连接于感知融合模块300和云控平台100之间,工具包用于将感知融合数据转换为指定格式,以使云控平台100能够读取指定个数的感知融合数据。
在另一些可选的实施例中,基于感知融合的车联网系统还包括路侧单元400,用于获取路侧数据。云控平台100还用于获取路侧数据并根据路侧数据和感知融合数据确定路况信息。
下面以图4至图6为例,举例说明发明第六实施例提供一种基于感知融合的车联网系统。
如图5所示,路侧单元400将获取的路侧数据信息通过ASN.1格式编码,利用UDP协议转发给云控平台100;车载终端200在发送车载数据的时候,车载数据会带上车载终端200的经纬度信息以及设备编号,云控平台100结合路侧单元400和车载单元上报的位置信息,将路侧单元400和车载单元匹配到预存储地图信息的指定区域内。云控平台100将路侧单元400的消息转发给符合匹配规则的车载终端200,即当路侧单元400在第二指定区域时,向位于第二指定区域的车载终端200发送路况信息。数据的传输过程中采用了UDP协议,以达到高并发,低时延的目的。整个设计基本满足了车联网应用中计算平台的稳定性、信息共享、实时性和完整性需求。
地图信息(MAP消息)内包含了区域节点与经纬度的映射关系。可以向 车载终端200传递局部区域的路况信息。路况信息包括局部区域的路口信息、路段信息、车道信息,道路之间的连接关系等。
车辆基本安全消息(BSM消息)携带经纬度信息向云控平台100上报,即车载数据携带经纬度信息向云控平台100上报。云控平台100的服务器去地图信息中根据上报得到的经纬度将该车载终端200或车载终端200的编号注册到对应的节点。交通事件消息及交通标志牌消息(RSI消息)和道路基本安全消息(RSM消息)根据经纬度信息去地图信息里得到对应的节点。路侧单元400将信号灯的状态信息(SPAT信息)对应的节点信息上报到服务器,即路测数据包括路侧单元400的第三位置信息,根据第三位置信息可以确定第二指定区域。云控平台100在转发路侧单元400消息时遍历所有节点,依次计算节点下所有车载终端200的距离,将节点所注册的上述几类路侧单元400消息转发给规定距离范围内的车辆,该规定距离由配置文件中的数据所确定。即向位于第二指定区域内的车辆发送路况信息。
地图信息存放一个经纬度和区域节点的映射关系,同时根据经纬度,通过GeoHash算法计算一个GeoHash值,作为一个空间索引。
云控平台100根据车辆基本安全消息(BSM)、交通事件消息及交通标志牌消息(RSI)和道路基本安全消息(RSM)中至少一者携带的经纬度信息计算地区索引值,通过GeoHash算法计算车辆基本安全消息(BSM)、交通事件消息及交通标志牌消息(RSI)和道路基本安全消息(RSM)中至少一者对应的GeoHash值,同时计算当前经纬度的西北、北、东北、东、东南、南、西南、西以及自身共9个GeoHash值作为不同方向的索引。系统会在地图消息中遍历计算得到的这9个GeoHash值,如果其中某一个GeoHash值在地图消息中存在,就取出这一索引值对应的地图信息,将该条地图消息的区域信息赋给当前待处理的消息。
感知融合模块300数据可来源于视频数据、毫米波雷达、激光雷达等。视频数据可用来确定目标图像的特征,毫米波雷达可用于收集目标的轮廓、速度及位置等信息,激光雷达用以加强对轮廓的描述。融合后的数据通过消息队列遥测传输协议传输给云控平台100,经过一定的匹配规则转发给车载终端200。通过具体算法将上述信息进行融合处理,以JSON结构作为结果输 出,云控平台100通过软件开发工具包将JSON结构数据转变为16进制字节数组,该字符串此时符合ASN.1标准,可进行转发。感知融合提供的数据消息更加详细全面。
其中,软件开发工具包对外提供接口,当接收到请求时调用相应数据。感知融合模块300将处理后的数据以JSON格式放入软件开发工具包中,软件开发工具包将JSON格式数据以ASN.1格式进行编码封装进消息帧(Message Frame)中。通过UDP协议发送给智能云控系统。当用户端向云控平台100请求数据时,软件开发工具包根据该请求将云控平台100接收转发的五种标准化数据及感知融合后的数据从消息帧封装的ASN.1格式转变为JSON格式并提供给用户端。同理,通过软件开发工具包中消息的编解码功能,数据也可上传至云控管理平台云端。
如图5所示,路侧单元400如红绿灯、指示牌等将收集到的路侧数据发送给云控平台100,车载终端200也将自身的位置信息、状态信息等发送给云控平台100。感知融合模块300利用视频、毫米波雷达等传感部件500收集传感数据并进行处理形成感知融合数据,并将感知融合数据发送给云控平台100,云控平台100依据匹配规则对所有消息进行匹配转发处理。同时使用集群化部署和大数据平台进行计算及存储,实现高并发的场景和大数据应用。
本公开是一种结合感知融合的车联网系统,能够实现多数据的接收及实时转发。能够接收ASN.1标准定义下的五类消息及感知融合数据,感知融合模块300接受视频、雷达的数据,经过算法模型计算得到的数据后编码转发到云控平台100。
下面以具体实施例事件说明本公开实施例:
道路事件提醒事件:路侧单元400按照1HZ的频率向云控平台100传输国标消息,国标消息即路侧数据。同时感知融合模块300也会向云控平台100传输感知融合数据。路侧单元400设备在发送消息的时候,会带上各自的经纬度信息,根据转发匹配规则,云控平台100会根据地图消息,将他们注册到最近的节点上,并得到区域节点数据与经纬度坐标的对应匹配关系,将上报得到的车载终端200依据经纬度位置计算所在的区域位置,获取注册在与 该消息同一节点下的车辆,根据匹配规则计算,计算配置距离在阈值下的车辆并发送事件提醒信息。在发送过程中使用UDP协议。基于实际情况中道路车载终端200及路侧终端会产生大量的数据,这些数据统一发送到消息队列,由数据中台订阅后进行大数据处理。
交通事故和行人检测事件:在自动驾驶中,当在前方发生交通事故时,感知融合模块300结合视频,毫米波雷达,激光雷达数据进行感知融合得到感知融合数据,形成的事件和决策数据发送给云控平台100,并由云控平台100利用5G通信管道和Uu口播发给附近的车载终端200。5G低时延,高带宽,广连接的特性得以充分的利用,可以有效降低二次事故的发生。与此同时,路侧单元400通过相应的检测手段得到周边交通参与者的实时状态消息作为路侧安全消息(RSM消息)广播给周边车辆。比如说当行驶的路线有行人穿过马路时,该行为被作为路侧安全消息发送给相应的即将通过该路口的车辆,结合车载终端200、驾驶人员或车辆的自动化驾驶设备,可实现对该行人的避让以达到安全驾驶。
又例如,当发生山体滑坡,有石头掉路在道路上,对于这种突发事件很容易造成交通事故,本公开实施例的基于感知融合模块300的车联网系统通过在路侧安装视频设备采集视频,视频流会被推送到搭建在边缘计算服务器的SRS流媒体服务器,通过ffmpeg进行切片(ffmpeg根据传入开始时间和剪切时长切成所需要视频大小),并将所切视频指定输出路径进行存储,同时分片以后的图片通过rtsp协议传输给感知融合平台,感知融合平台快速计算出事件的类型并将做出的决策和事件一起转发给云控平台100,云控平台100将事件信息转发给附近的车辆。
虽然已经参考可选实施例对本申请进行了描述,但在不脱离本申请的范围的情况下,可以对其进行各种改进并且可以用等效物替换其中的部件。尤其是,只要不存在结构冲突,各个实施例中所提到的各项技术特征均可以任意方式组合起来。本申请并不局限于文中公开的特定实施例,而是包括落入权利要求的范围内的所有技术方案。

Claims (13)

  1. 一种基于感知融合的车联网方法,包括:
    获取感知融合数据,所述感知融合数据是根据从传感部件获取到的传感数据融合确定;
    根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息。
  2. 根据权利要求1所述的方法,其中,所述根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息,包括:
    获取所述车载终端的车载数据;
    根据所述感知融合数据和所述车载数据向所述车载终端发送所述路况信息。
  3. 根据权利要求2所述的方法,其中,所述感知融合数据包括所述传感部件的第一位置信息,
    所述根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息,包括:
    确定所述第一位置在预存储地图信息中的第一指定区域;
    根据所述感知融合数据确定所述第一指定区域的所述路况信息;
    向位于所述第一指定区域内的所述车载终端发送所述路况信息。
  4. 根据权利要求3所述的方法,其中,所述车载数据包括所述车载终端的第二位置信息,所述第二位置信息包括经纬度信息,所述地图信息包括区域节点与经纬度的映射关系,
    向位于所述第一指定区域内的所述车载终端发送所述感知融合数据的步骤前,所述方法还包括:
    根据所述第二位置信息以及所述映射关系确定所述车载终端在所述地图信息上的对应位置;
    当所述对应位置位于所述第一指定区域时,向所述车载终端发送所述路况信息。
  5. 根据权利要求1-4任一项所述的方法,其中,在所述根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息,包括:
    获取路侧单元的路侧数据;
    根据所述路侧数据和所述感知融合数据确定所述路况信息,并向车载终端发送所述路况信息。
  6. 根据权利要求5所述的方法,其中,所述路侧数据包括所述路侧单元的第三位置信息,
    所述根据所述感知融合数据确定路况信息,并向车载终端发送所述路况信息,包括:
    确定所述第三位置在预存储地图信息中的第二指定区域;
    根据所述路侧数据和所述感知融合数据确定所述第二指定区域的所述路况信息;
    向位于所述第二指定区域的所述车载终端发送所述路况信息。
  7. 根据权利要求5所述的方法,其中,所述方法还包括:向大数据平台发送所述感知融合数据和所述路侧数据中的至少一者,所述大数据平台用于对所述感知融合数据及所述路侧数据中的至少一者进行存储。
  8. 根据权利要求5所述的方法,其中,
    所述路侧数据包括路口信息、信号灯的状态信息、交通事件信息、交通标志牌信息、道路基本安全信息和车辆基本安全信息中的至少一者;
    所述路况信息包括路口信息、路段信息、车道信息、道路之间的连接关系、交通事件消息、交通标志牌消息、道路基本安全消息、信号灯的状态消息中的至少一者。
  9. 根据权利要求1所述的方法,其中,所述传感部件包括视频获取装置、波雷达装置和激光雷达装置中的至少一者。
  10. 一种基于感知融合的车联网装置,包括云控平台,所述云控平台用于执行根据权利要求1至9任一项所述的基于感知融合的车联网方法。
  11. 一种基于感知融合的车联网设备,包括处理器、存储器及存储在所述存储器上并可在所述处理器上运行的计算机程序,所述计算机程序包括权利要求1至9中任一项所述的基于感知融合的车联网方法。
  12. 一种计算机可读存储介质,其上存储有计算机程序,所述计算机程序包括权利要求1至9中任一项所述的基于感知融合的车联网方法。
  13. 一种基于感知融合的车联网系统,包括:
    云控平台,用于执行根据权利要求1至9任一项所述的方法;
    传感部件,用于获取传感数据;
    感知融合模块,用于融合所述传感数据以得到感知融合数据。
PCT/CN2020/134932 2020-03-19 2020-12-09 车联网方法、装置、设备、存储介质及系统 WO2021184841A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010197663.2 2020-03-19
CN202010197663.2A CN113498011B (zh) 2020-03-19 2020-03-19 车联网方法、装置、设备、存储介质及系统

Publications (1)

Publication Number Publication Date
WO2021184841A1 true WO2021184841A1 (zh) 2021-09-23

Family

ID=77771889

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/134932 WO2021184841A1 (zh) 2020-03-19 2020-12-09 车联网方法、装置、设备、存储介质及系统

Country Status (2)

Country Link
CN (1) CN113498011B (zh)
WO (1) WO2021184841A1 (zh)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113727304A (zh) * 2021-11-04 2021-11-30 深圳市城市交通规划设计研究中心股份有限公司 一种基于5g通信架构的紧急车辆预警系统及其预警方法
CN113888871A (zh) * 2021-10-20 2022-01-04 上海电科智能系统股份有限公司 高速公路交通事件自动化处置联动系统及方法
CN113895442A (zh) * 2021-10-11 2022-01-07 苏州智加科技有限公司 一种基于路侧与车端协同感知的车辆行驶决策方法及系统
CN114244880A (zh) * 2021-12-16 2022-03-25 云控智行科技有限公司 智能网联驾驶云控功能的运行方法、装置、设备和介质
CN114333330A (zh) * 2022-01-27 2022-04-12 浙江嘉兴数字城市实验室有限公司 一种基于路侧边缘全息感知的交叉口事件检测系统及方法
CN114596707A (zh) * 2022-03-16 2022-06-07 阿波罗智联(北京)科技有限公司 交通控制方法及装置、设备、系统、介质
CN114792470A (zh) * 2022-04-08 2022-07-26 广州小鹏汽车科技有限公司 路况显示方法、装置、可穿戴设备及存储介质
CN115100852A (zh) * 2022-06-09 2022-09-23 智能汽车创新发展平台(上海)有限公司 服务于智能网联汽车的高可用路侧融合感知系统和方法

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114641041B (zh) * 2022-05-18 2022-09-13 之江实验室 一种面向边缘智能的车联网切片方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105025077A (zh) * 2015-05-28 2015-11-04 广州番禺职业技术学院 一种基于云计算的车载物联网运营系统
CN105390009A (zh) * 2015-11-17 2016-03-09 广东好帮手电子科技股份有限公司 一种动态交通信息发布方法及系统
EP3056861A1 (en) * 2015-02-12 2016-08-17 Honda Research Institute Europe GmbH Method and system in a vehicle for improving prediction results of an advantageous driver assistant system
US20180059680A1 (en) * 2016-08-29 2018-03-01 Denso Corporation Vehicle location recognition device
CN109738923A (zh) * 2019-03-18 2019-05-10 腾讯科技(深圳)有限公司 一种行车导航方法和装置以及系统
CN110210280A (zh) * 2019-03-01 2019-09-06 北京纵目安驰智能科技有限公司 一种超视距感知方法、系统、终端和存储介质

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120135676A1 (en) * 2010-11-26 2012-05-31 Industrial Technology Research Institute System and method for deployment and management of interactive regional broadcast services
CN105280005A (zh) * 2014-06-06 2016-01-27 电信科学技术研究院 一种道路安全消息的发送方法及装置
CN106331008A (zh) * 2015-06-26 2017-01-11 中兴通讯股份有限公司 车联网中车辆分组的管理方法及装置
CN106331006A (zh) * 2015-06-26 2017-01-11 中兴通讯股份有限公司 车联网中车辆的分组方法及装置
US10394237B2 (en) * 2016-09-08 2019-08-27 Ford Global Technologies, Llc Perceiving roadway conditions from fused sensor data
CN106530782B (zh) * 2016-09-30 2019-11-12 广州大正新材料科技有限公司 一种道路车辆交通告警方法
CN106530703B (zh) * 2016-11-25 2019-09-24 四川长虹电器股份有限公司 一种基于物联网的智能终端路况采集系统
DE102017213925A1 (de) * 2017-08-10 2019-02-14 Robert Bosch Gmbh Verfahren zur Erfassung von Straßenzustandsinformationen, Verfahren zur Verteilung von Verkehrsinformationen und ein Verkehrsinformationssystem
CN108417087B (zh) * 2018-02-27 2021-09-14 浙江吉利汽车研究院有限公司 一种车辆安全通行系统及方法
CN109709593A (zh) * 2018-12-28 2019-05-03 国汽(北京)智能网联汽车研究院有限公司 基于“云-端”紧耦合的智能网联汽车车载终端平台
CN110570674A (zh) * 2019-09-06 2019-12-13 杭州博信智联科技有限公司 车路协同数据交互方法、系统、电子设备及可读存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3056861A1 (en) * 2015-02-12 2016-08-17 Honda Research Institute Europe GmbH Method and system in a vehicle for improving prediction results of an advantageous driver assistant system
CN105025077A (zh) * 2015-05-28 2015-11-04 广州番禺职业技术学院 一种基于云计算的车载物联网运营系统
CN105390009A (zh) * 2015-11-17 2016-03-09 广东好帮手电子科技股份有限公司 一种动态交通信息发布方法及系统
US20180059680A1 (en) * 2016-08-29 2018-03-01 Denso Corporation Vehicle location recognition device
CN110210280A (zh) * 2019-03-01 2019-09-06 北京纵目安驰智能科技有限公司 一种超视距感知方法、系统、终端和存储介质
CN109738923A (zh) * 2019-03-18 2019-05-10 腾讯科技(深圳)有限公司 一种行车导航方法和装置以及系统

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113895442A (zh) * 2021-10-11 2022-01-07 苏州智加科技有限公司 一种基于路侧与车端协同感知的车辆行驶决策方法及系统
CN113895442B (zh) * 2021-10-11 2023-08-01 苏州智加科技有限公司 一种基于路侧与车端协同感知的车辆行驶决策方法及系统
CN113888871B (zh) * 2021-10-20 2023-05-05 上海电科智能系统股份有限公司 高速公路交通事件自动化处置联动系统及方法
CN113888871A (zh) * 2021-10-20 2022-01-04 上海电科智能系统股份有限公司 高速公路交通事件自动化处置联动系统及方法
CN113727304A (zh) * 2021-11-04 2021-11-30 深圳市城市交通规划设计研究中心股份有限公司 一种基于5g通信架构的紧急车辆预警系统及其预警方法
CN114244880A (zh) * 2021-12-16 2022-03-25 云控智行科技有限公司 智能网联驾驶云控功能的运行方法、装置、设备和介质
CN114244880B (zh) * 2021-12-16 2023-12-26 云控智行科技有限公司 智能网联驾驶云控功能的运行方法、装置、设备和介质
CN114333330B (zh) * 2022-01-27 2023-04-25 浙江嘉兴数字城市实验室有限公司 一种基于路侧边缘全息感知的交叉口事件检测系统
CN114333330A (zh) * 2022-01-27 2022-04-12 浙江嘉兴数字城市实验室有限公司 一种基于路侧边缘全息感知的交叉口事件检测系统及方法
CN114596707A (zh) * 2022-03-16 2022-06-07 阿波罗智联(北京)科技有限公司 交通控制方法及装置、设备、系统、介质
CN114596707B (zh) * 2022-03-16 2023-09-01 阿波罗智联(北京)科技有限公司 交通控制方法及装置、设备、系统、介质
CN114792470A (zh) * 2022-04-08 2022-07-26 广州小鹏汽车科技有限公司 路况显示方法、装置、可穿戴设备及存储介质
CN115100852A (zh) * 2022-06-09 2022-09-23 智能汽车创新发展平台(上海)有限公司 服务于智能网联汽车的高可用路侧融合感知系统和方法

Also Published As

Publication number Publication date
CN113498011A (zh) 2021-10-12
CN113498011B (zh) 2023-08-15

Similar Documents

Publication Publication Date Title
WO2021184841A1 (zh) 车联网方法、装置、设备、存储介质及系统
US20200209871A1 (en) Method and Apparatus for Analyzing Driving Risk and Sending Risk Data
CN113256976B (zh) 一种车路协同系统、模拟仿真方法、车载设备和路侧设备
JP6928184B2 (ja) 車両システムにおけるターゲット車両選択およびメッセージ配信
WO2017071224A1 (zh) 一种行车信息共享的方法、车载平台及智能交通系统
WO2022142664A1 (zh) 交通信息的传输方法、装置、介质、电子设备和程序产品
US20230030446A1 (en) Remote driving method, apparatus, and system, device, and medium
JP7225753B2 (ja) 情報収集装置、情報収集システム、情報収集方法及びコンピュータプログラム
CN102546696A (zh) 行车感知导航系统
US11308736B2 (en) Selecting V2X communications interface
US20220046391A1 (en) Vehicle to everything object exchange system
US20230269566A1 (en) System and method of communication between a vehicle and an agent
WO2019000745A1 (zh) 一种兼容多制式v2x的v2x终端、系统及管理方法
US11170640B2 (en) Method and apparatus for bridging and optimizing V2X networks
US20230336953A1 (en) Method by which first server transmits second message in wireless communication system, and device therefor
JP2023517799A (ja) 車両対あらゆるモノ(v2x)によって支援されるローカルナビゲーション
US20230080095A1 (en) Method and device for generating vru path map related to moving path of vru by softv2x server in wireless communication system supporting sidelink
CN112583872B (zh) 一种通信方法及装置
KR20210070038A (ko) 교통 안전 서비스를 위한 유무선 통신 네트워크 시스템 및 제공 방법
WO2023233989A1 (ja) 通信装置及び通信方法
WO2023221031A1 (en) Systems and methods to provide network services to user devices
WO2023171371A1 (ja) 通信装置および通信方法
Masahiro et al. Remote Proxy V2V Messaging using IPv6 and GeoNetworking
JP6738945B1 (ja) 通信装置、通信システム及び通信装置のプログラム
Huang 5G-based intelligent transportation system construction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20925195

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20925195

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 20925195

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 03/07/2023)