CN114530041A - New vehicle-road cooperative fusion perception method based on accuracy - Google Patents

New vehicle-road cooperative fusion perception method based on accuracy Download PDF

Info

Publication number
CN114530041A
CN114530041A CN202210143228.0A CN202210143228A CN114530041A CN 114530041 A CN114530041 A CN 114530041A CN 202210143228 A CN202210143228 A CN 202210143228A CN 114530041 A CN114530041 A CN 114530041A
Authority
CN
China
Prior art keywords
perception
complexity
accuracy
vehicle
road
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210143228.0A
Other languages
Chinese (zh)
Other versions
CN114530041B (en
Inventor
李文亮
周炜
高金
刘智超
曹琛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Research Institute of Highway Ministry of Transport
Original Assignee
Research Institute of Highway Ministry of Transport
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Research Institute of Highway Ministry of Transport filed Critical Research Institute of Highway Ministry of Transport
Priority to CN202210143228.0A priority Critical patent/CN114530041B/en
Publication of CN114530041A publication Critical patent/CN114530041A/en
Application granted granted Critical
Publication of CN114530041B publication Critical patent/CN114530041B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0125Traffic data processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/28Databases characterised by their database models, e.g. relational or object models
    • G06F16/284Relational databases
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/0104Measuring and analyzing of parameters relative to traffic conditions
    • G08G1/0137Measuring and analyzing of parameters relative to traffic conditions for specific applications
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/048Detecting movement of traffic to be counted or controlled with provision for compensation of environmental or other condition, e.g. snow, vehicle stopped at detector
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention provides a novel vehicle-road cooperative fusion perception method based on accuracy. The method comprises the steps of calculating a perception accuracy value in real time by establishing a perception model and a perception complexity calculation model which are unified at a vehicle end and a road end and establishing a relation model of respective perception complexity and perception accuracy, and acquiring a perception result by the perception accuracy value. The invention evaluates the perception accuracy by using the perception complexity and acquires the perception result by using the perception accuracy, thereby achieving the purpose of more accurate decision judgment.

Description

New vehicle-road cooperative fusion perception method based on accuracy
Technical Field
The invention relates to a vehicle intelligent control technology, in particular to a new method for vehicle-road cooperative fusion perception based on accuracy.
Background
The vehicle-road cooperation technology is a research hotspot of current vehicle intelligent traffic, and dynamic real-time information interaction among traffic elements such as people, vehicles, roads and the like is realized through advanced wireless communication and internet technologies, so that the safety and efficiency of vehicle operation are improved. The vehicle-road cooperative sensing technology is a premise for realizing vehicle-road cooperative control, the current vehicle-road cooperative sensing mainly utilizes the over-the-horizon sensing function of the road side, namely, the area which cannot be sensed by a vehicle end is completely provided with road-side sensing information, but under most conditions, the vehicle end and the road side can realize sensing, especially when the sensing precision and the sensing combination theory of the vehicle end and the road-side are inconsistent, the signal of which party is adopted becomes more important, decision-making judgment needs to be carried out, a real sensing conclusion is determined, so that the sensing precision is improved, and no better method for fusing and considering the sensing results of the vehicle end and the road-side is available in the current research.
Disclosure of Invention
Therefore, in order to further improve the accuracy of the cooperative vehicle-road sensing, the invention provides a novel method for cooperative vehicle-road sensing based on accuracy. The method is based on a vehicle end and road end perception complexity model and a perception complexity and perception accuracy relation model which are built under different working conditions, real-time values of the perception complexity and the perception accuracy are calculated according to vehicle end and road end real-time monitoring information of vehicle-road cooperation, the respective perception accuracy is compared with a threshold value, and finally a letter collecting result is determined.
Therefore, the technical scheme adopted by the invention is as follows:
a new vehicle-road collaborative fusion perception method based on accuracy is provided, aiming at a vehicle end and a road end:
determining a perception element by establishing a perception model;
determining the perception complexity based on the perception elements by establishing a perception complexity calculation model;
determining the perception accuracy based on the perception complexity by establishing a relation model between the perception complexity and the perception accuracy;
and determining a confidence acquisition standard based on the perception accuracy by establishing a perception fusion decision model, and deciding a perception result according to the confidence acquisition standard.
And the vehicle end and the road end are applied with a unified perception model.
The perception elements comprise target object elements and environment elements where the target objects are located, the complexity of each perception element is expressed by numerical values, and the more complex the perception is, the larger the value is.
The complexity of each perception element has the same value interval.
The established perception complexity calculation model comprises a calculation model of target object elements and target object complexity, a calculation model of environment elements and environment complexity of the target object, and a comprehensive perception complexity calculation model based on the target object complexity and the environment complexity;
the perceptual accuracy is determined based on a composite perceptual complexity.
And the vehicle end and the road end calculate and obtain respective perception complexity based on the perception complexity calculation model according to the perception element information monitored in real time.
The vehicle end and the road end are respectively fitted and established with respective perception complexity and perception accuracy relation models;
and then calculating the perception accuracy according to the perception complexity obtained in real time based on respective perception complexity and perception accuracy relation models.
In the perception fusion decision model, the confidence acquisition standard based on the perception accuracy rate is as follows:
setting a perception accuracy threshold value as Z0Vehicle end sensing accuracy is ZVRoad end sensing accuracy is ZR
When Z isV<Z0And Z isR<Z0No confidence in any perception result is made;
when Z isV≥Z0And/or ZR≥Z0And Z isV≠ZRSensing results with high confidence acquisition sensing accuracy;
when Z isV>Z0And Z isR>Z0And Z isV=ZRAnd the signal collecting vehicle end senses the result.
Due to the adoption of the technical scheme, compared with the prior art, the invention has the following remarkable technical effects:
1. the invention evaluates the elements of the target object and the environment of the target object by establishing a unified perception complexity model of the vehicle-end and road-end systems, sets a value range, presets a perception complexity calculation model, and converts the content of the complex elements into specific numerical values, so that the element evaluation is simplified.
2. The invention establishes a relation model of perception complexity and perception accuracy, enables the perception accuracy to be related to the complexity of environment and target, presets a perception accuracy calculation model for a vehicle end system and a road end system, calculates the value of the perception accuracy in real time according to real-time monitoring information, and enables the perception information of a vehicle-road cooperative system to be timely fed back to a driver for judgment.
3. According to the invention, the values of the vehicle-end sensing accuracy and the road-end sensing accuracy are compared with the preset sensing accuracy threshold, and the adoption sensing result is judged by fusing the values of the vehicle-end sensing accuracy and the road-end sensing accuracy, so that the sensing accuracy is further improved.
Drawings
The drawings are only for purposes of illustrating particular embodiments and are not to be construed as limiting the invention, wherein like reference numerals are used to designate like parts throughout.
FIG. 1 is a flow chart of a new method for cooperative vehicle-road fusion perception based on accuracy;
FIG. 2 is a graph of fitting vehicle-end perception complexity to perception accuracy;
fig. 3 is a fitting curve graph of road-end sensing complexity and sensing accuracy.
Detailed Description
The invention is described in detail below with reference to the drawings, which form a part hereof, and which are shown by way of illustration, embodiments of the invention. However, it should be understood by those skilled in the art that the following examples are not intended to limit the scope of the present invention, and any equivalent changes or modifications made within the spirit of the present invention should be considered as falling within the scope of the present invention.
The invention provides a novel method for vehicle-side and road-side cooperative fusion sensing based on vehicle-side and road-side system sensing accuracy, which is a method for realizing fusion decision of vehicle-side and road-side sensing accuracy by establishing a unified vehicle-side and road-side sensing model by traversing different working condition application scenes, determining sensing elements and an evaluation and analysis method of the elements, establishing a sensing complexity calculation model at a vehicle side and a road side, calculating the sensing complexity in real time according to real-time monitored target object information and environment information (namely the sensing elements), establishing a relation model of the sensing complexity and the sensing accuracy, calculating values of the vehicle-side and road-side sensing accuracy through the real-time calculated sensing complexity, and finally comparing the values with a preset sensing accuracy threshold respectively based on a sensing fusion decision model.
The method comprises the following specific steps:
1. establishment of unified perception model of vehicle end and road end
Establishing a perception model to determine perception elements and an evaluation method of the elements. The element content of the perception model comprises an object element and an environment element where the object is located, the object element comprises the content of the size, the shape, the color, the material, the dynamic information and the like of the object, the perception of the complexity of the elements is called the complexity of the object, and the perception is not exhaustive; the environmental elements of the target object include weather conditions, light conditions, road surface states, traffic flow information and the like, which are not exhaustive, and the perception of the complexity of the elements is called environmental complexity.
The comprehensive perception complexity comprises target object complexity and environment complexity, the comprehensive perception complexity is used for describing the perceived complexity of the target object and the environment where the target object operates, and the vehicle-end system and the road-end system perceive the comprehensive complexity of the target object and the environment where the target object is located under the unified model.
The perception complexity evaluation is evaluation analysis on element contents, such as: the larger the weather element, rain, snow, fog, hail and the like are, the more complicated the weather element is, and the weaker the light is, the more complicated the weather element is; the smaller the volume of the target, the more complicated the shape of the target, the less obvious the reflection characteristics of the target material, the higher the speed, and the like.
For convenience of calculation, the evaluation of the perceptual complexity of various elements of the perceptual model is expressed by a numerical value, for example, the evaluation can be determined by a normalization method, a value range is set to [0,1], and the more complex the perception is, the larger the value is, as shown in table 1:
table 1: perception complexity model element and evaluation table
Figure BDA0003507419180000041
2. Establishment of unified perception complexity calculation model of vehicle end and road end
The same perception complexity calculation model is preset in the vehicle-end and road-end systems, and the perception complexity calculation model comprises the functional relationship between the environmental element where the target object is located and the environmental complexity, the functional relationship between the target object element and the target object complexity, and the composite relationship between the environmental complexity and the target object complexity, namely the comprehensive perception complexity.
(1) According to the perception model, a calculation model of each environmental element and environmental complexity of the target object is established, and the calculation model can be expressed as follows:
A=f1(a1,a2,a3,...,ai) (1)
wherein:
a is the environmental complexity;
a1,...,aia perceptual complexity value for each environmental element;
i is the number of the environmental elements,
(2) according to the perception model, a calculation model of each target element and the complexity of the target is established, and the calculation model can be expressed as follows:
B=f2(b1,b2,b3,...,bj) (2)
wherein:
b is the complexity of the target;
b1,...,bja perceptual complexity value for each target object element;
j is the number of target elements.
(3) Establishing comprehensive perception complexity B according to environment complexity and target object complexityAThe computational model of (3), can be expressed as:
BA=f3(A,B) (3)
wherein:
BAto synthesize perceptual complexity.
The vehicle end perception complexity calculation model is preset at the vehicle end, the perception complexity calculation model is preset at the road end, under specific environment and target object, the vehicle end and the road end can respectively calculate and output the value of comprehensive perception complexity in real time according to the preset perception complexity calculation model by monitoring the target object information and the environment information in real time, and the value can be respectively expressed as:
vehicle end:
BA-V=f3(AV,BV) (4)
wherein:
BA-Vfor comprehensive perception of complexity at the vehicle end, AV,BVThe environment complexity and the target object complexity are respectively calculated for the vehicle end.
Road end:
BA-R=f3(AR,BR) (5)
wherein: b isA-RFor comprehensive perception of complexity at the end of the road, AR,BRThe calculated environment complexity and the target complexity are respectively the road end.
Due to the different perception capabilities, the complexity of the car-end and road-end perception may be the same or different. If the target and environment are uniformly identified based on vehicle-road coordination, the complexity results of vehicle-end and road-end perception should be consistent.
3. Establishment of relation model between vehicle-end and road-end perception complexity and perception accuracy
And respectively establishing a relation model of the vehicle-end perception complexity and the perception accuracy and a relation model of the road-end perception complexity and the perception accuracy.
The perception accuracy is described as the ratio of the correct number of detected targets to all detected numbers, including the correct number of detected targets and the number of detected errors, under the specific perception target and the environment where the target is located. The vehicle end and the road end have respective perception characteristics, and even if the same target object in the same environment is monitored, perception results may have differences, which are related to perception accuracy.
Therefore, the vehicle end perception accuracy and the road end perception accuracy under specific target objects and environments are firstly determined through a large number of tests, and then the vehicle end perception complexity and the vehicle end perception accuracy Z are established through a fitting methodVAnd the road end perception complexity and the road end perception accuracy rate ZRThe relationship model of (1).
The relationship model of the vehicle-end perception accuracy and the perception complexity can be expressed as follows:
ZV=f4(BA-V) (6)
the relationship model of the road end perception accuracy and the perception complexity can be expressed as follows:
ZR=f5(BA-R) (7)
above f1、f2、f3、f4、f5Both represent function operators.
The following examples verify that within the range of perceptual complexity values, at the same perceptual complexity BANext, the vehicle-end perception accuracy Z obtained by the experimentVSum end sensing accuracy ZRThe values of (a) are different as shown in table 2:
table 2: comparison of vehicle-end and road-end perception accuracy
Figure BDA0003507419180000061
And obtaining experimental data in the table above through multiple experiments, fitting the data, and establishing a relation model between the vehicle-end and road-end sensing complexity and the sensing accuracy by adopting a cubic polynomial method or other fitting methods.
As shown in fig. 2, a curve of a relationship model between sensing complexity and sensing accuracy at a vehicle end has the following fitting formula:
ZV=-1.1574BA 3+0.4415BA 2-0.2682BA+0.9928
as shown in fig. 3, a curve of a model of relationship between sensing complexity and sensing accuracy at a road end has the following fitting formula:
ZR=-1.6088BA 3+0.6587BA 2-0.0472BA+0.9847
the method comprises the steps of presetting a relation model of vehicle end perception accuracy and perception complexity at a vehicle end, presetting a relation model of road end perception accuracy and perception complexity at a road end, under a specific environment and a target object, calculating and outputting values of the vehicle end perception accuracy and the road end perception accuracy according to target object information and environment information which are obtained by the vehicle end and the road end in real time, and combining the current perception complexity through the relation model of the vehicle end perception accuracy and the road end perception complexity, and performing fusion decision according to the values of the vehicle end perception accuracy and the road end perception accuracy.
4. Establishment of perception fusion decision model
And establishing a fusion decision model based on the perception accuracy. Presetting a threshold value Z of perception accuracy in a vehicle end system0And the acquisition standard of the vehicle end and road end perception accuracy rate is based on the vehicle end perception accuracy rate Z obtained in real timeVSum end sensing accuracy ZRWith a threshold value Z0And comparing to obtain the result of gathering information. The criteria for letter acquisition are as follows:
when Z isV<Z0And Z isR<Z0If no sensing result is collected, the vehicle-road cooperative system quits working and prompts the driver;
when Z isV≥Z0And/or ZR≥Z0And Z isV≠ZRSensing results with high confidence acquisition sensing accuracy;
when Z isV>Z0And Z isR>Z0And Z isV=ZRAnd the vehicle end is used as the main vehicle under the condition of the same sensing accuracy, so the signal collecting vehicle end senses the result.
Examples
1. According to the perception model, the vehicle running condition under a specific environment is selected, a standard class-A car is selected in the embodiment and runs at the speed of 100km/h in the environment with light fog, and the elements and evaluation examples of the established perception model are shown in a table 3 and are as follows:
table 3: perception model element and evaluation example table
Figure BDA0003507419180000081
2. Establishing a calculation model of the environmental complexity, wherein the calculation formula of each environmental element and the environmental complexity of the car in the embodiment shown in table 3 is as follows:
Figure BDA0003507419180000082
wherein:
n is the number of the environmental elements, and n is 6;
a40.2, others are 0;
and A is the environmental complexity.
A calculation model of the complexity of the target object is established, and the calculation formula of each target object element and the complexity of the target object of the car in the embodiment shown in table 3 is as follows:
Figure BDA0003507419180000083
wherein:
m is the number of the target object elements, and m is 4;
b40.2, others are 0;
and B is the complexity of the target.
The comprehensive perception complexity of the car of the embodiment under the working condition is expressed as the relationship between the environment complexity and the target object complexity as follows:
BA=(A+B)/2 (10)
the cooperative sensing of the vehicle end and the road end can be obtained by substituting the formula (8) and the formula (9) into the formula (10):
Figure BDA0003507419180000091
3. establishing a relationship between the perception complexity and the perception accuracy of the vehicle end as follows:
ZV=-1.1574BA 3+0.4415BA 2-0.2682BA+0.9928
the perception complexity and the perception accuracy of the road end are related as follows:
ZR=-1.6088BA 3+0.6587BA 2-0.0472BA+0.9847
according to the formula (11) BAThe obtained value is brought in to obtain the vehicle end perception accuracy rate ZVSum end sensing accuracy ZRThe values of (a) are 0.95 and 0.98, respectively.
4. Confidence gathering of perception result
Setting a perception accuracy threshold Z00.90, vehicle end perception accuracy rate Z obtained by calculationVValue of 0.95 and road end perception accuracy ZRThe values are 0.98, and are all larger than the set threshold value Z0And Z isR>ZVAccording to the signal acquisition condition of the sensing result, the sensing result with high signal acquisition sensing accuracy value is obtained, and the line end sensing accuracy Z is obtainedRThe value of (2) is high, so the sensing result of the signal line end is collected.

Claims (8)

1. A new vehicle-road collaborative fusion perception method based on accuracy is characterized in that: aiming at the vehicle end and the road end,
determining a perception element by establishing a perception model;
determining the perception complexity based on the perception elements by establishing a perception complexity calculation model;
determining the perception accuracy based on the perception complexity by establishing a relation model between the perception complexity and the perception accuracy;
and determining a confidence acquisition standard based on the perception accuracy by establishing a perception fusion decision model, and deciding a perception result according to the confidence acquisition standard.
2. The new accuracy-based vehicle-road cooperative fusion perception method according to claim 1, characterized in that:
and the vehicle end and the road end are applied with a unified perception model.
3. The new accuracy-based vehicle-road cooperative fusion perception method according to claim 1, characterized in that:
the perception elements comprise target object elements and environment elements where the target objects are located, the complexity of each perception element is expressed by numerical values, and the more complex the perception is, the larger the value is.
4. The new accuracy-based vehicle-road cooperative fusion perception method according to claim 3, characterized in that:
the complexity of each perception element has the same value interval.
5. The new accuracy-based vehicle-road cooperative fusion perception method according to one of claims 1-4, characterized in that: the established perception complexity calculation model comprises a calculation model of target object elements and target object complexity, a calculation model of environment elements and environment complexity of the target object, and a comprehensive perception complexity calculation model based on the target object complexity and the environment complexity;
the perceptual accuracy is determined based on a composite perceptual complexity.
6. The new accuracy-based vehicle-road cooperative fusion perception method according to claim 1, characterized in that:
and the vehicle end and the road end calculate and obtain respective perception complexity based on the perception complexity calculation model according to the perception element information monitored in real time.
7. The new accuracy-based vehicle-road cooperative fusion perception method according to claim 1, characterized in that:
the vehicle end and the road end are respectively fitted and established with respective perception complexity and perception accuracy relation models;
and then calculating the perception accuracy according to the perception complexity obtained in real time based on respective perception complexity and perception accuracy relation models.
8. The new accuracy-based vehicle-road cooperative fusion perception method according to claim 1, characterized in that: in the perception fusion decision model, the confidence acquisition standard based on the perception accuracy rate is as follows:
setting a perception accuracy threshold value as Z0Vehicle end sensing accuracy is ZVRoad end sensing accuracy is ZR
When Z isV<Z0And Z isR<Z0No confidence in any perception result is made;
when Z isV≥Z0And/or ZR≥Z0And Z isV≠ZRSensing results with high confidence acquisition sensing accuracy;
when Z isV>Z0And Z isR>Z0And Z isV=ZRAnd the signal collecting vehicle end senses the result.
CN202210143228.0A 2022-02-16 2022-02-16 Novel vehicle-road collaborative fusion sensing method based on accuracy Active CN114530041B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210143228.0A CN114530041B (en) 2022-02-16 2022-02-16 Novel vehicle-road collaborative fusion sensing method based on accuracy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210143228.0A CN114530041B (en) 2022-02-16 2022-02-16 Novel vehicle-road collaborative fusion sensing method based on accuracy

Publications (2)

Publication Number Publication Date
CN114530041A true CN114530041A (en) 2022-05-24
CN114530041B CN114530041B (en) 2023-05-02

Family

ID=81622717

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210143228.0A Active CN114530041B (en) 2022-02-16 2022-02-16 Novel vehicle-road collaborative fusion sensing method based on accuracy

Country Status (1)

Country Link
CN (1) CN114530041B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004184230A (en) * 2002-12-03 2004-07-02 Mazda Motor Corp Information providing device for vehicle
CN110853393A (en) * 2019-11-26 2020-02-28 清华大学 Intelligent network vehicle test field data acquisition and fusion method and system
WO2021077809A1 (en) * 2019-10-26 2021-04-29 华为技术有限公司 Information fusion method and system
CN113064193A (en) * 2021-03-25 2021-07-02 上海智能新能源汽车科创功能平台有限公司 Combined positioning system based on vehicle road cloud cooperation
CN113743479A (en) * 2021-08-19 2021-12-03 东南大学 End-edge-cloud vehicle-road cooperative fusion perception architecture and construction method thereof
CN113763738A (en) * 2021-09-14 2021-12-07 上海智能网联汽车技术中心有限公司 Method and system for matching roadside perception and vehicle-end perception of vehicle-road cooperative system in real time
CN113887522A (en) * 2021-11-02 2022-01-04 清华大学 Data processing method, system and storage medium based on vehicle-road cooperation
CN113895442A (en) * 2021-10-11 2022-01-07 苏州智加科技有限公司 Vehicle driving decision method and system based on roadside and vehicle end cooperative sensing
CN114023061A (en) * 2021-10-25 2022-02-08 华录易云科技有限公司 Traffic flow acquisition capacity evaluation method based on vehicle-road cooperative roadside sensing system

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004184230A (en) * 2002-12-03 2004-07-02 Mazda Motor Corp Information providing device for vehicle
WO2021077809A1 (en) * 2019-10-26 2021-04-29 华为技术有限公司 Information fusion method and system
CN110853393A (en) * 2019-11-26 2020-02-28 清华大学 Intelligent network vehicle test field data acquisition and fusion method and system
CN113064193A (en) * 2021-03-25 2021-07-02 上海智能新能源汽车科创功能平台有限公司 Combined positioning system based on vehicle road cloud cooperation
CN113743479A (en) * 2021-08-19 2021-12-03 东南大学 End-edge-cloud vehicle-road cooperative fusion perception architecture and construction method thereof
CN113763738A (en) * 2021-09-14 2021-12-07 上海智能网联汽车技术中心有限公司 Method and system for matching roadside perception and vehicle-end perception of vehicle-road cooperative system in real time
CN113895442A (en) * 2021-10-11 2022-01-07 苏州智加科技有限公司 Vehicle driving decision method and system based on roadside and vehicle end cooperative sensing
CN114023061A (en) * 2021-10-25 2022-02-08 华录易云科技有限公司 Traffic flow acquisition capacity evaluation method based on vehicle-road cooperative roadside sensing system
CN113887522A (en) * 2021-11-02 2022-01-04 清华大学 Data processing method, system and storage medium based on vehicle-road cooperation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
程晨等: "自动驾驶汽车运行安全性测试评价体系研究", 《中国公共安全(学术版)》 *

Also Published As

Publication number Publication date
CN114530041B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
CN108445503B (en) Unmanned path planning algorithm based on fusion of laser radar and high-precision map
CN109094575A (en) Control method for vehicle, server-side and the client of intelligent scene
US10475336B2 (en) System for forecasting traffic condition pattern by analysis of traffic data and forecasting method thereof
CN110837800A (en) Port severe weather-oriented target detection and identification method
CN112116031B (en) Target fusion method, system, vehicle and storage medium based on road side equipment
CN103065151A (en) Vehicle identification method based on depth information
CN111723849A (en) Road adhesion coefficient online estimation method and system based on vehicle-mounted camera
CN110194041B (en) Self-adaptive vehicle body height adjusting method based on multi-source information fusion
CN112464773B (en) Road type identification method, device and system
CN106408944A (en) Congestion level analysis platform based on double communication data
CN107591025A (en) The method for early warning and system, server, car terminals, memory of vehicle traveling
CN202008739U (en) Traffic information vehicle-mounted service system
CN113538357B (en) Shadow interference resistant road surface state online detection method
CN216357337U (en) Intelligent network connection research testing device
CN114475573A (en) Fluctuating road condition identification and vehicle control method based on V2X and vision fusion
US11492006B2 (en) Apparatus and methodology of road condition classification using sensor data
CN114530041A (en) New vehicle-road cooperative fusion perception method based on accuracy
CN110509925B (en) Method for identifying sharp turn based on Internet of vehicles data
CN105684062A (en) Method and apparatus for providing an event message with respect to an imminent event for a vehicle
CN110782671A (en) Real-time updating method and server for road congestion state
CN116486359A (en) All-weather-oriented intelligent vehicle environment sensing network self-adaptive selection method
CN114024997B (en) Intelligent equipment based on automatic driving and AIOT Internet of things platform method
CN107329471B (en) A kind of intelligent decision system of automatic driving vehicle
CN112347953B (en) Recognition device for road condition irregular obstacles of unmanned vehicle
CN115326131A (en) Intelligent analysis method and system for unmanned mine road conditions

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant