CN114173307A - Roadside perception fusion system based on vehicle-road cooperation and optimization method - Google Patents

Roadside perception fusion system based on vehicle-road cooperation and optimization method Download PDF

Info

Publication number
CN114173307A
CN114173307A CN202111548249.2A CN202111548249A CN114173307A CN 114173307 A CN114173307 A CN 114173307A CN 202111548249 A CN202111548249 A CN 202111548249A CN 114173307 A CN114173307 A CN 114173307A
Authority
CN
China
Prior art keywords
vehicle
positioning
roadside
road
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111548249.2A
Other languages
Chinese (zh)
Inventor
张莉
邢珺
李志伟
沈志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Haikang Zhilian Technology Co ltd
Original Assignee
Zhejiang Haikang Zhilian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Haikang Zhilian Technology Co ltd filed Critical Zhejiang Haikang Zhilian Technology Co ltd
Priority to CN202111548249.2A priority Critical patent/CN114173307A/en
Publication of CN114173307A publication Critical patent/CN114173307A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • H04W4/44Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P] for communication between vehicles and infrastructures, e.g. vehicle-to-cloud [V2C] or vehicle-to-home [V2H]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/02Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
    • H04L67/025Protocols based on web technology, e.g. hypertext transfer protocol [HTTP] for remote control or remote monitoring of applications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L67/00Network arrangements or protocols for supporting network services or applications
    • H04L67/01Protocols
    • H04L67/12Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/06Selective distribution of broadcast services, e.g. multimedia broadcast multicast service [MBMS]; Services to user groups; One-way selective calling services

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Business, Economics & Management (AREA)
  • Emergency Management (AREA)
  • Traffic Control Systems (AREA)

Abstract

An optimization method based on a roadside perception fusion system sends vehicle-end positioning (vehicle-end high-precision GNSS, inertial navigation or high-precision map combined positioning) serving as a positioning true value to a roadside unit based on a vehicle-road cooperation mode, and determines the perception accuracy of roadside sensing equipment by comparing a vehicle positioning value acquired by roadside sensing equipment with the positioning true value fed back by the vehicle-end based on the vehicle-road cooperation mode. When the difference value of the two is within a certain range, all target positioning sensed by the roadside sensing equipment is corrected based on vehicle end positioning data by using a factor graph positioning method, so that the accuracy of the roadside sensing equipment for sensing road target data is improved, and correction information is also sent to a cloud platform through a roadside edge computing unit; when the difference value exceeds the set error allowable range, reporting data abnormity early warning to the platform end, and realizing monitoring of roadside perception data. According to the invention, under the cooperative condition of the vehicle and the road, the factor graph method is utilized to correct the data error acquired by the road side sensing equipment, so that the accuracy of the road side sensing equipment in acquiring data is ensured.

Description

Roadside perception fusion system based on vehicle-road cooperation and optimization method
Technical Field
The invention relates to the field of vehicle-road cooperation roadside perception fusion, in particular to a roadside perception fusion system and an optimization method based on vehicle-road cooperation.
Background
The vehicle-road cooperative technology is characterized in that advanced wireless communication and a new generation internet technology are adopted, dynamic real-time information interaction of vehicles and vehicles is carried out in an all-round mode, active safety control of the vehicles and road cooperative management are carried out on the basis of full-time dynamic traffic information collection and fusion, effective cooperation of the vehicles and the roads is fully achieved, traffic safety is guaranteed, passing efficiency is improved, and therefore a safe, efficient and environment-friendly road traffic system is formed.
At present, the requirement on the positioning accuracy of an automatic driving vehicle is higher and higher, and along with the large-scale installation of roadside equipment with high sensing accuracy, the sensing accuracy of the automatic driving vehicle is directly related to the sensing of an intelligent networked vehicle to a blind area obstacle. However, in actual situations, the angles of the sensing devices are slightly deviated due to factors such as inaccurate calibration in the early stage, infirm installation, wind blowing and the like, so that the accuracy of sensing data is greatly influenced, and the sensor has certain deviation when sensing the target position of the road.
In consideration of the fact that general intelligent networked vehicles have high-precision positioning capability, the invention provides a roadside sensing fusion optimization method based on vehicle-road cooperation to improve and monitor the precision of data broadcast by roadside sensing equipment.
Some cases of existing patents:
the patent: a vehicle positioning method and device (patent application number: 202011548379.1) based on the integration of vehicle-road cooperation and high-precision positioning, which utilize the vehicle-road cooperation technology and consider the defects of high-precision satellite navigation and inertial navigation, can increase a specific anchor point within a certain distance in some scenes with complicated roads or unfavorable satellite positioning, thereby realizing the high-precision positioning of vehicles under all road conditions. The method is only suitable for the road sections with the anchor point marks, and the high-precision positioning of the vehicle cannot be realized by the method for the road sections without the anchor point marks.
The patent discloses a positioning method and device, a vehicle and an auxiliary positioning system (patent application number: 201911241849.7), which propose to transmit position information in a position transmitter transmission range, acquire the current position of the vehicle according to the current geographic position reserved by the position information and the running information of the vehicle, and perform accurate positioning of the vehicle. The method can carry out accurate positioning only when the position transmitters are continuously deployed and the vehicle is required to be ensured to be within the deployment range of the transmitters in the running process.
The patent discloses an automatic driving high-precision positioning and path planning method (application patent number: 201911027832.1), which proposes that RSU is deployed under different road scenes (urban roads and high-speed different scenes) (RSU shells should be coated with rectangular bright-colored marks and serve as target points for optical positioning at the same time), and the precise positioning of a vehicle is realized by utilizing PCM optical ranging and using elevation angle measurement and azimuth angle measurement.
It can be seen that the prior art does not solve the problem that sensing data accuracy error changes caused by calibration and long-time slight displacement of the sensor of the road side sensing equipment of the whole road section, so that sensing data deviation of the environment of the road around the networked vehicle is provided.
Disclosure of Invention
According to the problems brought forward by the background technology, the invention provides a roadside sensing fusion optimization method based on vehicle-road cooperation, aiming at solving the technical difficulty that sensing data precision error changes caused by calibration and long-time slight displacement of a sensor of roadside sensing equipment cause sensing data deviation provided for surrounding roads of networked vehicles.
Because the vehicle end generally has high-precision GNSS, inertial navigation or high-precision map combined positioning capability and has higher positioning precision, the vehicle end positioning can be used as a true value and sent to the road side unit based on a road coordination mode, and the accuracy of the road side sensing equipment sensing is determined by comparing a vehicle positioning value acquired by the road side sensing equipment with a positioning true value fed back by the vehicle end based on the road coordination mode. When the difference value of the two is within a certain range, all target positioning sensed by the roadside sensing equipment is corrected based on vehicle end positioning data by using a factor graph positioning method, so that the precision of the roadside sensing equipment for sensing road target data is improved, and correction information is also sent to a cloud platform through a roadside edge computing unit (MEC); when the difference value exceeds the set error allowable range, reporting data abnormity early warning to the platform end, and realizing monitoring of roadside perception data.
A roadside perception fusion system based on vehicle-road cooperation comprises:
the vehicle-mounted unit broadcasts the vehicle information to the road side unit in a direct communication mode through the cooperation of the vehicle and the PC 5;
the road side unit is arranged at the road side at a certain distance and used for receiving the vehicle information broadcasted by the vehicle-mounted unit and broadcasting the road and the vehicle information to the outside;
the roadside sensing equipment acquires road and vehicle information and is used for sending the information to roadside edge calculation through a wired network;
the roadside edge calculation is distributed at the roadside of the road and used for receiving and processing information sent by the roadside unit and the roadside sensing equipment through a wired network;
and the cloud control platform is used for receiving road and vehicle structural data sent by the roadside edge computing and monitoring and early warning abnormal data.
An optimization method based on a roadside perception fusion system comprises the following steps:
step 1: the vehicle-mounted unit broadcasts the vehicle information to the outside in a way of cooperating the vehicle and the road with the PC5 in the road driving process of the internet vehicle;
step 2: when the vehicle enters the communication range of the road side unit, the road side unit receives the vehicle information broadcasted by the vehicle-mounted unit, the obtained real-time positioning information of the vehicle is used as a real-time positioning truth value of the vehicle, and the real-time positioning truth value is sent to road side edge calculation;
and step 3: road side sensing equipment (equipment such as a radar and a camera) senses and acquires road and vehicle information and sends the road and vehicle information to road side edge calculation;
and 4, step 4: the roadside edge calculation is used for identifying, tracking and fusing the acquired road and vehicle information to obtain structured information, and the vehicle positioning data sensed by the roadside sensing equipment is used as a value to be checked;
and 5: the roadside edge calculation compares the acquired positioning true value sent to the roadside unit by the vehicle-mounted unit with the vehicle positioning data to-be-inspected value sensed by the roadside sensing equipment;
step 6: if the error between the positioning true value and the roadside perception positioning value to be detected is smaller than the set allowable error value, the perception capability of the roadside perception equipment is considered to be more accurate; if the error between the positioning true value and the positioning value to be estimated exceeds the set allowable error but is within the abnormal error, correcting the positioning value to be estimated by utilizing a factor graph method; and if the error between the positioning true value and the positioning value to be estimated reaches an abnormal error range, early warning is given to the cloud platform.
Has the advantages that: compared with the prior art, the method and the device have the advantages that under the cooperative condition of the vehicle and the road, the data error acquired by the road side sensing equipment is corrected by using a factor graph method, and the accuracy of data broadcast by the road side equipment to the outside is ensured.
Drawings
FIG. 1: fusing and optimizing a specific scene graph based on the vehicle-road cooperative roadside sensing equipment;
FIG. 2: a road cooperative roadside perception fusion optimization flow chart is based on the vehicle;
FIG. 3: correcting the flow chart by a factor graph method;
FIG. 4: and the variable node and the factor graph node are in relation graph.
Detailed Description
A specific embodiment of the present invention will be described in detail with reference to the accompanying drawings.
Referring to the attached drawing 1, the roadside perception fusion system based on vehicle-road cooperation comprises an on-board unit (OBU), a roadside unit (RSU), roadside edge computing (MEC), roadside perception equipment (radar, camera and the like) and a cloud control platform.
The vehicle-mounted unit is arranged in a vehicle, and vehicle information (BSM information including longitude and latitude positions, speed, course angles, braking, vehicle types and the like) is broadcast to the road side unit in a direct communication mode through a vehicle road and a PC 5;
the roadside units are arranged at the roadside of the road and are spaced at a certain distance (the urban road is generally 300 + 500 meters), and are used for receiving the vehicle information broadcasted by the vehicle-mounted unit and externally broadcasting the road and the vehicle information;
the roadside sensing equipment comprises equipment such as a radar and a camera, acquires road and vehicle information, and sends the information to roadside edge calculation through a wired network;
the roadside edge is calculated and distributed at the roadside of the road, and receives and processes information sent by the roadside unit and the roadside sensing equipment through a wired network;
the cloud control platform is used for receiving road and vehicle structured data sent by roadside edge computing, and monitoring and early warning abnormal data.
Firstly, the vehicle end has high-precision GNSS, inertial navigation or high-precision map combined positioning, the vehicle positioning precision is higher, and generally the vehicle end has centimeter-level positioning capability. In the present invention, the vehicle-mounted unit (OBU) broadcasts the vehicle-mounted-unit-location data as a true value in a communication system based on the vehicle-road-cooperation PC 5.
When the networked vehicle drives to the communication range of the road side unit on the road (the general urban road is between 300 and 500 m), the vehicle-mounted unit broadcasts the positioning true value of the vehicle-mounted unit to the road side unit. When the networked vehicles are in the sensing range of the road side unit at the same time, the road side unit sends the data sensed by the sensor to the road side edge through the wired network for calculation, and the target structured data is generated after the road side edge is calculated, identified and tracked. And matching vehicle attribute information such as license plates and vehicle body colors acquired by the road side unit with target structural information acquired in the road side edge calculation sensing range to obtain networked vehicle detection data corresponding to the structural data in the road side edge calculation. And after time synchronization is carried out according to the timestamp, eliminating errors caused by time delay (the time delay of vehicle-road cooperative communication is about 20ms, the estimated position error is 0.4m according to the maximum speed of an urban road of 80 km/h), and comparing the real value of the positioning of the networked vehicles with the sensed value to be detected of the positioning of the vehicles. And when the error between the true positioning value and the positioning value to be detected is smaller than a set allowable error (the allowable error composition comprises a time delay error, a reasonable calibration error, a target central point estimation error and the like), the sensing capability of the road side sensing equipment is considered to be more accurate. And when the error between the positioning true value and the positioning value to be estimated exceeds the set allowable error value but is smaller than the maximum acceptable error, correcting the positioning value to be estimated by using a factor graph method to improve the sensing precision of the roadside sensing equipment.
Referring to fig. 2, the method for optimizing the system based on roadside perception fusion provided by the invention comprises the following steps:
step 1: in the road driving process of the internet-connected vehicle with high-precision positioning capability, the vehicle-mounted unit broadcasts vehicle information (BSM information including longitude and latitude positions, speed, course angles, braking, vehicle types and the like) to the outside in a communication mode of cooperating with a PC5 through a vehicle road;
step 2: when the internet-connected vehicle enters the communication range of the road side unit, the road side unit receives the vehicle information broadcasted by the vehicle-mounted unit, the obtained real-time positioning information of the vehicle is used as a real-time position value of the vehicle, and the real-time positioning value is sent to the road side edge for calculation through a wired network;
and step 3: the roadside sensing equipment senses and acquires road and vehicle information such as videos, laser radar point clouds, millimeter wave sensing targets and the like through a road sensor, and sends sensor data representing the road and vehicle information to roadside edges for calculation through a wired network;
and 4, step 4: the roadside edge calculation is used for identifying, tracking and fusing the sensor data to obtain structured information containing information such as vehicle attribute characteristics, positions, speeds, course angles and the like, and the vehicle positioning data sensed by the roadside sensing equipment is used as a value to be checked;
and 5: the roadside edge calculation compares a vehicle positioning true value sent to the roadside unit by the vehicle-mounted unit with a vehicle positioning value to be checked sensed by the sensor;
step 6: if the error between the positioning true value and the roadside perception positioning value to be detected is smaller than the set allowable error value, the perception capability of the roadside perception equipment is considered to be more accurate; if the error between the positioning true value and the positioning value to be estimated exceeds the set allowable error but is within the abnormal error, correcting the positioning value to be estimated by using a factor graph method to improve the sensing precision of the roadside sensing equipment; and if the error between the positioning true value and the positioning value to be estimated is larger, warning the cloud platform when the error reaches an abnormal error range. The method is mainly suitable for the situation that the angle of the sensing equipment slightly deviates due to factors such as early calibration error, instable installation, wind blowing and the like in the actual situation, so that the sensor has certain deviation when sensing the target position. In the embodiment, the factor graph method is used for correcting the vehicle positioning value acquired by the roadside sensing equipment so as to improve the sensing precision of the roadside sensing equipment. Referring to fig. 3, the factor graph method in step 6 specifically includes the following steps:
and S61, acquiring parameters required by the variable nodes and the factor nodes in the factor graph through the roadside sensor.
And S62, defining a variable node and a factor node.
The factor graph includes two types of nodes, namely a variable node and a factor node, in this embodiment, data sensed by the sensor is used as the variable node, and a probability relation between a data value sensed by the sensor and a vehicle positioning value and the like are used as the factor node.
The variable nodes comprise the pose x of the vehiclei(tiSensing the position and posture of the vehicle including the position, speed, direction, etc.) t) at any momentiTo tkThe pose parameters of the vehicle at the moment are as follows:
Xk={xi}i=1,2,…k;(1)
xi=[pT,vTT];(2)
wherein: p ═ x, y, z]Is the position vector of the vehicle, v ═ vx,vy,vz]Is the vehicle velocity vector, and theta is the vehicle heading angle.
ziRepresents tiReceiving true value data of vehicle positioning at time t1Is started at momenttkAll vehicle positioning truth value data received at the moment are recorded as Zk
Zk={zi}i=1,2,…k;(3)
And S63, constructing a factor graph model based on the variable nodes and the factor nodes.
For a given vehicle positioning truth value Z, the most common method for estimating the probability of a perceived unknown state variable X is the maximum a posteriori probability, which is the most common estimation algorithm based on factor graph modeling, as follows:
XMAP=argmaxp(X|Z);(4)
where p is the joint probability function.
For any one factor graph, the MAP inference can be attributed to maximizing the product of all factors, expressed as:
XMAP=argmaxΠiΦi(Xi);(5)
wherein phii(Xi) Is a factor and is proportional to the product of the sets of errors, and,
φi∝fi;(6)
Figure BDA0003416350800000061
wherein h isi(Xi) Is a perception function and is a function related to perception state variables; z is a radical ofiAnd actually positioning a true value for the vehicle.
Under the assumption of a gaussian noise model, by mathematically transforming the formula, the final MAP inference becomes of the form of minimizing nonlinear least squares:
Figure BDA0003416350800000062
the nonlinear least square problem is subjected to linear transformation, h is convertedi(Xi) Performing Taylor expansion to obtain:
Figure BDA0003416350800000063
wherein HiIs to observe the Jacobian matrix at a given linearization point
Figure BDA0003416350800000064
With respect to hi(.), where,
Figure BDA0003416350800000065
further, general formula (8)
Figure BDA0003416350800000066
To convert to:
Figure BDA0003416350800000067
wherein the content of the first and second substances,
Figure BDA0003416350800000068
converting the Ma norm into a 2 norm for the difference value between the actual value and the perception value at the linear point to obtain:
Figure BDA0003416350800000069
wherein:
Figure BDA00034163508000000610
Figure BDA00034163508000000611
and S64, correcting the road side sensor sensing data value according to the factor graph model.
In fact, the optimization equation (12) is actually solving the least squares problem, i.e. solving AiInverse of (A)iIs a reversible matrix ofEach row represents a factor, each column represents a variable contained in the factor, and A is solvediThe inverse of (A) can solve XMAPI.e. find fiAnd further error correction is performed on the error within the tolerance.
The specific relationship between the variable nodes and the factor nodes is shown in fig. 4, and since the data acquired by the roadside sensing equipment can generate error accumulation along with time and can generate errors caused by self-drifting due to the influence of external conditions, the data value acquired by the sensor through the added node Zi passes through fFusion localizationAnd (6) performing correction. Wherein f ispiorIs a unitary factor, only related to the first step state x1It is related. According to the method, under the cooperative condition of the vehicle and the road, the data error acquired by the road side sensing equipment is corrected by using a factor graph method, and the accuracy of the road side sensing equipment for acquiring data is maintained.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (3)

1. A roadside perception fusion optimization method based on vehicle-road cooperation is characterized by comprising the following steps:
step 1: the vehicle-mounted unit broadcasts the vehicle information to the outside in a way of cooperating the vehicle and the road with the PC5 in the road driving process of the internet vehicle;
step 2: when the vehicle enters the communication range of the road side unit, the road side unit receives the vehicle information broadcasted by the vehicle-mounted unit, the obtained real-time positioning information of the vehicle is used as a real-time positioning truth value of the vehicle, and the real-time positioning truth value is sent to road side edge calculation;
and step 3: road side sensing equipment (equipment such as a radar and a camera) senses and acquires road and vehicle information and sends the information to road side edge calculation;
and 4, step 4: the roadside edge calculation is used for identifying, tracking and fusing the acquired information sensed by the sensing equipment to obtain structural information of the vehicle, and the vehicle positioning data sensed by the roadside sensing equipment is used as a value to be checked;
and 5: the roadside edge calculation compares the acquired positioning true value sent to the roadside unit by the vehicle-mounted unit with the vehicle positioning data to-be-inspected value sensed by the roadside sensing equipment;
step 6: if the error between the positioning true value and the roadside perception positioning value to be detected is smaller than the set allowable error value, the perception capability of the roadside perception equipment is considered to be more accurate; if the error between the positioning true value and the positioning value to be estimated exceeds the set allowable error but is within the abnormal error, correcting the positioning value to be estimated by utilizing a factor graph method; and if the error between the positioning true value and the positioning value to be estimated reaches an abnormal error range, early warning is given to the cloud platform.
2. The roadside perception fusion optimization method according to claim 1, wherein the factor graph method in step 6 is as follows:
the factor graph comprises two types of nodes, namely variable nodes and factor nodes, the sensed data is used as the variable nodes, and the probability relation between the sensed data value and the vehicle positioning true value is used as the factor nodes;
the variable nodes comprise the pose x of the vehiclei,tiTo tkThe pose parameters of the vehicle at the moment are as follows:
Xk={xi}i=1,2,…k, (1)
xi=[pT,vTT], (2)
wherein p ═ x, y, z]Is the position vector of the vehicle, v ═ vx,vy,vz]Is a vehicle speed vector, and theta is a vehicle course angle;
definition of ziRepresents tiWhen the true value data of vehicle positioning is received at the moment, the t is started1Time of day up to tkAll positioning truth value data received at the moment is Zk
Zk={zi}i=1,2,…k; (3)
For a given vehicle positioning truth data Z, a maximum a posteriori probability estimation algorithm is used for probability estimation of the perceived unknown state variable X, the algorithm being as follows:
XMAP=argmaxp(X|Z), (4)
wherein p is a joint probability function;
for any one factor graph, the MAP inference can be attributed to maximizing the product of all factors:
XMAP=argmax∏iΦi(Xi), (5)
wherein phii(Xi) Is a factor and is proportional to the product of the sets of errors, and,
φi∝fi, (6)
Figure FDA0003416350790000021
wherein h isi(Xi) Is a perception function and is a function related to perception state variables; z is a radical ofiActually positioning a true value for the vehicle;
the formula is mathematically transformed under the assumption of a gaussian noise model, and the MAP inference is changed into a form of minimizing nonlinear least squares:
Figure FDA0003416350790000022
the nonlinear least square problem is subjected to linear transformation, h is convertedi(Xi) Performing Taylor expansion to obtain:
Figure FDA0003416350790000023
wherein HiIs to observe the Jacobian matrix at a given linearization point
Figure FDA0003416350790000024
With respect to hi(.), where,
Figure FDA0003416350790000025
further, equation (8) is converted into:
Figure FDA0003416350790000026
wherein the content of the first and second substances,
Figure FDA0003416350790000027
to locate the difference between the true and the perceived values at the linearized points, the Markov norm is converted to2Norm, yielding:
Figure FDA0003416350790000028
wherein:
Figure FDA0003416350790000029
Figure FDA00034163507900000210
equation (12) is actually solving the least squares problem, i.e. solving AiInverse of (A)iIs a reversible matrix, wherein each row represents a factor, each column represents the variable contained in the factor, and A is solvediThe inverse of (A) can solve XMAPI.e. find fiAnd correcting the error within the allowance.
3. The utility model provides a roadside perception fuses system based on vehicle-road is in coordination which characterized in that includes:
the vehicle-mounted unit broadcasts the vehicle information to the road side unit in a direct communication mode through the cooperation of the vehicle and the PC 5;
the road side unit is arranged at the road side at a certain distance and used for receiving the vehicle information broadcasted by the vehicle-mounted unit and broadcasting the road and the vehicle information to the outside;
the roadside sensing equipment acquires road and vehicle information through sensing and is used for sending the information to roadside edge calculation through a wired network;
the roadside edge calculation is distributed at the roadside of the road and used for receiving and processing information sent by the roadside unit and the roadside sensing equipment through a wired network;
and the cloud control platform is used for receiving road and vehicle structural data sent by the roadside edge computing and monitoring and early warning abnormal data.
CN202111548249.2A 2021-12-17 2021-12-17 Roadside perception fusion system based on vehicle-road cooperation and optimization method Pending CN114173307A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111548249.2A CN114173307A (en) 2021-12-17 2021-12-17 Roadside perception fusion system based on vehicle-road cooperation and optimization method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111548249.2A CN114173307A (en) 2021-12-17 2021-12-17 Roadside perception fusion system based on vehicle-road cooperation and optimization method

Publications (1)

Publication Number Publication Date
CN114173307A true CN114173307A (en) 2022-03-11

Family

ID=80487363

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111548249.2A Pending CN114173307A (en) 2021-12-17 2021-12-17 Roadside perception fusion system based on vehicle-road cooperation and optimization method

Country Status (1)

Country Link
CN (1) CN114173307A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114596706A (en) * 2022-03-15 2022-06-07 阿波罗智联(北京)科技有限公司 Detection method and device of roadside sensing system, electronic equipment and roadside equipment
CN114792469A (en) * 2022-04-06 2022-07-26 大唐高鸿智联科技(重庆)有限公司 Method and device for testing sensing system and testing equipment
CN114915940A (en) * 2022-05-13 2022-08-16 山东高速建设管理集团有限公司 Vehicle-road communication link matching method and system based on edge cloud computing
CN115188187A (en) * 2022-07-05 2022-10-14 浙江嘉兴数字城市实验室有限公司 Roadside perception data quality monitoring system and method based on vehicle-road cooperation
CN115294771A (en) * 2022-09-29 2022-11-04 智道网联科技(北京)有限公司 Monitoring method and device for road side equipment, electronic equipment and storage medium
CN115776506A (en) * 2023-02-07 2023-03-10 深圳开鸿数字产业发展有限公司 Vehicle-road cooperative data fusion system and method
CN116304994A (en) * 2023-05-22 2023-06-23 浙江交科交通科技有限公司 Multi-sensor target data fusion method, device, equipment and storage medium

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114596706A (en) * 2022-03-15 2022-06-07 阿波罗智联(北京)科技有限公司 Detection method and device of roadside sensing system, electronic equipment and roadside equipment
CN114596706B (en) * 2022-03-15 2024-05-03 阿波罗智联(北京)科技有限公司 Detection method and device of road side perception system, electronic equipment and road side equipment
CN114792469A (en) * 2022-04-06 2022-07-26 大唐高鸿智联科技(重庆)有限公司 Method and device for testing sensing system and testing equipment
CN114915940A (en) * 2022-05-13 2022-08-16 山东高速建设管理集团有限公司 Vehicle-road communication link matching method and system based on edge cloud computing
CN115188187A (en) * 2022-07-05 2022-10-14 浙江嘉兴数字城市实验室有限公司 Roadside perception data quality monitoring system and method based on vehicle-road cooperation
CN115294771A (en) * 2022-09-29 2022-11-04 智道网联科技(北京)有限公司 Monitoring method and device for road side equipment, electronic equipment and storage medium
CN115776506A (en) * 2023-02-07 2023-03-10 深圳开鸿数字产业发展有限公司 Vehicle-road cooperative data fusion system and method
CN116304994A (en) * 2023-05-22 2023-06-23 浙江交科交通科技有限公司 Multi-sensor target data fusion method, device, equipment and storage medium
CN116304994B (en) * 2023-05-22 2023-09-15 浙江交科交通科技有限公司 Multi-sensor target data fusion method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN114173307A (en) Roadside perception fusion system based on vehicle-road cooperation and optimization method
US11402841B1 (en) Approach for consolidating observed vehicle trajectories into a single representative trajectory
US11940539B2 (en) Camera-to-LiDAR calibration and validation
CN109556615B (en) Driving map generation method based on multi-sensor fusion cognition of automatic driving
CN111583630B (en) Brand-new road high-precision map rapid generation system and method based on space-time trajectory reconstruction
EP3644294B1 (en) Vehicle information storage method, vehicle travel control method, and vehicle information storage device
US9460622B1 (en) Approach for estimating the geometry of roads and lanes by using vehicle trajectories
CN109084786B (en) Map data processing method
Yoneda et al. Vehicle localization using 76GHz omnidirectional millimeter-wave radar for winter automated driving
US10384684B2 (en) Information processing apparatus
CN111653113A (en) Method, device, terminal and storage medium for determining local path of vehicle
CN107316457B (en) Method for judging whether road traffic condition accords with automatic driving of automobile
CN110920617A (en) Vehicle travel control system
CN112829753A (en) Millimeter-wave radar-based guardrail estimation method, vehicle-mounted equipment and storage medium
DE102022100068A1 (en) CONTROL OF VEHICLE PERFORMANCE BASED ON DATA RELATED TO AN ATMOSPHERIC CONDITION
DE102021132096A1 (en) VEHICLE LOCATION USING COMBINED INPUTS OF REDUNDANT LOCATION PIPELINES
CN113487915A (en) Unmanned aerial vehicle-based flight service supervision system and method
CN116572995B (en) Automatic driving method and device of vehicle and vehicle
CN112835029A (en) Unmanned-vehicle-oriented multi-sensor obstacle detection data fusion method and system
CN113227831A (en) Guardrail estimation method based on multi-sensor data fusion and vehicle-mounted equipment
CN115166721A (en) Radar and GNSS information calibration fusion method and device in roadside sensing equipment
Lee et al. Infrastructure Node-based Vehicle Localization for Autonomous Driving
DE102022104054A1 (en) THE VEHICLE CONDITION ESTIMATION IMPROVES SENSOR DATA FOR VEHICLE CONTROL AND AUTONOMOUS DRIVING
JP7298882B2 (en) Vehicle self-localization device and vehicle
CN115082562A (en) External parameter calibration method, device, equipment, server and vehicle-mounted computing equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination