CN117649777A - Target matching method, device and computer equipment - Google Patents

Target matching method, device and computer equipment Download PDF

Info

Publication number
CN117649777A
CN117649777A CN202410097585.7A CN202410097585A CN117649777A CN 117649777 A CN117649777 A CN 117649777A CN 202410097585 A CN202410097585 A CN 202410097585A CN 117649777 A CN117649777 A CN 117649777A
Authority
CN
China
Prior art keywords
target
vehicle
state information
speed
difference value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410097585.7A
Other languages
Chinese (zh)
Other versions
CN117649777B (en
Inventor
胡汇泽
杨唐涛
王邓江
殷旭梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Wanji Iov Technology Co ltd
Original Assignee
Suzhou Wanji Iov Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Wanji Iov Technology Co ltd filed Critical Suzhou Wanji Iov Technology Co ltd
Priority to CN202410097585.7A priority Critical patent/CN117649777B/en
Publication of CN117649777A publication Critical patent/CN117649777A/en
Application granted granted Critical
Publication of CN117649777B publication Critical patent/CN117649777B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Traffic Control Systems (AREA)

Abstract

The application relates to a target matching method, a target matching device and computer equipment, wherein the method comprises the following steps: acquiring vehicle state information of a current moment acquired by a vehicle and target state information of at least one target at a plurality of moments acquired by road side equipment, determining an allowable maximum distance between each target and the vehicle according to the target state information of each target, the vehicle state information and a preset acquisition difference time between the vehicle and the road side equipment, and then determining a matched target of the vehicle from each target according to the actually measured distance and the allowable maximum distance between each target and the vehicle. The method improves the accuracy of target matching in the intelligent network-connected vehicle and the road side system.

Description

Target matching method, device and computer equipment
Technical Field
The present disclosure relates to the field of intelligent driving technologies, and in particular, to a target matching method, device, and computer device.
Background
With the continuous development of intelligent driving technology, intelligent internet-connected vehicles can realize higher-level automatic driving in various scenes.
In the related art, the road perception capability of the intelligent network-connected vehicle can be improved through a vehicle cooperation technology, and the intelligent network-connected vehicle is required to be mapped into a road side system at the moment, namely, the intelligent network-connected vehicle is matched with a target acquired by road side equipment of the road side system.
However, in the related art, there is a problem that the perceived time difference between the road side device and the intelligent network-connected vehicle is large, resulting in inaccurate matching of the intelligent network-connected vehicle with the target in the road side system.
Disclosure of Invention
Based on the above, it is necessary to provide a target matching method, device and computer equipment, which can improve the accuracy of target matching in an intelligent network-connected vehicle and a road side system.
In a first aspect, the present application provides a target matching method, including:
acquiring vehicle state information of a vehicle at the current moment and target state information of at least one target at a plurality of moments acquired by road side equipment;
determining the allowable maximum distance between each target and the vehicle according to the target state information of each target, the vehicle state information and the preset acquisition difference time length between the vehicle and the road side equipment;
and determining a matching target of the vehicle from the targets according to the measured distance and the allowable maximum distance between the targets and the vehicle.
In one embodiment, obtaining target state information of at least one target at a plurality of moments collected by a roadside device includes:
acquiring a plurality of candidate moments corresponding to the current moment according to a preset time interval;
Screening at least one candidate target meeting preset conditions from at least one candidate target at a plurality of candidate moments acquired by road side equipment according to vehicle state information;
and determining the target state information of at least one candidate target meeting the preset condition as the target state information of at least one target at a plurality of moments acquired by the road side equipment.
In one embodiment, the time interval includes a maximum time interval and a minimum time interval, and acquiring, according to the preset time interval, a plurality of candidate moments corresponding to the current moment includes:
and determining a plurality of candidate moments corresponding to the current moment according to the maximum time interval and the minimum time interval.
In one embodiment, according to vehicle state information, screening at least one candidate object meeting a preset condition from at least one candidate object at a plurality of candidate moments collected by road side equipment includes:
spatially aligning the vehicle state information with target state information of each candidate target;
and screening candidate targets meeting preset conditions from the candidate targets according to the aligned vehicle state information and the target state information.
In one embodiment, the aligned vehicle state information includes vehicle coordinates, vehicle speed, and vehicle heading angle; the target state information comprises target coordinates, target speed and target course angle; screening candidate targets meeting preset conditions from the candidate targets according to the aligned vehicle state information and the target state information, wherein the candidate targets comprise:
According to the coordinates of each target and the coordinates of the vehicle, respectively determining a coordinate difference value and an actual measurement distance between each candidate target and the vehicle;
determining a speed difference between each candidate target and the vehicle according to the target speeds and the vehicle speeds;
determining a heading angle difference value between each candidate target and the vehicle according to the heading angles of the targets and the heading angles of the vehicle;
and obtaining candidate targets of which the coordinate difference value, the measured distance, the speed difference value and the course angle difference value all meet preset conditions.
In one embodiment, determining the allowable maximum distance between each target and the vehicle according to the target state information of each target, the vehicle state information and the preset acquisition difference duration between the vehicle and the road side equipment comprises:
aiming at any one target, determining a distance difference value, a speed difference value and a course angle difference value between the target and the vehicle according to target state information, vehicle state information and acquisition difference time length of the target;
and determining the allowable maximum distance between the target and the vehicle according to the distance difference value, the speed difference value and the course angle difference value.
In one embodiment, the target state information includes a target speed; the vehicle state information includes a vehicle speed; determining a distance difference value and a speed difference value between the target and the vehicle according to the target state information of the target, the vehicle state information and the acquired difference time length, wherein the method comprises the following steps of:
According to the historical target speed and the target speed of the target, determining the historical average target speed of the target;
determining a distance difference value according to the historical average speed and the acquired difference duration;
determining a historical average vehicle speed of the vehicle based on the historical vehicle speed of the vehicle and the vehicle speed;
and determining a speed difference value according to the historical average target speed, the historical average vehicle speed and the acquired difference duration.
In one embodiment, the target state information includes a target length and a target heading angle; the vehicle state information comprises a vehicle course angle, and the course angle difference value between the target and the vehicle is determined according to the target state information of the target, the vehicle state information and the acquisition difference time length, and the method comprises the following steps:
according to the historical target course angle and the target course angle of the target, determining the historical average target course angle of the target;
determining a historical average vehicle course angle of the vehicle according to the historical vehicle course angle and the vehicle course angle of the vehicle;
and determining a course angle difference value according to the historical average target course angle, the historical average vehicle course angle and the target length.
In one embodiment, determining the allowable maximum distance between the target and the vehicle based on the distance difference value, the speed difference value, and the heading angle difference value comprises:
Acquiring a distance weight, a speed weight and a course angle weight between a target and a vehicle;
and determining the allowable maximum distance according to the distance difference value, the speed difference value, the course angle difference value, the distance weight, the speed weight and the course angle weight.
In one embodiment, obtaining a distance weight, a speed weight, and a heading angle weight between a target and a vehicle includes:
determining a position function relation according to the distance difference value and the distance weight parameter, the speed difference value and the speed weight parameter, the course angle difference value and the course angle weight parameter and the position difference value between a preset target and the vehicle;
determining a weight function relation according to the distance weight parameter, the speed weight parameter and the course angle weight parameter;
and solving the position function relation according to the weight function relation to obtain a distance weight, a speed weight and a course angle weight.
In one embodiment, the position difference comprises a first position difference and a second position difference; solving the position function relation according to the weight function relation to obtain a distance weight, a speed weight and a course angle weight, wherein the method comprises the following steps:
detecting whether the vehicle and the target are in the same lane;
Under the condition that the vehicle and the target are in the same lane, solving the position function relation according to the first position difference value and the weight function relation to obtain a distance weight, a speed weight and a course angle weight;
and under the condition that the vehicle and the target are in different lanes, solving the position function relation according to the second position difference value and the weight function relation to obtain a distance weight, a speed weight and a course angle weight.
In one embodiment, the vehicle state information includes vehicle coordinates; the target state information includes target coordinates; the method further comprises the steps of:
according to the vehicle coordinates of the vehicle, acquiring a local high-precision map of the position of the vehicle;
according to the local high-precision map, lane information of the vehicle and the target is obtained;
under the condition that the lane information of the vehicle and the target are consistent, determining that the vehicle and the target are in the same lane;
in the case where the lane information of the vehicle and the target are inconsistent, it is determined that the vehicle and the target are in different lanes.
In one embodiment, the vehicle state information includes vehicle coordinates and the target state information includes target coordinates; determining a matching target of the vehicle from the targets based on the measured distance and the allowed maximum distance between the targets and the vehicle, comprising:
Determining the actual measurement distance between each target and the vehicle according to the target coordinates and the vehicle coordinates of each target;
obtaining a reference target with the measured distance smaller than the corresponding allowable maximum distance from each target;
in the case that the reference target is one, determining the reference target as a matching target of the vehicle;
in the case where there are a plurality of reference targets, the reference target having the smallest distance is determined as the matching target of the vehicle.
In a second aspect, the present application further provides an object matching apparatus, including:
the acquisition module is used for acquiring vehicle state information of a vehicle at the current moment and target state information of at least one target at a plurality of moments acquired by road side equipment;
the first determining module is used for determining the allowable maximum distance between each target and the vehicle according to the target state information of each target, the vehicle state information and the preset acquisition difference duration between the vehicle and the road side equipment;
and the second determining module is used for determining a matched target of the vehicle from the targets according to the measured distance and the allowable maximum distance between the targets and the vehicle.
In a third aspect, embodiments of the present application provide a computer device comprising a memory and a processor, the memory storing a computer program, the processor implementing the steps of the method provided by any of the embodiments of the first aspect described above when the computer program is executed.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the method provided by any of the embodiments of the first aspect described above.
In a fifth aspect, embodiments of the present application also provide a computer program product comprising a computer program which, when executed by a processor, implements the steps of the method provided by any of the embodiments of the first aspect described above.
The target matching method, the device and the computer equipment acquire the vehicle state information of the current moment acquired by the vehicle and the target state information of at least one target at a plurality of moments acquired by the road side equipment, determine the allowable maximum distance between each target and the vehicle according to the target state information of each target, the vehicle state information and the preset acquisition difference time length between the vehicle and the road side equipment, and then determine the matching target of the vehicle from each target according to the actually measured distance and the allowable maximum distance between each target and the vehicle. In the method, since the acquired difference duration can represent the maximum time difference between the vehicle and the road side equipment, the allowable maximum distance between the target and the vehicle can be determined through the vehicle state information and the target state information of the target and the acquired difference duration; and because the state information of the targets and the vehicles is considered when the allowable maximum distance between the targets and the vehicles is determined, the allowable maximum distance between each target and the vehicles is more accurate and reasonable, and the accuracy of matching the intelligent network-connected vehicles with the targets in the road side system is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the related art, the drawings that are required to be used in the embodiments or the related technical descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to the drawings without inventive effort for a person having ordinary skill in the art.
FIG. 1 is an application environment diagram of a target matching method in one embodiment;
FIG. 2 is a flow diagram of a target matching method in one embodiment;
FIG. 3 is a flow chart of a target matching method according to another embodiment;
FIG. 4 is a flow chart of a target matching method according to another embodiment;
FIG. 5 is a flow chart of a target matching method according to another embodiment;
FIG. 6 is a flow chart of a target matching method according to another embodiment;
FIG. 7 is a flow chart of a target matching method according to another embodiment;
FIG. 8 is a flow chart of a target matching method according to another embodiment;
FIG. 9 is a flow chart of a target matching method according to another embodiment;
FIG. 10 is a flow chart of a target matching method according to another embodiment;
FIG. 11 is a flow chart of a target matching method according to another embodiment;
FIG. 12 is a flow chart of a target matching method according to another embodiment;
FIG. 13 is a flow chart of a target matching method according to another embodiment;
FIG. 14 is a flow chart of a target matching method according to another embodiment;
FIG. 15 is a flow chart of a target matching method according to another embodiment;
FIG. 16 is a flow chart of a target matching method according to another embodiment;
FIG. 17 is a block diagram of an object matching device in one embodiment;
fig. 18 is an internal structural view of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The target matching method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. As shown in fig. 1, the vehicle 101 and the roadside device 102 each perform wired or wireless communication with the roadside system 103 through a network; the Road Side system may be integrated in a Road Side Unit (RSU)/a mobile edge computer (Mobile Edge Computing, MEC) on the Road Side, or may be integrated in a computer device, or may be placed on a cloud or other network server.
The road side system may be a road side computing unit/terminal/edge server, optionally, the road side system may also be a cloud server, a vehicle-mounted computing unit/terminal of a vehicle end, and the like. The sensing device may be a road side sensing device disposed at an intersection, for example, may be an intelligent base station (also called a road side base station) at the intersection, or may also be at least one of a millimeter wave radar sensor, a laser radar sensor, a camera, and the like, and the type of the road side device is not particularly limited herein; the roadside device may include a plurality of devices.
With the continuous development of intelligent driving technology, the intelligent driving system can realize higher-level automatic driving under specific scenes (such as highways). However, in urban scenes, due to complex road conditions, the problems of more shielding, large blind areas, changeable driving intentions of weak traffic participants (pedestrians and non-motor vehicles) and the like exist, and intelligent development of a bicycle is limited. Meanwhile, road side equipment and technology are advancing continuously, and communication between an RSU of a road side system and an On Board Unit (OBU) is mature, so that an intelligent internet-connected vehicle (Intelligent and Connected Vehicle, ICV) combined with a road side sensing technology can solve the problems, namely, the vehicle side sensing capability is improved by using a road cooperation technology.
Taking urban road/intersection scene as an example, a plurality of road side devices (sensing devices such as laser radar, millimeter wave radar, cameras and the like, and road side systems such as RSU, MEC and the like) are arranged in the road side, and all traffic participants in the area can be subjected to fusion tracking to acquire the state information of the traffic participants; and a plurality of intelligent network vehicles with OBUs run in the area and can exchange information with the RSU.
However, because the perception range of the road side equipment is larger, the calculation objects of the vehicle-road cooperation are all traffic participants in the road side system, but because the intelligent network-connected vehicles cannot be known in advance, the intelligent network-connected vehicles need to be mapped into the road side system, and then the decision, planning and control of the subsequent vehicle-road cooperation automatic driving can be performed.
The intelligent network-connected vehicle with the OBU can send the vehicle state information to the road side system, but because the road side equipment and the vehicle-side OBU of the road side system have different time for acquiring the state information (for example, the state that a certain intelligent driving vehicle is in a state of 12:15:20.001, the time that the road side equipment senses the target is 12:15:20.051, and the time that the OBU is sent to the road side system is 12:15:20.081), the problem that the time is difficult to synchronize exists, and thus the problem that the intelligent network-connected vehicle is inaccurate in target matching with the road side system is caused.
Based on the above, the embodiment of the application provides a target matching method, which can solve the problem that the target acquired in the intelligent driving vehicle and the road side equipment in the road side system is not accurately matched under the condition that the time of the vehicle end and the time of the road side equipment are greatly different.
In an exemplary embodiment, as shown in fig. 2, there is provided a target matching method, which is illustrated by using the method applied to the road side system in fig. 1 as an example, and includes the following steps:
s201, acquiring vehicle state information of a vehicle at the current moment acquired by the vehicle and target state information of at least one target at a plurality of moments acquired by road side equipment.
The vehicle may be any vehicle that travels on a road.
The mode of acquiring the vehicle state information of the vehicle at the current moment can be that a plurality of sensors are installed on the vehicle, the plurality of sensors can acquire the vehicle state information of the vehicle, and the vehicle state information is sent to a road side system through an on-board unit on the vehicle. The current time may be any time at which vehicle state information of the vehicle is received.
The road side system can determine the currently received vehicle state information as the vehicle state information of the vehicle at the current moment acquired by the vehicle; the vehicle state information may include information such as a vehicle position and a vehicle speed.
The road side equipment can be a sensing equipment and can be a smart base station, a millimeter wave radar sensor or a laser radar sensor and the like.
The road side device is capable of collecting target state information for all targets within a perceived range in the road, wherein the targets may be any vehicle within the coverage area of the road side device in the road when passing through the road, and the targets include, but are not limited to, cars, trucks, electric vehicles, bicycles, tricycles, scooters, pedestrians, and the like.
In practical application, the road side device arranged in the road can sense the targets passing through the road in real time so as to acquire the target state information of all the targets in the road and upload the target state information of all the targets to the road side system. Wherein, the plurality of time instants may be time instants determined with reference to the current time instant; for example, all the time instants within the current time instant chronological time range are determined as a plurality of time instants.
Alternatively, taking the current time as the third time as an example, the first time, the second time, the third time, the fourth time, and the fifth time may be determined as a plurality of times; and acquiring vehicle state information acquired by the vehicle at a third moment and target state information of all targets acquired by the road side equipment at all moments from the first moment to the fifth moment.
S202, determining the allowable maximum distance between each target and the vehicle according to the target state information of each target, the vehicle state information and the preset acquisition difference time length between the vehicle and the road side equipment.
Because the time for collecting the state information of the road side equipment and the vehicle is different, the allowable maximum distance between each object and the vehicle can be calculated through the collecting difference time length of the road side equipment and the vehicle, the object state information of each object collected by the road side equipment and the vehicle state information collected by the vehicle.
The collection difference time length can be the maximum collection difference time length allowed between the road side equipment and the vehicle; for example, the acquisition variance time period may be 200ms. The preset collection difference time length between the vehicle and the road side equipment can be a value set in advance according to historical experience; the time length of the acquisition difference can also be calculated according to a plurality of simulation tests.
In one embodiment, the target state information and the vehicle state information of each target and the acquired difference time length can be input into a preset distance calculation model, and the allowable maximum distance between each target and the vehicle output by the distance calculation model is obtained through analysis of the distance calculation model.
In another embodiment, the allowable maximum distance between each target and the vehicle may be calculated according to a preset distance calculation formula; specifically, for any one of the targets, the state information of the target and the state information of the vehicle are substituted into a distance calculation formula, and the allowable maximum distance between the target and the vehicle is obtained through calculation, so that the allowable maximum distance between each target and the vehicle is determined.
S203, determining a matching target of the vehicle from the targets according to the measured distance and the allowable maximum distance between the targets and the vehicle.
The allowable maximum distance between the target and the vehicle may represent the allowable maximum distance between the target and the vehicle. Thus, a matching target of the vehicle can be determined from the targets based on the measured distance and the allowable maximum distance between the targets and the vehicle. The matching target of the vehicle indicates a vehicle collected by the road side equipment at the current moment, namely, the vehicle and the matching target are the same vehicle.
Specifically, determining the actual measurement distance between each target and the vehicle according to the target state information and the vehicle state information of each target; and then directly inputting the measured distance and the allowable maximum distance between each target and the vehicle into a pre-trained matching model, and respectively matching each target with the vehicle through the matching model according to the measured distance and the allowable maximum distance between each target and the vehicle, so as to finally obtain the matching target of the vehicle output by the matching model.
In another embodiment, the vehicle state information includes vehicle coordinates and the target state information includes target coordinates; as shown in fig. 3, a matching target of a vehicle is determined from among targets according to a measured distance and an allowable maximum distance between each target and the vehicle, comprising the steps of:
s301, determining the actual measurement distance between each target and the vehicle according to the target coordinates and the vehicle coordinates of each target.
Wherein, for any one of the targets, the target coordinates of the target may represent the position of the target in the road, and the vehicle coordinates may represent the position of the vehicle in the road, and therefore, the measured distance between the target and the vehicle may be determined from the target coordinates of the target and the vehicle coordinates.
The measured distance between the target and the vehicle may be calculated according to equation (1).
(1)
Wherein,representing the measured distance between the target and the vehicle, +.>Represents the abscissa of the vehicle, +.>Representing the ordinate of the vehicle,/>Can represent the abscissa of the object, +.>The ordinate of the target may be represented.
S302, obtaining a reference target with the measured distance smaller than the corresponding allowable maximum distance from each target.
Based on the above mode, the measured distance and the allowed maximum distance between each target and the vehicle are sequentially calculated, then the measured distance and the allowed maximum distance of the targets are compared, and the target with the measured distance smaller than the allowed maximum distance is taken as the reference target.
Based on the above, the reference targets can be selected from among the targets.
S303, in the case that the reference targets are one, the reference target is determined as the matching target of the vehicle, and in the case that the reference targets are a plurality of the reference targets, the reference target with the smallest distance is determined as the matching target of the vehicle.
If the reference target is one target, namely, only one reference target in each target meets the screening condition, the reference target is determined to be a matching target of the vehicle.
If the reference object includes a plurality of objects, that is, it means that there are a plurality of objects that meet the screening condition, the reference object having the smallest measured distance among the reference objects may be determined as the matching object of the vehicle.
Alternatively, in the case that the number of the reference targets is plural, the matching targets of the vehicle may be screened from the reference targets according to the preset screening conditions according to the target state information of the reference targets.
According to the embodiment of the application, according to the target coordinates and the vehicle coordinates of each target, determining the actual measurement distance between each target and the vehicle, and acquiring a reference target with the actual measurement distance smaller than the corresponding allowable maximum distance from each target; in the case that the reference target is one, determining the reference target as a matching target of the vehicle; in the case where there are a plurality of reference targets, the reference target having the smallest distance is determined as the matching target of the vehicle. In the embodiment, the actual measured distance between the target and the vehicle is compared with the allowable maximum distance, and the target with the actual measured distance larger than the allowable maximum distance is screened out, so that the accuracy of target matching of the vehicle is improved.
In the target matching method provided by the embodiment of the application, the vehicle state information of the current moment acquired by the vehicle and the target state information of at least one target at a plurality of moments acquired by the road side equipment are acquired, the allowable maximum distance between each target and the vehicle is determined according to the target state information of each target, the vehicle state information and the preset acquisition difference duration between the vehicle and the road side equipment, and then the matching target of the vehicle is determined from each target according to the actually measured distance and the allowable maximum distance between each target and the vehicle. In the method, since the acquired difference duration can represent the maximum time difference between the vehicle and the road side equipment, the allowable maximum distance between the target and the vehicle can be determined through the vehicle state information and the target state information of the target and the acquired difference duration; and because the state information of the targets and the vehicles is considered when the allowable maximum distance between the targets and the vehicles is determined, the allowable maximum distance between each target and the vehicles is more accurate and reasonable, and the accuracy of matching the intelligent network-connected vehicles with the targets in the road side system is improved.
Because the problem that target matching is inaccurate when the road side equipment and the vehicle sense time difference is large is solved, before matching, a data frame acquired at the same time with the vehicle state information can be screened through a time interval by taking the current time as a reference, and then the matching target of the vehicle is determined from at least one target in the data frame. The implementation manner of acquiring the target state information of at least one target at a plurality of moments acquired by the roadside device is described below through an embodiment, and in an exemplary embodiment, as shown in fig. 4, the acquiring the target state information of at least one target at a plurality of moments acquired by the roadside device includes:
S401, acquiring a plurality of candidate moments corresponding to the current moment according to a preset time interval.
The data frames sent by the road side equipment and the vehicle can be regarded as the same frame data under the condition that the time interval is smaller than the preset event interval.
Therefore, the time at which the absolute value of the interval duration between the time and the current time is within the time interval can be determined as a plurality of candidate times corresponding to the current time with the current time as a reference. For example, the time interval is 200ms, and a time in which the absolute value of the interval duration between the time and the current time is within 200ms may be determined as a plurality of candidate times corresponding to the current time.
In one embodiment, the time interval includes a maximum time interval and a minimum time interval, and acquiring, according to the preset time interval, a plurality of candidate moments corresponding to the current moment includes: and determining a plurality of candidate moments corresponding to the current moment according to the maximum time interval and the minimum time interval.
The time at which the absolute value of the interval duration between the time and the current time is greater than the minimum time interval and less than the maximum time interval may be determined as a plurality of candidate times corresponding to the current time.
For example, time intervals May be 50ms</><200ms. It should be noted that the acquisition difference duration may be the maximum time interval, that is, the acquisition difference duration may be 200ms.
In this embodiment, a plurality of candidate moments are determined according to a minimum time interval and a maximum time interval with the current moment, so that the selected plurality of candidate moments are all likely to be the same moment with the current moment, and some targets which do not meet the time condition are screened out, thereby improving the accuracy of target matching.
S402, screening at least one candidate target meeting preset conditions from at least one candidate target at a plurality of candidate moments acquired by road side equipment according to vehicle state information.
The candidate targets can be all targets acquired by the road side equipment at a plurality of candidate moments; for example, candidate targets may include, but are not limited to, intelligent networked vehicles, trucks, electric vehicles, bicycles, tricycles, scooters, pedestrians, and the like.
Therefore, at least one target meeting the preset condition can be screened from all candidate targets under a plurality of candidate moments acquired by the road side equipment.
Screening at least one target meeting a distance preset condition from at least one candidate target under a plurality of candidate moments acquired by road side equipment; specifically, the measured distance between each candidate object and the vehicle may be determined according to the vehicle state information and the object state information of the candidate object, and the candidate object whose measured distance is smaller than the preset distance threshold value is determined as the object satisfying the preset condition.
Because the vehicle is an intelligent network-connected vehicle, the preset condition may be that the target is an intelligent network-connected vehicle, that is, the road side system may determine, according to the type of the candidate target, the target of the intelligent network-connected vehicle in at least one candidate target at a plurality of candidate moments collected by the road side device as at least one candidate target satisfying the preset condition.
Optionally, after screening the candidate targets of the intelligent network-connected vehicle, continuously screening the candidate targets according to the distance between the candidate targets and the vehicle in the candidate targets of the intelligent network-connected vehicle, and determining at least one candidate target meeting the distance constraint condition.
S403, determining target state information of at least one candidate target meeting preset conditions as target state information of at least one target at a plurality of moments collected by the road side equipment.
The preset condition may be a necessary constraint condition for matching the vehicle with the target, and at least one candidate target satisfying the preset condition may be a matching target of the vehicle.
Therefore, the target state information of at least one candidate target meeting the preset condition can be directly determined as the target state information of at least one target at a plurality of moments collected by the road side equipment, and then the matching target of the vehicle is determined from the targets.
According to the target matching method provided by the embodiment of the application, a plurality of candidate moments corresponding to the current moment are obtained according to a preset time interval, at least one candidate target meeting a preset condition is screened from at least one candidate target under the plurality of candidate moments collected by the road side equipment according to the vehicle state information, and finally the target state information of the at least one candidate target meeting the preset condition is determined as the target state information of the at least one target under the plurality of moments collected by the road side equipment. In the method, a plurality of candidate moments corresponding to the current moment are screened out through time intervals, and the candidate moments are regarded as the same moment as the current moment; and then, screening is carried out again among the candidate targets collected in the candidate moments, so that at least one target which is possibly the same as the vehicle in the candidate moments is determined, and the targets collected by the road side equipment are screened for multiple times, so that the accuracy of matching of the follow-up targets is improved.
In an exemplary embodiment, as shown in fig. 5, according to vehicle status information, at least one candidate object satisfying a preset condition is screened from at least one candidate object at a plurality of candidate moments collected by a road side device, including the following steps:
S501, spatially aligning the vehicle state information with the target state information of each candidate target.
Since the vehicle state information and the target state information of each candidate target are information acquired by different devices, it is necessary to spatially align the vehicle state information and the target state information of each candidate target. Spatially aligning the vehicle state information with the target state information for each candidate target is typically aligning the current position of the vehicle with the position of the candidate target.
Thus, in one exemplary embodiment, as shown in FIG. 6, the vehicle state information includes vehicle coordinates of the vehicle in a world coordinate system; the target state information of each target comprises target coordinates of each target under a radar coordinate system; spatially aligning the vehicle state information with candidate object state information for each candidate object, comprising the steps of:
s601, a conversion matrix between a world coordinate system and a radar coordinate system is acquired.
The target coordinates collected by the road side equipment are target coordinates under a radar coordinate system, and the vehicle coordinates under a world coordinate system are collected by the vehicle; therefore, when the vehicle coordinates and the target coordinates are spatially aligned, the vehicle coordinates and the target coordinates can be converted into the same coordinate system by the conversion matrix.
The method comprises the steps that road side equipment and vehicles can be calibrated in advance, and a conversion matrix between a world coordinate system and a radar coordinate system is determined; therefore, the transformation matrix between the world coordinate system and the radar coordinate system can be obtained directly from the database.
S602, according to the transformation matrix, the vehicle coordinates in the world coordinate system are transformed into the vehicle coordinates in the radar coordinate system.
The transformation matrix between the world coordinate system and the radar coordinate system may include a rotation matrix and a translation matrix between the world coordinate system and the radar coordinate system. Therefore, the coordinates of the vehicle in the radar coordinate system can be determined according to the formula (2) and the formula (3).
(2)
(3)
Wherein,is a transformation matrix between the world coordinate system and the radar coordinate system,/for the radar coordinate system>Is a rotation matrix between the world coordinate system and the radar coordinate system, and the matrix size is 3 multiplied by 3,/-or->Is a translation matrix between a world coordinate system and a radar coordinate system, and the matrix size is 3 multiplied by 1,/-or->A transposed matrix of zero matrix, a matrix size of 1×3, < >>For the vehicle coordinates of the vehicle in the world coordinate system, < >>Is the vehicle coordinates of the target in the radar coordinate system.
And S603, determining the vehicle coordinates in the radar coordinate system as aligned vehicle coordinates.
The vehicle coordinates in the radar coordinate system are determined as aligned vehicle coordinates.
The vehicle state information is spatially aligned with the candidate object state information of each candidate object, and it is actually to convert the vehicle coordinates of the vehicle and the object coordinates of the object into the same coordinate system.
In the target matching method provided by the embodiment of the application, a conversion matrix between a world coordinate system and a radar coordinate system is obtained, vehicle coordinates in the world coordinate system are converted into vehicle coordinates in the radar coordinate system according to the conversion matrix, and then the vehicle coordinates in the radar coordinate system are determined to be aligned vehicle coordinates. In the method, the space alignment is carried out on the vehicle coordinate and the target coordinate by directly utilizing the conversion matrix between the world coordinate system and the radar coordinate system which are calibrated in advance, so that the space alignment efficiency of the vehicle state information and the target state information is improved.
S502, screening candidate targets meeting preset conditions from the candidate targets according to the aligned vehicle state information and the target state information.
In one exemplary embodiment, as shown in FIG. 7, the aligned vehicle state information includes vehicle coordinates, vehicle speed, and vehicle heading angle; the target state information comprises target coordinates, target speed and target course angle; screening candidate targets meeting preset conditions from the candidate targets according to the aligned vehicle state information and the target state information, wherein the method comprises the following steps:
S701, respectively determining a coordinate difference value and a measured distance between each candidate object and the vehicle according to the object coordinates and the vehicle coordinates.
The coordinate difference between the candidate object and the vehicle may be a difference between the object coordinate of the candidate object and the vehicle coordinate of the vehicle.
For example, for any one candidate object, the vehicle coordinates areThe target coordinates are +.>The coordinate difference between the candidate object and the vehicle is +.>And->
The manner of determining the measured distance between the candidate target and the vehicle in the embodiment of the present application is the same as the manner of obtaining the measured distance between the target and the vehicle in the above embodiment, and the embodiment of the present application is not described herein again.
It should be noted that, the vehicle coordinates represent the vehicle coordinates after spatial alignment, that is, the vehicle coordinates under the radar coordinate system, and the vehicle coordinates used in the target matching in the embodiment of the present application are all the vehicle coordinates under the radar coordinate system.
S702, determining a speed difference value between each candidate target and the vehicle according to the target speeds and the vehicle speeds.
Subtracting the speed of the vehicle from the target speed of the candidate target aiming at any candidate target to obtain a speed difference value between the candidate target and the vehicle; for example, the vehicle speed is Target speed is +.>The speed difference between the candidate object and the vehicle may be +>
S703, determining a heading angle difference value between each candidate target and the vehicle according to each target heading angle and the vehicle heading angle.
Subtracting the heading angle of the vehicle from the heading angle of the target of the candidate target aiming at any candidate target to obtain a heading angle difference value between the candidate target and the vehicle; for example, the heading angle of the vehicle isThe target course angle is +.>The heading angle difference between the candidate object and the vehicle may be + ->
S704, obtaining candidate targets of which the coordinate difference value, the measured distance, the speed difference value and the course angle difference value all meet preset conditions.
The candidate targets of which the absolute values of the coordinate difference values, the absolute values of the measured distances, the absolute values of the speed difference values and the absolute values of the course angle difference values are respectively smaller than the corresponding coordinate threshold values, the displacement threshold values, the speed threshold values and the course angle threshold values in the candidate targets can be determined to be the candidate targets meeting preset conditions.
As shown in equation (4), equation (4) may be a necessary constraint for matching the target with the vehicle.
(4)
Wherein,for the coordinate threshold value->For the displacement threshold value, consider that no matter how big the vehicle speed is, the difference between the vehicle position of the vehicle and the position of the vehicle perceived by the road side equipment cannot be too big, the +. >Can be 4 m, or herba Cistanchis>Can take 5 meters; />As the speed threshold, 1m/s can be taken; />For the heading angle threshold, 20 ° may be taken.
According to the method, the coordinate difference value and the actual measurement distance between each candidate target and the vehicle are respectively determined according to the target coordinates and the vehicle coordinates, the speed difference value between each candidate target and the vehicle is respectively determined according to the target speed and the vehicle speed, then the course angle difference value between each candidate target and the vehicle is determined according to the target course angle and the vehicle course angle, and finally the candidate targets, in which the coordinate difference value, the actual measurement distance, the speed difference value and the course angle difference value meet preset conditions, are obtained. In this embodiment, since the matching targets of the vehicles are the same vehicle acquired at the same time, the coordinate difference, the measured distance, the speed difference and the heading angle difference between the matching targets of the vehicles should not be too large, and this is taken as a necessary constraint condition for target matching, so that the accuracy of target matching can be improved.
In the target matching method provided by the embodiment of the application, the vehicle state information and the target state information of each candidate target are spatially aligned, and candidate targets meeting preset conditions are screened from the candidate targets according to the aligned vehicle state information and the aligned target state information. In the method, the vehicle state information of the vehicle and the target loading information of the candidate target are not acquired by the same acquisition equipment, and the state information is not in the same coordinate system, so that when the candidate target is compared with the vehicle, the vehicle state information and the target state information are required to be spatially aligned, the vehicle state information and the target state information can be compared in the same dimension, and the accuracy of the screened candidate target is improved.
In an exemplary embodiment, as shown in fig. 8, determining the allowable maximum distance between each target and the vehicle according to the target state information of each target, the vehicle state information, and the preset acquisition difference duration between the vehicle and the roadside apparatus, includes the following steps:
s801, aiming at any one target, determining a distance difference value, a speed difference value and a course angle difference value between the target and the vehicle according to target state information, vehicle state information and acquisition difference duration of the target.
The target state information of the target may include a target coordinate, a target speed, and a target heading angle, and the vehicle state information includes a vehicle coordinate, a vehicle speed, and a vehicle heading angle.
Determining a distance difference value, a speed difference value and a course angle difference value between the target and the vehicle according to a pre-trained difference model; specifically, the target coordinates, the target speed, the target course angle, the vehicle coordinates, the vehicle speed and the vehicle course angle, and the acquired difference time length can be input into the difference model, and the difference model is used for analysis to obtain a distance difference value, a speed difference value and a course angle difference value between the target and the vehicle, which are output by the difference model.
S802, determining the allowable maximum distance between the target and the vehicle according to the distance difference value, the speed difference value and the course angle difference value.
Wherein the sum of the distance difference value, the speed difference value, and the heading angle difference value may be determined as an allowable maximum distance between the target and the vehicle.
The allowable maximum distance between the target and the vehicle can also be determined according to the distance difference value, the speed difference value and the course angle difference value and the corresponding weight coefficient; in one exemplary embodiment, as shown in fig. 9, determining the allowable maximum distance between the target and the vehicle based on the distance difference value, the speed difference value, and the heading angle difference value includes the steps of:
s901, obtaining the distance weight, the speed weight and the course angle weight between the target and the vehicle.
The distance weight, the speed weight and the course angle weight between the target and the vehicle can be preset according to historical experience; alternatively, the distance weight, the speed weight and the heading angle weight between the target and the vehicle are determined after testing according to a simulation experiment.
S902, determining the allowable maximum distance according to the distance difference value, the speed difference value, the course angle difference value, the distance weight, the speed weight and the course angle weight.
And the distance difference value, the speed difference value and the course angle difference value can be weighted according to the distance weight, the speed weight and the course angle weight to obtain the allowable maximum distance.
For example, the allowable maximum distance may be calculated according to equation (5).
(5)
Wherein,indicating maximum allowable distance->Respectively representing a distance weight, a speed weight and a course angle weight,respectively representing a distance difference value, a speed difference value and a course angle difference value.
In the target matching method provided by the embodiment of the application, the distance weight, the speed weight and the course angle weight between the target and the vehicle are obtained, and the allowable maximum distance is determined according to the distance difference value, the speed difference value, the course angle difference value, the distance weight, the speed weight and the course angle weight. In the method, in the process of calculating the allowable maximum distance according to the distance difference value, the speed difference value and the course angle difference value, the distance weight, the speed weight and the course angle weight are also considered, so that the accuracy of the allowable maximum distance is further improved.
In the target matching method provided by the embodiment of the application, for any one target, a distance difference value, a speed difference value and a course angle difference value between the target and the vehicle are determined according to target state information, vehicle state information and acquisition difference time length of the target, and an allowable maximum distance between the target and the vehicle is determined according to the distance difference value, the speed difference value and the course angle difference value. In the method, the allowable maximum distance between the target and the vehicle is determined according to the three dimensions of the distance difference value, the speed difference value and the course angle difference value between the target and the vehicle, so that the accuracy and the comprehensiveness of calculating the allowable maximum distance are improved.
In one exemplary embodiment, as shown in FIG. 10, the target state information includes a target speed; the vehicle state information includes a vehicle speed; according to the target state information, the vehicle state information and the acquisition difference time length of the target, determining a distance difference value and a speed difference value between the target and the vehicle, wherein the method comprises the following steps of:
s1001, determining the historical average target speed of the target according to the historical target speed and the target speed of the target.
The historical target speed can be the target speed of the target acquired by the road side equipment at the historical moment; the historical target speed of the target and the average of the target speeds may be determined as the historical average target speed of the target.
For example, taking the historical target speed as the target speed of the last two frames of the history corresponding to the target collected by the roadside device as an example, the manner of determining the historical average target speed of the target may be as shown in formula (6).
(6)/>
Wherein,can represent the historical average target speed of the target,/->Target speed, which can represent target, +.>The historical target speed for the last two frames of the target may be represented.
S1002, determining a distance difference value according to the historical average speed and the acquisition difference duration.
Determining the product of the historical average speed and the acquisition difference duration as a distance difference value,/>Representing the duration of the acquisition discrepancy.
S1003, determining a historical average vehicle speed of the vehicle according to the historical vehicle speed and the vehicle speed of the vehicle.
The historical vehicle speed may be a vehicle speed of the vehicle that the vehicle acquired at a historical time; the historical vehicle speed of the vehicle and the average of the vehicle speeds may be determined as the historical average vehicle speed of the vehicle.
For example, taking the historical vehicle speed as the vehicle speed of the last two frames of the history corresponding to the vehicle collected by the vehicle as an example, the manner of determining the historical average vehicle speed of the vehicle may be as shown in formula (7).
(7)
Wherein,may represent a historical average vehicle speed of the vehicle,/-or%>Vehicle speed, which may represent vehicle, < >>The historical vehicle speed for the last two frames of the vehicle may be represented.
S1004, determining a speed difference value according to the historical average target speed, the historical average vehicle speed and the acquired difference duration.
Acquiring a speed difference value between the historical average vehicle speed and the historical average target speed, multiplying the absolute value of the speed difference value by the acquired difference time length, and obtaining a multiplied result as a speed difference value
In the object matching method provided in the embodiment of the present application,
in the target matching method provided by the embodiment of the application, the historical average target speed of the target is determined according to the historical target speed and the target speed of the target, and the distance difference value is determined according to the historical average speed and the acquisition difference time length; then, a historical average vehicle speed of the vehicle is determined according to the historical vehicle speed and the vehicle speed of the vehicle, and then a speed difference value is determined according to the historical average target speed, the historical average vehicle speed and the acquired difference duration. According to the method, the distance difference value between the target and the vehicle can be determined through the historical average target speed and the acquired difference time length of the target, the target can be tracked better, and the accuracy of target matching is improved; in addition, the speed of the target and the speed of the vehicle can be better represented through the historical average target speed and the historical average vehicle speed, and the speed difference between the target and the vehicle can be more accurately determined according to the historical average speed and the acquired difference duration of the target and the vehicle, so that the accuracy of target matching is improved.
In one exemplary embodiment, the target state information includes a target length and a target heading angle; the vehicle state information includes a vehicle heading angle, and as shown in fig. 11, a heading angle difference value between the target and the vehicle is determined according to the target state information of the target, the vehicle state information and the acquisition difference duration, and the method includes the following steps:
S1101, determining a historical average target course angle of the target according to the historical target course angle and the target course angle of the target.
The historical target course angle can be a target course angle of the target acquired by the road side equipment at the historical moment; the historical target heading angle of the target and the average of the target heading angles may be determined as the historical average target heading angle of the target.
For example, taking the historical target course angle as the target course angle of the last two frames of the history corresponding to the target collected by the roadside device as an example, the manner of determining the historical average target course angle of the target may be as shown in formula (8).
(8)
Wherein,can represent the historical average target heading angle,/-for the target>Target course angle, which can represent target, +.>The historical target heading angle for the last two frames of the target may be represented.
S1102, determining a historical average vehicle course angle of the vehicle according to the historical vehicle course angle and the vehicle course angle of the vehicle.
The historical vehicle course angle can be the vehicle course angle of the vehicle collected at the historical moment; an average of the historical vehicle heading angle of the vehicle and the vehicle heading angle may be determined as a historical average vehicle heading angle of the vehicle.
For example, taking the historical vehicle heading angle as the vehicle heading angle of the last two frames of the history corresponding to the vehicle collected by the vehicle as an example, the manner of determining the historical average vehicle heading angle of the vehicle may be as shown in formula (9).
(9)
Wherein,may represent a historical average vehicle heading angle of the vehicle,/->The vehicle heading angle of the vehicle may be represented,the historical vehicle heading angle for the last two frames of the vehicle may be represented.
S1103, determining a course angle difference value according to the historical average target course angle, the historical average vehicle course angle and the target length.
The target vehicle may be represented as a length of the vehicle to which the target corresponds. The heading angle difference value may be an arc length deviation of the vehicle from the roadside apparatus. Therefore, the course angle difference value may be determined by first acquiring a course angle difference value between the vehicle and the target, and then determining the course angle difference value according to the absolute value of the course angle difference value and the target length, as shown in formula (10).
(10)
Wherein,representing the target length.
According to the target matching method, a historical average target course angle of a target is determined according to the historical target course angle and the target course angle of the target, a historical average vehicle course angle of the vehicle is determined according to the historical vehicle course angle and the vehicle course angle, and then a course angle difference value is determined according to the historical average target course angle, the historical average vehicle course angle and the target length. According to the method, the historical average target course angle and the historical average vehicle course angle can better represent the course angle of the target and the vehicle, and the arc length deviation between the target and the vehicle can be more accurately determined according to the historical average course angle of the target and the vehicle length of the vehicle, so that the accuracy of target matching is improved.
In one exemplary embodiment, as shown in fig. 12, the distance weight, the speed weight, and the heading angle weight between the target and the vehicle are obtained, including the steps of:
s1201, determining a position function relation according to the distance difference value and the distance weight parameter, the speed difference value and the speed weight parameter, the course angle difference value and the course angle weight parameter and the position difference value between the preset target and the vehicle.
Determining a distance weight according to the distance difference value and the distance weight; determining a speed weight according to the speed difference value and the speed weight; determining course angle weighting according to the course angle difference value and the course angle weighting; the sum of the distance weight, the speed weight, and the heading angle weight may represent an allowable maximum distance between the target and the vehicle; therefore, the positional function relationship between the target and the vehicle can be as shown in formula (11).
(11)
Wherein,representing the difference in position between the preset target and the vehicle,/->Respectively representing a distance weight parameter, a speed weight parameter and a course angle weight parameter, +.>Respectively representing a distance difference value, a speed difference value and a course angle difference value.
S1202, determining a weight function relation according to the distance weight parameter, the speed weight parameter and the course angle weight parameter.
Since the main evaluation index is position, the speed difference and the course angle difference of the vehicle and the target can be set with smaller weight; meanwhile, since the course angle will change more when turning, that is, there may be a larger error, the weight thereof needs to be much smaller than the other two. Namely:、/>. Alternatively, one can choose +.>、/>As a function of the weights.
And S1203, solving the position function relation according to the weight function relation to obtain a distance weight, a speed weight and a course angle weight.
And substituting the actual distance difference value, the actual speed difference value and the actual course angle difference value and the position difference value between the preset target and the vehicle into the position function relation to obtain the distance weight, the speed weight and the course angle weight.
The selection of the position difference between the target and the vehicle can be determined according to whether the target and the vehicle are in the same lane; for example, if the target and the vehicle are in the same lane, the position difference may be set larger; if the target and the vehicle are in different lanes, the position difference may be set smaller.
Thus, the position difference may comprise a first position difference and a second position difference; in an exemplary embodiment, as shown in fig. 13, the solving the location function relation according to the weight function relation, to obtain the distance weight, the speed weight and the heading angle weight, includes the following steps:
S1301, it is detected whether the vehicle and the target are in the same lane.
The target state information may include lane information, and the vehicle state information may include lane information, so that it may be determined whether the lane information of the vehicle is consistent with the lane information of the target, if the lane information of the vehicle is consistent with the lane information of the target, it is determined that the vehicle is in the same lane as the target, and if the lane information of the vehicle is inconsistent with the lane information of the target, it is determined that the vehicle is in different lanes from the target.
S1302, under the condition that the vehicle and the target are in the same lane, solving the position function relation according to the first position difference value and the weight function relation to obtain a distance weight, a speed weight and a course angle weight.
And under the condition that the vehicle and the target are in the same lane, the position function relation and the weight function relation are combined, the first position difference value, the distance difference value, the speed difference value and the course angle difference value of the target and the vehicle are substituted into the position function relation, and the position function relation and the weight function relation are solved to obtain the distance weight, the speed weight and the course angle weight.
For example, if the target length is=5m, its historical average target speed +. >=80 km/h, historical average speed difference between historical average target speed and historical average vehicle speed +.>A historical average heading angle difference between the historical average target heading angle and the historical average vehicle heading angle of 0.5m/s +.>10 DEG>Taking the maximum 200ms of the time interval, the first position differenceValue->4.5m.
The distance weight, the speed weight and the heading angle weight can be obtained as follows:
s1303, under the condition that the vehicle and the target are in different lanes, solving the position function relation according to the second position difference value and the weight function relation to obtain a distance weight, a speed weight and a course angle weight.
Wherein the second position difference is smaller than the first position difference.
And under the condition that the vehicle and the target are in different lanes, the position function relation and the weight function relation are combined, the second position difference value, the distance difference value, the speed difference value and the course angle difference value of the target and the vehicle are substituted into the position function relation, and the position function relation and the weight function relation are solved to obtain the distance weight, the speed weight and the course angle weight.
For example, if the target length is=5m, its historical average target speed +.>=80 km/h, historical average speed difference between historical average target speed and historical average vehicle speed +. >A historical average heading angle difference between the historical average target heading angle and the historical average vehicle heading angle of 0.5m/s +.>10 DEG>Taking the maximum 200ms of the time interval, the second position difference +.>3.5m.
The distance weight, the speed weight and the heading angle weight can be obtained as follows:
in the embodiment of the application, whether the vehicle and the target are in the same lane or not is detected; under the condition that the vehicle and the target are in the same lane, solving the position function relation according to the first position difference value and the weight function relation to obtain a distance weight, a speed weight and a course angle weight; and under the condition that the vehicle and the target are in different lanes, solving the position function relation according to the second position difference value and the weight function relation to obtain a distance weight, a speed weight and a course angle weight. In the method, if the vehicle and the target are in the same lane, the influence of the adjacent lanes is not required to be considered, so that larger weight can be determined, if the vehicle and the target are in different lanes, the influence caused by different lanes is required to be considered, the weight can be set to be smaller, the distance weight, the speed weight and the course angle weight between the target and the vehicle are more accurate, and the accuracy of the allowable maximum distance is improved.
In the target matching method provided by the embodiment of the application, a position function relation is determined according to a distance difference value and a distance weight parameter, a speed difference value and a speed weight parameter, a course angle difference value and a course angle weight parameter and a position difference value between a preset target and a vehicle, a weight function relation is determined according to the distance weight parameter, the speed weight parameter and the course angle weight parameter, and then the position function relation is solved according to the weight function relation to obtain the distance weight, the speed weight and the course angle weight. According to the method, the distance weight, the speed weight and the course angle weight between the target and the vehicle can be accurately determined through the preset position difference value between the target and the vehicle and the relation among the distance weight, the speed weight and the course angle weight.
The manner in which it is determined whether the vehicle is in the same lane as the target is described below by way of one embodiment, in which the vehicle state information includes vehicle coordinates; the target state information includes target coordinates; as shown in fig. 14, this embodiment includes the steps of:
s1401, a local high-precision map of the position of the vehicle is acquired from the vehicle coordinates of the vehicle.
The road side system stores the high-precision map, and after the road side system receives the vehicle coordinates sent by the vehicle, the local high-precision map is determined according to the vehicle coordinates, wherein the local high-precision map is the local high-precision map of the position of the vehicle.
S1402 acquires lane information of the vehicle and the target based on the local high-precision map.
Since the high-precision map refers to a digital map having high precision and detail, accurate position and attribute information of roads, buildings, terrains, traffic signs, traffic lights, lane lines, and other important landmarks are generally included in the high-precision map.
Therefore, lane information corresponding to the vehicle coordinates of the vehicle can be directly searched from the local high-precision map; and searching lane information corresponding to the target coordinates of the target from the local high-precision map.
The lane information may include information such as whether the lane information is in a lane or not and the lane number where the lane information is located.
S1403, when the lane information of the vehicle and the target are identical, it is determined that the vehicle and the target are in the same lane, and when the lane information of the vehicle and the target are not identical, it is determined that the vehicle and the target are in different lanes.
If the lane information of the vehicle is consistent with the lane information of the target, determining that the vehicle and the target are in the same lane; if the lane information of the vehicle is inconsistent with the lane information of the target, determining that the vehicle and the target are in different lanes.
According to the target matching method, a local high-precision map of the position of a vehicle is obtained according to the vehicle coordinates of the vehicle, and lane information of the vehicle and a target is obtained according to the local high-precision map; under the condition that the lane information of the vehicle and the target are consistent, determining that the vehicle and the target are in the same lane; in the case where the lane information of the vehicle and the target are inconsistent, it is determined that the vehicle and the target are in different lanes. According to the method, the lane information of the vehicle and the target is determined according to the high-precision map, so that the accuracy of the lane information of the vehicle and the lane information of the target is guaranteed, and the accuracy of the distance weight, the speed weight and the course angle weight is further improved.
In the following, a detailed description will be given of how lane information of a vehicle and a target is acquired from a local high-precision map by an embodiment, in an exemplary embodiment, as shown in fig. 15, lane information of a vehicle and a target is acquired from a local high-precision map, comprising the steps of:
s1501 converts the local high-precision map into a local high-precision map in a radar coordinate system.
The local high-precision map is also a map under the world coordinate system, so the local high-precision map under the world coordinate system can be converted into the local high-precision map under the radar coordinate system according to the conversion matrix of the world coordinate system and the radar coordinate system.
The essence of converting the local high-precision map into the local high-precision map in the radar coordinate system is converting each position in the local high-precision map into a position coordinate in the radar coordinate system.
S1502, carrying out pixelation processing on a local high-precision map under a radar coordinate system to obtain a pixel table; the pixel table includes a plurality of pixel coordinates and corresponding lane information.
The local high-precision map in the radar coordinate system comprises a plurality of position coordinates and corresponding lane information, the lane information can comprise a lane number and the like, and if the position coordinates are not in a lane, the lane information can be-1.
Therefore, each position coordinate in the local high-precision map under the radar coordinate system can be pixelized to obtain the pixel coordinate corresponding to each position, so as to obtain a pixel table; the pixel table comprises a plurality of pixel coordinates and corresponding lane information.
The method of pixelating the local high-precision map in the radar coordinate system may be to integer each position coordinate in the local high-precision map in the radar coordinate system, for example, multiply each coordinate by 10 to obtain a pixel coordinate corresponding to each position coordinate.
S1503, the vehicle coordinates of the vehicle and the target coordinates of the target are converted into pixel coordinates.
And (3) based on the same mode of pixelating the local high-precision map under the radar coordinate system, pixelating the vehicle coordinates of the vehicle and the target coordinates of the target to obtain the pixel coordinates of the vehicle and the pixel coordinates of the target.
For example, the vehicle coordinates of the vehicle are multiplied by 10 to obtain the pixel coordinates of the vehicle; the target coordinates of the target are multiplied by 10 to obtain the pixel coordinates of the target.
S1504 determines lane information corresponding to the pixel coordinates of the vehicle in the pixel table as lane information of the vehicle, and determines lane information corresponding to the pixel coordinates of the object in the pixel table as lane information of the object.
Acquiring lane information corresponding to pixel coordinates of a vehicle from a pixel table, and determining the lane information as lane information of the vehicle; lane information corresponding to pixel coordinates of the object is obtained from the pixel table, and the lane information is determined as lane information of the object.
Alternatively, if the vehicle coordinates and/or the target coordinates do not exist in the pixel table, the vehicle coordinates and/or the target coordinates may be interpolated in the pixel table to obtain lane information of the vehicle and/or the target.
In the target matching method provided by the embodiment of the application, a local high-precision map is converted into a local high-precision map under a radar coordinate system, and the local high-precision map under the radar coordinate system is subjected to pixelation processing to obtain a pixel table; the pixel table comprises a plurality of pixel coordinates and corresponding lane information; and converting the vehicle coordinates of the vehicle and the target coordinates of the target into pixel coordinates, and finally determining lane information corresponding to the pixel coordinates of the vehicle in the pixel table as lane information of the vehicle and determining lane information corresponding to the pixel coordinates of the target in the pixel table as lane information of the target. According to the method, the pixel list is calibrated in advance according to the high-precision map, so that lane information of the vehicle and the target can be directly searched in the pixel list, and the efficiency and the accuracy of determining the lane information of the vehicle and the target are improved.
In an exemplary embodiment, the embodiment of the present application further provides a target matching method, as shown in fig. 16, where the embodiment includes the following steps:
s1601, receiving vehicle state information sent by an intelligent network-connected vehicle and target state information of all targets acquired by road side equipment at a plurality of moments within a preset time interval.
Wherein the time intervals comprise a first time interval (50 ms) and a second time interval (200 ms); the method comprises the steps of acquiring multi-frame data, wherein the multi-frame data is acquired by a road side unit, the time interval between the multi-frame data and vehicle state information sent by an intelligent network-connected vehicle is larger than a first time interval and smaller than a second time interval, and each frame of data comprises target state information of a plurality of targets acquired by road side equipment.
S1602 converts the vehicle coordinates in the vehicle state information into vehicle coordinates in a radar coordinate system.
Wherein, according to the transformation matrix under the world coordinate system and the radar coordinate system. And multiplying the vehicle coordinates in the vehicle state information by the transformation matrix to obtain the vehicle coordinates of the intelligent network-connected vehicle under the radar coordinate system.
S1603, calculating a coordinate difference value, a speed difference value and a course angle difference value of the target and the vehicle aiming at any target, and determining the target with the coordinate difference value, the speed difference value and the course angle difference value meeting preset conditions as a reference target.
S1604, it is determined whether the vehicle and the reference target are in the same lane based on the vehicle coordinates of the vehicle, the target coordinates of the reference target, and the high-precision map.
The method comprises the steps of obtaining a local high-precision map of a position of an intelligent network-connected vehicle, converting the local high-precision map into a map under a radar coordinate system, and carrying out pixelation processing on the converted high-precision map to obtain a pixel table, wherein the pixel table comprises pixel coordinates of each position and corresponding attribute information, and the attribute information represents a lane number of the corresponding position; the method comprises the steps of carrying out pixelation processing on vehicle coordinates and coordinates of each reference object under a radar coordinate system to obtain pixel coordinates of the vehicle and each reference object; searching attribute information, namely track numbers, of the vehicle and each reference target from a pixel table according to the pixel coordinates of the vehicle and each reference target; and determining whether the vehicle and the reference target are in the same lane according to the lane number of the vehicle and the lane number of the reference target.
S1605, for any reference object, determining a distance weight, a speed weight, and a heading angle weight between the reference object and the vehicle according to whether the reference object and the vehicle are in the same lane.
The average speed and the average course angle of the reference target in the last three frames are respectively determined to be the historical average target speed and the historical average target course angle, the average speed and the average course angle of the vehicle in the last three frames are respectively determined to be the historical average vehicle speed and the historical average vehicle course angle, and the numerical values of the position weight, the speed weight and the course angle weight of the reference target and the vehicle in the same lane and the numerical values of the position weight, the speed weight and the course angle weight of the reference target and the vehicle in different lanes are determined according to the length, the acquisition difference time length, the historical average target speed, the historical average target course angle, the historical average vehicle speed and the historical average vehicle course angle of the reference target and the relation among the position weight, the speed weight and the course angle weight of the reference target and the vehicle in different lanes.
S1606, determining the allowable maximum distance between the reference target and the vehicle according to the values of the position weight, the speed weight and the heading angle weight, and the length of the reference target, the acquired difference time length, the historical average target speed, the historical average target heading angle, the historical average vehicle speed and the historical average vehicle heading angle.
S1607, determining a measured distance between the reference target and the vehicle according to the target coordinates of the reference target and the vehicle coordinates of the vehicle.
S1608, if the measured distance of the reference target is smaller than the corresponding allowable maximum distance, determining that the reference target is matched with the vehicle; if there are a plurality of reference targets matched with the vehicle, the reference target with the smallest measured distance with the vehicle is determined as the target matched with the vehicle.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
Based on the same inventive concept, the embodiment of the application also provides an object matching device for realizing the above-mentioned related object matching method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitation in one or more embodiments of the target matching device provided below may refer to the limitation of the target matching method hereinabove, and will not be repeated herein.
In an exemplary embodiment, as shown in fig. 17, there is provided a target matching apparatus 1700, comprising: an acquisition module 1701, a first determination module 1702, and a second determination module 1703, wherein:
an acquiring module 1701, configured to acquire vehicle state information of a current time acquired by a vehicle and target state information of at least one target at a plurality of times acquired by a road side device;
a first determining module 1702 configured to determine an allowable maximum distance between each target and the vehicle according to target state information of each target, vehicle state information, and a preset acquisition difference duration between the vehicle and the roadside device;
a second determining module 1703 for determining a matching target of the vehicle from the targets based on the measured distance and the allowed maximum distance between the targets and the vehicle.
In one exemplary embodiment, the acquisition module 1701 includes:
the first acquisition unit is used for acquiring a plurality of candidate moments corresponding to the current moment according to a preset time interval;
the screening unit is used for screening at least one candidate target meeting preset conditions from at least one candidate target at a plurality of candidate moments acquired by the road side equipment according to the vehicle state information;
the first determining unit is used for determining target state information of at least one candidate target meeting preset conditions as target state information of at least one target at a plurality of moments acquired by the road side equipment.
In one exemplary embodiment, the time intervals include a maximum time interval and a minimum time interval; the first acquisition unit includes:
and the first determination subunit is used for determining a plurality of candidate moments corresponding to the current moment according to the maximum time interval and the minimum time interval.
In an exemplary embodiment, the screening unit includes:
an alignment subunit, configured to spatially align the vehicle state information with target state information of each candidate target;
and the screening subunit is used for screening candidate targets meeting preset conditions from the candidate targets according to the aligned vehicle state information and the target state information.
In one exemplary embodiment, the aligned vehicle state information includes vehicle coordinates, vehicle speed, and vehicle heading angle; the target state information comprises target coordinates, target speed and target course angle; the screening subunit comprises:
the third determining subunit is used for respectively determining a coordinate difference value and an actual measurement distance between each candidate target and the vehicle according to the target coordinates and the vehicle coordinates;
a fourth determination subunit, configured to determine a speed difference between each candidate target and the vehicle according to the target speed and the vehicle speed;
a fifth determination subunit, configured to determine a heading angle difference value between each candidate target and the vehicle according to each target heading angle and the vehicle heading angle;
the second acquisition subunit is used for acquiring candidate targets of which the coordinate difference value, the measured distance, the speed difference value and the course angle difference value all meet preset conditions in each candidate target.
In one exemplary embodiment, the first determining module 1702 includes:
the second determining unit is used for determining a distance difference value, a speed difference value and a course angle difference value between the target and the vehicle according to the target state information, the vehicle state information and the acquisition difference duration of the target aiming at any one target;
And a third determining unit for determining an allowable maximum distance between the target and the vehicle according to the distance difference value, the speed difference value and the heading angle difference value.
In one exemplary embodiment, the target state information includes a target speed; the vehicle state information includes a vehicle speed; the second determining unit is specifically configured to determine a historical average target speed of the target according to the historical target speed and the target speed of the target, and determine a distance difference value according to the historical average speed and the acquired difference duration; the method comprises the steps of determining a historical average vehicle speed of a vehicle according to the historical vehicle speed and the vehicle speed of the vehicle, and determining a speed difference value according to the historical average target speed, the historical average vehicle speed and the acquired difference duration.
In one embodiment, the target state information includes a target length and a target heading angle; the vehicle state information includes a vehicle heading angle; the second determining unit is specifically further configured to determine a historical average target course angle of the target according to the historical target course angle and the target course angle of the target; determining a historical average vehicle course angle of the vehicle according to the historical vehicle course angle and the vehicle course angle of the vehicle; and determining a course angle difference value according to the historical average target course angle, the historical average vehicle course angle and the target length.
In one embodiment, the third determining unit includes:
the third acquisition subunit is used for acquiring the distance weight, the speed weight and the course angle weight between the target and the vehicle;
and a sixth determining subunit, configured to determine the allowable maximum distance according to the distance difference value, the speed difference value, the heading angle difference value, the distance weight, the speed weight and the heading angle weight.
In an exemplary embodiment, the third obtaining subunit is specifically configured to determine a location function relationship according to the distance difference value and the distance weight parameter, the speed difference value and the speed weight parameter, the heading angle difference value and the heading angle weight parameter, and a preset location difference value between the target and the vehicle; determining a weight function relation according to the distance weight parameter, the speed weight parameter and the course angle weight parameter; and solving the position function relation according to the weight function relation to obtain a distance weight, a speed weight and a course angle weight.
In one exemplary embodiment, the position differences include a first position difference and a second position difference; the third acquisition subunit is specifically further configured to detect whether the vehicle and the target are in the same lane; under the condition that the vehicle and the target are in the same lane, solving the position function relation according to the first position difference value and the weight function relation to obtain a distance weight, a speed weight and a course angle weight; and under the condition that the vehicle and the target are in different lanes, solving the position function relation according to the second position difference value and the weight function relation to obtain a distance weight, a speed weight and a course angle weight.
In one exemplary embodiment, the vehicle state information includes vehicle coordinates; the target state information includes target coordinates; the apparatus 1700 further comprises:
the extraction module is used for acquiring a local high-precision map of the position of the vehicle according to the vehicle coordinates of the vehicle;
the calculation module is used for acquiring lane information of the vehicle and the target according to the local high-precision map;
the judging module is used for determining that the vehicle and the target are in the same lane under the condition that lane information of the vehicle and the target are consistent; in the case where the lane information of the vehicle and the target are inconsistent, it is determined that the vehicle and the target are in different lanes.
In one exemplary embodiment, the vehicle state information includes vehicle coordinates and the target state information includes target coordinates; the second determination module 1703 includes:
a fifth determining unit for determining the measured distance between each object and the vehicle according to the object coordinates and the vehicle coordinates of each object;
the second acquisition unit is used for acquiring a reference target with the measured distance smaller than the corresponding allowable maximum distance from each target;
a sixth determining unit configured to determine the reference target as a matching target of the vehicle in a case where the reference target is one; in the case where there are a plurality of reference targets, the reference target having the smallest distance is determined as the matching target of the vehicle.
The respective modules in the above-described object matching apparatus may be implemented in whole or in part by software, hardware, and combinations thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one exemplary embodiment, a computer device is provided, which may be a server, and the internal structure thereof may be as shown in fig. 18. The computer device includes a processor, a memory, an Input/Output interface (I/O) and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is for storing target match data. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a target matching method.
It will be appreciated by those skilled in the art that the structure shown in fig. 18 is merely a block diagram of a portion of the structure associated with the present application and is not limiting of the computer device to which the present application is applied, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In an exemplary embodiment, a computer device is also provided, comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
The implementation principle and technical effect of each step implemented by the processor in this embodiment are similar to those of the above-mentioned target matching method, and are not described herein again.
In one exemplary embodiment, a computer-readable storage medium is provided, on which a computer program is stored which, when being executed by a processor, carries out the steps of the method embodiments described above.
The steps of the computer program implemented when executed by the processor in this embodiment realize the principle and technical effects similar to those of the above-mentioned target matching method, and are not described herein again.
In an exemplary embodiment, a computer program product is provided, comprising a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
The steps of the computer program implemented when executed by the processor in this embodiment realize the principle and technical effects similar to those of the above-mentioned target matching method, and are not described herein again.
It should be noted that, the data (including, but not limited to, data for analysis, stored data, presented data, etc.) referred to in the present application are all information and data authorized by the user or sufficiently authorized by each party, and the collection, use, and processing of the relevant data are required to meet the relevant regulations.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (15)

1. A method of object matching, the method comprising:
acquiring vehicle state information of a vehicle at the current moment and target state information of at least one target at a plurality of moments acquired by road side equipment;
determining an allowable maximum distance between each target and the vehicle according to the target state information of each target, the vehicle state information and a preset acquisition difference time length between the vehicle and the road side equipment;
And determining a matching target of the vehicle from the targets according to the measured distance and the allowable maximum distance between the targets and the vehicle.
2. The method according to claim 1, wherein the obtaining the target state information of at least one target at a plurality of moments collected by the roadside device includes:
acquiring a plurality of candidate moments corresponding to the current moment according to a preset time interval;
screening at least one candidate target meeting preset conditions from at least one candidate target at the plurality of candidate moments acquired by the road side equipment according to the vehicle state information;
and determining the target state information of at least one candidate target meeting the preset condition as the target state information of at least one target at a plurality of moments acquired by the road side equipment.
3. The method of claim 2, wherein the time intervals comprise a maximum time interval and a minimum time interval; the obtaining a plurality of candidate moments corresponding to the current moment according to a preset time interval includes:
and determining a plurality of candidate moments corresponding to the current moment according to the maximum time interval and the minimum time interval.
4. The method according to claim 2, wherein the screening at least one candidate object satisfying a preset condition from at least one candidate object at the plurality of candidate moments acquired by the roadside apparatus according to the vehicle state information comprises:
spatially aligning the vehicle state information with target state information of each candidate target;
and screening candidate targets meeting preset conditions from the candidate targets according to the aligned vehicle state information and the target state information.
5. The method of claim 4, wherein the aligned vehicle state information includes vehicle coordinates, vehicle speed, and vehicle heading angle; the target state information comprises target coordinates, target speed and target course angle; the step of screening candidate targets meeting preset conditions from the candidate targets according to the aligned vehicle state information and the target state information, includes:
according to the target coordinates and the vehicle coordinates, respectively determining a coordinate difference value and a measured distance between each candidate target and the vehicle;
determining a speed difference between each candidate target and the vehicle according to each target speed and the vehicle speed;
Determining a heading angle difference value between each candidate target and the vehicle according to each target heading angle and the vehicle heading angle;
and obtaining candidate targets of which the coordinate difference value, the measured distance, the speed difference value and the course angle difference value all meet preset conditions.
6. The method according to any one of claims 1-5, wherein determining the allowable maximum distance between each of the targets and the vehicle according to the target state information of each of the targets, the vehicle state information, and a preset acquisition difference duration between the vehicle and the roadside apparatus comprises:
for any one target, determining a distance difference value, a speed difference value and a course angle difference value between the target and the vehicle according to target state information of the target, the vehicle state information and the acquisition difference duration;
and determining an allowable maximum distance between the target and the vehicle according to the distance difference value, the speed difference value and the course angle difference value.
7. The method of claim 6, wherein the target state information comprises a target speed; the vehicle state information includes a vehicle speed; determining a distance difference value and a speed difference value between the target and the vehicle according to the target state information of the target, the vehicle state information and the acquisition difference time length, wherein the method comprises the following steps:
Determining a historical average target speed of the target according to the historical target speed and the target speed of the target;
determining the distance difference value according to the historical average speed and the acquisition difference duration;
determining a historical average vehicle speed of the vehicle based on the historical vehicle speed of the vehicle and the vehicle speed;
and determining the speed difference value according to the historical average target speed, the historical average vehicle speed and the acquisition difference duration.
8. The method of claim 6, wherein the target state information includes a target length and a target heading angle; the vehicle state information includes a vehicle heading angle; determining a heading angle difference value between the target and the vehicle according to the target state information of the target, the vehicle state information and the acquisition difference duration, wherein the method comprises the following steps:
according to the historical target course angle and the target course angle of the target, determining the historical average target course angle of the target;
determining a historical average vehicle course angle of the vehicle according to the historical vehicle course angle and the vehicle course angle of the vehicle;
and determining the course angle difference value according to the historical average target course angle, the historical average vehicle course angle and the target length.
9. The method of claim 6, wherein the determining an allowable maximum distance between the target and the vehicle based on the distance difference value, the speed difference value, and the heading angle difference value comprises:
acquiring a distance weight, a speed weight and a course angle weight between the target and the vehicle;
and determining an allowable maximum distance according to the distance difference value, the speed difference value, the course angle difference value, the distance weight, the speed weight and the course angle weight.
10. The method of claim 9, wherein the obtaining distance weights, speed weights, and heading angle weights between the target and the vehicle comprises:
determining a position function relation according to the distance difference value and the distance weight parameter, the speed difference value and the speed weight parameter, the course angle difference value and the course angle weight parameter and a position difference value between a preset target and a vehicle;
determining a weight function relation according to the distance weight parameter, the speed weight parameter and the course angle weight parameter;
and solving the position function relation according to the weight function relation to obtain the distance weight, the speed weight and the course angle weight.
11. The method of claim 10, wherein the position difference comprises a first position difference and a second position difference; solving the position function relation according to the weight function relation to obtain the distance weight, the speed weight and the course angle weight, wherein the method comprises the following steps:
detecting whether the vehicle and the target are in the same lane;
under the condition that the vehicle and the target are in the same lane, solving the position function relation according to the first position difference value and the weight function relation to obtain the distance weight, the speed weight and the heading angle weight;
and under the condition that the vehicle and the target are in different lanes, solving the position function relation according to the second position difference value and the weight function relation to obtain the distance weight, the speed weight and the heading angle weight.
12. The method of claim 11, wherein the vehicle status information includes vehicle coordinates; the target state information comprises target coordinates; the method further comprises the steps of:
according to the vehicle coordinates of the vehicle, acquiring a local high-precision map of the position of the vehicle;
Acquiring lane information of the vehicle and the target according to the local high-precision map;
determining that the vehicle is in the same lane as the target under the condition that the lane information of the vehicle and the target are consistent;
and in the case that the lane information of the vehicle and the target are inconsistent, determining that the vehicle and the target are in different lanes.
13. The method of any one of claims 1-5, wherein the vehicle state information comprises vehicle coordinates and the target state information comprises target coordinates; the determining a matching target of the vehicle from the targets according to the measured distance and the allowed maximum distance between the targets and the vehicle comprises:
determining measured distances between the targets and the vehicle according to the target coordinates of the targets and the vehicle coordinates;
acquiring a reference target with the measured distance smaller than the corresponding allowable maximum distance from each target;
in the case that the reference target is one, determining the reference target as a matching target of the vehicle;
and in the case that the number of the reference targets is multiple, determining the reference target with the smallest distance as the matching target of the vehicle.
14. An object matching device, the device comprising:
the acquisition module is used for acquiring vehicle state information of a vehicle at the current moment and target state information of at least one target at a plurality of moments acquired by road side equipment;
the first determining module is used for determining the allowable maximum distance between each target and the vehicle according to the target state information of each target, the vehicle state information and the acquisition difference duration preset between the vehicle and the road side equipment;
and the second determining module is used for determining a matching target of the vehicle from the targets according to the measured distance and the allowable maximum distance between the targets and the vehicle.
15. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 13 when the computer program is executed.
CN202410097585.7A 2024-01-24 2024-01-24 Target matching method, device and computer equipment Active CN117649777B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410097585.7A CN117649777B (en) 2024-01-24 2024-01-24 Target matching method, device and computer equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410097585.7A CN117649777B (en) 2024-01-24 2024-01-24 Target matching method, device and computer equipment

Publications (2)

Publication Number Publication Date
CN117649777A true CN117649777A (en) 2024-03-05
CN117649777B CN117649777B (en) 2024-04-19

Family

ID=90045354

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410097585.7A Active CN117649777B (en) 2024-01-24 2024-01-24 Target matching method, device and computer equipment

Country Status (1)

Country Link
CN (1) CN117649777B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110596694A (en) * 2019-09-20 2019-12-20 吉林大学 Complex environment radar multi-target tracking and road running environment prediction method
CN111770451A (en) * 2020-05-26 2020-10-13 同济大学 Road vehicle positioning and sensing method and device based on vehicle-road cooperation
CN112767475A (en) * 2020-12-30 2021-05-07 重庆邮电大学 Intelligent roadside sensing system based on C-V2X, radar and vision
CN113379805A (en) * 2021-08-12 2021-09-10 深圳市城市交通规划设计研究中心股份有限公司 Multi-information resource fusion processing method for traffic nodes
CN115665844A (en) * 2022-10-18 2023-01-31 中信科智联科技有限公司 Synchronization method, synchronization device, mobile equipment and road side equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110596694A (en) * 2019-09-20 2019-12-20 吉林大学 Complex environment radar multi-target tracking and road running environment prediction method
CN111770451A (en) * 2020-05-26 2020-10-13 同济大学 Road vehicle positioning and sensing method and device based on vehicle-road cooperation
CN112767475A (en) * 2020-12-30 2021-05-07 重庆邮电大学 Intelligent roadside sensing system based on C-V2X, radar and vision
CN113379805A (en) * 2021-08-12 2021-09-10 深圳市城市交通规划设计研究中心股份有限公司 Multi-information resource fusion processing method for traffic nodes
CN115665844A (en) * 2022-10-18 2023-01-31 中信科智联科技有限公司 Synchronization method, synchronization device, mobile equipment and road side equipment

Also Published As

Publication number Publication date
CN117649777B (en) 2024-04-19

Similar Documents

Publication Publication Date Title
KR20200121274A (en) Method, apparatus, and computer readable storage medium for updating electronic map
Bhardwaj et al. Autocalib: Automatic traffic camera calibration at scale
US8359156B2 (en) Map generation system and map generation method by using GPS tracks
CN110287276A (en) High-precision map updating method, device and storage medium
CN105206057B (en) Detection method and system based on Floating Car resident trip hot spot region
EP3769507A1 (en) Traffic boundary mapping
CN112287566B (en) Automatic driving scene library generation method and system and electronic equipment
Li et al. Knowledge-based trajectory completion from sparse GPS samples
Villanueva et al. Crowdsensing smart city parking monitoring
CN105628951A (en) Method and device for measuring object speed
CN114079665B (en) Data acquisition method, device, equipment and storage medium
CN102831766A (en) Multi-source traffic data fusion method based on multiple sensors
Wang et al. Realtime wide-area vehicle trajectory tracking using millimeter-wave radar sensors and the open TJRD TS dataset
CN104569911A (en) OBU positioning method, RSU and ETC system
CN112258850A (en) Edge side multi-sensor data fusion system of vehicle-road cooperative system
Chen et al. Enabling smart urban services with gps trajectory data
You et al. Unsupervised adaptation from repeated traversals for autonomous driving
Chen et al. A cooperative perception environment for traffic operations and control
Cao et al. An analytical model for quantifying the efficiency of traffic-data collection using instrumented vehicles
CN113012215A (en) Method, system and equipment for space positioning
Wu et al. OCR-RTPS: an OCR-based real-time positioning system for the valet parking
CN117649777B (en) Target matching method, device and computer equipment
Liu et al. An online intelligent method to calibrate radar and camera sensors for data fusing
CN116229708A (en) Perception test method of road side perception system on traffic target based on V2I
CN115188195A (en) Method and system for extracting vehicle track of urban omnidirectional intersection in real time

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant