CN114234996A - Multi-intersection multi-sensor-based track fusion method and system - Google Patents

Multi-intersection multi-sensor-based track fusion method and system Download PDF

Info

Publication number
CN114234996A
CN114234996A CN202111569137.5A CN202111569137A CN114234996A CN 114234996 A CN114234996 A CN 114234996A CN 202111569137 A CN202111569137 A CN 202111569137A CN 114234996 A CN114234996 A CN 114234996A
Authority
CN
China
Prior art keywords
intersection
track
information
distance information
opposite
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111569137.5A
Other languages
Chinese (zh)
Other versions
CN114234996B (en
Inventor
闫军
陈芸
王伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Super Vision Technology Co Ltd
Original Assignee
Super Vision Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Super Vision Technology Co Ltd filed Critical Super Vision Technology Co Ltd
Priority to CN202111569137.5A priority Critical patent/CN114234996B/en
Priority claimed from CN202111569137.5A external-priority patent/CN114234996B/en
Publication of CN114234996A publication Critical patent/CN114234996A/en
Application granted granted Critical
Publication of CN114234996B publication Critical patent/CN114234996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3407Route searching; Route guidance specially adapted for specific applications
    • G01C21/3415Dynamic re-routing, e.g. recalculating the route when the user deviates from calculated route or after detecting real-time traffic data or accidents
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/34Route searching; Route guidance
    • G01C21/3446Details of route searching algorithms, e.g. Dijkstra, A*, arc-flags, using precalculated routes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled

Abstract

The invention discloses a multi-intersection multi-sensor-based track fusion method and a system, which relate to the field of intersection intelligent vehicle management and comprise the following steps: the method comprises the steps of judging whether the opposite intersection is successfully fused or not by judging the zone bit according to the fusion judgment zone bit of the opposite intersection, and/or the sequence number of the fused total track information corresponding to the first intersection, and/or the difference value of the track distance information of adjacent frames, and then adopting different track information generation strategies according to different fusion conditions, so that fusion of multi-intersection data can be realized without adopting a complex algorithm.

Description

Multi-intersection multi-sensor-based track fusion method and system
Technical Field
The invention relates to the field of intelligent vehicle management at intersections, in particular to a track fusion method and system based on multiple intersections and multiple sensors.
Background
With the increasing number of urban automobiles, road conditions are more complex, especially in various intersection areas, vehicles, non-motor vehicles, pedestrians and the like are gathered together, and therefore, vehicle targets at multiple intersections are usually tracked and detected in a radar and camera combined mode. In order to better combine the data of the target point collected by the radar with the data of the target point collected by the camera, the data of the same target tracked by each intersection are generally fused.
At present, when multi-interface data fusion is carried out, a BP neural network algorithm is usually adopted for realizing, however, the algorithm has larger operation amount and lower real-time performance, and simultaneously has higher requirements on conditions of hardware equipment, so that the realization difficulty of the existing multi-interface data fusion is higher, and the hardware cost is higher.
Disclosure of Invention
In order to solve the technical problems, the invention provides a multi-interface and multi-sensor based flight path fusion method and system, which can solve the problems of high difficulty and high hardware cost of the conventional multi-interface data fusion.
In order to achieve the above object, in one aspect, the present invention provides a multi-intersection multi-sensor based track fusion method, including:
performing coordinate conversion on track distance information of targets respectively corresponding to at least two intersections to obtain track distance information after coordinate conversion;
acquiring a fusion judgment zone bit according to track distance information, speed information and category information after coordinate conversion of a first intersection, track distance information, speed information and category information after coordinate conversion of an opposite intersection corresponding to the first intersection, wherein the first intersection is any one of the at least two intersections;
acquiring the sequence number of the fused total track information corresponding to the first intersection according to the track information respectively corresponding to at least two intersections in the previous frame and the label information of the first intersection;
acquiring a difference value judgment mark position of the track distance information of adjacent frames according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the sequence number of at least two intersections in the previous frame;
judging whether the intersection is successfully fused or not by the marker bit according to the fusion judgment marker bit and/or the sequence number of the fused total track information corresponding to the first intersection and/or the difference value of the adjacent frame track distance information;
if the intersection is successful, taking the track distance information, the speed information, the label and the category of the first intersection as the track information after the intersection is fused;
and if the intersection information is unsuccessful, taking the track distance information, the speed information, the label, the category of the first intersection and the track distance information, the speed information, the label and the category of the opposite intersection as the track information after the opposite intersection is fused.
Further, the step of obtaining the fusion judgment flag bit according to the track distance information, the speed information, and the category information after the coordinate conversion of the first intersection, the track distance information, the speed information, and the category information after the coordinate conversion of the opposite intersection corresponding to the first intersection includes:
judging whether the converted track distance information meets a preset condition or not according to formulas abs (Xi-Xj) < (gate) and abs (Yi-Yj) < (gate), wherein Xi and Yi are distance information of the first intersection, Xj and Yj are distance information of an opposite intersection corresponding to the first intersection, abs represents an absolute value, gate represents an error threshold in the x direction, and gate represents an error threshold in the y direction;
judging whether the speed information meets a preset condition or not according to formulas Vi, Vj and abs (Vi + Vj) < ═ gatev, wherein Vi is the speed information of the first intersection, Vj is the speed information of an opposite intersection corresponding to the first intersection, and gatev represents an error threshold of the speed;
judging whether the category information of the first intersection is the same as that of the opposite intersection or not;
and if the track distance information, the speed information and the category information all meet the preset conditions, determining that the judgment zone bit is true.
Further, the step of obtaining the sequence number of the fused total track information corresponding to the first intersection according to the track information corresponding to the at least two intersections in the previous frame and the label information of the first intersection includes:
judging whether the mark number of the first intersection track information exists in the track information respectively corresponding to at least two intersections in the previous frame;
if so, taking the label of the first intersection track information as the serial number index of the total track information after fusion;
if not, the sequence number index is set to 0.
Further, the step of obtaining the difference value of the track distance information of the adjacent frames to judge the flag bit according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the sequence number of the at least two intersections in the previous frame includes:
acquiring the difference value of the track distance information corresponding to the serial number in the previous frame respectively corresponding to the first intersection and the opposite intersection according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the serial number in the previous frame respectively corresponding to at least two intersections;
and confirming that the difference value judges that the flag bit is true if the difference value of the track distance information corresponding to the sequence number in the previous frame respectively corresponding to the first intersection and the opposite intersection meets the preset difference value condition, otherwise, confirming that the flag bit is false.
Further, the step of judging whether the intersection is successfully fused according to the fusion judgment flag bit, and/or the sequence number of the fused total track information corresponding to the first intersection, and/or the difference value of the adjacent frame track distance information includes:
if the serial number of the fused total track information corresponding to the first intersection is 0 and the fusion judgment flag bit is true, confirming that the fusion is successful;
and if the serial number of the fused total track information corresponding to the first intersection is not 0, the fusion judgment flag bit is true, and the difference judgment flag bit is true, the fusion is determined to be successful.
Further, if there are multiple subtended intersections, the method further comprises:
acquiring fusion judgment mark positions according to the track distance information and the category information respectively corresponding to the fused opposite intersections;
acquiring the serial number of the fused total track information corresponding to the fused opposite crossing according to the track information respectively corresponding to at least two crossings in the previous frame and the label information of the fused opposite crossing;
acquiring a difference value judgment zone bit of the track distance information of the adjacent frames according to the track distance information after the coordinate conversion of each opposite crossing after the fusion and the track distance information corresponding to the sequence number in the last frame of at least two crossings;
judging whether the fusion of each fused opposite direction crossing is successful or not by the marker bit according to the fusion judgment marker bit and/or the sequence number of the fused total track information corresponding to the fused opposite direction crossing and/or the difference value of the adjacent frame track distance information;
if the intersection is successful, taking the integrated track distance information, speed information, label and category of the first opposite intersection as the integrated track information of each opposite intersection;
and if the intersection is not successful, taking the track distance information, the speed information, the label and the category which respectively correspond to each merged opposite intersection as the track information merged by each opposite intersection.
Further, the method further comprises:
judging whether the mark in the current frame fused track information exists in the previous frame or not;
if yes, storing the current frame fused track information in a storage position corresponding to the label;
if not, adding a new track label and storing the fused track information.
In another aspect, the present invention provides a multi-intersection multi-sensor based track fusion system, including: the conversion unit is used for carrying out coordinate conversion on the track distance information of the targets respectively corresponding to the at least two intersections to obtain the track distance information after the coordinate conversion;
an obtaining unit, configured to obtain a fusion judgment flag according to track distance information, speed information, category information after coordinate conversion of a first intersection, and track distance information, speed information, and category information after coordinate conversion of an opposite intersection corresponding to the first intersection, where the first intersection is any one of the at least two intersections;
the acquiring unit is further configured to acquire a sequence number of the fused total track information corresponding to the first intersection according to the track information corresponding to the at least two intersections in the previous frame and the label information of the first intersection; acquiring a difference value judgment mark position of the track distance information of adjacent frames according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the sequence number of at least two intersections in the previous frame;
the judging unit is used for judging whether the intersection is successfully fused or not according to the fusion judging flag bit and/or the sequence number of the fused total track information corresponding to the first intersection and/or the difference value of the adjacent frame track distance information;
the fusion unit is used for taking the track distance information, the speed information, the label and the category of the first intersection as the track information after the intersection is fused if the merging is successful; and if the intersection information is unsuccessful, taking the track distance information, the speed information, the label, the category of the first intersection and the track distance information, the speed information, the label and the category of the opposite intersection as the track information after the opposite intersection is fused.
Further, the obtaining unit is specifically configured to determine whether the converted track distance information meets a preset condition according to formulas abs (Xi-Xj) < ═ gate and abs (Yi-Yj) < ═ gate, where Xi and Yi are distance information of the first intersection, Xj and Yj are distance information of an opposite intersection corresponding to the first intersection, abs represents an absolute value, gate represents an error threshold in the x direction, and gate represents an error threshold in the y direction; judging whether the speed information meets a preset condition or not according to formulas Vi, Vj and abs (Vi + Vj) < ═ gatev, wherein Vi is the speed information of the first intersection, Vj is the speed information of an opposite intersection corresponding to the first intersection, and gatev represents an error threshold of the speed; judging whether the category information of the first intersection is the same as that of the opposite intersection or not; and if the track distance information, the speed information and the category information all meet the preset conditions, determining that the judgment zone bit is true.
Further, the acquiring unit is specifically configured to determine whether a label of the first intersection flight path information exists in flight path information corresponding to at least two intersections in a previous frame; if so, taking the label of the first intersection track information as the serial number index of the total track information after fusion; if not, the sequence number index is set to 0.
Further, the acquiring unit is specifically configured to acquire, according to the track distance information after coordinate conversion of the first intersection, the track distance information after coordinate conversion of the opposite intersection corresponding to the first intersection, and the track distance information corresponding to the sequence number in the last frame of at least two intersections, a difference value between the track distance information corresponding to the sequence number in the last frame and corresponding to the first intersection and the opposite intersection; and confirming that the difference value judges that the flag bit is true if the difference value of the track distance information corresponding to the sequence number in the previous frame respectively corresponding to the first intersection and the opposite intersection meets the preset difference value condition, otherwise, confirming that the flag bit is false.
Further, the determining unit is specifically configured to determine that the fusion is successful if the serial number of the fused total track information corresponding to the first intersection is 0 and the fusion determination flag bit is true; and if the serial number of the fused total track information corresponding to the first intersection is not 0, the fusion judgment flag bit is true, and the difference judgment flag bit is true, the fusion is determined to be successful.
Further, the fusion unit is further configured to obtain fusion judgment flag bits according to the track distance information and the category information respectively corresponding to the fused intersections of different opposite directions; acquiring the serial number of the fused total track information corresponding to the fused opposite crossing according to the track information respectively corresponding to at least two crossings in the previous frame and the label information of the fused opposite crossing; acquiring a difference value judgment zone bit of the track distance information of the adjacent frames according to the track distance information after the coordinate conversion of each opposite crossing after the fusion and the track distance information corresponding to the sequence number in the last frame of at least two crossings; judging whether the fusion of each fused opposite direction crossing is successful or not by the marker bit according to the fusion judgment marker bit and/or the sequence number of the fused total track information corresponding to the fused opposite direction crossing and/or the difference value of the adjacent frame track distance information; if the intersection is successful, taking the integrated track distance information, speed information, label and category of the first opposite intersection as the integrated track information of each opposite intersection; and if the intersection is not successful, taking the track distance information, the speed information, the label and the category which respectively correspond to each merged opposite intersection as the track information merged by each opposite intersection.
Further, the system further comprises: an update unit;
the updating unit is used for judging whether the mark in the current frame fused track information exists in the previous frame or not; if yes, storing the current frame fused track information in a storage position corresponding to the label; if not, adding a new track label and storing the fused track information.
The invention provides a multi-intersection multi-sensor-based track fusion method and system, which judge whether the opposite intersections are successfully fused or not by judging the marker bits according to the fusion of the opposite intersections, and/or the sequence number of the fused total track information corresponding to a first intersection and/or the difference value of adjacent frame track distance information, and then adopt different track information generation strategies according to different fusion conditions.
Drawings
FIG. 1 is a flow chart of a multi-intersection multi-sensor based track fusion method provided by the invention;
FIG. 2 is a first schematic structural diagram of a multi-intersection multi-sensor-based track fusion system provided by the invention;
FIG. 3 is a schematic structural diagram II of a multi-intersection multi-sensor-based track fusion system provided by the invention;
fig. 4 is a schematic diagram of a total coordinate system of an intersection provided by the present invention.
Detailed Description
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
As shown in fig. 1, a method for track fusion based on multiple interfaces and multiple sensors provided by an embodiment of the present invention includes the following steps:
101. and carrying out coordinate conversion on the track distance information of the targets respectively corresponding to the at least two intersections to obtain the track distance information after the coordinate conversion.
For the embodiment of the present invention, step 101 may specifically include: taking a crossroad as an example, reading track distance information of 4 crossroads of a current frame, performing coordinate conversion processing on 4 groups of data, and converting the distance information of the 4 groups of intersection data from a coordinate system of a single intersection to a total coordinate system of the crossroad, as shown in fig. 4, obtaining new track distance information after coordinate conversion, and keeping other information in the track information unchanged. Let the distance information of the intersection 1 be X1 and Y1, and the new distance information after coordinate transformation be X1 and Y1; distance information of the intersection 2 is X2 and Y2, and new distance information after coordinate transformation is X2 and Y2; distance information of the intersection 3 is X3 and Y3, and new distance information after coordinate transformation is X3 and Y3; distance information of the intersection 4 is X4 and Y4, and new distance information after coordinate transformation is X4 and Y4; the distance information after the intersection 1 is converted is X1 ═ X1-deltax 1; y1-deltay1 of Y1; wherein deltax1 is the projection of the distance between the O1 origin and the O origin in the abscissa direction of the intersection 1; deltay1 is a longitudinal projection of the distance between the O1 origin and the O origin in the longitudinal coordinate direction of the intersection 1; o1 is the origin of the No. 1 intersection radar vision coordinate system, and O is the origin of the intersection coordinate system; the distance information after the intersection 2 is converted is as follows: x2-y 2-deltay 2; y2 ═ deltax2-x 2; wherein, deltax2 is the projection of the distance between the O2 origin and the O origin in the abscissa direction of the No. 2 intersection; deltay2 is the longitudinal projection of the distance between the O2 origin and the O origin in the longitudinal coordinate direction of the No. 2 intersection; o2 is the origin of the lightning-vision coordinate system of the No. 2 intersection, and O is the origin of the coordinate system of the intersection; the distance information after the conversion at the intersection 3 is as follows: x3 ═ deltax 3-X3; y3 ═ deltay 3-Y3; wherein deltax3 is the projection of the distance between the O3 origin and the O origin in the abscissa direction of the intersection 3; deltay3 is a longitudinal projection of the distance between the O3 origin and the O origin in the longitudinal coordinate direction of the intersection No. 3; o3 is the origin of the lightning-vision coordinate system of the No. 3 intersection, and O is the origin of the coordinate system of the intersection; the distance information after the conversion at the intersection 4 is as follows: x4 ═ deltay4-y 4; y4 ═ x4-deltax 4; wherein, deltax4 is the projection of the distance between the O4 origin and the O origin in the abscissa direction of the intersection 4; deltay4 is the longitudinal projection of the distance between the O4 origin and the O origin in the longitudinal coordinate direction of the No. 4 intersection; o4 is the origin of the lightning-vision coordinate system of the No. 4 intersection, and O is the origin of the coordinate system of the intersection.
102. And acquiring a fusion judgment zone bit according to the track distance information, the speed information and the category information after the coordinate conversion of the first intersection, and the track distance information, the speed information and the category information after the coordinate conversion of the opposite intersection corresponding to the first intersection, wherein the first intersection is any one of the at least two intersections.
For the embodiment of the present invention, step 102 may specifically include: judging whether the converted track distance information meets a preset condition or not according to formulas abs (Xi-Xj) < (gate) and abs (Yi-Yj) < (gate), wherein Xi and Yi are distance information of the first intersection, Xj and Yj are distance information of an opposite intersection corresponding to the first intersection, abs represents an absolute value, gate represents an error threshold in the x direction, and gate represents an error threshold in the y direction; judging whether the speed information meets a preset condition or not according to formulas Vi, Vj and abs (Vi + Vj) < ═ gatev, wherein Vi is the speed information of the first intersection, Vj is the speed information of an opposite intersection corresponding to the first intersection, and gatev represents an error threshold of the speed; judging whether the category information of the first intersection is the same as that of the opposite intersection or not; and if the track distance information, the speed information and the category information all meet the preset conditions, determining that the judgment zone bit is true.
103. And acquiring the sequence number of the fused total track information corresponding to the first intersection according to the track information respectively corresponding to at least two intersections in the previous frame and the label information of the first intersection.
For the embodiment of the present invention, step 103 may specifically include: judging whether the mark number of the first intersection track information exists in the track information respectively corresponding to at least two intersections in the previous frame; if so, taking the label of the first intersection track information as the serial number index of the total track information after fusion; if not, the sequence number index is set to 0.
104. And acquiring a difference value judgment mark position of the track distance information of the adjacent frames according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the sequence number of the at least two intersections in the previous frame.
For the embodiment of the present invention, step 104 may specifically include: acquiring the difference value of the track distance information corresponding to the serial number in the previous frame respectively corresponding to the first intersection and the opposite intersection according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the serial number in the previous frame respectively corresponding to at least two intersections; and confirming that the difference value judges that the flag bit is true if the difference value of the track distance information corresponding to the sequence number in the previous frame respectively corresponding to the first intersection and the opposite intersection meets the preset difference value condition, otherwise, confirming that the flag bit is false.
Specifically, taking the intersection shown in fig. 4 as an example, the method can be implemented by reading the track distance information X1 and Y1 of intersection 1; intersection track distance information X3 and Y3; and reading the distance information x _ before and y _ before at the sequence number index of the total track information after the 4 intersections are fused in the previous frame. And then, carrying out condition judgment on the difference value of the track distance information of the front frame and the back frame:
DeltaX1=X1-x_before;DeltaY1=Y1-y_before;DeltaX3=X3-x_before;
DeltaY3 ═ Y3-Y _ before; condition 1: when all the above 4 variables are not 0, DeltaX1 by DeltaX3 is more than 0; DeltaY1 DeltaY3> 0; when all of the above 2 conditions are satisfied, the condition 1 is judged to be true. Condition 2: when 1 of the above 4 variables is 0, abs (DeltaX1-DeltaX3) < ═ gatex; abs (DeltaY1-DeltaY3) < ═ gateway; wherein abs represents an absolute value; gatex represents the error threshold in the x-direction; gatey denotes the error threshold in the y-direction; when all of the above 2 conditions are satisfied, the condition 2 is judged to be true. Further, if only one of the condition 1 and the condition 2 is judged to be true, the condition of the difference value between the track distance information of the previous and next frames is judged to be true, and the judgment flag2_ xy is output as 1; if the determination is false, the determination flag2_ xy is 0.
105. And judging whether the intersection is successfully fused or not by the marker bit according to the fusion judgment marker bit, and/or the sequence number of the fused total track information corresponding to the first intersection, and/or the difference value of the adjacent frame track distance information.
For the embodiment of the present invention, step 105 may specifically include: if the serial number of the fused total track information corresponding to the first intersection is 0 and the fusion judgment flag bit is true, confirming that the fusion is successful; and if the serial number of the fused total track information corresponding to the first intersection is not 0, the fusion judgment flag bit is true, and the difference judgment flag bit is true, the fusion is determined to be successful.
And 106a, if the success is achieved, taking the track distance information, the speed information, the label and the category of the first intersection as the track information after the intersection is merged.
And 106b, if the intersection is unsuccessful, taking the track distance information, the speed information, the label, the category and the track distance information, the speed information, the label and the category of the opposite intersection as the track information after the opposite intersection is fused.
Further, if there are a plurality of subtended intersections, the track information of the plurality of subtended intersections can be further fused, and the method includes: acquiring fusion judgment mark positions according to the track distance information and the category information respectively corresponding to the fused opposite intersections; acquiring the serial number of the fused total track information corresponding to the fused opposite crossing according to the track information respectively corresponding to at least two crossings in the previous frame and the label information of the fused opposite crossing; acquiring a difference value judgment zone bit of the track distance information of the adjacent frames according to the track distance information after the coordinate conversion of each opposite crossing after the fusion and the track distance information corresponding to the sequence number in the last frame of at least two crossings; judging whether the fusion of each fused opposite direction crossing is successful or not by the marker bit according to the fusion judgment marker bit and/or the sequence number of the fused total track information corresponding to the fused opposite direction crossing and/or the difference value of the adjacent frame track distance information; if the intersection is successful, taking the integrated track distance information, speed information, label and category of the first opposite intersection as the integrated track information of each opposite intersection; and if the intersection is not successful, taking the track distance information, the speed information, the label and the category which respectively correspond to each merged opposite intersection as the track information merged by each opposite intersection.
Further, the embodiment of the present invention may also perform unified management on labels of target points in different video frames, specifically: judging whether the mark in the current frame fused track information exists in the previous frame or not; if yes, storing the current frame fused track information in a storage position corresponding to the label; if the target point does not exist, the track label is newly added and the fused track information is stored, so that the track labels of all the target points after fusion are unique and not repeated.
The invention provides a multi-intersection multi-sensor-based track fusion method, which judges whether the opposite intersections are successfully fused or not by judging the marker bits according to the fusion judgment marker bits of the opposite intersections and/or the sequence numbers of the fused total track information corresponding to a first intersection and/or the difference value of adjacent frame track distance information, and then adopts different track information generation strategies according to different fusion conditions.
In order to implement the method provided by the embodiment of the present invention, an embodiment of the present invention provides a multi-intersection multi-sensor based track fusion system, as shown in fig. 2, the system includes: a conversion unit 21, an acquisition unit 22, a judgment unit 23, and a fusion unit 24.
The conversion unit 21 is configured to perform coordinate conversion on the track distance information of the target corresponding to each of the at least two intersections to obtain the track distance information after the coordinate conversion.
The acquiring unit 22 is configured to acquire a fusion judgment flag according to the track distance information, the speed information, and the category information after the coordinate conversion of the first intersection, and the track distance information, the speed information, and the category information after the coordinate conversion of the opposite intersection corresponding to the first intersection, where the first intersection is any one of the at least two intersections.
The obtaining unit 22 is further configured to obtain a sequence number of the fused total track information corresponding to the first intersection according to the track information corresponding to the at least two intersections in the previous frame and the label information of the first intersection; acquiring a difference value judgment mark position of the track distance information of adjacent frames according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the sequence number of at least two intersections in the previous frame;
and the judging unit 23 is configured to judge whether the intersection is successfully fused according to the fusion judging flag bit, and/or the sequence number of the fused total track information corresponding to the first intersection, and/or the difference value of the adjacent frame track distance information.
The fusion unit 24 is configured to, if the intersection is successful, take the track distance information, the speed information, the labels, and the categories of the first intersection as the track information after the intersection is fused; and if the intersection information is unsuccessful, taking the track distance information, the speed information, the label, the category of the first intersection and the track distance information, the speed information, the label and the category of the opposite intersection as the track information after the opposite intersection is fused.
Further, the obtaining unit 22 is specifically configured to determine whether the converted track distance information meets a preset condition according to formulas abs (Xi-Xj) < ═ gate and abs (Yi-Yj) < ═ gate, where Xi and Yi are distance information of the first intersection, Xj and Yj are distance information of an opposite intersection corresponding to the first intersection, abs represents an absolute value, gate represents an error threshold in the x direction, and gate represents an error threshold in the y direction; judging whether the speed information meets a preset condition or not according to formulas Vi, Vj and abs (Vi + Vj) < ═ gatev, wherein Vi is the speed information of the first intersection, Vj is the speed information of an opposite intersection corresponding to the first intersection, and gatev represents an error threshold of the speed; judging whether the category information of the first intersection is the same as that of the opposite intersection or not; and if the track distance information, the speed information and the category information all meet the preset conditions, determining that the judgment zone bit is true.
Further, the obtaining unit 22 is specifically configured to determine whether a label of the first intersection trajectory information exists in the trajectory information corresponding to at least two intersections in the previous frame; if so, taking the label of the first intersection track information as the serial number index of the total track information after fusion; if not, the sequence number index is set to 0.
Further, the obtaining unit 22 is specifically configured to obtain, according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection, and the track distance information corresponding to the sequence number in the last frame of at least two intersections, a difference value between the track distance information corresponding to the sequence number in the last frame and corresponding to the first intersection and the opposite intersection; and confirming that the difference value judges that the flag bit is true if the difference value of the track distance information corresponding to the sequence number in the previous frame respectively corresponding to the first intersection and the opposite intersection meets the preset difference value condition, otherwise, confirming that the flag bit is false.
Further, the determining unit 23 is specifically configured to determine that the fusion is successful if the serial number of the fused total track information corresponding to the first intersection is 0 and the fusion determination flag bit is true; and if the serial number of the fused total track information corresponding to the first intersection is not 0, the fusion judgment flag bit is true, and the difference judgment flag bit is true, the fusion is determined to be successful.
Further, the fusion unit 24 is further configured to obtain fusion judgment flag bits according to the track distance information and the category information respectively corresponding to the fused intersections with different opposite directions; acquiring the serial number of the fused total track information corresponding to the fused opposite crossing according to the track information respectively corresponding to at least two crossings in the previous frame and the label information of the fused opposite crossing; acquiring a difference value judgment zone bit of the track distance information of the adjacent frames according to the track distance information after the coordinate conversion of each opposite crossing after the fusion and the track distance information corresponding to the sequence number in the last frame of at least two crossings; judging whether the fusion of each fused opposite direction crossing is successful or not by the marker bit according to the fusion judgment marker bit and/or the sequence number of the fused total track information corresponding to the fused opposite direction crossing and/or the difference value of the adjacent frame track distance information; if the intersection is successful, taking the integrated track distance information, speed information, label and category of the first opposite intersection as the integrated track information of each opposite intersection; and if the intersection is not successful, taking the track distance information, the speed information, the label and the category which respectively correspond to each merged opposite intersection as the track information merged by each opposite intersection.
Further, in order to ensure that the track labels of the target points after fusion are unique and not repeated, as shown in fig. 3, the system further includes: an update unit 25;
the updating unit 25 is configured to determine whether a label in the current frame of the merged track information exists in a previous frame; if yes, storing the current frame fused track information in a storage position corresponding to the label; if not, adding a new track label and storing the fused track information.
The invention provides a multi-intersection multi-sensor-based track fusion system, which judges whether the opposite intersections are successfully fused or not by judging the marker bits according to the fusion judgment marker bits of the opposite intersections and/or the sequence numbers of the fused total track information corresponding to the first intersection and/or the difference value of the track distance information of adjacent frames, and then adopts different track information generation strategies according to different fusion conditions.
It should be understood that the specific order or hierarchy of steps in the processes disclosed is an example of exemplary approaches. Based upon design preferences, it is understood that the specific order or hierarchy of steps in the processes may be rearranged without departing from the scope of the present disclosure. The accompanying method claims present elements of the various steps in a sample order, and are not intended to be limited to the specific order or hierarchy presented.
In the foregoing detailed description, various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments of the subject matter require more features than are expressly recited in each claim. Rather, as the following claims reflect, invention lies in less than all features of a single disclosed embodiment. Thus, the following claims are hereby expressly incorporated into the detailed description, with each claim standing on its own as a separate preferred embodiment of the invention.
The previous description of the disclosed embodiments is provided to enable any person skilled in the art to make or use the present invention. To those skilled in the art; various modifications to these embodiments will be readily apparent, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the disclosure. Thus, the present disclosure is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.
What has been described above includes examples of one or more embodiments. It is, of course, not possible to describe every conceivable combination of components or methodologies for purposes of describing the aforementioned embodiments, but one of ordinary skill in the art may recognize that many further combinations and permutations of various embodiments are possible. Accordingly, the embodiments described herein are intended to embrace all such alterations, modifications and variations that fall within the scope of the appended claims. Furthermore, to the extent that the term "includes" is used in either the detailed description or the claims, such term is intended to be inclusive in a manner similar to the term "comprising" as "comprising" is interpreted when employed as a transitional word in a claim. Furthermore, any use of the term "or" in the specification of the claims is intended to mean a "non-exclusive or".
Those of skill in the art will further appreciate that the various illustrative logical blocks, units, and steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate the interchangeability of hardware and software, various illustrative components, elements, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design requirements of the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
The various illustrative logical blocks, or elements, described in connection with the embodiments disclosed herein may be implemented or performed with a general purpose processor, a digital signal processor, an Application Specific Integrated Circuit (ASIC), a field programmable gate array or other programmable logic system, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing systems, e.g., a digital signal processor and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a digital signal processor core, or any other similar configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. A software module may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. For example, a storage medium may be coupled to the processor such the processor can read information from, and write information to, the storage medium. In the alternative, the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC, which may be located in a user terminal. In the alternative, the processor and the storage medium may reside in different components in a user terminal.
In one or more exemplary designs, the functions described above in connection with the embodiments of the invention may be implemented in hardware, software, firmware, or any combination of the three. If implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. Computer-readable media includes both computer storage media and communication media that facilitate transfer of a computer program from one place to another. Storage media may be any available media that can be accessed by a general purpose or special purpose computer. For example, such computer-readable media can include, but is not limited to, RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage systems, or any other medium which can be used to carry or store program code in the form of instructions or data structures and which can be read by a general-purpose or special-purpose computer, or a general-purpose or special-purpose processor. Additionally, any connection is properly termed a computer-readable medium, and, thus, is included if the software is transmitted from a website, server, or other remote source via a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wirelessly, e.g., infrared, radio, and microwave. Such discs (disk) and disks (disc) include compact disks, laser disks, optical disks, DVDs, floppy disks and blu-ray disks where disks usually reproduce data magnetically, while disks usually reproduce data optically with lasers. Combinations of the above may also be included in the computer-readable medium.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (14)

1. A multi-intersection multi-sensor-based track fusion method is characterized by comprising the following steps:
performing coordinate conversion on track distance information of targets respectively corresponding to at least two intersections to obtain track distance information after coordinate conversion;
acquiring a fusion judgment zone bit according to track distance information, speed information and category information after coordinate conversion of a first intersection, track distance information, speed information and category information after coordinate conversion of an opposite intersection corresponding to the first intersection, wherein the first intersection is any one of the at least two intersections;
acquiring the sequence number of the fused total track information corresponding to the first intersection according to the track information respectively corresponding to at least two intersections in the previous frame and the label information of the first intersection;
acquiring a difference value judgment mark position of the track distance information of adjacent frames according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the sequence number of at least two intersections in the previous frame;
judging whether the intersection is successfully fused or not by the marker bit according to the fusion judgment marker bit and/or the sequence number of the fused total track information corresponding to the first intersection and/or the difference value of the adjacent frame track distance information;
if the intersection is successful, taking the track distance information, the speed information, the label and the category of the first intersection as the track information after the intersection is fused;
and if the intersection information is unsuccessful, taking the track distance information, the speed information, the label, the category of the first intersection and the track distance information, the speed information, the label and the category of the opposite intersection as the track information after the opposite intersection is fused.
2. The multi-intersection multi-sensor based track fusion method according to claim 1, wherein the step of obtaining the fusion judgment flag bit according to the track distance information, the speed information and the category information after the coordinate conversion of the first intersection, the track distance information, the speed information and the category information after the coordinate conversion of the opposite intersection corresponding to the first intersection comprises:
judging whether the converted track distance information meets a preset condition or not according to formulas abs (Xi-Xj) < (gate) and abs (Yi-Yj) < (gate), wherein Xi and Yi are distance information of the first intersection, Xj and Yj are distance information of an opposite intersection corresponding to the first intersection, abs represents an absolute value, gate represents an error threshold in the x direction, and gate represents an error threshold in the y direction;
judging whether the speed information meets a preset condition or not according to formulas Vi, Vj and abs (Vi + Vj) < ═ gatev, wherein Vi is the speed information of the first intersection, Vj is the speed information of an opposite intersection corresponding to the first intersection, and gatev represents an error threshold of the speed;
judging whether the category information of the first intersection is the same as that of the opposite intersection or not;
and if the track distance information, the speed information and the category information all meet the preset conditions, determining that the judgment zone bit is true.
3. The multi-intersection multi-sensor based track fusion method according to claim 1, wherein the step of obtaining the sequence number of the fused total track information corresponding to the first intersection according to the track information corresponding to at least two intersections in the previous frame and the label information of the first intersection comprises:
judging whether the mark number of the first intersection track information exists in the track information respectively corresponding to at least two intersections in the previous frame;
if so, taking the label of the first intersection track information as the serial number index of the total track information after fusion;
if not, the sequence number index is set to 0.
4. The multi-intersection multi-sensor based track fusion method according to claim 2 or 3, wherein the step of obtaining the difference value of the track distance information of adjacent frames to judge the flag bit according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the sequence number of at least two intersections in the previous frame respectively comprises:
acquiring the difference value of the track distance information corresponding to the serial number in the previous frame respectively corresponding to the first intersection and the opposite intersection according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the serial number in the previous frame respectively corresponding to at least two intersections;
and confirming that the difference value judges that the flag bit is true if the difference value of the track distance information corresponding to the sequence number in the previous frame respectively corresponding to the first intersection and the opposite intersection meets the preset difference value condition, otherwise, confirming that the flag bit is false.
5. The multi-intersection multi-sensor based track fusion method according to claim 4, wherein the step of judging whether the intersection is successfully fused according to the fusion judgment flag bit, and/or the sequence number of the fused total track information corresponding to the first intersection, and/or the difference value of the track distance information of the adjacent frames comprises:
if the serial number of the fused total track information corresponding to the first intersection is 0 and the fusion judgment flag bit is true, confirming that the fusion is successful;
and if the serial number of the fused total track information corresponding to the first intersection is not 0, the fusion judgment flag bit is true, and the difference judgment flag bit is true, the fusion is determined to be successful.
6. The method for track fusion based on multiple intersections and multiple sensors according to claim 1, wherein if there are multiple opposite intersections, the method further comprises:
acquiring fusion judgment mark positions according to the track distance information and the category information respectively corresponding to the fused opposite intersections;
acquiring the serial number of the fused total track information corresponding to the fused opposite crossing according to the track information respectively corresponding to at least two crossings in the previous frame and the label information of the fused opposite crossing;
acquiring a difference value judgment zone bit of the track distance information of the adjacent frames according to the track distance information after the coordinate conversion of each opposite crossing after the fusion and the track distance information corresponding to the sequence number in the last frame of at least two crossings;
judging whether the fusion of each fused opposite direction crossing is successful or not by the marker bit according to the fusion judgment marker bit and/or the sequence number of the fused total track information corresponding to the fused opposite direction crossing and/or the difference value of the adjacent frame track distance information;
if the intersection is successful, taking the integrated track distance information, speed information, label and category of the first opposite intersection as the integrated track information of each opposite intersection;
and if the intersection is not successful, taking the track distance information, the speed information, the label and the category which respectively correspond to each merged opposite intersection as the track information merged by each opposite intersection.
7. The multi-intersection multi-sensor based track fusion method according to claim 1 or 6, further comprising:
judging whether the mark in the current frame fused track information exists in the previous frame or not;
if yes, storing the current frame fused track information in a storage position corresponding to the label;
if not, adding a new track label and storing the fused track information.
8. A multi-intersection multi-sensor based track fusion system, the system comprising:
the conversion unit is used for carrying out coordinate conversion on the track distance information of the targets respectively corresponding to the at least two intersections to obtain the track distance information after the coordinate conversion;
an obtaining unit, configured to obtain a fusion judgment flag according to track distance information, speed information, category information after coordinate conversion of a first intersection, and track distance information, speed information, and category information after coordinate conversion of an opposite intersection corresponding to the first intersection, where the first intersection is any one of the at least two intersections;
the acquiring unit is further configured to acquire a sequence number of the fused total track information corresponding to the first intersection according to the track information corresponding to the at least two intersections in the previous frame and the label information of the first intersection; acquiring a difference value judgment mark position of the track distance information of adjacent frames according to the track distance information after the coordinate conversion of the first intersection, the track distance information after the coordinate conversion of the opposite intersection corresponding to the first intersection and the track distance information corresponding to the sequence number of at least two intersections in the previous frame;
the judging unit is used for judging whether the intersection is successfully fused or not according to the fusion judging flag bit and/or the sequence number of the fused total track information corresponding to the first intersection and/or the difference value of the adjacent frame track distance information;
the fusion unit is used for taking the track distance information, the speed information, the label and the category of the first intersection as the track information after the intersection is fused if the merging is successful; and if the intersection information is unsuccessful, taking the track distance information, the speed information, the label, the category of the first intersection and the track distance information, the speed information, the label and the category of the opposite intersection as the track information after the opposite intersection is fused.
9. The multi-intersection multi-sensor based track fusion system of claim 8,
the obtaining unit is specifically configured to determine whether the converted track distance information meets a preset condition according to formulas abs (Xi-Xj) < ═ gatex and abs (Yi-Yj) < ═ gatey, where Xi and Yi are distance information of the first intersection, Xj and Yj are distance information of an opposite intersection corresponding to the first intersection, abs represents an absolute value, gatex represents an error threshold in the x direction, and gatey represents an error threshold in the y direction; judging whether the speed information meets a preset condition or not according to formulas Vi, Vj and abs (Vi + Vj) < ═ gatev, wherein Vi is the speed information of the first intersection, Vj is the speed information of an opposite intersection corresponding to the first intersection, and gatev represents an error threshold of the speed; judging whether the category information of the first intersection is the same as that of the opposite intersection or not; and if the track distance information, the speed information and the category information all meet the preset conditions, determining that the judgment zone bit is true.
10. The multi-intersection multi-sensor based track fusion system of claim 8,
the acquiring unit is specifically further configured to determine whether a label of the first intersection track information exists in track information respectively corresponding to at least two intersections in a previous frame; if so, taking the label of the first intersection track information as the serial number index of the total track information after fusion; if not, the sequence number index is set to 0.
11. The multi-intersection multi-sensor based track fusion system according to claim 9 or 10,
the acquiring unit is specifically further configured to acquire, according to the track distance information after coordinate conversion of the first intersection, the track distance information after coordinate conversion of the opposite intersection corresponding to the first intersection, and the track distance information corresponding to the sequence number in the previous frame for at least two intersections, difference values of the track distance information corresponding to the sequence number in the previous frame, which correspond to the first intersection and the opposite intersection, respectively; and confirming that the difference value judges that the flag bit is true if the difference value of the track distance information corresponding to the sequence number in the previous frame respectively corresponding to the first intersection and the opposite intersection meets the preset difference value condition, otherwise, confirming that the flag bit is false.
12. The multi-intersection multi-sensor based track fusion system of claim 11,
the determining unit is specifically configured to determine that the fusion is successful if the serial number of the fused total track information corresponding to the first intersection is 0 and the fusion determination flag bit is true; and if the serial number of the fused total track information corresponding to the first intersection is not 0, the fusion judgment flag bit is true, and the difference judgment flag bit is true, the fusion is determined to be successful.
13. The multi-intersection multi-sensor based track fusion system of claim 8,
the fusion unit is also used for acquiring fusion judgment mark positions according to the track distance information and the category information which respectively correspond to the fused intersections with different opposite directions; acquiring the serial number of the fused total track information corresponding to the fused opposite crossing according to the track information respectively corresponding to at least two crossings in the previous frame and the label information of the fused opposite crossing; acquiring a difference value judgment zone bit of the track distance information of the adjacent frames according to the track distance information after the coordinate conversion of each opposite crossing after the fusion and the track distance information corresponding to the sequence number in the last frame of at least two crossings; judging whether the fusion of each fused opposite direction crossing is successful or not by the marker bit according to the fusion judgment marker bit and/or the sequence number of the fused total track information corresponding to the fused opposite direction crossing and/or the difference value of the adjacent frame track distance information; if the intersection is successful, taking the integrated track distance information, speed information, label and category of the first opposite intersection as the integrated track information of each opposite intersection; and if the intersection is not successful, taking the track distance information, the speed information, the label and the category which respectively correspond to each merged opposite intersection as the track information merged by each opposite intersection.
14. The system for multi-intersection and multi-sensor based track fusion according to claim 8 or 13, further comprising: an update unit;
the updating unit is used for judging whether the mark in the current frame fused track information exists in the previous frame or not; if yes, storing the current frame fused track information in a storage position corresponding to the label; if not, adding a new track label and storing the fused track information.
CN202111569137.5A 2021-12-21 Track fusion method and system based on multiple intersections and multiple sensors Active CN114234996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111569137.5A CN114234996B (en) 2021-12-21 Track fusion method and system based on multiple intersections and multiple sensors

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111569137.5A CN114234996B (en) 2021-12-21 Track fusion method and system based on multiple intersections and multiple sensors

Publications (2)

Publication Number Publication Date
CN114234996A true CN114234996A (en) 2022-03-25
CN114234996B CN114234996B (en) 2024-04-23

Family

ID=

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114964138A (en) * 2022-05-11 2022-08-30 超级视线科技有限公司 Multi-intersection-based radar installation angle determination method and system

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102110364A (en) * 2009-12-28 2011-06-29 日电(中国)有限公司 Traffic information processing method and traffic information processing device based on intersections and sections
JP2013088409A (en) * 2011-10-24 2013-05-13 Nissan Motor Co Ltd Travel support device for vehicle
CN109815993A (en) * 2019-01-03 2019-05-28 西北大学 Region Feature Extraction, Database and crossing recognition methods based on GPS track
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN111316127A (en) * 2018-12-29 2020-06-19 深圳市大疆创新科技有限公司 Target track determining method, target tracking system and vehicle
CN112507887A (en) * 2020-12-12 2021-03-16 武汉中海庭数据技术有限公司 Intersection sign extracting and associating method and device
CN112747765A (en) * 2021-01-08 2021-05-04 重庆长安汽车股份有限公司 Path pushing method and system based on navigation and sensor fusion and storage medium
CN113627373A (en) * 2021-08-17 2021-11-09 山东沂蒙交通发展集团有限公司 Vehicle identification method based on radar-vision fusion detection

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102110364A (en) * 2009-12-28 2011-06-29 日电(中国)有限公司 Traffic information processing method and traffic information processing device based on intersections and sections
JP2013088409A (en) * 2011-10-24 2013-05-13 Nissan Motor Co Ltd Travel support device for vehicle
CN111316127A (en) * 2018-12-29 2020-06-19 深圳市大疆创新科技有限公司 Target track determining method, target tracking system and vehicle
CN109815993A (en) * 2019-01-03 2019-05-28 西北大学 Region Feature Extraction, Database and crossing recognition methods based on GPS track
CN111090095A (en) * 2019-12-24 2020-05-01 联创汽车电子有限公司 Information fusion environment perception system and perception method thereof
CN112507887A (en) * 2020-12-12 2021-03-16 武汉中海庭数据技术有限公司 Intersection sign extracting and associating method and device
CN112747765A (en) * 2021-01-08 2021-05-04 重庆长安汽车股份有限公司 Path pushing method and system based on navigation and sensor fusion and storage medium
CN113627373A (en) * 2021-08-17 2021-11-09 山东沂蒙交通发展集团有限公司 Vehicle identification method based on radar-vision fusion detection

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
王建川;马立元;马剑;: "多目标航迹算法研究", 科学技术与工程, no. 24, 30 December 2006 (2006-12-30), pages 113 - 114 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114964138A (en) * 2022-05-11 2022-08-30 超级视线科技有限公司 Multi-intersection-based radar installation angle determination method and system
CN114964138B (en) * 2022-05-11 2023-09-26 超级视线科技有限公司 Radar installation angle determining method and system based on multiple intersections

Similar Documents

Publication Publication Date Title
CN110688902B (en) Method and device for detecting vehicle area in parking space
CN111477030B (en) Vehicle collaborative risk avoiding method, vehicle end platform, cloud end platform and storage medium
CN107977654B (en) Road area detection method, device and terminal
CN116484971A (en) Automatic driving perception self-learning method and device for vehicle and electronic equipment
CN111739338A (en) Parking management method and system based on multiple types of sensors
CN113205691A (en) Method and device for identifying vehicle position
CN114530056A (en) Parking management method and system based on positioning information and image information
CN115523934A (en) Vehicle track prediction method and system based on deep learning
CN114234996B (en) Track fusion method and system based on multiple intersections and multiple sensors
CN114234996A (en) Multi-intersection multi-sensor-based track fusion method and system
CN113450575A (en) Management method and device for roadside parking
CN110784680B (en) Vehicle positioning method and device, vehicle and storage medium
CN117128979A (en) Multi-sensor fusion method and device, electronic equipment and storage medium
CN114170836B (en) Mobile inspection parking management method and system based on parking space information
CN117152949A (en) Traffic event identification method and system based on unmanned aerial vehicle
CN116664498A (en) Training method of parking space detection model, parking space detection method, device and equipment
CN115482511A (en) Vehicle track generation method and system based on smooth vehicle motion data
CN110677491B (en) Method for estimating position of vehicle
CN114236526A (en) Single intersection multi-sensor-based track fusion method and system
CN114964138B (en) Radar installation angle determining method and system based on multiple intersections
CN116524457B (en) Parking space identification method, system, device, electronic equipment and readable storage medium
CN117289278B (en) Lane attribute determining method and device based on traffic radar
CN113033479B (en) Berth event identification method and system based on multilayer perception
CN117253355A (en) Method and system for detecting and identifying traffic signal lamp under complex intersection
CN117011807A (en) Multi-intersection camera data fusion method and system based on multi-region division

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant