CN113189583A - Time-space synchronous millimeter wave radar and visual information fusion method - Google Patents

Time-space synchronous millimeter wave radar and visual information fusion method Download PDF

Info

Publication number
CN113189583A
CN113189583A CN202110455091.8A CN202110455091A CN113189583A CN 113189583 A CN113189583 A CN 113189583A CN 202110455091 A CN202110455091 A CN 202110455091A CN 113189583 A CN113189583 A CN 113189583A
Authority
CN
China
Prior art keywords
millimeter wave
wave radar
point
track
coordinate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110455091.8A
Other languages
Chinese (zh)
Other versions
CN113189583B (en
Inventor
丁雅斌
王晨迁
李云飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin University
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN202110455091.8A priority Critical patent/CN113189583B/en
Publication of CN113189583A publication Critical patent/CN113189583A/en
Application granted granted Critical
Publication of CN113189583B publication Critical patent/CN113189583B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • G01S13/08Systems for measuring distance only
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Abstract

The invention discloses a time-space synchronous millimeter wave radar and visual information fusion method which mainly comprises three steps, wherein the first step is to finish the primary positioning of a front same-rail train by utilizing the message data analysis of a millimeter wave radar sensor. And secondly, completing the detection of the advancing track of the running train and the image position of the front same-track train by using a vision sensor based on an image processing technology. And step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to finish accurate identification and positioning of the front same-rail train. The method overcomes the problems of insufficient precision, low adaptability and the like existing in the detection of a single sensor, realizes the real-time monitoring of the distance between the running train and the front same-rail train, and improves the running safety of the train.

Description

Time-space synchronous millimeter wave radar and visual information fusion method
Technical Field
The invention relates to the field of millimeter wave radar and vision measurement, in particular to a track target ranging method based on millimeter wave radar and vision information fusion.
Background
In order to improve the safety of train operation in a rail transit operation system, the distance between a train and a front target needs to be monitored in real time. The single sensor detection has a series of defects such as insufficient precision and poor adaptability. For example, the millimeter wave radar sensor can accurately output distance information of all targets in front, but because the distance information is not displayed visually, accurate identification of the targets in front cannot be performed, and because the millimeter wave radar has high sensitivity to metal, the millimeter wave radar is easily interfered by noise to generate target detection position deviation and undetected conditions of the targets, so that the real-time performance and stability of target tracking are seriously influenced; real-time image information of a front target can be acquired by using a camera, but capture positioning of the target and actual relative position information between the target are difficult to obtain, so that real-time position detection of the front detected target is difficult to meet.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a time-space synchronization millimeter wave radar and visual sensor information fusion method which increases the target detection accuracy and increases the real-time performance of a ranging result.
The invention relates to a time-space synchronous millimeter wave radar and visual sensor information fusion method, which comprises the following steps:
the method comprises the following steps of firstly, completing primary positioning of a target to be detected by analyzing message data of a millimeter wave radar sensor, and specifically comprising the following steps:
firstly, a millimeter wave radar sensor is arranged at the head position of a running train, the geometric center of the maximum plane of the millimeter wave radar is taken as the origin of coordinates, and the advancing direction of the running train is taken as YrwAxis, vertically upwards direction ZrwThe right side direction of the axle and the running train is XrwThe millimeter wave radar sensor is connected with a computer through a CAN bus and used for acquiring message data information obtained by detecting all ahead target trains, and then radar message data analysis is completed by utilizing the MFC function of the computer and the millimeter wave radar communication protocol, and the millimeter wave radar three-dimensional rectangular coordinate system is establishedAll the target trains in front comprise the same-rail train in front and the adjacent rail train in front, and the message data information comprises the transverse distances d between all the targets in front and the millimeter wave radar coordinate originxAnd a longitudinal distance dy
Secondly, the detection of the advancing track of the running train and the detection of the position of the front same-track train are completed by utilizing a camera based on an image processing technology, and the method specifically comprises the following steps:
firstly, a camera is arranged on a head of a running train right below a millimeter wave radar sensor, and a three-dimensional rectangular coordinate system X of the camera is established by taking an optical center of the camera as a coordinate origincw-Ycw-ZcwEach coordinate axis of the three-dimensional rectangular coordinate system of the camera is parallel to each coordinate axis of the three-dimensional rectangular coordinate system of the millimeter wave radar and Z isrwAxis and ZcwThe axes are coincident, the pitch angle, the yaw angle and the roll angle of the camera are all zero under a three-dimensional rectangular coordinate system of the camera, the camera is connected with the millimeter wave radar and the camera is connected with the computer through USB data lines respectively, the camera is used for collecting real-time images of a scene in front of a running train, the collected scene images in front comprise all target trains in front and running tracks of the running train, and an image coordinate system X is establishedp-YpThe origin of coordinates of the image coordinate system is located at the intersection of the optical axis of the camera and the image plane, Xp,YpRespectively along the length direction and the width direction of the front scene image;
secondly, completing the straight line detection of the advancing track of the running train based on the accumulated probability Hough transformation and completing the primary screening of the advancing track of the running train based on the straight line slope on the front scene image collected by the camera in the first step;
thirdly, performing track linear screening based on DBSCAN probability density clustering and track linear correction based on queues on a plurality of linear information including the left and right tracks obtained in the second step to obtain corrected linear position information of the tracks on the two sides, wherein the linear positions of the tracks on the left and right sides are respectively represented by lleft,lrightShowing that the slopes are respectively kleft,krightTaking the straight line l of the left and right side railsleft,lrightThe intersection point is represented by p0Represents;
fourthly, selecting the corrected linear position information of the two sides of the track obtained in the third step, and using a track linear traversal mode based on logarithm to realize high-density traversal of points near the same-track train in the track direction and low-density traversal far away from the same-track train position to obtain a point coordinate p of traversal points of the left and right linear tracks in the scene image in front of the traversal pointsleft(xleft,yleft),pright(xright,yright),
Fifthly, selecting the point coordinate p of the left and right track traversal points obtained in the fourth step in the front scene imageleft(xleft,yleft),pright(xright,yright) The method comprises the steps that identification of a front same-rail train is completed based on gray value gradient changes of traversal points of track straight lines on the left side and the right side, and positions with gray value mutation exist in the traversal points of the left side and the right side of the track, and the positions are determined as the positions of the front same-rail train;
step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to finish accurate identification and distance measurement of the front same-rail train, and specifically comprising the following steps:
firstly, multithreading synchronization of a millimeter wave radar and a camera;
when data acquisition is carried out, a three-thread fusion mode of a millimeter wave radar data receiving thread, a camera receiving thread and a computer data processing thread is selected to realize multithreading time synchronization based on the millimeter wave radar and visual information;
secondly, obtaining a position of any radar point in the millimeter wave radar coordinate system converted into the image coordinate system by utilizing the translation and rotation relation among the millimeter wave radar coordinate system, the camera coordinate system and the image coordinate system, and then converting a middle position point p (x) at the bottom of the front same-rail train obtained in the sixth step in the second stepbottom,ybottom) Converting the image position information into coordinates under a millimeter wave radar coordinate system, and finally calculating to obtain a middle position point p (x) at the bottom of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemw
Thirdly, firstly, the transverse distances d of all the targets obtained by the radar in the first stepxAnd a longitudinal distance dyConversion to image coordinate system X of camerap-YpThen, using radar point coordinate pi(xp,yp) Is displayed in the front scene image, and then the transverse distances d between all the front targets and the millimeter wave radar coordinate origin in the step onexAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate system
Figure BDA0003040245750000031
Finally, the millimeter wave radar sensor and the camera spatial distance information are fused to complete the radar point coordinate pi(xp,yp) And (4) screening to obtain the position information of the radar point of the front same-rail train.
The invention has the following beneficial effects:
1. the invention realizes the track target ranging based on the millimeter wave radar and visual information fusion, overcomes the defect of single sensor ranging, and increases the target detection accuracy;
2. the algorithm used in the invention has higher operation rate, thus meeting the requirement on the operation rate of the algorithm in the ranging process and increasing the real-time property of the ranging result.
Drawings
FIG. 1 is a flow chart of track target ranging based on millimeter wave radar and visual information fusion;
FIG. 2 is a diagram illustrating clustering results based on probability density;
FIG. 3 is a schematic view of a straight line point traversal of the left and right tracks;
FIG. 4 is a schematic diagram of millimeter wave radar and camera joint calibration;
fig. 5 is a schematic diagram of radar point screening by fusion of millimeter wave radar and visual space distance information.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and examples.
As shown in the attached drawings, the time-space synchronous millimeter wave radar and visual information fusion method comprises the following steps:
the method comprises the following steps of firstly, completing primary positioning of a target to be detected by analyzing message data of a millimeter wave radar sensor, and specifically comprising the following steps:
firstly, a millimeter wave radar sensor is arranged at the head position of a running train, the geometric center of the maximum plane of the millimeter wave radar is taken as the origin of coordinates, and the advancing direction of the running train is taken as YrwAxis, vertically upwards direction ZrwThe right side direction of the axle and the running train is XrwAnd establishing a millimeter wave radar three-dimensional rectangular coordinate system. The pitch angle, yaw angle and roll angle of the millimeter wave radar sensor in the millimeter wave radar three-dimensional rectangular coordinate system are all zero, as shown in fig. 4 at 8. The millimeter wave radar sensor is connected with the computer through a CAN bus and used for acquiring message data information obtained by detecting all the target trains in front. And then, the data analysis of the radar message is completed by utilizing the MFC function of the computer and the millimeter wave radar communication protocol (see the communication protocol M of the King promotion ARS408_ ARS404_ SRR308 communication protocol]Technical _ Documentation, 2019.10.01) that includes a front co-rail train (stationary co-rail train or co-rail train traveling toward a traveling train) and a front adjacent rail train (stationary train or adjacent rail train traveling toward a traveling train). The message data information comprises the transverse distances d between all the front targets and the origin of the millimeter wave radar coordinatexAnd a longitudinal distance dy
Secondly, the detection of the advancing track of the running train and the detection of the position of the front same-track train are completed by utilizing a camera based on an image processing technology, and the method specifically comprises the following steps:
firstly, a camera is arranged on the head of a running train right below a millimeter wave radar sensor, the distance is usually 5cm, and a three-dimensional rectangular coordinate system X of the camera is established by taking the optical center of the camera as the origin of coordinatescw-Ycw-ZcwEach coordinate axis of the three-dimensional rectangular coordinate system of the camera is parallel to each coordinate axis of the three-dimensional rectangular coordinate system of the millimeter wave radar and Z isrwAxis and ZcwThe axes coincide. In phaseThe pitch angle, yaw angle and roll angle of the camera in the three-dimensional rectangular coordinate system are all zero as shown at 8 in fig. 4. The camera and the millimeter wave radar and the camera and the computer are connected through USB data lines respectively. And (3) utilizing a camera to acquire images in real time of a scene in front of the running train, wherein the acquired images of the scene in front comprise all the target trains in front and the running track of the running train. Establishing an image coordinate system Xp-YpThe origin of coordinates is located at the intersection of the camera optical axis and the image plane, X, as shown at 9 in FIG. 4p,YpRespectively along the length direction and the width direction of the front scene image.
And secondly, performing running train running track straight line detection on the front scene image acquired by the camera in the first step based on accumulated probability Hough transformation (see the Shandong, Weng Mongolia and Yang hongtao. the rapid lane line detection method based on improved probability Hough transformation, computer technology and development [ J ].2020,30(05)) and performing running train running track straight line primary screening based on a straight line slope.
The method for preliminarily screening the traveling track straight line of the running train comprises the following specific processes: according to the position information (including the slope and the starting point position) of the travelling track of the travelling train in the scene image in front and the installation position of the camera, the threshold value of the travelling track slope of the travelling train is selected, the selection principle of the threshold value comprises the steps of including the slopes of the tracks on the left side and the right side of the travelling track, removing unnecessary straight lines such as transverse straight lines and the like as far as possible, finally obtaining a plurality of pieces of straight line information which are represented in a point inclined mode and include the tracks on the left side and the right side, and finishing the preliminary screening of the travelling track straight lines of the travelling train based on the accumulated probability Hough transform.
Thirdly, performing track linear screening based on DBSCAN probability density clustering and track linear correction based on queues on a plurality of linear information including the left and right tracks obtained in the second step to obtain corrected linear position information of the tracks on the two sides, wherein the linear positions of the tracks on the left and right sides are respectively represented by lleft,lrightShowing that the slopes are respectively kleft,krightTaking the straight line l of the left and right side railsleft,lrightIntersection point, in p0The specific implementation is as follows:
step 101, selecting a plurality of straight lines obtained by preliminary screening of straight lines passing through a traveling track of a running train, and based on a DBSCAN probability density Clustering Algorithm (see Abdellah IDRISSI, Altaf ALAOUI. A Multi-criterion Decision Method in the DBSCAN Algorithm for Better Clustering [ J)]The International Journal of Advanced Computer Science and Applications,2016 (DBSCAN clustering algorithm based on multi-criteria decision method)) accurately identifies the left and right tracks of the advancing track to obtain the initial linear position information of the tracks on both sides expressed by a point-slope formula, wherein the slope is kjWhere j represents the number of lines initially obtained based on the BSCAN clustering algorithm, j ═ 1, 2.. n (n)<5) (ii) a Meanwhile, as the detection process is real-time, the deviation phenomenon of the linear position information of the two side tracks in a partial period occurs in the detection process, and therefore step 102 is executed;
102, respectively setting a queue with an empirical Length of 5 for the preliminary straight line position information of each side track to ensure that all straight lines obtained by the DBSCAN clustering algorithm enter the queue each time;
step 103, calculating the linear slope k of the preliminary linear position information of the two-side track obtained by utilizing the DBSCAN probability density clustering algorithm in sequencejEntering a queue, and selecting the average value k of the queuemeanAs a comparison value
Figure BDA0003040245750000061
Then, according to the pixel size of the width between the two side tracks in the front scene image, an empirical Distance threshold Distance is set to be 5 (unit, pixel), the following criterion is carried out on the initial straight line slope of each side track,
Figure BDA0003040245750000062
the average value k of the finally updated queue obtained by the above judgmentmeanI.e. the slope of the track straight line, finally completing the correction of the slopes of the track straight lines on both sides, obtaining the position information of the track straight lines on both sides expressed in a point-slope manner after the correction, wherein the track straight lines on the left side and the right side are respectively expressed in a linear mannerleft,lrightShowing that the slopes are respectively kleft,krightTaking the straight line l of the left and right side railsleft,lrightIntersection point, in p0Shown as 5 black dots in fig. 3.
Fourthly, selecting the corrected linear position information of the two sides of the track obtained in the third step, and using a track linear traversal mode based on logarithm to realize high-density traversal of points near the same-track train in the track direction and low-density traversal far away from the same-track train position to obtain a point coordinate p of traversal points of the left and right linear tracks in the scene image in front of the traversal pointsleft(xleft,yleft),pright(xright,yright) The specific implementation mode is as follows:
step 101, taking a straight line l of two side tracksleft,lrightPoint of intersection p0Is an initial traversal point;
step 102, following Y in FIG. 4pTraversing the straight lines of the two side tracks in the front scene image in the axial direction respectively, wherein the two side tracks have the same traversal distance
Figure BDA0003040245750000073
Wherein
Figure BDA0003040245750000074
Is an intersection point p0Along YpThe coordinate of the axis direction, Width is the image Width of the front scene;
step 103, acquiring straight traversing points Y of the tracks on the left side and the right sidepAxial coordinates, comprising the following steps:
Figure BDA0003040245750000071
in formula (1): y isleft,yrightRespectively, the straight line traverse points of the left and right tracks are at YpThe coordinate of the axial direction is determined,
Figure BDA0003040245750000075
is a straight line l of the left and right tracksleft,lrightInitial traversal point p0Along YpAxial coordinate, kleft,krightThe track slopes are respectively the track slopes at the left side and the right side, the straight line traversal intervals of the tracks at the two sides are the same, and the value is delta y ═ logaL, the number n of traversal points is
Figure BDA0003040245750000072
L is the traversal distance;
step 104, obtaining X of the traversal points of the tracks on the left and right sides according to the step 101-pThe coordinate in the axial direction is finally obtained, and the point coordinate p of all the traversal points of the left and right tracks based on the logarithm in the front scene image is finally obtainedleft(xleft,yleft),pright(xright,yright) The implementation result is shown in fig. 3, and 6 and 7 in fig. 3 represent traversal point information of the traversed left and right track lines, respectively.
Figure BDA0003040245750000081
In formula (2): x is the number ofleft,xrightX of the track traversal points on the left and right sides respectivelypAxial coordinate, yleft,yrightRespectively as the traversal points Y of the left and right trackspThe coordinate of the axial direction is determined,
Figure BDA0003040245750000083
for the initial traversal of the left and right orbits by the point p0Along YpThe coordinate of the axial direction is determined,
Figure BDA0003040245750000084
for the initial traversal of the left and right orbits by the point p0Along XpAxial coordinate, kleft,krightRespectively a track straight line l on the left and right sidesleft,lrightThe slope of the straight line of (a);
fifthly, selecting the point coordinate p of the left and right track traversal points obtained in the fourth step in the front scene imageleft(xleft,yleft),pright(xright,yright) And finishing the identification of the front same-rail train based on the gray value gradient change of the traversal points of the straight lines of the tracks on the left side and the right side. Because the gray value of the track is high, and the gray value of the bottom of the front same-rail train is low, the position with the sudden change of the gray value exists in the left track traversal point and the right track traversal point, and the position is determined as the position of the front same-rail train. The specific implementation mode is as follows:
step 101, homogenizing the gray value of the traversal point of the track straight line. In order to eliminate the jitter problem of the gray value of the traversal points, the mean value of the gray value of 4 continuous traversal points is calculated as a new mean value traversal point, and the coordinate value is pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right);
And step 102, determining the position of the gray value mutation. Will coordinate pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) As the coordinates of the left and right position points at the bottom of the front same-rail train, then selecting the coordinates p of the left and right position points at the bottom of the front same-rail trainmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) The calculated arithmetic mean value is used as the coordinate of the middle position point at the bottom of the front same-rail train:
Figure BDA0003040245750000082
in formula (3): x is the number ofbottom,ybottomIs a middle position point at the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-left,ymean-leftIs a position point at the left side of the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-right,ymean-rightIs a position point at the right side of the bottom of the front same-rail train along XpAxis, YpThe coordinate in the axial direction is finally obtained, and the coordinate p (x) of the bottom position point of the front same-rail train is finally obtainedbottom,ybottom) Black in FIG. 5Shown by dots 12;
the sixth step, choose the coordinate to be p (x)bottom,ybottom) The position of the middle position point at the bottom of the front co-rail train is corrected based on Kalman filtering, and the information of the middle position point at the bottom of the front co-rail train in smooth transition in each period is obtained, and the specific implementation mode is as follows:
for the position point deviation phenomenon of the front co-rail train and the phenomenon that the front co-rail train is not detected in a partial period, firstly, a Kalman filtering distance empirical threshold value d is setthreshold50 (see in particular, alidio Gagliardi, francisco de Gioia, Sergio saponara. a real-time video detection algorithm based on Kalman filter and CNN J]Journal of Real-Time Image Processing,2021 a Real-Time video detection algorithm based on kalman filtering and CNN), at the bottom middle point p (x) of the same-track train ahead of the ith periodbottom,ybottom)iAnd the middle position point p (x) at the bottom of the same-rail train in front of the i-1 th periodbottom,ybottom)i-1The Euclidean distance between the two is greater than a distance threshold dthresholdAnd then abandoning the middle target position point at the bottom of the same-rail train in front of the period, and replacing the middle target position point at the bottom of the same-rail train in front of the previous period with the middle target position point. Simultaneously, performing Kalman filtering treatment on all middle target position points at the bottom of the front same-rail train smaller than the distance threshold value to realize smooth transition of the point positions;
step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to finish accurate identification and distance measurement of the front same-rail train, and specifically comprising the following steps:
firstly, multithreading synchronization of a millimeter wave radar and a camera;
during data acquisition, in order to enable a millimeter wave radar and a camera to acquire target data at the same time, a three-thread fusion mode of a millimeter wave radar data receiving thread, a camera receiving thread and a computer data processing thread is selected to realize multi-thread time synchronization based on the millimeter wave radar and visual information (see particularly lubin, fislin. research and application of multi-thread technology [ J ] computer research and development, 2000, (04));
secondly, obtaining a position of any radar point in the millimeter wave radar coordinate system converted into the image coordinate system by utilizing the translation and rotation relations among the millimeter wave radar coordinate system, the camera coordinate system and the image coordinate system, wherein the conversion relation is as follows (see Roxiao, Yao Yuan, Zhang jin Shi. a millimeter wave radar and camera combined calibration method [ N ]. Qinghua university academy, vol 54, No. 3),
Figure BDA0003040245750000101
in the formula (4), xp,ypX of points in millimeter wave radar in image coordinate systempAxial direction and YpAxial coordinate, xrw,yrwFor X of point in millimeter-wave radar under coordinate system of millimeter-wave radarrwAxial direction and YrwAxial coordinate, CxFor the optical axis of the camera along XrwAxial offset, CyIs the optical axis of the camera along YpOffset in axial direction, fxIs along XpFocal length of axial camera, fyIs along YpFocal length of axial camera, LxFor the spacing, L, between the radar projection coordinate system and the X-axis of the camera projection coordinate systemyThe distance between the radar projection coordinate system and the Y axis of the camera projection coordinate system is shown, and H is the installation height of the camera.
Finally, the formula (4) is utilized to obtain the middle position point p (x) at the bottom of the front common rail train obtained in the sixth step in the second stepbottom,ybottom) The image position information is converted into coordinates under a millimeter wave radar coordinate system, and then the middle position point p (x) at the bottom of the same-rail train is obtained through calculationbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemw
The specific conversion steps are as follows: firstly, obtaining the coordinate p (x) of the middle position point at the bottom of the same-rail trainbottom,ybottom) At XpAxial position xbottomAnd YpAxial position ybottomThen x is addedbottom,ybottomCoordinate x converted into millimeter wave radar coordinate systemw,ywBy the formula
Figure BDA0003040245750000102
Calculating to obtain the middle position point p (x) at the bottom of the train on the same trackbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemw
Thirdly, firstly, the transverse distances d of all the targets obtained by the radar in the first stepxAnd a longitudinal distance dyConversion to the image coordinate system X of the camera using equation (4)p-YpThen, using radar point coordinate pi(xp,yp) Is shown in the front scene image as black squares 10, 11, 13 in fig. 5. Then the transverse distances d between all the front targets in the step one and the coordinate origin of the millimeter wave radar are calculatedxAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate system
Figure BDA0003040245750000111
Finally, the millimeter wave radar sensor and the camera spatial distance information are fused to complete the radar point coordinate pi(xp,yp) The screening to obtain the position information of the radar point of the front same-rail train comprises the following steps:
step 101, regarding the radar point data phenomenon that multiple groups of distances are multiple in the same target caused by secondary reflection in the radar detection process, as shown in fig. 5 10 and 11, where the actual relative distance of 10 is much greater than 11, firstly, the transverse distances d between all the targets in front and the millimeter wave radar coordinate origin are determinedxAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate system
Figure BDA0003040245750000112
Comparing and then comparing drWith the middle position point p (x) at the bottom of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemwComparison, if dwAnd drAbsolute value of the difference | dw-dr|<ΔdthresholdThen the relative distance d is preservedrCorresponding radar points are deleted, otherwise, the coordinate p of the radar point is realizedi(xp,yp) Obtaining the coordinates p of the radar points after coarse screeningj(xp,yp). Since the actual relative distance of the radar points is at least doubled due to the secondary reflection, the distance threshold is set to be 1.5 times the visual actual relative distance value deltadthreshold=1.5dw
102, selecting a radar point coordinate p obtained by rough screening for the phenomenon that the information of the radar point of the adjacent vehicle is detected in the radar detection process as shown in 13 in FIG. 5j(xp,yp) In the front scene image, according to the radar point coordinates pj(xp,yp) Position information, and the middle position point information p (x) of the bottom of the same-rail train in the front scene image obtained in the step twobottom,ybottom) As a reference center, selecting a bottom middle position point p (x) of the train on the same track with the front according to the Euclidean distance minimum constraint principlebottom,ybottom) Nearest radar point coordinate pj(xp,yp) As a final radar point screening result, relative distance information d corresponding to the finally screened radar pointrAs a final ranging result.
Figure BDA0003040245750000113
In formula (5): x is the number ofpi,ypiIs the horizontal and vertical coordinate, x, of the radar point in the image coordinate systembottom,ybottomAnd the horizontal and vertical coordinates of the middle position point at the bottom of the front same-rail train are shown. The middle position point at the bottom of the front same-rail train is shown as 12 in fig. 5, the radar point closest to the middle position point is selected as a final radar point detection result, screening of the radar points 13 of the adjacent vehicles is achieved, the radar point 11 based on visual matching is finally obtained, the distance information of the radar point at the moment is selected as a final radar point screening result, and the corresponding relative distance information d of the radar point is selectedrAs a finalAnd (5) measuring the distance.

Claims (6)

1. A time-space synchronization millimeter wave radar and visual information fusion method is characterized by comprising the following steps:
the method comprises the following steps of firstly, completing primary positioning of a target to be detected by analyzing message data of a millimeter wave radar sensor, and specifically comprising the following steps:
firstly, a millimeter wave radar sensor is arranged at the head position of a running train, the geometric center of the maximum plane of the millimeter wave radar is taken as the origin of coordinates, and the advancing direction of the running train is taken as YrwAxis, vertically upwards direction ZrwThe right side direction of the axle and the running train is XrwThe system comprises a shaft, a millimeter wave radar three-dimensional rectangular coordinate system is established, the pitch angle, the yaw angle and the roll angle of a millimeter wave radar sensor in the millimeter wave radar three-dimensional rectangular coordinate system are all zero, the millimeter wave radar sensor is connected with a computer through a CAN bus and used for acquiring message data information obtained by detecting all target trains in front, and then radar message data analysis is completed by utilizing the MFC function of the computer and the millimeter wave radar communication protocol, all target trains in front comprise the same-rail train in front and adjacent-rail trains in front, and the message data information comprises the transverse distances d between all targets in front and the origin of the millimeter wave radar coordinatexAnd a longitudinal distance dy
Secondly, the detection of the advancing track of the running train and the detection of the position of the front same-track train are completed by utilizing a camera based on an image processing technology, and the method specifically comprises the following steps:
firstly, a camera is arranged on a head of a running train right below a millimeter wave radar sensor, and a three-dimensional rectangular coordinate system X of the camera is established by taking an optical center of the camera as a coordinate origincw-Ycw-ZcwEach coordinate axis of the three-dimensional rectangular coordinate system of the camera is parallel to each coordinate axis of the three-dimensional rectangular coordinate system of the millimeter wave radar and Z isrwAxis and ZcwThe axes are coincident, the pitch angle, the yaw angle and the roll angle of the camera are all zero under a three-dimensional rectangular coordinate system of the camera, and the camera and the millimeter wave radar and the camera and the computer are respectively connected through USB data linesThen, a camera is used for acquiring real-time images of a scene in front of the running train, the acquired scene images in front comprise all the target trains in front and the running track of the running train, and an image coordinate system X is establishedp-YpThe origin of coordinates of the image coordinate system is located at the intersection of the optical axis of the camera and the image plane, Xp,YpRespectively along the length direction and the width direction of the front scene image;
secondly, completing the straight line detection of the advancing track of the running train based on the accumulated probability Hough transformation and completing the primary screening of the advancing track of the running train based on the straight line slope on the front scene image collected by the camera in the first step;
thirdly, performing track linear screening based on DBSCAN probability density clustering and track linear correction based on queues on a plurality of linear information including the left and right tracks obtained in the second step to obtain corrected linear position information of the tracks on the two sides, wherein the linear positions of the tracks on the left and right sides are respectively represented by lleft,lrightShowing that the slopes are respectively kleft,krightTaking the straight line l of the left and right side railsleft,lrightThe intersection point is represented by p0Represents;
fourthly, selecting the corrected linear position information of the two sides of the track obtained in the third step, and using a track linear traversal mode based on logarithm to realize high-density traversal of points near the same-track train in the track direction and low-density traversal far away from the same-track train position to obtain a point coordinate p of traversal points of the left and right linear tracks in the scene image in front of the traversal pointsleft(xleft,yleft),pright(xright,yright),
Fifthly, selecting the point coordinate p of the left and right track traversal points obtained in the fourth step in the front scene imageleft(xleft,yleft),pright(xright,yright) The method comprises the steps that identification of a front same-rail train is completed based on gray value gradient changes of traversal points of track straight lines on the left side and the right side, and positions with gray value mutation exist in the traversal points of the left side and the right side of the track, and the positions are determined as the positions of the front same-rail train;
step three, fusing millimeter wave radar information and visual information by a time-space synchronization-based combined calibration method to finish accurate identification and distance measurement of the front same-rail train, and specifically comprising the following steps:
firstly, multithreading synchronization of a millimeter wave radar and a camera;
when data acquisition is carried out, a three-thread fusion mode of a millimeter wave radar data receiving thread, a camera receiving thread and a computer data processing thread is selected to realize multithreading time synchronization based on the millimeter wave radar and visual information;
secondly, obtaining a position of any radar point in the millimeter wave radar coordinate system converted into the image coordinate system by utilizing the translation and rotation relation among the millimeter wave radar coordinate system, the camera coordinate system and the image coordinate system, and then converting a middle position point p (x) at the bottom of the front same-rail train obtained in the sixth step in the second stepbottom,ybottom) Converting the image position information into coordinates under a millimeter wave radar coordinate system, and finally calculating to obtain a middle position point p (x) at the bottom of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemw
Thirdly, firstly, the transverse distances d of all the targets obtained by the radar in the first stepxAnd a longitudinal distance dyConversion to image coordinate system X of camerap-YpThen, using radar point coordinate pi(xp,yp) Is displayed in the front scene image, and then the transverse distances d between all the front targets and the millimeter wave radar coordinate origin in the step onexAnd a longitudinal distance dyConverting into relative distance under millimeter wave radar coordinate system
Figure FDA0003040245740000021
Finally, the millimeter wave radar sensor and the camera spatial distance information are fused to complete the radar point coordinate pi(xp,yp) And (4) screening to obtain the position information of the radar point of the front same-rail train.
2. The time-space synchronized millimeter wave radar and visual information fusion method of claim 1, wherein:
the specific processes of the second step and the third step are as follows:
step 101, selecting a plurality of straight lines obtained by preliminarily screening straight lines of a travelling track of a running train, accurately identifying the tracks on the left side and the right side of the travelling track based on a DBSCAN probability density clustering algorithm, and obtaining preliminary straight line position information of the tracks on the two sides expressed in a point skew manner, wherein the slope is kjWhere j represents the number of lines initially obtained based on the BSCAN clustering algorithm, j is 1,2<5;
102, respectively setting a queue with an empirical Length of 5 for the preliminary straight line position information of each side track to ensure that all straight lines obtained by the DBSCAN clustering algorithm enter the queue each time;
step 103, calculating the linear slope k of the preliminary linear position information of the two-side track obtained by utilizing the DBSCAN probability density clustering algorithm in sequencejEntering a queue, and selecting the average value k of the queuemeanAs a comparison value
Figure FDA0003040245740000031
Then, setting an empirical Distance threshold Distance to 5 according to the pixel size of the width between the tracks on the two sides in the front scene image, performing the following criterion on the initial straight line slope of each track on the side,
Figure FDA0003040245740000032
the average value k of the finally updated queue obtained by the above judgmentmeanNamely the slope of the track straight line, and finally finishing the correction of the slopes of the track straight lines on both sides to obtain the position information of the track straight lines on both sides expressed in a point-slope manner after the correction.
3. The time-space synchronized millimeter wave radar and visual information fusion method according to claim 1 or 2, characterized in that: the fourth step of the second step is realized by the following specific method:
step 101, taking a straight line l of two side tracksleft,lrightOf (2)Point p0Is an initial traversal point;
step 102, along YpTraversing the straight lines of the two side tracks in the front scene image in the axial direction respectively, wherein the two side tracks have the same traversal distance
Figure FDA0003040245740000033
Wherein
Figure FDA0003040245740000034
Is an intersection point p0Along YpThe coordinate of the axis direction, Width is the image Width of the front scene;
step 103, acquiring straight traversing points Y of the tracks on the left side and the right sidepAxial coordinates, comprising the following steps:
Figure FDA0003040245740000035
in formula (1): y isleft,yrightRespectively, the straight line traverse points of the left and right tracks are at YpThe coordinate of the axial direction is determined,
Figure FDA0003040245740000036
is a straight line l of the left and right tracksleft,lrightInitial traversal point p0At YpAxial coordinate, kleft,krightThe track slopes are respectively the track slopes at the left side and the right side, the straight line traversal intervals of the tracks at the two sides are the same, and the value is delta y ═ logaL, the number n of traversal points is
Figure FDA0003040245740000041
L is the traversal distance;
step 104, obtaining X of the traversal points of the tracks on the left and right sides according to the step 101-pThe coordinate in the axial direction is finally obtained, and the point coordinate p of all the traversal points of the left and right tracks based on the logarithm in the front scene image is finally obtainedleft(xleft,yleft),pright(xright,yright):
Figure FDA0003040245740000042
In formula (2): x is the number ofleft,xrightX of the track traversal points on the left and right sides respectivelypAxial coordinate, yleft,yrightRespectively as the traversal points Y of the left and right trackspThe coordinate of the axial direction is determined,
Figure FDA0003040245740000043
for the initial traversal of the left and right orbits by the point p0Along YpThe coordinate of the axial direction is determined,
Figure FDA0003040245740000044
for the initial traversal of the left and right orbits by the point p0Along XpAxial coordinate, kleft,krightRespectively a track straight line l on the left and right sidesleft,lrightThe slope of the straight line of (a);
4. the time-space synchronized millimeter wave radar and visual information fusion method of claim 3, wherein: the concrete implementation method of the fifth step of the second step is as follows:
step 101, carrying out homogenization operation on the gray values of the traversal points of the track straight line, namely calculating the mean value of the gray values of 4 continuous traversal points as a new mean value traversal point, wherein the coordinate value is pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right);
Step 102, determining the position of the gray value mutation: will coordinate pmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) As the coordinates of the left and right position points at the bottom of the front same-rail train, then selecting the coordinates p of the left and right position points at the bottom of the front same-rail trainmean_left(xmean_left,ymean_left),pmean_right(xmean_right,ymean_right) The calculated arithmetic mean value is used as the coordinate of the middle position point at the bottom of the front same-rail train:
Figure FDA0003040245740000045
in formula (3): x is the number ofbottom,ybottomIs a middle position point at the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-left,ymean-leftIs a position point at the left side of the bottom of the front same-rail train along XpAxis, YpAxial coordinate, xmean-right,ymean-rightIs a position point at the right side of the bottom of the front same-rail train along XpAxis, YpThe coordinate in the axial direction is finally obtained, and the coordinate p (x) of the bottom position point of the front same-rail train is finally obtainedbottom,ybottom);
The sixth step, choose the coordinate to be p (x)bottom,ybottom) The middle position point of the bottom of the front co-rail train is corrected based on Kalman filtering to obtain the middle position point information d of the bottom of the front co-rail train in smooth transition in each periodthresholdWhen p (x) is 50bottom,ybottom)iAnd p (x)bottom,ybottom)i-1The Euclidean distance between the two is greater than a distance threshold dthresholdThen abandoning the middle target position point at the bottom of the same-rail train in front of the period, and replacing the middle target position point at the bottom of the same-rail train in front of the previous period with the middle target position point; simultaneously, performing Kalman filtering treatment on all middle target position points at the bottom of the front same-rail train smaller than the distance threshold value to realize smooth transition of the point positions;
5. the time-space synchronized millimeter wave radar and visual information fusion method of claim 3, wherein: the third step of the step comprises the following steps:
101, firstly, the transverse distances d between all the front targets and the origin of coordinates of the millimeter wave radar are determinedxAnd a longitudinal distance dyConverted into millimeter wave radar coordinate systemRelative distance of
Figure FDA0003040245740000051
Comparing and then comparing drWith the middle position point p (x) at the bottom of the same-rail trainbottom,ybottom) Relative distance d in millimeter wave radar coordinate systemwIf d iswAnd drAbsolute value of the difference | dw-dr|<ΔdthresholdThen the relative distance d is preservedrCorresponding radar points are deleted, otherwise, the coordinate p of the radar point is realizedi(xp,yp) Obtaining the coordinates p of the radar points after coarse screeningj(xp,yp);
102, selecting coordinates p of radar points obtained by coarse screeningj(xp,yp) In the front scene image, according to the radar point coordinates pj(xp,yp) Position information, and the middle position point information p (x) of the bottom of the same-rail train in the front scene image obtained in the step twobottom,ybottom) As a reference center, selecting a bottom middle position point p (x) of the train on the same track with the front according to the Euclidean distance minimum constraint principlebottom,ybottom) Nearest radar point coordinate pj(xp,yp) As a final radar point screening result, relative distance information d corresponding to the finally screened radar pointrAs a final ranging result.
6. The time-space synchronized millimeter wave radar and visual information fusion method of claim 3, wherein: the sixth step of claim 4 is implemented as follows:
firstly, an empirical threshold d of Kalman filtering distance is setthresholdWhen the position is 50, the middle position p (x) of the bottom of the same-rail train in front of the ith periodbottom,ybottom)iAnd the middle position point p (x) at the bottom of the same-rail train in front of the i-1 th periodbottom,ybottom)i-1The Euclidean distance between the two is greater than a distance threshold dthresholdAbandon the front same-track row of the cycleReplacing the middle target position point at the bottom of the train with the middle target position point at the bottom of the same-rail train in front of the previous period; meanwhile, Kalman filtering processing is carried out on all middle target position points at the bottom of the front same-rail train smaller than the distance threshold value, and smooth transition of the point positions is realized.
CN202110455091.8A 2021-04-26 2021-04-26 Time-space synchronization millimeter wave radar and visual information fusion method Active CN113189583B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110455091.8A CN113189583B (en) 2021-04-26 2021-04-26 Time-space synchronization millimeter wave radar and visual information fusion method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110455091.8A CN113189583B (en) 2021-04-26 2021-04-26 Time-space synchronization millimeter wave radar and visual information fusion method

Publications (2)

Publication Number Publication Date
CN113189583A true CN113189583A (en) 2021-07-30
CN113189583B CN113189583B (en) 2022-07-01

Family

ID=76978999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110455091.8A Active CN113189583B (en) 2021-04-26 2021-04-26 Time-space synchronization millimeter wave radar and visual information fusion method

Country Status (1)

Country Link
CN (1) CN113189583B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114708585A (en) * 2022-04-15 2022-07-05 电子科技大学 Three-dimensional target detection method based on attention mechanism and integrating millimeter wave radar with vision
CN115169452A (en) * 2022-06-30 2022-10-11 北京中盛国芯科技有限公司 System and method for fusing target information based on space-time synchronization queue characteristics
CN115877328A (en) * 2023-03-06 2023-03-31 成都鹰谷米特科技有限公司 Signal receiving and transmitting method of array radar and array radar

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140203959A1 (en) * 2013-01-18 2014-07-24 Caterpillar Inc. Object recognition system having radar and camera input
CN107609522A (en) * 2017-09-19 2018-01-19 东华大学 A kind of information fusion vehicle detecting system based on laser radar and machine vision
CN107818557A (en) * 2016-09-12 2018-03-20 德尔福技术有限公司 Enhanced camera object for automotive vehicle detects
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion
US20200174112A1 (en) * 2018-12-03 2020-06-04 CMMB Vision USA Inc. Method and apparatus for enhanced camera and radar sensor fusion
WO2020134512A1 (en) * 2018-12-29 2020-07-02 南京慧尔视智能科技有限公司 Traffic detection system based on millimeter wave radar and video
CN111368706A (en) * 2020-03-02 2020-07-03 南京航空航天大学 Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision
CN111461088A (en) * 2020-06-17 2020-07-28 长沙超创电子科技有限公司 Rail transit obstacle avoidance system based on image processing and target recognition
CN111546328A (en) * 2020-04-02 2020-08-18 天津大学 Hand-eye calibration method based on three-dimensional vision measurement
CN111832410A (en) * 2020-06-09 2020-10-27 北京航空航天大学 Forward train detection method based on fusion of vision and laser radar
WO2020216316A1 (en) * 2019-04-26 2020-10-29 纵目科技(上海)股份有限公司 Driver assistance system and method based on millimetre wave radar, terminal, and medium
CN111856441A (en) * 2020-06-09 2020-10-30 北京航空航天大学 Train positioning method based on fusion of vision and millimeter wave radar

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140203959A1 (en) * 2013-01-18 2014-07-24 Caterpillar Inc. Object recognition system having radar and camera input
CN107818557A (en) * 2016-09-12 2018-03-20 德尔福技术有限公司 Enhanced camera object for automotive vehicle detects
CN107609522A (en) * 2017-09-19 2018-01-19 东华大学 A kind of information fusion vehicle detecting system based on laser radar and machine vision
CN108960183A (en) * 2018-07-19 2018-12-07 北京航空航天大学 A kind of bend target identification system and method based on Multi-sensor Fusion
US20200174112A1 (en) * 2018-12-03 2020-06-04 CMMB Vision USA Inc. Method and apparatus for enhanced camera and radar sensor fusion
WO2020134512A1 (en) * 2018-12-29 2020-07-02 南京慧尔视智能科技有限公司 Traffic detection system based on millimeter wave radar and video
WO2020216316A1 (en) * 2019-04-26 2020-10-29 纵目科技(上海)股份有限公司 Driver assistance system and method based on millimetre wave radar, terminal, and medium
CN111368706A (en) * 2020-03-02 2020-07-03 南京航空航天大学 Data fusion dynamic vehicle detection method based on millimeter wave radar and machine vision
CN111546328A (en) * 2020-04-02 2020-08-18 天津大学 Hand-eye calibration method based on three-dimensional vision measurement
CN111832410A (en) * 2020-06-09 2020-10-27 北京航空航天大学 Forward train detection method based on fusion of vision and laser radar
CN111856441A (en) * 2020-06-09 2020-10-30 北京航空航天大学 Train positioning method based on fusion of vision and millimeter wave radar
CN111461088A (en) * 2020-06-17 2020-07-28 长沙超创电子科技有限公司 Rail transit obstacle avoidance system based on image processing and target recognition

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
Z. WANG, G. YU, B. ZHOU, P. WANG AND X. WU: "《A Train Positioning Method Based-On Vision and Millimeter-Wave Radar Data Fusion》", 《 IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS》 *
丁雅斌; 彭翔; 刘则毅; 牛憨笨: "《基于广义等值面提取的多视场深度像融合》", 《工程图学学报》 *
姚文韬; 沈春锋; 董文生: "《一种自适应摄像机与激光雷达联合标定算法》", 《控制工程》 *
郑云水; 郭双全; 董昱: "《基于雷达测量数据的列车运行前方障碍物检测方法研究》", 《铁道学报》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114708585A (en) * 2022-04-15 2022-07-05 电子科技大学 Three-dimensional target detection method based on attention mechanism and integrating millimeter wave radar with vision
CN114708585B (en) * 2022-04-15 2023-10-10 电子科技大学 Attention mechanism-based millimeter wave radar and vision fusion three-dimensional target detection method
CN115169452A (en) * 2022-06-30 2022-10-11 北京中盛国芯科技有限公司 System and method for fusing target information based on space-time synchronization queue characteristics
CN115877328A (en) * 2023-03-06 2023-03-31 成都鹰谷米特科技有限公司 Signal receiving and transmitting method of array radar and array radar

Also Published As

Publication number Publication date
CN113189583B (en) 2022-07-01

Similar Documents

Publication Publication Date Title
CN113189583B (en) Time-space synchronization millimeter wave radar and visual information fusion method
CN109684921B (en) Road boundary detection and tracking method based on three-dimensional laser radar
EP2126843B1 (en) Method and system for video-based road lane curvature measurement
CN110942449A (en) Vehicle detection method based on laser and vision fusion
CN112698302B (en) Sensor fusion target detection method under bumpy road condition
Perrollaz et al. Long range obstacle detection using laser scanner and stereovision
Zhangyu et al. A camera and LiDAR data fusion method for railway object detection
CN110736999B (en) Railway turnout detection method based on laser radar
Wu et al. An algorithm for automatic vehicle speed detection using video camera
CN112991369B (en) Method for detecting outline size of running vehicle based on binocular vision
CN107480646B (en) Binocular vision-based vehicle-mounted video abnormal motion detection method
CN110189363B (en) Airport scene moving target low-visual-angle video speed measuring method
CN109917359B (en) Robust vehicle distance estimation method based on vehicle-mounted monocular vision
CN114758504B (en) Online vehicle overspeed early warning method and system based on filtering correction
CN110705358A (en) Tunnel scene control decision method of train AEB system
CN111694011A (en) Road edge detection method based on data fusion of camera and three-dimensional laser radar
CN112862858A (en) Multi-target tracking method based on scene motion information
Qiu et al. Rail fastener positioning based on double template matching
CN115856872A (en) Vehicle motion track continuous tracking method
CN114820474A (en) Train wheel defect detection method based on three-dimensional information
CN113124777B (en) Vehicle size determination method, device and system and storage medium
Marita et al. Stop-line detection and localization method for intersection scenarios
CN117173215A (en) Inland navigation ship whole-course track identification method and system crossing cameras
CN111539278A (en) Detection method and system for target vehicle
Avizzano et al. Robust image stitching and reconstruction of rolling stocks using a novel Kalman filter with a multiple-hypothesis measurement model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant