CN111369541A - Vehicle detection method for intelligent automobile under severe weather condition - Google Patents

Vehicle detection method for intelligent automobile under severe weather condition Download PDF

Info

Publication number
CN111369541A
CN111369541A CN202010151618.3A CN202010151618A CN111369541A CN 111369541 A CN111369541 A CN 111369541A CN 202010151618 A CN202010151618 A CN 202010151618A CN 111369541 A CN111369541 A CN 111369541A
Authority
CN
China
Prior art keywords
vehicle
detection
target
image
targets
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010151618.3A
Other languages
Chinese (zh)
Other versions
CN111369541B (en
Inventor
詹军
王战古
段春光
管欣
卢萍萍
杨凯
祝怀南
仲昭辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jilin University
Original Assignee
Jilin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jilin University filed Critical Jilin University
Priority to CN202010151618.3A priority Critical patent/CN111369541B/en
Publication of CN111369541A publication Critical patent/CN111369541A/en
Application granted granted Critical
Publication of CN111369541B publication Critical patent/CN111369541B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention discloses a vehicle detection method of an intelligent vehicle under severe weather conditions, which solves the problem that the vehicle detection of the intelligent vehicle is difficult in severe environments such as night, rainy days, foggy days and snowy days. Respectively carrying out vehicle target detection and image capture through a millimeter wave radar and a thermal imaging camera; projecting a vehicle target detected by the millimeter wave radar to an infrared thermal image captured by a thermal imaging camera through coordinate transformation to obtain approximate position distribution of the vehicle detection target in the infrared thermal image; extracting and segmenting a vehicle region of interest; adopting a DMP target detection algorithm to carry out hypothesis verification on the vehicle region of interest and carrying out regression prediction on a detection frame of the vehicle; and fusing detection targets of the millimeter wave radar and the thermal imaging camera by calculating a cost matrix, and finally tracking the fused vehicle detection target by adopting Kalman filtering.

Description

Vehicle detection method for intelligent automobile under severe weather condition
Technical Field
The invention relates to the field of intelligent automobile environment perception, in particular to a severe weather vehicle detection method based on information fusion of an infrared thermal image and a millimeter wave radar.
Background
Environmental awareness is the basis and precondition for intelligent automobile decision making and motion control, and vehicles are the key objects of environmental awareness as the main participants of traffic. The real-time performance and accuracy of vehicle detection in good weather conditions are already high, but vehicle detection in severe weather (night, rainy days, snowy days and foggy days) still has many challenges.
Iwasaki Y et al in his paper "road vehicle detection under vacuum environment to real road traffic surface using infrared thermographic images of significant grayscale and texture features of areas such as tires, exhaust stacks, windows, etc. vehicle detection is performed using infrared thermographic images. Wang Hai et al in the paper "Night-time vehicle sensing in a farinfrared image with deep learning" utilizes a visual saliency algorithm to extract a region of interest of a vehicle from an infrared thermal image, and then uses a deep confidence network DBN to realize the detection of the vehicle. Chui-Qin et al, thesis "infrared image vehicle detection method based on SLPP-SHOG", Zhao Ying Man et al, thesis "infrared vehicle detection based on Gabor filter and SVM classifier", Qinan et al, thesis "infrared vehicle detection technology based on visual saliency and target confidence", and Zhouyi-Jing thesis "infrared vehicle target tracking algorithm research based on SVM and mean translation", the front vehicle identification and vehicle distance detection based on infrared images of nedy's on paper, "the infrared sports vehicle target detection based on visual saliency of a fang yuan ' on paper," the unmanned vehicle night environment perception based on an improved YOLOv3 network "in the field of euphoria on paper," the front vehicle identification and vehicle distance detection based on infrared images of a nedy's on paper, "the night front vehicle detection method for a heavy truck" all utilize infrared images and adopt different methods to realize the detection of traffic vehicles. Swerting in his master thesis "vehicle detection based on convolutional neural networks in infrared images" used R-CNN and SSD models to detect vehicles in infrared thermal images. Zhao Xiaofeng et al put forward an SSD model based on incomplete window optimization in a thesis "special vehicle infrared camouflage detection method based on improved SSD", and the accuracy of vehicle detection is improved by using an infrared image method.
Dinghong is in the thesis of 'forward vehicle detection research based on millimeter wave radar and machine vision' for realizing vehicle detection at curves and at night by utilizing millimeter wave and infrared image fusion.
By analyzing the existing infrared image vehicle detection technology, some imperfect places are found: firstly, complete perception information of a vehicle cannot be obtained only by means of a thermal imaging camera, for example, distance and motion state information of the vehicle cannot be obtained by a vehicle detection algorithm based on an infrared thermal image, and the information is crucial to intelligent vehicle behavior decision and motion planning. In addition, many vehicle detection algorithms rely on region-of-interest generation, extraction of a vehicle region-of-interest (ROI) only by means of an image processing algorithm is easily interfered by the surrounding environment, quality and accuracy of ROI extraction of the vehicle are to be improved, and the instantaneity of the whole vehicle detection system is easily reduced by means of an additional image processing process.
Disclosure of Invention
The invention provides a vehicle detection method of an intelligent vehicle under severe weather conditions, which solves the problem that the vehicle detection of the intelligent vehicle is difficult in severe environments such as night, rainy days, foggy days and snowy days.
The purpose of the invention is realized by the following technical scheme:
a vehicle detection method of an intelligent automobile under severe weather conditions comprises the following steps:
step 1, vehicle interested area generation: respectively carrying out vehicle target detection and image capture through a millimeter wave radar and a thermal imaging camera; projecting a vehicle target detected by the millimeter wave radar to an infrared thermal image captured by a thermal imaging camera through coordinate transformation to obtain approximate position distribution of the vehicle detection target in the infrared thermal image; extracting and segmenting a vehicle region of interest;
step 2, vehicle hypothesis verification: adopting a DMP target detection algorithm to carry out hypothesis verification on the vehicle region of interest and carrying out regression prediction on a detection frame of the vehicle;
step 3, target fusion and tracking: and fusing vehicle detection targets of the millimeter-wave radar and the thermal imaging camera by calculating a cost matrix, and finally tracking the fused vehicle detection targets by adopting Kalman filtering.
Further, the step 1 of generating the vehicle region of interest specifically includes the following steps:
step 1.1, respectively fixing a millimeter wave radar and a thermal imaging camera at specified positions of a vehicle;
step 1.2, calibrating parameters of the thermal imaging camera by using a Zhang Yongyou calibration method, and calibrating an object point p (x) in the environment according to a camera imaging principle and projection transformationw,yw,zw) To the image coordinates p (u, v), the transformation formula is as follows:
Figure RE-GDA0002454610720000021
step 1.3, defining the world coordinate system to be the same as the vehicle coordinate system, wherein the radar coordinate system XrOrYrThe plane is parallel to the XOY plane of the world coordinate system, and the distance between the two planes in the vertical direction is ZoThe distance in the Y direction being YoThen, any point P (R, α) in the radar coordinate system is converted into the vehicle coordinate system, which can be expressed as:
Figure RE-GDA0002454610720000022
step 1.4, obtaining a conversion relation of the object point P in the radar coordinate system and the camera coordinate system according to the step 1.2 and the step 1.3, wherein the conversion relation is as follows:
Figure RE-GDA0002454610720000023
step 1.5, ensuring the synchronization of information transmission of the millimeter wave radar and the thermal imaging camera by adopting a timestamp alignment mode;
step 1.6, screening vehicle targets detected by the millimeter wave radar to obtain effective vehicle targets;
step 1.7, projecting the effective vehicle targets screened in the step 1.6 to an infrared thermal image by using the coordinate conversion formula in the step 1.4 to obtain approximate position distribution of the vehicle detection targets in the infrared thermal image;
step 1.8, carrying out dynamic extraction according to a vehicle region of interest in the vehicle distribution thermal infrared image;
and step 1.9, performing image enhancement on each extracted vehicle region of interest by adopting an adaptive gray enhancement algorithm, and improving the identification degree of a vehicle detection target.
Further, the step 1.8 of dynamically extracting the vehicle region of interest specifically includes the following steps:
step 1.8.1, selecting an experimental site, equally dividing the road length and drawing a line;
step 1.8.2, selecting an experimental vehicle, and respectively stopping the experimental vehicle on a drawing line of an experimental field;
step 1.8.3, keeping the camera still, recording image information of the vehicle at different distances from the camera in sequence, and extracting and recording width information of vehicle pixels in each image;
step 1.8.4, obtaining the width and distance mapping combination K ═ l of several groups of vehicles in the image through the above stepsi,wi), i∈[1,20]Wherein l isiRepresenting the distance of the ith measurement point from the camera, wiA pixel width representing the vehicle at the i measurement points;
step 1.8.5, performing regression fitting on the functional relationship of the plurality of groups of measurement points by using different functions, wherein the functional relationship expression of the vehicle pixel width W and the distance l is as follows:
W=905.6l-0.9246-3.472
and 1.8.6, the millimeter wave radar can accurately detect the distance l of the vehicle, and the distance l is substituted into the formula of the step 1.8.5 to obtain the pixel width W of the vehicle in the image. The statistical value of the aspect ratio of the common vehicle is 1.2, and the pixel height H of the vehicle in the image is W/1.2; establishing a vehicle dynamic region of interest by taking the vehicle mass center detected by the millimeter wave radar as the center and the solved vehicle pixel width W and height H as boundaries; the planned region of interest is expanded by a factor of 1.2 again in order to prevent the loss of vehicle detail.
Further, the step 2 vehicle hypothesis verification comprises the following specific steps:
step 2.1, training sample collection and pretreatment: respectively acquiring sensing information of a radar and a thermal imaging camera under different severe environments by adopting a real vehicle experiment; marking a vehicle target and a non-vehicle target of the infrared thermal image, and collecting a positive training sample picture and a negative training sample picture;
step 2.2, training samples are enlarged;
step 2.3, modeling and training of DPM: randomly selecting 80% of samples as training samples, and 20% of samples as testing samples; evenly dividing the training samples into a plurality of parts, and performing cross validation during training; obtaining a vehicle detection classifier through repeated training and iteration;
and 2.4, carrying out DPM feature extraction on the vehicle region of interest by adopting a multi-purpose multi-scale sliding window, identifying the vehicle by utilizing a trained vehicle classifier, and finally carrying out regression prediction on a detection frame of the vehicle target by adopting a non-maximum suppression algorithm.
Further, the step 3 of target fusion and tracking comprises the following specific steps:
step 3.1, assuming that the coordinates of the upper left corner and the lower right corner of the vehicle detection frame in the step 2 are respectively (x)1,y1) And (x)2,y2) The detected center coordinates of the vehicle
Figure RE-GDA0002454610720000041
Can be expressed as:
Figure RE-GDA0002454610720000042
step 3.2, all radar detection targets and thermal imaging camera detection targets are respectively represented by sets R and C, wherein
Figure RE-GDA0002454610720000043
x and y represent the abscissa and ordinate of the target in the image plane respectively;
3.3, respectively carrying out normalization processing on x and y of all detection targets by adopting a maximum and minimum value method:
x=(x-xmin)/(xmax-xmin)
y=(y-ymin)/(ymax-ymin)
and 3.4, calculating a cost function value between any two sensor targets, wherein a calculation formula of a cost matrix is as follows:
Figure RE-GDA0002454610720000044
step 3.5, fusing the detection targets of the two sensors according to the calculation result of the cost function to obtain an effective vehicle detection target list;
and 3.6, directly detecting the speed, distance and angle of the vehicle by the radar, obtaining a detection frame of the vehicle by the thermal imaging camera through a DPM target detection algorithm, substituting the position information of the detection frame into the formula in the step 1.4 to obtain the height and width of the actual vehicle, and updating the fused measurement information to a target list.
And 3.7, tracking the fused vehicle target by adopting a Kalman filtering algorithm.
Through the scheme, the invention can bring the following beneficial effects:
(1) the combination of the millimeter wave radar and the thermal imaging camera is beneficial to the complementary advantages of the two sensors, and the vehicle detection performance in severe weather is greatly improved. On one hand, the perception result of the radar can provide the depth information and the motion state of the vehicle for the image; on the other hand, the vehicle detection result based on thermal imaging can assist the radar to eliminate false targets and provide contour information for the targets.
(2) Vehicle detection information of the millimeter wave radar is projected to the infrared thermal image through sensor calibration and coordinate conversion, and extraction of the vehicle region of interest is rapidly and accurately achieved. The trained DPM model only needs to calculate the images of the interested region, and compared with a detection algorithm for traversing the whole image, the DPM model greatly improves the speed of vehicle detection on one hand, effectively avoids the interference of environmental information on the other hand, and is beneficial to improving the precision of vehicle detection.
(3) A dynamic interesting region extraction window is established by adopting the idea of pixel regression fitting, the accuracy of vehicle interesting region extraction is improved, and the loss of vehicle characteristics and details is effectively avoided.
In conclusion, the vehicle detection method under the severe weather condition of the intelligent automobile provided by the invention fully utilizes the advantages of the millimeter wave radar and the thermal imaging camera, on one hand, the millimeter wave radar can work all day long and all day long, and simultaneously, the millimeter wave radar can provide the position information and the motion state of the vehicle for the infrared thermal image; on the other hand, based on infrared thermal imaging, the method can resist severe weather conditions such as night, fog, snow, rain and the like, and provides information which cannot be provided by millimeter wave radars such as outlines, types and the like of vehicles. In addition, vehicle detection position information of the millimeter wave radar is projected to the infrared thermal image, the extraction of the vehicle region of interest of the infrared image can be quickly and accurately realized, the interference of environmental information is effectively avoided, the image processing speed and precision are improved, and the real-time performance and the accuracy of vehicle detection are ensured.
Drawings
FIG. 1 is a flow chart of a vehicle detection method under severe weather conditions of an intelligent automobile according to the invention
FIG. 2 is a front view of sensor mounting and coordinate system definition
FIG. 3 is a left side view of sensor mounting and coordinate system definition
FIG. 4 is a schematic diagram illustrating the effective detection range and the dangerous area division of the millimeter wave radar
FIG. 5 is a millimeter wave radar target map
FIG. 6 is a graph of the width of a vehicle pixel versus distance fit
FIG. 7 is a dynamic region of interest generation map for a vehicle
FIG. 8 is a vehicle region-of-interest adaptive gray scale enhancement map
Detailed Description
In order to fully express the objects, technical solutions and advantages of the present invention, the following figures and embodiments are combined to describe the embodiments of the present invention in detail.
The invention provides a vehicle detection method of an intelligent vehicle under severe weather conditions, which mainly solves the problem that the intelligent vehicle is difficult to detect under severe environments such as night, rainy days, foggy days, snowy days and the like, and the flow chart of the whole method is shown in figure 1, and specifically comprises the following steps:
step 1, generating a vehicle region of interest. The vehicle detection target of the millimeter wave radar is projected to the infrared thermal image through coordinate transformation, and the extraction and segmentation of the vehicle region of interest are rapidly and accurately realized. The method specifically comprises the following steps:
step 1.1, as shown in fig. 2, fixing the millimeter wave radar at the center above a front bumper of a vehicle, ensuring that the emission surface of the radar faces outwards and is vertical to the ground, the allowable deviation in the horizontal direction is +/-2 degrees, the allowable deviation in the vertical direction is +/-1 degree, and the ground clearance is 50 cm. The thermal imaging camera is fixed at the center of the top of the vehicle through a bracket, and in order to ensure that the thermal imaging camera captures road information as much as possible, the camera is inclined 15 degrees towards the ground.
Step 1.2, calibrating internal parameters and external parameters of the thermal imaging camera by using a Zhang Yongyou calibration method, and calibrating object points p (x) in the environment according to the camera imaging principle and projection transformationw,yw,zw) To the image coordinates p (u, v), the transformation formula is as follows:
Figure RE-GDA0002454610720000061
step 1.3, defining the world coordinate system to be the same as the vehicle coordinate system in the invention, wherein the radar coordinate system XrOrYrThe plane is parallel to the XOY plane of the world coordinate system as shown in fig. 2 and 3. The distance between the two planes in the vertical direction is ZoThe distance in the Y direction being YoThen, any point P (R, α) in the radar coordinate system is converted into the vehicle coordinate system, which can be expressed as:
Figure RE-GDA0002454610720000062
step 1.4, the transformation relationship of the object point P in the radar coordinate system and the camera coordinate system obtained according to step 1.2 and step 1.3 is as follows:
Figure RE-GDA0002454610720000063
and step 1.5, in order to ensure the information transmission time synchronization of the millimeter wave radar and the thermal imaging camera, the information transmission synchronization of the two information sensors is ensured by adopting a timestamp alignment mode based on a system local clock.
And step 1.6, the millimeter wave radar detects the targets by receiving echo signals of the obstacles, so that the detected obstacles comprise invalid targets, static targets and non-dangerous targets besides the vehicle targets with potential collision risks, the targets do not influence the normal running of the vehicle, and the targets need to be removed to obtain the final valid vehicle targets.
The invalid target is a false target generated by instability of radar echo signals due to vehicle bumping, electromagnetic interference and the like, the duration of the target is generally short, and the target is filtered by Kalman filtering and a life cycle algorithm.
The static target refers to obstacles such as trees, guardrails, green belts and the like in the road environment. Suppose that the vehicle speed is VegoThe relative speed of the millimeter wave radar detection target is Vv, and the absolute speed V of the detected targettarCan be expressed as:
Vtar=Vego+Vv
since the moving speed of the stationary object is close to 0, the speed threshold V is setref0.5, when detecting the velocity | V of the objecttar|≤VrefThe object is discarded, thereby eliminating the interference of a stationary object.
Non-dangerous targets refer to vehicle targets outside the own lane and adjacent lanes, as shown in fig. 4. The effective detection range of the millimeter wave radar contains 5 vehicle targets, wherein only Veh _2, Veh _3 and Veh _4 are targets with potential collision risks. Veh _1 and Vec _5 belong to non-dangerous targets and need to be rejected. Lane width w of the first level road of our countryroadIs 3.75, the threshold value w of the transverse distancelimtComprises the following steps:
Figure RE-GDA0002454610720000071
wherein wvehicleTaking the average width w of the road vehicle as the width of the vehiclevehicle1.8m, the threshold value w of the lateral distancelimit6.5m, i.e. the lateral distance | w of the object when detected by the radardetection|>wlimitThe target is discarded, thereby eliminating interference from non-dangerous targets.
And step 1.7, projecting the vehicle effective target screened in the step 1.6 to an infrared thermal image by using the coordinate conversion formula of the radar and the camera in the step 1.4, and obtaining the approximate position distribution of the vehicle target in the infrared thermal image as shown in fig. 5.
Step 1.8, the size of the vehicle region of interest is dynamically changed in the image, and in order to improve the accuracy of region of interest extraction, the invention provides a dynamic region of interest extraction method based on pixel regression, which specifically comprises the following steps:
step 1.8.1, selecting an open experimental field, ensuring that the road length is more than or equal to 100m, uniformly dividing the road length into 20 parts, and drawing a transverse line every 5 m.
Step 1.8.2, the average width of the vehicle is 1.8 and the average width-height ratio is 1.2 by carrying out statistics on common vehicles on the road. The vehicle is selected as an experimental vehicle, the experimental vehicle is respectively parked on the drawn transverse line, and the plane where the tail of the vehicle is located is aligned with the transverse line when the vehicle is parked every time.
And 1.8.3, keeping the camera still, and recording the image information of the vehicle at different distances from the camera in sequence. And an image processing tool is adopted to extract and record the width information of the vehicle pixel in each image.
Step 1.8.4, obtaining a total of 20 groups of vehicle width and distance mapping combinations K ═ l in the image through the above experimenti,wi), i∈[1,20]. Wherein liRepresenting the distance of the ith measurement point from the camera, wiRepresenting the pixel width of the vehicle at the i measurement points.
Step 1.8.5, performing regression fitting on the functional relationship of the 20 sets of measurement points by using different functions, and the fitting result is shown in fig. 6. The fitting effect of the power function is best, the fitting result shows that SSE is 3.17, RMSE is 0.4316, and the functional relation expression of the vehicle pixel width W and the distance l is as follows:
W=905.6l-0.9246-3.472
step 1.8.6, the millimeter wave radar can accurately detect the distance l of the vehicle, and the pixel width W of the vehicle in the image can be obtained by substituting l into the formula of step 1.8.5. From the statistical results of the aspect ratios of the common vehicles, the average aspect ratio of the vehicle is 1.2, and the pixel height H of the vehicle in the image can be obtained as W/1.2. And establishing a vehicle dynamic region of interest by taking the vehicle mass center detected by the millimeter wave radar as the center and the solved vehicle pixel width W and height H as boundaries.
Step 1.8.7, considering that there is a certain error in the vehicle type, the change of the attitude, and the vehicle centroid position provided by the radar, in order to prevent the vehicle feature missing and the target missing, the planned dynamic region of interest needs to be expanded by 1.2 times, as shown in fig. 7.
Step 1.9, performing image enhancement on each extracted vehicle region of interest by adopting an adaptive gray enhancement algorithm, and improving the identification degree of a vehicle target, as shown in fig. 8.
And 2, vehicle hypothesis verification. And performing hypothesis verification on the vehicle region of interest by adopting a classical DMP target detection algorithm, and performing regression prediction on a detection frame of the vehicle. The method mainly comprises the following steps:
and 2.1, training sample collection and pretreatment. The real vehicle experiment is adopted to respectively collect the sensing information of the radar and the thermal imaging camera under different severe environments (night, rainy days, snowy days and foggy days). And marking the vehicle target and the non-vehicle target of the infrared thermal image by adopting a manual marking mode, and collecting 8200 positive training sample pictures (vehicle targets) and 7900 negative training sample pictures (non-vehicle targets).
And 2.2, augmenting the training samples. In order to increase training data, overfitting during model training is prevented, and model training precision is improved. The invention adopts the following four methods to amplify the training sample, which specifically comprises the following steps: mirror image inversion, contrast change, brightness change, adaptive gray scale enhancement. After the sample augmentation treatment, 26800 positive training samples and 25700 negative training samples are available.
And 2.3, modeling and training the DPM. The DPM target detection model is built based on C + + and OpenCV. 80% of the samples were randomly selected as training samples and 20% of the samples were selected as test samples. 80% of the training samples were equally divided into 5 and cross-validated during training. And obtaining the vehicle detection classifier through repeated training and iteration.
And 2.4, carrying out DPM feature extraction on the vehicle region of interest by adopting a multi-purpose multi-scale sliding window, identifying the vehicle by utilizing a trained vehicle classifier, and finally carrying out regression prediction on a detection frame of the vehicle target by adopting a non-maximum suppression algorithm.
And 3, fusing and tracking the target. And fusing detection targets of the radar and the thermal imaging camera by calculating a cost matrix, and finally tracking the fused vehicle detection target by adopting Kalman filtering. The method specifically comprises the following steps:
step 3.1, assuming that the coordinates of the upper left corner and the lower right corner of the vehicle detection frame in step 2 are respectively (x)1,y1) And (x)2,y2) The detected center coordinates of the vehicle
Figure RE-GDA0002454610720000081
Can be expressed as:
Figure RE-GDA0002454610720000082
step 3.2, all radar detection targets and thermal imaging camera detection targets are respectively represented by sets R and C, wherein
Figure RE-GDA0002454610720000083
x and y represent the abscissa and ordinate of the object in the image plane, respectively.
3.3, in order to eliminate the difference of numerical distribution of the abscissa and the ordinate, normalizing the x and y of all the detection targets by adopting a maximum and minimum value method respectively, as follows:
x=(x-xmin)/(xmax-xmin)
y=(y-ymin)/(ymax-ymin)
and 3.4, calculating a cost function value between any two sensor targets, wherein the two targets with smaller cost function values are judged to be the same target, and the calculation formula of the cost matrix is as follows.
Figure RE-GDA0002454610720000091
And 3.5, fusing the detection targets of the two sensors according to the calculation result of the cost function to obtain an effective vehicle detection target list.
And 3.6, the radar can directly detect the speed, the distance and the angle of the vehicle, the thermal imaging camera can obtain a detection frame of the vehicle through a DPM target detection algorithm, and the position information of the detection frame is substituted into the formula in the step 1.4 to obtain the height and the width of the actual vehicle. The fused vehicle detection target can simultaneously obtain the speed, distance, angle, width and height information of the vehicle.
And 3.7, finally, tracking the fused vehicle target by adopting a Kalman filtering algorithm.

Claims (5)

1. The method for detecting the vehicle of the intelligent automobile under the severe weather condition is characterized by comprising the following steps of:
step 1, vehicle interested area generation: respectively carrying out vehicle target detection and image capture through a millimeter wave radar and a thermal imaging camera; projecting a vehicle target detected by the millimeter wave radar to an infrared thermal image captured by a thermal imaging camera through coordinate transformation to obtain approximate position distribution of the vehicle detection target in the infrared thermal image; extracting and segmenting a vehicle region of interest;
step 2, vehicle hypothesis verification: adopting a DMP target detection algorithm to carry out hypothesis verification on the vehicle region of interest and carrying out regression prediction on a detection frame of the vehicle;
step 3, target fusion and tracking: and fusing vehicle detection targets of the millimeter-wave radar and the thermal imaging camera by calculating a cost matrix, and finally tracking the fused vehicle detection targets by adopting Kalman filtering.
2. The intelligent vehicle detection method under the severe weather condition of an automobile as claimed in claim 1, wherein the step 1 of generating the vehicle region of interest specifically comprises the following steps:
step 1.1, respectively fixing a millimeter wave radar and a thermal imaging camera at specified positions of a vehicle;
step 1.2, calibrating parameters of the thermal imaging camera by using a Zhang Yongyou calibration method, and calibrating an object point p (x) in the environment according to a camera imaging principle and projection transformationw,yw,zw) To the image coordinates p (u, v), the transformation formula is as follows:
Figure FDA0002402645960000011
step 1.3, defining the world coordinate system to be the same as the vehicle coordinate system, wherein the radar coordinate system XrOrYrThe plane is parallel to the XOY plane of the world coordinate system, and the distance between the two planes in the vertical direction is ZoThe distance in the Y direction being YoThen, any point P (R, α) in the radar coordinate system is converted into the vehicle coordinate system, which can be expressed as:
Figure FDA0002402645960000012
step 1.4, obtaining a conversion relation of the object point P in the radar coordinate system and the camera coordinate system according to the step 1.2 and the step 1.3, wherein the conversion relation is as follows:
Figure FDA0002402645960000013
step 1.5, ensuring the synchronization of information transmission of the millimeter wave radar and the thermal imaging camera by adopting a timestamp alignment mode;
step 1.6, screening vehicle targets detected by the millimeter wave radar to obtain effective vehicle targets;
step 1.7, projecting the effective vehicle targets screened in the step 1.6 to an infrared thermal image by using the coordinate conversion formula in the step 1.4 to obtain approximate position distribution of the vehicle detection targets in the infrared thermal image;
step 1.8, dynamically extracting a vehicle region of interest in the thermal infrared image according to vehicle distribution;
and step 1.9, performing image enhancement on each extracted vehicle region of interest by adopting an adaptive gray enhancement algorithm, and improving the identification degree of a vehicle detection target.
3. The intelligent vehicle detection method under severe weather conditions of an automobile as claimed in claim 2, wherein the step 1.8 of dynamically extracting the vehicle region of interest specifically comprises the following steps:
step 1.8.1, selecting an experimental site, equally dividing the road length and drawing a line;
step 1.8.2, selecting an experimental vehicle, and respectively stopping the experimental vehicle on a drawing line of an experimental field;
step 1.8.3, keeping the camera still, recording image information of the vehicle at different distances from the camera in sequence, and extracting and recording width information of vehicle pixels in each image;
step 1.8.4, obtaining the width and distance mapping combination K ═ l of several groups of vehicles in the image through the above stepsi,wi),i∈[1,20]Wherein l isiRepresenting the distance of the ith measurement point from the camera, wiA pixel width representing the vehicle at the i measurement points;
step 1.8.5, performing regression fitting on the functional relationship of the plurality of groups of measurement points by using different functions, wherein the functional relationship expression of the vehicle pixel width W and the distance l is as follows:
W=905.6l-0.9246-3.472
step 1.8.6, the millimeter wave radar can accurately detect the distance l of the vehicle, and the distance l is substituted into the formula of the step 1.8.5 to obtain the pixel width W of the vehicle in the image; if the statistical value of the vehicle aspect ratio is 1.2, the pixel height H of the vehicle in the image is W/1.2; establishing a vehicle dynamic region of interest by taking the vehicle mass center detected by the millimeter wave radar as the center and the solved vehicle pixel width W and height H as boundaries; in order to prevent the loss of vehicle details, the planned region of interest is expanded by a factor of 1.2.
4. The intelligent vehicle detection method under the severe weather condition of the automobile as claimed in claim 2, wherein the step 2 vehicle hypothesis verification comprises the following specific steps:
step 2.1, training sample collection and pretreatment: respectively acquiring sensing information of a radar and a thermal imaging camera under different severe environments by adopting a real vehicle experiment; marking a vehicle target and a non-vehicle target of the infrared thermal image, and collecting a positive training sample picture and a negative training sample picture;
step 2.2, training samples are enlarged;
step 2.3, modeling and training of DPM: randomly selecting 80% of samples as training samples, and 20% of samples as testing samples; evenly dividing the training samples into a plurality of parts, and performing cross validation during training; obtaining a vehicle detection classifier through repeated training and iteration;
and 2.4, carrying out DPM feature extraction on the vehicle region of interest by adopting a multi-purpose multi-scale sliding window, identifying the vehicle by utilizing a trained vehicle classifier, and finally carrying out regression prediction on a detection frame of the vehicle target by adopting a non-maximum suppression algorithm.
5. The intelligent vehicle detection method under the severe weather condition of the automobile as claimed in claim 4, wherein the step 3 of target fusion and tracking comprises the following specific steps:
step 3.1, assuming that the coordinates of the upper left corner and the lower right corner of the vehicle detection frame in the step 2 are respectively (x)1,y1) And (x)2,y2) The detected center coordinates of the vehicle
Figure FDA0002402645960000031
Can be expressed as:
Figure FDA0002402645960000032
step 3.2, all radar detection targets and thermal imaging camera detection targets are respectively represented by sets R and C, wherein
Figure FDA0002402645960000033
x and y represent the abscissa and ordinate of the target in the image plane respectively;
3.3, respectively carrying out normalization processing on x and y of all detection targets by adopting a maximum and minimum value method:
x=(x-xmin)/(xmax-xmin)
y=(y-ymin)/(ymax-ymin)
and 3.4, calculating a cost function value between any two sensor targets, wherein a calculation formula of a cost matrix is as follows:
Figure FDA0002402645960000034
step 3.5, fusing the detection targets of the two sensors according to the calculation result of the cost function to obtain an effective vehicle detection target list;
step 3.6, directly detecting the speed, distance and angle of the vehicle by the radar, obtaining a detection frame of the vehicle by the thermal imaging camera through a DPM target detection algorithm, substituting the position information of the detection frame into the formula in the step 1.4 to obtain the height and width of the actual vehicle, and updating the fused measurement information to a target list;
and 3.7, tracking the fused vehicle target by adopting a Kalman filtering algorithm.
CN202010151618.3A 2020-03-06 2020-03-06 Vehicle detection method for intelligent automobile under severe weather condition Active CN111369541B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010151618.3A CN111369541B (en) 2020-03-06 2020-03-06 Vehicle detection method for intelligent automobile under severe weather condition

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010151618.3A CN111369541B (en) 2020-03-06 2020-03-06 Vehicle detection method for intelligent automobile under severe weather condition

Publications (2)

Publication Number Publication Date
CN111369541A true CN111369541A (en) 2020-07-03
CN111369541B CN111369541B (en) 2022-07-08

Family

ID=71210360

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010151618.3A Active CN111369541B (en) 2020-03-06 2020-03-06 Vehicle detection method for intelligent automobile under severe weather condition

Country Status (1)

Country Link
CN (1) CN111369541B (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112508081A (en) * 2020-12-02 2021-03-16 王刚 Vehicle identification method, device and computer readable storage medium
CN112580695A (en) * 2020-12-02 2021-03-30 王刚 Vehicle type identification method and device, storage medium and electronic equipment
CN112633274A (en) * 2020-12-21 2021-04-09 中国航天空气动力技术研究院 Sonar image target detection method and device and electronic equipment
CN112950671A (en) * 2020-08-06 2021-06-11 郑锴 Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle
CN113449632A (en) * 2021-06-28 2021-09-28 重庆长安汽车股份有限公司 Vision and radar perception algorithm optimization method and system based on fusion perception and automobile
CN113608355A (en) * 2021-08-06 2021-11-05 湖南龙特科技有限公司 Interactive display mode based on millimeter wave radar and infrared thermal imager
CN113655460A (en) * 2021-10-18 2021-11-16 长沙莫之比智能科技有限公司 Rain and snow clutter recognition method based on millimeter wave radar
US11231498B1 (en) 2020-07-21 2022-01-25 International Business Machines Corporation Concealed object detection
CN114062961A (en) * 2021-11-17 2022-02-18 吉林大学 OCC-based multi-feature demodulation method for automatic driving vehicle
CN116071707A (en) * 2023-02-27 2023-05-05 南京航空航天大学 Airport special vehicle identification method and system
CN117495695A (en) * 2023-11-10 2024-02-02 苏州清研浩远汽车科技有限公司 Low-light environment detection system based on millimeter wave and infrared image fusion

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103324936A (en) * 2013-05-24 2013-09-25 北京理工大学 Vehicle lower boundary detection method based on multi-sensor fusion
US20130335569A1 (en) * 2012-03-14 2013-12-19 Honda Motor Co., Ltd. Vehicle with improved traffic-object position detection
CN104637059A (en) * 2015-02-09 2015-05-20 吉林大学 Night preceding vehicle detection method based on millimeter-wave radar and machine vision
CN104777835A (en) * 2015-03-11 2015-07-15 武汉汉迪机器人科技有限公司 Omni-directional automatic forklift and 3D stereoscopic vision navigating and positioning method
US20160238703A1 (en) * 2015-02-16 2016-08-18 Panasonic Intellectual Property Management Co., Ltd. Object detection apparatus and method
CN107609522A (en) * 2017-09-19 2018-01-19 东华大学 A kind of information fusion vehicle detecting system based on laser radar and machine vision
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130335569A1 (en) * 2012-03-14 2013-12-19 Honda Motor Co., Ltd. Vehicle with improved traffic-object position detection
CN103324936A (en) * 2013-05-24 2013-09-25 北京理工大学 Vehicle lower boundary detection method based on multi-sensor fusion
CN104637059A (en) * 2015-02-09 2015-05-20 吉林大学 Night preceding vehicle detection method based on millimeter-wave radar and machine vision
US20160238703A1 (en) * 2015-02-16 2016-08-18 Panasonic Intellectual Property Management Co., Ltd. Object detection apparatus and method
CN104777835A (en) * 2015-03-11 2015-07-15 武汉汉迪机器人科技有限公司 Omni-directional automatic forklift and 3D stereoscopic vision navigating and positioning method
CN107609522A (en) * 2017-09-19 2018-01-19 东华大学 A kind of information fusion vehicle detecting system based on laser radar and machine vision
CN110532896A (en) * 2019-08-06 2019-12-03 北京航空航天大学 A kind of road vehicle detection method merged based on trackside millimetre-wave radar and machine vision

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
DEOKKYU KIM 等: "Extrinsic parameter calibration of 2D radar-camera using point matching and generative optimization", 《2019 19TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS)》 *
ZHANGUWANG 等: "A new scheme of vehicle detection for severe weather based on multi-sensor fusion", 《MEASUREMENT》 *
刘禹希: "基于机器视觉的智能驾驶车辆的目标识别研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *
张书玮 等: "基于机器视觉和雷达数据融合的变电站巡检机器人自主导航方法研究", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11231498B1 (en) 2020-07-21 2022-01-25 International Business Machines Corporation Concealed object detection
US11774580B2 (en) 2020-07-21 2023-10-03 International Business Machines Corporation Concealed object detection
WO2022017005A1 (en) * 2020-07-21 2022-01-27 International Business Machines Corporation Concealed object detection
CN112950671A (en) * 2020-08-06 2021-06-11 郑锴 Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle
CN112950671B (en) * 2020-08-06 2024-02-13 中国人民解放军32146部队 Real-time high-precision parameter measurement method for moving target by unmanned aerial vehicle
CN112580695A (en) * 2020-12-02 2021-03-30 王刚 Vehicle type identification method and device, storage medium and electronic equipment
CN112508081A (en) * 2020-12-02 2021-03-16 王刚 Vehicle identification method, device and computer readable storage medium
CN112633274A (en) * 2020-12-21 2021-04-09 中国航天空气动力技术研究院 Sonar image target detection method and device and electronic equipment
CN113449632A (en) * 2021-06-28 2021-09-28 重庆长安汽车股份有限公司 Vision and radar perception algorithm optimization method and system based on fusion perception and automobile
CN113449632B (en) * 2021-06-28 2023-04-07 重庆长安汽车股份有限公司 Vision and radar perception algorithm optimization method and system based on fusion perception and automobile
CN113608355B (en) * 2021-08-06 2023-07-21 湖南龙特科技有限公司 Interactive display mode based on millimeter wave radar and infrared thermal imager
CN113608355A (en) * 2021-08-06 2021-11-05 湖南龙特科技有限公司 Interactive display mode based on millimeter wave radar and infrared thermal imager
CN113655460B (en) * 2021-10-18 2022-01-07 长沙莫之比智能科技有限公司 Rain and snow clutter recognition method based on millimeter wave radar
CN113655460A (en) * 2021-10-18 2021-11-16 长沙莫之比智能科技有限公司 Rain and snow clutter recognition method based on millimeter wave radar
CN114062961A (en) * 2021-11-17 2022-02-18 吉林大学 OCC-based multi-feature demodulation method for automatic driving vehicle
CN114062961B (en) * 2021-11-17 2023-08-08 吉林大学 OCC-based multi-feature demodulation method for automatic driving vehicle
CN116071707A (en) * 2023-02-27 2023-05-05 南京航空航天大学 Airport special vehicle identification method and system
CN116071707B (en) * 2023-02-27 2023-11-28 南京航空航天大学 Airport special vehicle identification method and system
CN117495695A (en) * 2023-11-10 2024-02-02 苏州清研浩远汽车科技有限公司 Low-light environment detection system based on millimeter wave and infrared image fusion

Also Published As

Publication number Publication date
CN111369541B (en) 2022-07-08

Similar Documents

Publication Publication Date Title
CN111369541B (en) Vehicle detection method for intelligent automobile under severe weather condition
CN108983219B (en) Fusion method and system for image information and radar information of traffic scene
CN110244322B (en) Multi-source sensor-based environmental perception system and method for pavement construction robot
CN110487562B (en) Driveway keeping capacity detection system and method for unmanned driving
CN112215306B (en) Target detection method based on fusion of monocular vision and millimeter wave radar
Nieto et al. Road environment modeling using robust perspective analysis and recursive Bayesian segmentation
CN113156421A (en) Obstacle detection method based on information fusion of millimeter wave radar and camera
CN105488454A (en) Monocular vision based front vehicle detection and ranging method
CN104778444A (en) Method for analyzing apparent characteristic of vehicle image in road scene
CN105160649A (en) Multi-target tracking method and system based on kernel function unsupervised clustering
CN113848545B (en) Fusion target detection and tracking method based on vision and millimeter wave radar
CN115273034A (en) Traffic target detection and tracking method based on vehicle-mounted multi-sensor fusion
Jiang et al. Target detection algorithm based on MMW radar and camera fusion
CN107220632B (en) Road surface image segmentation method based on normal characteristic
Wang et al. An improved hough transform method for detecting forward vehicle and lane in road
CN114693909A (en) Microcosmic vehicle track sensing equipment based on multi-sensor machine vision fusion
CN112733678A (en) Ranging method, ranging device, computer equipment and storage medium
CN112052768A (en) Urban illegal parking detection method and device based on unmanned aerial vehicle and storage medium
CN115100618B (en) Multi-source heterogeneous perception information multi-level fusion characterization and target identification method
CN116699602A (en) Target detection system and method based on millimeter wave radar and camera fusion
Atiq et al. Vehicle detection and shape recognition using optical sensors: a review
Deshpande et al. Vehicle classification
CN115327572A (en) Method for detecting obstacle in front of vehicle
Serfling et al. Camera and imaging radar feature level sensorfusion for night vision pedestrian recognition
Huang et al. An efficient multi-threshold selection method for lane detection based on lidar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant