CN111583671A - Millimeter wave radar intersection traffic flow monitoring method and system - Google Patents

Millimeter wave radar intersection traffic flow monitoring method and system Download PDF

Info

Publication number
CN111583671A
CN111583671A CN202010503763.3A CN202010503763A CN111583671A CN 111583671 A CN111583671 A CN 111583671A CN 202010503763 A CN202010503763 A CN 202010503763A CN 111583671 A CN111583671 A CN 111583671A
Authority
CN
China
Prior art keywords
millimeter wave
wave radar
machine learning
neural network
intersection
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010503763.3A
Other languages
Chinese (zh)
Other versions
CN111583671B (en
Inventor
张波
吴伟
顾振飞
李想
陈智康
刘文轩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing College of Information Technology
Original Assignee
Nanjing College of Information Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing College of Information Technology filed Critical Nanjing College of Information Technology
Priority to CN202010503763.3A priority Critical patent/CN111583671B/en
Publication of CN111583671A publication Critical patent/CN111583671A/en
Application granted granted Critical
Publication of CN111583671B publication Critical patent/CN111583671B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/065Traffic control systems for road vehicles by counting the vehicles in a section of the road or in a parking area, i.e. comparing incoming count with outgoing count
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/417Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a method for monitoring traffic flow at a millimeter wave radar intersection, which comprises the following steps: collecting millimeter wave radar detection information of the intersection under different scenes; preprocessing the obtained millimeter wave radar detection information to be used as a training sample, and constructing a training set and a testing set; constructing a machine learning model based on a multitask convolutional neural network; training the machine learning model by using a training set and a testing set to obtain appropriate model parameters; collecting millimeter wave radar detection information in real time, inputting the millimeter wave radar detection information into a trained machine learning model for classification, and identifying vehicle targets and quantity; and counting the quantity change of vehicles at the intersection in the period to realize the monitoring of the traffic flow at the intersection. The invention also discloses a system for monitoring the traffic flow at the intersection by the millimeter wave radar. The method and the system for monitoring the traffic flow at the intersection by the millimeter wave radar can apply machine learning to the intersection traffic flow monitoring application of the millimeter wave radar, and have the advantages of real-time performance, universality and high efficiency.

Description

Millimeter wave radar intersection traffic flow monitoring method and system
Technical Field
The invention relates to a method and a system for monitoring traffic flow at a millimeter wave radar intersection, and belongs to the technical field of intelligent traffic road state monitoring.
Background
Millimeter wave radar atmospheric attenuation is little, smog dust penetrability is better, receive weather influence little, is the sensor that can "all-weather full-time" work, and simultaneously, its resolution ratio is high, directive property is good, the interference killing feature is strong, the detection performance is good, is used for autopilot, intelligent transportation field. The system for using the millimeter wave radar for traffic state monitoring adopts the traditional radar detection and statistics method, and has the following problems: the millimeter wave detection algorithm on the static target is not as good as that on the dynamic target, and the false alarm rate of detection and identification is higher; the algorithm has limited calculation efficiency and low timeliness, and is difficult to meet the requirements of real-time performance and high efficiency of intersection traffic flow monitoring in intelligent traffic; by adopting radar detection, the influence of background interference is large, meanwhile, the backgrounds of different intersections are different, the processing parameters need to be manually adjusted, and the algorithm universality is poor.
The machine learning has strong feature extraction, expression and classification capability, and the automatic extraction of key features of the model is realized through the learning and training of data samples, so that the high-speed and accurate identification and classification are realized. In the prior art, machine learning can be applied to intersection traffic flow monitoring application of millimeter wave radar, and rapid, efficient, accurate and universal traffic flow state identification can be realized.
Disclosure of Invention
The invention aims to solve the technical problem of providing a millimeter wave radar intersection traffic monitoring method based on machine learning, which can apply the machine learning to the intersection traffic monitoring application of a millimeter wave radar and has the advantages of real-time performance, universality and high efficiency.
In order to solve the technical problems, the technical scheme adopted by the invention is as follows:
a millimeter wave radar intersection traffic monitoring method based on machine learning comprises the following steps:
collecting millimeter wave radar detection information of the intersection under different scenes;
preprocessing the obtained millimeter wave radar detection information to be used as a training sample, and constructing a training set and a testing set;
constructing a machine learning model based on a multitask convolutional neural network;
training the machine learning model by using a training set and a testing set to obtain appropriate model parameters;
collecting millimeter wave radar detection information in real time, inputting the millimeter wave radar detection information into a trained machine learning model for classification, and identifying vehicle targets and quantity;
and counting the quantity change of vehicles at the intersection in the period to realize the monitoring of the traffic flow at the intersection.
The millimeter wave radar detection information is image information formed based on millimeter wave radar echo data.
The preprocessing specifically comprises the steps of converting millimeter wave radar detection information samples into gray-scale maps and identifying the number of vehicle targets in each gray-scale map.
The machine learning model is used for zooming the input original pictures to different sizes, calculating the pictures of each size through the neural network respectively, and considering the target forms of different sizes in the pictures, so that the detection of the vehicle targets of different sizes in the pictures based on a unified scale is realized, and the quantity of the vehicle targets is classified.
The working of the machine learning model comprises the following steps: firstly, carrying out first-pass detection, carrying out second-pass detection on a detection result in a larger area, then carrying out detection on the larger area on the basis of the previous two-pass filtering on the detection result, and finally carrying out classification judgment.
The machine learning model comprises three neural networks, wherein the three neural networks are a first neural network, a second neural network and a third neural network respectively, the input of the first neural network, the input of the second neural network and the input of the third neural network are 8 x 8 2-channel images, 16 x 16 2-channel images and 32 x 32 2-channel images respectively, the first neural network, the second neural network and the third neural network are used for judging whether a vehicle target exists in the network and giving a position frame and a key point position, and the output of the first neural network, the output of the second neural network and the output of the third neural network are all composed of 3 parts including whether the vehicle target exists in the image, the area where the vehicle target exists in the image and the number of the vehicle targets contained in the image.
The machine learning model adopts center loss to carry out classification discrimination training convergence of vehicle target number, and appoints a class center for the images of each class of vehicle target number, and the images of the same class of vehicle target number are close to the class center of the machine learning model and far away from other class centers.
Different scenes comprise millimeter wave radar angle, vehicle type, weather, crossing background, vehicle speed and crossing background.
A millimeter wave radar intersection traffic monitoring system based on machine learning is characterized by comprising a data acquisition module, a data processing module, a machine learning module, a real-time detection module and a data statistics module, wherein the data acquisition module is used for acquiring millimeter wave radar detection information of an intersection under different scenes and transmitting the millimeter wave radar detection information to the data processing module, the data processing module is used for preprocessing acquired data and constructing a training set and a test set, the machine learning module is used for constructing a machine learning model based on a multitask convolutional neural network, the data processing module transmits the training set and the test set to the machine learning module for machine learning model training, the real-time detection module acquires the millimeter wave radar detection information in real time and inputs the millimeter wave radar detection information to the trained machine learning model for classification, and identifying the vehicle targets and the quantity, wherein the data statistics module is used for counting the quantity change of the vehicles at the road junctions in the period according to the machine learning model classification result.
The invention has the beneficial effects that: the invention provides a millimeter wave radar intersection traffic monitoring method and system based on machine learning.
Drawings
FIG. 1 is a schematic processing flow diagram of a millimeter wave radar intersection traffic monitoring method based on machine learning according to the present invention;
FIG. 2 is a schematic diagram of a network structure of a first neural network in the multitasking convolutional neural network of the present invention;
FIG. 3 is a schematic diagram of a network structure of a second neural network in the multi-tasking convolutional neural network of the present invention;
FIG. 4 is a schematic diagram of a network structure of a third neural network in the multitasking convolutional neural network of the present invention;
FIG. 5 is a schematic diagram of a training process of the multitask convolutional neural network of the present invention.
Detailed Description
The present invention is further described with reference to the following examples, which are only used to more clearly illustrate the technical solutions of the present invention, but not to limit the scope of the present invention.
As shown in FIG. 1, the invention discloses a millimeter wave radar intersection traffic monitoring method based on machine learning, which comprises the following steps:
step one, millimeter wave radar detection information under different scenes of a road junction is collected, wherein the millimeter wave radar detection information is image information formed based on millimeter wave radar echo data.
And step two, preprocessing the data obtained in the step one to be used as a training sample, and constructing a training set and a testing set. The preprocessing specifically comprises the steps of converting millimeter wave radar detection information samples into gray level maps, and identifying the number of vehicle targets in each gray level map in a manual marking mode, wherein: the label containing one vehicle is "1", the label containing two vehicles is "2", and so on, forming a sample of the image information of the number of vehicle objects in different scenes. Wherein, different scenes comprise millimeter wave radar angle, vehicle type, weather, crossing background, vehicle speed and crossing background.
And step three, constructing a machine learning model based on the multitask convolutional neural network. And inputting the machine learning model into image information formed based on millimeter wave radar echo data in the step one. The machine learning model based on the multitask convolution neural network comprises three neural networks, the three neural networks scale input original pictures to different sizes, the pictures of each size are calculated through the neural networks respectively, and target forms of different sizes in the pictures are considered, so that vehicle targets of different sizes in the pictures are detected based on a uniform scale, and the quantity of the vehicle targets is classified.
The three neural networks are respectively a first neural network, a second neural network and a third neural network, the output of the upper layer neural network is also one of the inputs of the lower layer neural network, the filtering effect is achieved, and the overall training efficiency and accuracy of the network are improved. As shown in fig. 2 to 4, the input of the first neural network is 12 × 12 × 1 data, and 10 feature maps of 5 × 5 are generated by 10 convolution kernels of 3 × 3 and pooling of 2 × 2; secondly, generating 16 characteristic maps of 3 × 3 by 16 convolution kernel processing of 3 × 3 × 10; then, 32 convolution kernel processes of 3 × 3 × 16 are performed to generate 32 1 × 1 feature maps, and finally, 2 convolution kernel processes of 1 × 1 × 32, 4 convolution kernel processes of 1 × 1 × 4, and 10 convolution kernel processes of 1 × 1 × 32 are performed to obtain three outputs of the target presence determination, the target presence region, and the vehicle number classification determination.
The input of the second neural network is 24 × 24 × 1 data, and first, 28 convolution kernels of 3 × 3 and 3 × 3 pooling processing are performed to generate 28 feature maps of 11 × 11; secondly, 48 4 × 4 feature maps are generated through 48 convolution kernels of 3 × 3 × 28 and 3 × 3 pooling; and thirdly, generating 64 3 × 3 feature maps through 64 convolution kernel processing of 2 × 2 × 48, converting the 3 × 3 × 64 feature maps into full connection layers of 128 sizes, and finally accessing the full connection layers of 2, 4 and 10 sizes respectively to obtain three outputs of target existence judgment, target existence areas and vehicle quantity classification judgment.
The input of the third neural network is 48 × 48 × 1 data, and 32 23 × 23 feature maps are generated by 32 convolution kernels of 3 × 3 and 3 × 3 pooling processing; secondly, generating 64 10 × 10 feature maps by 64 convolution kernels of 3 × 3 × 32 and 3 × 3 pooling; thirdly, 64 4 × 4 feature maps are generated by 64 convolution kernels of 3 × 3 × 64 and pooling of 3 × 3, and then are converted into 128 feature maps of 3 × 3 by 128 convolution kernels of 2 × 2 × 64; then, the 3 × 3 × 64 feature map is converted into a full connection layer with 256 dimensions, and finally, full connection layers with dimensions of 2, 4 and 10 are accessed respectively, so that three outputs of target existence judgment, target existence region and vehicle quantity classification judgment are obtained.
As shown in FIG. 5, the operation of the machine learning model includes the steps of: step 1, a first neural network carries out first-pass detection and submits a detection result to a second neural network; step 2, the second neural network detects in a larger area, and submits a detection result to a third neural network; and 3, detecting a larger area by the third neural network on the basis of the filtering of the first neural network and the second neural network, and finally making classification judgment.
The machine learning model adopts center loss to carry out classification discrimination training convergence of vehicle target number, a category center is appointed for images of each category of vehicle target number, and the images of the same category of vehicle target number are close to the category center of the machine learning model as far as possible and far away from other category centers.
Step four, training the machine learning model obtained in the step S03 by using the training set and the test set obtained in the step two to obtain appropriate model parameters;
step five, inputting the millimeter wave radar detection information obtained in the step one into a machine learning model trained in S04 for classification, and identifying vehicle targets and quantity;
and step six, counting the number change of vehicles at the intersection in the period to realize the monitoring of the traffic flow at the intersection.
The invention also discloses a millimeter wave radar intersection traffic monitoring system based on machine learning, which comprises a data acquisition module, a data processing module, a machine learning module, a real-time detection module and a data statistics module, wherein the data acquisition module is used for acquiring millimeter wave radar detection information under different scenes of an intersection and transmitting the millimeter wave radar detection information to the data processing module, the data processing module is used for preprocessing the acquired data and constructing a training set and a test set, the machine learning module is used for constructing a machine learning model based on a multitask convolutional neural network, the data processing module transmits the training set and the test set to the machine learning module for training the machine learning model, the real-time detection module acquires the millimeter wave radar detection information in real time and inputs the millimeter wave radar detection information to the trained machine learning model for classification and vehicle target and quantity identification, and the data statistics module is used for counting the number change of the vehicles at the road junction in the period according to the classification result of the machine learning model.
The invention provides a millimeter wave radar intersection traffic monitoring method and system based on machine learning.
The above description is only of the preferred embodiments of the present invention, and it should be noted that: it will be apparent to those skilled in the art that various modifications and adaptations can be made without departing from the principles of the invention and these are intended to be within the scope of the invention.

Claims (9)

1. A method for monitoring traffic flow at a millimeter wave radar intersection is characterized by comprising the following steps: the method comprises the following steps:
collecting millimeter wave radar detection information of the intersection under different scenes;
preprocessing the obtained millimeter wave radar detection information to be used as a training sample, and constructing a training set and a testing set;
constructing a machine learning model based on a multitask convolutional neural network;
training the machine learning model by using a training set and a testing set to obtain appropriate model parameters;
collecting millimeter wave radar detection information in real time, inputting the millimeter wave radar detection information into a trained machine learning model for classification, and identifying vehicle targets and quantity;
and counting the quantity change of vehicles at the intersection in the period to realize the monitoring of the traffic flow at the intersection.
2. The millimeter wave radar intersection traffic flow monitoring method according to claim 1, characterized in that: the millimeter wave radar detection information is image information formed based on millimeter wave radar echo data.
3. The millimeter wave radar intersection traffic flow monitoring method according to claim 1, characterized in that: the preprocessing specifically comprises the steps of converting millimeter wave radar detection information samples into gray-scale maps and identifying the number of vehicle targets in each gray-scale map.
4. The millimeter wave radar intersection traffic flow monitoring method according to claim 1, characterized in that: the machine learning model is used for zooming the input original pictures to different sizes, calculating the pictures of each size through the neural network respectively, and considering the target forms of different sizes in the pictures, so that the detection of the vehicle targets of different sizes in the pictures based on a unified scale is realized, and the quantity of the vehicle targets is classified.
5. The millimeter wave radar intersection traffic flow monitoring method according to claim 4, characterized in that: the working of the machine learning model comprises the following steps: firstly, carrying out first-pass detection, carrying out second-pass detection on a detection result in a larger area, then carrying out detection on the larger area on the basis of the previous two-pass filtering on the detection result, and finally carrying out classification judgment.
6. The millimeter wave radar intersection traffic flow monitoring method according to claim 5, characterized in that: the machine learning model comprises three neural networks, wherein the three neural networks are a first neural network, a second neural network and a third neural network respectively, the input of the first neural network, the input of the second neural network and the input of the third neural network are 8 x 8 2-channel images, 16 x 16 2-channel images and 32 x 32 2-channel images respectively, the first neural network, the second neural network and the third neural network are used for judging whether a vehicle target exists in the network and giving a position frame and a key point position, and the output of the first neural network, the output of the second neural network and the output of the third neural network are all composed of 3 parts including whether the vehicle target exists in the image, the area where the vehicle target exists in the image and the number of the vehicle targets contained in the image.
7. The millimeter wave radar intersection traffic flow monitoring method according to claim 4, 5 or 6, characterized in that: the machine learning model adopts center loss to carry out classification discrimination training convergence of vehicle target number, and appoints a class center for the images of each class of vehicle target number, and the images of the same class of vehicle target number are close to the class center of the machine learning model and far away from other class centers.
8. The millimeter wave radar intersection traffic flow monitoring method according to claim 3, characterized in that: different scenes comprise millimeter wave radar angle, vehicle type, weather, crossing background, vehicle speed and crossing background.
9. The utility model provides a millimeter wave radar crossing traffic monitoring system which characterized in that: the system comprises a data acquisition module, a data processing module, a machine learning module, a real-time detection module and a data statistics module, wherein the data acquisition module is used for acquiring millimeter wave radar detection information under different scenes of a road junction and transmitting the millimeter wave radar detection information to the data processing module, the data processing module is used for preprocessing obtained data and constructing a training set and a test set, the machine learning module is used for constructing a machine learning model based on a multitask convolutional neural network, the data processing module transmits the training set and the test set to the machine learning module to train the machine learning model, the real-time detection module acquires the millimeter wave radar detection information in real time and inputs the millimeter wave radar detection information to the trained machine learning model for classification so as to identify vehicle targets and quantity, and the data statistics module classifies results according to the machine learning model, and counting the number change of the vehicles at the road junction in the period.
CN202010503763.3A 2020-06-05 2020-06-05 Millimeter wave radar intersection traffic flow monitoring method and system Active CN111583671B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010503763.3A CN111583671B (en) 2020-06-05 2020-06-05 Millimeter wave radar intersection traffic flow monitoring method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010503763.3A CN111583671B (en) 2020-06-05 2020-06-05 Millimeter wave radar intersection traffic flow monitoring method and system

Publications (2)

Publication Number Publication Date
CN111583671A true CN111583671A (en) 2020-08-25
CN111583671B CN111583671B (en) 2022-05-31

Family

ID=72112777

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010503763.3A Active CN111583671B (en) 2020-06-05 2020-06-05 Millimeter wave radar intersection traffic flow monitoring method and system

Country Status (1)

Country Link
CN (1) CN111583671B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112201032A (en) * 2020-08-28 2021-01-08 武汉理工大学 Road traffic flow monitoring method, storage medium and system
CN112328651A (en) * 2020-10-19 2021-02-05 南京理工大学 Traffic target identification method based on millimeter wave radar data statistical characteristics
CN116719003A (en) * 2023-08-10 2023-09-08 利国智能科技(昆山)有限公司 Target detection method and system for millimeter wave radar detection

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005182256A (en) * 2003-12-17 2005-07-07 Sekisui Jushi Co Ltd Movable body detection/notification system
CN105678214A (en) * 2015-12-21 2016-06-15 中国石油大学(华东) Vehicle flow statistical method based on convolutional neural network vehicle model recognition in cloud environment
CN107145833A (en) * 2017-04-11 2017-09-08 腾讯科技(上海)有限公司 The determination method and apparatus of human face region
CN107909005A (en) * 2017-10-26 2018-04-13 西安电子科技大学 Personage's gesture recognition method under monitoring scene based on deep learning
CN108550269A (en) * 2018-06-01 2018-09-18 中物汽车电子扬州有限公司 Traffic flow detection system based on millimetre-wave radar and its detection method
CN109407067A (en) * 2018-10-13 2019-03-01 中国人民解放军海军航空大学 Radar moving targets detection and classification integral method based on time-frequency figure convolutional neural networks
CN109615880A (en) * 2018-10-29 2019-04-12 浙江浙大列车智能化工程技术研究中心有限公司 A kind of wagon flow measuring method based on radar image processing
CN109658715A (en) * 2019-01-31 2019-04-19 厦门精益远达智能科技有限公司 Statistical method of traffic flow, device, equipment and the storage medium of multilane
CN109753874A (en) * 2018-11-28 2019-05-14 南京航空航天大学 A kind of low slow small classification of radar targets method based on machine learning
CN110210463A (en) * 2019-07-03 2019-09-06 中国人民解放军海军航空大学 Radar target image detecting method based on Precise ROI-Faster R-CNN

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005182256A (en) * 2003-12-17 2005-07-07 Sekisui Jushi Co Ltd Movable body detection/notification system
CN105678214A (en) * 2015-12-21 2016-06-15 中国石油大学(华东) Vehicle flow statistical method based on convolutional neural network vehicle model recognition in cloud environment
CN107145833A (en) * 2017-04-11 2017-09-08 腾讯科技(上海)有限公司 The determination method and apparatus of human face region
CN107909005A (en) * 2017-10-26 2018-04-13 西安电子科技大学 Personage's gesture recognition method under monitoring scene based on deep learning
CN108550269A (en) * 2018-06-01 2018-09-18 中物汽车电子扬州有限公司 Traffic flow detection system based on millimetre-wave radar and its detection method
CN109407067A (en) * 2018-10-13 2019-03-01 中国人民解放军海军航空大学 Radar moving targets detection and classification integral method based on time-frequency figure convolutional neural networks
CN109615880A (en) * 2018-10-29 2019-04-12 浙江浙大列车智能化工程技术研究中心有限公司 A kind of wagon flow measuring method based on radar image processing
CN109753874A (en) * 2018-11-28 2019-05-14 南京航空航天大学 A kind of low slow small classification of radar targets method based on machine learning
CN109658715A (en) * 2019-01-31 2019-04-19 厦门精益远达智能科技有限公司 Statistical method of traffic flow, device, equipment and the storage medium of multilane
CN110210463A (en) * 2019-07-03 2019-09-06 中国人民解放军海军航空大学 Radar target image detecting method based on Precise ROI-Faster R-CNN

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112201032A (en) * 2020-08-28 2021-01-08 武汉理工大学 Road traffic flow monitoring method, storage medium and system
CN112328651A (en) * 2020-10-19 2021-02-05 南京理工大学 Traffic target identification method based on millimeter wave radar data statistical characteristics
CN112328651B (en) * 2020-10-19 2022-09-06 南京理工大学 Traffic target identification method based on millimeter wave radar data statistical characteristics
CN116719003A (en) * 2023-08-10 2023-09-08 利国智能科技(昆山)有限公司 Target detection method and system for millimeter wave radar detection
CN116719003B (en) * 2023-08-10 2023-10-24 利国智能科技(昆山)有限公司 Target detection method and system for millimeter wave radar detection

Also Published As

Publication number Publication date
CN111583671B (en) 2022-05-31

Similar Documents

Publication Publication Date Title
CN111583671B (en) Millimeter wave radar intersection traffic flow monitoring method and system
Nie et al. Pavement Crack Detection based on yolo v3
CN108828621A (en) Obstacle detection and road surface partitioning algorithm based on three-dimensional laser radar
US20230108634A1 (en) Ground penetrating radar and deep learning-based underground pipeline detection method and system
CN109460709A (en) The method of RTG dysopia analyte detection based on the fusion of RGB and D information
CN109993138A (en) A kind of car plate detection and recognition methods and device
CN114359181B (en) Intelligent traffic target fusion detection method and system based on image and point cloud
CN103679214B (en) Vehicle checking method based on online Class area estimation and multiple features Decision fusion
CN107985189A (en) Towards driver's lane change Deep Early Warning method under scorch environment
CN102214290B (en) License plate positioning method and license plate positioning template training method
CN108711172A (en) Unmanned plane identification based on fine grit classification and localization method
CN113033303A (en) Method for realizing SAR image rotating ship detection based on RCIoU loss
CN110909656B (en) Pedestrian detection method and system integrating radar and camera
CN111738114A (en) Vehicle target detection method based on anchor-free accurate sampling remote sensing image
CN115861619A (en) Airborne LiDAR (light detection and ranging) urban point cloud semantic segmentation method and system of recursive residual double-attention kernel point convolution network
CN113450573A (en) Traffic monitoring method and traffic monitoring system based on unmanned aerial vehicle image recognition
CN106845458A (en) A kind of rapid transit label detection method of the learning machine that transfinited based on core
CN116258940A (en) Small target detection method for multi-scale features and self-adaptive weights
WO2022148143A1 (en) Target detection method and device
CN113469097B (en) Multi-camera real-time detection method for water surface floaters based on SSD network
CN113447902A (en) Sea surveillance radar target identification method based on machine learning
Hu et al. A simple information fusion method provides the obstacle with saliency labeling as a landmark in robotic mapping
CN116486287A (en) Target detection method and system based on environment self-adaptive robot vision system
CN115937736A (en) Small target detection method based on attention and context awareness
CN112907734B (en) TEDS fault detection method based on virtual CRH380A model and deep learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant