CN117452410A - Millimeter wave radar-based vehicle detection system - Google Patents

Millimeter wave radar-based vehicle detection system Download PDF

Info

Publication number
CN117452410A
CN117452410A CN202311395165.9A CN202311395165A CN117452410A CN 117452410 A CN117452410 A CN 117452410A CN 202311395165 A CN202311395165 A CN 202311395165A CN 117452410 A CN117452410 A CN 117452410A
Authority
CN
China
Prior art keywords
radar
coordinate system
image
point cloud
millimeter wave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311395165.9A
Other languages
Chinese (zh)
Inventor
夏明飞
甄红涛
韩宁
王天
李志伟
孙华刚
贾锋
吕桐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
32181 Troops of PLA
Original Assignee
32181 Troops of PLA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 32181 Troops of PLA filed Critical 32181 Troops of PLA
Priority to CN202311395165.9A priority Critical patent/CN117452410A/en
Publication of CN117452410A publication Critical patent/CN117452410A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a vehicle detection system based on millimeter wave radar, comprising: and the point cloud image acquisition module is used for: the method comprises the steps of acquiring radar point cloud information of a target vehicle through a millimeter wave radar, and converting the radar point cloud information to acquire a radar point cloud image of the target vehicle; visual image acquisition module: for acquiring a visual image of a target vehicle; and an image fusion module: the method comprises the steps of fusing a radar point cloud image with a visual image and outputting a fused image; parameter calibration module: the method comprises the steps of establishing a connection between a camera coordinate system and a world coordinate system, realizing conversion between a pixel coordinate system and the world coordinate system, obtaining a coordinate system conversion relation, and determining a radar projection ROI region; and a decision fusion module: and the method is used for detecting the radar projection ROI area and the visual detection frame size according to the decision fusion rule to obtain the detection result of the target vehicle. The invention improves the detection precision, effectively reduces the omission factor, and can continuously and stably detect and track the vehicle.

Description

Millimeter wave radar-based vehicle detection system
Technical Field
The invention relates to the technical field of intelligent vehicle detection, in particular to a vehicle detection system based on millimeter wave radar.
Background
With the continuous increase of the quantity of the automobile, traffic accidents are gradually and frequently caused, and improper operation of a driver is a main cause of the traffic accidents. People enjoy the convenience of transportation by vehicles, meanwhile, the development of intelligent transportation and automatic driving technology can greatly improve the driving safety, and the vehicle detection technology is one of key links. In order to prevent traffic accidents and reduce casualties of people as much as possible, more and more enterprises and institutions are greatly developing intelligent driving systems and advanced driver assistance systems of automobiles, and with the continuous improvement of the technology level, the advanced driving assistance systems of automobiles greatly improve the driving safety of automobiles nowadays.
The radar sensor commonly used in intelligent automobiles mainly comprises an ultrasonic radar, a laser radar and a millimeter wave radar, wherein the ultrasonic radar judges the distance between the ultrasonic radar and an obstacle by transmitting ultrasonic wave in the air, and the detection precision within 5 meters can reach the centimeter level range. Since the detection distance of the ultrasonic radar is short, the ultrasonic radar is generally applied to short-distance detection scenes such as automobile parking. Limited by the high cost, lidar is equipped only in some autopilot laboratory vehicles and high-end vehicle types, compared to millimeter wave radar, which is not correspondingly configured in common civilian vehicle types, but is equipped in most intelligent driving vehicles. The millimeter wave radar can detect the distance, speed, azimuth and other information of the obstacle relative to the radar, has strong capability of penetrating smoke, dust and the like, has good environment interference resistance and low price, and is an environment sensing sensor widely used in the intelligent automobile advanced auxiliary driving system at present.
And currently common vehicle detection methods include vision-based vehicle detection and lidar point cloud-based vehicle detection. The vehicle detection method based on vision can capture real-time picture information of the traffic target, is easily influenced by factors such as illumination and weather, and cannot acquire accurate motion information of the traffic target; lidar performance is susceptible to severe weather and is costly.
Accordingly, the present invention proposes a millimeter wave radar-based vehicle detection system.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a vehicle detection system based on millimeter wave radar so as to realize the purpose of continuously and stably detecting and tracking a vehicle.
A millimeter wave radar-based vehicle detection system, comprising:
and the point cloud image acquisition module is used for: the method comprises the steps of acquiring radar point cloud information of a target vehicle through a millimeter wave radar, and converting the radar point cloud information to acquire a radar point cloud image of the target vehicle;
visual image acquisition module: for acquiring a visual image of a target vehicle;
and an image fusion module: the method comprises the steps of fusing the radar point cloud image with the visual image and outputting a fused image;
parameter calibration module: the method comprises the steps of establishing a connection between a camera coordinate system and a world coordinate system, realizing conversion between a pixel coordinate system and the world coordinate system, obtaining a coordinate system conversion relation, and determining a radar projection ROI region;
and a decision fusion module: and the method is used for detecting the radar projection ROI area and the visual detection frame size according to the decision fusion rule, and obtaining the detection result of the target vehicle.
Preferably, the point cloud image acquisition module includes:
target recognition unit: the method comprises the steps of obtaining target vehicle point cloud information by preprocessing millimeter wave Lei Dadian cloud data;
an image conversion unit: and the radar point cloud image of the target vehicle is obtained by converting the target vehicle point cloud information through a calibration matrix and an RGB matrix.
Preferably, the image fusion module includes:
decision and fusion unit: and the method is used for generating the ROI region and extracting the characteristics of the visual image, completing the target recognition based on the visual sensor, and outputting the final detection target by carrying out decision-level fusion on the radar point cloud image and the recognition result of the visual sensor.
Preferably, the image fusion module further comprises a fusion image acquisition unit, wherein the fusion image acquisition unit is used for sampling the final detection target by adopting convolution layers with different scales, and adding the characteristics obtained by sampling to obtain a spatial attention weight matrix; and splicing the spatial attention weight matrix with the final detection target, and outputting the fusion image.
Preferably, the fused image acquisition unit includes a number of different scale convolution layers including a convolution layer having a convolution kernel size of 1×1, a convolution layer having a convolution kernel size of 3×3, and a convolution layer having a convolution kernel size of 5×5.
Preferably, the parameter calibration module includes:
a space coordinate system alignment unit: the method comprises the steps of aligning a world coordinate system, a radar coordinate system, a vision sensor coordinate system, an image coordinate system and a pixel coordinate system, wherein the radar coordinate system is established at a position of a vehicle central axis signpost where a millimeter wave radar is installed, the vision sensor coordinate system is established at a position of a vehicle where a monocular camera is installed, and the world coordinate system is established at a position of a line connecting the front axle and the rear axle of the vehicle;
a pixel coordinate system alignment unit: and the method is used for respectively adjusting the internal parameters and the external parameters of the vision sensor to obtain the rotation matrix and the translation vector of the vision sensor.
Preferably, the parameter calibration module comprises a radar projection size determining unit, wherein the radar projection size determining unit is used for projecting the point cloud data of the radar into the visual image according to the coordinate system conversion relation, and determining the radar projection ROI according to the central point position coordinates.
Preferably, the coordinate system conversion relationship is:
wherein d is the radar detection relative distance, alpha is the relative angle, Y 0 Z is the relative height between the origin of the radar coordinate system and the origin of the world coordinate system 0 The relative longitudinal distance between the origin of the radar coordinate system and the origin of the world coordinate system is given by (u, v) the coordinate position of the radar detection point in the pixel coordinate system.
Preferably, the decision fusion rule includes:
and if the visual detection frame is completely overlapped with the radar projection ROI region, outputting a detection result.
Compared with the prior art, the invention has the following advantages and technical effects:
according to the invention, the radar features are fully utilized from different hierarchical scales by adopting a spatial attention splicing fusion method, the generated spatial attention weight matrix can more effectively control or enhance visual image information, the detection accuracy of far and near targets is improved, and the detection and identification of target vehicles in severe weather are realized;
the invention improves the detection precision, effectively reduces the omission factor, and can continuously and stably detect and track the vehicle.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application, illustrate and explain the application and are not to be construed as limiting the application. In the drawings:
fig. 1 is a schematic diagram showing the composition of a millimeter wave radar-based vehicle detection system in an embodiment of the present invention.
Detailed Description
It should be noted that, in the case of no conflict, the embodiments and features in the embodiments may be combined with each other. The present application will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
The invention provides a vehicle detection system based on millimeter wave radar, as shown in figure 1, comprising:
and the point cloud image acquisition module is used for: the method comprises the steps of acquiring radar point cloud information of a target vehicle through a millimeter wave radar, and converting the radar point cloud information to acquire a radar point cloud image of the target vehicle;
visual image acquisition module: for acquiring a visual image of a target vehicle;
and an image fusion module: the method comprises the steps of fusing the radar point cloud image with the visual image and outputting a fused image;
parameter calibration module: the method comprises the steps of establishing a connection between a camera coordinate system and a world coordinate system, realizing conversion between a pixel coordinate system and the world coordinate system, obtaining a coordinate system conversion relation, and determining a radar projection ROI region;
and a decision fusion module: and the method is used for detecting the radar projection ROI area and the visual detection frame size according to the decision fusion rule, and obtaining the detection result of the target vehicle.
The point cloud image acquisition module comprises:
target recognition unit: the method comprises the steps of obtaining target vehicle point cloud information by preprocessing millimeter wave Lei Dadian cloud data;
an image conversion unit: the method is used for converting the target vehicle point cloud information through the calibration matrix and the RGB matrix to obtain a radar point cloud image of the target vehicle.
The target recognition unit acquires radar point cloud information through millimeter wave radar detection, the Lei Dadian cloud information comprises target vehicle radial distance, radial speed, azimuth angle and pitch angle information, and the radar point cloud information is converted into an image domain through a calibration matrix and an RGB image conversion equation based on the image conversion unit. The visual image acquisition module acquires image information of the target vehicle through a plurality of visual sensors to obtain a visual image.
The image fusion module comprises:
decision and fusion unit: the method is used for generating the region of interest (ROI) and extracting the characteristics of the visual image, completing target recognition based on the visual sensor, and outputting a final detection target by carrying out decision-level fusion on the radar point cloud image and the recognition result of the visual sensor.
The feature level fusion is to analyze and process feature information extracted from original detection information of a sensor, wherein the feature information mainly comprises edge features, speed, shape features and the like of a target. Firstly, respectively extracting target feature information of a radar point cloud image and a visual image to perform feature layer fusion; then, screening the target characteristic information again for the extracted characteristic layer; and finally, generating a RoI area to finish the identification of the target. The feature level fusion can not only keep enough important information of the sensor, but also dilute the data volume through data compression, thereby improving the real-time performance of the processing process.
The image fusion module further comprises a fusion image acquisition unit, wherein the fusion image acquisition unit is used for sampling the final detection target by adopting convolution layers with different scales, and adding the characteristics obtained by sampling to obtain a spatial attention weight matrix; and splicing the spatial attention weight matrix with the final detection target, and outputting the fusion image.
The fusion image acquisition unit comprises a plurality of convolution layers with different scales, wherein the convolution layers with different scales comprise a convolution layer with a convolution kernel size of 1×1, a convolution layer with a convolution kernel size of 3×3 and a convolution layer with a convolution kernel size of 5×5. The fusion image acquisition unit extracts the features of the radar image through three convolution layers with different scales, the radar image features are fused with the visual image, and the fusion module can comprehensively and fully utilize the radar features from different levels and scales, so that the accuracy of the fusion image is improved.
Further optimizing scheme, the parameter calibration module includes:
a space coordinate system alignment unit: the method comprises the steps of aligning a world coordinate system, a radar coordinate system, a vision sensor coordinate system, an image coordinate system and a pixel coordinate system, wherein the radar coordinate system is established at a position of a vehicle central axis signpost where a millimeter wave radar is installed, the vision sensor coordinate system is established at a position of a vehicle where a monocular camera is installed, and the world coordinate system is established at a position of a line connecting the front axle and the rear axle of the vehicle;
a pixel coordinate system alignment unit: and the method is used for respectively adjusting internal parameters and external parameters of the vision sensor and acquiring a rotation matrix and a translation vector of the vision sensor.
After the millimeter wave radar and the vision camera are aligned in the space coordinate system, the alignment in the time coordinate system is also required due to the difference between the radar detection frequency and the camera detection frequency. And aligning the time coordinate system, namely respectively selecting road data acquired by the radar and the camera at the same time as input data of information fusion. The data acquired by the experiment are offline data, so that the time point of radar acquisition information which is the same as the time point of camera acquisition data is found out through system time stamp information and used as a starting point of fusion algorithm detection. The millimeter wave radar sampling frequency selected in the embodiment is 20Hz, the vision camera sampling frequency is 30Hz, and according to the working frequency characteristics of the two sensors, the millimeter wave radar samples data every two frames and the vision camera samples data every three frames, so that the alignment of the millimeter wave radar and the vision camera under a time coordinate system can be realized.
The parameter calibration module comprises a radar projection size determination unit, wherein the radar projection size determination unit is used for projecting the point cloud data of the radar into the visual image according to the coordinate system conversion relation, and determining a radar projection ROI region according to the central point position coordinates.
The targets returned by the millimeter wave radar are various point cloud data under a radar coordinate system, and the targets returned by the visual detection are pixel coordinate positions and category information of a detection frame in a pixel coordinate system, so that the point cloud data of the radar are required to be projected into a visual image according to a coordinate system conversion relation, and then the vehicle ROI size is determined according to a central point position coordinate.
The coordinate system conversion relationship is as follows:
wherein d is the radar detection relative distance, alpha is the relative angle, Y 0 Z is the relative height between the origin of the radar coordinate system and the origin of the world coordinate system 0 The relative longitudinal distance between the origin of the radar coordinate system and the origin of the world coordinate system is given by (u, v) the coordinate position of the radar detection point in the pixel coordinate system.
The point cloud data returned by the radar is projected to the middle point of the vehicle position defaulting in the visual image through coordinate conversion, and the size information of the ROI is further required to be determined according to the position coordinates of the central point. The standard length, width and height values of vehicles in urban roads which do not pass are different according to the types and brands of the vehicles. By referring to the standard of road vehicle outline size, axle load and quality limit value, the width of a common vehicle (including passenger vehicles, trucks, buses and the like) is between 1.6m and 2.5m, and the aspect ratio of the vehicle is usually between 0.7 and 1.3. Considering the variety of vehicles in urban roads, the vehicle ROI height is finally set to 2.4 meters and the vehicle aspect ratio is 1.2, and the vehicle ROI width is 2 meters. From the perspective relationship of the camera, the actual pixel ratio of the vehicle in the pixel plane is inversely proportional to the relative distance between the vehicle and the vehicle, that is, the farther the vehicle is from the vehicle, the fewer the vehicle occupies in the video image.
Further optimizing the scheme, the decision fusion rule comprises:
and if the visual detection frame is completely overlapped with the radar projection ROI region, outputting a detection result.
Since the millimeter wave radar reflection point cloud data is projected in the visual image, it is not necessarily the center position of the vehicle, and furthermore, there is a certain deviation between the vehicle ROI area generated from the radar projection point and the actual vehicle position. Therefore, the decision-level fusion strategy is more believed to detect the rectangular box of the output vehicle position by the neural network.
According to the method, the millimeter wave radar target is tracked, and the noise priori statistical characteristics and the influence of the dynamic change of the external environment on the filtering precision are not considered. The neural network algorithm is adopted to improve the detection accuracy, and the detection speed of 22 frames/second (Frames Per Second, FPS) can be achieved for real-lane scene test, so that a rapid and accurate detection target is realized. Based on the decision-level fusion strategy, the influence of the environment such as illumination, shadow and the like on the algorithm is reduced, the detection precision is further improved, the occurrence of missed detection and false detection is reduced, and the accuracy of estimating the target position, the motion state and the geometric parameters of the vehicle is improved.
According to the invention, the radar features are fully utilized from different hierarchical scales by adopting a spatial attention splicing fusion method, the generated spatial attention weight matrix can more effectively control or enhance visual image information, the detection accuracy of far and near targets is improved, and the detection and identification of target vehicles in severe weather are realized;
the invention improves the detection precision, effectively reduces the omission factor, and can continuously and stably detect and track the vehicle.
The foregoing is merely a preferred embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions easily conceivable by those skilled in the art within the technical scope of the present application should be covered in the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (9)

1. A millimeter wave radar-based vehicle detection system, characterized by comprising:
and the point cloud image acquisition module is used for: the method comprises the steps of acquiring radar point cloud information of a target vehicle through a millimeter wave radar, and converting the radar point cloud information to acquire a radar point cloud image of the target vehicle;
visual image acquisition module: for acquiring a visual image of a target vehicle;
and an image fusion module: the method comprises the steps of fusing the radar point cloud image with the visual image and outputting a fused image;
parameter calibration module: the method comprises the steps of establishing a connection between a camera coordinate system and a world coordinate system, realizing conversion between a pixel coordinate system and the world coordinate system, obtaining a coordinate system conversion relation, and determining a radar projection ROI region;
and a decision fusion module: and the method is used for detecting the radar projection ROI area and the visual detection frame size according to the decision fusion rule, and obtaining the detection result of the target vehicle.
2. The millimeter wave radar-based vehicle detection system according to claim 1, wherein the point cloud image acquisition module includes:
target recognition unit: the method comprises the steps of obtaining target vehicle point cloud information by preprocessing millimeter wave Lei Dadian cloud data;
an image conversion unit: and the radar point cloud image of the target vehicle is obtained by converting the target vehicle point cloud information through a calibration matrix and an RGB matrix.
3. The millimeter wave radar-based vehicle detection system according to claim 1, wherein the image fusion module includes:
decision and fusion unit: and the method is used for generating the ROI region and extracting the characteristics of the visual image, completing the target recognition based on the visual sensor, and outputting the final detection target by carrying out decision-level fusion on the radar point cloud image and the recognition result of the visual sensor.
4. The millimeter wave radar-based vehicle detection system according to claim 3, wherein the image fusion module further comprises a fusion image acquisition unit, the fusion image acquisition unit is used for sampling the final detection target by adopting convolution layers with different scales, and adding the sampled features to obtain a spatial attention weight matrix; and splicing the spatial attention weight matrix with the final detection target, and outputting the fusion image.
5. The millimeter wave radar-based vehicle detection system according to claim 4, wherein the fusion image acquisition unit includes a number of different-scale convolution layers including a convolution layer having a convolution kernel size of 1 x 1, a convolution layer having a convolution kernel size of 3 x 3, and a convolution layer having a convolution kernel size of 5 x 5.
6. The millimeter wave radar-based vehicle detection system according to claim 1, wherein the parameter calibration module comprises:
a space coordinate system alignment unit: the method comprises the steps of aligning a world coordinate system, a radar coordinate system, a vision sensor coordinate system, an image coordinate system and a pixel coordinate system, wherein the radar coordinate system is established at a position of a vehicle central axis signpost where a millimeter wave radar is installed, the vision sensor coordinate system is established at a position of a vehicle where a monocular camera is installed, and the world coordinate system is established at a position of a line connecting the front axle and the rear axle of the vehicle;
a pixel coordinate system alignment unit: and the method is used for respectively adjusting the internal parameters and the external parameters of the vision sensor to obtain the rotation matrix and the translation vector of the vision sensor.
7. The millimeter wave radar-based vehicle detection system according to claim 1, wherein the parameter calibration module includes a radar projection size determination unit for projecting the radar point cloud data into the visual image according to the coordinate system conversion relationship, and determining the radar projection ROI area according to the center point position coordinates.
8. The millimeter wave radar-based vehicle detection system according to claim 7, wherein the coordinate system conversion relationship is:
wherein d is the radar detection relative distance, alpha is the relative angle, Y 0 Z is the relative height between the origin of the radar coordinate system and the origin of the world coordinate system 0 Is a radar coordinate systemThe relative longitudinal distance between the origin and the origin of the world coordinate system is (u, v) the coordinate position of the radar detection point in the pixel coordinate system.
9. The millimeter wave radar-based vehicle detection system according to claim 1, wherein the decision fusion rule includes:
and if the visual detection frame is completely overlapped with the radar projection ROI region, outputting a detection result.
CN202311395165.9A 2023-10-25 2023-10-25 Millimeter wave radar-based vehicle detection system Pending CN117452410A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311395165.9A CN117452410A (en) 2023-10-25 2023-10-25 Millimeter wave radar-based vehicle detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311395165.9A CN117452410A (en) 2023-10-25 2023-10-25 Millimeter wave radar-based vehicle detection system

Publications (1)

Publication Number Publication Date
CN117452410A true CN117452410A (en) 2024-01-26

Family

ID=89581112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311395165.9A Pending CN117452410A (en) 2023-10-25 2023-10-25 Millimeter wave radar-based vehicle detection system

Country Status (1)

Country Link
CN (1) CN117452410A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117784121A (en) * 2024-02-23 2024-03-29 四川天府新区北理工创新装备研究院 Combined calibration method and system for road side sensor and electronic equipment
CN117854046A (en) * 2024-03-07 2024-04-09 北京理工大学前沿技术研究院 Integrated positioning system and device based on vision fusion

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117784121A (en) * 2024-02-23 2024-03-29 四川天府新区北理工创新装备研究院 Combined calibration method and system for road side sensor and electronic equipment
CN117854046A (en) * 2024-03-07 2024-04-09 北京理工大学前沿技术研究院 Integrated positioning system and device based on vision fusion
CN117854046B (en) * 2024-03-07 2024-05-14 北京理工大学前沿技术研究院 Integrated positioning system and device based on vision fusion

Similar Documents

Publication Publication Date Title
US20210124013A1 (en) Signal processing apparatus, signal processing method, and program
US10445928B2 (en) Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types
CN117452410A (en) Millimeter wave radar-based vehicle detection system
CN111712731A (en) Target detection method and system and movable platform
CN112215306B (en) Target detection method based on fusion of monocular vision and millimeter wave radar
US20090122136A1 (en) Object detection device
JP7072641B2 (en) Road surface detection device, image display device using road surface detection device, obstacle detection device using road surface detection device, road surface detection method, image display method using road surface detection method, and obstacle detection method using road surface detection method
CN117441113A (en) Vehicle-road cooperation-oriented perception information fusion representation and target detection method
CN109993060B (en) Vehicle omnidirectional obstacle detection method of depth camera
JP2007255979A (en) Object detection method and object detector
CN113850102B (en) Vehicle-mounted vision detection method and system based on millimeter wave radar assistance
CN114495064A (en) Monocular depth estimation-based vehicle surrounding obstacle early warning method
Wang et al. Vehicle detection and width estimation in rain by fusing radar and vision
CN110379178A (en) Pilotless automobile intelligent parking method based on millimetre-wave radar imaging
CN111736153A (en) Environment detection system, method, apparatus, and medium for unmanned vehicle
CN115876198A (en) Target detection and early warning method, device, system and medium based on data fusion
Jiang et al. Target detection algorithm based on MMW radar and camera fusion
CN113989766A (en) Road edge detection method and road edge detection equipment applied to vehicle
CN117111055A (en) Vehicle state sensing method based on thunder fusion
CN114037972A (en) Target detection method, device, equipment and readable storage medium
US20240144507A1 (en) Electronic device and control method
CN115187941A (en) Target detection positioning method, system, equipment and storage medium
CN114550142A (en) Parking space detection method based on fusion of 4D millimeter wave radar and image recognition
CN113189581A (en) Millimeter wave radar and visual fusion fog penetration target recognition algorithm processing method
CN116978009A (en) Dynamic object filtering method based on 4D millimeter wave radar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination