CN113267779A - Target detection method and system based on radar and image data fusion - Google Patents
Target detection method and system based on radar and image data fusion Download PDFInfo
- Publication number
- CN113267779A CN113267779A CN202110536632.XA CN202110536632A CN113267779A CN 113267779 A CN113267779 A CN 113267779A CN 202110536632 A CN202110536632 A CN 202110536632A CN 113267779 A CN113267779 A CN 113267779A
- Authority
- CN
- China
- Prior art keywords
- target
- radar
- point cloud
- detection
- signal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000001514 detection method Methods 0.000 title claims abstract description 72
- 230000004927 fusion Effects 0.000 title claims abstract description 16
- 238000012545 processing Methods 0.000 claims abstract description 34
- 238000007689 inspection Methods 0.000 claims abstract description 28
- 238000013135 deep learning Methods 0.000 claims abstract description 15
- 230000005540 biological transmission Effects 0.000 claims abstract description 12
- 230000007613 environmental effect Effects 0.000 claims abstract description 4
- 230000008447 perception Effects 0.000 claims abstract description 4
- 238000000034 method Methods 0.000 claims description 15
- 230000015572 biosynthetic process Effects 0.000 claims description 6
- 238000012423 maintenance Methods 0.000 claims description 6
- 238000001228 spectrum Methods 0.000 claims description 6
- 238000003786 synthesis reaction Methods 0.000 claims description 6
- 230000000630 rising effect Effects 0.000 claims description 3
- 238000010183 spectrum analysis Methods 0.000 claims description 3
- 230000009469 supplementation Effects 0.000 claims description 3
- 238000001914 filtration Methods 0.000 claims description 2
- 238000004364 calculation method Methods 0.000 abstract 1
- 230000000875 corresponding effect Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 2
- 238000007781 pre-processing Methods 0.000 description 2
- 238000013459 approach Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013480 data collection Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/933—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Electromagnetism (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention discloses a target detection method and a system based on radar and image data fusion. The system comprises an inspection area environment sensing module, a millimeter wave signal processing module, a point cloud analysis and processing module, a deep learning computer and a display module, wherein wired or wireless transmission modes are adopted among the modules. According to the invention, the millimeter wave radar is used as the environment perception sensor equipment in the inspection area, so that the influence of environmental factors such as light rays on the performance of the security system is effectively avoided; by reading the radar point cloud data of the sub-area where the target is located for processing and analyzing, the calculation amount caused by processing the non-dynamic redundant information of the routing inspection area by the deep learning computer is reduced, and the real-time performance of target detection is effectively improved.
Description
Technical Field
The invention relates to the field of inspection of power transmission lines, in particular to a target detection technology based on radar and image data fusion.
Background
In recent years, the intelligent routing inspection technology of the power transmission line with the modern information technology as the core has a high development speed, and aims to realize important state monitoring, evaluation, overhaul and risk early warning of power transmission equipment, and on-line data collection and statistics. The intelligent level of the operation and maintenance of the power transmission line is improved, the occurrence of power transmission accidents is prevented and responded in an active mode, and the intelligent power transmission line has important significance on the safety of the energy Internet.
With the continuous progress of the aviation industry technology and the positioning technology, the operation and maintenance inspection method of the power transmission line is gradually diversified, and the inspection modes of the manned helicopter, the unmanned aerial vehicle and other machines are popularized and applied to the operation and maintenance of the power transmission line by the characteristics of high efficiency, accuracy, wide application range and the like. Although this kind of mode of patrolling and examining can effectively avoid the potential safety hazard that the on-the-spot manual work climbed the pole and patrolled and examined, whole work load and resource consumption do not change basically. Still there is consuming time in the aspect of unmanned aerial vehicle independently patrols and examines work, the problem of inefficiency.
In order to realize autonomous routing inspection of the unmanned aerial vehicle, a surrounding complex environment is usually modeled to provide path planning position reference information, routing inspection targets in the modeled environment are further detected to perform path planning, and how to detect the routing inspection targets in the modeled environment becomes a key problem.
Disclosure of Invention
The invention aims to solve the technical problem that the target detection precision is insufficient, and provides a target detection method based on the fusion of radar and image data.
The invention adopts the following technical scheme for solving the technical problems:
the invention provides a target detection method based on radar and image data fusion, which comprises the following steps:
(1) and detecting the object based on the encoder network:
generating preliminary detection on an image by adopting a CenterNet detection network, extracting image characteristics by using a backbone network of a full-convolution encoder decoder, predicting an object center point, an object 2D size, a center offset and a 3D size on the image by using a DLA network as a backbone network through the extracted image characteristics, and providing a two-dimensional boundary frame and a preliminary 3D boundary frame for each detected object in a scene;
(2) and performing data association based on a cone method:
associating the radar detection result with a corresponding object on an image surface, creating a 3D RoI view cone for the target by using a two-dimensional boundary frame of the target and the estimated depth and size thereof through a view cone association method, reducing the radar detection range needing to be checked, creating RoI around the target by using the estimated depth, size and rotation angle of the target, and further filtering out radar detection irrelevant to the target;
(3) image feature supplementation is performed based on radar detection:
connecting the associated target features with a feature map composed of depth information detected by radar data in parallel, and performing regression of depth, rotation and attribute of the 3D target; for each radar detection associated with a target, three heatmap channels centered around and inside the target two-dimensional bounding box are generated.
Further, the present invention provides a target detecting method, in step (1), by transmitting radio waves through a radar to sense an environment and measuring reflected waves to determine a target position, reporting a detected target as a 2D point in the BEV, and providing an azimuth angle and a radial distance to the target.
Further, the object detection method proposed by the present invention, in step (2), is to omit any point beyond this frustum in the two-dimensional bounding box of the object by the method of creating a 3D RoI frustum to narrow the radar detection range required to check the association, wherein the size of the RoI frustum is controlled by using the import parameter δ, and when creating RoI around the object, if there are multiple radar detections in the RoI, the nearest point is taken as the radar detection point corresponding to the object.
Further, the target detection method provided by the invention, in the step (3), the generated heat map is used as an additional channel to be connected to the image characteristics, and the characteristics are used as the input of a secondary regression head to recalculate the target position; the attribute regression head estimates different attributes for different target classes, and the quadratic regression head consists of three convolutional layers with 3 × 3 kernels followed by 1 × 1 convolutional layers to generate the required output.
Further, in the target detection method provided by the present invention, in step (1), the radar emits radio waves to sense the environment and measures reflected waves to determine the target position, which is specifically as follows:
(101) on one hand, a signal generated by the radio frequency end of the millimeter wave radar is used as a transmitting signal to diffuse to different directions of the inspection area through the transmitting antenna, and on the other hand, the signal is used as a local oscillator signal to be subjected to frequency mixing processing in a frequency mixer together with echo signals reflected by a target object and an environmental object received by the receiving antenna, so that an intermediate frequency signal is obtained; the intermediate frequency signal has time delay of the radial distance between the target and the radar, and the distance between the target and the radar can be obtained;
(102) realizing the multi-beam digital synthesis of the azimuth and elevation receiving beams through the azimuth and elevation receiving beam digital synthesis; frequency modulation deskew processing is carried out on a plurality of wave beam signals, and the rising section and the falling section of the frequency modulation triangular wave signals are respectively deskewed, so that decoupling processing of a target distance is realized;
(103) and performing frequency spectrum analysis on the deskew echo signal of each receiving wave beam by adopting a fast Fourier transform algorithm to obtain a frequency spectrum result, and calculating the position distance of the target by utilizing the frequency spectrum distance transform relation of the frequency modulation continuous wave.
The invention also provides a target detection system based on the fusion of radar and image data, which comprises a patrol area environment sensing module, a millimeter wave signal processing module, a point cloud analysis and processing module, a deep learning computer and a display module, wherein wired or wireless transmission modes are adopted among the modules, and the target detection system comprises:
the inspection area environment sensing module consists of a millimeter wave radar and is used for transmitting a radio signal to an inspection area, receiving a reflected signal and collecting point cloud data;
the millimeter wave signal processing module is used for carrying out target detection according to the millimeter wave echo signal and acquiring target position parameters;
the point cloud analysis and processing module reads point cloud data of a sub-area where the target is located and detected in the inspection area according to the position parameters of the target output by the millimeter wave signal processing module, and projects the point cloud data onto a horizontal plane to obtain a point cloud top view;
the deep learning computer detects a target through a deep learning algorithm according to the point cloud top view output by the point cloud analysis and processing module;
and the display module is used for storing and displaying the information identified in the inspection area on the platform so as to inform operation and maintenance personnel to overhaul.
Compared with the prior art, the invention adopting the technical scheme has the following technical effects:
according to the invention, the millimeter wave radar is used as the environment perception sensor equipment in the inspection area, so that the influence of environmental factors such as light rays on the performance of the security system is effectively avoided; the target detection method provided by the invention detects the target in the inspection area through the millimeter wave radar, reads the radar point cloud data of the sub-area where the target is located for processing and analysis, reduces the calculated amount caused by processing the non-dynamic redundant information of the inspection area by the deep learning computer, and effectively improves the real-time property of target detection; and the target detection classification is carried out by combining with a deep learning algorithm, and the radar detection range needing to be checked is narrowed by creating a frustum for the target, so that the radar data is fully utilized, and the detection precision is improved.
Drawings
Fig. 1 is a flowchart of a target detection method based on radar and image data fusion provided by the present invention.
FIG. 2 is a schematic structural diagram of a deep learning network model adopted for target detection based on radar and image data fusion.
Fig. 3 is a hardware structure diagram of a target detection method based on radar and image data fusion provided by the present invention.
Detailed Description
The technical scheme of the invention is further explained in detail by combining the attached drawings:
as shown in fig. 1, the present invention provides a target detection method based on radar and image data fusion, which includes object detection based on an encoder network, data association based on a view frustum method, and image feature supplementation based on radar detection, and the specific flow is as follows:
(1) the object detection based on the encoder network adopts a CenterNet detection network to generate preliminary detection on an image, a full convolution encoder decoder backbone network is used for extracting image characteristics, a DLA network is used as a backbone network, and the object center point on the image, the 2D size (width and height), the center offset, the 3D size and the like of an object are predicted through the extracted image characteristics, so that an accurate 2D boundary frame and a preliminary 3D boundary frame are provided for each detected object in a scene. Wherein the detected target is reported as a 2D point in the BEV by radar emitting radio waves to sense the environment and measuring the reflected waves to determine the target position, providing azimuth and radial distance to the target.
(2) The data association based on the view frustum method is to associate a radar detection result with a corresponding object on an image surface, create a 3D RoI view frustum for a target by using a two-dimensional boundary frame of the target and the estimated depth and size thereof through the view frustum association method, reduce a radar detection range needing to be checked for association, create RoI around the target by using the estimated depth, size and rotation angle of the target, and further filter radar detection irrelevant to the target.
The method for creating the 3D RoI frustum ignores any point beyond the frustum in the two-dimensional bounding box of the target to narrow the radar detection range required to check the association, wherein the size of the RoI frustum is controlled by using an introduction parameter delta, and when the RoI around the target is created, if a plurality of radar detections exist in the RoI, the nearest point is taken as the radar detection point corresponding to the target.
(3) The image feature supplement based on radar detection is realized by connecting the associated target features and a feature map formed by depth information detected by radar data in parallel and performing regression of depth, rotation and attributes of a 3D target. In particular, for each radar detection associated with a target, three heatmap channels centered around the target 2D bounding box and centered inside are generated; the generated heat map is connected as an additional channel to the image features and the features are used as input for a second regression header to recalculate the target position. The attribute regression head estimates different attributes for different target classes. The quadratic regression header consists of three convolutional layers with 3 × 3 kernels followed by 1 × 1 convolutional layers to generate the required output. The additional convolutional layers help to learn higher level features from the radar signature than the main regression head and decode the regression head results into a 3D bounding box.
As a further introduction of the invention, the main processing steps are as follows:
1. target perception positioning based on millimeter wave radar
(1) Signals generated by the millimeter wave radar radio frequency end are used as transmitting signals to diffuse in different directions of the inspection area through the transmitting antenna on one hand, and are used as local oscillation signals to be subjected to frequency mixing processing in the frequency mixer together with echo signals reflected by the target object and the environment object received by the receiving antenna on the other hand, so that intermediate frequency signals are obtained. The intermediate frequency signal has time delay of the radial distance between the target and the radar, and the distance between the target and the radar can be obtained.
(2) Realizing simultaneous multi-beam digital synthesis of the azimuth and elevation receiving beams by the azimuth and elevation receiving beam digital synthesis; and frequency modulation deskew processing is carried out on the plurality of wave beam signals, and the rising section and the falling section of the frequency modulation triangular wave signals are respectively deskewed, so that decoupling processing of the target distance is realized.
(3) And performing frequency spectrum analysis on the deskew echo signal of each receiving wave beam by adopting a fast Fourier transform algorithm to obtain a frequency spectrum result, and calculating the position distance of the target by utilizing the frequency spectrum distance transform relation of the frequency modulation continuous wave.
2. Radar point cloud preprocessing
In order to solve the problem of inaccurate height information, a radar point cloud preprocessing step of pillar extension is introduced, and each radar point is extended into a pillar with a fixed size. The frustum correlation accurately maps the radar detection to the center of the object and minimizes overlap (bottom image). Radar detection is only associated with objects having valid ground truth or detection bins and can only be performed when all or part of the radar detection leg is located within the bin. The viewing cone relationship may also prevent radar detection caused by background objects such as buildings from being associated with ground objects.
3. Fusion mechanism based on radar and image features
A schematic structural diagram of a deep learning network model adopted by target detection based on radar and image data fusion is shown in FIG. 2. The key to the fusion mechanism is to accurately correlate the targets detected by the radar. The hub target detection network generates a heat map for each target class in the image. The peaks in the heat map represent the likely center points of the object and the image features of the locations are used to estimate the attributes of other objects. In order to utilize radar information in such settings to map radar-based features to the center of their corresponding targets on the image, it is necessary to establish an accurate association between the targets detected by the radar and the targets in the scene.
The present invention creates a 3D RoI frustum of an object by using a two-dimensional bounding box of the object and its estimated depth and size, generating the frustum when the object has an accurate 2D bounding box, and using the estimated object depth, size and rotation to create the RoI around the object to further filter out radar detections not relevant to the object. The RoI frustum approach facilitates correlating overlapping objects, eliminating the problem of multiple detection correlation since only the closest radar detection within the RoI frustum is correlated with the target.
The invention provides a target detection system based on radar and image data fusion, a hardware structure diagram is shown in fig. 3, the system comprises a patrol area environment sensing module, a millimeter wave signal processing module, a point cloud analysis and processing module, a deep learning computer and a display module, and wired or wireless transmission modes are adopted among the modules. Wherein:
the inspection area environment sensing module is composed of a millimeter wave radar, and the millimeter wave radar is used for transmitting radio signals (electromagnetic waves in a millimeter wave band) to an inspection area, receiving reflected signals and collecting point cloud data. The scanning angle range of the module is respectively 0-360 degrees of horizontal angle and 0-60 degrees of vertical angle,
the millimeter wave signal processing module performs target detection according to the millimeter wave echo signal to acquire target position parameters, and the module can adopt an embedded ARM + FPGA architecture design;
the point cloud analysis and processing module reads point cloud data of a sub-region where a target is detected in the inspection region according to the position parameters of the target output by the millimeter wave signal processing module, and projects the point cloud data onto a horizontal plane to obtain a point cloud top view, and the module can adopt an embedded ARM + FPGA architecture design;
the deep learning computer performs target detection through a deep learning algorithm according to the point cloud top view output by the point cloud analysis and processing module, and the module can adopt a computer configured as NVIDIA GeForce RTX 2080 Ti such as a GPU;
the display module stores and displays information such as 'date-longitude and latitude-pole tower number-fault component-specific fault condition-fault grade' identified in the inspection area on the platform, and informs operation and maintenance personnel to overhaul.
The embodiments are only for illustrating the technical idea of the present invention, and the technical idea of the present invention is not limited thereto, and any modifications made on the basis of the technical scheme according to the technical idea of the present invention fall within the scope of the present invention.
Claims (6)
1. A target detection method based on radar and image data fusion is characterized by comprising the following steps:
(1) and detecting the object based on the encoder network:
generating preliminary detection on an image by adopting a CenterNet detection network, extracting image characteristics by using a backbone network of a full-convolution encoder decoder, predicting an object center point, an object 2D size, a center offset and a 3D size on the image by using a DLA network as a backbone network through the extracted image characteristics, and providing a two-dimensional boundary frame and a preliminary 3D boundary frame for each detected object in a scene;
(2) and performing data association based on a cone method:
associating the radar detection result with a corresponding object on an image surface, creating a 3D RoI view cone for the target by using a two-dimensional boundary frame of the target and the estimated depth and size thereof through a view cone association method, reducing the radar detection range needing to be checked, creating RoI around the target by using the estimated depth, size and rotation angle of the target, and further filtering out radar detection irrelevant to the target;
(3) image feature supplementation is performed based on radar detection:
connecting the associated target features with a feature map composed of depth information detected by radar data in parallel, and performing regression of depth, rotation and attribute of the 3D target; for each radar detection associated with a target, three heatmap channels centered around and inside the target two-dimensional bounding box are generated.
2. The object detecting method of claim 1, wherein in the step (1), the detected object is reported as a 2D point in the BEV by transmitting a radio wave through a radar to sense an environment and measuring a reflected wave to determine the position of the object, providing an azimuth angle and a radial distance to the object.
3. The method of claim 1, wherein in step (2), the 3D RoI frustum is created, and any point beyond the RoI in the two-dimensional bounding box of the target is ignored to narrow the radar detection range of the target, wherein the size of RoI frustum is controlled by using the introduction parameter δ, and when the RoI around the target is created, if there are multiple radar detections in the RoI, the nearest point is taken as the radar detection point corresponding to the target.
4. The method of claim 1, wherein in step (3) the generated heat map is connected as an additional channel to the image features and the features are used as input to a second regression header to recalculate the target position; the attribute regression head estimates different attributes for different target classes, and the quadratic regression head consists of three convolutional layers with 3 × 3 kernels followed by 1 × 1 convolutional layers to generate the required output.
5. The object detecting method according to claim 2, wherein in the step (1), the radio wave is emitted by the radar to sense the environment and the reflected wave is measured to determine the position of the object, specifically as follows:
(101) on one hand, a signal generated by the radio frequency end of the millimeter wave radar is used as a transmitting signal to diffuse to different directions of the inspection area through the transmitting antenna, and on the other hand, the signal is used as a local oscillator signal to be subjected to frequency mixing processing in a frequency mixer together with echo signals reflected by a target object and an environmental object received by the receiving antenna, so that an intermediate frequency signal is obtained; the intermediate frequency signal has time delay of the radial distance between the target and the radar, and the distance between the target and the radar can be obtained;
(102) realizing the multi-beam digital synthesis of the azimuth and elevation receiving beams through the azimuth and elevation receiving beam digital synthesis; frequency modulation deskew processing is carried out on a plurality of wave beam signals, and the rising section and the falling section of the frequency modulation triangular wave signals are respectively deskewed, so that decoupling processing of a target distance is realized;
(103) and performing frequency spectrum analysis on the deskew echo signal of each receiving wave beam by adopting a fast Fourier transform algorithm to obtain a frequency spectrum result, and calculating the position distance of the target by utilizing the frequency spectrum distance transform relation of the frequency modulation continuous wave.
6. The utility model provides a target detection system based on radar and image data fuse, its characterized in that, including patrolling and examining regional environment perception module, millimeter wave signal processing module, point cloud analysis and processing module, deep learning computer and display module, adopt wired or wireless transmission mode between each module, wherein:
the inspection area environment sensing module consists of a millimeter wave radar and is used for transmitting a radio signal to an inspection area, receiving a reflected signal and collecting point cloud data;
the millimeter wave signal processing module is used for carrying out target detection according to the millimeter wave echo signal and acquiring target position parameters;
the point cloud analysis and processing module reads point cloud data of a sub-area where the target is located and detected in the inspection area according to the position parameters of the target output by the millimeter wave signal processing module, and projects the point cloud data onto a horizontal plane to obtain a point cloud top view;
the deep learning computer detects a target through a deep learning algorithm according to the point cloud top view output by the point cloud analysis and processing module;
and the display module is used for storing and displaying the information identified in the inspection area on the platform so as to inform operation and maintenance personnel to overhaul.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110536632.XA CN113267779B (en) | 2021-05-17 | 2021-05-17 | Target detection method and system based on radar and image data fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110536632.XA CN113267779B (en) | 2021-05-17 | 2021-05-17 | Target detection method and system based on radar and image data fusion |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113267779A true CN113267779A (en) | 2021-08-17 |
CN113267779B CN113267779B (en) | 2024-08-06 |
Family
ID=77231334
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110536632.XA Active CN113267779B (en) | 2021-05-17 | 2021-05-17 | Target detection method and system based on radar and image data fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113267779B (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113848825A (en) * | 2021-08-31 | 2021-12-28 | 国电南瑞南京控制系统有限公司 | AGV state monitoring system and method for flexible production workshop |
CN114708585A (en) * | 2022-04-15 | 2022-07-05 | 电子科技大学 | Three-dimensional target detection method based on attention mechanism and integrating millimeter wave radar with vision |
CN117017276A (en) * | 2023-10-08 | 2023-11-10 | 中国科学技术大学 | Real-time human body tight boundary detection method based on millimeter wave radar |
CN117152647A (en) * | 2023-11-01 | 2023-12-01 | 天津市普迅电力信息技术有限公司 | Unmanned aerial vehicle distribution network completion acceptance method based on multi-view fusion |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107609522A (en) * | 2017-09-19 | 2018-01-19 | 东华大学 | A kind of information fusion vehicle detecting system based on laser radar and machine vision |
CN111192295A (en) * | 2020-04-14 | 2020-05-22 | 中智行科技有限公司 | Target detection and tracking method, related device and computer readable storage medium |
WO2020135810A1 (en) * | 2018-12-29 | 2020-07-02 | 华为技术有限公司 | Multi-sensor data fusion method and device |
CN111652097A (en) * | 2020-05-25 | 2020-09-11 | 南京莱斯电子设备有限公司 | Image millimeter wave radar fusion target detection method |
WO2020216316A1 (en) * | 2019-04-26 | 2020-10-29 | 纵目科技(上海)股份有限公司 | Driver assistance system and method based on millimetre wave radar, terminal, and medium |
CN112712129A (en) * | 2021-01-11 | 2021-04-27 | 深圳力维智联技术有限公司 | Multi-sensor fusion method, device, equipment and storage medium |
-
2021
- 2021-05-17 CN CN202110536632.XA patent/CN113267779B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107609522A (en) * | 2017-09-19 | 2018-01-19 | 东华大学 | A kind of information fusion vehicle detecting system based on laser radar and machine vision |
WO2020135810A1 (en) * | 2018-12-29 | 2020-07-02 | 华为技术有限公司 | Multi-sensor data fusion method and device |
WO2020216316A1 (en) * | 2019-04-26 | 2020-10-29 | 纵目科技(上海)股份有限公司 | Driver assistance system and method based on millimetre wave radar, terminal, and medium |
CN111192295A (en) * | 2020-04-14 | 2020-05-22 | 中智行科技有限公司 | Target detection and tracking method, related device and computer readable storage medium |
CN111652097A (en) * | 2020-05-25 | 2020-09-11 | 南京莱斯电子设备有限公司 | Image millimeter wave radar fusion target detection method |
CN112712129A (en) * | 2021-01-11 | 2021-04-27 | 深圳力维智联技术有限公司 | Multi-sensor fusion method, device, equipment and storage medium |
Non-Patent Citations (1)
Title |
---|
陈思程: "毫米波雷达和视频联合处理系统在安防中的应用研究", 《中国优秀硕士学位论文全文数据库》, pages 9 - 16 * |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113848825A (en) * | 2021-08-31 | 2021-12-28 | 国电南瑞南京控制系统有限公司 | AGV state monitoring system and method for flexible production workshop |
CN113848825B (en) * | 2021-08-31 | 2023-04-11 | 国电南瑞南京控制系统有限公司 | AGV state monitoring system and method for flexible production workshop |
CN114708585A (en) * | 2022-04-15 | 2022-07-05 | 电子科技大学 | Three-dimensional target detection method based on attention mechanism and integrating millimeter wave radar with vision |
CN114708585B (en) * | 2022-04-15 | 2023-10-10 | 电子科技大学 | Attention mechanism-based millimeter wave radar and vision fusion three-dimensional target detection method |
CN117017276A (en) * | 2023-10-08 | 2023-11-10 | 中国科学技术大学 | Real-time human body tight boundary detection method based on millimeter wave radar |
CN117017276B (en) * | 2023-10-08 | 2024-01-12 | 中国科学技术大学 | Real-time human body tight boundary detection method based on millimeter wave radar |
CN117152647A (en) * | 2023-11-01 | 2023-12-01 | 天津市普迅电力信息技术有限公司 | Unmanned aerial vehicle distribution network completion acceptance method based on multi-view fusion |
CN117152647B (en) * | 2023-11-01 | 2024-01-09 | 天津市普迅电力信息技术有限公司 | Unmanned aerial vehicle distribution network completion acceptance method based on multi-view fusion |
Also Published As
Publication number | Publication date |
---|---|
CN113267779B (en) | 2024-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113267779A (en) | Target detection method and system based on radar and image data fusion | |
CN109670411B (en) | Ship point cloud depth image processing method and system based on generation countermeasure network | |
CN102891453B (en) | Unmanned aerial vehicle patrolling line corridor method and device based on millimeter-wave radar | |
US12066353B2 (en) | Apparatuses and methods for gas flux measurements | |
CN110782465B (en) | Ground segmentation method and device based on laser radar and storage medium | |
CN102508219B (en) | Turbulent current target detection method of wind profiler radar | |
EP3663790A1 (en) | Method and apparatus for processing radar data | |
CN109102702A (en) | Vehicle speed measuring method based on video encoder server and Radar Signal Fusion | |
CN107783133B (en) | Anti-collision system and anti-collision method for fixed-wing unmanned aerial vehicle of millimeter wave radar | |
CN110568433A (en) | High-altitude parabolic detection method based on millimeter wave radar | |
CN111856496A (en) | Pipeline detection method and pipeline detection device | |
CN111980872B (en) | Sensor for measuring distance from wind driven generator blade to tower | |
CN107783116A (en) | Pilotless automobile complex environment anticollision millimetre-wave radar system | |
CN110596731A (en) | Active obstacle detection system and method for metro vehicle | |
CN112051568A (en) | Pitching angle measurement method of two-coordinate radar | |
CN109559525A (en) | A kind of method for monitoring overspeed based on millimetre-wave radar, device and equipment | |
CN113504525B (en) | Fog region visibility inversion method and system | |
CN107783115A (en) | The remote complex environment anticollision millimetre-wave radar system of rotor wing unmanned aerial vehicle | |
CN105445729A (en) | Unmanned aerial vehicle flight three-dimensional track precision detection method and system | |
CN104569923B (en) | Velocity restraint-based Hough transformation fast track starting method | |
CN107783123A (en) | Pilotless automobile complex environment anticollision MMW RADAR SIGNAL USING processing system and method | |
CN106772263A (en) | Surveillance radar over the ground | |
CN111679285A (en) | Optical detection method and device for aircraft wake vortex | |
CN105372650A (en) | Unmanned aerial vehicle flight path precision detection method and device | |
CN107783114A (en) | The remote complex environment anticollision MMW RADAR SIGNAL USING processing system of rotor wing unmanned aerial vehicle and method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |