CN113657270A - Unmanned aerial vehicle tracking method based on deep learning image processing technology - Google Patents

Unmanned aerial vehicle tracking method based on deep learning image processing technology Download PDF

Info

Publication number
CN113657270A
CN113657270A CN202110942946.XA CN202110942946A CN113657270A CN 113657270 A CN113657270 A CN 113657270A CN 202110942946 A CN202110942946 A CN 202110942946A CN 113657270 A CN113657270 A CN 113657270A
Authority
CN
China
Prior art keywords
target
image
unmanned aerial
aerial vehicle
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202110942946.XA
Other languages
Chinese (zh)
Inventor
王晓跃
高丽娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Xifeng Intelligent Technology Co ltd
Original Assignee
Jiangsu Xifeng Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Xifeng Intelligent Technology Co ltd filed Critical Jiangsu Xifeng Intelligent Technology Co ltd
Priority to CN202110942946.XA priority Critical patent/CN113657270A/en
Publication of CN113657270A publication Critical patent/CN113657270A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/66Radar-tracking systems; Analogous systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an unmanned aerial vehicle tracking method based on a deep learning image processing technology, which belongs to the technical field of unmanned aerial vehicles and comprises the following specific steps: (1) collecting a target image to be identified; (2) preprocessing an image; (3) target identification; (4) target fusion positioning; (5) measuring the speed of a target; (6) tracking control; according to the invention, a YOLOv3 algorithm model is adopted to carry out learning training on a target image, and image enhancement processing is carried out based on a CLAHE image enhancement preprocessing algorithm, so that the target identification precision of the unmanned aerial vehicle in complex environments such as environment shielding or illumination condition change is favorably improved; the situation that the unmanned aerial vehicle cannot identify the target is avoided; in addition, the laser radar and the millimeter wave radar are adopted to carry out real-time synchronous measurement on the position and the moving speed of the target, and the Kalman filtering algorithm fusion is utilized to carry out optimal solution calculation, so that the positioning and tracking performance of the unmanned aerial vehicle under the complex environments of heavy snow, heavy fog and the like is favorably improved, and the target loss is favorably avoided.

Description

Unmanned aerial vehicle tracking method based on deep learning image processing technology
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle tracking method based on a deep learning image processing technology.
Background
Through retrieval, the chinese patent No. CN108010080A discloses an unmanned aerial vehicle tracking system and method, which can guide an unmanned aerial vehicle to realize accurate tracking on a target beacon through infrared rays emitted by an infrared emission device, but are susceptible to external factors, cannot perform target tracking in a complex environment, and have poor tracking performance; an unmanned aerial vehicle ("UAV"), which is an unmanned aerial vehicle operated by a radio remote control device and a self-contained program control device, is a general term for unmanned aerial vehicles, and can be defined as follows from the technical point of view: unmanned fixed wing aircraft, unmanned vertical take-off and landing aircraft, unmanned airship, unmanned helicopter, unmanned multi-rotor aircraft, unmanned parachute-wing aircraft and the like; compared with manned aircrafts, the unmanned aerial vehicle has the advantages of small volume, low manufacturing cost, convenient use, low requirement on combat environment, strong battlefield viability and the like, in recent years, along with the continuous improvement of the levels of the scientific and technological fields such as automation technology, computer vision technology and the like, the unmanned aerial vehicle is rapidly developed in the military, industrial and civil fields, the target tracking technology of the micro unmanned aerial vehicle is taken as an important branch of the application technology of the unmanned aerial vehicle, has wide application prospects in the aspects of national public safety fields such as explosion prevention, anti-terrorism, traffic monitoring, disaster-resistant rescue and the like, is greatly concerned by students of various countries, and becomes one of the most active research directions in the field at present; however, the existing unmanned aerial vehicle tracking unmanned aerial vehicle positioning mostly adopts a GPS technology, cannot perform target tracking in a complex environment, and is easily affected by factor conditions such as illumination condition change, object shielding, wind, snow and weather and the like, so that a tracking shot target cannot be identified or the target is lost, and therefore, the invention of the unmanned aerial vehicle tracking method based on the deep learning image processing technology becomes more important;
the existing unmanned aerial vehicle tracking method mostly adopts a GPS technology or a laser radar to position a target and combines a traditional target algorithm to identify and track, but the method is easily influenced by factors such as environmental shielding, weather and illumination condition changes, the tracking performance is poor, and the target cannot be identified or lost; therefore, an unmanned aerial vehicle tracking method based on a deep learning image processing technology is provided.
Disclosure of Invention
The invention aims to solve the defects in the prior art, and provides an unmanned aerial vehicle tracking method based on a deep learning image processing technology.
In order to achieve the purpose, the invention adopts the following technical scheme:
an unmanned aerial vehicle tracking method based on a deep learning image processing technology comprises the following specific steps:
(1) collecting a target image to be identified; acquiring image information shot by an airborne camera of the unmanned aerial vehicle in the flying process at a fixed frequency to obtain a target image set to be identified;
(2) image preprocessing: acquiring the target image set to be identified in the step (1), and performing image preprocessing operation on the target image set one by one through an image processing module;
(3) target identification: adopting a target identification module to perform parallel identification on the target image set to be identified after the image preprocessing, and determining a tracking target;
(4) target fusion positioning: performing target positioning on the tracking target by using a laser radar and a millimeter wave radar which are arranged on an unmanned aerial vehicle body to obtain laser radar positioning data and millimeter wave radar positioning data, and performing fusion positioning on the laser radar positioning data and the millimeter wave radar positioning data to obtain optimal target position data;
(5) measuring the speed of a target: on the basis of the optimal target position data of the tracked target, calculating the moving speed of the tracked target through a laser radar and a millimeter wave radar to obtain laser radar speed measurement data and millimeter wave radar speed measurement data, and performing fusion speed measurement on the laser radar speed measurement data and the millimeter wave radar speed measurement data to obtain the optimal target moving speed data;
(6) tracking control: and controlling the unmanned aerial vehicle to continuously track and shoot the target by using the flight control module according to the optimal target position data and the optimal target moving speed data.
Further, the space-time synchronization module comprises a time synchronization unit and a space synchronization unit, and is used for performing space-time synchronization on data collected by the laser radar and the millimeter wave radar.
Further, the image preprocessing of step (2) includes, but is not limited to, image denoising and image enhancement.
Further, the target recognition module in step (3) is provided with a built-in depth target detection model, and the specific construction process is as follows:
s1: firstly, acquiring a large amount of target image materials, and cutting the target image materials into 416 x 416 sizes one by one to obtain a target image set;
s2: then, the target image set in step S1 is preprocessed by using a CLAHE image enhancement preprocessing algorithm;
s3: then, dividing the preprocessed target image set into a 70% training set and a 30% testing set, and manually labeling the preprocessed 70% training set;
s4: constructing a YOLOv3 algorithm model, inputting 70% of the artificially labeled training set as input data into the model for learning training to obtain a deep target detection model;
s5: and (4) acquiring the 30% test set in the step (S3), inputting the test set into the depth target detection model as input data for testing, outputting the model if the accuracy of the test result reaches 95%, otherwise, resampling until the model reaches expectation.
Further, the CLAHE image enhancement preprocessing algorithm specifically comprises the following steps:
SS 1: firstly, converting the color space of a target image from RGB into HSV;
SS 2: then, partitioning the target image, and carrying out histogram equalization operation on the brightness component of each partition in the HSV color space, wherein the operation cuts and equally divides gray level pixels exceeding a threshold value in the histogram to each gray level;
SS 3: finally, the brightness component and the original hue and saturation component are spliced and then transferred to an RGB color space to obtain an enhanced image.
Further, the optimal target moving speed data and the optimal target moving speed data are obtained by performing Kalman filtering algorithm fusion on measurement and calculation results of the laser radar and the millimeter wave radar respectively.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the unmanned aerial vehicle tracking method based on the deep learning image processing technology, a Yolov3 algorithm model is adopted to perform learning training on a target image, and image enhancement processing is performed based on a CLAHE image enhancement preprocessing algorithm, so that the target identification precision of the unmanned aerial vehicle in complex environments such as environment shielding or illumination condition change is improved; the situation that the unmanned aerial vehicle cannot identify the target is avoided;
2. according to the unmanned aerial vehicle tracking method based on the deep learning image processing technology, the laser radar and the millimeter wave radar are adopted to carry out real-time synchronous measurement on the target position and the moving speed, and Kalman filtering algorithm fusion is utilized to carry out optimal solution calculation.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention.
Fig. 1 is an overall flowchart of an unmanned aerial vehicle tracking method based on a deep learning image processing technique according to the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments.
In the description of the present invention, it is to be understood that the terms "upper", "lower", "front", "rear", "left", "right", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are merely for convenience in describing the present invention and simplifying the description, and do not indicate or imply that the device or element being referred to must have a particular orientation, be constructed and operated in a particular orientation, and thus, should not be construed as limiting the present invention.
Referring to fig. 1, the embodiment discloses an unmanned aerial vehicle tracking method based on a deep learning image processing technology, and the tracking method specifically includes the following steps:
(1) collecting a target image to be identified; acquiring image information shot by an airborne camera of the unmanned aerial vehicle in the flying process at a fixed frequency to obtain a target image set to be identified;
specifically, the time-space synchronization module comprises a time synchronization unit and a space synchronization unit, and is used for performing time-space synchronization on data collected by the laser radar and the millimeter wave radar.
(2) Image preprocessing: acquiring the target image set to be identified in the step (1), and performing image preprocessing operation on the target image set one by one through an image processing module;
specifically, the image preprocessing includes, but is not limited to, image denoising and image enhancement.
(3) Target identification: performing parallel recognition on a target image set to be recognized after image preprocessing by adopting a target recognition module, and determining a tracking target;
(4) target fusion positioning: performing target positioning on a tracking target by using a laser radar and a millimeter wave radar which are arranged on an unmanned aerial vehicle body to obtain laser radar positioning data and millimeter wave radar positioning data, and performing fusion positioning on the laser radar positioning data and the millimeter wave radar positioning data to obtain optimal target position data;
specifically, the laser radar is a radar system for detecting the position, speed and other characteristic quantities of a target by emitting laser beams, and the working principle of the system is to emit detection signals (laser beams) to the target, compare received signals (target echoes) reflected from the target with the emission signals, and after proper processing, obtain relevant information of the target, such as parameters of target distance, direction, height, speed, attitude, even shape and the like, so as to detect, track and identify the target, such as an airplane, a missile and the like; the laser changes the electric pulse into the optical pulse to be emitted, and the optical receiver restores the optical pulse reflected from the target into the electric pulse to be sent to the display; the millimeter wave radar is a radar which operates in a millimeter wave band (millimeter wave) for detection; generally, the millimeter wave is in a frequency domain of 30-300 GHz (the wavelength is 1-10 mm), the wavelength of the millimeter wave is between microwave and centimeter wave, and the millimeter wave guide head has the characteristics of small volume, light weight and high spatial resolution.
(5) Measuring the speed of a target: on the basis of the optimal target position data of the tracked target, calculating the moving speed of the tracked target through a laser radar and a millimeter wave radar to obtain laser radar speed measurement data and millimeter wave radar speed measurement data, and performing fusion speed measurement on the laser radar speed measurement data and the millimeter wave radar speed measurement data to obtain optimal target moving speed data;
specifically, the optimal target moving speed data and the optimal target moving speed data are obtained by performing kalman filtering algorithm fusion on measurement calculation results of the laser radar and the millimeter wave radar respectively;
specifically, the Kalman filtering algorithm is one of sequential data assimilation, is proposed by Kalman for random process state estimation, and has the basic idea that the optimal estimation of the state variable of the dynamic system at the current time is obtained by using the state estimation value at the previous time and the observation value at the current time, and comprises two steps of forecasting and analyzing.
(6) Tracking control: controlling the unmanned aerial vehicle to continuously track and shoot the target by using the flight control module according to the optimal target position data and the optimal target moving speed data;
in the embodiment, the laser radar and the millimeter wave radar are adopted to perform real-time synchronous measurement on the target position and the moving speed, and the Kalman filtering algorithm is utilized for fusion to perform optimal solution calculation, so that compared with a traditional unmanned aerial vehicle tracking method of a single infrared transmitting device, the method is favorable for improving the positioning and tracking performance of the unmanned aerial vehicle in complex environments such as heavy snow and heavy fog, and is favorable for avoiding tracking and losing targets.
Referring to fig. 1, the embodiment discloses an unmanned aerial vehicle tracking method based on a deep learning image processing technology, including: the system comprises an unmanned aerial vehicle body, an airborne camera module, a laser radar, a millimeter wave radar, an image processing module, a target identification module, a time-space synchronization module, a central processing module and a flight control module; except for the same structure as the above embodiment, the present embodiment will specifically describe a target recognition module;
specifically, the target recognition module is internally provided with a depth target detection model, and the specific construction process is as follows: firstly, acquiring a large amount of target image materials, and cutting the target image materials into 416 x 416 sizes one by one to obtain a target image set; then, preprocessing a target image set by using a CLAHE image enhancement preprocessing algorithm; then, dividing the preprocessed target image set into a 70% training set and a 30% testing set, and manually labeling the preprocessed 70% training set; constructing a YOLOv3 algorithm model, inputting 70% of training sets after manual labeling as input data into the model for learning training to obtain a deep target detection model; acquiring the test set of the step S330%, inputting the test set into the depth target detection model as input data for testing, outputting the model if the accuracy of the test result reaches 95%, otherwise, resampling until the model reaches the expectation;
specifically, the CLAHE image enhancement preprocessing algorithm comprises the following specific steps: firstly, converting the color space of a target image from RGB into HSV; then, partitioning the target image, and carrying out histogram equalization operation on the brightness component of each partition in the HSV color space, wherein the operation cuts and equally divides gray level pixels exceeding a threshold value in the histogram to each gray level; finally, the brightness component and the original hue and saturation component are spliced and then transferred to an RGB color space to obtain an enhanced image;
in the embodiment, a Yolov3 algorithm model is adopted to perform learning training on the target image, and image enhancement processing is performed based on a CLAHE image enhancement preprocessing algorithm, so that the target identification precision of the unmanned aerial vehicle in complex environments such as environment shielding or illumination condition change is favorably improved; the occurrence of the situation that the unmanned aerial vehicle cannot identify the target is avoided.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art should be considered to be within the technical scope of the present invention, and the technical solutions and the inventive concepts thereof according to the present invention should be equivalent or changed within the scope of the present invention.

Claims (6)

1. An unmanned aerial vehicle tracking method based on a deep learning image processing technology is characterized by comprising the following specific steps:
(1) collecting a target image to be identified; acquiring image information shot by an airborne camera of the unmanned aerial vehicle in the flying process at a fixed frequency to obtain a target image set to be identified;
(2) image preprocessing: acquiring the target image set to be identified in the step (1), and performing image preprocessing operation on the target image set one by one through an image processing module;
(3) target identification: adopting a target identification module to perform parallel identification on the target image set to be identified after the image preprocessing, and determining a tracking target;
(4) target fusion positioning: performing target positioning on the tracking target by using a laser radar and a millimeter wave radar which are arranged on an unmanned aerial vehicle body to obtain laser radar positioning data and millimeter wave radar positioning data, and performing fusion positioning on the laser radar positioning data and the millimeter wave radar positioning data to obtain optimal target position data;
(5) measuring the speed of a target: on the basis of the optimal target position data of the tracked target, calculating the moving speed of the tracked target through a laser radar and a millimeter wave radar to obtain laser radar speed measurement data and millimeter wave radar speed measurement data, and performing fusion speed measurement on the laser radar speed measurement data and the millimeter wave radar speed measurement data to obtain the optimal target moving speed data;
(6) tracking control: and controlling the unmanned aerial vehicle to continuously track and shoot the target by using the flight control module according to the optimal target position data and the optimal target moving speed data.
2. The unmanned aerial vehicle tracking method based on the deep learning image processing technology as claimed in claim 1, wherein the space-time synchronization module comprises a time synchronization unit and a space synchronization unit, and is configured to perform space-time synchronization on data collected by the lidar and the millimeter wave radar.
3. The method for unmanned aerial vehicle tracking based on deep learning image processing technology as claimed in claim 2, wherein the image preprocessing of step (2) includes but is not limited to image denoising and image enhancement.
4. The unmanned aerial vehicle tracking method based on the deep learning image processing technology as claimed in claim 1, wherein the target recognition module in step (3) is embedded with a deep target detection model, and the specific construction process is as follows:
s1: firstly, acquiring a large amount of target image materials, and cutting the target image materials into 416 x 416 sizes one by one to obtain a target image set;
s2: then, the target image set in step S1 is preprocessed by using a CLAHE image enhancement preprocessing algorithm;
s3: then, dividing the preprocessed target image set into a 70% training set and a 30% testing set, and manually labeling the preprocessed 70% training set;
s4: constructing a YOLOv3 algorithm model, inputting 70% of the artificially labeled training set as input data into the model for learning training to obtain a deep target detection model;
s5: and (4) acquiring the 30% test set in the step (S3), inputting the test set into the depth target detection model as input data for testing, outputting the model if the accuracy of the test result reaches 95%, otherwise, resampling until the model reaches expectation.
5. The unmanned aerial vehicle tracking method based on the deep learning image processing technology as claimed in claim 4, wherein the CLAHE image enhancement preprocessing algorithm comprises the following specific steps:
SS 1: firstly, converting the color space of a target image from RGB into HSV;
SS 2: then, partitioning the target image, and carrying out histogram equalization operation on the brightness component of each partition in the HSV color space, wherein the operation cuts and equally divides gray level pixels exceeding a threshold value in the histogram to each gray level;
SS 3: finally, the brightness component and the original hue and saturation component are spliced and then transferred to an RGB color space to obtain an enhanced image.
6. The unmanned aerial vehicle tracking method based on the deep learning image processing technology as claimed in claim 1, wherein the optimal target moving speed data and the optimal target moving speed data are obtained by performing kalman filtering algorithm fusion on measurement calculation results of a laser radar and a millimeter wave radar, respectively.
CN202110942946.XA 2021-08-17 2021-08-17 Unmanned aerial vehicle tracking method based on deep learning image processing technology Withdrawn CN113657270A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110942946.XA CN113657270A (en) 2021-08-17 2021-08-17 Unmanned aerial vehicle tracking method based on deep learning image processing technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110942946.XA CN113657270A (en) 2021-08-17 2021-08-17 Unmanned aerial vehicle tracking method based on deep learning image processing technology

Publications (1)

Publication Number Publication Date
CN113657270A true CN113657270A (en) 2021-11-16

Family

ID=78480045

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110942946.XA Withdrawn CN113657270A (en) 2021-08-17 2021-08-17 Unmanned aerial vehicle tracking method based on deep learning image processing technology

Country Status (1)

Country Link
CN (1) CN113657270A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115379408A (en) * 2022-10-26 2022-11-22 斯润天朗(北京)科技有限公司 Scene perception-based V2X multi-sensor fusion method and device
CN115390582A (en) * 2022-07-15 2022-11-25 江西理工大学 Point cloud-based multi-rotor unmanned aerial vehicle tracking and intercepting method and system
CN116382328A (en) * 2023-03-09 2023-07-04 南通大学 Dam intelligent detection method based on cooperation of multiple robots in water and air
CN117111164A (en) * 2023-10-17 2023-11-24 杭州海康威视数字技术股份有限公司 Millimeter wave-based foreign matter detection method and device and electronic equipment

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115390582A (en) * 2022-07-15 2022-11-25 江西理工大学 Point cloud-based multi-rotor unmanned aerial vehicle tracking and intercepting method and system
CN115379408A (en) * 2022-10-26 2022-11-22 斯润天朗(北京)科技有限公司 Scene perception-based V2X multi-sensor fusion method and device
CN115379408B (en) * 2022-10-26 2023-01-13 斯润天朗(北京)科技有限公司 Scene perception-based V2X multi-sensor fusion method and device
CN116382328A (en) * 2023-03-09 2023-07-04 南通大学 Dam intelligent detection method based on cooperation of multiple robots in water and air
CN116382328B (en) * 2023-03-09 2024-04-12 南通大学 Dam intelligent detection method based on cooperation of multiple robots in water and air
CN117111164A (en) * 2023-10-17 2023-11-24 杭州海康威视数字技术股份有限公司 Millimeter wave-based foreign matter detection method and device and electronic equipment
CN117111164B (en) * 2023-10-17 2024-01-26 杭州海康威视数字技术股份有限公司 Millimeter wave-based foreign matter detection method and device and electronic equipment

Similar Documents

Publication Publication Date Title
CN113657270A (en) Unmanned aerial vehicle tracking method based on deep learning image processing technology
US20220197281A1 (en) Intelligent decision-making method and system for unmanned surface vehicle
Alam et al. A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs)
CN108957445A (en) A kind of low-altitude low-velocity small targets detection system and its detection method
US20100305857A1 (en) Method and System for Visual Collision Detection and Estimation
US9165383B1 (en) Point cloud visualization using bi-modal color schemes based on 4D lidar datasets
CN105416584A (en) Post-disaster life tracking unmanned aerial vehicle system
CN113485450A (en) Unmanned aerial vehicle keeps away barrier system based on computer vision
CN107783119A (en) Apply the Decision fusion method in obstacle avoidance system
Zarandy et al. A novel algorithm for distant aircraft detection
CN111831010A (en) Unmanned aerial vehicle obstacle avoidance flight method based on digital space slice
CN110033490B (en) Airport low-slow small target prevention and control method based on photoelectric image automatic identification
RU130410U1 (en) RADAR DEVICE FOR IDENTIFICATION OF AIR OBJECTS
Fasano et al. Sky region obstacle detection and tracking for vision-based UAS sense and avoid
Yan et al. Moving targets detection for video SAR surveillance using multilevel attention network based on shallow feature module
Vitiello et al. Detection and tracking of non-cooperative flying obstacles using low SWaP radar and optical sensors: an experimental analysis
Rzucidło et al. Simulation studies of a vision intruder detection system
CN109815773A (en) A kind of low slow small aircraft detection method of view-based access control model
CN114859962B (en) Unmanned aerial vehicle control method with intelligent obstacle avoidance and constant-height cruising functions
CN110794391A (en) Passive positioning optimization station distribution method based on unmanned aerial vehicle cluster networking platform
CN206038902U (en) Radar early warning system
CN114019996A (en) Trapped person search and rescue system and search and rescue method
CN114545414A (en) Track management method for unmanned aerial vehicle anti-collision radar
Loffi et al. Evaluation of onboard detect-and-avoid system for sUAS BVLOS operations
CN114092522A (en) Intelligent capture tracking method for take-off and landing of airport airplane

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20211116