CN115440094B - Obstacle detection method, device and storage medium for helicopter ground proximity warning - Google Patents

Obstacle detection method, device and storage medium for helicopter ground proximity warning Download PDF

Info

Publication number
CN115440094B
CN115440094B CN202210872811.5A CN202210872811A CN115440094B CN 115440094 B CN115440094 B CN 115440094B CN 202210872811 A CN202210872811 A CN 202210872811A CN 115440094 B CN115440094 B CN 115440094B
Authority
CN
China
Prior art keywords
obstacle
helicopter
information
detection
longitude
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210872811.5A
Other languages
Chinese (zh)
Other versions
CN115440094A (en
Inventor
吴旭峰
刘玉虎
陆洋
陈广永
袁杨
管凯
刘青春
周成中
王弟伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN202210872811.5A priority Critical patent/CN115440094B/en
Publication of CN115440094A publication Critical patent/CN115440094A/en
Application granted granted Critical
Publication of CN115440094B publication Critical patent/CN115440094B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/04Anti-collision systems
    • G08G5/045Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/774Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Abstract

The application provides an obstacle detection method, device and storage medium for helicopter ground proximity warning, wherein the method comprises the following steps: step 1, performing obstacle detection and tracking on an image returned by an onboard camera by using an image detection neural network and a multi-target tracking algorithm to obtain corresponding information; step 2, information fusion processing is carried out to obtain longitude and latitude information of the obstacle; and 3, performing data processing on the longitude and latitude information of the obstacle obtained in the step 2, and compared with the traditional method for performing terrain perception through an expensive geodetic radar, the method disclosed by the application has the advantages that the special optimization is performed on urban scenes and obstacle detection tasks, no special geodetic radar is required, and the installation and purchase cost is obviously reduced.

Description

Obstacle detection method, device and storage medium for helicopter ground proximity warning
Technical Field
The application belongs to the technical field of helicopter flight safety, and particularly relates to an obstacle detection method, an obstacle detection device and a storage medium for ground proximity warning of a helicopter.
Background
At present, a forward-looking warning system of a helicopter is widely applied to the helicopter so as to avoid the occurrence of a controllable flight ground collision accident. The basic principle is that the surrounding terrain of the helicopter is obtained through an airborne terrain database, collision threat prediction is carried out on the current position, the posture and the speed of the helicopter, if the possible collision is judged, the warning is carried out, and otherwise, the warning is not carried out. It can be seen that the effectiveness of the forward looking alert system is closely related to the on-board terrain database. In most cases, the terrain database update period is calculated annually. For a helicopter frequently flying in a busy city, because the building or dismantling period of the building in the city is short, the information of an onboard topographic database is often not updated timely, so that the warning success rate of a forward-looking warning system of the helicopter in the city is obviously reduced, and even accidents of the destruction of the helicopter and the death of people occur. Therefore, for helicopters flying in cities, this problem can be effectively solved if there is a building activity detection capability.
Common devices for active detection of helicopters include airborne radar, infrared and visible optical devices, and the like. The onboard terrain scanning radar of the helicopter has good effect on detecting the terrain in front of the helicopter, but most helicopters are limited in cost, power consumption and other reasons and are not provided with high-performance radars; the infrared equipment has good working adaptability to specific environments such as power line inspection and the like, but the urban environment is complex, the information obtained by the infrared and thermal imaging equipment lacks texture information of the obstacle, and the obstacle and the background in the complex environment are difficult to distinguish; the visible optical device can obtain the texture information of the obstacle, and obtain the type of the obstacle and the position of the obstacle in the image by methods such as an image detection neural network method, an optical flow method and the like, but the accurate distance between the obstacle and the helicopter is difficult to obtain; the method realizes the detection task of the forward-looking direction obstacle in the helicopter city with lower cost, but has single function, and only can return the relative distance between the forward obstacle and the helicopter, so that the method has no function of predicting the collision threat according to the current pose and speed of the helicopter, and also has no function of recording the position and type of the obstacle for the prediction of the subsequent collision threat.
Disclosure of Invention
The application aims to: the application aims to solve the technical problems of low accuracy and poor instantaneity of a helicopter-mounted terrain database in urban flight and other scenes. Aiming at the defects of a single sensor, the application realizes the detection of the forward looking direction terrain of the helicopter through the information fusion of the radar and the visible light equipment. The method is optimized for specific scenes and specific detection tasks, reduces the cost of selecting each sensor, and is an active detection method for the obstacle in the city of the helicopter, which takes economy and safety into consideration.
According to the method, the obstacle information in the forward looking direction of the helicopter can be generated according to the current position and posture information of the helicopter and the image and distance information obtained through radar and visible light camera detection, and the information is updated into an airborne terrain database for forward looking near-earth warning of the helicopter. The method specifically comprises the following steps:
step 1, performing obstacle detection and tracking on images returned by an airborne camera by using an image detection neural network and a multi-target tracking algorithm to obtain corner pixel coordinates, an obstacle type and an obstacle sequence number of an obstacle in each image; obtaining azimuth angle information of forward looking directions of the obstacle and the helicopter through the relation between the pixel coordinates of the corner points of the obstacle and the distance between the center line of the image width and the view angle of the camera, and extracting the distance between the obstacle and the helicopter, which is obtained by detection of the airborne millimeter wave radar, through the azimuth angle information;
step 2, information fusion processing is carried out to obtain longitude and latitude information of the obstacle;
and step 3, carrying out data processing on the longitude and latitude information of the obstacle obtained in the step 2, and finishing obstacle detection.
In the step 1, firstly, a common obstacle data set in a city is manufactured, each obstacle in the data set is subjected to label division and obstacle corner pixel coordinate division, after the completion, the data set is subjected to division and normalization operation, and the data set is divided into a training set and a testing set;
adjusting the image neural network, and adjusting the output of the multi-target detection image neural network into the pixel coordinates of the corner points of the obstacle and the probability of the type of the obstacle; after 30 batches of training are performed for 100 times, the neural network model is converged, and the weight of the neural network model is obtained;
according to the corresponding relation between the camera field angle and the pixel length, calculating to obtain the forward looking direction azimuth angle of the obstacle and the helicopter through the difference value between the horizontal coordinate of the center position of the obstacle edge detection frame and the horizontal coordinate of the center point of the image;
and extracting the radar detection distance according to the azimuth, extracting all distance data of the last time of corresponding azimuth plus radar azimuth precision error of the timestamp, carrying out average value calculation, removing outliers through the average value, and carrying out calculation again to obtain the final distance data between the helicopter and the corresponding obstacle.
In the step 1, detecting and tracking an obstacle according to the distance and image information in the forward looking direction of a helicopter obtained by an airborne millimeter wave radar and an airborne camera, and obtaining an obstacle sequence number and an obstacle pixel coordinate through an image neural network and a multi-target tracking algorithm; and obtains the azimuth angle of the obstacle relative to the forward looking direction of the helicopter through the pixel coordinates of the obstacle,
the distance information of the millimeter wave radar on the corresponding azimuth angle is obtained through the azimuth angle, then the current position, the height and the attitude information of the helicopter are obtained through the airborne GPS, the barometric altimeter and the IMU of the helicopter, and the obtained information is summarized and spliced through the timestamp. IMU (Inertial Measurement Unit), an inertial measurement unit, is used to measure the three-axis attitude angle (or angular rate) and acceleration of an object.
The step 2 comprises the following steps: the method comprises the steps that all detection information frames aiming at obstacles are obtained by integrating airborne camera return data and airborne millimeter wave radar return data, wherein information in the frames comprises obstacle serial numbers, obstacle pixel coordinates, obstacle types, obstacle-to-helicopter distances and azimuth angles;
the corresponding relation between the pixel point and the actual length is obtained through the following formula:
wherein eta pix2meter Conversion coefficients for pixels and actual sizes; d, d rel Is the relative distance between the helicopter and the obstacle; θ is the angle between the longitude direction and the obstacle in the world coordinate system; x is x 1 The left lower corner abscissa of the obstacle boundary detection frame in the image coordinate system; x is x 2 The right upper corner abscissa of the obstacle boundary detection frame in the image coordinate system; h is a IMG_WIDTH Is the image pixel width;
calculating the height and width of the obstacle relative to the helicopter according to the corresponding relation between the pixel points and the actual length, and setting the thickness and the width of the obstacle to be equal;
calculating longitude and latitude coordinates of a center point of the obstacle through a corresponding relation between local longitude and latitude and actual length and longitude and latitude and height information of the helicopter, calculating longitude and latitude coordinates of each angular point of the obstacle through a geometrical relation of the obstacle, and obtaining the actual height of the obstacle through superposition of the height of the helicopter and the height of the obstacle relative to the height of the helicopter.
The step 3 comprises the following steps: obtaining longitude and latitude coordinates and height information frames of each angular point position of the obstacle transmitted from the information fusion module 103, arranging each data frame with the same obstacle sequence number into the same queue, and taking out the data in the queue when the number of frames in the queue reaches a threshold value; the frame number threshold here takes 5; calculating to obtain the mahalanobis distance quantity reflecting the longitude, latitude and height of each point by using a formula (2), judging whether an outlier exists by using a formula (3), and eliminating the outlier if the outlier exists:
wherein,for the kth in the queue including obstacle longitude and latitude and altitude vector +.>And (4) the vector before it>Sigma of (c) is a mahalanobis distance -1 Is the covariance matrix among eigenvectors, P M P is the current Markov distance probability density Tr To truncate the probability density threshold, the application takes 0.9, binary number b k For determining whether the detected point is an outlier, b k A value of 0 indicates that the detection point is not an outlier, b k When the detection point is 1, the detection point is judged to be an outlier;
after the elimination operation is finished, the other points of the queue are processed according to the update time sequence input formula (4) to obtain a new obstacle information queue, then the new obstacle information queue is input into the formula (5) to obtain a final output point, the obstacle sequence number queue is emptied, and the obstacle data information input queue is waited again:
wherein,for the obstacle position height vector at the last detection, +.>For the longitude and latitude and altitude information of the obstacle during the current detection, < > the present application>The result obtained by the first-order filtering is obtained; omega is weight matrix, mu i The i < th > item in the queue which is the vector containing the longitude and latitude and the height of the obstacle in the current sequence number>Forgetting coefficient of->The result obtained by the weighted average is obtained after the forgetting coefficient is overlapped.
In step 3, step 3 is executed again each time when the obstacle queue with the same sequence number reaches the threshold value, and the threshold value is generally 5.
The application also provides an obstacle detection device for the ground proximity warning of the helicopter, which comprises a visual detection module (101), a radar detection module (102), an information fusion module (103), an obstacle data processing module (104), an airborne topography database (105) and a ground proximity warning system (106) of the helicopter;
the visual detection module (101) uses an image detection neural network and a multi-target tracking algorithm to detect and track the obstacle of the image returned by the airborne camera, so as to obtain the pixel coordinates of the corner point, the type and the serial number of the obstacle in each image; obtaining azimuth angle information of forward looking directions of the obstacle and the helicopter through the relation between the pixel coordinates of the corner points of the obstacle and the distance between the center line of the width of the image and the view angle of the camera, and extracting the distance between the obstacle and the helicopter, which is detected by a radar detection module (102), through the azimuth angle information;
the information fusion module (103) fuses the information obtained by the visual detection module (101), and then calculates the longitude and latitude range and the height of the obstacle according to the current pose of the helicopter;
the obstacle data processing module (104) divides obstacle information containing the same serial number into the same queue and processes the obstacle information, eliminates outliers and obtains processed obstacle information, the processed obstacle information is input into the onboard terrain database (105) for the helicopter ground proximity warning system (106) to use, the onboard terrain database (105) is scanned by the helicopter ground proximity warning system (106) in real time to store terrain elevation data of surrounding environment of the current position of the helicopter, collision threat prediction is carried out based on the terrain elevation data, if threat exists, an alarm prompt picture and sound are sent to the alarm display panel, and if threat does not exist, the alarm prompt picture and sound are not sent.
The device performs the steps of:
in a visual detection module (101), firstly, a common obstacle data set in a city is manufactured, each obstacle in the data set is subjected to label division and obstacle corner pixel coordinate division, after the completion, the data set is subjected to segmentation and normalization operation, and the data set is divided into a training set and a test set;
adjusting the image neural network, wherein the final output result of the adjustment image neural network is the barrier corner pixel coordinates and the barrier type probability; selecting proper neural network model learning rate and learning step length, and converging the neural network model through iterative operation to obtain the neural network model weight capable of rapidly and accurately detecting the obstacles in the city;
step a2, in the radar detection module (102), according to the pixel coordinates of the obstacle angle points obtained by the vision detection module (101), according to the corresponding relation between the camera view angle and the pixel length, calculating to obtain azimuth angles of the obstacle and the forward looking direction of the helicopter through the difference value between the horizontal coordinates of the center position of the obstacle edge detection frame and the horizontal coordinates of the center point of the image;
extracting radar detection distance according to the azimuth, extracting all distance data of the corresponding azimuth of the latest time stamp plus radar azimuth precision error, carrying out average value calculation, removing outliers through the average value, and carrying out calculation again to obtain final helicopter and corresponding obstacle distance data;
step a3, in the information fusion module (103), all detection information frames aiming at the obstacle are obtained by integrating the returned data of the visual detection module (101) and the returned data of the radar detection module (102), wherein the information in the frames comprises an obstacle sequence number, an obstacle pixel coordinate, an obstacle type, the distance between the obstacle and the helicopter and an azimuth angle;
the corresponding relation between the pixel point and the actual length is obtained through the following formula:
wherein eta pix2meter Conversion coefficients for pixels and actual sizes; d, d rel Is the relative distance between the helicopter and the obstacle; θ is the angle between the longitude direction and the obstacle in the world coordinate system; x is x 1 The left lower corner abscissa of the obstacle boundary detection frame in the image coordinate system; x is x 2 The right upper corner abscissa of the obstacle boundary detection frame in the image coordinate system; h is a IMG_WIDTH Is the image pixel width;
calculating the height and width of the obstacle relative to the helicopter according to the corresponding relation between the pixel points and the actual length, and setting the thickness and the width of the obstacle to be equal;
calculating longitude and latitude coordinates of a central point of the obstacle according to the corresponding relation between the local longitude and latitude and the actual length and the longitude and latitude and height information of the helicopter, calculating the longitude and latitude coordinates of each angular point of the obstacle according to the geometrical relation of the obstacle, and obtaining the actual height of the obstacle through superposition of the height of the helicopter and the height of the obstacle relative to the height of the helicopter;
step a4, in the obstacle data processing module (104), obtaining longitude and latitude coordinates and height information frames of each angular point position of the obstacle transmitted from the information fusion module (103), arranging each same obstacle serial number data frame into the same queue, and taking out the data in the queue when the number of frames in the queue reaches a threshold value; the frame number threshold here takes 5; calculating to obtain the mahalanobis distance quantity reflecting the longitude, latitude and height of each point by using a formula (2), judging whether an outlier exists by using a formula (3), and eliminating the outlier if the outlier exists:
wherein,for the kth in the queue including obstacle longitude and latitude and altitude vector +.>And (4) the vector before it>Sigma of (c) is a mahalanobis distance -1 Is the covariance matrix among eigenvectors, P M P is the current Markov distance probability density Tr To truncate the probability density threshold, the application takes 0.9, binary number b k For determining whether the detected point is an outlier, b k A value of 0 indicates that the detection point is not an outlier, b k When the detection point is 1, the detection point is judged to be an outlier;
after the elimination operation is finished, the other points of the queue are processed according to the update time sequence input formula (4) to obtain a new obstacle information queue, then the new obstacle information queue is input into the formula (5) to obtain a final output point, the obstacle sequence number queue is emptied, and the obstacle data information input queue is waited again:
wherein,for the obstacle position height vector at the last detection, +.>For the longitude and latitude and altitude information of the obstacle during the current detection, < > the present application>The result obtained by the first-order filtering is obtained; omega is weight matrix, mu i The i < th > item in the queue which is the vector containing the longitude and latitude and the height of the obstacle in the current sequence number>Forgetting coefficient of->The obtained result is weighted and averaged after the forgetting coefficient is overlapped;
and updating the final output point with the corresponding serial number into an onboard terrain database (105) for the helicopter ground proximity warning system 106 to work.
The application also provides a storage medium which stores a computer program or instructions that when executed implement the obstacle detection method for helicopter ground proximity warning.
Compared with the prior art, the application has the following beneficial effects:
(1) The obstacle active detection method for the ground proximity warning of the helicopter can be applied to the helicopter for executing low-altitude and ultra-low-altitude flight tasks in cities, and can allow the helicopter to execute the current flight tasks to the greatest extent while guaranteeing the flight safety of the helicopter. Compared with the traditional method for detecting the obstacle in the forward looking direction of the helicopter through laser ranging, the method can sense the flying area of the helicopter more comprehensively, and can enable a driver to know the type of the obstacle and the longitude and latitude height and update the obstacle into an airborne topography database.
(2) Compared with the traditional method for terrain perception through the expensive geodetic radar, the method is specially and precisely optimized for urban scenes and obstacle detection tasks, does not need to use the professional geodetic radar, and remarkably reduces the installation and purchase cost.
Drawings
The foregoing and/or other advantages of the application will become more apparent from the following detailed description of the application when taken in conjunction with the accompanying drawings and detailed description.
FIG. 1 is a schematic diagram of the apparatus of the present application.
Fig. 2 is a diagram showing the hardware connection of the present application.
Fig. 3 is a schematic diagram of an example of active detection of an obstacle in a helicopter city of the present application.
Fig. 4 is a graph of example results of active detection of obstacles in a helicopter city in accordance with the present application.
Detailed Description
The obstacle detection device for helicopter ground proximity warning provided by the application is shown in the principle of fig. 1 and 2, and comprises a visual detection module 101, a radar detection module 102, an information fusion module 103, an obstacle data processing module 104, an airborne terrain database 105 and a helicopter ground proximity warning system 106.
The vision detection module 101 performs obstacle detection and tracking on images returned by the airborne camera by using an image detection neural network and a multi-target tracking algorithm, so as to obtain an edge detection frame, an obstacle type and an obstacle sequence number of an obstacle in each image; obtaining azimuth angle information of the forward looking direction of the obstacle and the helicopter according to the relation between the distance between the center point of the obstacle detection frame and the center line of the image width and the field angle of the camera, and extracting the distance between the obstacle detected by the radar detection module 102 and the helicopter according to the azimuth angle information; the information fusion module 103 fuses the vision detection tracking information and the radar distance information obtained before, and then calculates the longitude and latitude range and the height of the obstacle according to the current pose of the helicopter; the obstacle data processing module 104 divides the obstacle information containing the same serial number into the same queue for processing, eliminates outliers and obtains the processed obstacle information, and inputs the processed obstacle information into the onboard topographic database 105 for the helicopter ground proximity warning system to use 106.
The application provides an obstacle active detection method for helicopter ground proximity warning, which specifically comprises the following steps:
in step a1, in the visual detection module 101, firstly, a common obstacle data set in a city needs to be manufactured, each obstacle in the data set is subjected to label division and obstacle edge detection frame domain division, and after the completion, the data set is subjected to segmentation and normalization operations, and is divided into a training set and a test set.
And further, adjusting the image neural network, wherein the final output result of the adjustment neural network is the pixel coordinates of the corner points of the obstacle edge detection frame and the obstacle type probability. And selecting proper neural network model learning rate and learning step length, and converging the neural network model through iterative operation to obtain the neural network model weight capable of rapidly and accurately detecting the obstacles in the city.
In step a2, in the radar detection module 102, according to the pixel coordinates of the obstacle obtained by the vision detection module 101, according to the corresponding relationship between the camera view angle and the pixel length, the azimuth angle of the forward looking direction of the obstacle and the helicopter is obtained by calculating the difference value between the abscissa of the center position of the obstacle edge detection frame and the abscissa of the center point of the image.
Further, extracting the radar detection distance according to the obtained azimuth, extracting all distance data of the last time of the corresponding azimuth of the timestamp plus the radar azimuth precision error, carrying out average value calculation, removing outliers through the average value, and carrying out calculation again to obtain the final distance data between the helicopter and the corresponding obstacle.
In step a3, in the information fusion module 103, the vision detection module 101 and the radar detection module 102 are integrated to transmit data, so as to obtain all detection information frames for a certain obstacle, wherein the information in the frames comprises an obstacle serial number, an obstacle pixel coordinate, an obstacle type, and an obstacle-helicopter distance and azimuth angle.
The corresponding relation between the pixel point and the actual length is obtained through the following formula:
wherein eta pix2meter Conversion coefficients for pixels and actual sizes; d, d rel Is the relative distance between the helicopter and the obstacle; θ is the angle between the longitude direction and the obstacle in the world coordinate system; x is x 1 The left lower corner abscissa of the obstacle boundary detection frame in the image coordinate system; x is x 2 The right upper corner abscissa of the obstacle boundary detection frame in the image coordinate system; h is a IMG_WIDTH Is the image pixel width;
calculating the height and width of the obstacle relative to the helicopter according to the corresponding relation between the pixel points and the actual length, and setting the thickness and the width of the obstacle to be equal;
calculating longitude and latitude coordinates of a central point of the obstacle according to the corresponding relation between the local longitude and latitude and the actual length and the longitude and latitude and height information of the helicopter, calculating the longitude and latitude coordinates of each angular point of the obstacle according to the geometrical relation of the obstacle, and obtaining the actual height of the obstacle through superposition of the height of the helicopter and the height of the obstacle relative to the height of the helicopter;
step a4, in the obstacle data processing module 104, obtaining longitude and latitude coordinates and height information frames of each angular point position of the obstacle transmitted from the information fusion module 103, arranging each data frame with the same obstacle sequence number into the same queue, and taking out the data in the queue when the number of frames in the queue reaches a threshold value; the frame number threshold here takes 5; calculating to obtain the mahalanobis distance quantity reflecting the longitude, latitude and height of each point by using a formula (2), judging whether an outlier exists by using a formula (3), and eliminating the outlier if the outlier exists:
wherein,for the kth in the queueComprising the longitude and latitude of the obstacle and the height vector +.>And (4) the vector before it>Sigma of (c) is a mahalanobis distance -1 Is the covariance matrix among eigenvectors, P M P is the current Markov distance probability density Tr To truncate the probability density threshold, the application takes 0.9, binary number b k For determining whether the detected point is an outlier, b k A value of 0 indicates that the detection point is not an outlier, b k When the detection point is 1, the detection point is judged to be an outlier;
after the elimination operation is finished, the other points of the queue are processed according to the update time sequence input formula (4) to obtain a new obstacle information queue, then the new obstacle information queue is input into the formula (5) to obtain a final output point, the obstacle sequence number queue is emptied, and the obstacle data information input queue is waited again:
wherein,for the obstacle position height vector at the last detection, +.>For the longitude and latitude and altitude information of the obstacle during the current detection, < > the present application>The result obtained by the first-order filtering is obtained; omega is weight matrix, mu i Queue for vectors containing longitude, latitude and height of obstacle in current sequence numberItem i->Forgetting coefficient of->The obtained result is weighted and averaged after the forgetting coefficient is overlapped;
the final output points of the corresponding serial numbers are updated into the onboard terrain database 105 for the helicopter ground proximity warning system 106 to operate.
Table 1 shows an example result of the method for actively detecting the obstacle in the city of the helicopter based on multi-sensor fusion, and the method for warning the ground proximity of the helicopter has better obstacle detection effect and higher accuracy on the forward looking direction of the helicopter.
TABLE 1
Obstacle 1 lat top (°) lat bottom (°) lon left (°) lon right (°) hgt(m)
True longitude, latitude and altitude 31.155509 31.155179 118.235371 118.234579 147
Active longitude and latitude and altitude detection 31.155488 31.155150 118.235401 118.234547 152.4
In Table 1, lat top Latitude of the upper left corner point of the top view of the obstacle; lon (lon) left Longitude the upper left corner point of the top view of the obstacle; lat bottom The right lower corner latitude of the top view of the obstacle; lon (lon) right Longitude the lower right corner point of the top view of the obstacle; hgt is the obstacle height. Fig. 3 is a schematic diagram of an example of the active detection of an obstacle in a helicopter city according to the present application (the abscissa in fig. 3 represents longitude and the ordinate represents latitude). Fig. 4 is a graph of example results of active detection of obstacles in a helicopter city in accordance with the present application. The result in the figure shows that the method of the application can produce better technical effect and provides a reliable method for detecting the obstacle of the helicopter.
In a specific implementation, the application provides a computer storage medium and a corresponding data processing unit, wherein the computer storage medium can store a computer program, and the computer program can run the application content and part or all of the steps of the obstacle detection method for helicopter ground proximity warning provided by the application when being executed by the data processing unit. The storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), a random-access memory (random access memory, RAM), or the like.
It will be apparent to those skilled in the art that the technical solutions in the embodiments of the present application may be implemented by means of a computer program and its corresponding general hardware platform. Based on such understanding, the technical solutions in the embodiments of the present application may be embodied essentially or in the form of a computer program, i.e. a software product, which may be stored in a storage medium, and include several instructions to cause a device (which may be a personal computer, a server, a single-chip microcomputer MUU or a network device, etc.) including a data processing unit to perform the methods described in the embodiments or some parts of the embodiments of the present application.
The present application provides a method, a device and a storage medium for detecting obstacles for ground proximity warning of a helicopter, and the method and the way for realizing the technical scheme are numerous, the above description is only a preferred embodiment of the present application, and it should be noted that, for a person skilled in the art, several improvements and modifications can be made, and these improvements and modifications should also be regarded as the protection scope of the present application. The components not explicitly described in this embodiment can be implemented by using the prior art.

Claims (5)

1. The obstacle detection method for the ground proximity warning of the helicopter is characterized by comprising the following steps of:
step 1, performing obstacle detection and tracking on images returned by an airborne camera by using an image detection neural network and a multi-target tracking algorithm to obtain corner pixel coordinates, an obstacle type and an obstacle sequence number of an obstacle in each image; obtaining azimuth angle information of forward looking directions of the obstacle and the helicopter through the relation between the pixel coordinates of the corner points of the obstacle and the distance between the center line of the image width and the view angle of the camera, and extracting the distance between the obstacle and the helicopter, which is obtained by detection of the airborne millimeter wave radar, through the azimuth angle information;
step 2, information fusion processing is carried out to obtain longitude and latitude information of the obstacle;
the step 2 comprises the following steps: the method comprises the steps that all detection information frames aiming at obstacles are obtained by integrating airborne camera return data and airborne millimeter wave radar return data, wherein information in the frames comprises obstacle serial numbers, obstacle pixel coordinates, obstacle types, obstacle-to-helicopter distances and azimuth angles;
the corresponding relation between the pixel point and the actual length is obtained through the following formula:
wherein eta pix2meter Conversion coefficients for pixels and actual sizes; d, d rel Is the relative distance between the helicopter and the obstacle; θ is the angle between the longitude direction and the obstacle in the world coordinate system; x is x 1 The left lower corner abscissa of the obstacle boundary detection frame in the image coordinate system; x is x 2 The right upper corner abscissa of the obstacle boundary detection frame in the image coordinate system; h is a IMG_WIDTH Is the image pixel width;
calculating the height and width of the obstacle relative to the helicopter according to the corresponding relation between the pixel points and the actual length, and setting the thickness and the width of the obstacle to be equal;
calculating longitude and latitude coordinates of a central point of the obstacle according to the corresponding relation between the local longitude and latitude and the actual length and the longitude and latitude and height information of the helicopter, calculating the longitude and latitude coordinates of each angular point of the obstacle according to the geometrical relation of the obstacle, and obtaining the actual height of the obstacle through superposition of the height of the helicopter and the height of the obstacle relative to the height of the helicopter;
step 3, carrying out data processing on the longitude and latitude information of the obstacle obtained in the step 2, and finishing obstacle detection;
the step 3 comprises the following steps: obtaining longitude and latitude coordinates and height information frames of each angular point position of the obstacle transmitted from the information fusion module (103), arranging data frames with the same obstacle sequence number into the same queue, and taking out data in the queue when the number of frames in the queue reaches a threshold value; calculating to obtain the mahalanobis distance quantity reflecting the longitude, latitude and height of each point by using a formula (2), judging whether an outlier exists by using a formula (3), and eliminating the outlier if the outlier exists:
wherein,for the kth in the queue including obstacle longitude and latitude and altitude vector +.>With its previous vectorSigma of (c) is a mahalanobis distance -1 Is the covariance matrix among eigenvectors, P M P is the current Markov distance probability density Tr To truncate the probability density threshold, binary number b k For determining whether the detected point is an outlier, b k A value of 0 indicates that the detection point is not an outlier, b k When the detection point is 1, the detection point is judged to be an outlier;
after the elimination operation is finished, the other points of the queue are processed according to the update time sequence input formula (4) to obtain a new obstacle information queue, then the new obstacle information queue is input into the formula (5) to obtain a final output point, the obstacle sequence number queue is emptied, and the obstacle data information input queue is waited again:
wherein,the height direction of the obstacle position at the last detectionQuantity (S)>For the longitude and latitude and altitude information of the obstacle during the current detection, < > the present application>The result obtained by the first-order filtering is obtained; omega is weight matrix, mu i The i < th > item in the queue which is the vector containing the longitude and latitude and the height of the obstacle in the current sequence number>Forgetting coefficient of->The result obtained by the weighted average is obtained after the forgetting coefficient is overlapped.
2. The method according to claim 1, wherein in step 1, firstly, a common obstacle data set in a city is produced, label division and obstacle corner pixel coordinate division are performed on each obstacle in the data set, and after the completion, division and normalization operations are performed on the data set, and the data set is divided into a training set and a test set;
adjusting the image neural network, and adjusting the output of the multi-target detection image neural network into the pixel coordinates of the corner points of the obstacle and the probability of the type of the obstacle; training to enable the neural network model to achieve convergence, and obtaining the weight of the neural network model;
according to the corresponding relation between the camera field angle and the pixel length, calculating to obtain the forward looking direction azimuth angle of the obstacle and the helicopter through the difference value between the horizontal coordinate of the center position of the obstacle edge detection frame and the horizontal coordinate of the center point of the image;
and extracting the radar detection distance according to the azimuth, extracting all distance data of the last time of corresponding azimuth plus radar azimuth precision error of the timestamp, carrying out average value calculation, removing outliers through the average value, and carrying out calculation again to obtain the final distance data between the helicopter and the corresponding obstacle.
3. The method of claim 2, wherein in step 3, step 3 is performed again each time the same sequence number barrier queue reaches a threshold.
4. The obstacle detection device for the ground proximity warning of the helicopter is characterized by comprising a visual detection module (101), a radar detection module (102), an information fusion module (103), an obstacle data processing module (104), an airborne topography database (105) and a ground proximity warning system (106) of the helicopter;
the visual detection module (101) uses an image detection neural network and a multi-target tracking algorithm to detect and track the obstacle of the image returned by the airborne camera, so as to obtain the pixel coordinates of the corner point, the type and the serial number of the obstacle in each image; obtaining azimuth angle information of forward looking directions of the obstacle and the helicopter through the relation between the pixel coordinates of the corner points of the obstacle and the distance between the center line of the width of the image and the view angle of the camera, and extracting the distance between the obstacle and the helicopter, which is detected by a radar detection module (102), through the azimuth angle information;
the information fusion module (103) fuses the information obtained by the visual detection module (101), and then calculates the longitude and latitude range and the height of the obstacle according to the current pose of the helicopter;
the obstacle data processing module (104) divides obstacle information containing the same serial number into the same queue and processes the obstacle information, eliminates outliers and obtains processed obstacle information, the processed obstacle information is input into the onboard terrain database (105) for the helicopter ground proximity warning system (106) to use, the onboard terrain database (105) is scanned in real time by the helicopter ground proximity warning system (106) to store terrain elevation data of surrounding environment of the current position of the helicopter, collision threat prediction is carried out based on the terrain elevation data, if threat exists, warning prompt pictures and sounds are sent to the warning display panel, and if no threat exists, warning prompt pictures and sounds are not sent;
the device performs the steps of:
in a visual detection module (101), firstly, a common obstacle data set in a city is manufactured, each obstacle in the data set is subjected to label division and obstacle corner pixel coordinate division, after the completion, the data set is subjected to segmentation and normalization operation, and the data set is divided into a training set and a test set;
adjusting the image neural network, wherein the final output result of the adjustment image neural network is the barrier corner pixel coordinates and the barrier type probability; selecting the learning rate and the learning step length of the neural network model, and converging the neural network model through iterative operation to obtain the neural network model weight capable of detecting the obstacle in the city;
step a2, in the radar detection module (102), according to the pixel coordinates of the obstacle angle points obtained by the vision detection module (101), according to the corresponding relation between the camera view angle and the pixel length, calculating to obtain azimuth angles of the obstacle and the forward looking direction of the helicopter through the difference value between the horizontal coordinates of the center position of the obstacle edge detection frame and the horizontal coordinates of the center point of the image;
extracting radar detection distance according to the azimuth, extracting all distance data of the corresponding azimuth of the latest time stamp plus radar azimuth precision error, carrying out average value calculation, removing outliers through the average value, and carrying out calculation again to obtain final helicopter and corresponding obstacle distance data;
step a3, in the information fusion module (103), all detection information frames aiming at the obstacle are obtained by integrating the returned data of the visual detection module (101) and the returned data of the radar detection module (102), wherein the information in the frames comprises an obstacle sequence number, an obstacle pixel coordinate, an obstacle type, the distance between the obstacle and the helicopter and an azimuth angle;
the corresponding relation between the pixel point and the actual length is obtained through the following formula:
wherein eta pix2meter Conversion coefficients for pixels and actual sizes; d, d rel Is a helicopter and an obstacleThe relative distance of the objects; θ is the angle between the longitude direction and the obstacle in the world coordinate system; x is x 1 The left lower corner abscissa of the obstacle boundary detection frame in the image coordinate system; x is x 2 The right upper corner abscissa of the obstacle boundary detection frame in the image coordinate system; h is a IMG_WIDTH Is the image pixel width;
calculating the height and width of the obstacle relative to the helicopter according to the corresponding relation between the pixel points and the actual length, and setting the thickness and the width of the obstacle to be equal;
calculating longitude and latitude coordinates of a central point of the obstacle according to the corresponding relation between the local longitude and latitude and the actual length and the longitude and latitude and height information of the helicopter, calculating the longitude and latitude coordinates of each angular point of the obstacle according to the geometrical relation of the obstacle, and obtaining the actual height of the obstacle through superposition of the height of the helicopter and the height of the obstacle relative to the height of the helicopter;
step a4, in the obstacle data processing module (104), obtaining longitude and latitude coordinates and height information frames of each angular point position of the obstacle transmitted from the information fusion module (103), arranging each same obstacle serial number data frame into the same queue, and taking out the data in the queue when the number of frames in the queue reaches a threshold value; calculating to obtain the mahalanobis distance quantity reflecting the longitude, latitude and height of each point by using a formula (2), judging whether an outlier exists by using a formula (3), and eliminating the outlier if the outlier exists:
wherein,for the kth in the queue including obstacle longitude and latitude and altitude vector +.>With its previous vectorSigma of (c) is a mahalanobis distance -1 Is the covariance matrix among eigenvectors, P M P is the current Markov distance probability density Tr To truncate the probability density threshold, binary number b k For determining whether the detected point is an outlier, b k A value of 0 indicates that the detection point is not an outlier, when b k When the detection point is 1, the detection point is judged to be an outlier;
after the elimination operation is finished, the other points of the queue are processed according to the update time sequence input formula (4) to obtain a new obstacle information queue, then the new obstacle information queue is input into the formula (5) to obtain a final output point, the obstacle sequence number queue is emptied, and the obstacle data information input queue is waited again:
wherein,for the obstacle position height vector at the last detection, +.>For the longitude and latitude and altitude information of the obstacle during the current detection, < > the present application>The result obtained by the first-order filtering is obtained; omega is weight matrix, mu i The i < th > item in the queue which is the vector containing the longitude and latitude and the height of the obstacle in the current sequence number>Forgetting coefficient of->The obtained result is weighted and averaged after the forgetting coefficient is overlapped;
and updating the final output point with the corresponding serial number into an onboard terrain database (105) for the helicopter ground proximity warning system (106) to work.
5. A storage medium storing a computer program or instructions which, when executed, implement the method of any one of claims 1 to 3.
CN202210872811.5A 2022-07-21 2022-07-21 Obstacle detection method, device and storage medium for helicopter ground proximity warning Active CN115440094B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210872811.5A CN115440094B (en) 2022-07-21 2022-07-21 Obstacle detection method, device and storage medium for helicopter ground proximity warning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210872811.5A CN115440094B (en) 2022-07-21 2022-07-21 Obstacle detection method, device and storage medium for helicopter ground proximity warning

Publications (2)

Publication Number Publication Date
CN115440094A CN115440094A (en) 2022-12-06
CN115440094B true CN115440094B (en) 2023-11-07

Family

ID=84241462

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210872811.5A Active CN115440094B (en) 2022-07-21 2022-07-21 Obstacle detection method, device and storage medium for helicopter ground proximity warning

Country Status (1)

Country Link
CN (1) CN115440094B (en)

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6243482B1 (en) * 1996-02-13 2001-06-05 Dornier Gmbh Obstacle detection system for low-flying airborne craft
CN101968354A (en) * 2010-09-29 2011-02-09 清华大学 Laser detection and image identification based unmanned helicopter distance measuring method
CN105173095A (en) * 2015-09-07 2015-12-23 国网通用航空有限公司 Barrier avoiding system of helicopter
CN107402578A (en) * 2017-06-21 2017-11-28 中国科学院深圳先进技术研究院 Unmanned plane panorama obstacle cognitive method, device, equipment and storage medium
CN108227738A (en) * 2017-12-28 2018-06-29 湖北电鹰科技有限公司 A kind of unmanned plane barrier-avoiding method and system
CN109034018A (en) * 2018-07-12 2018-12-18 北京航空航天大学 A kind of low latitude small drone method for barrier perception based on binocular vision
CN110135485A (en) * 2019-05-05 2019-08-16 浙江大学 The object identification and localization method and system that monocular camera is merged with millimetre-wave radar
KR102260240B1 (en) * 2019-12-06 2021-06-03 한화시스템 주식회사 Terrain following flight method
CN113327297A (en) * 2021-06-30 2021-08-31 湖南科技大学 Deep sea seabed obstacle measuring system and recognition method based on deep learning
CN114266724A (en) * 2021-11-16 2022-04-01 中国航空工业集团公司雷华电子技术研究所 High-voltage line detection method based on radar infrared visible light image fusion
CN114428515A (en) * 2022-01-24 2022-05-03 广东电网有限责任公司 Unmanned aerial vehicle obstacle avoidance method and device, unmanned aerial vehicle and storage medium

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10453351B2 (en) * 2017-07-17 2019-10-22 Aurora Flight Sciences Corporation System and method for detecting obstacles in aerial systems

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6243482B1 (en) * 1996-02-13 2001-06-05 Dornier Gmbh Obstacle detection system for low-flying airborne craft
CN101968354A (en) * 2010-09-29 2011-02-09 清华大学 Laser detection and image identification based unmanned helicopter distance measuring method
CN105173095A (en) * 2015-09-07 2015-12-23 国网通用航空有限公司 Barrier avoiding system of helicopter
CN107402578A (en) * 2017-06-21 2017-11-28 中国科学院深圳先进技术研究院 Unmanned plane panorama obstacle cognitive method, device, equipment and storage medium
CN108227738A (en) * 2017-12-28 2018-06-29 湖北电鹰科技有限公司 A kind of unmanned plane barrier-avoiding method and system
CN109034018A (en) * 2018-07-12 2018-12-18 北京航空航天大学 A kind of low latitude small drone method for barrier perception based on binocular vision
CN110135485A (en) * 2019-05-05 2019-08-16 浙江大学 The object identification and localization method and system that monocular camera is merged with millimetre-wave radar
KR102260240B1 (en) * 2019-12-06 2021-06-03 한화시스템 주식회사 Terrain following flight method
CN113327297A (en) * 2021-06-30 2021-08-31 湖南科技大学 Deep sea seabed obstacle measuring system and recognition method based on deep learning
CN114266724A (en) * 2021-11-16 2022-04-01 中国航空工业集团公司雷华电子技术研究所 High-voltage line detection method based on radar infrared visible light image fusion
CN114428515A (en) * 2022-01-24 2022-05-03 广东电网有限责任公司 Unmanned aerial vehicle obstacle avoidance method and device, unmanned aerial vehicle and storage medium

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
视觉无人机高速公路违章识别技术的研究及实现;曾荻清;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;9-48页 *

Also Published As

Publication number Publication date
CN115440094A (en) 2022-12-06

Similar Documents

Publication Publication Date Title
US10210401B2 (en) Real time multi dimensional image fusing
CN108226951B (en) Laser sensor based real-time tracking method for fast moving obstacle
EP3745158B1 (en) Methods and systems for computer-based determining of presence of dynamic objects
KR20210111180A (en) Method, apparatus, computing device and computer-readable storage medium for positioning
US11860281B2 (en) Methods and systems for filtering data points when merging LIDAR sensor datasets
EP3875905B1 (en) Method, device and medium for detecting environmental change
EP3845927B1 (en) Merging multiple lidar point cloud data using an iterative closest point (icp) algorithm with weighting factor
US11860315B2 (en) Methods and systems for processing LIDAR sensor data
US11608058B2 (en) Method of and system for predicting future event in self driving car (SDC)
CN108844538B (en) Unmanned aerial vehicle obstacle avoidance waypoint generation method based on vision/inertial navigation
US20210197853A1 (en) Method of and system for computing data for controlling operation of self driving car (sdc)
EP4002300A1 (en) Kurtosis based pruning for sensor-fusion systems
CN114325634A (en) Method for extracting passable area in high-robustness field environment based on laser radar
CN111508282A (en) Low-altitude unmanned farmland operation flight obstacle conflict detection method
Lombaerts et al. Adaptive multi-sensor fusion based object tracking for autonomous urban air mobility operations
CN115440094B (en) Obstacle detection method, device and storage medium for helicopter ground proximity warning
US10330769B1 (en) Method and apparatus for geolocating emitters in a multi-emitter environment
CN116385997A (en) Vehicle-mounted obstacle accurate sensing method, system and storage medium
CN115965847A (en) Three-dimensional target detection method and system based on multi-modal feature fusion under cross view angle
US20220089166A1 (en) Motion state estimation method and apparatus
Dudek et al. Cloud Detection System for UAV Sense and Avoid: Cloud Distance Estimation using Triangulation
CN110838246B (en) Airborne SVS obstacle warning display method
Glozman et al. A vision-based solution to estimating time to closest point of approach for sense and avoid
Forlenza Vision based strategies for implementing Sense and Avoid capabilities onboard Unmanned Aerial Systems
CN112162550B (en) Three-dimensional target tracking method for active safety collision avoidance of automobile

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant