CN114967731A - Unmanned aerial vehicle-based automatic field personnel searching method - Google Patents

Unmanned aerial vehicle-based automatic field personnel searching method Download PDF

Info

Publication number
CN114967731A
CN114967731A CN202210383003.2A CN202210383003A CN114967731A CN 114967731 A CN114967731 A CN 114967731A CN 202210383003 A CN202210383003 A CN 202210383003A CN 114967731 A CN114967731 A CN 114967731A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
pixel
personnel
flight
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210383003.2A
Other languages
Chinese (zh)
Inventor
李易平
郑恩辉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Jiliang University
Original Assignee
China Jiliang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Jiliang University filed Critical China Jiliang University
Priority to CN202210383003.2A priority Critical patent/CN114967731A/en
Publication of CN114967731A publication Critical patent/CN114967731A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Remote Sensing (AREA)
  • Evolutionary Computation (AREA)
  • Automation & Control Theory (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention discloses an unmanned aerial vehicle-based automatic field personnel searching method. The unmanned aerial vehicle receives a flight line flight instruction sent by the handheld terminal equipment and carries out flight on a flight line in a search area; the unmanned aerial vehicle acquires an infrared image of a search area through a camera in the process of flight of the flight, processes the infrared image, and identifies whether a result of a person exists in the image through thermal imaging threshold analysis and trapped person motion analysis; and controlling the flight attitude of the unmanned aerial vehicle to track and snapshot personnel according to the image recognition result, acquiring position information and environment information of the personnel and sending the position information and the environment information to the handheld terminal device and the cloud server. The invention can accurately identify the personnel trapped in the field under the condition of more field shelters, accurately rescues the trapped personnel by full-automatically identifying the personnel and acquiring the surrounding environmental information, thereby not only reducing the cost, but also reducing the manpower resource.

Description

Unmanned aerial vehicle-based automatic field personnel searching method
Technical Field
The invention relates to an unmanned aerial vehicle application method in the field of research on search and rescue technology of unmanned aerial vehicles, in particular to an unmanned aerial vehicle-based automatic field personnel search method.
Background
In recent years, personnel missing accidents in the aspects of field exploration, traveling trapped, sudden natural disasters trapped and the like occur occasionally, the problems of high cost, low searching speed and the like of traditional manpower search and rescue exposure are solved, especially for some fields with complex terrains and incapable of entering of vehicles and search instruments, the application of an intelligent search and rescue system of an unmanned aerial vehicle in the field rescue aspect is promoted, but in the prior stage, the field personnel search of the unmanned aerial vehicle is controlled only by manpower, carpet type search still needs the search and rescue personnel on a base station to judge through the images returned by the unmanned aerial vehicle by naked eyes, a large amount of manpower and material resources are consumed, and common cameras cannot accurately distinguish background and field trapped personnel.
Disclosure of Invention
In order to solve the problems in the background art, the invention aims to provide an unmanned aerial vehicle-based automatic field personnel searching method, which can accurately identify field trapped personnel under the condition of more field shelters, can accurately rescue the trapped personnel by full-automatically identifying the personnel and acquiring surrounding environment information, and not only reduces the cost, but also reduces manpower and material resources.
The technical scheme adopted by the invention is as follows:
the unmanned aerial vehicle receives a flight line flight instruction sent by the handheld terminal equipment and carries out flight on a flight line in a search area;
the unmanned aerial vehicle acquires an infrared image of the search area through a camera thereof in the process of flight of the flight, performs image processing on the infrared image, and identifies whether a result of personnel exists in the image through thermal imaging threshold analysis and trapped personnel motion analysis;
when the situation that personnel exist in the image is determined, the flight attitude of the unmanned aerial vehicle is controlled according to the image recognition result to track and snapshot the personnel, and position information and environment information of the personnel are obtained and sent to the handheld terminal device;
and the handheld terminal equipment sends the position information and the environment information to a cloud server of a remote monitoring center through a 5G network.
The air line flight instruction is generated after GPS coordinates of the search area are calibrated, and the handheld terminal device sends the air line flight instruction to the unmanned aerial vehicle through the data transmission module.
Preferably, the air route flight instruction is generated by developing a GPS coordinate from a defined calibration search area of an APP installed on the handheld terminal, wherein a calibrated waypoint is displayed on a Gade map, then a data transmission module of the unmanned aerial vehicle sends the air route flight instruction to the unmanned aerial vehicle, and the unmanned aerial vehicle starts to execute an air route task and acquire an infrared image.
And performing frame extraction on the infrared image video acquired by the unmanned aerial vehicle in the search area, performing H.246 format transcoding operation on the video subjected to frame extraction, and regenerating and storing the file name of the transcoded video.
Carrying out histogram equalization and background denoising processing on the infrared image of the search region in real time, and then respectively carrying out thermal imaging threshold analysis and personnel motion analysis to obtain a possibly-appearing personnel ROI pixel point region, wherein:
marking the region of the ROI pixel points of the personnel obtained by thermal imaging threshold analysis as an Rt region;
marking a region of a pixel point of a person ROI obtained by analyzing the motion of the trapped person as an Rm region;
fusing the Rt region and the Rm region to obtain a union region, denoising the union region, extracting a target contour of the union region, and drawing a minimum solid line rectangular frame;
and inputting the minimum solid line rectangular box into a pre-trained personnel recognition model, and outputting a recognition result.
The person identification model can be a convolutional neural network model, and the identification result is the result of whether a person exists in the minimum solid line rectangular frame.
The thermal imaging threshold analysis is to perform binary segmentation on the infrared image through a preset threshold, perform connected domain operation on the binary infrared image after performing opening operation and closing operation respectively to obtain each connected domain,
only the connected domain with the area larger than the reference area S calculated by the following formula is reserved as a connected domain pixel ROI area, and the connected domain pixel ROI area is set as an Rt area, wherein:
S=0.0025×r×c
where r is the width of the largest area of the connected domain, and c is the height of the largest area of the connected domain.
The trapped person motion analysis is to compare the difference value between the pixel values of the front frame and the back frame of each pixel point at the same position of the infrared image, and take the pixel point with the difference value Q larger than a preset pixel difference threshold value as a hot pixel, wherein the preset pixel difference threshold value is set to be 16, wherein:
|I 1 -I 2 |>Q
wherein, I 1 Pixel value, I, of the current frame of co-located pixel points 2 The pixel value of the previous frame of the pixel point at the same position;
forming a plurality of hot pixel ROI areas by a plurality of hot pixels according to connected domain coherent communication, calculating the average pixel value M of all pixel points in all connected domain pixel ROI areas obtained in thermal imaging threshold analysis, and judging as follows:
and when the pixel number of the hot spot pixel ROI area is more than 5% of the pixel number of all connected domain pixel ROI areas obtained in the thermal imaging threshold analysis and the average pixel value of the hot spot pixel ROI area is more than the average pixel value M, taking the hot spot pixel ROI area as an Rm area.
When the person exists in the image, extracting the central point pixel coordinate position of the minimum solid line rectangular frame in real time as the person central point (x) 1 ,y 1 ) Acquiring the central point pixel coordinate position of the infrared image as the image central point (x) 0 Y0), namely, the coordinate position of the central point pixel on the terminal equipment display image interface of the infrared camera;
according to the center point (x) of the person 1 Y1) and image center point (x) 0 ,y 0 ) The position deviation value of the two points in the vertical coordinate direction of the infrared image is calculated, and the pitch angle deviation theta of a camera during the tracking of the unmanned aerial vehicle is calculated y Wherein:
Figure BDA0003592590030000031
in the formula, beta is a rated pitch angle of the unmanned aerial vehicle camera;
then, obtaining a corrected yaw angle delta (t) in the horizontal yaw direction of the unmanned aerial vehicle according to a following formula tracking algorithm;
Figure BDA0003592590030000032
where δ (t) represents the corrected yaw angle at time t, θ x (t) yaw angle deviation of the UAV at time t, yaw angle deviation theta x (t) calculation method and Pitch Angle deviation θ y The calculation method is the same, lambda is an adjusting parameter, d (t) is the pre-aiming distance deviation at the moment t, and v (t) is the current speed of the unmanned aerial vehicle;
the pre-aiming distance deviation d (t) is obtained by calculation according to the following formula:
Figure BDA0003592590030000033
in the formula, a max Maximum braking speed of the unmanned plane, v (t) current speed of the unmanned plane, K e For the distance parameter of the unmanned plane running in the abnormal reaction condition, K in the concrete implementation e Set to 0.1, R min The minimum turning radius value of the unmanned aerial vehicle is obtained;
by pitch angle deviation theta y And the corrected yaw angle delta (t) is input into a pitching control channel and a yaw control channel of the unmanned aerial vehicle in real time and added to the original pitching control quantity and yaw control quantity, so that accurate tracking snapshot control is realized.
In the processing, the unmanned aerial vehicle is matched with the obstacle avoidance module to track more stably to avoid the obstacle through the adjustment of the pre-aiming distance deviation d (t), and the target personnel is accurately locked on the image display interface of the terminal equipment.
At unmanned aerial vehicle and ground safety height, under unmanned aerial vehicle and target personnel's horizontal safe distance, unmanned aerial vehicle is in real time to the outdoor personnel's of target tracking snapshot back, shows infrared light and visible light control picture in real time, real time control unmanned aerial vehicle throttle and flight gesture make unmanned aerial vehicle hover directly over the outdoor personnel of target and the camera is perpendicular downwards, this moment rethread camera is shot and is taken a picture and take notes target personnel's GPS coordinate, get back to the airline task after the completion and continue to search.
Unmanned aerial vehicle be 2 trade version unmanned aerial vehicle in the imperial in big jiang, carry on the bi-light camera, the bi-light camera is infrared camera and visible light camera respectively, unmanned aerial vehicle begins from the open-air personnel of locking target, the GPS coordinate transmission in handheld terminal equipment that will grab environment infrared photo, visible light photo, personnel place that shoot many times in real time.
And the handheld terminal equipment sends the captured environment infrared photo, visible light photo and GPS coordinate of the personnel to a cloud server of a remote monitoring center through a 5G network for real-time monitoring.
The invention can accurately identify the outdoor trapped personnel under the condition of more outdoor shelters, accurately rescue the trapped personnel by fully automatically identifying the personnel and acquiring the surrounding environment information, thereby not only reducing the cost, but also reducing the human resources.
Compared with the prior art, the unmanned aerial vehicle-based field personnel searching method provided by the embodiment of the invention has the following beneficial effects:
the invention can set the route task on the handheld terminal in a self-defined way, and repeatedly search the areas where people may exist in the field.
According to the invention, the carried double-camera pair can accurately and fully automatically identify field trapped people under the condition of more field shelters, and can track and position the field people in real time and acquire surrounding environment information, so that the trapped people can be rescued accurately, the cost is reduced, and manpower and material resources are reduced.
The handheld terminal device uploads the information of field personnel to the remote monitoring center database through the 5G network, and the remote monitoring personnel can further confirm the field personnel and adopt an effective next rescue scheme.
Drawings
FIG. 1 is a flow chart of the method of the present invention;
FIG. 2 is a block diagram of an embodiment of the present invention;
FIG. 3 is a control flow diagram of the method of the present invention.
Detailed Description
The invention is further described with reference to the drawings and the embodiments so as to clearly and completely describe the technical scheme in the embodiments of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 2, the system architecture of the present invention includes: unmanned aerial vehicle, remote controller, handheld terminal device, surveillance center's cloud ware.
The working principle of the scheme is as follows: the unmanned aerial vehicle receives a flight path flight instruction sent by the handheld terminal equipment through the remote controller, and controls the aircraft to fly along a flight path in the search area; the unmanned aerial vehicle acquires an infrared image of the search area, sends the infrared image to the handheld terminal equipment through a data transmission module of the remote controller, processes the infrared image and identifies whether personnel exist in the image; when the situation that personnel exist in the image is determined, the remote controller controls the flight attitude of the unmanned aerial vehicle, the environment where the personnel are located is tracked and captured, and the unmanned aerial vehicle acquires the position information and the environment information of the personnel and sends the position information and the environment information to the handheld terminal equipment through a data transmission module of the remote controller; and the handheld terminal equipment sends the position information and the environment information to a cloud server of a remote monitoring center through a 5G network, and the monitoring center carries out remote monitoring on personnel.
Embodiments of the present invention and processes for implementing the same are described below with reference to fig. 1-3:
as shown in fig. 1, an automatic field personnel searching method based on an unmanned aerial vehicle includes:
s1: the unmanned aerial vehicle receives a flight line flight instruction sent by the handheld terminal equipment and carries out flight on a flight line in a search area;
s2: acquiring an infrared image of the search area, performing image processing on the infrared image, and identifying whether people exist in the image or not through thermal imaging threshold analysis and trapped person motion analysis;
s3: when the situation that personnel exist in the image is determined, controlling the flight attitude of the unmanned aerial vehicle, carrying out tracking snapshot on the personnel, and acquiring position information and environment information of the personnel;
s4: and the handheld terminal equipment sends the position information and the environment information to a cloud server of a remote monitoring center through a 5G network.
As shown in fig. 2, a structure of an unmanned aerial vehicle-based automatic field personnel search system includes: unmanned aerial vehicle, remote controller, handheld terminal device, surveillance center's cloud ware.
The working principle of the scheme is as follows: the unmanned aerial vehicle receives a flight path flight instruction sent by the handheld terminal equipment through the remote controller, and controls the aircraft to fly along a flight path in the search area; the unmanned aerial vehicle acquires an infrared image of the search area, sends the infrared image to the handheld terminal equipment through a data transmission module of the remote controller, processes the infrared image and identifies whether personnel exist in the image; when the situation that personnel exist in the image is determined, the remote controller controls the flight attitude of the unmanned aerial vehicle, the environment where the personnel are located is tracked and captured, and the unmanned aerial vehicle acquires the position information and the environment information of the personnel and sends the position information and the environment information to the handheld terminal equipment through a data transmission module of the remote controller; and the handheld terminal equipment sends the position information and the environment information to a cloud server of a remote monitoring center through a 5G network, and personnel of the monitoring center remotely monitor.
As shown in fig. 3, a software flow of an unmanned aerial vehicle-based field personnel automatic search system is specifically as follows:
preferably, the air route flight instruction is generated by developing a GPS coordinate from a defined calibration search area of an APP installed on the handheld terminal, wherein a calibrated waypoint is displayed on a Gade map, then a data transmission module of the unmanned aerial vehicle sends the air route flight instruction to the unmanned aerial vehicle, and the unmanned aerial vehicle starts to execute an air route task and acquire an infrared image.
The method comprises the steps of performing frame extraction processing on an infrared image video acquired by an unmanned aerial vehicle in a search area, performing H.246 format transcoding operation on the video subjected to frame extraction, regenerating and storing a file name of the transcoded video, performing conventional image preprocessing such as histogram equalization and background denoising, and further performing thermal imaging threshold segmentation analysis and trapped person motion analysis to respectively compare out possible person ROI pixel points, wherein:
marking a pixel point region of a human ROI (region of interest) subjected to thermal imaging threshold analysis as an Rt region;
marking a region of a person ROI pixel point of the trapped person motion analysis as an Rm region;
fusing a union region of the Rt region and the Rm region, denoising, extracting a target contour of the Rf region, and drawing a minimum solid line rectangular frame for the region in real time;
and inputting the area of the solid line rectangular frame Rf to be recognized into a pre-trained personnel recognition model, and outputting a recognition result.
The thermal imaging threshold analysis is to perform binary segmentation on an image through a threshold, perform connected domain operation after performing opening operation and closing operation on the binary image respectively, only reserve a connected domain pixel ROI (region of interest) region as a region with the area larger than S after calculation, and then set the ROI region as an Rt region, wherein:
S=0.0025×r×c
the trapped person motion analysis is to compare the difference value between the pixel values of the front frame and the back frame of each pixel point at the same position of the infrared image, and take the pixel point with the difference value Q larger than a preset pixel difference threshold value as a hot pixel, wherein the preset pixel difference threshold value is set to be 16, wherein:
|I 1 -I 2 |>Q
wherein, I 1 Current frame threshold, I, for a pixel at the same location 2 A previous frame threshold value of a pixel point at the same position;
and calculating the average pixel value M of all pixel points in the reserved connected domain pixel ROI area by thermal imaging threshold analysis, and when the number of the pixels in the hot spot pixel ROI area is more than 5% of the number of the pixels in the reserved connected domain pixel ROI area and the average pixel value in the hot spot pixel ROI area is more than the average pixel value M, taking the hot spot pixel ROI area as an Rm area.
When the person exists in the image, interrupting the air route task, calculating the pixel coordinate position of the central point of the solid line rectangular frame in real time, and recording as (x) 1 ,y 1 ) Acquiring the central point pixel coordinate position of the infrared image as the image central point (x) 0 ,y 0 ) Namely, the coordinate position of the central point pixel of the infrared camera on the image display interface of the terminal equipment;
according to the center point (x) of the person 1 ,y 1 ) And image center point (x) 0 ,y 0 ) The position deviation value of the two points in the vertical coordinate direction of the infrared image is calculated, and the pitch angle deviation theta of a camera during the tracking of the unmanned aerial vehicle is calculated y Wherein:
Figure BDA0003592590030000061
in the formula, beta is a rated pitch angle of the unmanned aerial vehicle camera holder;
then, acquiring a corrected yaw angle delta (t) in the horizontal yaw direction of the unmanned aerial vehicle according to a tracking algorithm of the following formula;
Figure BDA0003592590030000071
where δ (t) represents the corrected yaw angle at time t, θ x (t) yaw angle deviation of the UAV at time t, yaw angle deviation theta x (t) calculation method and Pitch Angle deviation θ y The calculation method is the same, lambda is an adjusting parameter, d (t) is the pre-aiming distance deviation at the moment t, and v (t) is the current speed of the unmanned aerial vehicle;
the pre-aiming distance deviation d (t) is obtained by calculation according to the following formula:
Figure BDA0003592590030000072
in the formula, a max Maximum braking speed of the unmanned plane, v (t) current speed of the unmanned plane, K e For the distance parameter of traveling when the unmanned aerial vehicle encounters abnormal reaction condition, K in specific implementation e Set to 0.1, R min The minimum turning radius value of the unmanned aerial vehicle;
by pitch angle deviation theta y And inputting the corrected yaw angle delta (t) into a pitching control channel and a yaw control channel of the unmanned aerial vehicle in real time, and adding the corrected yaw angle delta (t) to the original pitching control quantity and yaw control quantity so as to realize accurate tracking snapshot control.
In the processing, the unmanned aerial vehicle is matched with the obstacle avoidance module to track more stably to avoid the obstacle through the adjustment of the pre-aiming distance deviation d (t), and the target personnel is accurately locked on the image display interface of the terminal equipment.
At unmanned aerial vehicle and ground safety height, under unmanned aerial vehicle and target personnel's horizontal safe distance, unmanned aerial vehicle is in real time to the outdoor personnel's of target tracking snapshot back, shows infrared light and visible light control picture in real time, real time control unmanned aerial vehicle throttle and flight gesture make unmanned aerial vehicle hover directly over the outdoor personnel of target and the camera is perpendicular downwards, this moment rethread camera is shot and is taken a picture and take notes target personnel's GPS coordinate, get back to the airline task after the completion and continue to search.
The unmanned aerial vehicle that uses is 2 trade version unmanned aerial vehicle of the imperial in big jiang, carries on the bi-light camera, and the bi-light camera is infrared camera and visible light camera respectively, and unmanned aerial vehicle begins from the open-air personnel of locking target, and the GPS coordinate that will grab environment infrared photo, visible light photo, personnel place that shoot many times in real time transmits handheld terminal equipment.
The handheld terminal device sends the captured environment infrared photos, visible light photos and GPS coordinates of people to a cloud server of a remote monitoring center through a 5G network, and real-time monitoring is carried out.
Further, after the unmanned aerial vehicle finishes the operation, returning to the route task to continue executing the rest route points, continuously identifying the target area, finally judging whether the whole route task is finished, and if so, finishing the route task;
while the invention has been described with reference to specific embodiments, it will be understood by those skilled in the art that the invention is not limited thereto, and may be embodied in other forms without departing from the spirit or essential characteristics thereof. Any modification which does not depart from the functional and structural principles of the present invention is intended to be included within the scope of the claims.

Claims (7)

1. An unmanned aerial vehicle-based automatic field personnel searching method is characterized by comprising the following steps:
the unmanned aerial vehicle receives a flight line flight instruction sent by the handheld terminal equipment and carries out flight on a flight line in a search area;
the unmanned aerial vehicle acquires an infrared image of the search area through a camera thereof in the process of flight of the flight, performs image processing on the infrared image, and identifies whether a result of personnel exists in the image through thermal imaging threshold analysis and trapped personnel motion analysis;
when the situation that personnel exist in the image is determined, the flight attitude of the unmanned aerial vehicle is controlled according to the image recognition result to track and snapshot the personnel, and position information and environment information of the personnel are obtained and sent to the handheld terminal device;
and the handheld terminal equipment sends the position information and the environment information to a cloud server of a remote monitoring center through a 5G network.
2. The unmanned aerial vehicle-based field personnel automatic searching method of claim 1, wherein: the flight path instruction is a flight path instruction generated after GPS coordinates of the search area are calibrated.
3. The unmanned aerial vehicle-based field personnel automatic search method of claim 1, wherein: carrying out histogram equalization and background denoising processing on the infrared image of the search region in real time, and then respectively carrying out thermal imaging threshold analysis and personnel motion analysis to obtain a personnel ROI pixel point region, wherein:
marking the region of the pixel point of the ROI of the person obtained by thermal imaging threshold analysis as an Rt region;
marking a region of a pixel point of a person ROI obtained by analyzing the motion of the trapped person as an Rm region;
fusing the Rt region and the Rm region to obtain a union region, denoising the union region, extracting a target contour of the union region, and drawing a minimum solid line rectangular frame;
and inputting the minimum solid line rectangular box into a pre-trained personnel recognition model, and outputting a recognition result.
4. The unmanned aerial vehicle-based field personnel automatic search method of claim 3, wherein: the thermal imaging threshold analysis is to perform binary segmentation on the infrared image through a preset threshold, perform connected domain operation on the binary infrared image after performing opening operation and closing operation respectively to obtain each connected domain,
only the connected domain with the area larger than the reference area S calculated by the following formula is reserved as a connected domain pixel ROI area, and the connected domain pixel ROI area is set as an Rt area, wherein:
S=0.0025×r×c
where r is the width of the largest area of the connected domain, and c is the height of the largest area of the connected domain.
5. The unmanned aerial vehicle-based field personnel automatic searching method according to claim 4, wherein: the motion analysis of the trapped people is realized by comparing the difference value between the pixel values of the front frame and the back frame of each pixel point at the same position of the infrared image and taking the pixel point with the difference value Q larger than a preset pixel difference threshold value as a hot pixel, wherein:
|I 1 -I 2 |>Q
wherein, I 1 Pixel value, I, of the current frame of co-located pixel points 2 The pixel value of the previous frame of the pixel point at the same position;
forming a plurality of hot pixel ROI areas by a plurality of hot pixels according to connected domain coherent communication, calculating the average pixel value M of all pixel points in all connected domain pixel ROI areas obtained in thermal imaging threshold analysis, and judging as follows: and when the pixel number of the hot spot pixel ROI area is more than 5% of the pixel number of all connected domain pixel ROI areas obtained in the thermal imaging threshold analysis and the average pixel value of the hot spot pixel ROI area is more than the average pixel value M, taking the hot spot pixel ROI area as an Rm area.
6. The unmanned aerial vehicle-based field personnel automatic searching method of claim 1, wherein: when the person exists in the image, extracting the coordinate position of the central point pixel of the minimum solid line rectangular frame in real time as the person central point (x) 1 ,y 1 ) Acquiring the central point pixel coordinate position of the infrared image as the image central point (x) 0 ,y 0 );
According to the center point (x) of the person 1 ,y 1 ) And image center point (x) 0 ,y 0 ) The position deviation value of the two points in the vertical coordinate direction of the infrared image is calculated, and the pitch angle deviation theta of a camera during the tracking of the unmanned aerial vehicle is calculated y Wherein:
Figure FDA0003592590020000021
in the formula, beta is a rated pitch angle of the unmanned aerial vehicle camera;
then, obtaining a corrected yaw angle delta (t) in the horizontal yaw direction of the unmanned aerial vehicle according to a following formula tracking algorithm;
Figure FDA0003592590020000022
where δ (t) represents the corrected yaw angle at time t, θ x (t) yaw angle deviation of the UAV at time t, yaw angle deviation theta x (t) calculation method and Pitch Angle deviation θ y The calculation method is the same, lambda is an adjusting parameter, d (t) is the pre-aiming distance deviation at the moment t, and v (t) is the current speed of the unmanned aerial vehicle;
the pre-aiming distance deviation d (t) is obtained by calculation according to the following formula:
Figure FDA0003592590020000023
in the formula, a max Maximum braking speed of the unmanned plane, v (t) current speed of the unmanned plane, K e For the distance parameter, R, of the unmanned plane running in case of abnormal reaction min The minimum turning radius value of the unmanned aerial vehicle;
by pitch angle deviation theta y And the corrected yaw angle delta (t) is input into a pitching control channel and a yaw control channel of the unmanned aerial vehicle in real time and added to the original pitching control quantity and the original yaw control quantity.
7. The unmanned aerial vehicle-based field personnel automatic searching method of claim 1, wherein: and the handheld terminal equipment sends the captured environment infrared photo, visible light photo and GPS coordinate of the personnel to a cloud server of a remote monitoring center through a 5G network for real-time monitoring.
CN202210383003.2A 2022-04-12 2022-04-12 Unmanned aerial vehicle-based automatic field personnel searching method Pending CN114967731A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210383003.2A CN114967731A (en) 2022-04-12 2022-04-12 Unmanned aerial vehicle-based automatic field personnel searching method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210383003.2A CN114967731A (en) 2022-04-12 2022-04-12 Unmanned aerial vehicle-based automatic field personnel searching method

Publications (1)

Publication Number Publication Date
CN114967731A true CN114967731A (en) 2022-08-30

Family

ID=82978106

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210383003.2A Pending CN114967731A (en) 2022-04-12 2022-04-12 Unmanned aerial vehicle-based automatic field personnel searching method

Country Status (1)

Country Link
CN (1) CN114967731A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116612494A (en) * 2023-05-05 2023-08-18 交通运输部水运科学研究所 Pedestrian target detection method and device in video monitoring based on deep learning
CN116698044A (en) * 2023-08-01 2023-09-05 北京共创晶桔科技服务股份有限公司 Unmanned aerial vehicle navigation method and system
CN117079397A (en) * 2023-09-27 2023-11-17 青海民族大学 Wild human and animal safety early warning method based on video monitoring

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116612494A (en) * 2023-05-05 2023-08-18 交通运输部水运科学研究所 Pedestrian target detection method and device in video monitoring based on deep learning
CN116698044A (en) * 2023-08-01 2023-09-05 北京共创晶桔科技服务股份有限公司 Unmanned aerial vehicle navigation method and system
CN116698044B (en) * 2023-08-01 2023-12-08 内蒙古科比特航空科技有限公司 Unmanned aerial vehicle navigation method and system
CN117079397A (en) * 2023-09-27 2023-11-17 青海民族大学 Wild human and animal safety early warning method based on video monitoring
CN117079397B (en) * 2023-09-27 2024-03-26 青海民族大学 Wild human and animal safety early warning method based on video monitoring

Similar Documents

Publication Publication Date Title
CN111932588B (en) Tracking method of airborne unmanned aerial vehicle multi-target tracking system based on deep learning
CN109765930B (en) Unmanned aerial vehicle vision navigation
CN106874854B (en) Unmanned aerial vehicle tracking method based on embedded platform
CN114967731A (en) Unmanned aerial vehicle-based automatic field personnel searching method
CN111527463B (en) Method and system for multi-target tracking
CN115439424B (en) Intelligent detection method for aerial video images of unmanned aerial vehicle
McGee et al. Obstacle detection for small autonomous aircraft using sky segmentation
CN105979147A (en) Intelligent shooting method of unmanned aerial vehicle
Lebedev et al. Accurate autonomous uav landing using vision-based detection of aruco-marker
CN111679695B (en) Unmanned aerial vehicle cruising and tracking system and method based on deep learning technology
CN110619276B (en) Anomaly and violence detection system and method based on unmanned aerial vehicle mobile monitoring
WO2021083151A1 (en) Target detection method and apparatus, storage medium and unmanned aerial vehicle
CN112162565B (en) Uninterrupted self-main-pole tower inspection method based on multi-machine collaborative operation
CN106295695B (en) A kind of takeoff and landing process automatic tracing image pickup method and device
CN111461013B (en) Unmanned aerial vehicle-based real-time fire scene situation awareness method
Savva et al. ICARUS: Automatic autonomous power infrastructure inspection with UAVs
CN112800918A (en) Identity recognition method and device for illegal moving target
CN115115785A (en) Multi-machine cooperative three-dimensional modeling system and method for search and rescue in field mountain and forest environment
Ranjbar et al. Addressing practical challenge of using autopilot drone for asphalt surface monitoring: Road detection, segmentation, and following
CN114689030A (en) Unmanned aerial vehicle auxiliary positioning method and system based on airborne vision
Raj et al. Vision based accident vehicle identification and scene investigation
CN116894936B (en) Unmanned aerial vehicle vision-based marine target identification and positioning method and system
KR20220068606A (en) Automatic landing algorithm of drone considering partial images
CN114740878B (en) Unmanned aerial vehicle flight obstacle detection method based on computer image recognition
CN115588143A (en) Target identification and tracking method for electrical equipment based on airborne computer of unmanned aerial vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination