CN109063532A - A kind of field lost contact personnel's method for searching based on unmanned plane - Google Patents
A kind of field lost contact personnel's method for searching based on unmanned plane Download PDFInfo
- Publication number
- CN109063532A CN109063532A CN201810382529.2A CN201810382529A CN109063532A CN 109063532 A CN109063532 A CN 109063532A CN 201810382529 A CN201810382529 A CN 201810382529A CN 109063532 A CN109063532 A CN 109063532A
- Authority
- CN
- China
- Prior art keywords
- image
- unmanned plane
- optical system
- hybrid optical
- lost contact
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S19/00—Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
- G01S19/38—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
- G01S19/39—Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
- G01S19/42—Determining position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20092—Interactive image processing based on input by user
- G06T2207/20104—Interactive definition of region of interest [ROI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Human Computer Interaction (AREA)
- Astronomy & Astrophysics (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Image Processing (AREA)
Abstract
The invention discloses a kind of field lost contact personnel's method for searching based on unmanned plane.Unmanned plane takes photo by plane searchs to the region of lost contact personnel activity through row by scheduled course line, using automatic cruise mode, makes a return voyage after task of completing to take photo by plane;During taking photo by plane, image dividing processing, ROI region mapping and object detection process are successively carried out to the image of acquisition by onboard image processing module in real time, and will test the image of target and its GPS position information of corresponding record is transferred to earth station by communication module, rescue work is unfolded by the information that unmanned plane returns in rescue worker.Its location information and dependent image data are sent to earth station by communication module to help rescue worker to determine lost contact personnel positions for lost contact personnel that may be present by the present invention.
Description
Technical field
The present invention relates to the method for searching of field lost contact personnel a kind of, especially relate to a kind of open country based on unmanned plane
Outer lost contact personnel method for searching.
Background technique
In recent years, self-service autonomous travelling receives pursuing for vast travelling fan, and the depopulated zone without exploitation
It is the place that " tour pal " most often sets foot in again.Since travelling fan is with independently spontaneous and most of professional there is no passing through
Training, therefore in exploration while kickster also along with huge risk, lost contact phenomenon occurs repeatedly.To field
During lost contact personnel search and rescue, the greatest difficulty that rescue worker faces is how quickly to position the position of lost contact personnel, mesh
Preceding conventional method is to carry out inch-by-inch search by a large amount of personnel, and this method is not only time-consuming and laborious, but also depends critically upon
The subjective judgement of people, so that search-and-rescue work inefficiency, often results in irremediable consequence.
Summary of the invention
The technical problem to be solved by the present invention is providing one for deficiency present in current field personnel method for searching
Field lost contact personnel method for searching of the kind based on unmanned plane.
Problem to be solved by this invention includes the following steps:
Unmanned plane is taken photo by plane through row using region of the automatic cruise mode to lost contact personnel activity and is searched by scheduled course line
It seeks, completion, which is taken photo by plane, makes a return voyage after task;During taking photo by plane, in real time by onboard image processing module to the image of acquisition successively into
Row image dividing processing, ROI region mapping and object detection process, and will test the image and its corresponding record of target
GPS position information is transferred to earth station by communication module, and rescue work is unfolded by the information that unmanned plane returns in rescue worker.
Infrared camera and visible image capturing head, infrared camera and visible image capturing head are mounted on the unmanned plane
Acquisition obtains infrared hybrid optical system and visible images respectively, and infrared camera and visible image capturing head constitute Image Acquisition list
Member, infrared camera and visible image capturing head are arranged side by side in unmanned plane head, and infrared camera and visible image capturing head
Towards same direction.
Described carries out image dividing processing by image of the onboard image processing module to acquisition in real time, specifically uses
Improved maximum entropy method (MEM) carries out image segmentation to infrared hybrid optical system, obtains the foreground area that may represent human body target, in turn
For extracting the region in image there may be human body:
1) gray-scale statistical is carried out to the infrared hybrid optical system that infrared camera obtains, calculates grey level histogram, then basis
Grey level histogram is calculated using the following equation each gray level probability:
Wherein, piIndicate that the probability of i-th of gray level, i indicate gray level ordinal number, niIndicate i-th gray level in image
Pixel quantity, N indicate the total number of pixels of entire image;
2) Random entropy of background B and target O are then calculated using the following equation:
In formula, H (B), H (O) are respectively the Random entropy of background B and target O, and t indicates that the gray level of infrared hybrid optical system is pre-
If segmentation threshold, i, j indicate that the gray level ordinal number of infrared hybrid optical system, L indicate maximum gray scale;
3) then, according to obtained Random entropy, following optimal threshold t is established*Objective function:
In formula, toIndicate that the gray value that human body is represented in infrared hybrid optical system, α indicate gray value toConfidence level;
Finally according to Lagrange duality, the solution that carries out of objective function is obtained most using antithesis optimization method
Excellent threshold value t*, utilize optimal threshold t*Infrared hybrid optical system is split, foreground area and background area are obtained.
The present invention is specially designed objective function, and wherein special setting confidence level α, can be improved to human body target
Detection accuracy.
Described is carried out at ROI region mapping and target detection by image of the onboard image processing module to acquisition in real time
Reason specifically uses improved target identification method, obtain in image whether the testing result of human body target, can effectively reduce
Calculate useless calculating of the equipment in identification object procedure:
1) collected infrared hybrid optical system is distinguished to infrared camera and visible image capturing head and visible images carries out
Calibration, obtains n between infrared hybrid optical system and visible images to characteristic point pair, and n is to characteristic point to being at least 8 pairs of characteristic points
It is right;
2) pass through n to characteristic point to the basic square being calculated using the following equation between infrared hybrid optical system and visible images
Battle array f:
Af=0
In formula, A indicates R-matrix, and matrix A is the matrix of n × 9, (u, v)T, (u ', v ') is respectively infrared gray scale
The coordinate of a pair of of characteristic point pair in image and visible images;
3) it is calculated using the following equation the foreground area for obtaining and being partitioned into infrared hybrid optical system and corresponds to visible images
In polar curve l:
L=fm
In formula, m is the Geometric center coordinates for the foreground area extracted in infrared hybrid optical system;
4) sliding detection window is then established centered on each pixel on polar curve l, and extracts sliding detection window
Hog feature, each Hog feature of polar curve l is then input to trained support vector machines and is classified, identify
The presence for whether having human body target in visible images obtained.
The present invention carries out ROI region mapping and target detection through the above way, can more preferably, more rapidly, more accurately examine
Survey the human body target obtained in image.
Trained support vector machines are to represent human body target by the image zooming-out acquisition in standardized human body's picture library
Then the known pixels point label information of image in Hog feature and standardized human body's picture library is input to by the Hog feature of pixel
Learning training obtains in support vector machines, uses RBF kernel function (gaussian kernel function) when learning training.
The unmanned plane be a kind of small-sized fixed-wing of the solar energy that flexible photovoltaic component is covered on wing and empennage without
It is man-machine, to realize the long-time continuation of the journey when intensity of illumination meets enough.
The flexible photovoltaic component full name monocrystalline flexible solar component, is a kind of device converted solar energy into electrical energy
Part, converts solar energy into electrical energy in the present embodiment and is sent in battery and store, to push loaded work piece.It is so-called
Flexibility refers to that the solar panel can bending.Bending angle is up to 30 degree.
The prebriefed pattern is to be planned by artificially treating region of search by inch-by-inch search mode, executes flight
Unmanned plane will be searched according to course line after order.
Compared with existing method for searching, the beneficial effects of the present invention are:
In existing field lost contact personnel's method for searching, not yet there are the relevant technologies based on unmanned plane vision search, because
This present invention is a completely new direction.
The invention avoids when artificial search the problem of existing low efficiency at high cost, while using visible light and infrared heat
The efficient identification technology of imaging carries out large-scale field personnel search, provides reliable foundation for accurate rescue work.
Detailed description of the invention
Fig. 1 is the flow diagram of the method for the present invention.
Specific embodiment
Hereinafter reference will be made to the drawings, and the present invention is further described.
As shown in Figure 1, in present invention specific implementation, what it is using the main earth station composition corresponding comprising unmanned plane is
System, unmanned plane are selected the small-sized fixed-wing unmanned plane of solar energy to meet the long requirement continued a journey as flying platform, are filled on unmanned plane
Flight control panel, power management module, barometer, gyroscope, accelerometer, GPS module are carried, for carrying out attitude of flight vehicle control
System and the demand of navigation, external communication module, image variants unit shoot the image of target for identification and return phase
The data answered.
Infrared camera and visible image capturing head are mounted on unmanned plane, infrared camera and visible image capturing head are adopted respectively
Collection obtains infrared hybrid optical system and visible images, and infrared camera and visible image capturing head constitute image acquisition units, infrared
Camera and visible image capturing head are arranged side by side in unmanned plane head, and infrared camera and visible image capturing head are towards same
Direction.
Unmanned plane is a kind of small-sized fixed-wing unmanned plane of the solar energy that flexible photovoltaic component is covered on wing and empennage.
Flight control panel flies control processor module, the 3 axis gyros for integrating by MPU-6050 using STM32F407
Instrument, 3 axis accelerometers and geomagnetic sensor acquire attitude data and control attitude of flight vehicle, are led by GPS module
Boat.
Image acquisition units use image capture device and controllable three axis cloud including visible light and infrared thermal imaging
The effect of jitter of aircraft flight process is reduced to minimum by platform, holder, so that maximum possible guarantee collects the clear of image
Degree.Acquired image information is transmitted to the identification that onboard image processing module is used for target by camera.
Image processing module, as processing unit, adds four cores by NVIDIA 4-1 using NVIDIA Tegra K1 processor
ARM-Cortex A15CPU and NVIDIA Kepler GPU comprising 192 NVIDIA CUDA cores are formed to complete image
The demand of processing.
Communication module uses iridium satellite 9602SBD satellite data transmission module, and Iridium 9602 is veneer transceiver module,
By satellite channel sending and receiving data packet, to realize, the data of unmanned plane and earth station are transmitted in the depopulated zone of field.
Power management module is voltage 5V and output electricity needed for lithium battery voltage to be down to airborne platform and flies control unit
Flow 2A.
Earth station is the information for receiving and showing airborne platform transmission in real time.
Implementation process of the invention is as follows:
1, objective for implementation of the somewhere without the deserted mountain largely developed as the present embodiment is chosen, delimiting area to be sought is
10 hectares, forest land landforms, the random distribution 10-20 human simulation lost contact personnel to be searched and rescued in region to be searched.Pass through earth station
Upper computer software prebriefed pattern carries out blanket type search to target area, and drone flying height is set in 20-25m or so.
Unmanned plane is according to prebriefed pattern autonomous navigation, route speed 30-40km/h.Unmanned plane is every with 3 seconds during navigation
The speed of width image persistently shoots ground,
2, during taking photo by plane, image dividing processing is carried out by image of the onboard image processing module to acquisition in real time
Specific implementation carries out image segmentation to infrared hybrid optical system using improved maximum entropy method (MEM), obtains foreground area:
1) gray-scale statistical is carried out to the infrared hybrid optical system that infrared camera obtains, calculates grey level histogram, then basis
Grey level histogram is calculated using the following equation each gray level probability:
Wherein, piIndicate that the probability of i-th of gray level, i indicate gray level ordinal number, for 8-bit gray level image, value
Usually take 0~255;niIndicate that the pixel quantity of i-th of gray level in image, N indicate the total number of pixels of entire image;
2) Random entropy of background B and target O are then calculated using the following equation:
In formula, H (B), H (O) are respectively the Random entropy of background B and target O, and t indicates that the gray level of infrared hybrid optical system is pre-
If segmentation threshold, i, j indicate that the gray level ordinal number of infrared hybrid optical system, L indicate maximum gray scale, the L for 8-bit image
Take 255;
3) following optimal threshold t is then established according to obtained Random entropy*Objective function:
In formula, toIndicate the gray value that human body is represented in infrared hybrid optical system, toBy the standard IR gray scale for having human body
Image determines, is a constant, and α indicates gray value toConfidence level;
Finally according to Lagrange duality, the solution that carries out of objective function is obtained most using antithesis optimization method
Excellent threshold value t*, utilize optimal threshold t*Infrared hybrid optical system is split, foreground area and background area are obtained.
3, ROI region mapping and object detection process are carried out by image of the onboard image processing module to acquisition in real time,
Specifically use improved target identification method, obtain in image whether the testing result of human body target:
1) collected infrared hybrid optical system is distinguished to infrared camera and visible image capturing head and visible images uses
Camera Calibrator tool in matlab software toolkit is demarcated, and infrared hybrid optical system and visible light figure are obtained
N as between is to characteristic point pair;
2) pass through n to characteristic point to the basic square being calculated using the following equation between infrared hybrid optical system and visible images
Battle array f:
Af=0
In formula, A indicates R-matrix, and matrix A is the matrix of n × 9, (u, v)T, (u ', v ') is respectively infrared gray scale
The coordinate of a pair of of characteristic point pair in image and visible images;
Specific implementation carries out singular value decomposition to R-matrix A, calculates the generalized inverse matrix A of R-matrix A+, and according to
Generalized inverse matrix A+The least square solution for calculating fundamental matrix f, to obtain fundamental matrix f.
3) it is calculated using the following equation the foreground area for obtaining and being partitioned into infrared hybrid optical system and corresponds to visible images
In polar curve l, polar curve l represents that the foreground area central point being partitioned into infrared hybrid optical system corresponds in visible images can
Energy place:
L=fm
In formula, m is the Geometric center coordinates for the foreground area extracted in infrared hybrid optical system;
4) sliding detection window is then established centered on each pixel on polar curve l, and extracts sliding detection window
Hog feature, each Hog feature of polar curve l is then input to trained support vector machines and is classified, identify
The presence for whether having human body target in visible images obtained.
Trained support vector machines are to represent human body target by the image zooming-out acquisition in standardized human body's picture library
Then the known pixels point label information of image in Hog feature and standardized human body's picture library is input to by the Hog feature of pixel
Learning training obtains in support vector machines, uses RBF kernel function (gaussian kernel function) when learning training.
4, the GPS position information of the image and its corresponding record that will test target is transferred to ground by communication module
It stands, rescue work is unfolded by the information that unmanned plane returns in rescue worker.
When unmanned plane detects target, the width image and corresponding GPS information are sent to by satellite communication module
Earth station, earth station staff compare the data of unmanned plane passback and the correctness of cross-check information.
Unmanned plane completes the search task of target area, and straight line makes a return voyage.
By actual test, which can compare in the case where not being blocked by obstructions to rescue worker's body
Accurately identification positioning belt rescue worker, the satellite transmission module used can be good at target information being transmitted to host computer,
The average handling time of each image (1920 × 1080) only need to be less than 1s.In many experiments, for the to be checked of random distribution
Survey personnel, the detection accuracy of system reach 98.2%, meet the requirement searched to field lost contact personnel.
The continuation of the journey of the long-time when intensity of illumination meets enough, nothing may be implemented in solar energy unmanned plane of the invention as a result,
It is man-machine during a wide range of long working, ground image is constantly obtained by image acquisition units and passes through image procossing
Unit, which is differentiated, whether there is doubtful lost contact personnel, for lost contact personnel that may be present by its location information and dependent image data
Earth station is sent to by communication module to help rescue worker to determine lost contact personnel positions.
Claims (6)
1. a kind of field lost contact personnel's method for searching based on unmanned plane, it is characterised in that the following steps are included: unmanned plane passes through
Scheduled course line takes photo by plane searchs to the region of lost contact personnel activity through row using automatic cruise mode, returns after task of completing to take photo by plane
Boat;During taking photo by plane, image dividing processing, ROI are successively carried out to the image of acquisition by onboard image processing module in real time
Area maps and object detection process, and will test the image of target and its GPS position information of corresponding record passes through communication
Rescue work is unfolded by the information that unmanned plane returns to earth station, rescue worker in module transfer.
2. a kind of field lost contact personnel's method for searching based on unmanned plane according to claim 1, it is characterised in that: described
Unmanned plane on be mounted with infrared camera and visible image capturing head, infrared camera and visible image capturing head are arranged side by side in nothing
Man-machine head, and infrared camera and visible image capturing head are towards same direction.
3. a kind of field lost contact personnel's method for searching based on unmanned plane according to claim 2, it is characterised in that: described
Image dividing processing is carried out to the image of acquisition by onboard image processing module in real time, specifically use improved maximum entropy
Method carries out image segmentation to infrared hybrid optical system, obtains foreground area:
1) gray-scale statistical is carried out to the infrared hybrid optical system that infrared camera obtains, grey level histogram is calculated, then according to gray scale
Histogram is calculated using the following equation each gray level probability:
Wherein, piIndicate that the probability of i-th of gray level, i indicate gray level ordinal number, niIndicate the pixel of i-th of gray level in image
Quantity, N indicate the total number of pixels of entire image;
2) Random entropy of background B and target O are then calculated using the following equation:
In formula, H (B), H (O) are respectively the Random entropy of background B and target O, and t indicates default point of the gray level of infrared hybrid optical system
Threshold value is cut, i, j indicate that the gray level ordinal number of infrared hybrid optical system, L indicate maximum gray scale;
3) then, according to obtained Random entropy, following optimal threshold t is established*Objective function:
In formula, toIndicate that the gray value that human body is represented in infrared hybrid optical system, α indicate gray value toConfidence level;
Finally according to Lagrange duality, solve the optimal threshold of acquisition to objective function using antithesis optimization method
Value t*, utilize optimal threshold t*Infrared hybrid optical system is split, foreground area and background area are obtained.
4. a kind of field lost contact personnel's method for searching based on unmanned plane according to claim 2, it is characterised in that: described
ROI region mapping and object detection process are carried out to the image of acquisition by onboard image processing module in real time, specifically adopt
With improved target identification method, obtain in image whether the testing result of human body target:
1) to infrared camera and visible image capturing head, collected infrared hybrid optical system and visible images are demarcated respectively,
The n between infrared hybrid optical system and visible images is obtained to characteristic point pair;
2) pass through n to characteristic point to the fundamental matrix f being calculated using the following equation between infrared hybrid optical system and visible images:
Af=0
In formula, A indicates R-matrix, and matrix A is the matrix of n × 9, (u, v)T, (u ', v ') is respectively infrared hybrid optical system
With the coordinate of characteristic point pair a pair of of in visible images;
3) foreground area for obtaining and being partitioned into infrared hybrid optical system is calculated using the following equation to correspond in visible images
Polar curve l:
L=fm
In formula, m is the Geometric center coordinates for the foreground area extracted in infrared hybrid optical system;
4) sliding detection window is then established centered on each pixel on polar curve l, and extracts the Hog of sliding detection window
Then each Hog feature of polar curve l is input to trained support vector machines and classified by feature, identification obtains can
Whether the presence of human body target is had in light-exposed image.
5. a kind of field lost contact personnel's method for searching based on unmanned plane according to claim 1, it is characterised in that: described
Unmanned plane be a kind of small-sized fixed-wing unmanned plane of the solar energy that flexible photovoltaic component is covered on wing and empennage.
6. a kind of field lost contact personnel's method for searching based on unmanned plane according to claim 1, it is characterised in that: described
Prebriefed pattern is to be planned by artificially treating region of search by inch-by-inch search mode, executes unmanned plane after flight orders
It will be searched according to course line.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810382529.2A CN109063532B (en) | 2018-04-26 | 2018-04-26 | Unmanned aerial vehicle-based method for searching field offline personnel |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810382529.2A CN109063532B (en) | 2018-04-26 | 2018-04-26 | Unmanned aerial vehicle-based method for searching field offline personnel |
Publications (2)
Publication Number | Publication Date |
---|---|
CN109063532A true CN109063532A (en) | 2018-12-21 |
CN109063532B CN109063532B (en) | 2021-12-07 |
Family
ID=64820067
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810382529.2A Active CN109063532B (en) | 2018-04-26 | 2018-04-26 | Unmanned aerial vehicle-based method for searching field offline personnel |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109063532B (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109696921A (en) * | 2018-12-27 | 2019-04-30 | 济南大学 | A kind of system design for searching and rescuing unmanned plane |
CN109787679A (en) * | 2019-03-15 | 2019-05-21 | 郭欣 | Police infrared arrest system and method based on multi-rotor unmanned aerial vehicle |
CN111540166A (en) * | 2020-05-09 | 2020-08-14 | 重庆工程学院 | Unmanned aerial vehicle night search system and method based on deep learning |
CN111701118A (en) * | 2020-06-24 | 2020-09-25 | 郭中华 | Blood vessel developing device for injection of hyaluronic acid |
CN111800180A (en) * | 2020-05-12 | 2020-10-20 | 萧县航迅信息技术有限公司 | Rescue target discovery system and method for field unmanned aerial vehicle |
CN115147741A (en) * | 2022-06-28 | 2022-10-04 | 慧之安信息技术股份有限公司 | Auxiliary helicopter search and rescue method based on edge calculation |
CN117295009A (en) * | 2023-10-07 | 2023-12-26 | 广州精天信息科技股份有限公司 | Communication equipment deployment method and device, storage medium and intelligent terminal |
CN118018104A (en) * | 2024-04-09 | 2024-05-10 | 中科元境(江苏)文化科技有限公司 | Unmanned aerial vehicle-based data transmission method and system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105416584A (en) * | 2015-11-12 | 2016-03-23 | 广州杰赛科技股份有限公司 | Post-disaster life tracking unmanned aerial vehicle system |
CN106184753A (en) * | 2016-07-13 | 2016-12-07 | 京信通信系统(中国)有限公司 | A kind of unmanned plane and unmanned plane search and rescue localization method |
CN106291592A (en) * | 2016-07-14 | 2017-01-04 | 桂林长海发展有限责任公司 | A kind of countermeasure system of SUAV |
CN106406343A (en) * | 2016-09-23 | 2017-02-15 | 北京小米移动软件有限公司 | Control method, device and system of unmanned aerial vehicle |
US9665094B1 (en) * | 2014-08-15 | 2017-05-30 | X Development Llc | Automatically deployed UAVs for disaster response |
CN106741875A (en) * | 2016-12-30 | 2017-05-31 | 天津市天安博瑞科技有限公司 | A kind of flight search and rescue system and method |
-
2018
- 2018-04-26 CN CN201810382529.2A patent/CN109063532B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9665094B1 (en) * | 2014-08-15 | 2017-05-30 | X Development Llc | Automatically deployed UAVs for disaster response |
CN105416584A (en) * | 2015-11-12 | 2016-03-23 | 广州杰赛科技股份有限公司 | Post-disaster life tracking unmanned aerial vehicle system |
CN106184753A (en) * | 2016-07-13 | 2016-12-07 | 京信通信系统(中国)有限公司 | A kind of unmanned plane and unmanned plane search and rescue localization method |
CN106291592A (en) * | 2016-07-14 | 2017-01-04 | 桂林长海发展有限责任公司 | A kind of countermeasure system of SUAV |
CN106406343A (en) * | 2016-09-23 | 2017-02-15 | 北京小米移动软件有限公司 | Control method, device and system of unmanned aerial vehicle |
CN106741875A (en) * | 2016-12-30 | 2017-05-31 | 天津市天安博瑞科技有限公司 | A kind of flight search and rescue system and method |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109696921A (en) * | 2018-12-27 | 2019-04-30 | 济南大学 | A kind of system design for searching and rescuing unmanned plane |
CN109787679A (en) * | 2019-03-15 | 2019-05-21 | 郭欣 | Police infrared arrest system and method based on multi-rotor unmanned aerial vehicle |
CN111540166A (en) * | 2020-05-09 | 2020-08-14 | 重庆工程学院 | Unmanned aerial vehicle night search system and method based on deep learning |
CN111800180A (en) * | 2020-05-12 | 2020-10-20 | 萧县航迅信息技术有限公司 | Rescue target discovery system and method for field unmanned aerial vehicle |
CN111701118A (en) * | 2020-06-24 | 2020-09-25 | 郭中华 | Blood vessel developing device for injection of hyaluronic acid |
CN115147741A (en) * | 2022-06-28 | 2022-10-04 | 慧之安信息技术股份有限公司 | Auxiliary helicopter search and rescue method based on edge calculation |
CN115147741B (en) * | 2022-06-28 | 2023-06-30 | 慧之安信息技术股份有限公司 | Auxiliary helicopter search and rescue method based on edge calculation |
CN117295009A (en) * | 2023-10-07 | 2023-12-26 | 广州精天信息科技股份有限公司 | Communication equipment deployment method and device, storage medium and intelligent terminal |
CN118018104A (en) * | 2024-04-09 | 2024-05-10 | 中科元境(江苏)文化科技有限公司 | Unmanned aerial vehicle-based data transmission method and system |
Also Published As
Publication number | Publication date |
---|---|
CN109063532B (en) | 2021-12-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109063532A (en) | A kind of field lost contact personnel's method for searching based on unmanned plane | |
CN105318888B (en) | Automatic driving vehicle paths planning method based on unmanned plane perception | |
CN103822635B (en) | The unmanned plane during flying spatial location real-time computing technique of view-based access control model information | |
CN109683629B (en) | Unmanned aerial vehicle electric power overhead line system based on combination navigation and computer vision | |
CN106403904B (en) | A kind of calculation method and system of the landscape scale vegetation coverage based on unmanned plane | |
CN106813900B (en) | A kind of civil airport navigational lighting aid flight check method based on unmanned air vehicle technique | |
CN105000194A (en) | UAV (unmanned aerial vehicle) assisted landing visual guiding method and airborne system based on ground cooperative mark | |
CN106054929A (en) | Unmanned plane automatic landing guiding method based on optical flow | |
CN106292126B (en) | A kind of intelligence aerial survey flight exposal control method, unmanned aerial vehicle (UAV) control method and terminal | |
CN104215239A (en) | Vision-based autonomous unmanned plane landing guidance device and method | |
CN112326686A (en) | Unmanned aerial vehicle intelligent cruise pavement disease detection method, unmanned aerial vehicle and detection system | |
CN108475442A (en) | Augmented reality method, processor and unmanned plane for unmanned plane | |
CN104700576A (en) | Quick water rescuing system and method | |
CN106289186A (en) | The airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system and implementation method | |
CN110832494A (en) | Semantic generation method, equipment, aircraft and storage medium | |
CN109116865A (en) | Large scale equipment unmanned plane cruising inspection system and its method based on machine vision | |
CN109883398A (en) | The system and method that the green amount of plant based on unmanned plane oblique photograph is extracted | |
US11341673B2 (en) | Infrared image processing method, infrared image processing device, and infrared image processing program | |
CN110104167A (en) | A kind of automation search and rescue UAV system and control method using infrared thermal imaging sensor | |
CN105447431B (en) | A kind of docking aircraft method for tracking and positioning and system based on machine vision | |
CN111402324B (en) | Target measurement method, electronic equipment and computer storage medium | |
CN117636284A (en) | Unmanned aerial vehicle autonomous landing method and device based on visual image guidance | |
CN112560751A (en) | Balcony high-altitude falling risk detection method and system | |
CN115393738A (en) | Unmanned aerial vehicle-based PAPI flight verification method and system | |
CN112859907A (en) | Rocket debris high-altitude detection method based on three-dimensional special effect simulation under condition of few samples |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |