CN106289186A - The airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system and implementation method - Google Patents
The airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system and implementation method Download PDFInfo
- Publication number
- CN106289186A CN106289186A CN201610839767.2A CN201610839767A CN106289186A CN 106289186 A CN106289186 A CN 106289186A CN 201610839767 A CN201610839767 A CN 201610839767A CN 106289186 A CN106289186 A CN 106289186A
- Authority
- CN
- China
- Prior art keywords
- target
- aerial vehicle
- unmanned aerial
- rotor wing
- wing unmanned
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/04—Interpretation of pictures
Abstract
The invention discloses a kind of airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system, belong to location navigation and control technical field.Including On-Board Subsystem and ground surveillance subsystem, On-Board Subsystem includes that video acquisition unit, graphics processing unit and figure pass transmitting terminal, and the image of video acquisition unit collection passes transmitting terminal through figure after being processed by graphics processing unit and is sent to ground surveillance subsystem;Ground surveillance subsystem includes that earth station and the figure being connected with earth station pass receiving terminal, and figure passes receiving terminal and passes transmitting terminal communication with figure.Present configuration is compact, realizes perfect fusion with rotor wing unmanned aerial vehicle, it is simple to the flight of rotor wing unmanned aerial vehicle and control, thus the precision of target location has been effectively ensured and has realized Multi-target position.The invention also discloses the implementation method of the airborne visual detection of above-mentioned rotor wing unmanned aerial vehicle and multi-target positioning system.
Description
Technical field
The present invention relates to a kind of dynamic positioning system, be especially a kind of airborne visual detection of rotor wing unmanned aerial vehicle and multiple target
Alignment system and implementation method, belong to location navigation and control technical field.
Background technology
Rotor wing unmanned aerial vehicle has that mobility is good, can hover, can VTOL, low cost and other advantages, at military surveillance, electric power
Patrol and examine, there is boundless application prospect in seaborne supply, crops plant protection, the field such as goods carrying.Airborne visual detection with
Object locating system is the important component part of rotor wing unmanned aerial vehicle, plays an important role when rotor wing unmanned aerial vehicle performs task.
Visual system has the advantages such as lightweight, volume is little, low in energy consumption, precision is high, good concealment, has expanded unmanned
Machine visual system widely studied, covers the various fields such as target following, electric inspection process, independent landing.
Visual detection at present and object locating system realize mainly there are two kinds of forms, and one is that visual processes part is arranged on
Ground, after Airborne Camera capture target image, passes ground by radio back image information, and ground image procossing calculates
Machine calculates the information of target, then passes unmanned plane back for flight control system, such system to the power of ground installation, volume,
Disposal abilities etc. the most do not limit, but can be affected by signal transmission delay;Another kind is that visual processes part is arranged on nothing
On man-machine, this mode avoids the time delay being transmitted information by radio, and real-time is greatly improved with reliability, but
Requiring that airborne visual processes equipment volume is little, lightweight, power can not be too big, limits calculating disposal ability.At vision
Adjustment method angle is seen, the most, along with nothing for single goal of the in the past both at home and abroad research in terms of visual detection with location
Man-machine more and more concerned, need the occasion of multiple target detection location is got more and more, but prior art cannot be carried out multiple target
Location, it is difficult to meet and use demand normally.
Summary of the invention
The technical problem to be solved is to overcome prior art defect, it is provided that one can realize Multi-target position
The design and implementation methods of the airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system.
In order to solve above-mentioned technical problem, the airborne visual detection of rotor wing unmanned aerial vehicle that the present invention provides and Multi-target position system
System, including On-Board Subsystem and ground surveillance subsystem, described On-Board Subsystem includes video acquisition unit, graphics processing unit
Passing transmitting terminal with figure, the image of video acquisition unit collection passes transmitting terminal through figure after being processed by graphics processing unit and is sent to ground
Watchdog subsystem;Described ground surveillance subsystem includes that earth station and the figure being connected with earth station pass receiving terminal, and described figure passes and connects
Receiving end passes transmitting terminal communication with figure.
In the present invention, described video acquisition unit includes camera and Self-stabilization holder, and described installation camera is at self-stabilization cloud
On platform, Self-stabilization holder holding camera optical axis sensing ground is the most perpendicular with ground.
In the present invention, described figure passes receiving terminal and figure passes and uses simplex between transmitting terminal.
Present invention also offers the implementation method of the airborne visual detection of above-mentioned rotor wing unmanned aerial vehicle and multi-target positioning system, bag
Include following steps:
1), use grid dish standardizition that camera is demarcated, obtain the intrinsic parameter of camera, then collected by camera is arrived
Each original image to be identified is corrected removing distortion;
2), the original image after correction is converted into HSV form by YCbCr format;
3), set every width original image is equipped with A color border circular areas, and A color border circular areas and surrounding Scene colors, size
Distinguishing substantially, then tri-passages of H, S, V to image use the mode of fixed threshold, carry out binary conversion treatment;
4), the image after binaryzation is made morphology opening operation, remove the isolated noise in image, utilize seed mediated growth method
Finding all of connected region in image, wherein area coinciding sets the connected region of threshold size as area-of-interest, as
Potential target;
5), in same image, in multiple identical targets, auxiliary beacon is set, target is made a distinction and locks;
6), the Hough transform improved is utilized to look for round algorithm to determine each target's center;
7), complete target location according to camera imaging model to resolve.
5, the implementation method of the airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system according to claim 4,
It is characterized in that, be included in described step 4) area-of-interest remove the step of pseudo-target: utilize A color intra-zone in target
The feature in nested territory, B zone, performs step 3), the area in territory, B zone is judged, the threshold size meeting setting is latent
In target, it it is otherwise pseudo-target.
6, the implementation method of the airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system according to claim 5,
It is characterized in that, described step 5) process that arranges auxiliary beacon in multiple identical targets is: for by A color border circular areas
The multiple identical area-of-interest that nested inside B color border circular areas is constituted, uses and places difformity, number in territory, A zone
The C color beacon of amount carrys out supplementary globe, identifies multiple identical target.
7, according to the airborne visual detection of rotor wing unmanned aerial vehicle described in any one of claim 4 to 6 and multi-target positioning system
Implementation method, it is characterised in that described step 6) concretely comprise the following steps:
61), choosing K, L, M 3 point on circle C, KL and LM constitutes two uneven strings, the perpendicular bisector l of KL and LMKL、
lLMIntersecting at center of circle O, KO is the radius of circle C.
62) coordinate of K, L, M 3, is set as K (x1,y1)、L(x2,y2)、M(x3,y3), then in being understood two by point slope method
Vertical line lKLAnd lLMEquation be respectively as follows:
lKL: y=kKLx+bKL (1)
lLM: y=kLMx+bLM (2)
In formula
63), according to lKLAnd lLMEquation can calculate central coordinate of circle and radius of circle:
64), boundary point sum is set as N number of, each group of some Kij、Lij、MijRepresent, often take Q group point and be considered as a cycle,
Subscript i represents periodicity (i ∈ [1, T]), and the group number (j ∈ [1, Q]) that j puts in representing each cycle, the size of T value determines always
The point group number chosen altogether;In the i-th cycle, first group of point is by initial point Ki、Li、MiDetermine, such as formula such as (6);
In above formula, k is proportionality coefficient, and i ∈ [1, T], k=1.
Identical stride value S is all differed between consecutive points groupi, as shown in formula (7)
In formula, S1=S2=...=ST=N/8, in each cycle, the value of the group number Q of point follows the change Q=N of N.
In the present invention, described T value is 5.
The beneficial effects of the present invention is: (1), present configuration are compact, realize perfect fusion, just with rotor wing unmanned aerial vehicle
In flight and the control of rotor wing unmanned aerial vehicle, thus the precision of target location is effectively ensured and has realized Multi-target position;(2), certainly
Stablizing The Cloud Terrace keeps camera optical axis to point to ground perpendicular with ground, can simplify the motions such as rotor wing unmanned aerial vehicle attitude to camera
The coupling influence that attitude is brought;(3), figure passes receiving terminal and figure passes and uses simplex between transmitting terminal, it is ensured that system is run
Reliability;(4), utilize the geometrical property of circle to improve the Hough transform loop truss algorithm of classics, ensure the same of accuracy of identification
Time greatly reduce operand, improve algorithm operational efficiency;Use scheme and algorithm, its redundancy distinguishing multiple same targets
Mechanism enhances the reliability of scheme, has the strongest practical and popularizing value.
Accompanying drawing explanation
Fig. 1 is the airborne visual detection of rotor wing unmanned aerial vehicle of the present invention and multi-target positioning system Organization Chart;
Fig. 2 is program master control flow chart;
Fig. 3 is multiple target differentiation figure;
Fig. 4 is the Hough transform schematic diagram improved;
Fig. 5 is fitting effect image;
Fig. 6 is at X-direction resolution error correlation curve;
Fig. 7 is resolution error correlation curve in the Y direction;
Fig. 8 is that different cameral height is to precision influence curve.
Detailed description of the invention
Below in conjunction with the accompanying drawings technical scheme is described in further detail.
As it is shown in figure 1, the airborne visual detection of rotor wing unmanned aerial vehicle of the present invention and multi-target positioning system, with Leonardo da Vinci's series
TMS320DM6437 processor is platform building, is made up of On-Board Subsystem and ground surveillance subsystem.On-Board Subsystem includes
Camera, Self-stabilization holder, graphics processing unit and figure pass transmitting terminal, and camera is arranged on Self-stabilization holder, and Self-stabilization holder is protected
Holding camera optical axis and point to ground perpendicular with ground, graphics processing unit connects figure and passes transmitting terminal.The figure that camera will collect
As being transferred to after graphics processing unit processes be transferred to ground surveillance subsystem by figure biography transmitting terminal again.
Ground surveillance subsystem forms by scheming to pass receiving terminal and earth station, and figure passes receiving terminal and passes transmitting terminal with figure and carry out either simplex
Communication, is i.e. only passed transmitting terminal watchdog subsystem earthward by On-Board Subsystem by figure and sends the image information processed in real time,
Figure passes receiving terminal and is served only for the image information that reception figure biography transmitting terminal is launched.The logical figure of earth station passes receiving terminal reception figure and passes transmitting terminal
Send the image of coming, and show in real time, the most normal for the running status of monitoring machine loading system.
In the present invention, the image information of collected by camera target and around scene, graphics processing unit is to target image information
Carry out processing and target detection, control subsystem in conjunction with row and send the information such as the task order come and unmanned plane position, attitude, solve
Calculate the positional information of target, then target position information is sent to the flight control system of rotor wing unmanned aerial vehicle self, for
It preferably carries out flight decision-making.
In the present invention, camera uses Hero gopro3+ moving camera, and its wide-angle lens increases the visual field of visual detection.
For the impact overcoming rotor wing unmanned aerial vehicle attitudes vibration that collected by camera is brought, camera is arranged on Self-stabilization holder, it is ensured that phase
Machine optical axis is the most down.Image transmitting selects big boundary 5.8GHz figure to pass external member, and wherein figure passes transmitting terminal and is arranged on rotation
On wing unmanned plane, collected by camera to image be sent to ground surveillance subsystem in real time, the figure of ground surveillance subsystem passes and connects
Receiving end receives image, and in real time image is shown in earth station by image pick-up card.
Communicated by serial ports between On-Board Subsystem and flight control system, use way communication, flight control
The information such as current task number, unmanned plane state position are sent to visual system by system processed, and On-Board Subsystem is appointed according to different
Business instruction, performs different tasks, then the target location detected is sent to flight control system, in order to flight control system performs more
Good flight decision-making.
The software design architecture of the present invention makes full use of the multithreading that DSP/BIOS provides, and image processing process is made
Serving as theme journey, the serial ports of the compunication with flight control system sends and receives data and is respectively two independent threads,
Their priority is higher than main thread, and data receiver triggers in hardware interrupts mode, and data send and trigger with timing mode.In order to
Accelerate the input and output of image data stream, the collection of image and send and aobvious use ping-pong operation in main thread, reach data stream string
And the effect changed.Owing to different tasks needs the target difference of airborne visual system detection (so root in main thread
According to different task numbers, select different image processing algorithms, it is achieved the detection to different target, complete different tasks.Journey
Sequence master control flow chart, as shown in Figure 2.
The airborne visual detection of rotor wing unmanned aerial vehicle of the present invention and multi-target positioning system, it is as follows that concrete process realizes process:
(1), rotor wing unmanned aerial vehicle fly near target overhead behind position, starter motor subsystems, use Zhang Zhengyou grid
Camera is demarcated by dish standardizition, obtains the intrinsic parameter of camera, collected by camera target and the image information of surrounding scene, then
The each width original image collecting graphics processing unit is corrected, and obtains the image after removing distortion.This trimming process
Mainly remove the pattern distortion that camera wide-angle lens brings.
(2), the original image that collects of graphics processing unit be YCbCr format, setpoint color threshold value is to figure for convenience
As carrying out binaryzation, it is to avoid brightness of image changes the impact brought, YCbCr format forward image to visual system one with people
The HSV form caused.HSV color space comprises the information of three passages of hue, saturation, intensity of image, and three passages are mutual
Independent, enormously simplify the workload of image procossing and analysis.
(3), set the blue region having larger proportion in target to be identified, and around target, scene seldom has and target
Color, interference of the same size, it is possible to first pass through differentiation colouring information and target and background are carried out preliminary differentiation.Here
Tri-passages of H, S, V to image use the mode of fixed threshold, to coloured image binaryzation.
(4), in order to obtain area-of-interest in entire image, first the image after binaryzation is made morphology opening operation,
Remove the isolated noise in image, then utilize seed mediated growth method to find all of connected region, wherein area coinciding in image
Set the connected region of threshold size as area-of-interest, be potential target.Because camera meeting different from the distance of target causes
Target imaging in the picture cause not of uniform size, so the threshold value of connected region area is dynamic threshold, according to demarcating in advance
Different cameral height and the mapping relations one by one of different threshold size, set up mapping table, use the method tabled look-up to determine dynamically
Threshold value, the method is techniques known, and the present invention is not described in detail in this.
(5), area-of-interest may exist some pseudo-targets, utilize target blueness large circle nested inside redness little
The feature of circle, uses the method that above-mentioned (3)-(4) step is same, judges the area of red annulus, meets the threshold value of setting
Size for potential target, be otherwise pseudo-target.
(6), identical targets multiple in same image are made a distinction, are to realize a step crucial in multiple target detection,
The present invention uses the method placing auxiliary beacon, as it is shown on figure 3, for four respectively constituted by green roundlet and blue circular ring
Same target, uses placement difformity, the yellow beacon of quantity in blue circular ring region to distinguish, and beacon is divided into solid square
Shape C and hollow rectangle D two kinds, each shape is placed two pieces and cross placement four pieces according to yi word pattern, can be identified four phases
Same target.
The decision rule distinguishing target takes full advantage of quantity and the positional information of beacon, it is achieved that a kind of redundancy machine
System.As it is shown on figure 3, be from left to right followed successively by 1 No. 4 targets, as a example by No. 1 and No. 3 targets, when identifying Filled Rectangle number it is
The center of 2 and two Filled Rectangles and when being wired to 180 degree of angles or only identify one of them Filled Rectangle of the center of circle,
It is regarded as No. 1 target;When Filled Rectangle number number is 2,3 or 4, and at least a part of which have the center of 2 solid yellow blocks with
When being wired to an angle of 90 degrees of the center of circle, it is believed that be No. 3 targets.By that analogy, No. 2 and No. 4 targets are determined.This to multiobject superfluous
Remaining differentiation mode makes it have extremely strong robustness, tests through lot of experiments, for rotor wing unmanned aerial vehicle toward four target area codes
During goods putting thing, part beacon is produced and block, or the situation that part beacon can not accurately identify out, the method still can determine that.
(7), Hough transform be a kind of use voting principle parameter estimation techniques, it be accomplished that one from image
Space is to the mapping relations of parameter space, owing to having some obvious advantages, causes many Chinese scholars and engineering
The common concern of personnel.Classical Hough transform is looked for round algorithm to be improved by the present invention, makes full use of round geometry character
Matter, intersects at the center of circle according to the perpendicular bisector of upper any two the uneven strings of circle and determines parameter.
First, choosing K, L, M 3 point on circle C, KL and LM constitutes two uneven strings, the perpendicular bisector l of KL and LMKL、
lLMIntersecting at center of circle O, KO is the radius of circle C, as shown in Figure 4.
Secondly, if the coordinate of K, L, M 3 is K (x1,y1)、L(x2,y2)、M(x3,y3), then in being understood two by point slope method
Vertical line lKLAnd lLMEquation be respectively as follows:
lKL: y=kKLx+bKL (1)
lLM: y=kLMx+bLM (2)
In formula
According to lKLAnd lLMEquation can calculate central coordinate of circle and radius of circle:
Parameter [the x of one group of circle can be uniquely determined by mono-group of point of K, L, M0,y0, R], the quality of voting result depends on a little
Choosing of group.Preservation order in view of boundary point determines according to traversal sequencing, in order to improve algorithm to noise and
The robustness that circular arc length is different, the method using dynamic stride when selected element group.
Again, if boundary point sum is N number of, each group of some Kij、Lij、MijRepresent, often take Q group point and be considered as a cycle,
Subscript i represents periodicity (i ∈ [1, T]), the group number (j ∈ [1, Q]) that j puts in representing each cycle, and the size of cycle T value determines
The point group number altogether chosen.In the i-th cycle, first group of point is by initial point Ki、Li、MiDetermining, computing formula is as shown in (6),
And between consecutive points group, all differ identical stride value Si, as shown in formula (7).
In above formula, k is proportionality coefficient, and i ∈ [1, T].
I ∈ [1, T] in formula.
In processing procedure, General Boundary point number is about 100, owing to boundary point number is relatively fewer, makes in (6) formula
K=1, for the ease of debugging, take S1=S2=...=ST=N/8, the change of the value N to be followed of the group number Q of point in each cycle
Change and change, take Q=N here.Through experiment test, when cycle T value is 5, this algorithm can reach the precision that comparison is high, can obtain again
To reasonable operational efficiency.The mode of choosing of this some group can be effectively improved noise and the robustness of shorter circular arc, is justifying
Target location still can be simulated, as shown in Fig. 5 (a), (b) when arc is shorter.The height of target can produce stereoscopic vision effect, leads
Cause camera is when different direction shootings, and target imaging difference is relatively big, and this algorithm still can accurately simulate target location,
As shown in Fig. 5 (c), (d).
(8), target location resolves.Solution process is that target's center's pixel coordinate under pixel coordinate system is to image coordinate
Being the conversion of coordinate, the i.e. pixel coordinate conversion to physical coordinates, conversion formula is:
Wherein (u, v, 1)TFor target's center's homogeneous coordinates under pixel coordinate, unit is: pixel;(x,y,1)TFor mesh
Mark center homogeneous coordinates under image coordinate, unit is: millimeter;Transition matrix is the Intrinsic Matrix of camera, (u0,v0)T
For camera photocentre coordinate under pixel coordinate system, (fx, fy)TFor camera photocentre coordinate under pixel coordinate system.
(9) the target location transmission, resolving obtained carries out flight decision-making to flight control system.
For proving the effectiveness of technical solution of the present invention, static experiment and airborne experiment is used to verify individually below.
(1), static experiment.In order to visual detection is tested with positioning precision, and facilitate program debugging, build
Static test platform, camera and Self-stabilization holder frame on the support of inverted L shape adjustment height, target to be detected is only placed on one
On the dolly of energy fore-and-aft direction motion, dolly seesaws and target can be made to send out in the side-play amount of X-direction or Y-direction relative to camera
Changing.
Comprehensive in view of test, respectively the situation under X-direction, Y-direction, different cameral height is done contrast and has surveyed
Examination, as shown in Fig. 6, Fig. 7, Fig. 8.During system work, camera heights is between 140cm to 240cm, during test X-direction precision, and mesh
The side-play amount being marked on Y-direction is fixed at random, takes, in X-direction, the test point that many groups are different, and often group test point spacing is 10cm, record
Often organize the real offset of test point target, and contrast with positioning result, test result indicate that visual system is in X-direction solution
Calculating error is ± 1cm;Same method records Y-direction resolution error for ± 1.5cm.Change location to test camera heights
The impact that precision is brought, makes target fix at random in the side-play amount of X, Y-direction, and record camera Solutions of Systems when differing heights calculates
Target offset amount, and contrast with real offset, choose multiple different target location, carry out organizing test, result more
The error showing camera heights to change to system accuracy to bring is within ± 4cm.
Can be seen that when camera heights is fixed, visual system is at X, and the application condition of Y-direction is little, substantially at ± 1.5cm
Within, source of error is probably and is brought by measurement error;The impact that system accuracy is brought by camera differing heights is mainly hands
During dynamic adjustment camera heights, the position of camera can produce slight rotation around support, bring experimental implementation error, and this error is about
2~3cm, remove operating error, the error that detection accuracy is brought by camera differing heights is substantially within ± 2cm.
(2), airborne experiment.Test in order to precision is visited in the robustness of visual system, real-time and inspection, open up with Asia
The rotor unmanned helicopter of 800E model plane repacking is platform, has built airborne visual detection and multi-target positioning system.Through excessive
Amount experimental verification, with multi-target positioning system in case of variable weather conditions, it is right to complete for the airborne visual detection of the present invention
Multiobject differentiation be accurately positioned, there is the strongest robustness.Visual system forms closed loop, Ke Yishi with flight control system
Now to quiet target, the real-time tracking of moving-target, illustrate that the airborne visual detection of the present invention and multi-target positioning system have preferably
Real-time and higher detection accuracy.
The present invention, with TMS320DM6437 as platform, has built airborne visual detection and object locating system, respectively quiet
State platform and airborne platform complete multiobject accurate detection, have carried out substantial amounts of experiment and have achieved preferable effect,
Demonstrate the reliability of system, robustness, real-time and higher detection accuracy.
The above is only the preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art
For Yuan, on the premise of without departing from the technology of the present invention principle, it is also possible to make some improvement and modification, these improve and modification
Also should be regarded as protection scope of the present invention.
Claims (8)
1. the airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system, it is characterised in that: include On-Board Subsystem and
Ground surveillance subsystem, described On-Board Subsystem includes that video acquisition unit, graphics processing unit and figure pass transmitting terminal, and video is adopted
The image of collection unit collection passes transmitting terminal through figure after being processed by graphics processing unit and is sent to ground surveillance subsystem;Described ground
Watchdog subsystem includes that earth station and the figure being connected with earth station pass receiving terminal, and described figure passes receiving terminal and leads to figure biography transmitting terminal
Letter.
The airborne visual detection of rotor wing unmanned aerial vehicle the most according to claim 1 and 2 and multi-target positioning system, its feature exists
In: described video acquisition unit includes camera and Self-stabilization holder, described installation camera on Self-stabilization holder, Self-stabilization holder
Holding camera optical axis sensing ground is the most perpendicular with ground.
The airborne visual detection of rotor wing unmanned aerial vehicle the most according to claim 3 and multi-target positioning system, it is characterised in that: institute
State figure and pass employing simplex between receiving terminal and figure biography transmitting terminal.
4. the airborne visual detection of rotor wing unmanned aerial vehicle described in claim 1 and the implementation method of multi-target positioning system, its feature exists
In comprising the following steps:
1), using grid dish standardizition to demarcate camera, obtain the intrinsic parameter of camera, that then arrives collected by camera is each
Original image to be identified is corrected removing distortion;
2), the original image after correction is converted into HSV form by YCbCr format;
3), set every width original image is equipped with A color border circular areas, and A color border circular areas and surrounding Scene colors, size discrimination
Substantially, then tri-passages of H, S, V to image use the mode of fixed threshold, carry out binary conversion treatment;
4), the image after binaryzation is made morphology opening operation, remove the isolated noise in image, utilize seed mediated growth method to find
All of connected region in image, wherein area coinciding sets the connected region of threshold size as area-of-interest, as potential
Target;
5), in same image, in multiple identical targets, auxiliary beacon is set, target is made a distinction and locks;
6), the Hough transform improved is utilized to look for round algorithm to determine each target's center;
7), complete target location according to camera imaging model to resolve.
The most according to claim 4, the airborne visual detection of rotor wing unmanned aerial vehicle and the implementation method of multi-target positioning system, it is special
Levy and be, be included in described step 4) area-of-interest remove the step of pseudo-target: utilize territory, A zone nested inside in target
The feature in territory, B zone, performs step 3), the area in territory, B zone is judged, meet setting threshold size for potential mesh
Mark, is otherwise pseudo-target.
The most according to claim 5, the airborne visual detection of rotor wing unmanned aerial vehicle and the implementation method of multi-target positioning system, it is special
Levy and be, described step 5) process that arranges auxiliary beacon in multiple identical targets is: inside by A color border circular areas
The multiple identical area-of-interest that nested B color border circular areas is constituted, uses and places difformity, the C of quantity in territory, A zone
Color beacon carrys out supplementary globe, identifies multiple identical target.
7. according to the realization of the airborne visual detection of rotor wing unmanned aerial vehicle described in any one of claim 4 to 6 Yu multi-target positioning system
Method, it is characterised in that described step 6) concretely comprise the following steps:
61), choosing K, L, M 3 point on circle C, KL and LM constitutes two uneven strings, the perpendicular bisector l of KL and LMKL、lLMIntersect
It is the radius of circle C in center of circle O, KO;
62) coordinate of K, L, M 3, is set as K (x1,y1)、L(x2,y2)、M(x3,y3), then understood two perpendicular bisectors by point slope method
lKLAnd lLMEquation be respectively as follows:
lKL: y=kKLx+bKL (1)
lLM: y=kLMx+bLM (2)
In formula
63), according to lKLAnd lLMEquation can calculate central coordinate of circle and radius of circle:
64), boundary point sum is set as N number of, each group of some Kij、Lij、MijRepresent, often take Q group point and be considered as a cycle, subscript i
Representing periodicity (i ∈ [1, T]), the group number (j ∈ [1, Q]) that j puts in representing each cycle, the size of T value determines to be selected altogether
The point group number taken;In the i-th cycle, first group of point is by initial point Ki、Li、MiDetermine, such as formula such as (6);
In above formula, k is proportionality coefficient, and i ∈ [1, T], k=1.
Identical stride value S is all differed between consecutive points groupi, as shown in formula (7)
S in formula1=S2=...=ST=N/8, in each cycle, the value of the group number Q of point follows the change Q=N of N.
The most according to claim 7, the airborne visual detection of rotor wing unmanned aerial vehicle and the implementation method of multi-target positioning system, it is special
Levy and be: described T value is 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610839767.2A CN106289186B (en) | 2016-09-21 | 2016-09-21 | The airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system and implementation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610839767.2A CN106289186B (en) | 2016-09-21 | 2016-09-21 | The airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system and implementation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106289186A true CN106289186A (en) | 2017-01-04 |
CN106289186B CN106289186B (en) | 2019-04-19 |
Family
ID=57711506
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610839767.2A Active CN106289186B (en) | 2016-09-21 | 2016-09-21 | The airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system and implementation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106289186B (en) |
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107576329A (en) * | 2017-07-10 | 2018-01-12 | 西北工业大学 | Fixed-wing unmanned plane based on machine vision drop guiding cooperation beacon design method |
CN107911429A (en) * | 2017-11-04 | 2018-04-13 | 南京奇蛙智能科技有限公司 | A kind of online traffic flow monitoring method in unmanned plane high in the clouds based on video |
CN108255553A (en) * | 2017-12-22 | 2018-07-06 | 西安思丹德信息技术有限公司 | A kind of graphic processing method and device of telemetry ground station human-computer interaction interface |
CN108563236A (en) * | 2018-06-08 | 2018-09-21 | 清华大学 | It is a kind of that type unmanned plane target tracking is received based on concentric circles feature |
CN109141232A (en) * | 2018-08-07 | 2019-01-04 | 常州好迪机械有限公司 | A kind of circle plate casting online test method based on machine vision |
CN109540834A (en) * | 2018-12-13 | 2019-03-29 | 深圳市太赫兹科技创新研究院 | A kind of cable aging monitoring method and system |
CN109597424A (en) * | 2017-09-30 | 2019-04-09 | 南京理工大学 | Unmanned plane line walking control system based on video image processing |
CN109921841A (en) * | 2018-12-29 | 2019-06-21 | 顺丰科技有限公司 | The unmanned plane means of communication and system |
CN110398233A (en) * | 2019-09-04 | 2019-11-01 | 浙江中光新能源科技有限公司 | A kind of heliostat field coordinate mapping system and method based on unmanned plane |
CN111476116A (en) * | 2020-03-24 | 2020-07-31 | 南京新一代人工智能研究院有限公司 | Rotor unmanned aerial vehicle system for vehicle detection and tracking and detection and tracking method |
CN111709994A (en) * | 2020-04-17 | 2020-09-25 | 南京理工大学 | Autonomous unmanned aerial vehicle visual detection and guidance system and method |
CN111879313A (en) * | 2020-07-31 | 2020-11-03 | 中国人民解放军国防科技大学 | Multi-target continuous positioning method and system based on unmanned aerial vehicle image recognition |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1043566A2 (en) * | 1999-04-08 | 2000-10-11 | Donnelly Corporation | Vehicle compass compensation |
CN103903253A (en) * | 2012-12-28 | 2014-07-02 | 联想(北京)有限公司 | Mobile terminal positioning method and system |
CN104776848A (en) * | 2015-04-20 | 2015-07-15 | 李智 | Space target identifying, positioning and tracking method |
US20150323322A1 (en) * | 2010-06-15 | 2015-11-12 | California Institute Of Technology | Automated Vessel Navigation Using Sea State Prediction |
CN105222760A (en) * | 2015-10-22 | 2016-01-06 | 一飞智控(天津)科技有限公司 | The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method |
CN206177293U (en) * | 2016-09-21 | 2017-05-17 | 南京航空航天大学 | Rotor unmanned aerial vehicle machine carries visual detection and many object positioning system |
-
2016
- 2016-09-21 CN CN201610839767.2A patent/CN106289186B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1043566A2 (en) * | 1999-04-08 | 2000-10-11 | Donnelly Corporation | Vehicle compass compensation |
US20150323322A1 (en) * | 2010-06-15 | 2015-11-12 | California Institute Of Technology | Automated Vessel Navigation Using Sea State Prediction |
CN103903253A (en) * | 2012-12-28 | 2014-07-02 | 联想(北京)有限公司 | Mobile terminal positioning method and system |
CN104776848A (en) * | 2015-04-20 | 2015-07-15 | 李智 | Space target identifying, positioning and tracking method |
CN105222760A (en) * | 2015-10-22 | 2016-01-06 | 一飞智控(天津)科技有限公司 | The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method |
CN206177293U (en) * | 2016-09-21 | 2017-05-17 | 南京航空航天大学 | Rotor unmanned aerial vehicle machine carries visual detection and many object positioning system |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107576329A (en) * | 2017-07-10 | 2018-01-12 | 西北工业大学 | Fixed-wing unmanned plane based on machine vision drop guiding cooperation beacon design method |
CN107576329B (en) * | 2017-07-10 | 2020-07-03 | 西北工业大学 | Fixed wing unmanned aerial vehicle landing guiding cooperative beacon design method based on machine vision |
CN109597424A (en) * | 2017-09-30 | 2019-04-09 | 南京理工大学 | Unmanned plane line walking control system based on video image processing |
CN107911429A (en) * | 2017-11-04 | 2018-04-13 | 南京奇蛙智能科技有限公司 | A kind of online traffic flow monitoring method in unmanned plane high in the clouds based on video |
CN108255553A (en) * | 2017-12-22 | 2018-07-06 | 西安思丹德信息技术有限公司 | A kind of graphic processing method and device of telemetry ground station human-computer interaction interface |
CN108255553B (en) * | 2017-12-22 | 2023-10-13 | 西安思丹德信息技术有限公司 | Graphic processing method and device for remote measurement ground station man-machine interaction interface |
CN108563236A (en) * | 2018-06-08 | 2018-09-21 | 清华大学 | It is a kind of that type unmanned plane target tracking is received based on concentric circles feature |
CN109141232A (en) * | 2018-08-07 | 2019-01-04 | 常州好迪机械有限公司 | A kind of circle plate casting online test method based on machine vision |
CN109540834A (en) * | 2018-12-13 | 2019-03-29 | 深圳市太赫兹科技创新研究院 | A kind of cable aging monitoring method and system |
CN109921841B (en) * | 2018-12-29 | 2021-06-25 | 顺丰科技有限公司 | Unmanned aerial vehicle communication method and system |
CN109921841A (en) * | 2018-12-29 | 2019-06-21 | 顺丰科技有限公司 | The unmanned plane means of communication and system |
CN110398233A (en) * | 2019-09-04 | 2019-11-01 | 浙江中光新能源科技有限公司 | A kind of heliostat field coordinate mapping system and method based on unmanned plane |
CN111476116A (en) * | 2020-03-24 | 2020-07-31 | 南京新一代人工智能研究院有限公司 | Rotor unmanned aerial vehicle system for vehicle detection and tracking and detection and tracking method |
CN111709994A (en) * | 2020-04-17 | 2020-09-25 | 南京理工大学 | Autonomous unmanned aerial vehicle visual detection and guidance system and method |
CN111709994B (en) * | 2020-04-17 | 2022-12-20 | 南京理工大学 | Autonomous unmanned aerial vehicle visual detection and guidance system and method |
CN111879313A (en) * | 2020-07-31 | 2020-11-03 | 中国人民解放军国防科技大学 | Multi-target continuous positioning method and system based on unmanned aerial vehicle image recognition |
Also Published As
Publication number | Publication date |
---|---|
CN106289186B (en) | 2019-04-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106289186A (en) | The airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system and implementation method | |
CN111145545B (en) | Road traffic behavior unmanned aerial vehicle monitoring system and method based on deep learning | |
Zhao et al. | Detection, tracking, and geolocation of moving vehicle from uav using monocular camera | |
CN107202982B (en) | A kind of beacon arrangement and image processing method based on UAV position and orientation calculating | |
CN103822635B (en) | The unmanned plane during flying spatial location real-time computing technique of view-based access control model information | |
CN104217439B (en) | Indoor visual positioning system and method | |
CN109212545A (en) | Multiple source target following measuring system and tracking based on active vision | |
CN107729808A (en) | A kind of image intelligent acquisition system and method for power transmission line unmanned machine inspection | |
CN105930819A (en) | System for real-time identifying urban traffic lights based on single eye vision and GPS integrated navigation system | |
Štěpán et al. | Vision techniques for on‐board detection, following, and mapping of moving targets | |
CN108762291A (en) | A kind of method and system finding and track black winged unmanned aerial vehicle remote controller | |
CN102436738A (en) | Traffic monitoring device based on unmanned aerial vehicle (UAV) | |
CN103942273A (en) | Dynamic monitoring system and method for aerial quick response | |
CN109739254A (en) | Using the unmanned plane and its localization method of visual pattern positioning in a kind of electric inspection process | |
CN109063532A (en) | A kind of field lost contact personnel's method for searching based on unmanned plane | |
CN111061266A (en) | Night on-duty robot for real-time scene analysis and space obstacle avoidance | |
CN111679695A (en) | Unmanned aerial vehicle cruising and tracking system and method based on deep learning technology | |
CN113286081B (en) | Target identification method, device, equipment and medium for airport panoramic video | |
CN206177293U (en) | Rotor unmanned aerial vehicle machine carries visual detection and many object positioning system | |
CN106908038B (en) | A kind of monitoring device and monitoring system based on fish eye lens video camera | |
CN109668567A (en) | Polarized light orientation method under multi-cloud condition of unmanned aerial vehicle | |
CN107065027A (en) | Detection system, method, device and the equipment of source of leaks | |
CN112947550A (en) | Illegal aircraft striking method based on visual servo and robot | |
Liu et al. | Dloam: Real-time and robust lidar slam system based on cnn in dynamic urban environments | |
US11816863B2 (en) | Method and device for assisting the driving of an aircraft moving on the ground |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |