CN106774402A - The method and device positioned to unmanned plane - Google Patents
The method and device positioned to unmanned plane Download PDFInfo
- Publication number
- CN106774402A CN106774402A CN201611240082.2A CN201611240082A CN106774402A CN 106774402 A CN106774402 A CN 106774402A CN 201611240082 A CN201611240082 A CN 201611240082A CN 106774402 A CN106774402 A CN 106774402A
- Authority
- CN
- China
- Prior art keywords
- unmanned plane
- image
- benchmark image
- return voyage
- characteristic point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000005070 sampling Methods 0.000 claims description 9
- 230000009897 systematic effect Effects 0.000 abstract description 3
- 230000009466 transformation Effects 0.000 description 18
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 13
- 238000010586 diagram Methods 0.000 description 11
- 238000004590 computer program Methods 0.000 description 7
- 230000008859 change Effects 0.000 description 6
- 239000011159 matrix material Substances 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 description 4
- 230000006870 function Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000009471 action Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000011835 investigation Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 102220008440 rs3746964 Human genes 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/102—Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for vertical take-off of aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Image Analysis (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
A kind of method and device positioned to unmanned plane, wherein, methods described includes:During unmanned plane during flying, benchmark image is generated;Obtain the present image of current time collection;According to benchmark image and present image, the current location of unmanned plane is determined.There is certain relevance between benchmark image and present image, then, the current location of unmanned plane is can determine according to benchmark image and present image.Relative to resolution ratio fixed in the prior art, unmanned plane can be better achieved Dynamic Matching, reduce systematic error during making a return voyage, so as to improve the positioning precision maked a return voyage.
Description
Technical field
The present invention relates to unmanned aerial vehicle (UAV) control field, and in particular to a kind of method and device positioned to unmanned plane.
Background technology
Unmanned plane take precautions against natural calamities recover, there is wide application in the field such as scientific investigation, and flight control system is unmanned plane
Important component, UAV Intelligent and it is practical in play an important role.Generally, unmanned plane is in flight control system
Gone under the control of system after the completion of the operation of destination, can automatically be maked a return voyage according to former road.
Positioning during in order to realize making a return voyage unmanned plane automatically, in the prior art, generally can be in the flight system of unmanned plane
The map datum that third party provides is stored in system, then by positioner such as global positioning system (Global Position
System, GPS) realize positioning and the navigation of unmanned plane.However, the resolution ratio and unmanned plane of the map datum of third party's offer
Height apart from ground is relevant, and typically, the liftoff flying height of unmanned plane is higher, and resolution ratio is lower.Because unmanned plane exists
Flying height in operation process can often change, and the resolution ratio difference for thus easily causing ground target is larger, matching
Precision is low, result in positioning precision when making a return voyage poor.
Therefore, how to improve positioning precision as technical problem urgently to be resolved hurrily.
The content of the invention
The technical problem to be solved in the present invention is how to improve positioning precision.
Therefore, according in a first aspect, the embodiment of the invention discloses a kind of method positioned to unmanned plane, including:
During unmanned plane during flying, benchmark image is generated;Obtain the present image of current time collection;According to benchmark
Image and present image, determine the current location of unmanned plane.
Alternatively, benchmark image is generated, including:During unmanned plane during flying, ground image is gathered;By ground image
Spliced, obtained benchmark image.
Alternatively, during unmanned plane during flying, ground image is gathered, including:Flown to from original position in unmanned plane
Make a return voyage during position, gather ground image.
Alternatively, before the present image of current time collection is obtained, the present embodiment is disclosed to be determined unmanned plane
The method of position also includes:It is determined that making a return voyage.
Alternatively, before it is determined that making a return voyage, the disclosed method positioned to unmanned plane of the present embodiment also includes:Receive
The instruction maked a return voyage for instruction that controller sends.
Alternatively, after generating the reference images, the disclosed method positioned to unmanned plane of the present embodiment also includes:
Determine that unmanned plane flies to the reverse track of position of making a return voyage from original position.
Alternatively, it is determined that after reverse track, the disclosed method positioned to unmanned plane of the present embodiment also includes:
Flown to original position from position of making a return voyage according to reverse track.
Alternatively, according to benchmark image and present image, the current location of unmanned plane is determined, including:To present image with
Benchmark image is matched, and obtains unmanned plane in motion vector of the current time relative to benchmark image;It is true according to motion vector
Unmanned plane is determined in location information of the current time relative to benchmark image;Wherein, location information includes at least one following:Nobody
The course of the position of machine, the height of unmanned plane, the attitude of unmanned plane, the orientation of unmanned plane, the speed of unmanned plane and unmanned plane.
Alternatively, present image is matched with benchmark image, obtains unmanned plane at current time relative to reference map
The motion vector of picture, including:Present image and benchmark image are carried out into scene matching aided navigation, obtain unmanned plane current time relative to
The motion vector of benchmark image.
Alternatively, present image and benchmark image are carried out into scene matching aided navigation, obtains unmanned plane at current time relative to base
The motion vector of quasi- image, including:The characteristic point of benchmark image is chosen, wherein, the characteristic point of selection is used as reference characteristic
Point;It is determined that in present image with the characteristic point of reference characteristic Point matching, wherein, the characteristic point that obtains of matching is used as currently
Characteristic point;Current signature point is matched with reference characteristic point, unmanned plane is obtained at current time relative to benchmark image
Motion vector.
According to second aspect, the embodiment of the invention discloses a kind of device positioned to unmanned plane, including:
Base modules, for during unmanned plane during flying, generating benchmark image;Acquisition module, it is current for obtaining
The present image of moment collection;Locating module, what benchmark image and acquisition module for being generated according to base modules were gathered works as
Preceding image, determines the current location of unmanned plane.
Alternatively, base modules include:Sampling unit, for during unmanned plane during flying, gathering ground image;Spell
Order unit, the ground image for sampling unit to be gathered is spliced, and obtains benchmark image.
Alternatively, during sampling unit in unmanned plane specifically for flying to position of making a return voyage from original position, locality
Face image.
Alternatively, the disclosed device positioned to unmanned plane of the present embodiment also includes:Determining module, for determining to return
Boat.
Optionally it is determined that module is additionally operable to receive the instruction maked a return voyage for instruction that controller sends.
Alternatively, the disclosed device positioned to unmanned plane of the present embodiment also includes:Track module, in benchmark
After module generation benchmark image, determine that unmanned plane flies to the reverse track of position of making a return voyage from original position.
Alternatively, the disclosed device positioned to unmanned plane of the present embodiment also includes:Make a return voyage module, for according to rail
Flown to original position from position of making a return voyage the reverse track that mark module determines.
Alternatively, locating module includes:Matching unit, for being matched with benchmark image to present image, obtains nothing
It is man-machine in motion vector of the current time relative to benchmark image;Location information unit, for determining nobody according to motion vector
Machine is in location information of the current time relative to benchmark image;Wherein, location information includes at least one following:The position of unmanned plane
Put, the course of the attitude of the height of unmanned plane, unmanned plane, the orientation of unmanned plane, the speed of unmanned plane and unmanned plane.
Alternatively, matching unit obtains unmanned plane and exists specifically for present image and benchmark image are carried out into scene matching aided navigation
Motion vector of the current time relative to benchmark image.
Alternatively, matching unit includes:Selection subelement, the characteristic point for choosing benchmark image, wherein, the spy of selection
Levy and be a little used as reference characteristic point;Characteristic point determination subelement, for determine in present image with reference characteristic Point matching
Characteristic point, wherein, the characteristic point that obtains of matching is used as current signature point;Vector subelement, for by current signature point
Matched with reference characteristic point, obtained unmanned plane in motion vector of the current time relative to benchmark image.
Technical solution of the present invention, has the following advantages that:
The method and device positioned to unmanned plane provided in an embodiment of the present invention, it is raw during unmanned plane during flying
Into benchmark image, benchmark image can the newest surface state of ground reflection, then, obtain the present image of current time collection,
Collection is obtained during unmanned plane during flying due to benchmark image and present image, therefore, benchmark image and present image
Between there is certain relevance, then, the current location of unmanned plane is can determine according to benchmark image and present image;This hair
In the scheme of bright embodiment, benchmark image is generated during unmanned plane during flying, and present image is also unmanned plane during flying
During collection acquire, therefore, the benchmark image of generation is produced during can dynamically compensating adaptation unmanned plane during flying
Raw differences in resolution, relative to resolution ratio fixed in the prior art, unmanned plane can be better achieved dynamic during making a return voyage
State is matched, and systematic error is reduced, so as to improve the positioning precision maked a return voyage.
As optional technical scheme, after generating the reference images, determine that unmanned plane flies to position of making a return voyage from original position
The reverse track put so that unmanned plane, can be directly according to reverse track from returning when from position of making a return voyage to original position flight
Boat position is flown to original position, reduces the planning amount of making a return voyage of flight path, improves the effect that flight path determines when making a return voyage
Rate.Additionally, unmanned plane is in the case where no signal or communication failure is run into, by according to reverse track from making a return voyage position to
Beginning position flight so that unmanned plane can smoothly be back to original position.
In addition, generally unmanned plane from original position to make a return voyage position flight when can cook up preferably for example cut-through
Thing etc. goes to journey track, therefore so that during according to going to the reverse track of journey track from position of making a return voyage to original position flight, can
Maked a return voyage with preferably track.
Brief description of the drawings
In order to illustrate more clearly of the specific embodiment of the invention or technical scheme of the prior art, below will be to specific
The accompanying drawing to be used needed for implementation method or description of the prior art is briefly described, it should be apparent that, in describing below
Accompanying drawing is some embodiments of the present invention, for those of ordinary skill in the art, before creative work is not paid
Put, other accompanying drawings can also be obtained according to these accompanying drawings.
Fig. 1 is a kind of method flow diagram positioned to unmanned plane in the embodiment of the present invention;
Fig. 2 is that a kind of scene matching aided navigation obtains the method flow diagram of unmanned plane motion vector in the embodiment of the present invention;
Fig. 3 is a kind of apparatus structure schematic diagram positioned to unmanned plane in the embodiment of the present invention;
Fig. 4 is a kind of system structure diagram positioned to unmanned plane in the embodiment of the present invention.
Specific embodiment
Technical scheme is clearly and completely described below in conjunction with accompanying drawing, it is clear that described implementation
Example is a part of embodiment of the invention, rather than whole embodiments.Based on the embodiment in the present invention, ordinary skill
The every other embodiment that personnel are obtained under the premise of creative work is not made, belongs to the scope of protection of the invention.
In the description of the invention, it is necessary to explanation, term " " center ", " on ", D score, "left", "right", " vertical ",
The orientation or position relationship of the instruction such as " level ", " interior ", " outward " be based on orientation shown in the drawings or position relationship, merely to
Be easy to the description present invention and simplify describe, rather than indicate imply signified device or element must have specific orientation,
With specific azimuth configuration and operation, therefore it is not considered as limiting the invention.Additionally, term " first ", " second ",
" the 3rd " is only used for describing purpose, and it is not intended that instruction or hint relative importance, can not be interpreted as sequencing.
In the description of the invention, it is necessary to illustrate, unless otherwise clearly defined and limited, term " installation ", " phase
Company ", " connection " should be interpreted broadly, for example, it may be being fixedly connected, or being detachably connected, or be integrally connected;Can
Being to mechanically connect, or electrically connect;Can be joined directly together, it is also possible to be indirectly connected to by intermediary, can be with
It is two connections of element internal, can is wireless connection, or wired connection.For one of ordinary skill in the art
For, above-mentioned term concrete meaning in the present invention can be understood with concrete condition.
As long as additionally, technical characteristic involved in invention described below different embodiments non-structure each other
Can just be combined with each other into conflict.
In order to improve the positioning precision of unmanned plane during flying, present embodiment discloses a kind of side positioned to unmanned plane
Method, refer to Fig. 1, be the method flow diagram that this pair of unmanned plane is positioned, and the method that this pair of unmanned plane is positioned includes:
Step S101, during unmanned plane during flying, generates benchmark image.
In the present embodiment, ground image can be gathered after unmanned plane takes off since original position, by unmanned plane preceding
The ground image gathered toward the moving process of destination is spliced, using spliced result as benchmark image.Allegedly
Face image refer to unmanned plane in flight course to overlook the image of visual angle collection, the folder of the vertical view view directions and vertical direction
Angle is less than 90 degree.Preferably, the vertical view view directions can straight down, in the case, be overlooked view directions and erect
Nogata to angle be 0 degree.
Unmanned plane can store the benchmark image of generation, in case being subsequently continuing with the benchmark image..It is optional as one kind
Mode, the benchmark image after generating the reference images, can also be sent to other unmanned planes, so as to other by unmanned plane
Unmanned plane can also use the benchmark image.It should be noted that for same unmanned plane, because the items of unmanned plane are hard
Part parameter will not change during unmanned plane during flying, so the benchmark image that unmanned plane itself is generated can be characterized
The unmanned plane goes to journey track.It is so-called that to go to journey track refer to that unmanned plane flies to the flight path of destination locations from original position.
Step S102, obtains the present image of current time collection.
After benchmark image is generated, during unmanned plane during flying, the present image of current time collection can be obtained.
Step S103, according to benchmark image and present image, determines the current location of unmanned plane.
In the present embodiment, after benchmark image is obtained, present image and benchmark image can be compared, obtain current
The difference of image and benchmark image, the motion vector of unmanned plane is can be evaluated whether according to the difference, thereby determines that the current of unmanned plane
Position.
In an alternate embodiment of the invention, in step S101, the operation for generating benchmark image can include:In unmanned plane during flying
During, gather ground image;Ground image is spliced, benchmark image is obtained.In a particular embodiment, can be by pre-
If interval collection ground image, alleged predetermined interval can determine default time interval in time domain according to priori, or
Default distance interval on person position.Alleged predetermined interval can be at equal intervals, or unequal interval.
In specific splicing, can be spliced using one-piece pattern, it would however also be possible to employ segmented model carries out image mosaic.
Specifically, in splicing, overlapping region is usually there will be between adjacent two field picture, will can be had two before and after lap
Two field picture is combined into a secondary large-scale seamless image.For with the adjacent two field pictures for overlapping, it is also possible to will a wherein two field picture
Overlapping region directly cast out, the part being had more is spliced on another two field picture, is merged in seam region, so as to obtain
Spliced map.
In the embodiment of the present invention, during unmanned plane flies to position of making a return voyage from original position, ground image is gathered.Institute
Title original position refers to unmanned plane in the position for starting to take off;Alleged position of making a return voyage refers to start to start bit after unmanned plane takes off
Put the position of return.Generally, make a return voyage the destination that position goes to for unmanned plane, but, in specific implementation process, position of making a return voyage
Can also be unmanned plane received during destination of flying to make a return voyage instruction when where position.Position of making a return voyage can also be
Unmanned plane runs into the position that special circumstances determine to need to make a return voyage during destination of flying to.Occur for example in flight course
Not enough power supply, without emergency cases such as gps signal, unmanned plane failures, at this moment the flight control system in unmanned plane determines to make a return voyage.
Alternatively, before step S102 is performed, can also include:
Step S104, it is determined that making a return voyage.
Can be that unmanned plane actively determines to make a return voyage wherein in a kind of implementation method, for example unmanned plane is in flight course
Running into special circumstances needs to make a return voyage, and not enough power supply occurs in flight course in such as unmanned plane, without gps signal, unmanned plane failure
Etc. emergency case, the at this moment flight control system determination in unmanned plane is maked a return voyage.Fulfiled assignment after task destination is flown to, unmanned plane
Can actively determine that needs make a return voyage.
Maked a return voyage in another implementation method, or by controller control unmanned plane.Specifically, unmanned plane is received
The instruction maked a return voyage for instruction that controller sends.After unmanned plane receives the instruction, it is determined that making a return voyage.In the present embodiment, control
Device can be unmanned plane dedicated remote control, or with the terminal being remotely controlled to unmanned plane, such as mobile terminal, calculate
Machine, notebook etc..
In order to provide track reference of making a return voyage when making a return voyage to unmanned plane, alternatively, after step S101 is performed, also wrap
Include:
Step S105, determines that unmanned plane flies to the reverse track of position of making a return voyage from original position.
In the present embodiment, for a unmanned plane, because every physical parameter of the unmanned plane is in flight course
All do not change, therefore during position of making a return voyage is flown to from original position, the image of collection after splicing, Neng Gougen
The flight path of unmanned plane is determined according to image attributes.In the present embodiment, original position is flown to from position of making a return voyage along journey track is gone to
Form the reverse track that unmanned plane flies to position of making a return voyage from original position.When unmanned plane makes a return voyage, unmanned plane can be according to this
Reverse track performs operation of making a return voyage.
Alternatively, step S103 can be specifically included:Present image is matched with benchmark image, unmanned plane is obtained and is existed
Motion vector of the current time relative to benchmark image;Determine unmanned plane at current time relative to reference map according to motion vector
The location information of picture.
In the present embodiment, location information includes at least one following:The position of unmanned plane, the height of unmanned plane, unmanned plane
Attitude, the orientation of unmanned plane, the course of the speed of unmanned plane and unmanned plane.Wherein, the direction position of unmanned plane refers to aircraft
In the present image and the relative angle of benchmark image of current time collection, the course of unmanned plane refers to that the actual of unmanned plane flies
Line direction.When present image is matched with benchmark image, because the flight path of the process of making a return voyage is the inverse of journey track
To track, therefore, motion vector of the unmanned plane current time relative to benchmark image can be obtained by the matching, by the fortune
Dynamic vector can obtain the information such as position of the unmanned plane at current time in benchmark image, height, attitude and orientation, in
It is, it may be determined that unmanned plane is in the position at current time.
In a particular embodiment, present image and benchmark image match obtain unmanned plane current time relative to
During the motion vector of benchmark image, present image and benchmark image can be carried out scene matching aided navigation and obtain unmanned plane current time phase
For the motion vector of benchmark image, specifically, Fig. 2 is refer to.Method shown in Fig. 2 includes:
Step S201, chooses the characteristic point of the benchmark image, and the characteristic point of the selection is used as reference characteristic point.
The point or building of easy identification can be chosen as reference characteristic point, such as the object edge point of texture-rich
Deng.After reference characteristic point is selected, the mathematical way such as such as histogram of gradients, local random binary feature can be used to retouch
State characteristic point.
Step S202, it is determined that the characteristic point obtained with the characteristic point of reference characteristic Point matching, the matching in present image
It is used as current signature point.
In a particular embodiment, the pixel in present image can be described by identical mathematical description mode, is utilized
Mathematical knowledge can determine the current signature point with reference characteristic Point matching in present image.
Step S203, current signature point is matched with reference characteristic point, obtain unmanned plane current time relative to
The motion vector of benchmark image.
Current signature point and reference characteristic point can be carried out by affine Transform Model or projective transformation model
Match somebody with somebody.Relevant affine Transform Model or projective transformation model are described as follows.
(1) for affine Transform Model, affine Transform Model can be set up by way of equation group, specifically, is led to
The transformation model for crossing equation group foundation is as follows:
Wherein, on the basis of (x, y) in image reference characteristic point coordinate, (x', y') be present image in reference characteristic
The coordinate of the characteristic point of Point matching, a, b, c, d, m and n are affine transformation parameter.In the present embodiment, when the characteristic point of matching is three
When organizing not conllinear characteristic point, complete affine transformation parameter just can be calculated;When the characteristic point of matching is more than three groups,
More accurate affine transformation parameter can be solved by least square solution.
In a particular embodiment, affine Transform Model can be set up by the form of matrix, specifically, is built by matrix
Vertical transformation model is as follows:
Wherein, on the basis of (x, y) in image reference characteristic point coordinate, (x', y') be present image in reference characteristic
The coordinate of the characteristic point of Point matching, a0, a1, a2, b0, b1 and b2 are affine transformation parameter.In the present embodiment, when the feature of matching
When point is for three groups of not conllinear characteristic points, complete affine transformation parameter just can be calculated;When the characteristic point of matching is three groups
During the above, more accurate affine transformation parameter can be solved by least square solution.
The affine transformation parameter being calculated according to affine Transform Model may be used to indicate that the motion vector of unmanned plane.
(2) for projective transformation model, projective transformation model can be set up by way of equation group, specifically, is led to
Cross below equation and set up transformation model:
Wherein, on the basis of (x, y) in image reference characteristic point coordinate, (x', y') be present image in reference characteristic
The coordinate of the characteristic point of Point matching, (w'x'w'y'w') and (wx wy w) is respectively the homogeneous coordinates of (x, y) and (x', y'),It is projective transform matrix, in a particular embodiment, transformation matrixCan split
It is 4 parts, wherein,Linear transformation is represented, [a31 a32] is used to translate, [a13 a23]TProduce projective transformation,
A33=1.
The projective transform matrix being calculated according to projective transformation model may be used to indicate that the motion vector of unmanned plane.
The present embodiment also discloses a kind of device positioned to unmanned plane, refer to Fig. 3, and the unmanned plane makes a return voyage positioning
Device includes:Base modules 301, acquisition module 302 and locating module 303, wherein:
Base modules 301 are used to during unmanned plane during flying, generate benchmark image;Acquisition module 302 is used to obtain
The present image of current time collection;Locating module 303 is used for the benchmark image and acquisition module generated according to base modules 301
The present image of 302 collections, determines the current location of unmanned plane.
In an alternate embodiment of the invention, base modules 301 include:Sampling unit, for during unmanned plane during flying, adopting
Collection ground image;Concatenation unit, the ground image for sampling unit to be gathered is spliced, and obtains benchmark image.
In an alternate embodiment of the invention, sampling unit from original position specifically for flying to the process of position of making a return voyage in unmanned plane
In, gather ground image.
In an alternate embodiment of the invention, also include:Determining module, for determining to make a return voyage.
In an alternate embodiment of the invention, determining module is additionally operable to receive the instruction maked a return voyage for instruction that controller sends.
In an alternate embodiment of the invention, also include:Track module, after generating benchmark image in base modules 301, really
Determine the reverse track that unmanned plane flies to position of making a return voyage from original position.
In an alternate embodiment of the invention, also include:Make a return voyage module, for the reverse track that determines according to track module from making a return voyage
Flown to original position position.
In an alternate embodiment of the invention, locating module includes:Matching unit, for being carried out to present image and benchmark image
Match somebody with somebody, obtain unmanned plane in motion vector of the current time relative to benchmark image;Location information unit, for according to motion vector
Determine unmanned plane in location information of the current time relative to benchmark image;Wherein, location information includes at least one following:Nothing
The boat of man-machine position, the height of unmanned plane, the attitude of unmanned plane, the orientation of unmanned plane, the speed of unmanned plane and unmanned plane
To.
In an alternate embodiment of the invention, matching unit is additionally operable to for present image and benchmark image to carry out scene matching aided navigation, obtains
Unmanned plane is in motion vector of the current time relative to benchmark image.
In an alternate embodiment of the invention, matching unit includes:Selection subelement, the characteristic point for choosing benchmark image, its
In, the characteristic point of selection is used as reference characteristic point;Characteristic point determination subelement, for determine in present image with benchmark
The characteristic point of Feature Points Matching, wherein, the characteristic point that matching is obtained is used as current signature point;Vector subelement, for inciting somebody to action
Current signature point is matched with reference characteristic point, obtains unmanned plane in motion vector of the current time relative to benchmark image.
The present embodiment also discloses a kind of unmanned plane, refer to Fig. 4.The unmanned plane includes:Fuselage 401, image collector
402 and processor (not shown) are put, wherein:
Fuselage 401 is used to carry all parts of unmanned plane, and such as battery, engine (motor), shooting are first-class;
Image collecting device 402 is arranged on fuselage 401, and image collecting device 402 is used to gather image.
It should be noted that in the present embodiment, image collecting device 402 can be video camera.Alternatively, IMAQ
Device 402 can be used for panoramic shooting.For example, image collecting device 402 can include many mesh cameras, it is also possible to including panorama
Camera, can also simultaneously include many mesh cameras and full-view camera, to gather image or video from multi-angle.
Processor is used to perform the method disclosed in embodiment illustrated in fig. 1.
The method and device positioned to unmanned plane that the present embodiment is provided, it is provided in an embodiment of the present invention to unmanned plane
The method and device for being positioned, generates benchmark image during unmanned plane during flying, benchmark image can ground reflection it is newest
Surface state, then, the present image of current time collection is obtained, because benchmark image and present image are winged in unmanned plane
Collection is obtained during row, therefore, there is certain relevance between benchmark image and present image, then, according to benchmark
Image and present image can determine the current location of unmanned plane;In the scheme of the embodiment of the present invention, benchmark image is at nobody
Generated in machine flight course, and present image is also to gather during unmanned plane during flying to acquire, therefore, the base of generation
Quasi- image can dynamically be compensated and adapt to produced differences in resolution during unmanned plane during flying, relative to fixing in the prior art
Resolution ratio, unmanned plane can be better achieved Dynamic Matching, reduce systematic error, be maked a return voyage so as to improve during making a return voyage
Positioning precision.
In an alternate embodiment of the invention, after generating the reference images, determine that unmanned plane flies to position of making a return voyage from original position
Reverse track so that unmanned plane, can be directly according to reverse track from making a return voyage when from position of making a return voyage to original position flight
Position is flown to original position, reduces the planning amount of making a return voyage of flight path, improves the efficiency that flight path determines when making a return voyage.
Additionally, unmanned plane is in the case where no signal or communication failure is run into, by according to reverse track from position to starting of making a return voyage
Fly position so that unmanned plane can smoothly be back to original position.
In addition, generally unmanned plane from original position to make a return voyage position flight when can cook up preferably for example cut-through
Thing etc. goes to journey track, therefore so that during according to going to the reverse track of journey track from position of making a return voyage to original position flight, can
Maked a return voyage with preferably track.
It should be understood by those skilled in the art that, embodiments of the invention can be provided as method, system or computer program
Product.Therefore, the present invention can be using the reality in terms of complete hardware embodiment, complete software embodiment or combination software and hardware
Apply the form of example.And, the present invention can be used and wherein include the computer of computer usable program code at one or more
The computer program implemented in usable storage medium (including but not limited to magnetic disk storage, CD-ROM, optical memory etc.) is produced
The form of product.
The present invention is the flow with reference to method according to embodiments of the present invention, equipment (system) and computer program product
Figure and/or block diagram are described.It should be understood that every first-class during flow chart and/or block diagram can be realized by computer program instructions
The combination of flow and/or square frame in journey and/or square frame and flow chart and/or block diagram.These computer programs can be provided
The processor of all-purpose computer, special-purpose computer, Embedded Processor or other programmable data processing devices is instructed to produce
A raw machine so that produced for reality by the instruction of computer or the computing device of other programmable data processing devices
The device of the function of being specified in present one flow of flow chart or multiple one square frame of flow and/or block diagram or multiple square frames.
These computer program instructions may be alternatively stored in can guide computer or other programmable data processing devices with spy
In determining the computer-readable memory that mode works so that instruction of the storage in the computer-readable memory is produced and include finger
Make the manufacture of device, the command device realize in one flow of flow chart or multiple one square frame of flow and/or block diagram or
The function of being specified in multiple square frames.
These computer program instructions can be also loaded into computer or other programmable data processing devices so that in meter
Series of operation steps is performed on calculation machine or other programmable devices to produce computer implemented treatment, so as in computer or
The instruction performed on other programmable devices is provided for realizing in one flow of flow chart or multiple flows and/or block diagram one
The step of function of being specified in individual square frame or multiple square frames.
Obviously, above-described embodiment is only intended to clearly illustrate example, and not to the restriction of implementation method.It is right
For those of ordinary skill in the art, can also make on the basis of the above description other multi-forms change or
Change.There is no need and unable to be exhaustive to all of implementation method.And the obvious change thus extended out or
Among changing still in the protection domain of the invention.
Claims (20)
1. a kind of method positioned to unmanned plane, it is characterised in that including:
During unmanned plane during flying, benchmark image is generated;
Obtain the present image of current time collection;
According to the benchmark image and the present image, the current location of the unmanned plane is determined.
2. the method for claim 1, it is characterised in that the generation benchmark image, including:
During the unmanned plane during flying, ground image is gathered;
The ground image is spliced, the benchmark image is obtained.
3. method as claimed in claim 2, it is characterised in that during the unmanned plane during flying, gathers the ground
Image, including:
During the unmanned plane flies to position of making a return voyage from original position, the ground image is gathered.
4. the method as described in any in claim 1-3, it is characterised in that obtain current time collection present image it
Before, methods described also includes:
It is determined that making a return voyage.
5. method as claimed in claim 4, it is characterised in that before it is determined that making a return voyage, methods described also includes:
Receive the instruction maked a return voyage for instruction that controller sends.
6. the method as described in any in claim 1-5, it is characterised in that after the benchmark image is generated, the side
Method also includes:
Determine that the unmanned plane flies to the reverse track of position of making a return voyage from original position.
7. method as claimed in claim 6, it is characterised in that it is determined that after the reverse track, methods described also includes:
Flown to the original position from the position of making a return voyage according to the reverse track.
8. the method as described in any one in claim 1-7, it is characterised in that described according to the benchmark image and described
Present image, determines the current location of the unmanned plane, including:
The present image is matched with the benchmark image, the unmanned plane is obtained at the current time relative to institute
State the motion vector of benchmark image;
Determine the unmanned plane in location information of the current time relative to the benchmark image according to the motion vector;
Wherein, the location information includes at least one following:
The position of the unmanned plane, the height of the unmanned plane, the attitude of the unmanned plane, the orientation of the unmanned plane, institute
State the speed of unmanned plane and the course of the unmanned plane.
9. method as claimed in claim 8, it is characterised in that described to be carried out to the present image and the benchmark image
Match somebody with somebody, obtain the unmanned plane in motion vector of the current time relative to the benchmark image, including:
The present image is carried out into scene matching aided navigation with the benchmark image, the unmanned plane is obtained relative at the current time
In the motion vector of the benchmark image.
10. method as claimed in claim 9, it is characterised in that described that the present image is carried out with the benchmark image
Scene matching aided navigation, obtains the unmanned plane in motion vector of the current time relative to the benchmark image, including:
The characteristic point of the benchmark image is chosen, wherein, the characteristic point of the selection is used as reference characteristic point;
It is determined that in the present image with the characteristic point of the reference characteristic Point matching, wherein, it is described to match the feature that obtains
Point is used as current signature point;
The current signature point is matched with the reference characteristic point, the unmanned plane is obtained relative at the current time
In the motion vector of the benchmark image.
A kind of 11. devices positioned to unmanned plane, it is characterised in that including:
Base modules, for during unmanned plane during flying, generating benchmark image;
Acquisition module, the present image for obtaining current time collection;
Locating module, for the present image that the benchmark image generated according to the base modules and the acquisition module are gathered,
Determine the current location of the unmanned plane.
12. devices as claimed in claim 11, it is characterised in that the base modules include:
Sampling unit, for during the unmanned plane during flying, gathering ground image;
Concatenation unit, for the ground image that the sampling unit is gathered to be spliced, obtains the benchmark image.
13. devices as claimed in claim 12, it is characterised in that the sampling unit specifically in the unmanned plane from
Beginning position fly to position of making a return voyage during, gather the ground image.
14. device as described in any in claim 11-13, it is characterised in that also include:
Determining module, for determining to make a return voyage.
15. devices as claimed in claim 14, it is characterised in that the determining module is additionally operable to receive the use that controller sends
In the instruction that instruction is maked a return voyage.
16. device as described in any in claim 11-15, it is characterised in that also include:
Track module, after generating the benchmark image in base modules, determines that the unmanned plane flies to from original position
Make a return voyage the reverse track of position.
17. devices as claimed in claim 16, it is characterised in that also include:
Make a return voyage module, the reverse track for determining according to the track module flies from the position of making a return voyage to the original position
OK.
18. device as described in claim 11-17 any one, it is characterised in that the locating module includes:
Matching unit, for being matched with the benchmark image to the present image, obtains the unmanned plane and works as described
Motion vector of the preceding moment relative to the benchmark image;
Location information unit, for determining the unmanned plane at the current time relative to the base according to the motion vector
The location information of quasi- image;
Wherein, the location information includes at least one following:
The position of the unmanned plane, the height of the unmanned plane, the attitude of the unmanned plane, the orientation of the unmanned plane, institute
State the speed of unmanned plane and the course of the unmanned plane.
19. devices as claimed in claim 18, it is characterised in that the matching unit specifically for by the present image with
The benchmark image carries out scene matching aided navigation, obtains the unmanned plane in motion of the current time relative to the benchmark image
Vector.
20. devices as claimed in claim 19, it is characterised in that the matching unit includes:
Selection subelement, the characteristic point for choosing the benchmark image, wherein, the characteristic point of the selection is used as benchmark
Characteristic point;
Characteristic point determination subelement, for determining the characteristic point in the present image with the reference characteristic Point matching, its
In, the characteristic point for obtaining that matches is used as current signature point;
Vector subelement, for the current signature point to be matched with the reference characteristic point, obtains the unmanned plane and exists
Motion vector of the current time relative to the benchmark image.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611240082.2A CN106774402A (en) | 2016-12-28 | 2016-12-28 | The method and device positioned to unmanned plane |
PCT/CN2017/072477 WO2018120350A1 (en) | 2016-12-28 | 2017-01-24 | Method and device for positioning unmanned aerial vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611240082.2A CN106774402A (en) | 2016-12-28 | 2016-12-28 | The method and device positioned to unmanned plane |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106774402A true CN106774402A (en) | 2017-05-31 |
Family
ID=58923493
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611240082.2A Withdrawn CN106774402A (en) | 2016-12-28 | 2016-12-28 | The method and device positioned to unmanned plane |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN106774402A (en) |
WO (1) | WO2018120350A1 (en) |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108700892A (en) * | 2017-09-27 | 2018-10-23 | 深圳市大疆创新科技有限公司 | A kind of path method of adjustment and unmanned plane |
CN108917768A (en) * | 2018-07-04 | 2018-11-30 | 上海应用技术大学 | Unmanned plane positioning navigation method and system |
WO2019006772A1 (en) * | 2017-07-06 | 2019-01-10 | 杨顺伟 | Return flight method and device for unmanned aerial vehicle |
CN109214984A (en) * | 2017-07-03 | 2019-01-15 | 北京臻迪科技股份有限公司 | A kind of image acquiring method and device, calculate equipment at automatic positioning navigation system |
CN110243357A (en) * | 2018-03-07 | 2019-09-17 | 杭州海康机器人技术有限公司 | A kind of unmanned plane localization method, device, unmanned plane and storage medium |
CN111492326A (en) * | 2017-12-21 | 2020-08-04 | Wing航空有限责任公司 | Image-based positioning for unmanned aerial vehicles and related systems and methods |
CN111722179A (en) * | 2020-06-29 | 2020-09-29 | 河南天安润信信息技术有限公司 | Multipoint-distributed unmanned aerial vehicle signal direction finding method |
WO2021056144A1 (en) * | 2019-09-23 | 2021-04-01 | 深圳市大疆创新科技有限公司 | Method and apparatus for controlling return of movable platform, and movable platform |
TWI829005B (en) * | 2021-08-12 | 2024-01-11 | 國立政治大學 | High-altitude positioning center setting method and high-altitude positioning flight control method |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US12008910B2 (en) * | 2017-08-04 | 2024-06-11 | ideaForge Technology Pvt. Ltd | UAV system emergency path planning on communication failure |
CN113361552B (en) * | 2020-03-05 | 2024-02-20 | 西安远智电子科技有限公司 | Positioning method and device |
CN112712558A (en) * | 2020-12-25 | 2021-04-27 | 北京三快在线科技有限公司 | Positioning method and device of unmanned equipment |
CN114930194A (en) * | 2020-12-28 | 2022-08-19 | 深圳市大疆创新科技有限公司 | Method, device and equipment for determining position of movable platform |
CN114348264B (en) * | 2022-01-29 | 2022-08-02 | 国家海洋环境预报中心 | Unmanned aerial vehicle search and rescue method and system based on marine environment |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101046387A (en) * | 2006-08-07 | 2007-10-03 | 南京航空航天大学 | Scene matching method for raising navigation precision and simulating combined navigation system |
CN103411609A (en) * | 2013-07-18 | 2013-11-27 | 北京航天自动控制研究所 | Online composition based aircraft return route programming method |
CN104807456A (en) * | 2015-04-29 | 2015-07-29 | 深圳市保千里电子有限公司 | Method for automatic return flight without GPS (global positioning system) signal |
CN104932515A (en) * | 2015-04-24 | 2015-09-23 | 深圳市大疆创新科技有限公司 | Automatic cruising method and cruising device |
CN105487555A (en) * | 2016-01-14 | 2016-04-13 | 浙江大华技术股份有限公司 | Hovering positioning method and hovering positioning device of unmanned aerial vehicle |
CN106204443A (en) * | 2016-07-01 | 2016-12-07 | 成都通甲优博科技有限责任公司 | A kind of panorama UAS based on the multiplexing of many mesh |
-
2016
- 2016-12-28 CN CN201611240082.2A patent/CN106774402A/en not_active Withdrawn
-
2017
- 2017-01-24 WO PCT/CN2017/072477 patent/WO2018120350A1/en not_active Application Discontinuation
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101046387A (en) * | 2006-08-07 | 2007-10-03 | 南京航空航天大学 | Scene matching method for raising navigation precision and simulating combined navigation system |
CN103411609A (en) * | 2013-07-18 | 2013-11-27 | 北京航天自动控制研究所 | Online composition based aircraft return route programming method |
CN104932515A (en) * | 2015-04-24 | 2015-09-23 | 深圳市大疆创新科技有限公司 | Automatic cruising method and cruising device |
CN104807456A (en) * | 2015-04-29 | 2015-07-29 | 深圳市保千里电子有限公司 | Method for automatic return flight without GPS (global positioning system) signal |
CN105487555A (en) * | 2016-01-14 | 2016-04-13 | 浙江大华技术股份有限公司 | Hovering positioning method and hovering positioning device of unmanned aerial vehicle |
CN106204443A (en) * | 2016-07-01 | 2016-12-07 | 成都通甲优博科技有限责任公司 | A kind of panorama UAS based on the multiplexing of many mesh |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109214984A (en) * | 2017-07-03 | 2019-01-15 | 北京臻迪科技股份有限公司 | A kind of image acquiring method and device, calculate equipment at automatic positioning navigation system |
CN109214984B (en) * | 2017-07-03 | 2023-03-14 | 臻迪科技股份有限公司 | Image acquisition method and device, autonomous positioning navigation system and computing equipment |
WO2019006772A1 (en) * | 2017-07-06 | 2019-01-10 | 杨顺伟 | Return flight method and device for unmanned aerial vehicle |
CN108700892A (en) * | 2017-09-27 | 2018-10-23 | 深圳市大疆创新科技有限公司 | A kind of path method of adjustment and unmanned plane |
CN111492326B (en) * | 2017-12-21 | 2024-04-19 | Wing航空有限责任公司 | Image-based positioning for unmanned aerial vehicles and related systems and methods |
CN111492326A (en) * | 2017-12-21 | 2020-08-04 | Wing航空有限责任公司 | Image-based positioning for unmanned aerial vehicles and related systems and methods |
CN110243357B (en) * | 2018-03-07 | 2021-09-10 | 杭州海康机器人技术有限公司 | Unmanned aerial vehicle positioning method and device, unmanned aerial vehicle and storage medium |
CN110243357A (en) * | 2018-03-07 | 2019-09-17 | 杭州海康机器人技术有限公司 | A kind of unmanned plane localization method, device, unmanned plane and storage medium |
CN108917768B (en) * | 2018-07-04 | 2022-03-01 | 上海应用技术大学 | Unmanned aerial vehicle positioning navigation method and system |
CN108917768A (en) * | 2018-07-04 | 2018-11-30 | 上海应用技术大学 | Unmanned plane positioning navigation method and system |
WO2021056144A1 (en) * | 2019-09-23 | 2021-04-01 | 深圳市大疆创新科技有限公司 | Method and apparatus for controlling return of movable platform, and movable platform |
CN111722179A (en) * | 2020-06-29 | 2020-09-29 | 河南天安润信信息技术有限公司 | Multipoint-distributed unmanned aerial vehicle signal direction finding method |
TWI829005B (en) * | 2021-08-12 | 2024-01-11 | 國立政治大學 | High-altitude positioning center setting method and high-altitude positioning flight control method |
Also Published As
Publication number | Publication date |
---|---|
WO2018120350A1 (en) | 2018-07-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106774402A (en) | The method and device positioned to unmanned plane | |
CN106643664A (en) | Method and device for positioning unmanned aerial vehicle | |
CN105627991B (en) | A kind of unmanned plane image real time panoramic joining method and system | |
AU2003244321B2 (en) | Picked-up image display method | |
AU2008322565B2 (en) | Method and apparatus of taking aerial surveys | |
CN110186433B (en) | A kind of airborne survey method and device for rejecting extra aerophotograph | |
CN109765927A (en) | A kind of unmanned plane aerial photography flight remote control system based on APP | |
CN108702444A (en) | A kind of image processing method, unmanned plane and system | |
CN110989658B (en) | High-voltage transmission line crossing inclined photographic point cloud acquisition method | |
CN110717861B (en) | Image splicing method and device, electronic equipment and computer readable storage medium | |
CN111578904B (en) | Unmanned aerial vehicle aerial surveying method and system based on equidistant spirals | |
US12067887B2 (en) | Method and system for generating aerial imaging flight path | |
CN111091622B (en) | Unmanned aerial vehicle inspection route construction method | |
CN109669474A (en) | The adaptive hovering position optimization algorithm of multi-rotor unmanned aerial vehicle based on priori knowledge | |
CN107941167B (en) | Space scanning system based on unmanned aerial vehicle carrier and structured light scanning technology and working method thereof | |
CN113875222A (en) | Shooting control method and device, unmanned aerial vehicle and computer readable storage medium | |
CN113034347A (en) | Oblique photographic image processing method, device, processing equipment and storage medium | |
CN112985398A (en) | Target positioning method and system | |
CN114564049A (en) | Unmanned aerial vehicle wide area search device and method based on deep learning | |
CN113077500A (en) | Panoramic viewpoint positioning and attitude determining method, system, equipment and medium based on plane graph | |
CN112632415A (en) | Web map real-time generation method and image processing server | |
CN110191311A (en) | A kind of real-time video joining method based on multiple no-manned plane | |
Men et al. | Cooperative Localization Method of UAVs for a Persistent Surveillance Task | |
JP4864763B2 (en) | Aerial plan support device | |
CN113592929B (en) | Unmanned aerial vehicle aerial image real-time splicing method and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20170531 |