CN103196430B - Based on the flight path of unmanned plane and the mapping navigation method and system of visual information - Google Patents

Based on the flight path of unmanned plane and the mapping navigation method and system of visual information Download PDF

Info

Publication number
CN103196430B
CN103196430B CN201310153828.6A CN201310153828A CN103196430B CN 103196430 B CN103196430 B CN 103196430B CN 201310153828 A CN201310153828 A CN 201310153828A CN 103196430 B CN103196430 B CN 103196430B
Authority
CN
China
Prior art keywords
unmanned plane
information
flight path
visual
visual pattern
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201310153828.6A
Other languages
Chinese (zh)
Other versions
CN103196430A (en
Inventor
戴琼海
刘慧�
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201310153828.6A priority Critical patent/CN103196430B/en
Publication of CN103196430A publication Critical patent/CN103196430A/en
Application granted granted Critical
Publication of CN103196430B publication Critical patent/CN103196430B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Navigation (AREA)

Abstract

The present invention proposes the mapping navigation method and system of a kind of flight path based on unmanned plane and visual information.Wherein, method comprises the following steps: the visual pattern gathering unmanned plane; Unmanned plane image understanding system is utilized to analyze visual pattern to obtain characteristic information and the semantic text information of visual pattern; According to characteristic information and the semantic text information determination unmanned plane during flying track of visual pattern; Generate mapping relations database according to visual pattern and corresponding flight path, wherein, mapping relations database data can generate in real time according to image understanding system or be loaded in advance; And according to mapping relations database, unmanned plane is navigated.According to the method for the embodiment of the present invention, mapping relations database is generated by flight path being associated with visual pattern, and learn according to this mapping relations database, thus improve the accuracy that unmanned plane is navigated, improve the security of unmanned plane during flying simultaneously.

Description

Based on the flight path of unmanned plane and the mapping navigation method and system of visual information
Technical field
The present invention relates to unmanned air vehicle technique field, particularly the mapping navigation method and system of a kind of flight path based on unmanned plane and visual information.
Background technology
For the complicacy of relation between the huge property of the diversity of day by day complicated information environment and expression forms of information, information content, information, and require the real-time to information processing, information fusion is as informix treatment technology, and its importance highlights day by day.By can improve the quality of data to the fusion of different information, obtain useful complete information to greatest extent to realize the identification to environment, the detection of target, location, the functions such as independent navigation.Unmanned plane visual navigation is a kind of novel air navigation aid comprehensive visual, flight, blank pipe and the communication technology organically combined, and is the typical apply of information fusion.The problems such as existing in visual navigation needs sensing equipment many, and visual information data amount is large, and Data Fusion requirement of real-time is high.
At present, the method of carrying out navigating or following the tracks of for Track Fusion is mainly divided into two classes, one class is centralized flight path fusion method, another kind of is Distributed Track Fusion method, and mostly fusion method is to carry out merging the method generating lower error flight path or the problem solving track association for multisensor or multi-source information, as: the fusion method etc. of simple variance convex combination method, Weighted Fusion method, Filtering Step by Step.But prior art there is no the understanding considered visual information, the raising accurate visual navigation aspect of unmanned plane is short of to some extent, the method that flight path and visual information merge to mate mapping mutually be yet there are no.Different from the mode that existing Track Fusion Technology solves process problem, the present invention is in unmanned plane visual navigation flight environment of vehicle, its flight path is merged mutually with visual information, unmanned plane is made to be provided with the function mapping study, effectively can improve the accuracy of unmanned plane image understanding system, improving real-time and matching, independently locating for realizing unmanned plane, path planning provides reliable hedging to keep away barrier navigation information with safe flight.
Summary of the invention
Object of the present invention is intended at least solve one of above-mentioned technological deficiency.
For this reason, one object of the present invention is a kind of mapping navigation method proposing flight path based on unmanned plane and visual information.
Another object of the present invention is to the mapping navigation system proposing a kind of flight path based on unmanned plane and visual information.
For achieving the above object, the embodiment of one aspect of the present invention proposes a kind of mapping navigation method of flight path based on unmanned plane and visual information, comprises the following steps:
Gather the visual pattern of unmanned plane; Unmanned plane image understanding system is utilized to analyze visual pattern to obtain characteristic information and the semantic text information of visual pattern; Described unmanned plane during flying track is determined according to the characteristic information of described visual pattern and semantic text information; Mapping relations database is generated according to described visual pattern and corresponding flight path, wherein, described mapping relations database can comprise the flight path information of described unmanned plane in different visual pattern, attitude, speed, strategy and corresponding visual pattern in order to retrieval study and/or transmission, and described mapping relations database data can generate in real time according to image understanding system or be loaded in advance; And
According to described mapping relations database, described unmanned plane is navigated.
In one embodiment of the present of invention, according to the step that described mapping relations database navigates to described unmanned plane, comprise further: when visual pattern matches in the visual pattern that described unmanned plane gathers and described mapping relations database, adopt the countermeasures corresponding with visual pattern in described mapping relations database to navigate to described unmanned plane.
In one embodiment of the present of invention, also comprise: when the matching degree of the visual pattern in the visual pattern that described unmanned plane gathers and described mapping relations database is greater than threshold value, by the countermeasures mapping mode of learning corresponding with described visual pattern, described unmanned plane is navigated.
In one embodiment of the present of invention, countermeasures corresponding for visual pattern the highest for matching degree is navigated as final countermeasures to described unmanned plane.
In one embodiment of the present of invention, described semantic text information comprises: the shape of object in described visual pattern, size, motion state, with the relative distance of described unmanned plane and the status information of background context.
In one embodiment of the present of invention; unmanned plane during flying flight path and corresponding visual information is stored in described mapping relations database; and the time sequencing that described unmanned plane during flying flight path and corresponding visual information occur according to the feature of flight path or visual information, flight path is classified, and makes unmanned plane retrieve according to different types of data in retrieving.
In one embodiment of the present of invention, described unmanned plane positions according to the flight track of history flight track and corresponding visual information, current time and corresponding visual information, and carry out mapping study according to the described visual information of current acquisition, realize Trajectory Prediction and the path planning of described unmanned plane.
In one embodiment of the present of invention, described mapping mode of learning comprises further: according to feature recognition result and/or semantic relevant explanation, retrieval and relevant flight path described like current obtained data class and visual information from mapping relations database, and learn according to carrying out mapping with described feature recognition result and/or the highest flight path of semantic relevant explanation matching degree and visual information.
In one embodiment of the present of invention, when described unmanned plane has map maybe can obtain map, flight path information is generated by described map, unmanned plane directly can carry out navigation flight according to the trace information generated by map, for prevention environment changes, the trace information generated by map can map the prediction flight path information match learnt and merges and generate low error flight path information and navigate to described unmanned plane with merging according to flight path and visual information.
In one embodiment of the present of invention, described visual information comprises: characteristic information and semantic text information.
According to the method for the embodiment of the present invention, mapping relations database is generated by flight path being associated with visual pattern, and learn according to this mapping relations database, thus improve the accuracy that unmanned plane is navigated, improve the security of unmanned plane during flying simultaneously.
For achieving the above object, embodiments of the invention propose the mapping navigation system of a kind of flight path based on unmanned plane and visual information on the other hand, comprising: acquisition module, for gathering the visual pattern of unmanned plane; Analysis module, for analyzing described visual pattern with the characteristic information obtaining described visual pattern and the semantic text information of mating with the characteristic information of described visual pattern; Determination module, for determining the flight path of described unmanned plane according to the characteristic information of visual pattern and semantic text information; Generation module, for generating mapping relations database according to described visual pattern and corresponding flight path, wherein, described mapping relations database can comprise the flight path information of described unmanned plane in different visual pattern, attitude, speed, strategy and corresponding visual pattern in order to retrieval study and/or transmission, mapping relations database data can generate in real time according to image understanding system, also can be loaded in advance; And navigation module, for navigating to described unmanned plane according to described mapping relations database.
In one embodiment of the present of invention, described navigation module, also for when visual pattern matches in the visual pattern that described unmanned plane gathers and described mapping relations database, adopts the countermeasures corresponding with visual pattern in described mapping relations database to navigate to described unmanned plane.
In one embodiment of the present of invention, also comprise: learning tutor module, when the matching degree of the visual pattern in the visual pattern gathered for described unmanned plane and described mapping relations database is greater than threshold value, by the countermeasures mapping mode of learning corresponding with described visual pattern, described unmanned plane is navigated.
In one embodiment of the present of invention, described learning tutor module is also for navigating countermeasures corresponding for visual pattern the highest for matching degree to described unmanned plane as final countermeasures.
In one embodiment of the present of invention, described semantic text information comprises: the shape of object in described visual pattern, size, motion state, with the relative distance of described unmanned plane and the status information of background context.
According to the system of the embodiment of the present invention, mapping relations database is generated by flight path being associated with visual pattern, and learn according to this mapping relations database, thus improve the accuracy that unmanned plane is navigated, improve the security of unmanned plane during flying simultaneously.
The aspect that the present invention adds and advantage will part provide in the following description, and part will become obvious from the following description, or be recognized by practice of the present invention.
Accompanying drawing explanation
The present invention above-mentioned and/or additional aspect and advantage will become obvious and easy understand from the following description of the accompanying drawings of embodiments, wherein:
Fig. 1 is the process flow diagram according to an embodiment of the invention based on the flight path of unmanned plane and the mapping navigation method of visual information;
Fig. 2 is the frame diagram of the unmanned plane image understanding system according to the embodiment of the present invention;
Fig. 3 (a) (b) (c) (d) is the schematic diagram of unmanned plane determination flight path according to an embodiment of the invention;
Fig. 4 is the structured flowchart according to an embodiment of the invention based on the flight path of unmanned plane and the mapping navigation system of visual information; And
Fig. 5 is the structured flowchart in accordance with another embodiment of the present invention based on the flight path of unmanned plane and the mapping navigation system of visual information.
Embodiment
Be described below in detail embodiments of the invention, the example of embodiment is shown in the drawings, and wherein same or similar label represents same or similar element or has element that is identical or similar functions from start to finish.Being exemplary below by the embodiment be described with reference to the drawings, only for explaining the present invention, and can not limitation of the present invention being interpreted as.
Fig. 1 is the process flow diagram according to an embodiment of the invention based on the flight path of unmanned plane and the mapping navigation method of visual information.As shown in Figure 1, according to the flight path based on unmanned plane of the embodiment of the present invention and the mapping navigation method of visual information, comprise the following steps:
Step S101, gathers the visual pattern of unmanned plane.
Particularly, by the visual pattern of airborne visual sensing equipment Real-time Collection unmanned plane in flight course of unmanned plane, such as, video camera etc.Video image information needed for visual navigation can also be obtained from other unmanned planes or equipment.
Step S102, utilize unmanned plane image understanding system to analyze visual pattern to obtain the characteristic information of visual pattern, and the image, semantic that search is mated with the characteristic information of visual pattern from iconic model storehouse is with generative semantics text message.Semantic text information comprises: the shape of object in visual pattern, size, with the relative distance of unmanned plane and the status information etc. of background context.
Particularly, by carrying out video procession to gathered video image to obtain the characteristic information of video image.Then search in iconic model storehouse, wherein, this iconic model storehouse comprises abundant image makes the video image to obtaining identify to obtain image, semantic.Afterwards again unmanned plane according to image, semantic generative semantics text message.
Fig. 2 is the frame diagram of the unmanned plane image understanding system according to the embodiment of the present invention.As shown in Figure 2, unmanned plane image understanding system comprises: real-time vision information acquisition module is used for gathering video image information needed for visual navigation by airborne visual sensing equipment and/or obtaining video image information needed for visual navigation by communicating from other unmanned planes or equipment.Image characteristics extraction module is used for the feature being obtained video image information by video procession technology.Retrieval, for storing image model information, is carried out for image, semantic generation module in iconic model storehouse.Dictionary knowledge base, for storing the lexical or textual analysis between keyword and keyword, carries out retrieval for image, semantic generation module.Flight path and visual information merge mapping block for setting up the dependency rule of the corresponding visual information of flight path and flight path, and store unmanned plane during flying flight path and corresponding visual information in order to retrieval study and/or transmission.Image, semantic generation module is used for according to the feature of video image information and the fuse information generative semantics text message of flight path and visual information.Fig. 2 is the image understanding system schematic diagram of the fusion mapping mode of unmanned plane during flying flight path and visual information in the visual navigation of the embodiment of the present invention; Unmanned plane obtains airborne visual sensing equipment by real-time vision information acquisition module and gathers the video image information needed for visual navigation and/or obtain the video image information needed for the visual navigation that come from other unmanned planes or device transmission by communication; The feature of video image information is obtained by video procession technology in image characteristics extraction module; With the graphic-image matching identification in iconic model storehouse while of gained characteristic information, and crucial semantic interpretation can be obtained by image, semantic generation module; According to feature recognition result and/or semantic relevant explanation, retrieval flight path merges in mapping block to visual information the similar relevant flight path and visual information that store, learn according to carrying out mapping with current signature recognition result and/or the highest flight path of semantic relevant explanation matching degree and visual information, obtain the prediction flight path information that Current vision information is corresponding, can merge mutually with the explanation results of image, semantic generation module to Current vision information further, generate Navigation of Pilotless Aircraft instruction and continue flight.
In one embodiment of the invention; unmanned plane during flying flight path and corresponding visual information is stored in mapping relations database; and the time sequencing that unmanned plane during flying flight path and corresponding visual information occur according to the feature of flight path or visual information, flight path is classified, and makes unmanned plane retrieve according to different types of data in retrieving.Unmanned plane positions according to the flight track of history flight track and corresponding visual information, current time and corresponding visual information, and carries out mapping study according to the visual information of current acquisition, realizes Trajectory Prediction and the path planning of unmanned plane.
Step S103, according to characteristic information and the semantic text information determination unmanned plane during flying track of visual pattern.
Particularly, from the moving situation of semantic text information institute acquired disturbance thing, such as, static, constant speed or speed change move, the shape of barrier, such as, circle, ball, square etc., also have the semantic text information such as slit, circular hole, door, window, tree, mountain, house, bird, flying object to adopt corresponding flight path in addition.Such as, can to cross above barrier according to the relative height with unmanned plane or the flight path of the mode determination unmanned plane such as cut-through thing.Fig. 3 is the schematic diagram of unmanned plane determination flight path according to an embodiment of the invention.As shown in the (a) and (b) of Fig. 3, (c), (d), during as run into slit, the mode determination flight path of flying nonstop to can be adopted; Run into tree or buildings, and when should not fly over, adopt the mode determination flight path detoured according to the width of barrier; When there being barrier in the air, adopt the mode determination flight path of low flight; When running into the turning that buildings or mountain etc. are formed, then turning flight.Detailed process represents by following date expression, and retrieving reference locus is S r=f r(x, y, z, I r, t), its corresponding visual information is I r=F r(x', y', z', λ, t), current visual information to be matched is I n=F n(x', y', z', λ, t), matching degree is MD n=g (F r, F n), wherein, g is expressed as matching process, if MD n>d c, d cfor matching degree threshold value, then the prediction locus information that can obtain Current vision information by reference locus is S n=f n(x, y, z, I n, t).Wherein x, y, z are trajectory range coordinate, I rrepresent reference picture intensity, I nrepresent current image intensity to be matched, t represents the time, and x', y', z' are image space coordinate, and λ represents optical wavelength.
Step S104, mapping relations database is generated according to visual pattern and corresponding flight path, wherein, mapping relations database can comprise the flight path information of unmanned plane in different visual pattern, attitude, speed, strategy and corresponding visual pattern in order to retrieval study and/or transmission, and mapping relations database data can generate in real time according to image understanding system or be loaded in advance.
Particularly, different visual pattern is adopted to the flight path of corresponding countermeasures determination unmanned plane.This visual pattern and corresponding countermeasures are formed one group of information, multiple different visual pattern is obtained multiple countermeasures to generate mapping relations database.As shown in Figure 3, by slit with fly nonstop to strategy form one group of information; The tree that should not fly over or buildings and the strategy that detours form one group of information.Thus, build by the visual pattern mapping relations database corresponding with countermeasures.
In addition, the foundation of countermeasures, can also according to the unmanned plane during flying time, set up the detailed corresponding relation of each history track points and its place's visual information, as the index using track points position coordinates as this place's visual information, track points and the visual information thereof that also can extract the crucial historical juncture set up corresponding relation, or extract crucial flight path line segment and key visual information sets up corresponding relation.Wherein when index flight path visual information, the attitude of this historical juncture of unmanned plane can be indexed simultaneously, highly, the information such as speed and the computation rule obtaining this moment history flight path according to this place's visual information.Can according to signature of flight path or region as: turn, soar, fly nonstop to, fly at low altitude, be diversion etc. or the text message of visual information feature as the dynamic barrier of high or low static-obstacle thing, high or low certain speed, the barrier of definite shape (as: circle, ball, side etc.), slit, tree/mountain/house/bird/flying object etc., set up the correlation computations rule of flight path and its place's visual information.
In one embodiment of the invention, the visual pattern in the visual pattern that unmanned plane gathers and mapping relations database compares, if when its matching degree is greater than threshold value, navigates by the study countermeasures corresponding with visual pattern to unmanned plane.Such as, for run into before and after one tree unmanned plane two trees, just it varies in size, and now unmanned plane can learn previous countermeasures and navigates to unmanned plane.In the matching process, countermeasures corresponding for visual pattern the highest for matching degree is navigated as final countermeasures to unmanned plane.
According to feature recognition result and/or semantic relevant explanation, retrieval and relevant flight path and visual information like current obtained data class from mapping relations database, and learn according to carrying out mapping with feature recognition result and/or the highest flight path of semantic relevant explanation matching degree and visual information.Specifically, can learn according to carrying out mapping with current signature recognition result and/or the highest flight path of semantic relevant explanation matching degree and visual information, specifically get matching degree and exceed certain threshold value and the policing rule that generated by its visual information of the higher or the highest flight path of matching degree, the related data of Current vision information is substituted into the corresponding flight path that displacement obtains Current vision information, is unmanned plane and merges by flight path and visual information the prediction flight path information that mapping block learns to obtain.When matching degree exceed certain threshold value and the higher or the highest flight path of matching degree and visual information more than one time, the prediction flight path information that desirable any one utilizes the highest flight path of matching degree and visual information to map to learn to obtain, when time delay allows, the fusion association algorithm that also can design many Trajectory Prediction information carries out merging and obtains final prediction flight path information.When within a certain period of time, do not retrieve matching degree when exceeding flight path and its visual information of certain threshold value, then export specific character and represent do not have study to obtain prediction flight path information.
In one embodiment of the invention, when unmanned plane has map maybe can obtain map, the track generated by map, and according to flight path and flight path and visual information, unmanned plane is navigated.Visual information comprises: characteristic information and semantic text information.
Step S105, navigates to unmanned plane according to mapping relations database.
Particularly, visual image information in unmanned plane Real-time Collection visual pattern and mapping relations database compares, when visual pattern matches in gathered visual pattern and mapping relations database, the countermeasures corresponding with visual pattern in mapping relations database is adopted to navigate to unmanned plane.
According to the method for the embodiment of the present invention, mapping relations database is generated by flight path being associated with visual pattern, and learn according to this mapping relations database, thus improve the accuracy that unmanned plane is navigated, improve the security of unmanned plane during flying simultaneously.
In one embodiment of the invention, when having map in advance or have numerical map generator to produce cartographic information, unmanned plane directly can carry out navigation flight according to the trace information generated by map, for prevention environment changes, the trace information generated by map can map the prediction flight path information match learnt and merges and generate low error flight path information and carry out navigation flight with merging according to flight path and visual information mapping block.Also directly cartographic information can be fused to unmanned aerial vehicle flight path to merge in the mapping relations database of mapping block or in the graph image model bank of unmanned plane image understanding system with visual information and generate visual navigation information to identify and to mate.
Fig. 4 is the structured flowchart according to an embodiment of the invention based on the flight path of unmanned plane and the mapping navigation system of visual information.As shown in Figure 4, acquisition module 100, analysis module 200, determination module 300, generation module 400 and navigation module 500 is comprised according to the flight path based on unmanned plane of the embodiment of the present invention and the mapping navigation system of visual information.
Acquisition module 100 is for gathering the visual pattern of unmanned plane.
Particularly, acquisition module 100 passes through the visual pattern of airborne visual sensing equipment Real-time Collection unmanned plane in flight course of unmanned plane, such as, and video camera etc.Video image information needed for visual navigation can also be obtained from other unmanned planes or equipment.
Analysis module 200 is for analyzing visual pattern with the characteristic information obtaining visual pattern and the semantic text information of mating with the characteristic information of visual pattern.
Particularly, analysis module 200 is by carrying out video procession to obtain the characteristic information of video image to gathered video image.Then search in iconic model storehouse, wherein, this iconic model storehouse comprises abundant image makes the video image to obtaining identify to obtain image, semantic.Afterwards again unmanned plane according to image, semantic generative semantics text message.
Determination module 300 is for according to the characteristic information of visual pattern and the flight path of semantic text information determination unmanned plane.
Particularly, determination module 300 is from the moving situation of semantic text information institute acquired disturbance thing, such as, static, constant speed or speed change moves, the shape of barrier, such as, circle, ball, square etc., also have the semantic text information such as slit, circular hole, door, window, tree, mountain, house, bird, flying object to adopt corresponding flight path in addition.Such as, can to cross above barrier according to the relative height with unmanned plane or the flight path of the mode determination unmanned plane such as cut-through thing.Fig. 3 is the schematic diagram of unmanned plane determination flight path according to an embodiment of the invention.As shown in the (a) and (b) of Fig. 3, (c), (d), during as run into slit, the mode determination flight path of flying nonstop to can be adopted; Run into tree or buildings, and when should not fly over, adopt the mode determination flight path detoured according to the width of barrier; When there being barrier in the air, adopt the mode determination flight path of low flight; When running into the turning that buildings or mountain etc. are formed, then turning flight.Detailed process represents by following date expression, and retrieving reference locus is S r=f r(x, y, z, I r, t), its corresponding visual information is I r=F r(x', y', z', λ, t), current visual information to be matched is I n=F n(x', y', z', λ, t), matching degree is MD n=g (F r, F n), wherein, g is expressed as matching process, if MD n>d c, d cfor matching degree threshold value, then the prediction locus information that can obtain Current vision information by reference locus is S n=f n(x, y, z, I n, t).Wherein, x, y, z are trajectory range coordinate, I rrepresent reference picture intensity, I nrepresent current image intensity to be matched, t represents the time, and x', y', z' are image space coordinate, and λ represents optical wavelength.
Generation module 400 is for generating mapping relations database according to visual pattern and corresponding flight path, and wherein, mapping relations database data can generate in real time according to image understanding system, also can be loaded in advance.Semantic text information comprises: the shape of object in visual pattern, size, with the relative distance of unmanned plane and the status information of background context.
Particularly, determination module 300 adopts the flight path of corresponding countermeasures determination unmanned plane to different visual pattern, and by generation module 400, this visual pattern and corresponding countermeasures are formed one group of information, multiple different visual pattern is obtained multiple countermeasures to generate mapping relations database.As shown in Figure 3, by slit with fly nonstop to strategy form one group of information; The tree that should not fly over or buildings and the strategy that detours form one group of information.Thus, build by the visual pattern mapping relations database corresponding with countermeasures.
In addition, the foundation of countermeasures, can also according to the unmanned plane during flying time, set up the detailed corresponding relation of each history track points and its place's visual information, as the index using track points position coordinates as this place's visual information, track points and the visual information thereof that also can extract the crucial historical juncture set up corresponding relation, or extract crucial flight path line segment and key visual information sets up corresponding relation.Wherein when index flight path visual information, the attitude of this historical juncture of unmanned plane can be indexed simultaneously, highly, the information such as speed and the computation rule obtaining this moment history flight path according to this place's visual information.Can according to signature of flight path or region as: turn, soar, fly nonstop to, fly at low altitude, be diversion etc. or the text message of visual information feature as the dynamic barrier of high or low static-obstacle thing, high or low certain speed, the barrier of definite shape (as: circle, ball, side etc.), slit, tree/mountain/house/bird/flying object etc., set up the correlation computations rule of flight path and its place's visual information.
Navigation module 500 is for navigating to unmanned plane according to mapping relations database.
Particularly, visual image information in Real-time Collection visual pattern and mapping relations database compares by navigation module 500, when visual pattern matches in gathered visual pattern and mapping relations database, the countermeasures corresponding with visual pattern in mapping relations database is adopted to navigate to unmanned plane.
Fig. 5 is the structured flowchart in accordance with another embodiment of the present invention based on the flight path of unmanned plane and the mapping navigation system of visual information.As shown in Figure 5, learning tutor module 600 is also comprised according to the flight path based on unmanned plane of the embodiment of the present invention and the mapping navigation system of visual information.
When the matching degree of the visual pattern in the visual pattern that learning tutor module 600 gathers for unmanned plane and mapping relations database is greater than threshold value, by the countermeasures mapping mode of learning corresponding with visual pattern, unmanned plane is navigated.Such as, for run into before and after one tree unmanned plane two trees, just it varies in size, and now unmanned plane can learn previous countermeasures and navigates to unmanned plane.In the matching process, countermeasures corresponding for visual pattern the highest for matching degree is navigated as final countermeasures to unmanned plane.
In one embodiment of the invention, countermeasures corresponding for visual pattern the highest for matching degree navigates as final countermeasures to unmanned plane by learning tutor module 600.
In one embodiment of the invention, analysis module 200 is according to feature recognition result and/or semantic relevant explanation, retrieval and relevant flight path and visual information like current obtained data class from mapping relations database, learning tutor module 600 learns according to carrying out mapping with feature recognition result and/or the highest flight path of semantic relevant explanation matching degree and visual information.Particularly, learning tutor module 600 learns according to carrying out mapping with current signature recognition result and/or the highest flight path of semantic relevant explanation matching degree and visual information, specifically get matching degree and exceed certain threshold value and the policing rule that generated by its visual information of the higher or the highest flight path of matching degree, the related data of Current vision information is substituted into the corresponding flight path that displacement obtains Current vision information by learning tutor module 600, is unmanned plane and merges by flight path and visual information the prediction flight path information that mapping block learns to obtain.When matching degree exceed certain threshold value and the higher or the highest flight path of matching degree and visual information more than one time, the prediction flight path information that desirable any one utilizes the highest flight path of matching degree and visual information to map to learn to obtain, when time delay allows, the fusion association algorithm that also can design many Trajectory Prediction information carries out merging and obtains final prediction flight path information.If within a certain period of time, do not retrieve matching degree when exceeding flight path and its visual information of certain threshold value, learning tutor module 600 exports specific character and represents do not have study to obtain prediction flight path information.
In one embodiment of the invention, when unmanned plane has map maybe can obtain map, the track generated by map, and according to flight path and flight path and visual information, unmanned plane is navigated.Visual information comprises: characteristic information and semantic text information.
According to the system of the embodiment of the present invention, mapping relations database is generated by flight path being associated with visual pattern, and learn according to this mapping relations database, thus improve the accuracy that unmanned plane is navigated, improve the security of unmanned plane during flying simultaneously.
Although illustrate and describe embodiments of the invention above, be understandable that, above-described embodiment is exemplary, can not be interpreted as limitation of the present invention, those of ordinary skill in the art can change above-described embodiment within the scope of the invention when not departing from principle of the present invention and aim, revising, replacing and modification.

Claims (13)

1., based on the flight path of unmanned plane and a mapping navigation method for visual information, it is characterized in that, comprise the following steps:
Gather the visual pattern of unmanned plane;
Unmanned plane image understanding system is utilized to analyze visual pattern to obtain characteristic information and the semantic text information of visual pattern;
Described unmanned plane during flying track is determined according to the characteristic information of described visual pattern and semantic text information;
Mapping relations database is generated according to described visual pattern and corresponding flight path, wherein, described mapping relations database comprises the flight path information of described unmanned plane in different visual pattern, attitude, speed, strategy and corresponding visual pattern in order to retrieval study and/or transmission, and described mapping relations database data generates in real time according to image understanding system or is loaded in advance;
According to described mapping relations database, described unmanned plane is navigated;
When the matching degree of the visual pattern in the visual pattern that described unmanned plane gathers and described mapping relations database is greater than threshold value, by the countermeasures mapping mode of learning corresponding with described visual pattern, described unmanned plane is navigated.
2. as claimed in claim 1 based on the flight path of unmanned plane and the mapping navigation method of visual information, it is characterized in that, according to the step that described mapping relations database navigates to described unmanned plane, comprise further:
When visual pattern matches in the visual pattern that described unmanned plane gathers and described mapping relations database, the countermeasures corresponding with visual pattern in described mapping relations database is adopted to navigate to described unmanned plane.
3. as claimed in claim 1 based on the flight path of unmanned plane and the mapping navigation method of visual information, it is characterized in that, countermeasures corresponding for visual pattern the highest for matching degree is navigated to described unmanned plane as final countermeasures.
4. as claimed in claim 1 based on the flight path of unmanned plane and the mapping navigation method of visual information, it is characterized in that, described semantic text information comprises: the shape of object in described visual pattern, size, motion state, with the relative distance of described unmanned plane and the status information of background context.
5. as claimed in claim 1 based on the flight path of unmanned plane and the mapping navigation method of visual information; it is characterized in that; unmanned plane during flying flight path and corresponding visual information is stored in described mapping relations database; and the time sequencing that described unmanned plane during flying flight path and corresponding visual information occur according to the feature of flight path or visual information, flight path is classified, and makes unmanned plane retrieve according to different types of data in retrieving.
6. as claimed in claim 1 based on the flight path of unmanned plane and the mapping navigation method of visual information, it is characterized in that, described unmanned plane positions according to the flight track of history flight track and corresponding visual information, current time and corresponding visual information, and carry out mapping study according to the described visual information of current acquisition, realize Trajectory Prediction and the path planning of described unmanned plane.
7. as claimed in claim 1 based on the flight path of unmanned plane and the mapping navigation method of visual information, it is characterized in that, described mapping mode of learning comprises further:
According to feature recognition result and/or semantic relevant explanation, retrieval and relevant flight path described like current obtained data class and visual information from mapping relations database, and learn according to carrying out mapping with described feature recognition result and/or the highest flight path of semantic relevant explanation matching degree and visual information.
8. as claimed in claim 1 based on the flight path of unmanned plane and the mapping navigation method of visual information, it is characterized in that, when described unmanned plane has map maybe can obtain map, flight path information is generated by described map, unmanned plane directly carries out navigation flight according to the trace information generated by map, for prevention environment changes, the trace information generated by map maps the prediction flight path information match learnt and merges and generate low error flight path information and navigate to described unmanned plane with merging according to flight path and visual information.
9. the flight path based on unmanned plane according to any one of claim 5 to 8 and the mapping navigation method of visual information, it is characterized in that, described visual information comprises: characteristic information and semantic text information.
10., based on the flight path of unmanned plane and a mapping navigation system for visual information, it is characterized in that, comprising:
Acquisition module, for gathering the visual pattern of unmanned plane;
Analysis module, for analyzing described visual pattern with the characteristic information obtaining described visual pattern and the semantic text information of mating with the characteristic information of described visual pattern;
Determination module, for determining the flight path of described unmanned plane according to the characteristic information of visual pattern and semantic text information;
Generation module, for generating mapping relations database according to described visual pattern and corresponding flight path, wherein, described mapping relations database comprises the flight path information of described unmanned plane in different visual pattern, attitude, speed, strategy and corresponding visual pattern in order to retrieval study and/or transmission, and mapping relations database data generates in real time according to image understanding system or is loaded in advance;
Navigation module, for navigating to described unmanned plane according to described mapping relations database;
Learning tutor module, when the matching degree of the visual pattern in the visual pattern gathered for described unmanned plane and described mapping relations database is greater than threshold value, by the countermeasures mapping mode of learning corresponding with described visual pattern, described unmanned plane is navigated.
11. is as claimed in claim 10 based on the flight path of unmanned plane and the mapping navigation system of visual information, it is characterized in that, described navigation module, also for when visual pattern matches in the visual pattern that described unmanned plane gathers and described mapping relations database, adopts the countermeasures corresponding with visual pattern in described mapping relations database to navigate to described unmanned plane.
12. is as claimed in claim 10 based on the flight path of unmanned plane and the mapping navigation system of visual information, it is characterized in that, described learning tutor module is also for navigating countermeasures corresponding for visual pattern the highest for matching degree to described unmanned plane as final countermeasures.
13. is as claimed in claim 10 based on the flight path of unmanned plane and the mapping navigation system of visual information, it is characterized in that, described semantic text information comprises: the shape of object in described visual pattern, size, motion state, with the relative distance of described unmanned plane and the status information of background context.
CN201310153828.6A 2013-04-27 2013-04-27 Based on the flight path of unmanned plane and the mapping navigation method and system of visual information Active CN103196430B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310153828.6A CN103196430B (en) 2013-04-27 2013-04-27 Based on the flight path of unmanned plane and the mapping navigation method and system of visual information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310153828.6A CN103196430B (en) 2013-04-27 2013-04-27 Based on the flight path of unmanned plane and the mapping navigation method and system of visual information

Publications (2)

Publication Number Publication Date
CN103196430A CN103196430A (en) 2013-07-10
CN103196430B true CN103196430B (en) 2015-12-09

Family

ID=48719134

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310153828.6A Active CN103196430B (en) 2013-04-27 2013-04-27 Based on the flight path of unmanned plane and the mapping navigation method and system of visual information

Country Status (1)

Country Link
CN (1) CN103196430B (en)

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103941748B (en) * 2014-04-29 2016-05-25 百度在线网络技术(北京)有限公司 Autonomous navigation method and system and Map building method and system
US10168696B2 (en) 2016-03-31 2019-01-01 International Business Machines Corporation Dynamic analysis of real-time restrictions for remote controlled vehicles
CN105955291B (en) * 2016-04-29 2021-04-27 深圳市哈博森科技有限公司 Unmanned aerial vehicle flight route track recording and automatic flight control mode
CN106297237B (en) * 2016-08-17 2021-07-16 联想(北京)有限公司 Control method and electronic equipment
CN106355866A (en) * 2016-11-14 2017-01-25 徐志勇 Unmanned aerial vehicle detection image storage analytic system
CN107278262B (en) * 2016-11-14 2021-03-30 深圳市大疆创新科技有限公司 Flight trajectory generation method, control device and unmanned aerial vehicle
CN106647807B (en) * 2016-12-29 2019-12-31 上海资誉电子科技有限公司 Coping strategy generation method and coping strategy generation system for unmanned aerial vehicle
CN108496129B (en) * 2017-04-28 2021-10-01 深圳市大疆创新科技有限公司 Aircraft-based facility detection method and control equipment
CN107444665B (en) * 2017-07-24 2020-06-09 长春草莓科技有限公司 Unmanned aerial vehicle autonomous landing method
WO2019043704A1 (en) * 2017-08-29 2019-03-07 Wajnberg Adam Drone escort system
CN107450593B (en) * 2017-08-30 2020-06-12 清华大学 Unmanned aerial vehicle autonomous navigation method and system
WO2019041266A1 (en) * 2017-08-31 2019-03-07 深圳市大疆创新科技有限公司 Path planning method, aircraft, and flight system
CN107993308A (en) * 2017-09-08 2018-05-04 北京航空航天大学 A kind of stand alone type unmanned plane during flying safety monitoring and information management system
US10387727B2 (en) * 2017-09-13 2019-08-20 Wing Aviation Llc Backup navigation system for unmanned aerial vehicles
CN108088438B (en) * 2017-12-05 2021-05-14 普达迪泰(天津)智能装备科技有限公司 Unmanned aerial vehicle visual navigation test method and system
US10689110B2 (en) * 2018-02-12 2020-06-23 Wipro Limited Method and system for performing inspection and maintenance tasks of three-dimensional structures using drones
CN109064467A (en) * 2018-08-20 2018-12-21 贵州宜行智通科技有限公司 Analysis method, device and the electronic equipment of community security defence
CN109275027A (en) * 2018-09-26 2019-01-25 Tcl海外电子(惠州)有限公司 Speech output method, electronic playback devices and the storage medium of video
CN109799838B (en) * 2018-12-21 2022-04-15 金季春 Training method and system
CN109765922A (en) * 2019-03-07 2019-05-17 安徽省川佰科技有限公司 A kind of unmanned plane during flying track auxiliary method of adjustment
CN110940320A (en) * 2019-07-19 2020-03-31 华北电力大学(保定) Open stock ground monitored control system based on unmanned aerial vehicle cruises
CN111209899B (en) * 2019-12-31 2023-06-02 科大讯飞股份有限公司 Rescue material delivery method, system, device and storage medium
CN112668652A (en) * 2020-12-31 2021-04-16 哈尔滨工业大学 Method and system for identifying cluster array and motion trend in unmanned equipment confrontation
CN115248877B (en) * 2022-09-22 2023-01-17 中国电子科技集团公司第十五研究所 Multi-mode-based track text matching method
CN117452969A (en) * 2023-12-06 2024-01-26 南京瑞蓝世光电传感技术研究院有限公司 Unmanned aerial vehicle navigation method based on multi-mode data processing

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101619985A (en) * 2009-08-06 2010-01-06 上海交通大学 Service robot autonomous navigation method based on deformable topological map
CN102436738A (en) * 2011-09-26 2012-05-02 同济大学 Traffic monitoring device based on unmanned aerial vehicle (UAV)
CN102853830A (en) * 2012-09-03 2013-01-02 东南大学 Robot vision navigation method based on general object recognition

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102426019B (en) * 2011-08-25 2014-07-02 航天恒星科技有限公司 Unmanned aerial vehicle scene matching auxiliary navigation method and system

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101619985A (en) * 2009-08-06 2010-01-06 上海交通大学 Service robot autonomous navigation method based on deformable topological map
CN102436738A (en) * 2011-09-26 2012-05-02 同济大学 Traffic monitoring device based on unmanned aerial vehicle (UAV)
CN102853830A (en) * 2012-09-03 2013-01-02 东南大学 Robot vision navigation method based on general object recognition

Also Published As

Publication number Publication date
CN103196430A (en) 2013-07-10

Similar Documents

Publication Publication Date Title
CN103196430B (en) Based on the flight path of unmanned plane and the mapping navigation method and system of visual information
Liu et al. Predicting aircraft trajectories: A deep generative convolutional recurrent neural networks approach
US9911340B2 (en) Real-time system for multi-modal 3D geospatial mapping, object recognition, scene annotation and analytics
CN109597087A (en) A kind of 3D object detection method based on point cloud data
Lian et al. DeepWindow: Sliding window based on deep learning for road extraction from remote sensing images
Emmi et al. A hybrid representation of the environment to improve autonomous navigation of mobile robots in agriculture
CN106873630A (en) A kind of flight control method and device, perform equipment
CN110276972A (en) A kind of object cognitive method and system based on car networking
CN105893621A (en) Method for mining target behavior law based on multi-dimensional track clustering
CN110298330A (en) A kind of detection of transmission line polling robot monocular and localization method
CN104463909A (en) Visual target tracking method based on credibility combination map model
Xie et al. Hierarchical forest based fast online loop closure for low-latency consistent visual-inertial SLAM
Li et al. An efficient point cloud place recognition approach based on transformer in dynamic environment
Zhao et al. Autonomous Exploration Method for Fast Unknown Environment Mapping by Using UAV Equipped with Limited FOV Sensor
CN112380933A (en) Method and device for identifying target by unmanned aerial vehicle and unmanned aerial vehicle
Karaoğuz et al. An integrated model of autonomous topological spatial cognition
CN115311867B (en) Tunnel scene positioning method and device, computer equipment and storage medium
CN116719247A (en) Digital space simulation system and method
CN112380314B (en) Road network information processing method and device, storage medium and electronic equipment
Zeng et al. Robotic Relocalization Algorithm Assisted by Industrial Internet of Things and Artificial Intelligence
Salehi et al. Improving constrained bundle adjustment through semantic scene labeling
Gaspar et al. Limit characterization for visual place recognition in underwater scenes
Clachar Identifying and analyzing atypical flights by using supervised and unsupervised approaches
Barkley et al. Cooperative bayesian target detection on a real road network using aerial vehicles
Yu et al. Large-scale scene mapping and localization based on multi-sensor fusion

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant