CN109655058A - A kind of inertia/Visual intelligent Combinated navigation method - Google Patents

A kind of inertia/Visual intelligent Combinated navigation method Download PDF

Info

Publication number
CN109655058A
CN109655058A CN201811579143.7A CN201811579143A CN109655058A CN 109655058 A CN109655058 A CN 109655058A CN 201811579143 A CN201811579143 A CN 201811579143A CN 109655058 A CN109655058 A CN 109655058A
Authority
CN
China
Prior art keywords
navigation
rule
decision
data
inertia
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811579143.7A
Other languages
Chinese (zh)
Inventor
高嘉瑜
景鑫
李阳
王健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
CETC 20 Research Institute
Original Assignee
CETC 20 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CETC 20 Research Institute filed Critical CETC 20 Research Institute
Priority to CN201811579143.7A priority Critical patent/CN109655058A/en
Publication of CN109655058A publication Critical patent/CN109655058A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The present invention provides a kind of inertia/Visual intelligent Combinated navigation methods, when navigation specialist system receives the input information such as navigation information and abnormality detection result, it is brought into operation first by the rule in inference machine control knowledge base, read the state in current each ranging source, it generates intermediate true and is saved to integrated database, then inference machine is according to the real-time status in each ranging source, rule in knowledge base is matched, select current optimal navigation fusion mode, it activates and executes corresponding rule, choose navigation sources, the decision-making foundation that explanation engine can be explained to user and export the navigation fusion mode currently selected etc., if finally navigation fusion mode switching, decision can be carried out to the performance change trend of navigation fusion mode.The present invention carries out intelligent management and decision to vision and inertial sensor and provides navigational parameter to improve the navigation and positioning accuracy and reliability of system for all kinds of unmanned systems.

Description

A kind of inertia/Visual intelligent Combinated navigation method
Technical field
The present invention relates to a kind of air navigation aids, belong to integrated navigation field.
Background technique
Inertial navigation system has the characteristics that entirely autonomous property, highly concealed type, common-path interference and information continuity, therefore used Property navigation system be carrier realize automatic navigation control one of core sensor.But INS errors increase over time And constantly drift about, it is clear that single navigation mode is difficult to meet requirement of the navigation system to accuracy, reliability and real-time.With Visual sensor, image processing techniques etc. are constantly progressive, and inertia/visual combination system is one of important trend, in addition base It is more sensitive to navigational parameters such as relative poses in vision guided navigation, it can be used for closing or complex environment, thus further perfect Autonomous positioning homing capability.In carrier prolonged exercise, it is assumed that have scene map, select scene matching aided navigation to navigate, scene Matching positions positioning accuracy with higher in its matching area, but since the discontinuity of matching area causes to be difficult to obtain Continuous navigator fix information, inertial navigation combine frequently as supplementary means and inertia system for a long time.
Under the conditions of modernizing high-tech war, fighter plane faces increasingly serious challenge, the requirement to navigation aspect It is higher and higher, it is desirable that can to support various mission requirements, and can be under any circumstance in all weather operations under rugged environment The navigator fix information of full accuracy is continuously provided.For vision and inertia combination mode.Each navigation equipment is all With respective working principle, positioning accuracy and advantage and disadvantage, their navigation information is all the observed quantity to terrace part state, If carrying out fusion treatment to all navigation informations without distinction, result obtained certainly will not be optimal, or even ratio is not Combined result is also poor.
Summary of the invention
For overcome the deficiencies in the prior art, the present invention provides a kind of based on inertial navigation/Visual intelligent Combinated navigation method, right Vision and inertial sensor carry out intelligent management and decision, are all kinds of to improve the navigation and positioning accuracy and reliability of system Unmanned systems provide navigational parameter.
The technical solution adopted by the present invention to solve the technical problems the following steps are included:
Step 1 generates Inertial Measurement Unit IMU using gyroscope and accelerometer and emulates data, and carrier is defeated in flight Attitude angular rate out isAngular speed of the carrier in the case where carrier coordinate system is relative to geographic coordinate systemWherein,For the pose transformation matrix of geographic coordinate system to carrier coordinate system;
The ideal output of gyroscopeWhereinFor Department of Geography's phase For inertial system angular speed carrier system projection,
The ideal output of accelerometerfnFor the specific force under Department of Geography;
The additional random error in the output of ideal gyroscope, including white noise, single order Markov error and it is random often Number error, the gyroscope output data as emulation;
The additional random error in the output of ideal accelerometer, including single order Markov error, as accelerometer Simulation data data;
Step 2 carries out Image Acquisition, and feature is extracted from image and carries out feature point tracking calculating, utilizes light velocity adjustment Method calculates relative coordinate system vision guided navigation parameter;The IMU data between two camera frames are integrated to together by pre-integration, it will The result that vision calculates is registrated with the result after IMU pre-integration, time alignment stamp;
Step 3, by inertia, vision guided navigation information and performance evaluation result information input intelligent navigation expert system, by Rule in inference machine control knowledge base brings into operation, and reads the state of current each sensing sources, generates intermediate true and is protected It deposits to integrated database;Then inference machine matches the rule in knowledge base according to the real-time status of each sensing sources, selection Current optimal navigation fusion mode, activates and executes corresponding rule, exports current selection decision-making foundation to user;Inference machine Knowledge base is scanned for according to the navigation sources sensor information of input, control, which meets to act accordingly when imposing a condition, to be swashed It is living, until reasoning is completed, obtain optimal navigation decision information;
Scene matching aided navigation result and IMU inertial data are carried out Unscented kalman filtering combination by step 4.
The beneficial effects of the present invention are: being added to intelligent decision mode before navigation information combination, inertia/vision is carried out Integrated navigation operating mode selection strategy and switching research, make integrated navigation system intelligent decision single-sensor and integrated navigation State completes combined system according to sensor states and makes decisions on one's own, guarantee overall navigation positioning performance, obtains integrated navigation most In high precision.
Detailed description of the invention
Fig. 1 is the basic block diagram of expert system;
Fig. 2 is the reasoning process figure of navigation specialist system;
Fig. 3 is steps flow chart schematic diagram of the present invention;
Fig. 4 is intelligent decision flow chart.
Specific embodiment
Present invention will be further explained below with reference to the attached drawings and examples, and the present invention includes but are not limited to following implementations Example.
The present invention combines the characteristics of each navigation sources sensor, proposes a kind of intelligent integrated navigation based on expert system Processing method explores the navigation sources decision-making function realized in integrated navigation system using rule-based expert system, with Phase realizes the higher navigation sources combined effect of positioning and orientation precision.
Navigation specialist system integrated database, inference machine and knowledge base as shown in Figure 1, be mainly made of.Work as navigation specialist When system receives the input information such as navigation information and abnormality detection result, first by the rule in inference machine control knowledge base It brings into operation, reads the state in current each ranging source, generate intermediate true and be saved to integrated database, then inference machine According to the real-time status in each ranging source, the rule in knowledge base is matched, current optimal navigation fusion mode is selected, swashs It lives and executes corresponding rule, choose navigation sources, explanation engine can be explained to user and export the navigation fusion mould currently selected The decision-making foundation of formula etc. can carry out decision to the performance change trend of navigation fusion mode if finally navigation fusion mode switching.
The most important constituent element of rule-based expert system are as follows: knowledge base, inference machine and integrated database.
Knowledge base: being based on generation comprising strictly all rules knowledge base herein using frame as knowledge representation method The heuristic search of formula rule, so comprising the professional knowledge of a large amount of navigation fields stored with rule format in knowledge base.
Inference machine: the control centre that whole control inference machine is rule-based expert system is carried out to operation, which is determined The former piece of a little rules meets condition, so that the consequent of respective rule be activated to go to execute corresponding operation, reasoning process is actually One search and matched process.Inference machine scans for knowledge base according to the navigation source data of input, and control meets certain Movement can be activated accordingly when condition, until reasoning is completed, obtain optimal navigation decision information.Reasoning in research herein Machine selects forward chained reasoning, is inspired from the navigation source information of input using rule using Rete pattern matching algorithm Formula search, if regular premise matching, the rule are chosen, integrated database is added in rule conclusion, if problem does not solve completely, after Continuous reasoning is exited if completing.The algorithm greatly improves the efficiency of reasoning, provides quickly certainly for integrated navigation processing It instigates rebellion within enemy camp and answers.Reasoning process is as shown in Figure 2.
Integrated database: comprising aggregation of data database needed for reasoning be used to store obtained in reasoning process it is various in Between state, the fact, data, original state and target etc..The fact that it is equivalent to working storage, is used to store user's answer, Known facts and the fact that obtained by reasoning, and with the difference of problem, the content of database is also possible to dynamic change 's.In reasoning process, inference machine carries out corresponding operating to true table according to the executive condition of rule, such as deletes true, addition True and modification fact etc..
The intelligent main research inertia/visual combination navigation mode of operation selection strategy of integrated navigation system and switching are ground Study carefully, make integrated navigation system intelligent decision single-sensor and integrated navigation state, completes combined system according to sensor states It makes decisions on one's own, guarantees overall navigation positioning performance.Inertia, visual sensor knowledge base are established, navigation sources work health is passed through The factors such as state, current task demand, external interference environment carry out comprehensive analysis, complete operating mode selection strategy and switching Research.
The present invention proposes intelligent Combinated navigation method, the specific steps are as follows:
Step 1: it generates IMU and emulates data:
1.1) Inertial Measurement Unit IMU being generated using gyroscope and accelerometer measures principle and emulating data, carrier is flying The attitude angular rate exported when row isIt is available according to Eulerian angles theorem
WhereinIt is carrier coordinate system relative to the angular speed under geographic coordinate system,For geographic coordinate system to carrier The pose transformation matrix of coordinate system.
The ideal output of gyroscopeFor
WhereinFor Department of Geography relative to inertial system angular speed carrier system projection,
The ideal of accelerometer, which exports, is
fnFor the specific force under Department of Geography.
1.2) additional random error (white noise, single order Markov error, arbitrary constant in the output of ideal gyroscope Error) as the gyroscope output data emulated.
1.3) additional random error (single order Markov error) is used as accelerometer in the output of ideal accelerometer Simulation data data.
Step 2: vision data obtains, and carries out Image Acquisition using video camera and stores, generates vision guided navigation data.
Step 3: IMU and visual sensor initialization:
3.1) it is to extract feature from image and carry out feature point tracking calculating to the processing of vision data, is based on The technology of Structure From Motion (SFM) can calculate relative coordinate system vision guided navigation parameter using light velocity adjustment method.
3.2) due to IMU acquisition data frequency be much higher than camera frame per second, in order to realize data fusion, it is necessary to consider by when Between stab alignment, pre-integration is exactly that the IMU data between two camera frames are integrated to together.
3.3) result that vision calculates is matched with the result after IMU pre-integration using by pretreated data Standard is aligned its timestamp, and is initialized.
Step 4: intelligent decision system building
Inertia/visual combination navigation system is combined with inertia, vision guided navigation source sensor, the work shape of both sensors State, real-time performance and geometry distribution etc. are different from, and need to be managed each sensor and decision, optimal are melted using current Syntype is combined, to improve the navigation and positioning accuracy and reliability of system.
4.1) it is directed to navigation mode of operation selection strategy and switching, is needed by navigation sources work health state, current task Ask, external interference environment carries out comprehensive analysis, provide to management of navigation sources, fusion mode, information output mode judgement or The result of decision, as performance evaluation result.Intelligent decision flow chart is as shown in Figure 3.
4.2) inertia, vision guided navigation information and performance evaluation result (abnormality detection result etc.) information input are intelligently led Boat expert system,
4.3) it is brought into operation first by the rule in inference machine control knowledge base, reads the state of current each sensing sources, it is raw At the intermediate fact and it is saved to integrated database,
4.4) then inference machine matches the rule in knowledge base, selection is worked as according to the real-time status of each sensing sources Preceding optimal navigation fusion mode, activates and executes corresponding rule, and explanation engine can be explained to user and export current selection Decision-making foundation can carry out decision to the performance change trend of navigation fusion mode if finally navigation fusion mode switching.
4.5) inference machine scans for knowledge base according to the navigation sources sensor information of input, and control is met certain condition When accordingly movement can be activated, until reasoning complete, obtain optimal navigation decision information, reasoning process is actually one Search and matched process.
Step 5: inertia/visual combination emulation: loose coupling integrated mode is carried out, i.e., scene matching aided navigation result and IMU inertia Data carry out Unscented kalman filtering combination.
The embodiment of the present invention generates the emulation data needed using IMU working principle, then to IMU, visual sensor Initialization, is then combined mode intelligent decision, has finally carried out inertia/visual combination simulation analysis.
Specific implementation step is as follows:
Step 1: it generates IMU and emulates data:
1.1) Inertial Measurement Unit IMU is generated using gyroscope and accelerometer measures principle emulate data, guided missile carrier The attitude angular rate exported in flight isIt is available according to Eulerian angles theorem
WhereinIt is carrier coordinate system relative to the angular speed under geographic coordinate system,For geographic coordinate system to carrier The pose transformation matrix of coordinate system.
The ideal output of gyroscopeFor
WhereinFor Department of Geography relative to inertial system angular speed carrier system projection,
The ideal of accelerometer, which exports, is
fnFor the specific force under Department of Geography.
1.2) additional random error (white noise, single order Markov error, arbitrary constant in the output of ideal gyroscope Error) as the gyroscope output data emulated.
1.3) additional random error (single order Markov error) is used as accelerometer in the output of ideal accelerometer Simulation data data.
Step 2: vision data obtains, and carries out Image Acquisition using video camera and stores, generates vision guided navigation data.
Step 3: IMU and visual sensor initialization:
3.1) it is to extract feature from image and carry out feature point tracking calculating to the processing of vision data, is based on The technology of Structure From Motion (SFM) can calculate relative coordinate system vision guided navigation parameter using light velocity adjustment method.
3.2) due to IMU acquisition data frequency be much higher than camera frame per second, in order to realize data fusion, it is necessary to consider by when Between stab alignment, pre-integration is exactly that the IMU data between two camera frames are integrated to together.By defining between two key frames Posture, speed, the relative variation of position these amounts can be expressed with the measurement data of IMU, the quantity of state with optimal estimation It is unrelated, therefore during iteration optimization, above-mentioned pre-integration amount can be calculated only once, be only used only in more new state opposite Variable quantity calculates the quantity of state of next key frame, therefore calculation amount can be greatly reduced, to improve in embedded system The speed of service on system.
3.3) result that vision calculates is matched with the result after IMU pre-integration using by pretreated data Standard is aligned its timestamp, and is initialized.Corresponding three-dimensional can be recovered from two-dimensional image sequence simultaneously in registration Information, the structural information of kinematic parameter and scene including image camera are estimated to obtain a very basic state Meter.
Step 4: intelligent decision system building
Inertia/visual combination navigation system is combined with inertia, vision guided navigation source sensor, the work shape of both sensors State, real-time performance and geometry distribution etc. are different from, and need to be managed each sensor and decision, optimal are melted using current Syntype is combined, to improve the navigation and positioning accuracy and reliability of system.
4.1) it since inertia, visual sensor navigator fix characteristic are different, needs to be abstracted navigation sources, establish each The description of navigation sources.For navigation mode of operation selection strategy and switching, needed by navigation sources work health state, current task Ask, the factors such as external interference environment carry out comprehensive analysis, provide to management of navigation sources, fusion mode, information output mode etc. Judgement or the result of decision.Intelligent decision flow chart is as shown in Figure 4.
4.2) inertia, vision guided navigation information and performance evaluation result (abnormality detection result etc.) information input are intelligently led Boat expert system,
4.3) it is brought into operation first by the rule in inference machine control knowledge base, reads the state of current each sensing sources, it is raw At the intermediate fact and it is saved to integrated database,
4.4) then inference machine matches the rule in knowledge base, selection is worked as according to the real-time status of each sensing sources Preceding optimal navigation fusion mode, activates and executes corresponding rule, and explanation engine can be explained to user and export current selection Decision-making foundation can carry out decision to the performance change trend of navigation fusion mode if finally navigation fusion mode switching.
4.5) inference machine scans for knowledge base according to the navigation sources sensor information of input, and control is met certain condition When accordingly movement can be activated, until reasoning complete, obtain optimal navigation decision information, reasoning process is actually one Search and matched process.The major function of inference machine is according to relevant fact and rule, to the real-time of each navigation sources sensor State makes inferences and judges, selects currently available best fusion mode, to realize the fusion mode decision of system.Reasoning Mechanism relies on knowledge and carries out.Knowledge is stored in knowledge base in a specified pattern, and the mode of storage should be convenient for customer analysis and processing The working condition of navigation sources sensor, the states such as performance and geometry distribution, realizes management of navigation sources and Decision Inference.
Step 5: inertia/visual combination emulation: loose coupling integrated mode is carried out, i.e., scene matching aided navigation result and IMU inertia Data carry out Unscented kalman filtering combination.
(1) embodiment is demarcated
Unmanned plane during flying time 2 h is set, flying speed 300m/s, inertial navigation positioning accuracy 1n mile/h (CEP) are used to Lead renewal frequency 200Hz, scene can Matching band positioning accuracy 1m (CEP), it is contemplated that scene matching aided navigation is computationally intensive, and scene matching aided navigation is fixed Position update cycle 1s, in scene matching aided navigation failed areas, system only works under pure inertia mode, carries out at this time to systematic error battle array Recursive operation guarantees navigation information continuity, and after scene matching aided navigation functional rehabilitation, entire navigation system is in work in combination mould again Formula, when scene matching aided navigation update cycle 1s, integrated navigation and location precision is better than 2.5m.
Scene matching aided navigation is not continuously effective in practical application, it is therefore desirable to consider how long unavailable scene matching aided navigation is and still be able to Guarantee integrated navigation and location precision, this is planning Matching band gap length, selection continuous image provides reference significance.
In the case where meeting integrated positioning index request, the scene matching aided navigation update cycle is adjusted, is adjusted by parameter so that inertia/view Feel that integrated positioning result is greater than 5m, time interval at this time may be considered the maximum time interval of scene matching aided navigation, that is, scape As the maximum distance at Matching band interval.
It is emulated with above-mentioned setting condition, when combined cycle is less than 10s, i.e. flying distance 3km, integrated navigation is fixed Position precision is better than 5m (CEP), that is to say, that has scene matching aided navigation to ensure that whole process integrated positioning precision when flight 3km.
The above examples are only used to illustrate the technical scheme of the present invention, rather than its limitations;Although with reference to the foregoing embodiments Invention is explained in detail, those skilled in the art should understand that: it still can be to previous embodiment Documented technical solution is modified or equivalent replacement of some of the technical features;And these are modified or replace It changes, the spirit and scope for technical solution of the embodiment of the present invention that it does not separate the essence of the corresponding technical solution.

Claims (1)

1. a kind of inertia/Visual intelligent Combinated navigation method, it is characterised in that include the following steps:
Step 1 generates Inertial Measurement Unit IMU using gyroscope and accelerometer and emulates data, what carrier was exported in flight Attitude angular rate isAngular speed of the carrier in the case where carrier coordinate system is relative to geographic coordinate systemWherein,For the pose transformation matrix of geographic coordinate system to carrier coordinate system;
The ideal output of gyroscopeWhereinFor Department of Geography relative to The angular speed of inertial system carrier system projection,
The ideal output of accelerometerfnFor the specific force under Department of Geography;
The additional random error in the output of ideal gyroscope, including white noise, single order Markov error and arbitrary constant miss Difference, the gyroscope output data as emulation;
The additional random error in the output of ideal accelerometer, including single order Markov error are emulated as accelerometer Output data;
Step 2 carries out Image Acquisition, and feature is extracted from image and carries out feature point tracking calculating, utilizes light velocity adjustment method meter Calculate relative coordinate system vision guided navigation parameter;The IMU data between two camera frames are integrated to together by pre-integration, by vision The result of calculating is registrated with the result after IMU pre-integration, time alignment stamp;
Step 3, by inertia, vision guided navigation information and performance evaluation result information input intelligent navigation expert system, by reasoning Rule in machine control knowledge base brings into operation, and reads the state of current each sensing sources, generates intermediate true and is saved to Integrated database;Then inference machine matches the rule in knowledge base according to the real-time status of each sensing sources, and selection is current Optimal navigation fusion mode activates and executes corresponding rule, exports current selection decision-making foundation to user;Inference machine according to The navigation sources sensor information of input scans for knowledge base, and control, which meets to act accordingly when imposing a condition, to be activated, directly It is completed to reasoning, obtains optimal navigation decision information;
Scene matching aided navigation result and IMU inertial data are carried out Unscented kalman filtering combination by step 4.
CN201811579143.7A 2018-12-24 2018-12-24 A kind of inertia/Visual intelligent Combinated navigation method Pending CN109655058A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811579143.7A CN109655058A (en) 2018-12-24 2018-12-24 A kind of inertia/Visual intelligent Combinated navigation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811579143.7A CN109655058A (en) 2018-12-24 2018-12-24 A kind of inertia/Visual intelligent Combinated navigation method

Publications (1)

Publication Number Publication Date
CN109655058A true CN109655058A (en) 2019-04-19

Family

ID=66115466

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811579143.7A Pending CN109655058A (en) 2018-12-24 2018-12-24 A kind of inertia/Visual intelligent Combinated navigation method

Country Status (1)

Country Link
CN (1) CN109655058A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111065043A (en) * 2019-10-25 2020-04-24 重庆邮电大学 System and method for fusion positioning of vehicles in tunnel based on vehicle-road communication
CN113031040A (en) * 2021-03-01 2021-06-25 宁夏大学 Positioning method and system for airport ground clothes vehicle
CN113587975A (en) * 2020-04-30 2021-11-02 伊姆西Ip控股有限责任公司 Method, apparatus and computer program product for managing application environments
CN113949999A (en) * 2021-09-09 2022-01-18 之江实验室 Indoor positioning navigation equipment and method
CN116531690A (en) * 2023-06-25 2023-08-04 中国人民解放军63863部队 Forest fire extinguishing bomb with throwing device and throwing control method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160033280A1 (en) * 2014-08-01 2016-02-04 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable earpiece for providing social and environmental awareness
US20160305784A1 (en) * 2015-04-17 2016-10-20 Regents Of The University Of Minnesota Iterative kalman smoother for robust 3d localization for vision-aided inertial navigation
CN106446815A (en) * 2016-09-14 2017-02-22 浙江大学 Simultaneous positioning and map building method
CN106679648A (en) * 2016-12-08 2017-05-17 东南大学 Vision-inertia integrated SLAM (Simultaneous Localization and Mapping) method based on genetic algorithm
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN106840196A (en) * 2016-12-20 2017-06-13 南京航空航天大学 A kind of strap-down inertial computer testing system and implementation method
CN107014371A (en) * 2017-04-14 2017-08-04 东南大学 UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension
CN107796391A (en) * 2017-10-27 2018-03-13 哈尔滨工程大学 A kind of strapdown inertial navigation system/visual odometry Combinated navigation method
CN108375370A (en) * 2018-07-02 2018-08-07 江苏中科院智能科学技术应用研究院 A kind of complex navigation system towards intelligent patrol unmanned plane
CN108731670A (en) * 2018-05-18 2018-11-02 南京航空航天大学 Inertia/visual odometry combined navigation locating method based on measurement model optimization

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160033280A1 (en) * 2014-08-01 2016-02-04 Toyota Motor Engineering & Manufacturing North America, Inc. Wearable earpiece for providing social and environmental awareness
US20160305784A1 (en) * 2015-04-17 2016-10-20 Regents Of The University Of Minnesota Iterative kalman smoother for robust 3d localization for vision-aided inertial navigation
CN106708066A (en) * 2015-12-20 2017-05-24 中国电子科技集团公司第二十研究所 Autonomous landing method of unmanned aerial vehicle based on vision/inertial navigation
CN106446815A (en) * 2016-09-14 2017-02-22 浙江大学 Simultaneous positioning and map building method
CN106679648A (en) * 2016-12-08 2017-05-17 东南大学 Vision-inertia integrated SLAM (Simultaneous Localization and Mapping) method based on genetic algorithm
CN106840196A (en) * 2016-12-20 2017-06-13 南京航空航天大学 A kind of strap-down inertial computer testing system and implementation method
CN107014371A (en) * 2017-04-14 2017-08-04 东南大学 UAV integrated navigation method and apparatus based on the adaptive interval Kalman of extension
CN107796391A (en) * 2017-10-27 2018-03-13 哈尔滨工程大学 A kind of strapdown inertial navigation system/visual odometry Combinated navigation method
CN108731670A (en) * 2018-05-18 2018-11-02 南京航空航天大学 Inertia/visual odometry combined navigation locating method based on measurement model optimization
CN108375370A (en) * 2018-07-02 2018-08-07 江苏中科院智能科学技术应用研究院 A kind of complex navigation system towards intelligent patrol unmanned plane

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
程林: "专家系统在组合导航辅助决策中的应用研究", 《科技视界》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111065043A (en) * 2019-10-25 2020-04-24 重庆邮电大学 System and method for fusion positioning of vehicles in tunnel based on vehicle-road communication
CN113587975A (en) * 2020-04-30 2021-11-02 伊姆西Ip控股有限责任公司 Method, apparatus and computer program product for managing application environments
CN113031040A (en) * 2021-03-01 2021-06-25 宁夏大学 Positioning method and system for airport ground clothes vehicle
CN113949999A (en) * 2021-09-09 2022-01-18 之江实验室 Indoor positioning navigation equipment and method
CN113949999B (en) * 2021-09-09 2024-01-30 之江实验室 Indoor positioning navigation equipment and method
CN116531690A (en) * 2023-06-25 2023-08-04 中国人民解放军63863部队 Forest fire extinguishing bomb with throwing device and throwing control method
CN116531690B (en) * 2023-06-25 2023-10-20 中国人民解放军63863部队 Forest fire extinguishing bomb with throwing device and throwing control method

Similar Documents

Publication Publication Date Title
CN109655058A (en) A kind of inertia/Visual intelligent Combinated navigation method
CN106840148A (en) Wearable positioning and path guide method based on binocular camera under outdoor work environment
CN109959377A (en) A kind of robot navigation's positioning system and method
Scherer et al. Efficient onbard RGBD-SLAM for autonomous MAVs
CN110068335A (en) Unmanned aerial vehicle cluster real-time positioning method and system under GPS rejection environment
Kallmann et al. Geometric and discrete path planning for interactive virtual worlds
Chudoba et al. Exploration and mapping technique suited for visual-features based localization of mavs
CN106767791A (en) A kind of inertia/visual combination air navigation aid using the CKF based on particle group optimizing
Rao et al. Ctin: Robust contextual transformer network for inertial navigation
Tang et al. Onboard detection-tracking-localization
Herath et al. Neural inertial localization
Feng et al. S3e: A large-scale multimodal dataset for collaborative slam
Acuna et al. Moma: Visual mobile marker odometry
Deng et al. A cluster positioning architecture and relative positioning algorithm based on pigeon flock bionics
Rocha et al. Plannie: A benchmark framework for autonomous robots path planning algorithms integrated to simulated and real environments
Feng et al. Image-based trajectory tracking through unknown environments without absolute positioning
Hönig et al. Dynamic multi-target coverage with robotic cameras
Alliez et al. Indoor localization and mapping: Towards tracking resilience through a multi-slam approach
Deng et al. Entropy flow-aided navigation
Zhang et al. Exploration with global consistency using real-time re-integration and active loop closure
Waxman et al. A visual navigation system
Leng et al. An improved method for odometry estimation based on EKF and Temporal Convolutional Network
Gong et al. DeepNav: A scalable and plug-and-play indoor navigation system based on visual CNN
CN113689501A (en) Double-machine cooperative target machine positioning and tracking control method based on convergence point
Kou et al. Autonomous Navigation of UAV in Dynamic Unstructured Environments via Hierarchical Reinforcement Learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20190419