CN107144281B - Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision - Google Patents
Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision Download PDFInfo
- Publication number
- CN107144281B CN107144281B CN201710530501.4A CN201710530501A CN107144281B CN 107144281 B CN107144281 B CN 107144281B CN 201710530501 A CN201710530501 A CN 201710530501A CN 107144281 B CN107144281 B CN 107144281B
- Authority
- CN
- China
- Prior art keywords
- module
- flight
- unmanned aerial
- aerial vehicle
- marker
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 11
- 239000003550 marker Substances 0.000 claims abstract description 37
- 230000010365 information processing Effects 0.000 claims abstract description 20
- 238000012545 processing Methods 0.000 claims abstract description 16
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 claims abstract description 11
- 230000004927 fusion Effects 0.000 claims description 16
- 230000005540 biological transmission Effects 0.000 claims description 14
- 230000000007 visual effect Effects 0.000 claims description 11
- 230000004069 differentiation Effects 0.000 claims description 3
- 239000000284 extract Substances 0.000 claims description 3
- 238000007499 fusion processing Methods 0.000 claims description 3
- 238000005259 measurement Methods 0.000 claims 1
- 238000010923 batch production Methods 0.000 abstract description 3
- 238000005516 engineering process Methods 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- ZVKAMDSUUSMZES-NZQWGLPYSA-N OS II Natural products CC(=O)N[C@H]1[C@H](OC[C@@H](O)[C@@H](O)[C@@H](O)CO)O[C@H](CO)[C@H](O[C@H]2O[C@H](CO)[C@@H](O)[C@H](O)[C@H]2O)[C@@H]1O[C@@H]3O[C@H](CO)[C@@H](O)[C@H](O)[C@H]3O ZVKAMDSUUSMZES-NZQWGLPYSA-N 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 230000009977 dual effect Effects 0.000 description 1
- 238000004387 environmental modeling Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D30/00—Reducing energy consumption in communication networks
- Y02D30/70—Reducing energy consumption in communication networks in wireless communication networks
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Navigation (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses an unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision, which comprises a plurality of markers arranged indoors and provided with coding information, a monocular vision system module, a sensing module and a flight control system, wherein the monocular vision system module comprises a monocular camera and a vision information processing unit, the monocular camera is used for acquiring marker information during flight, and the vision information processing unit is used for completing acquisition and information processing of video data and transmitting processing results to the flight control system; the invention takes the marker with the coding information as the guide, only one marker is needed as the cooperation target to identify the relative pose of the unmanned aerial vehicle relative marker, and the marker is easy to distinguish from the environment because of the coding information, and is suitable for batch production because of the coding characteristic, and a great deal of effort is not needed to put out the marker which is distinguished from the environment and has the distinction between the markers.
Description
Technical Field
The invention relates to the technical field of unmanned aerial vehicle control systems, in particular to an unmanned aerial vehicle indoor positioning system and method based on cooperative targets and monocular vision.
Background
Unmanned aerial vehicle is unmanned aircraft, and unmanned aerial vehicle adopts high and new technologies such as satellite positioning, remote sensing, geospatial, aerospace, automatic control, computer-aided analysis, etc. can serve a plurality of fields such as homeland, survey and drawing, forestry, traffic, water conservancy and military affairs.
Unmanned aerial vehicle navigation systems are one of the key components of unmanned aerial vehicle systems. At present, navigation technologies used by unmanned aerial vehicles mainly comprise inertial navigation, satellite navigation, doppler navigation, visual navigation, terrain assistance and geomagnetic navigation. In practical applications, it is necessary to select a suitable navigation system according to the flight environment of the unmanned aerial vehicle, the task requirements, and the application range and use conditions of the navigation technology.
In recent years, unmanned aerial vehicle navigation technology research has greatly progressed, but most navigation technologies are only suitable for outdoor space and cannot be applied to indoor unknown environment navigation. The indoor navigation environment is unknown and complex. The unmanned aerial vehicle does not know the size of the indoor environment, the shape and the distribution of the obstacles and the existence of artificial reference objects, the obstacles in the indoor environment are randomly placed or mutually shielded, and indoor light changes along with the change of the angle of the unmanned aerial vehicle. Nowadays, some research is carried out on unmanned aerial vehicle indoor navigation control by the country, but a unified theoretical system is not formed yet, and some important theories and technologies remain to be solved. These problems are mainly environmental modeling, unmanned aerial vehicle positioning, unmanned aerial vehicle navigation controller design, real-time motion control, etc.
Disclosure of Invention
The invention aims at overcoming the technical defects in the prior art and provides an unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision.
The technical scheme adopted for realizing the purpose of the invention is as follows:
an unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision comprises a plurality of markers arranged indoors and provided with coding information, a monocular vision system module, a sensing module and a flight control system,
the monocular vision system module comprises a monocular camera and a vision information processing unit, wherein the monocular camera is used for acquiring identifier information during flight, and the vision information processing unit is used for completing acquisition and information processing of video data and transmitting a processing result to the flight control system;
the sensing module comprises a gyroscope, an accelerometer and a relative height measuring module;
the flight control system comprises a sensor data fusion module based on an extended Kalman filter, and the sensor data fusion module receives the processing result and the flight state information from the sensor module and fuses the processing result and the flight state information to generate a flight instruction.
The output end of the sensor data fusion module based on the extended Kalman filter is connected with the input end of the flight controller, and the output end of the flight controller is connected with the input end of the data transmission module.
The relative height measuring module is a laser ranging module.
The marker is in a block shape, a plate shape or a sheet shape, and the coded information has differentiation.
The system also comprises a flight state indicating module and a remote controller which are in communication connection with the data transmission module.
The positioning method of the unmanned aerial vehicle indoor positioning system comprises the following steps,
1) The monocular camera acquires the video stream containing the identifier and transmits the video stream to the visual information processing unit, the visual information processing unit module extracts the characteristic information of the code information of the identifier acquired in the visual field,
2) Acquiring the corresponding code of the marker and the coordinates of the marker in a world coordinate system through the characteristic information,
3) Calculating the relative pose of the camera relative to the marker by combining the coordinates, the corner features and the monocular camera parameters of the marker, and then obtaining pose data and indoor navigation data of the unmanned aerial vehicle under an indoor world coordinate system;
4) And the sensor data fusion module based on the extended Kalman filter receives the pose data, the indoor navigation data and the flight state information from the sensing module, performs fusion processing, and generates a flight instruction according to a set flight track.
The flight command is transmitted to the unmanned aerial vehicle power module through the flight controller and the data transmission module, and the flight controller is based on a muC/OS-II system.
The remote control method further comprises a remote control step, wherein a remote control instruction of the remote control module is transmitted to the unmanned aerial vehicle power module through the data transmission module and is input into the sensor data fusion module.
Compared with the prior art, the invention has the beneficial effects that:
the invention takes the marker with the coding information as the guide, only one marker is needed as the cooperation target to identify the relative pose of the unmanned aerial vehicle relative marker, and the marker is easy to distinguish from the environment because of the coding information, and is suitable for batch production because of the coding characteristic, and a great deal of effort is not needed to put out the marker which is distinguished from the environment and has the distinction between the markers. The identification itself has coding information, rather than coding a plurality of identifications after the cooperative identification is arranged on the indoor ground, so that the specific coordinates of the identification can be obtained through information matching after the coding information is read, and the calculation speed is improved. The marker with the coded information can effectively improve real-time high-precision positioning in the environment or interaction with the environment.
Drawings
FIG. 1 is a system block diagram of an unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision according to the present invention;
fig. 2 is a view of collaborative goal and monocular vision based indoor navigation data solution according to the present invention.
Fig. 3 is a schematic diagram of a sensor data fusion module based on an extended kalman filter according to the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and the specific examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
As shown in the figure, the unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision comprises a plurality of markers which are arranged on the indoor ground or a certain elevation and provided with coding information, a monocular vision system module, a sensing module and a flight control system,
the monocular vision system module comprises a monocular camera and a vision information processing unit, wherein the monocular camera is used for acquiring identifier information during flight and transmitting a video stream to the vision information processing unit through a CAN data bus, and the vision information processing unit comprises a processing system with a CPU and a GPU dual processing core, and is used for completing acquisition and information processing of video data and transmitting a processing result to a flight control system; wherein the monocular camera is on-board, the visual information processing unit is provided with a dual-core processing system for data analysis, processing and calculation, thereby effectively improving the calculation speed,
the sensing module comprises a gyroscope, an accelerometer and a relative height measuring module; meanwhile, the system also comprises conventional sensors such as a magnetometer, a barometer and the like, and the multi-sensor is arranged to detect various types, so that the control stability is further improved; the relative height measuring module is a laser ranging module;
the flight control system comprises a sensor data fusion module based on an extended Kalman filter, and the sensor data fusion module receives the processing result and the flight state information from the sensor module and fuses the processing result and the flight state information to generate a flight instruction.
The extended Kalman filter is a software algorithm, and the key of sensor fusion based on the algorithm is the establishment of a system model and the processing of identification switching between monocular vision and multiple identifications, so that smooth transition and stable flight of the unmanned aerial vehicle in the identification switching process are ensured.
The invention takes the marker with the coding information as the guide, only one marker is needed as the cooperation target to identify the relative pose of the unmanned aerial vehicle relative marker, and the marker is easy to distinguish from the environment because of the coding information, and is suitable for batch production because of the coding characteristic, and a great deal of effort is not needed to put out the marker which is distinguished from the environment and has the distinction between the markers. The identification itself has coding information, rather than coding a plurality of identifications after the cooperative identification is arranged on the indoor ground, so that the specific coordinates of the identification can be obtained through information matching after the coding information is read, and the calculation speed is improved. The marker with the coded information can effectively improve real-time high-precision positioning in the environment or interaction with the environment.
For unmanned aerial vehicle control system, the output of sensor data fusion module based on extended Kalman filter links to each other with flight controller's input, and flight controller's output links to each other with data transmission module's input, and data transmission module delivers the flight instruction to unmanned aerial vehicle power module and realizes following flight and autonomous take off and land of various topography.
The marker is in a block shape, a plate shape or a sheet shape, and the coded information has differentiation. The generation of the identifier and the synchronous generation of the coding information can utilize designed software to directly print or generate corresponding pictures after the binary coding information is determined, and simultaneously store corresponding information and coordinate information. Since the image of each marker itself contains encoded information, there is a difference between each marker picture, and thus the marker picture library refers to all encoded template pictures. Each marker has a fixed number of corner points, and the geometric distance between the corner points satisfies invariance under projective transformation. From this characteristic information it is determined whether it is the designed identity.
Meanwhile, in order to improve the controllability and man-machine interaction, the intelligent control system further comprises a flight state indicating module and a remote controller, wherein the flight state indicating module and the remote controller are in communication connection with the data transmission module. The ground station and the data transmission module communicate to obtain the flight state of the unmanned aerial vehicle, display through a mobile phone, a tablet personal computer or a computer and the like, and when abnormal control occurs, early warning can be performed through the color of the navigation lights, the flashing frequency, the voice prompt and the like.
Meanwhile, the invention discloses a positioning method of the unmanned aerial vehicle indoor positioning system, which comprises the following steps,
1) The monocular camera acquires the video stream containing the identifier and transmits the video stream to the visual information processing unit, the visual information processing unit module extracts the characteristic information of the code information of the identifier acquired in the visual field,
2) Acquiring the corresponding code of the marker and the coordinates of the marker in a world coordinate system through the characteristic information,
3) Calculating the relative pose of the camera relative to the marker by combining the coordinates, the corner features and the monocular camera parameters of the marker, and then obtaining pose data and indoor navigation data of the unmanned aerial vehicle under an indoor world coordinate system;
4) And the sensor data fusion module based on the extended Kalman filter receives the pose data, the indoor navigation data and the flight state information from the sensing module, performs fusion processing, and generates a flight instruction according to a set flight track.
The flight command is transmitted to the unmanned aerial vehicle power module through the flight controller and the data transmission module, the flight controller is based on the muC/OS-II and also comprises a remote control module and a ground station, such as a mobile phone, a tablet personal computer, a computer and the like, and the remote control module is in communication connection with the unmanned aerial vehicle power module through the data transmission module. Based on the system scheduling provided by the muC/OS-II, the aircraft controller outputs instructions to a servo executing mechanism, namely a steering engine system after judgment, calculation and processing through collected sensor information, stored relevant states and data and instructions and data of uplink remote control or ground station planning sent by a radio control terminal, and controls the control surface of the unmanned aircraft and the throttle of an engine so as to control the flight of the unmanned aircraft.
The unmanned aerial vehicle position control method based on the monocular vision can plan a path in advance through software or plan the path on line according to the perception information of the environment, is responsible for feeding back the position and posture information of the unmanned aerial vehicle in the indoor environment as the identification of the identifier of the cooperative target, is the basis of the unmanned aerial vehicle to fly according to the preset or on-line corrected track, is used for controlling the position ring (namely the outer ring) of the unmanned aerial vehicle based on the monocular vision position and has high dynamic characteristics when the unmanned aerial vehicle is controlled, so that the unmanned aerial vehicle cannot fly stably only by means of the vision information.
The foregoing is merely a preferred embodiment of the present invention and it should be noted that modifications and adaptations to those skilled in the art may be made without departing from the principles of the present invention, which are intended to be comprehended within the scope of the present invention.
Claims (8)
1. An unmanned aerial vehicle indoor positioning system based on cooperative targets and monocular vision is characterized by comprising a plurality of indoor markers provided with coded information, a monocular vision system module, a sensing module and a flight control system,
the monocular vision system module comprises a monocular camera and a vision information processing unit, wherein the monocular camera is used for acquiring identifier information during flight, and the vision information processing unit is used for completing acquisition and information processing of video data and transmitting a processing result to the flight control system;
the sensing module comprises a gyroscope, an accelerometer and a relative height measuring module;
the flight control system comprises a sensor data fusion module based on an extended Kalman filter, and the sensor data fusion module receives the processing result and the flight state information from the sensor module and fuses the processing result and the flight state information to generate a flight instruction.
2. The unmanned aerial vehicle indoor positioning system of claim 1, wherein the output of the extended kalman filter based sensor data fusion module is connected to an input of a flight controller, and wherein the output of the flight controller is connected to an input of the data transmission module.
3. The unmanned aerial vehicle indoor positioning system of claim 1, wherein the relative height measurement module is a laser ranging module.
4. The unmanned aerial vehicle indoor positioning system of claim 1, wherein the marker is in the form of a block, a plate or a sheet, and the coded information has a differentiation.
5. The unmanned aerial vehicle indoor positioning system of claim 2, further comprising a flight status indication module communicatively coupled to the data transmission module and a remote control.
6. A positioning method of an unmanned aerial vehicle indoor positioning system according to claim 1, comprising the steps of,
1) The monocular camera acquires the video stream containing the identifier and transmits the video stream to the visual information processing unit, the visual information processing unit module extracts the characteristic information of the code information of the identifier acquired in the visual field,
2) Acquiring the corresponding code of the marker and the coordinates of the marker in a world coordinate system through the characteristic information,
3) Calculating the relative pose of the camera relative to the marker by combining the coordinates, the corner features and the monocular camera parameters of the marker, and then obtaining pose data and indoor navigation data of the unmanned aerial vehicle under an indoor world coordinate system;
4) And the sensor data fusion module based on the extended Kalman filter receives the pose data, the indoor navigation data and the flight state information from the sensing module, performs fusion processing, and generates a flight instruction according to a set flight track.
7. The positioning method of claim 6, wherein the flight command is transmitted to the unmanned aerial vehicle power module via a flight controller, a data transmission module, the flight controller being based on a μc/OS-ii system.
8. The positioning method according to claim 6, further comprising a remote control step, wherein a remote control command of the remote control module is transmitted to the unmanned aerial vehicle power module via the data transmission module and is input to the sensor data fusion module.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710530501.4A CN107144281B (en) | 2017-06-30 | 2017-06-30 | Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710530501.4A CN107144281B (en) | 2017-06-30 | 2017-06-30 | Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107144281A CN107144281A (en) | 2017-09-08 |
CN107144281B true CN107144281B (en) | 2023-09-12 |
Family
ID=59785556
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710530501.4A Active CN107144281B (en) | 2017-06-30 | 2017-06-30 | Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107144281B (en) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108592914A (en) * | 2018-04-08 | 2018-09-28 | 河南科技学院 | The positioning of complex region inspecting robot, navigation and time service method under no GPS scenario |
WO2020000386A1 (en) * | 2018-06-29 | 2020-01-02 | 深圳市大疆创新科技有限公司 | Flight control method, device and system, and storage medium |
CN109405821B (en) * | 2018-09-21 | 2020-01-03 | 北京三快在线科技有限公司 | Method and device for positioning and target equipment |
CN109977924A (en) * | 2019-04-15 | 2019-07-05 | 北京麦飞科技有限公司 | For real time image processing and system on the unmanned plane machine of crops |
CN110261877B (en) * | 2019-06-26 | 2024-06-11 | 南京航空航天大学 | Ground-air collaborative visual navigation method and device based on improved graph optimization SLAM |
CN110446159B (en) * | 2019-08-12 | 2020-11-27 | 上海工程技术大学 | System and method for accurate positioning and autonomous navigation of indoor unmanned aerial vehicle |
CN111486841B (en) * | 2020-05-19 | 2022-04-05 | 西安因诺航空科技有限公司 | Unmanned aerial vehicle navigation positioning method based on laser positioning system |
CN113720331B (en) * | 2020-12-25 | 2023-12-19 | 北京理工大学 | Multi-camera fused unmanned aerial vehicle in-building navigation positioning method |
CN112859923B (en) * | 2021-01-25 | 2022-02-18 | 西北工业大学 | Unmanned aerial vehicle vision formation flight control system |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2071353A2 (en) * | 2007-12-14 | 2009-06-17 | The Boeing Company | System and methods for autonomous tracking and surveillance |
CN101667032A (en) * | 2008-09-03 | 2010-03-10 | 中国科学院自动化研究所 | Vision-based target tracking system using unmanned helicopter |
CN106527485A (en) * | 2016-12-14 | 2017-03-22 | 天津文林科技有限公司 | Unmanned aerial vehicle (UAV) fixed-point flight control system based on photo-electro-mechanical integrated visual positioning |
-
2017
- 2017-06-30 CN CN201710530501.4A patent/CN107144281B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2071353A2 (en) * | 2007-12-14 | 2009-06-17 | The Boeing Company | System and methods for autonomous tracking and surveillance |
CN101667032A (en) * | 2008-09-03 | 2010-03-10 | 中国科学院自动化研究所 | Vision-based target tracking system using unmanned helicopter |
CN106527485A (en) * | 2016-12-14 | 2017-03-22 | 天津文林科技有限公司 | Unmanned aerial vehicle (UAV) fixed-point flight control system based on photo-electro-mechanical integrated visual positioning |
Also Published As
Publication number | Publication date |
---|---|
CN107144281A (en) | 2017-09-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107144281B (en) | Unmanned aerial vehicle indoor positioning system and positioning method based on cooperative targets and monocular vision | |
WO2018032457A1 (en) | Systems and methods for augmented stereoscopic display | |
CN110426046B (en) | Unmanned aerial vehicle autonomous landing runway area obstacle judging and tracking method | |
CN103175524B (en) | A kind of position of aircraft without view-based access control model under marking environment and attitude determination method | |
CN105222760A (en) | The autonomous obstacle detection system of a kind of unmanned plane based on binocular vision and method | |
CN104298248A (en) | Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle | |
CN103353758A (en) | Indoor robot navigation device and navigation technology thereof | |
CN110097620A (en) | High-precision map creation system based on image and three-dimensional laser | |
CN112558608A (en) | Vehicle-mounted machine cooperative control and path optimization method based on unmanned aerial vehicle assistance | |
Li et al. | UAV autonomous landing technology based on AprilTags vision positioning algorithm | |
CN102190081A (en) | Vision-based fixed point robust control method for airship | |
CN116540784B (en) | Unmanned system air-ground collaborative navigation and obstacle avoidance method based on vision | |
CN114923477A (en) | Multi-dimensional space-ground collaborative map building system and method based on vision and laser SLAM technology | |
CN104965513A (en) | Son hopping robot recovery system and recovery method | |
CN109116861A (en) | A kind of unmanned aerial vehicle ground control system | |
CN110309883A (en) | A kind of unmanned plane autonomic positioning method of view-based access control model SLAM | |
CN115826622B (en) | Night co-location method for unmanned aerial vehicle group | |
CN112556695A (en) | Indoor positioning and three-dimensional modeling method and system, electronic equipment and storage medium | |
CN115755575A (en) | ROS-based double-tripod-head unmanned aerial vehicle autonomous landing method | |
CN112904895B (en) | Image-based airplane guiding method and device | |
Zhou et al. | Multi-robot real-time cooperative localization based on high-speed feature detection and two-stage filtering | |
CN113959412A (en) | Unmanned aerial vehicle docking process active visual navigation method based on infrared detection | |
WO2021072766A1 (en) | Flight control method and system, unmanned aerial vehicle, and storage medium | |
Ming et al. | Laser tracking leader-follower automatic cooperative navigation system for UAVs | |
Zhong et al. | A vision-based auxiliary system of multirotor unmanned aerial vehicles for autonomous rendezvous and docking |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |