CN110780678A - Unmanned aerial vehicle visual navigation control method based on point cloud data - Google Patents

Unmanned aerial vehicle visual navigation control method based on point cloud data Download PDF

Info

Publication number
CN110780678A
CN110780678A CN201911034573.5A CN201911034573A CN110780678A CN 110780678 A CN110780678 A CN 110780678A CN 201911034573 A CN201911034573 A CN 201911034573A CN 110780678 A CN110780678 A CN 110780678A
Authority
CN
China
Prior art keywords
aerial vehicle
unmanned aerial
flight
visual navigation
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911034573.5A
Other languages
Chinese (zh)
Inventor
王朱伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuxi Han Yong Polytron Technologies Inc
Original Assignee
Wuxi Han Yong Polytron Technologies Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuxi Han Yong Polytron Technologies Inc filed Critical Wuxi Han Yong Polytron Technologies Inc
Priority to CN201911034573.5A priority Critical patent/CN110780678A/en
Publication of CN110780678A publication Critical patent/CN110780678A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses an unmanned aerial vehicle visual navigation control method based on point cloud data. Drawing a three-dimensional map model of a flight area on the intelligent terminal; drawing the flight track of the unmanned aerial vehicle on the drawn three-dimensional map model and storing the flight track; transmitting the three-dimensional map model marked with the flight track to an Android chip; when the flight control module starts the unmanned aerial vehicle, the visual navigation module acquires the spatial data of the position of the unmanned aerial vehicle to obtain the three-dimensional spatial coordinates of the position of the unmanned aerial vehicle and the three-dimensional spatial coordinates of the starting point and the ending point of the flight track; the initial position that unmanned aerial vehicle was pushed back to the time of flight that vision navigation module combines unmanned aerial vehicle tracks of following, constantly controls flight control module adjustment flight position at the flight in-process, makes unmanned aerial vehicle's flight orbit coincide gradually with the flight orbit of mark, until unanimous. The unmanned aerial vehicle positioning system replaces positioning systems such as a GPS and the like, can work normally even under the condition of poor signals, and enlarges the application range of the unmanned aerial vehicle.

Description

Unmanned aerial vehicle visual navigation control method based on point cloud data
The technical field is as follows:
the invention belongs to the technical field of data analysis, and particularly relates to an unmanned aerial vehicle visual navigation control method based on point cloud data.
Background art:
the unmanned aerial vehicle is widely applied to the military and civil fields by virtue of the characteristics of small volume, low cost, convenient use, strong maneuvering performance, low requirement on environment and the like. The existing unmanned aerial vehicle mostly navigates through positioning systems such as a GPS (global positioning system), when the signals of the positioning systems are not good, the situation that the unmanned aerial vehicle loses control often occurs, and the use of the unmanned aerial vehicle is limited.
The information disclosed in this background section is only for enhancement of understanding of the general background of the invention and should not be taken as an acknowledgement or any form of suggestion that this information forms the prior art already known to a person skilled in the art.
The invention content is as follows:
the invention aims to provide an unmanned aerial vehicle visual navigation control method based on point cloud data, so that the defects in the prior art are overcome.
In order to achieve the purpose, the invention provides an unmanned aerial vehicle visual navigation system based on point cloud data, which comprises an intelligent terminal and a visual navigation module, wherein the unmanned aerial vehicle selects a part AR.DRONE 2.0 unmanned aerial vehicle, a flight control module in the unmanned aerial vehicle is an open source flight control module, the visual navigation module is arranged on the unmanned aerial vehicle, and the visual navigation module is in communication connection with the flight control module; the intelligent terminal and the visual navigation module respectively comprise a sensor layer, an operating system layer and an application layer, wherein the sensor layer comprises 2300-ten-thousand-pixel main cameras, a depth perception camera and a motion focus-tracking camera, the operating system layer is an Android system, the application layer is an app based on Unity3D, the intelligent terminal and the visual navigation module adopt Android chips, the 2300-thousand-pixel main cameras, the depth perception camera and the motion focus-tracking camera are arranged on the Android chips, and the depth perception camera is provided with a laser; and the intelligent terminal is in communication connection with the visual navigation module.
An unmanned aerial vehicle visual navigation control method based on point cloud data comprises the following steps:
(1) starting an intelligent terminal, opening an app based on Unity3D in an Android system, performing three-dimensional spatial data acquisition on an area where the unmanned aerial vehicle needs to fly by sending a laser signal through a depth perception camera, establishing a 3d slam, and drawing a three-dimensional map model of the flying area;
(2) editing the drawn three-dimensional map model in the Unity 3D-based app, drawing the flight trajectory of the unmanned aerial vehicle in the three-dimensional map model, and storing the flight trajectory;
(3) communicating with a visual navigation module through an intelligent terminal, and transmitting the three-dimensional map model marked with the flight track in the step (2) to the visual navigation module;
(4) when the flight control module starts the unmanned aerial vehicle, the visual navigation module starts, an app based on Unity3D is opened in an Android system, the sensor layer starts to work, the main camera with 2300 ten thousand pixels is used for collecting image information of a flight area, the depth sensing camera is used for sending a laser signal to collect three-dimensional spatial data information, and spatial data of the position where the unmanned aerial vehicle is located are obtained; the method comprises the steps that 2300 ten-thousand-pixel main camera and a depth perception camera transmit data to an Android chip, 3d slam is established through an app based on Unity3D, any point is selected as a reference point in a three-dimensional map model, and a three-dimensional space coordinate of the position where the unmanned aerial vehicle is located and a three-dimensional space coordinate of a starting point and a tail point of a flight track are obtained according to the reference point;
(5) in the flight process of the unmanned aerial vehicle, the visual navigation module collects the position data of the unmanned aerial vehicle through the motion focus-following camera, the position is a set of position and direction, the motion focus-following camera transmits the data to the Android chip, the position and direction of the unmanned aerial vehicle in the flight process are recorded in the Unity 3D-based app, and the motion track of the unmanned aerial vehicle flying from the initial position to the current position can be pushed back through a method of adjusting back and returning a given timestamp by combining a timer in the Android chip to perform motion tracking;
(6) the three-dimensional space coordinate of the position of the unmanned aerial vehicle and the three-dimensional space coordinate data of the starting point and the ending point of the flight track are transmitted to the unmanned aerial vehicle flight control module by the visual navigation module, the unmanned aerial vehicle is controlled by the flight control module to fly to the ending point of the flight track from the position, the visual navigation module continuously controls the flight control module to adjust the flight position in the flight process according to the motion track obtained in the step (5), and the flight track of the unmanned aerial vehicle is gradually overlapped with the marked flight track until the flight track is consistent with the marked flight track.
Preferably, in the technical scheme, the laser signals sent by the depth perception camera in the steps (1) and (3) return depth data in the form of point cloud, the point cloud includes three-dimensional coordinate (x, y, z) data of all points in the current area, and each piece of dimensional data is of a floating point type and represents the position of the data in the coordinate frame.
Preferably, in the technical scheme, the position data in the step (5) comprises two parts: one unit is a vector for conversion in meters and one for a rotating quadruple.
Preferably, in the technical scheme, the process of the visual navigation module in the step (6) for controlling the flight control module to adjust the flight position is regional learning, the visual navigation module records visual feature points of the region in the flight process, drift correction is completed by using the visual feature points, and errors in the position, direction and motion of the unmanned aerial vehicle are corrected, so that the flight track of the unmanned aerial vehicle is matched with the previous record.
Preferably, in the technical scheme, the visual feature point is a point with consistent contrast in point cloud data scanned by the depth perception camera twice through laser.
Compared with the prior art, the invention has the following beneficial effects:
through the three-dimensional model map of the flight area drawn by the import terminal, the visual navigation module positions the unmanned aerial vehicle on the three-dimensional model map in real time, and the three-dimensional model map replaces positioning systems such as a GPS (global positioning system), so that the unmanned aerial vehicle can normally work even under the condition that signals are not good, and the application range of the unmanned aerial vehicle is expanded. The flight track of the unmanned aerial vehicle obtained through the visual navigation module is compared with the drawn flight track, and the flight route of the unmanned aerial vehicle can be corrected in real time.
Description of the drawings:
FIG. 1 is a flow chart of an unmanned aerial vehicle visual navigation control method based on point cloud data according to the invention;
fig. 2 is a structural schematic block diagram of the unmanned aerial vehicle visual navigation control system based on point cloud data.
FIG. 3 is a schematic diagram of the area learning according to the present invention.
The specific implementation mode is as follows:
the following detailed description of specific embodiments of the invention is provided, but it should be understood that the scope of the invention is not limited to the specific embodiments.
Throughout the specification and claims, unless explicitly stated otherwise, the word "comprise", or variations such as "comprises" or "comprising", will be understood to imply the inclusion of a stated element or component but not the exclusion of any other element or component.
Example 1
As shown in fig. 2, an unmanned aerial vehicle visual navigation control system based on point cloud data includes an intelligent terminal and a visual navigation module, the unmanned aerial vehicle is a part ar.drone 2.0 unmanned aerial vehicle, a flight control module in the unmanned aerial vehicle is an open source flight control module, the visual navigation module is arranged on the unmanned aerial vehicle, and the visual navigation module is in communication connection with the flight control module; the intelligent terminal and the visual navigation module respectively comprise a sensor layer, an operating system layer and an application layer, wherein the sensor layer comprises 2300-ten-thousand-pixel main cameras, a depth perception camera depth camera and a motion tracking camera moving camera, the operating system layer is an Android system, the application layer is an app based on Unity3D, the intelligent terminal and the visual navigation module adopt Android chips, the 2300-thousand-pixel main cameras, the depth perception camera and the motion tracking camera are arranged on the Android chips, and the depth perception camera is provided with a laser; and the intelligent terminal is in communication connection with the visual navigation module. The Android chip is provided with an Erbida LPDDR3 memory, a Myriad1 vision coprocessor, two AMIC 16Mbit low-power consumption serial memory chips, an MPU-91509 nine-axis gyroscope/acceleration sensor, a Winbond 16Mbit SPI flash memory, a PrimeSense 3D sensor, a Sandisk 64GB flash memory chip, and an Erbida LPDDR3 memory lower packaging Dulong 800 processor.
As shown in fig. 1, a method for controlling visual navigation of an unmanned aerial vehicle based on point cloud data includes:
(1) starting an intelligent terminal, opening an app based on Unity3D in an Android system, performing three-dimensional spatial data acquisition on an area where the unmanned aerial vehicle needs to fly by sending a laser signal through a depth perception camera, establishing a 3d slam, and drawing a three-dimensional map model of the flying area;
(2) editing the drawn three-dimensional map model in the Unity 3D-based app, drawing the flight trajectory of the unmanned aerial vehicle in the three-dimensional map model, and storing the flight trajectory;
(3) communicating with a visual navigation module through an intelligent terminal, and transmitting the three-dimensional map model marked with the flight track in the step (2) to the visual navigation module;
(4) when the flight control module starts the unmanned aerial vehicle, the visual navigation module starts, an app based on Unity3D is opened in an Android system, the sensor layer starts to work, the main camera with 2300 ten thousand pixels is used for collecting image information of a flight area, the depth sensing camera is used for sending a laser signal to collect three-dimensional spatial data information, and spatial data of the position where the unmanned aerial vehicle is located are obtained; the method comprises the steps that 2300 ten-thousand-pixel main camera and a depth perception camera transmit data to an Android chip, 3d slam is established through an app based on Unity3D, any point is selected as a reference point in a three-dimensional map model, and a three-dimensional space coordinate of the position where the unmanned aerial vehicle is located and a three-dimensional space coordinate of a starting point and a tail point of a flight track are obtained according to the reference point;
(5) in the unmanned aerial vehicle flight process, the visual navigation module gathers unmanned aerial vehicle's pos data through the camera that focuses on of motion, and the pos is the set of position and direction, and the pos data contains two parts: the unit of the motion tracking camera is a vector for conversion in meters, a quadruple for rotation is used, the motion tracking camera transmits data to the Android chip, the position and the direction of the unmanned aerial vehicle in the flying process are recorded in the Unity 3D-based app, and the motion tracking can be performed by reversely pushing the motion track of the unmanned aerial vehicle flying from the initial position to the current position through a method of returning back and a given timestamp in combination with a self-contained timer in the Android chip;
(6) the three-dimensional space coordinate of the position of the unmanned aerial vehicle and the three-dimensional space coordinate data of the starting point and the ending point of the flight track are transmitted to the unmanned aerial vehicle flight control module by the visual navigation module, the unmanned aerial vehicle is controlled by the flight control module to fly to the ending point of the flight track from the position, the visual navigation module continuously controls the flight control module to adjust the flight position in the flight process according to the motion track obtained in the step (5), and the flight track of the unmanned aerial vehicle is gradually overlapped with the marked flight track until the flight track is consistent with the marked flight track.
The laser signal sent by the depth perception camera returns depth data in the form of point cloud, the point cloud comprises three-dimensional coordinate (x, y, z) data of all points in the current area, and each dimension data is of a floating point type and represents the position of the data in a coordinate frame.
The visual navigation module is regional study in the process of controlling the flight control module to adjust the flight position, and the visual navigation module records the visual feature points in the region in the flight process, completes drift correction by using the visual feature points, and corrects the errors in the aspects of the position, direction and motion of the unmanned aerial vehicle so as to enable the flight track of the unmanned aerial vehicle to be consistent with the previous record. The visual feature points are points with consistent contrast in point cloud data obtained by twice laser scanning of the depth perception camera.
As shown in fig. 3, when the drone starts to pass through a certain area, there are actually two different trajectories simultaneously — the trajectory that should be taken (real trajectory) and the trajectory measured by the device (measured trajectory). We are presented with the process of measuring the deviation of the trajectory from the true trajectory over time. When the device returns to the origin and recognizes the origin, it corrects for drift errors, assuming that the point on the measurement trajectory is (x) 1,y 1,z 1) The point on the real track is (x) 0,y 0,z 0) During the correction, a point (x) on the trajectory is measured 1,y 1,z 1) Will gradually move towards the point (x) 0,y 0,z 0) And finally, the measurement track is matched with the real track.
The foregoing descriptions of specific exemplary embodiments of the present invention have been presented for purposes of illustration and description. It is not intended to limit the invention to the precise form disclosed, and obviously many modifications and variations are possible in light of the above teaching. The exemplary embodiments were chosen and described in order to explain certain principles of the invention and its practical application to enable one skilled in the art to make and use various exemplary embodiments of the invention and various alternatives and modifications as are suited to the particular use contemplated. It is intended that the scope of the invention be defined by the claims and their equivalents.

Claims (5)

1. An unmanned aerial vehicle visual navigation control method based on point cloud data comprises the following steps:
(1) starting an intelligent terminal, opening an app based on Unity3D in an Android system, performing three-dimensional spatial data acquisition on an area where the unmanned aerial vehicle needs to fly by sending a laser signal through a depth perception camera, establishing a 3d slam, and drawing a three-dimensional map model of the flying area;
(2) editing the drawn three-dimensional map model in the Unity 3D-based app, drawing the flight trajectory of the unmanned aerial vehicle in the three-dimensional map model, and storing the flight trajectory;
(3) communicating with a visual navigation module through an intelligent terminal, and transmitting the three-dimensional map model marked with the flight track in the step (2) to the visual navigation module;
(4) when the flight control module starts the unmanned aerial vehicle, the visual navigation module starts, an app based on Unity3D is opened in an Android system, the sensor layer starts to work, the main camera with 2300 ten thousand pixels is used for collecting image information of a flight area, the depth sensing camera is used for sending a laser signal to collect three-dimensional spatial data information, and spatial data of the position where the unmanned aerial vehicle is located are obtained; the method comprises the steps that 2300 ten-thousand-pixel main camera and a depth perception camera transmit data to an Android chip, 3d slam is established through an app based on Unity3D, any point is selected as a reference point in a three-dimensional map model, and a three-dimensional space coordinate of the position where the unmanned aerial vehicle is located and a three-dimensional space coordinate of a starting point and a tail point of a flight track are obtained according to the reference point;
(5) in the flight process of the unmanned aerial vehicle, the visual navigation module collects the position data of the unmanned aerial vehicle through the motion focus-following camera, the position is a set of position and direction, the motion focus-following camera transmits the data to the Android chip, the position and direction of the unmanned aerial vehicle in the flight process are recorded in the Unity 3D-based app, and the motion track of the unmanned aerial vehicle flying from the initial position to the current position can be pushed back through a method of adjusting back and returning a given timestamp by combining a timer in the Android chip to perform motion tracking;
(6) the three-dimensional space coordinate of the position of the unmanned aerial vehicle and the three-dimensional space coordinate data of the starting point and the ending point of the flight track are transmitted to the unmanned aerial vehicle flight control module by the visual navigation module, the unmanned aerial vehicle is controlled by the flight control module to fly to the ending point of the flight track from the position, the visual navigation module continuously controls the flight control module to adjust the flight position in the flight process according to the motion track obtained in the step (5), and the flight track of the unmanned aerial vehicle is gradually overlapped with the marked flight track until the flight track is consistent with the marked flight track.
2. The method for controlling the visual navigation of the unmanned aerial vehicle based on the point cloud data of claim 1, wherein: and (3) returning the depth data by the laser signals sent by the depth perception camera in the steps (1) and (3) in a point cloud form, wherein the point cloud comprises three-dimensional coordinate (x, y, z) data of all points in the current area, and each piece of dimensional data is of a floating point type and represents the position of the data in a coordinate frame.
3. The method for controlling the visual navigation of the unmanned aerial vehicle based on the point cloud data of claim 1, wherein: the position data in the step (5) comprises two parts: one unit is a vector for conversion in meters and one for a rotating quadruple.
4. The method for controlling the visual navigation of the unmanned aerial vehicle based on the point cloud data of claim 1, wherein: and (6) the process of the visual navigation module in controlling the flight control module to adjust the flight position is regional learning, the visual navigation module records visual characteristic points of the region in the flight process, the visual characteristic points are used for finishing drift correction, and errors in the position, direction and motion of the unmanned aerial vehicle are corrected, so that the flight track of the unmanned aerial vehicle is matched with the previous record.
5. The method for controlling the visual navigation of the unmanned aerial vehicle based on the point cloud data of claim 4, wherein: the visual feature points are points with consistent contrast in point cloud data obtained by twice laser scanning of the depth perception camera.
CN201911034573.5A 2019-10-29 2019-10-29 Unmanned aerial vehicle visual navigation control method based on point cloud data Pending CN110780678A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911034573.5A CN110780678A (en) 2019-10-29 2019-10-29 Unmanned aerial vehicle visual navigation control method based on point cloud data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911034573.5A CN110780678A (en) 2019-10-29 2019-10-29 Unmanned aerial vehicle visual navigation control method based on point cloud data

Publications (1)

Publication Number Publication Date
CN110780678A true CN110780678A (en) 2020-02-11

Family

ID=69387167

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911034573.5A Pending CN110780678A (en) 2019-10-29 2019-10-29 Unmanned aerial vehicle visual navigation control method based on point cloud data

Country Status (1)

Country Link
CN (1) CN110780678A (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112327928A (en) * 2020-11-26 2021-02-05 苏州流昴飞行器技术有限公司 Unmanned aerial vehicle flight control system
CN112466490A (en) * 2020-11-27 2021-03-09 中广核工程有限公司 Nuclear power station double-layer safety shell ring corridor area defect inspection system and defect inspection method
CN112650284A (en) * 2020-12-17 2021-04-13 苏州流昴飞行器技术有限公司 Unmanned aerial vehicle autopilot system
CN113238205A (en) * 2021-05-25 2021-08-10 珠海市亿点科技有限公司 Unmanned aerial vehicle surveying and mapping point cloud data offset correction method and system based on artificial intelligence
WO2022000197A1 (en) * 2020-06-29 2022-01-06 深圳市大疆创新科技有限公司 Flight operation method, unmanned aerial vehicle, and storage medium
CN113965683A (en) * 2021-11-02 2022-01-21 郑州航空工业管理学院 Suspension-moving type large-scale project archive image acquisition equipment
CN117668574A (en) * 2024-01-31 2024-03-08 利亚德智慧科技集团有限公司 Data model optimization method, device and equipment for light shadow show and storage medium
CN112466490B (en) * 2020-11-27 2024-09-24 中广核工程有限公司 Nuclear power station double-layer containment ring gallery area defect inspection system and defect inspection method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105424026A (en) * 2015-11-04 2016-03-23 中国人民解放军国防科学技术大学 Indoor navigation and localization method and system based on point cloud tracks
CN106501829A (en) * 2016-09-26 2017-03-15 北京百度网讯科技有限公司 A kind of Navigation of Pilotless Aircraft method and apparatus
CN107957733A (en) * 2017-12-05 2018-04-24 深圳市道通智能航空技术有限公司 Flight control method, device, terminal and unmanned plane
CN108319264A (en) * 2017-12-28 2018-07-24 北京臻迪科技股份有限公司 Navigation control method, device
CN109582032A (en) * 2018-10-11 2019-04-05 天津大学 Quick Real Time Obstacle Avoiding routing resource of the multi-rotor unmanned aerial vehicle under complex environment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105424026A (en) * 2015-11-04 2016-03-23 中国人民解放军国防科学技术大学 Indoor navigation and localization method and system based on point cloud tracks
CN106501829A (en) * 2016-09-26 2017-03-15 北京百度网讯科技有限公司 A kind of Navigation of Pilotless Aircraft method and apparatus
CN107957733A (en) * 2017-12-05 2018-04-24 深圳市道通智能航空技术有限公司 Flight control method, device, terminal and unmanned plane
CN108319264A (en) * 2017-12-28 2018-07-24 北京臻迪科技股份有限公司 Navigation control method, device
CN109582032A (en) * 2018-10-11 2019-04-05 天津大学 Quick Real Time Obstacle Avoiding routing resource of the multi-rotor unmanned aerial vehicle under complex environment

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022000197A1 (en) * 2020-06-29 2022-01-06 深圳市大疆创新科技有限公司 Flight operation method, unmanned aerial vehicle, and storage medium
CN112327928A (en) * 2020-11-26 2021-02-05 苏州流昴飞行器技术有限公司 Unmanned aerial vehicle flight control system
CN112466490A (en) * 2020-11-27 2021-03-09 中广核工程有限公司 Nuclear power station double-layer safety shell ring corridor area defect inspection system and defect inspection method
CN112466490B (en) * 2020-11-27 2024-09-24 中广核工程有限公司 Nuclear power station double-layer containment ring gallery area defect inspection system and defect inspection method
CN112650284A (en) * 2020-12-17 2021-04-13 苏州流昴飞行器技术有限公司 Unmanned aerial vehicle autopilot system
CN113238205A (en) * 2021-05-25 2021-08-10 珠海市亿点科技有限公司 Unmanned aerial vehicle surveying and mapping point cloud data offset correction method and system based on artificial intelligence
CN113965683A (en) * 2021-11-02 2022-01-21 郑州航空工业管理学院 Suspension-moving type large-scale project archive image acquisition equipment
CN117668574A (en) * 2024-01-31 2024-03-08 利亚德智慧科技集团有限公司 Data model optimization method, device and equipment for light shadow show and storage medium
CN117668574B (en) * 2024-01-31 2024-05-14 利亚德智慧科技集团有限公司 Data model optimization method, device and equipment for light shadow show and storage medium

Similar Documents

Publication Publication Date Title
CN110780678A (en) Unmanned aerial vehicle visual navigation control method based on point cloud data
Balamurugan et al. Survey on UAV navigation in GPS denied environments
CN107356252B (en) Indoor robot positioning method integrating visual odometer and physical odometer
CN109191504A (en) A kind of unmanned plane target tracking
CN112230242B (en) Pose estimation system and method
Alonso et al. Accurate global localization using visual odometry and digital maps on urban environments
KR102664900B1 (en) Apparatus for measuring ground control point using unmanned aerial vehicle and method thereof
CN109374008A (en) A kind of image capturing system and method based on three mesh cameras
JP2022518911A (en) Generate structured map data from vehicle sensors and camera arrays
CN109753076A (en) A kind of unmanned plane vision tracing implementing method
KR101220527B1 (en) Sensor system, and system and method for preparing environment map using the same
CN106767785B (en) Navigation method and device of double-loop unmanned aerial vehicle
KR102219843B1 (en) Estimating location method and apparatus for autonomous driving
CN103207634A (en) Data fusion system and method of differential GPS (Global Position System) and inertial navigation in intelligent vehicle
CN105182992A (en) Unmanned aerial vehicle control method and device
JP2009199572A (en) Three-dimensional machine map, three-dimensional machine map generating device, navigation device, and automatic driving device
CN112577517A (en) Multi-element positioning sensor combined calibration method and system
CN110163963B (en) Mapping device and mapping method based on SLAM
CN109974713B (en) Navigation method and system based on surface feature group
CN112665584B (en) Underwater robot positioning and composition method based on multi-sensor fusion
CN107389968A (en) A kind of unmanned plane fixed-point implementation method and apparatus based on light stream sensor and acceleration transducer
CN113267191A (en) Unmanned navigation system and method based on pseudolite indoor signal map correction
TWI725611B (en) Vehicle navigation switching device for golf course self-driving cars
Springer et al. Autonomous drone landing with fiducial markers and a gimbal-mounted camera for active tracking
KR101387665B1 (en) Self-alignment driving system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200211