CN117906598A - Positioning method and device of unmanned aerial vehicle equipment, computer equipment and storage medium - Google Patents

Positioning method and device of unmanned aerial vehicle equipment, computer equipment and storage medium Download PDF

Info

Publication number
CN117906598A
CN117906598A CN202410314224.3A CN202410314224A CN117906598A CN 117906598 A CN117906598 A CN 117906598A CN 202410314224 A CN202410314224 A CN 202410314224A CN 117906598 A CN117906598 A CN 117906598A
Authority
CN
China
Prior art keywords
data
unmanned aerial
aerial vehicle
feature
vehicle equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202410314224.3A
Other languages
Chinese (zh)
Inventor
高军强
赵开勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Qiyu Innovation Technology Co ltd
Original Assignee
Shenzhen Qiyu Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qiyu Innovation Technology Co ltd filed Critical Shenzhen Qiyu Innovation Technology Co ltd
Priority to CN202410314224.3A priority Critical patent/CN117906598A/en
Publication of CN117906598A publication Critical patent/CN117906598A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Analysis (AREA)

Abstract

The invention relates to a positioning method, a device, equipment and a storage medium of unmanned aerial vehicle equipment, which solve the problem that the unmanned aerial vehicle equipment cannot obtain a sufficiently accurate map and positioning information due to extreme weather such as heavy rain, heavy fog, weather and the like when the laser radar sensor is only used for simultaneous positioning and map construction by using a visual sensor in addition to providing additional information about the environment by using a 3D laser radar, and achieve the technical effect of improving the real-time positioning and map construction performance of the unmanned aerial vehicle equipment in a complex environment by utilizing the advantages of different sensors.

Description

Positioning method and device of unmanned aerial vehicle equipment, computer equipment and storage medium
Technical Field
The present invention relates to the field of data processing technologies, and in particular, to a method and apparatus for locating an unmanned aerial vehicle device, a computer device, and a storage medium.
Background
The SLAM technology of LiDAR refers to simultaneous localization and mapping (SLAM) using a LiDAR sensor. This is a technique of scanning the terrain by a lidar and determining the position of itself at the same time in an unknown environment. By constantly acquiring three-dimensional information of the environment, the SLAM algorithm may build a map and update the position of the robot, vehicle, or other device on the map. The method is a key technology for autonomous navigation and environment perception, and is widely applied to the fields of automatic driving vehicles, robots, unmanned aerial vehicles and the like.
However, SLAM technology of LiDAR may face challenges in heavy rain and fog weather as follows:
1. scattering and absorption: the laser beam is easily scattered or absorbed in raindrops and fog, so that a laser signal is weakened or distorted, and the accuracy of obtaining the terrain and environment information by the laser radar is further affected.
2. Multipath effects: heavy rain and fog can cause light to be reflected for multiple times, and multipath effect is formed. This can complicate the signal received from the LiDAR sensor by creating multiple reflection points in the environment for the laser beam to interpret and process properly.
3. Sensor protection problem: heavy rain and fog may affect the lidar sensor surface, causing it to be covered by water droplets or water vapor, thereby affecting the emission and reception of the laser beam.
These factors combine such that under extreme weather conditions, unmanned aerial vehicle devices employing SLAM systems of LiDAR may not obtain sufficiently accurate map and positioning information, thereby affecting their reliability and performance in these environments.
Disclosure of Invention
The invention mainly aims to provide a positioning method, a device, equipment and a storage medium of unmanned aerial vehicle equipment, which solve the problem that the unmanned aerial vehicle equipment cannot obtain a sufficiently accurate map and positioning information due to extreme weather such as heavy rain, heavy fog, weather and the like when the laser radar sensor is used for simultaneous positioning and map construction is used for providing additional information about the environment by using a visual sensor in addition to providing additional information about the environment by using a 3D laser radar, and achieve the technical effect of improving the real-time positioning and map construction performance of the unmanned aerial vehicle equipment in a complex environment by utilizing the advantages of different sensors.
In order to achieve the above object, the present invention provides a positioning method of an unmanned aerial vehicle device, comprising the steps of: estimating the motion state of the unmanned aerial vehicle equipment by using an inertial measurement unit; acquiring environmental data of the unmanned aerial vehicle equipment by using a plurality of preset sensors, wherein the plurality of preset sensors comprise: 3D lidar, color cameras and thermal infrared cameras; and taking the motion state as a motion equation of Kalman filtering, taking the environmental data as an observation equation of Kalman filtering, and carrying out Kalman filtering fusion processing on the motion equation and the observation equation to obtain the positioning of the unmanned aerial vehicle equipment.
Further, collecting environmental data of the unmanned aerial vehicle device by using a plurality of preset sensors, including: acquiring data of an environment where the unmanned aerial vehicle equipment is located through the 3D laser radar, and extracting characteristics of the acquired radar data to obtain first characteristic data; acquiring data of the environment where the unmanned aerial vehicle equipment is located through the color camera, and extracting features of acquired color image data to obtain second feature data; the thermal infrared camera is used for collecting data of the environment where the unmanned aerial vehicle equipment is located, and extracting characteristics of collected thermal infrared image data to obtain third characteristic data; and constructing a constraint equation for at least one of the first feature data, the second feature data and the third feature data to obtain environment data serving as an observation equation.
Further, data acquisition is performed on an environment where the unmanned aerial vehicle device is located through the 3D laser radar, and feature extraction is performed on the acquired radar data to obtain first feature data, including: acquiring data of the environment where the unmanned aerial vehicle equipment is located through the 3D laser radar to obtain radar data; and performing motion distortion correction on the radar data by using the inertial measurement unit, and performing feature extraction on the radar data subjected to the motion distortion correction to obtain first feature data.
Further, the data acquisition is performed on the environment where the unmanned aerial vehicle device is located through the color camera, and the feature extraction is performed on the acquired color image data to obtain second feature data, including: acquiring data of the environment where the unmanned aerial vehicle equipment is located through the color camera to obtain color image data; extracting key feature points in the color image data by using a FAST algorithm, and performing feature matching processing on the key feature points in the color image data by using a light flow algorithm; constructing a projection residual equation for the color image data processed by the streamer algorithm; and carrying out feature extraction on the color image data based on the projection residual error construction result to obtain second feature data.
Further, the data acquisition is performed on the environment where the unmanned aerial vehicle device is located through the thermal infrared camera, and the feature extraction is performed on the acquired thermal infrared image data to obtain third feature data, including: and acquiring data of the environment where the unmanned aerial vehicle equipment is located through the thermal infrared camera to obtain thermal infrared image data, and extracting features of the thermal infrared image data by using a distance field to obtain third feature data.
Further, constructing a constraint equation for at least one of the first feature data, the second feature data, and the third feature data, including: screening target characteristic data for constructing a constraint equation from the first characteristic data, the second characteristic data and the third characteristic data; and constructing a constraint equation for the target characteristic data.
Further, screening target feature data for constraint equation construction from the first feature data, the second feature data and the third feature data includes: screening out feature data extracted according to the radar data and the color image data in a normal mode; screening out characteristic data extracted according to the radar data and the thermal infrared image data in a night mode; screening out characteristic data extracted according to the thermal infrared image data in a rain and snow mode; and screening out the characteristic data extracted according to the color image data in an abnormal mode.
The invention provides a positioning device of unmanned aerial vehicle equipment, which comprises: the estimation module is used for estimating the motion state of the unmanned aerial vehicle equipment by utilizing the inertial measurement unit; the acquisition module is used for acquiring environmental data of the unmanned aerial vehicle equipment by utilizing a plurality of preset sensors, and the plurality of preset sensors comprise: 3D lidar, color cameras and thermal infrared cameras; and the fusion module is used for taking the motion state as a motion equation of Kalman filtering, taking the environmental data as an observation equation of Kalman filtering, and carrying out Kalman filtering fusion processing on the motion equation and the observation equation to obtain the positioning of the unmanned aerial vehicle equipment.
The invention also provides a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of any of the methods described above when the computer program is executed.
The invention also provides a computer readable storage medium having stored thereon a computer program which when executed by a processor performs the steps of the method of any of the preceding claims.
Compared with the prior art, the positioning method, the device, the equipment and the storage medium of the unmanned aerial vehicle equipment have the advantages that the technical problem that extreme weather conditions cannot be dealt with when the Kalman filtering fusion processing is carried out by only adopting the environmental data acquired by the 3D laser radar as the observation equation is avoided by supplementing the environmental data acquired by the color camera and the thermal infrared camera into the Kalman filtering observation equation, and the technical effect that the positioning performance of the unmanned aerial vehicle equipment in a complex environment is improved by utilizing the advantages of different sensors is achieved.
Drawings
Fig. 1 is a schematic diagram of steps of a positioning method of a drone device according to an embodiment of the present invention;
fig. 2 is a schematic diagram of step S2 of a positioning method of a drone device according to an embodiment of the present invention;
Fig. 3 is a schematic diagram illustrating steps of a positioning method of a drone device according to an embodiment of the present invention;
fig. 4 is a block diagram of a positioning device of a drone device according to an embodiment of the present invention;
fig. 5 is a block diagram schematically illustrating a structure of a computer device according to an embodiment of the present invention.
The achievement of the objects, functional features and advantages of the present invention will be further described with reference to the accompanying drawings, in conjunction with the embodiments.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The positioning method of the unmanned aerial vehicle device provided by the invention is implemented by taking the computer device as an execution main body.
Referring to fig. 1-2, a flow chart of a positioning method of an unmanned aerial vehicle device according to the present invention includes the following steps:
s1, estimating the motion state of the unmanned aerial vehicle equipment by using an Inertial Measurement Unit (IMU).
And acquiring acceleration and angular velocity information of the unmanned aerial vehicle equipment through an inertial measurement unit, further obtaining the velocity through integrating the acceleration, and obtaining the position through integrating the velocity again so as to obtain the motion trail of the object. And this information can be used as an equation of motion for kalman filtering.
S2, acquiring environmental data of the unmanned aerial vehicle equipment by using a plurality of preset sensors, wherein the plurality of preset sensors comprise: 3D lidar, color cameras and thermal infrared cameras.
The provision of additional information about the environment by 3D lidar and vision sensors (e.g., color cameras and thermal infrared cameras) mainly includes: the location or depth information of the feature points. These data can be used as an observation equation for the kalman filter for continuously correcting and calibrating the state of motion estimated by the inertial measurement unit.
The 3D laser radar can accurately measure the distance between the unmanned aerial vehicle and surrounding objects under the condition of not depending on an external light source, but the 3D laser radar is easy to generate the laser beam multipath reflection effect and the laser signal weakening distortion under the weather of heavy rain and heavy fog, so that the 3D laser radar cannot accurately measure; the thermal infrared camera can additionally provide infrared radiation information of the environment under complex environmental conditions (such as heavy rain, heavy snow, heavy fog and night weather); while color cameras may additionally provide color texture information of the environment in addition to 3D lidar and thermal infrared cameras.
And S3, taking the motion state as a motion equation of Kalman filtering, taking the environmental data as an observation equation of Kalman filtering, and carrying out Kalman filtering fusion processing on the motion equation and the observation equation to obtain the positioning of the unmanned aerial vehicle equipment.
Kalman filtering is a recursive estimation method used for fusing information of different sensors and realizing real-time estimation of the position and the posture of an object by continuously and iteratively updating state estimation values.
In summary, according to the positioning method of the unmanned aerial vehicle device, the environmental data collected by the color camera and the thermal infrared camera are supplemented to the Kalman filtering observation equation, so that the technical problem that extreme weather conditions cannot be handled when the Kalman filtering fusion processing is carried out by only taking the environmental data collected by the 3D laser radar as the observation equation is avoided, and the technical effect that the unmanned aerial vehicle device can still be accurately positioned under the extreme weather conditions is achieved by utilizing the advantages of different sensors.
In one embodiment, collecting environmental data of the unmanned aerial vehicle device using a plurality of preset sensors includes:
S11, data acquisition is carried out on the environment where the unmanned aerial vehicle equipment is located through the 3D laser radar, and feature extraction is carried out on the acquired radar data, so that first feature data are obtained.
Specifically, the 3D laser radar is used for collecting data of the environment where the unmanned aerial vehicle equipment is located, so that radar data are obtained; and performing motion distortion correction on the radar data by using the inertial measurement unit, wherein the purpose of the motion distortion correction is to correct sensor data distortion caused by motion so as to obtain more accurate motion trail information.
Further, performing feature extraction on line-surface features from the radar data subjected to the motion distortion correction, wherein the line-surface features comprise straight lines and planes in a detection environment; point-to-line and point-to-surface feature constraints are constructed using previously extracted line-to-surface features, and these feature constraints are the first feature data.
S12, data acquisition is carried out on the environment where the unmanned aerial vehicle equipment is located through the color camera, and feature extraction is carried out on the acquired color image data, so that second feature data are obtained.
Specifically, the color camera is used for collecting data of the environment where the unmanned aerial vehicle equipment is located, so that color image data are obtained; extracting key feature points in the color image data by using a FAST algorithm, and performing feature matching processing on the key feature points in the color image data by using a light flow algorithm; constructing a projection residual equation for the color image data processed by the streamer algorithm; and based on the projection residual error construction result, carrying out feature extraction on the color image data to obtain second feature data, so that unmanned aerial vehicle equipment can better understand the movement and change of key feature points in the color image, carry out motion estimation, and realize accurate positioning and image construction.
The FAST (Features from ACCELERATED SEGMENT TEST) algorithm is an algorithm for rapidly detecting corner points in an image, wherein the corner points are points with significant local feature changes in the image, and are generally used for image analysis and feature matching; the optical flow algorithm is used for tracking the movement of key feature points extracted by the FAST algorithm in continuous frames, so that the system is helped to understand the movement of an object or a camera; the projection residual equation is an algorithm for calculating the residual (difference) between the actual observed position and the theoretical projected position, and is used to adjust the feature extraction model to minimize the projection residual of the feature extraction model.
S13, acquiring data of the environment where the unmanned aerial vehicle equipment is located through the thermal infrared camera, and extracting features of the acquired thermal infrared image data to obtain third feature data;
specifically, the thermal infrared camera is used for collecting data of the environment where the unmanned aerial vehicle equipment is located, thermal infrared image data are obtained, the distance field is used for carrying out feature extraction on the thermal infrared image data, third feature data are obtained, and the technical problems that the resolution of the thermal infrared camera is low, and feature points are difficult to directly extract are solved.
S14, constructing a constraint equation for at least one of the first feature data, the second feature data and the third feature data to obtain environment data serving as an observation equation.
Specifically, screening target feature data for constructing a constraint equation from the first feature data, the second feature data and the third feature data; and constructing a constraint equation for the target characteristic data so as to integrate different characteristic data into a consistent format and improve algorithm accuracy.
Illustrating: screening out feature data extracted according to the radar data and the color image data under the condition that unmanned aerial vehicle equipment is in a normal mode, and correcting and calibrating a motion state estimated through an IMU according to the feature data extracted by the radar data and the color image data; screening out characteristic data extracted according to the radar data and the thermal infrared image data when the unmanned aerial vehicle equipment is in a night mode, and correcting and calibrating a motion state estimated through an IMU according to the radar data and the characteristic data extracted by the thermal infrared image data; screening out characteristic data extracted according to the thermal infrared image data when the unmanned aerial vehicle equipment is in a rain and snow mode, and correcting and calibrating a motion state estimated through an IMU according to the characteristic data extracted by the thermal infrared image data; and screening out the characteristic data extracted according to the color image data when the unmanned aerial vehicle equipment is in an abnormal mode (for example, the 3D laser radar is in failure), and correcting and calibrating the motion state estimated by the IMU according to the characteristic data extracted by the color image data.
Referring to fig. 3, a flow chart of a positioning method of an unmanned aerial vehicle device according to the present invention includes the following contents:
1. IMU motion estimation is performed using the IMU.
2. And correcting motion distortion by using a 3D laser radar, and extracting surface characteristics.
3. Color image feature extraction is performed using a color camera.
4. And using a thermal infrared camera to extract infrared image characteristics.
5. Judging whether the features extracted by the sensor are effective;
6. Constructing a constraint equation for the effective features;
7. And carrying out EKF fusion on the data obtained by IMU motion estimation and the data obtained by constructing a constraint equation to obtain positioning data.
Referring to fig. 4, a schematic structural diagram of an optimized dispatching device for a virtual power plant according to the present invention includes:
the estimating module 1 is used for estimating the motion state of the unmanned aerial vehicle equipment by utilizing the inertial measurement unit;
The collection module 2 is configured to collect environmental data where the unmanned aerial vehicle device is located by using a plurality of preset sensors, where the plurality of preset sensors include: 3D lidar, color cameras and thermal infrared cameras;
And the fusion module 3 is used for taking the motion state as a motion equation of Kalman filtering, taking the environmental data as an observation equation of Kalman filtering, and carrying out Kalman filtering fusion processing on the motion equation and the observation equation to obtain the positioning of the unmanned aerial vehicle equipment.
In this embodiment, for specific implementation of each unit in the above embodiment of the apparatus, please refer to the description in the above embodiment of the method, and no further description is given here.
Referring to fig. 5, a computer device is further provided in an embodiment of the present invention, where the computer device may be a server, and the internal structure of the computer device may be as shown in fig. 5. The computer device includes a processor, a memory, a display screen, an input device, a network interface, and a database connected by a system bus. Wherein the computer is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is used to store the corresponding data in this embodiment. The network interface of the computer device is used for communicating with an external terminal through a network connection. Which computer program, when being executed by a processor, carries out the above-mentioned method.
It will be appreciated by those skilled in the art that the architecture shown in fig. 5 is merely a block diagram of a portion of the architecture in connection with the present inventive arrangements and is not intended to limit the computer devices to which the present inventive arrangements are applicable.
An embodiment of the present invention also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the above method. It is understood that the computer readable storage medium in this embodiment may be a volatile readable storage medium or a nonvolatile readable storage medium.
In summary, by using the 3D laser radar to provide additional information about the environment and using the vision sensor (such as a color camera and a thermal infrared camera) to provide additional information about the environment, the problem that the unmanned aerial vehicle device cannot obtain sufficiently accurate map and positioning information due to extreme weather such as heavy rain, heavy fog and the like when the laser radar sensor is used for simultaneous positioning and map construction is solved, and the technical effect of improving the real-time positioning and map construction performance of the unmanned aerial vehicle device in a complex environment by utilizing the advantages of different sensors is achieved.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium provided by the present invention and used in embodiments may include non-volatile and/or volatile memory. The nonvolatile memory can include Read Only Memory (ROM), programmable ROM (PROM), electrically Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), dual speed data rate SDRAM (SSRSDRAM), enhanced SDRAM (ESDRAM), synchronous link (SYNCHLINK) DRAM (SLDRAM), memory bus (rambus) direct RAM (RDRAM), direct memory bus dynamic RAM (DRDRAM), and memory bus dynamic RAM.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, apparatus, article, or method that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, apparatus, article, or method. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, apparatus, article, or method that comprises the element.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the invention, and all equivalent structures or equivalent processes using the descriptions and drawings of the present invention or direct or indirect application in other related technical fields are included in the scope of the present invention.

Claims (10)

1. The positioning method of the unmanned aerial vehicle equipment is characterized by comprising the following steps of:
estimating the motion state of the unmanned aerial vehicle equipment by using an inertial measurement unit;
Acquiring environmental data of the unmanned aerial vehicle equipment by using a plurality of preset sensors, wherein the plurality of preset sensors comprise: 3D lidar, color cameras and thermal infrared cameras;
And taking the motion state as a motion equation of Kalman filtering, taking the environmental data as an observation equation of Kalman filtering, and carrying out Kalman filtering fusion processing on the motion equation and the observation equation to obtain the positioning of the unmanned aerial vehicle equipment.
2. The method for locating a drone device according to claim 1, wherein the capturing environmental data of the drone device using a plurality of preset sensors comprises:
Acquiring data of an environment where the unmanned aerial vehicle equipment is located through the 3D laser radar, and extracting characteristics of the acquired radar data to obtain first characteristic data;
acquiring data of the environment where the unmanned aerial vehicle equipment is located through the color camera, and extracting features of acquired color image data to obtain second feature data;
The thermal infrared camera is used for collecting data of the environment where the unmanned aerial vehicle equipment is located, and extracting characteristics of collected thermal infrared image data to obtain third characteristic data;
and constructing a constraint equation for at least one of the first feature data, the second feature data and the third feature data to obtain environment data serving as an observation equation.
3. The method for positioning an unmanned aerial vehicle device according to claim 2, wherein the data acquisition is performed on the environment where the unmanned aerial vehicle device is located by the 3D laser radar, and the feature extraction is performed on the acquired radar data, so as to obtain first feature data, and the method comprises the following steps:
Acquiring data of the environment where the unmanned aerial vehicle equipment is located through the 3D laser radar to obtain radar data;
And performing motion distortion correction on the radar data by using the inertial measurement unit, and performing feature extraction on the radar data subjected to the motion distortion correction to obtain first feature data.
4. The method for positioning an unmanned aerial vehicle device according to claim 2, wherein the data acquisition is performed on the environment where the unmanned aerial vehicle device is located by the color camera, and the feature extraction is performed on the acquired color image data, so as to obtain second feature data, including:
acquiring data of the environment where the unmanned aerial vehicle equipment is located through the color camera to obtain color image data;
Extracting key feature points in the color image data by using a FAST algorithm, and performing feature matching processing on the key feature points in the color image data by using a light flow algorithm;
constructing a projection residual equation for the color image data processed by the streamer algorithm;
And carrying out feature extraction on the color image data based on the projection residual error construction result to obtain second feature data.
5. The method for positioning an unmanned aerial vehicle device according to claim 2, wherein the data acquisition is performed on the environment where the unmanned aerial vehicle device is located by the thermal infrared camera, and the feature extraction is performed on the acquired thermal infrared image data, so as to obtain third feature data, and the method comprises the following steps:
And acquiring data of the environment where the unmanned aerial vehicle equipment is located through the thermal infrared camera to obtain thermal infrared image data, and extracting features of the thermal infrared image data by using a distance field to obtain third feature data.
6. The positioning method of the unmanned aerial vehicle device according to claim 2, wherein constructing the constraint equation for at least one of the first feature data, the second feature data, and the third feature data comprises:
Screening target characteristic data for constructing a constraint equation from the first characteristic data, the second characteristic data and the third characteristic data;
And constructing a constraint equation for the target characteristic data.
7. The method for locating a drone device according to claim 6, wherein screening out target feature data for constraint equation construction from the first feature data, the second feature data, and the third feature data includes:
screening out feature data extracted according to the radar data and the color image data in a normal mode;
screening out characteristic data extracted according to the radar data and the thermal infrared image data in a night mode;
screening out characteristic data extracted according to the thermal infrared image data in a rain and snow mode;
and screening out the characteristic data extracted according to the color image data in an abnormal mode.
8. Positioning device of unmanned aerial vehicle equipment, its characterized in that, positioning device of unmanned aerial vehicle equipment includes:
the estimation module is used for estimating the motion state of the unmanned aerial vehicle equipment by utilizing the inertial measurement unit;
the acquisition module is used for acquiring environmental data of the unmanned aerial vehicle equipment by utilizing a plurality of preset sensors, and the plurality of preset sensors comprise: 3D lidar, color cameras and thermal infrared cameras;
And the fusion module is used for taking the motion state as a motion equation of Kalman filtering, taking the environmental data as an observation equation of Kalman filtering, and carrying out Kalman filtering fusion processing on the motion equation and the observation equation to obtain the positioning of the unmanned aerial vehicle equipment.
9. A computer device comprising a memory and a processor, the memory having stored therein a computer program, characterized in that the processor, when executing the computer program, implements the steps of the method of any of claims 1 to 7.
10. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 7.
CN202410314224.3A 2024-03-19 2024-03-19 Positioning method and device of unmanned aerial vehicle equipment, computer equipment and storage medium Pending CN117906598A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410314224.3A CN117906598A (en) 2024-03-19 2024-03-19 Positioning method and device of unmanned aerial vehicle equipment, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410314224.3A CN117906598A (en) 2024-03-19 2024-03-19 Positioning method and device of unmanned aerial vehicle equipment, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117906598A true CN117906598A (en) 2024-04-19

Family

ID=90689362

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410314224.3A Pending CN117906598A (en) 2024-03-19 2024-03-19 Positioning method and device of unmanned aerial vehicle equipment, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117906598A (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108095761A (en) * 2012-03-07 2018-06-01 齐特奥股份有限公司 Spacial alignment equipment, spacial alignment system and the method for instructing medical procedure
CN109787679A (en) * 2019-03-15 2019-05-21 郭欣 Police infrared arrest system and method based on multi-rotor unmanned aerial vehicle
CN111382683A (en) * 2020-03-02 2020-07-07 东南大学 Target detection method based on feature fusion of color camera and infrared thermal imager
CN212008943U (en) * 2020-03-26 2020-11-24 北京农业信息技术研究中心 High-flux three-dimensional scanning spectral imaging measuring device
CN112347840A (en) * 2020-08-25 2021-02-09 天津大学 Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN113657270A (en) * 2021-08-17 2021-11-16 江苏熙枫智能科技有限公司 Unmanned aerial vehicle tracking method based on deep learning image processing technology
CN114254696A (en) * 2021-11-30 2022-03-29 上海西虹桥导航技术有限公司 Visible light, infrared and radar fusion target detection method based on deep learning
CN116380062A (en) * 2022-12-28 2023-07-04 深圳市优必选科技股份有限公司 Robot positioning method and device, mobile robot and readable storage medium
CN116878501A (en) * 2023-07-12 2023-10-13 北京理工大学 High-precision positioning and mapping system and method based on multi-sensor fusion
CN116957360A (en) * 2023-07-31 2023-10-27 广东中科凯泽信息科技有限公司 Space observation and reconstruction method and system based on unmanned aerial vehicle
CN117036404A (en) * 2023-08-09 2023-11-10 北京理工大学 Monocular thermal imaging simultaneous positioning and mapping method and system
CN117232499A (en) * 2023-09-20 2023-12-15 清华大学苏州汽车研究院(吴江) Multi-sensor fusion point cloud map construction method, device, equipment and medium
CN117554989A (en) * 2022-08-03 2024-02-13 北京氢源智能科技有限公司 Visual fusion laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108095761A (en) * 2012-03-07 2018-06-01 齐特奥股份有限公司 Spacial alignment equipment, spacial alignment system and the method for instructing medical procedure
CN109787679A (en) * 2019-03-15 2019-05-21 郭欣 Police infrared arrest system and method based on multi-rotor unmanned aerial vehicle
CN111382683A (en) * 2020-03-02 2020-07-07 东南大学 Target detection method based on feature fusion of color camera and infrared thermal imager
CN212008943U (en) * 2020-03-26 2020-11-24 北京农业信息技术研究中心 High-flux three-dimensional scanning spectral imaging measuring device
CN112347840A (en) * 2020-08-25 2021-02-09 天津大学 Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN113657270A (en) * 2021-08-17 2021-11-16 江苏熙枫智能科技有限公司 Unmanned aerial vehicle tracking method based on deep learning image processing technology
CN114254696A (en) * 2021-11-30 2022-03-29 上海西虹桥导航技术有限公司 Visible light, infrared and radar fusion target detection method based on deep learning
CN117554989A (en) * 2022-08-03 2024-02-13 北京氢源智能科技有限公司 Visual fusion laser radar SLAM positioning navigation method and unmanned aerial vehicle system thereof
CN116380062A (en) * 2022-12-28 2023-07-04 深圳市优必选科技股份有限公司 Robot positioning method and device, mobile robot and readable storage medium
CN116878501A (en) * 2023-07-12 2023-10-13 北京理工大学 High-precision positioning and mapping system and method based on multi-sensor fusion
CN116957360A (en) * 2023-07-31 2023-10-27 广东中科凯泽信息科技有限公司 Space observation and reconstruction method and system based on unmanned aerial vehicle
CN117036404A (en) * 2023-08-09 2023-11-10 北京理工大学 Monocular thermal imaging simultaneous positioning and mapping method and system
CN117232499A (en) * 2023-09-20 2023-12-15 清华大学苏州汽车研究院(吴江) Multi-sensor fusion point cloud map construction method, device, equipment and medium

Similar Documents

Publication Publication Date Title
CN109887057B (en) Method and device for generating high-precision map
KR102483649B1 (en) Vehicle localization method and vehicle localization apparatus
CN111947671B (en) Method, apparatus, computing device and computer-readable storage medium for positioning
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
WO2018142900A1 (en) Information processing device, data management device, data management system, method, and program
US20200124421A1 (en) Method and apparatus for estimating position
US20080195316A1 (en) System and method for motion estimation using vision sensors
US20180075614A1 (en) Method of Depth Estimation Using a Camera and Inertial Sensor
CN110889808A (en) Positioning method, device, equipment and storage medium
WO2019208101A1 (en) Position estimating device
KR101890612B1 (en) Method and apparatus for detecting object using adaptive roi and classifier
US11151729B2 (en) Mobile entity position estimation device and position estimation method
CN111308415B (en) Online pose estimation method and equipment based on time delay
US10109074B2 (en) Method and system for inertial measurement having image processing unit for determining at least one parameter associated with at least one feature in consecutive images
CN113580134A (en) Visual positioning method, device, robot, storage medium and program product
CN113405555B (en) Automatic driving positioning sensing method, system and device
KR102288609B1 (en) Method and system for position estimation of unmanned aerial vehicle using graph structure based on multi module
CN107270904B (en) Unmanned aerial vehicle auxiliary guide control system and method based on image registration
KR102195040B1 (en) Method for collecting road signs information using MMS and mono camera
WO2021056283A1 (en) Systems and methods for adjusting a vehicle pose
CN117906598A (en) Positioning method and device of unmanned aerial vehicle equipment, computer equipment and storage medium
CN117760417B (en) Fusion positioning method and system based on 4D millimeter wave radar and IMU
WO2022179047A1 (en) State information estimation method and apparatus
US11859979B2 (en) Delta position and delta attitude aiding of inertial navigation system
CN117723055A (en) Low-cost positioning method and system for autonomous passenger parking

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination