CN113970753B - Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection - Google Patents

Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection Download PDF

Info

Publication number
CN113970753B
CN113970753B CN202111157405.2A CN202111157405A CN113970753B CN 113970753 B CN113970753 B CN 113970753B CN 202111157405 A CN202111157405 A CN 202111157405A CN 113970753 B CN113970753 B CN 113970753B
Authority
CN
China
Prior art keywords
unit
positioning
image
unmanned aerial
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111157405.2A
Other languages
Chinese (zh)
Other versions
CN113970753A (en
Inventor
单梁
马苗苗
周逸飞
吴志强
陈佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Science and Technology
Original Assignee
Nanjing University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Science and Technology filed Critical Nanjing University of Science and Technology
Priority to CN202111157405.2A priority Critical patent/CN113970753B/en
Publication of CN113970753A publication Critical patent/CN113970753A/en
Application granted granted Critical
Publication of CN113970753B publication Critical patent/CN113970753B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders

Abstract

The invention discloses an unmanned aerial vehicle positioning control method and system based on laser radar and visual detection. The system comprises a main control unit, a laser radar unit, a visual detection unit, a laser ranging unit, a storage unit and a communication unit; the laser radar unit, the vision detection unit, the laser ranging unit, the storage unit and the communication unit are all connected with the main control unit; the working method comprises the following steps: the laser radar unit obtains the point cloud information of the ground target object through radar scanning, and obtains coarse positioning information according to the point cloud information; the visual detection unit performs fine positioning recognition on the image acquired in real time, and performs feature extraction on the target object to obtain accurate positioning information of the target image; the main control unit transmits the received positioning information and the unmanned aerial vehicle height information measured by the laser ranging unit to the external equipment through the communication unit. According to the invention, the multi-sensor is adopted to perform coarse positioning and fine positioning on the ground target object respectively, so that the unmanned aerial vehicle positioning control is realized, the drift error caused by the inertia measurement element is reduced, and the unmanned aerial vehicle positioning control precision is improved.

Description

Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection
Technical Field
The invention relates to the technical field of unmanned aerial vehicle positioning, in particular to an unmanned aerial vehicle positioning control system based on laser radar and vision detection.
Background
In recent years, unmanned aerial vehicles are increasingly hot in application market, and the unmanned aerial vehicles are beginning to be applied to various fields, such as aerial photography, plant protection, transportation, security protection and the like. Along with the gradual popularization of unmanned aerial vehicles in life, the application market puts forward more and more strict requirements on unmanned aerial vehicle high-precision positioning technology.
The current mature unmanned aerial vehicle positioning technology comprises the technologies of global satellite navigation, optical flow method positioning, wireless ranging positioning and the like, wherein the application effect of the global satellite navigation positioning technology can be greatly reduced due to the influence of building environment on satellite signals, the effectiveness of an optical flow algorithm and a data fusion algorithm in the optical flow positioning technology has great influence on positioning precision, and the wireless ranging positioning technology has the characteristics of large shielding interference on ranging and sensitivity to self-posture change of an unmanned aerial vehicle and poor dynamic performance, and has no universality in application in an unmanned aerial vehicle system.
Disclosure of Invention
The invention aims to provide an unmanned aerial vehicle positioning control system based on laser radar and visual detection and a working method thereof, wherein the unmanned aerial vehicle positioning control system adopts a multi-sensor positioning mode, can reduce drift errors caused by inertial measurement elements and improve positioning accuracy.
The technical scheme for realizing the invention is as follows: the unmanned aerial vehicle positioning control method based on laser radar and visual detection comprises a main control unit, a laser radar unit, a visual detection unit, a laser ranging unit, a storage unit and a communication unit;
the method comprises the following steps:
step 1, a laser ranging unit measures the flying height value of an unmanned aerial vehicle in real time and transmits height information to a main control unit;
Step 2, the laser radar unit performs coarse positioning, and the main control unit transmits coarse positioning information to the external equipment through the communication unit:
Step 3, the laser ranging unit transmits the measured flying height value of the unmanned aerial vehicle to the main control unit, and the main control unit transmits the received radar coarse positioning information to the external equipment;
Step 4, the vision detection unit performs fine positioning recognition on the image acquired in real time, performs feature extraction on the target object to obtain accurate positioning information of the target image, transmits the fine positioning information to the main control unit,
Step 4.1, aligning a camera to a positioning cross on the upper surface of a target container, and taking an acquired image as a target image;
Step 4.2, collecting images in real time, avoiding the situation of overexposure and overdarkness of the images, designing a brightness self-adaptive algorithm, firstly calculating image brightness values L, wherein B, G, R respectively represent pixel mean values of BGR channels, then dividing the brightness into areas, and correspondingly different brightness values are different areas, and setting different gains alpha and offsets beta under each area, wherein alpha adjusts the contrast of the images, beta adjusts the brightness of the images, f (i, j) represents pixels of ith row and jth column of the original images, g (i, j) represents corresponding output image pixels, changing the brightness of the images, and finally realizing brightness processing of the collected images;
L=0.299*R+0.587*G+0.114*B
g(i,j)=α*f(i,j)+β
Step 4.3, carrying out gray level processing, gaussian filtering and self-adaptive threshold binarization and on operation on the image subjected to brightness processing by adopting an algorithm in an opencv library, wherein the threshold thresh in the self-adaptive threshold binarization is selected by carrying out mean value processing on image pixels after Gaussian filtering, sum is the pixel value of the image after filtering, preS is the pixel value of the original image, and the range of delta is 20-30;
Step 4.4, adopting a PP-YOLO target detection algorithm, wherein the average detection accuracy of the algorithm on COCO data sets for various targets, namely mAP (MEAN AVERAGE Precision) reaches 45.2%, and the detection speed reaches 72.9FPS (FRAMES PER Second), and the detection is simplified into single-type object detection due to the fact that the specific object cross target is identified, so that the detection speed is improved;
Step 4.5, creating a cross positioning mark data set, and expanding the data set by adopting an image enhancement method, namely rotating, overturning, cutting, scaling and deforming, so as to improve the generalization capability of the PP-YOLO model training;
step 4.6, performing target detection by adopting a trained model, and identifying a cross target;
step 4.7, detecting the cross profile of the identified cross target to obtain a cross center coordinate B (x 1, y 1);
Step 4.8, comparing the cross center coordinates A (x 0, y 0) of the target image with the cross center coordinates B (x 1, y 1) of the image obtained by real-time acquisition and detection to obtain offset in the x and y directions, and using the offset as horizontal position positioning information;
Step 5, the main control unit correlates the accurate positioning information and the unmanned aerial vehicle height value measured by the laser ranging unit with the actual position of the unmanned aerial vehicle, and transmits the positioning information to external equipment through the communication unit;
And 6, the main control unit stores the coarse positioning, the fine positioning and the height information into the storage unit.
Furthermore, the laser radar unit performs coarse positioning, specifically:
step 2.1, scanning the 16-line radar at a fixed distance right above the container to obtain and store point cloud information of a target position;
step 2.2, obtaining an output matrix T by utilizing an ICP algorithm, and further obtaining position information, wherein R is a rotation transformation matrix, p is a translation transformation matrix, and current point cloud information X acquired by a 16-line radar in real time reaches target point cloud information Y through the T homogeneous transformation matrix;
T*X=Y
Furthermore, the invention provides an unmanned aerial vehicle positioning control system, wherein the laser radar unit, the visual detection unit, the laser ranging unit, the storage unit and the communication unit are all connected with the main control unit; the laser radar unit obtains coarse positioning information according to the point cloud information; the visual detection unit performs fine positioning identification on the image acquired in real time to obtain accurate positioning information of the target image; the laser ranging unit is used for measuring the flying height of the unmanned aerial vehicle; the main control unit receives coarse positioning information, height information and fine positioning information; the storage unit can store point cloud information, altitude information and positioning information of an image of the radar, and can read out the stored data through an external device; the communication unit enables the master to be connected to the communication device.
Furthermore, in the laser radar unit, the RS-LIDAR-16 radar is arranged at the bottom of the four-rotor unmanned aerial vehicle body, the 16-line radar scans the area below the unmanned aerial vehicle, and a container is used as a target object.
Still further, the vision detecting unit mounts a camera of model OpenMV4 at the bottom of the unmanned aerial vehicle, and when the camera is aimed at the positioning cross on the upper surface of the target cargo box, an image is acquired as a target image.
Still further, the master control unit employs an STM32F407 microcontroller.
Furthermore, the laser radar unit adopts an RS-LIDAR-16 type laser radar, communicates with the main control unit through the W5500 Ethernet module, is connected through an SPI interface, and transmits positioning information to the main control unit through a TCP/IP protocol.
Still further, the laser ranging unit employs PANFEE L-40 laser ranging sensors.
Still further, the communication unit comprises a WIFI module, and is of the ATK-ESP8226 model.
The invention has the beneficial effects that
Compared with the prior art, the invention has the remarkable advantages that: (1) The illumination adaptability is strong, the brightness self-adaptive algorithm can solve the problem of difficult target identification caused by different illumination intensities, but not the solution under the condition of single certain brightness, and meanwhile, the processing speed is higher due to the adoption of the traditional image processing algorithm, and the processing can be completed within 50 ms. (2) high precision. Compared with an unmanned aerial vehicle positioning system in an outdoor place, the unmanned aerial vehicle positioning system adopts a GPS positioning mode, and the invention adopts a laser combined vision mode, so that the defects of GPS signal transmission delay and large positioning error can be reduced, wherein a laser ranging sensor and a laser radar sensor have very high precision, and the accurate positioning of a target object can be realized beyond one meter and five meters; and the visual positioning unit can realize high-precision positioning of the target object of 0-1.5m in a short distance, and acquire accurate horizontal position positioning information of the target object.
Drawings
FIG. 1 is a schematic diagram of the structure of the present invention;
FIG. 2 is a flow chart of the method of the present invention;
FIG. 3 is a laser radar positioning flow chart of the method of the present invention;
FIG. 4 is a visual positioning flow chart of the method of the present invention;
FIG. 5 is a graph showing the effect of pretreatment in visual positioning according to the method of the present invention
FIG. 6 is a graph showing the effect of PP-YOLO detection in visual positioning according to the present invention.
Detailed Description
A working method of an unmanned aerial vehicle positioning control system based on laser radar and vision detection comprises the following steps:
Step 1, a laser radar unit scans in real time to obtain point cloud information at a current position, and coarse positioning information is obtained by processing the current point cloud information and is transmitted to a main control unit;
Step 2, the main control unit transmits the coarse positioning information to the external equipment through the communication unit;
step 3, the laser ranging unit transmits the measured flying height value of the unmanned aerial vehicle to the main control unit;
Step 4, the vision detection unit performs fine positioning identification on the image acquired in real time, performs feature extraction on the target object to obtain accurate positioning information of the target image, and transmits the fine positioning information to the main control unit;
Step 5, the main control unit correlates the accurate positioning information and the unmanned aerial vehicle height value measured by the laser ranging unit with the actual position of the unmanned aerial vehicle, and transmits the positioning information to external equipment through the communication unit;
And 6, the main control unit stores the coarse positioning, the fine positioning and the height information into the storage unit.
Further, the radar positioning in step 1 includes:
The RS-LIDAR-16 radar is arranged at the bottom of a four-rotor unmanned aerial vehicle body, a 16-line radar scans the area below the unmanned aerial vehicle, and a container is used as a target object;
The 16-line radar scans a fixed distance above the container to obtain and store point cloud information of a target position;
And obtaining an output matrix T by utilizing an ICP algorithm, and further obtaining position information, wherein R is a rotation transformation matrix, p is a translation transformation matrix, and the current point cloud information X acquired by the 16-line radar in real time can reach the target point cloud information Y through the T homogeneous transformation matrix.
T*X=Y
Further, the visual positioning in step 4 includes:
A camera with the model number OpenMV being mounted at the bottom of the unmanned aerial vehicle body, and when the camera is aligned with a positioning cross on the upper surface of a target container, acquiring an image as a target image;
The method comprises the steps of collecting images in real time, avoiding the situation that the images are overexposed and excessively darkened, designing a brightness self-adaptive algorithm, firstly calculating image brightness values L, wherein B, G, R respectively represent pixel mean values of BGR channels, then dividing the brightness into areas, wherein different brightness values correspond to different areas, and setting different gains alpha and offsets beta under each area, wherein alpha can adjust the contrast of the images, beta can adjust the brightness of the images, f (i, j) represents pixels of ith row and jth column of an original image, g (i, j) represents corresponding output image pixels, changing the brightness of the images, and finally realizing brightness processing of the collected images;
L=0.299*R+0.587*G+0.114*B
g(i,j)=α*f(i,j)+β
Carrying out gray level processing, gaussian filtering, adaptive threshold binarization and on operation on the image subjected to brightness processing by adopting an algorithm in an opencv library, wherein the threshold value thresh in the adaptive threshold binarization is selected by carrying out average value processing on image pixels subjected to Gaussian filtering, wherein sum is an image pixel value after filtering, preS is a pixel value of an original image, and a deviation delta is added, wherein the delta is in a range of 20-30;
The method is characterized in that a PP-YOLO target detection algorithm is adopted, the average detection accuracy of the algorithm on COCO data sets for various targets is 45.2%, namely mAP (MEAN AVERAGE Precision) reaches 72.9FPS (FRAMES PER Second), and the detection speed is improved by simplifying the detection of various objects on the PP-YOLO into single object detection due to the fact that the detection speed is the recognition of a cross target of a specific object;
creating a cross positioning mark data set, and expanding the data set by adopting an image enhancement method, namely rotating, overturning, cutting, zooming and deforming, so as to improve the generalization capability of the training of the PP-YOLO model;
performing target detection by adopting a trained model, and identifying a cross target;
Detecting the cross profile of the identified cross target to obtain a cross center coordinate B (x 1, y 1);
And comparing the cross center coordinates A (x 0, y 0) of the target image with the cross center coordinates B (x 1, y 1) of the image obtained by real-time acquisition and detection to obtain offset in the x and y directions, and using the offset as horizontal position positioning information.
The invention will be described in further detail with reference to the drawings and the specific examples.
Examples
Referring to fig. 1, the unmanned aerial vehicle positioning control system with the laser radar and the visual detection comprises a main control unit, a laser radar unit, a visual detection unit, a laser ranging unit, a storage unit and a communication unit, wherein the laser radar unit, the visual detection unit, the laser ranging unit, the storage unit and the communication unit are all connected with the main control unit; the laser radar unit obtains coarse positioning information according to the point cloud information; the visual detection unit performs fine positioning identification on the image acquired in real time to obtain accurate positioning information of the target image; the laser ranging unit is used for measuring the flying height of the unmanned aerial vehicle; the main control unit receives coarse positioning information, height information and fine positioning information; the storage unit can store point cloud information, altitude information and positioning information of an image of the radar, and can read out the stored data through an external device; the communication unit enables the master to be connected to the communication device.
The master control unit adopts an STM32F407 microcontroller, and the controller has abundant peripheral resources and high-efficiency processing speed, can meet the requirements of a plurality of sensor modules on instantaneity, rapidity and stability of the system. The laser radar unit adopts an RS-LIDAR-16 type laser radar, communicates with the main control unit through the W5500 Ethernet module, is connected through an SPI interface, and transmits positioning information to the main control unit through a TCP/IP protocol. The visual detection unit adopts OpenMV's 4 cameras, and the FPU processor STM32H7 of 32 bit ARM framework has been integrated in this camera, and the dominant frequency is up to 480MHz, and image processing algorithm is faster, and image sensor adopts OV7725 sensitization chip, and this unit communicates through the serial ports with the main control unit, gives main control unit with positioning information. The laser ranging unit adopts PANFEE L-40 laser ranging sensors, the measuring range of the sensors is 40m, the resolution is 1mm, the repetition precision is +/-1 mm, the laser ranging unit is a stable and accurate sensor, and the laser ranging unit is communicated with the main control unit through a serial port to transmit the height information to the main control unit. The communication unit comprises a WIFI module, and an ATK-ESP8226 model is adopted, and the module supports a UART data communication interface, so that the main control unit can be connected with the WIFI module through a serial port to realize data transmission with external communication equipment.
With reference to fig. 2, the working method of the unmanned aerial vehicle positioning control system with laser radar and visual detection of the invention comprises the following steps:
step 1, a laser ranging unit measures the flying height value of an unmanned aerial vehicle in real time and transmits height information to a main control unit;
step 2, the main control unit judges according to the height information, if the height value is more than 2m, and the radar positioning unit works in combination with fig. 3, specifically as follows:
Step 2.1, scanning a position, which is right above a container and is a fixed distance of 2m, of a 16-line radar to obtain and store point cloud information of a target position;
and 2.2, obtaining an output matrix T by utilizing an ICP algorithm, and further obtaining position information, wherein R is a rotation transformation matrix, p is a translation transformation matrix, and the current point cloud information X acquired by the 16-line radar in real time can reach the target point cloud information Y through the T homogeneous transformation matrix.
T*X=Y
Step 3, the main control unit transmits the received radar coarse positioning information to external equipment;
step 4, if the height information received by the main control unit is less than or equal to 2m, combining with fig. 4, the visual positioning unit works as follows:
step 4.1, aligning a camera to a positioning cross on the upper surface of a target container, and taking an acquired image as a target image;
Step 4.2, collecting images in real time, avoiding the situation of overexposure and overdarkness of the images, designing a brightness self-adaptive algorithm, firstly calculating image brightness values L, wherein B, G, R respectively represent pixel mean values of BGR channels, then dividing the brightness into areas, wherein different brightness values correspond to different areas, adjusting the contrast of the images by setting different gains alpha and offsets beta under each area, adjusting the brightness of the images by alpha, f (i, j) represents pixels of an ith row and a jth column of an original image, g (i, j) represents corresponding output image pixels, changing the brightness of the images, and finally realizing brightness processing of the collected images;
L=0.299*R+0.587*G+0.114*B
g(i,j)=α*f(i,j)+β
Step 4.3, carrying out gray level processing, gaussian filtering and self-adaptive threshold binarization and on operation on the image subjected to brightness processing by adopting an algorithm in an opencv library, wherein the threshold thresh in the self-adaptive threshold binarization is selected by carrying out mean value processing on image pixels after Gaussian filtering, sum is the pixel value of the image after filtering, preS is the pixel value of the original image, and the range of delta is 20-30;
Step 4.4, after the target image is subjected to the steps 4.1-4.3, the image shown in fig. 5 after pretreatment can be obtained, the image is a binary image, the outline of a cross target of the target object can be clearly seen, the subsequent detection is facilitated, the image after pretreatment is subjected to a PP-YOLO target detection algorithm, the algorithm is used for detecting the average detection Precision of various types of targets on a COCO data set, namely mAP (MEAN AVERAGE Precision) reaches 45.2%, the detection speed reaches 72.9FPS (FRAMES PER seconds), and the detection speed is improved by simplifying the detection of various types of objects on the PP-YOLO into the detection of single type of objects due to the recognition of the cross target of the specific object;
Step 4.5, creating a cross positioning mark data set, and expanding the data set by adopting an image enhancement method, namely rotating, overturning, cutting, scaling and deforming, so as to improve the generalization capability of the PP-YOLO model training;
step 4.6, performing target detection by adopting a trained model, and identifying a cross target;
Step 4.7, detecting the cross profile of the identified cross target after detection by the PP-YOLO algorithm to obtain a cross center coordinate B (x 1, y 1), wherein the cross center coordinate B can be seen from a detection effect diagram shown in FIG. 6, and can be accurately identified according to the image after preprocessing in FIG. 5 to obtain the center of the cross;
Step 4.8, comparing the cross center coordinates A (x 0, y 0) of the target image with the cross center coordinates B (x 1, y 1) of the image obtained by real-time acquisition and detection to obtain offset in the x and y directions, and using the offset as horizontal position positioning information;
Step 5, the main control unit correlates the accurate positioning information and the unmanned aerial vehicle height value measured by the laser ranging unit with the actual position of the unmanned aerial vehicle, and transmits the positioning information to external equipment through the communication unit;
And 6, the main control unit stores the coarse positioning, the fine positioning and the height information into the storage unit.

Claims (9)

1. The unmanned aerial vehicle positioning control method based on laser radar and visual detection comprises a main control unit, a laser radar unit, a visual detection unit, a laser ranging unit, a storage unit and a communication unit;
The method is characterized by comprising the following steps of:
step 1, a laser ranging unit measures the flying height value of an unmanned aerial vehicle in real time and transmits height information to a main control unit;
Step 2, the laser radar unit performs coarse positioning, and the main control unit transmits coarse positioning information to the external equipment through the communication unit:
Step 3, the laser ranging unit transmits the measured flying height value of the unmanned aerial vehicle to the main control unit, and the main control unit transmits the received radar coarse positioning information to the external equipment;
Step 4, the vision detection unit performs fine positioning recognition on the image acquired in real time, performs feature extraction on the target object to obtain accurate positioning information of the target image, transmits the fine positioning information to the main control unit,
Step 4.1, aligning a camera to a positioning cross on the upper surface of a target container, and taking an acquired image as a target image;
And 4.2, acquiring images in real time, and processing the brightness, wherein a brightness self-adaptive algorithm is as follows: firstly, calculating an image brightness value L, wherein B, G, R represents pixel mean values of BGR channels respectively, then carrying out region division on brightness, wherein different brightness values correspond to different regions, adjusting the contrast of an image by setting different gains alpha and offsets beta under each region, adjusting the brightness of the image by alpha, f (i, j) represents pixels of an ith row and a jth column of an original image, g (i, j) represents corresponding output image pixels, changing the brightness of the image, and finally realizing brightness processing on the acquired image;
L=0.299*R+0.587*G+0.114*B
g(i,j)=α*f(i,j)+β
Step 4.3, carrying out gray level processing, gaussian filtering and self-adaptive threshold binarization and on operation on the image subjected to brightness processing by adopting an algorithm in an opencv library, wherein the threshold thresh in the self-adaptive threshold binarization is selected by carrying out mean value processing on image pixels after Gaussian filtering, sum is the pixel value of the image after filtering, preS is the pixel value of the original image, and the range of delta is 20-30;
Step 4.4, adopting a PP-YOLO target detection algorithm, wherein the average value of average detection precision of various targets on a COCO data set is the identification of a cross target of a specific object, so that the detection of various objects on the PP-YOLO is simplified into single object detection, and the detection speed is improved;
Step 4.5, creating a cross positioning mark data set, and expanding the data set by adopting an image enhancement method, namely rotating, overturning, cutting, scaling and deforming, so as to improve the generalization capability of the PP-YOLO model training;
step 4.6, performing target detection by adopting a trained model, and identifying a cross target;
step 4.7, detecting the cross profile of the identified cross target to obtain a cross center coordinate B (x 1, y 1);
Step 4.8, comparing the cross center coordinates A (x 0, y 0) of the target image with the cross center coordinates B (x 1, y 1) of the image obtained by real-time acquisition and detection to obtain offset in the x and y directions, and using the offset as horizontal position positioning information;
Step 5, the main control unit correlates the accurate positioning information and the unmanned aerial vehicle height value measured by the laser ranging unit with the actual position of the unmanned aerial vehicle, and transmits the positioning information to external equipment through the communication unit;
And 6, the main control unit stores the coarse positioning, the fine positioning and the height information into the storage unit.
2. The unmanned aerial vehicle positioning control method based on the laser radar and the visual detection according to claim 1, wherein the laser radar unit performs coarse positioning, specifically:
step 2.1, scanning the 16-line radar at a fixed distance right above the container to obtain and store point cloud information of a target position;
step 2.2, obtaining an output matrix T by utilizing an ICP algorithm, and further obtaining position information, wherein R is a rotation transformation matrix, p is a translation transformation matrix, and current point cloud information X acquired by a 16-line radar in real time reaches target point cloud information Y through the T homogeneous transformation matrix;
T*X=Y
3. A unmanned aerial vehicle positioning control system using the unmanned aerial vehicle positioning control method according to claim 1 or 2, wherein the laser radar unit, the vision detection unit, the laser ranging unit, the storage unit and the communication unit are all connected with the main control unit; the laser radar unit obtains coarse positioning information according to the point cloud information; the visual detection unit performs fine positioning identification on the image acquired in real time to obtain accurate positioning information of the target image; the laser ranging unit is used for measuring the flying height of the unmanned aerial vehicle; the main control unit receives coarse positioning information, height information and fine positioning information; the storage unit can store point cloud information, altitude information and positioning information of an image of the radar, and can read out the stored data through an external device; the communication unit enables the master to be connected to the communication device.
4. The unmanned aerial vehicle positioning control system of claim 3, wherein the laser radar unit mounts the RS-LIDAR-16 radar on the bottom of the four-rotor unmanned aerial vehicle, and the 16-wire radar scans the area below the unmanned aerial vehicle, using the cargo box as the target object.
5. A drone positioning control system according to claim 3, wherein the vision detection unit mounts a camera of model OpenMV4 at the bottom of the drone fuselage, and captures an image as the target image when the camera is aiming at a positioning cross on the upper surface of the target cargo box.
6. A drone positioning control system according to claim 3, wherein the master control unit employs an STM32F407 microcontroller.
7. The unmanned aerial vehicle positioning control system of claim 3, wherein the laser radar unit is an RS-LIDAR-16 type laser radar, communicates with the main control unit through the W5500 ethernet module, uses an SPI interface connection, and transmits positioning information to the main control unit using TCP/IP protocol.
8. A drone positioning control system according to claim 3, wherein the laser ranging unit employs PANFEE L-40 laser ranging sensors.
9. A drone positioning control system according to claim 3, wherein the communication unit comprises a WIFI module, model number ATK-ESP 8226.
CN202111157405.2A 2021-09-30 2021-09-30 Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection Active CN113970753B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111157405.2A CN113970753B (en) 2021-09-30 2021-09-30 Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111157405.2A CN113970753B (en) 2021-09-30 2021-09-30 Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection

Publications (2)

Publication Number Publication Date
CN113970753A CN113970753A (en) 2022-01-25
CN113970753B true CN113970753B (en) 2024-04-30

Family

ID=79587043

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111157405.2A Active CN113970753B (en) 2021-09-30 2021-09-30 Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection

Country Status (1)

Country Link
CN (1) CN113970753B (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105823478A (en) * 2016-03-14 2016-08-03 武汉卓拔科技有限公司 Autonomous obstacle avoidance navigation information sharing and using method
WO2017177533A1 (en) * 2016-04-12 2017-10-19 深圳市龙云创新航空科技有限公司 Method and system for controlling laser radar based micro unmanned aerial vehicle
GB201715590D0 (en) * 2017-09-26 2017-11-08 Cambridge Consultants Delivery system
CN107450577A (en) * 2017-07-25 2017-12-08 天津大学 UAV Intelligent sensory perceptual system and method based on multisensor
CN108827306A (en) * 2018-05-31 2018-11-16 北京林业大学 A kind of unmanned plane SLAM navigation methods and systems based on Multi-sensor Fusion
CN109444911A (en) * 2018-10-18 2019-03-08 哈尔滨工程大学 A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion
CN110926474A (en) * 2019-11-28 2020-03-27 南京航空航天大学 Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method
WO2020237693A1 (en) * 2019-05-31 2020-12-03 华南理工大学 Multi-source sensing method and system for water surface unmanned equipment
CN112347840A (en) * 2020-08-25 2021-02-09 天津大学 Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN113358665A (en) * 2021-05-25 2021-09-07 同济大学 Unmanned aerial vehicle tunnel defect detection method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200301015A1 (en) * 2019-03-21 2020-09-24 Foresight Ai Inc. Systems and methods for localization

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105823478A (en) * 2016-03-14 2016-08-03 武汉卓拔科技有限公司 Autonomous obstacle avoidance navigation information sharing and using method
WO2017177533A1 (en) * 2016-04-12 2017-10-19 深圳市龙云创新航空科技有限公司 Method and system for controlling laser radar based micro unmanned aerial vehicle
CN107450577A (en) * 2017-07-25 2017-12-08 天津大学 UAV Intelligent sensory perceptual system and method based on multisensor
GB201715590D0 (en) * 2017-09-26 2017-11-08 Cambridge Consultants Delivery system
CN108827306A (en) * 2018-05-31 2018-11-16 北京林业大学 A kind of unmanned plane SLAM navigation methods and systems based on Multi-sensor Fusion
CN109444911A (en) * 2018-10-18 2019-03-08 哈尔滨工程大学 A kind of unmanned boat waterborne target detection identification and the localization method of monocular camera and laser radar information fusion
WO2020237693A1 (en) * 2019-05-31 2020-12-03 华南理工大学 Multi-source sensing method and system for water surface unmanned equipment
CN110926474A (en) * 2019-11-28 2020-03-27 南京航空航天大学 Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method
CN112347840A (en) * 2020-08-25 2021-02-09 天津大学 Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method
CN113358665A (en) * 2021-05-25 2021-09-07 同济大学 Unmanned aerial vehicle tunnel defect detection method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于激光雷达信息和单目视觉信息的车辆识别方法;闫尧;李春书;;河北工业大学学报;20191215(06);16-22 *
露天储煤场无人机自动盘煤系统研究;程健;祖丰收;王东伟;毛少文;马永辉;钱建生;;煤炭科学技术;20161231(05);162-167 *

Also Published As

Publication number Publication date
CN113970753A (en) 2022-01-25

Similar Documents

Publication Publication Date Title
CN109945858B (en) Multi-sensing fusion positioning method for low-speed parking driving scene
AU2018282302B2 (en) Integrated sensor calibration in natural scenes
CN111436216B (en) Method and system for color point cloud generation
CN104215239B (en) Guidance method using vision-based autonomous unmanned plane landing guidance device
CN110842940A (en) Building surveying robot multi-sensor fusion three-dimensional modeling method and system
CN109270534A (en) A kind of intelligent vehicle laser sensor and camera online calibration method
CN112669393A (en) Laser radar and camera combined calibration method
CN109472831A (en) Obstacle recognition range-measurement system and method towards road roller work progress
CN109212545A (en) Multiple source target following measuring system and tracking based on active vision
CN106197422A (en) A kind of unmanned plane based on two-dimensional tag location and method for tracking target
CN112577517A (en) Multi-element positioning sensor combined calibration method and system
CN111815717A (en) Multi-sensor fusion external parameter combination semi-autonomous calibration method
CN110297234B (en) Networked large-area passive air target intersection determination method and system
CN115683062B (en) Territorial space planning detection analysis system
CN113419235A (en) Unmanned aerial vehicle positioning method based on millimeter wave radar
CN114544006B (en) Low-altitude remote sensing image correction system and method based on ambient illumination condition
CN113655803A (en) System and method for calibrating course of rotor unmanned aerial vehicle in tunnel environment based on vision
CN113970753B (en) Unmanned aerial vehicle positioning control method and system based on laser radar and vision detection
CN117310627A (en) Combined calibration method applied to vehicle-road collaborative road side sensing system
CN109146936B (en) Image matching method, device, positioning method and system
CN113947141B (en) Roadside beacon sensing system of urban intersection scene
CN115471555A (en) Unmanned aerial vehicle infrared inspection pose determination method based on image feature point matching
CN110738706B (en) Rapid robot visual positioning method based on track conjecture
CN114066972A (en) Unmanned aerial vehicle autonomous positioning method based on monocular vision
CN111521996A (en) Laser radar installation calibration method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant