CN110926474B - Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method - Google Patents

Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method Download PDF

Info

Publication number
CN110926474B
CN110926474B CN201911188082.6A CN201911188082A CN110926474B CN 110926474 B CN110926474 B CN 110926474B CN 201911188082 A CN201911188082 A CN 201911188082A CN 110926474 B CN110926474 B CN 110926474B
Authority
CN
China
Prior art keywords
information
building
satellite
skyline
boundary
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911188082.6A
Other languages
Chinese (zh)
Other versions
CN110926474A (en
Inventor
孙蕊
傅麟霞
彭聪
何伟
曹阳威
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201911188082.6A priority Critical patent/CN110926474B/en
Publication of CN110926474A publication Critical patent/CN110926474A/en
Application granted granted Critical
Publication of CN110926474B publication Critical patent/CN110926474B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/933Lidar systems specially adapted for specific applications for anti-collision purposes of aircraft or spacecraft
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/22Multipath-related issues
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement

Abstract

The invention discloses a satellite/vision/laser combined urban canyon environment UAV positioning and navigation method, and belongs to the technical field of measurement and testing. According to the method, an omnidirectional infrared camera is used for obtaining a sky image, partial data in 3D city information is extracted to construct an ideal city skyline database, building skyline information extracted from the infrared image and the ideal city skyline database are compared to obtain horizontal position information of the unmanned aerial vehicle, height information of the unmanned aerial vehicle is determined by laser ranging, a building boundary skyline is constructed according to initial position information of the unmanned aerial vehicle and building boundary information, a building boundary skyline and satellite skyline mining multipath judgment rule is superposed, final position information of the unmanned aerial vehicle is resolved, interference caused by light rays such as light ray attenuation at night is avoided, the calculation complexity is reduced, image processing feature extraction is easy to achieve, errors and calculated amount caused by direct scene matching can be reduced, and all-weather flight and operation requirements of the unmanned aerial vehicle can be met.

Description

Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method
Technical Field
The invention discloses a satellite/vision/laser combined urban canyon environment UAV positioning and navigation method, and belongs to the technical field of measurement and testing.
Background
In recent years, with the development and popularization of commercial and civil Unmanned Aerial Vehicles (UAVs), Unmanned Aerial vehicles are applied to numerous fields, such as urban express Unmanned Aerial vehicles for large logistics density areas, emergency rescue Unmanned Aerial Vehicle auxiliary systems for urban emergency situations, and urban traffic dynamic monitoring Unmanned Aerial vehicles for smart city construction, and therefore the positioning and navigation problems of urban Unmanned Aerial vehicles are urgently to be solved. Currently, a Global Navigation Satellite System (GNSS) can provide a low-cost continuous Global solution for positioning, and becomes an important positioning and Navigation equipment for an unmanned aerial vehicle. However, in an urban canyon environment, due to narrow streets, dense buildings and generally high height, GNSS signals are easily shielded or reflected when being positioned therein, so that the positioning and navigation accuracy of the unmanned aerial vehicle cannot meet the requirements of flight and operation.
In an open area, the signals received by the GNSS satellite signal receiver are Line-of-Sight (LOS) received signals, i.e., direct signals received by users directly without being shielded by satellite signals. However, in an environment with dense urban canyon buildings, GNSS signals are easily blocked or reflected, so that signals received by a receiver may be direct signals or reflected signals or mixed signals of the direct signals and the reflected signals, thereby causing multipath effect, and the obtained final positioning result may be influenced by the multipath effect to generate a large error. Therefore, under an urban canyon environment, it is very important to identify and reject multipath signals in signals received by the GNSS satellite signal receiver to improve the navigation and positioning accuracy of the unmanned aerial vehicle.
Obst et al propose a lightweight probabilistic positioning algorithm for multi-path detection that utilizes ray tracing to determine satellite visibility through 3D map information that can be used to improve the accuracy and integrity of an urban single frequency GPS receiver without additional physical sensors. Conte et al propose an unmanned aerial vehicle navigation method combining an inertial sensor, a visual odometer and scene matching according to the characteristic that an urban canyon GPS signal is susceptible to multipath effect and other factors to cause satellite signals to be unusable. Sim et al propose a navigation parameter estimation system based on airborne image sensor, the system is composed of two parts, a relative position estimation and an absolute position estimation, the relative position estimates the relative position by stereo modeling two consecutive image frames to update the position of the unmanned aerial vehicle; the absolute position corrects the relative position estimation error by scene matching with a reference image or using a Digital Elevation Model (DEM). Chen et al propose a navigation algorithm based on a probability map and combine laser radar, 3D map information and GNSS satellites to generate track estimation and detect the surrounding environment, and the method can reduce the influence of multipath effect on pseudo-range positioning. Groves et al propose a method for judging satellite visibility by performing shadow matching based on 3D map information, which predicts a satellite signal receivable position by using the 3D map information and demarcates a possible position of a user by judging whether the satellite signal is a line-of-sight received signal, thereby improving positioning accuracy in an urban canyon environment. Soundaraj et al propose an algorithm combining an optical flow technology and a data-driven-based image classification technology, and can perform real-time 3D positioning and navigation by using only images captured by a single camera, thereby realizing obstacle avoidance and positioning navigation in an indoor narrow environment.
In general, the main methods currently used in urban canyons to improve the positioning and navigation accuracy of drones include: the method comprises the steps that a visual sensor is combined with other sensors such as a LiDAR (light detection and ranging) sensor and an Inertial Measurement Unit (IMU) sensor to construct real-time three-dimensional surrounding environment information, and the real-time three-dimensional surrounding environment information is matched with 3D map information, so that accurate positioning estimation is obtained; presume the available situation of satellite and carry on the satellite signal screening in order to position the result accurately by using 3D map information and utilizing algorithms such as shadow matching or ray tracing; the vision sensor is utilized to improve the positioning drift phenomenon to assist satellite positioning through a video image processing technology and an image matching technology, and the positioning precision is improved.
Although the existing urban unmanned aerial vehicle positioning and navigation method assisted by sensors such as vision obtains respective effects, high-performance sensors are mostly adopted, part of the vision sensors such as monocular cameras are easy to be influenced by ambient light environments in working, the limitation of application scenes is increased, the problem of poor real-time performance and the like easily exists when the vision sensors are used for three-dimensional information reconstruction, in addition, equipment such as binocular cameras have the problem of high cost, the popularization of the equipment in the application field of unmanned aerial vehicles is hindered, and large errors are easily generated in real-time map construction, and multiple times of training are needed to obtain better results. Secondly, the problem of high calculation complexity and the like exists in the visibility back-estimation of each satellite by using the 3D map information, and the real-time performance of the system can be reduced.
Disclosure of Invention
The invention aims to provide a satellite/vision/laser combined urban canyon environment UAV positioning and navigation method, which is used for judging the type of a received signal by combining vision, laser and a 2D urban ideal skyline database constructed by 3D map information, so that the defects of large calculation complexity, ray factor interference and the like of the existing urban unmanned aerial vehicle positioning and navigation method can be overcome, the positioning accuracy is improved, and the technical problem of insufficient positioning and navigation accuracy of the unmanned aerial vehicle caused by the shielding and reflection of high-rise GNSS satellite signals in the urban canyon environment is solved.
The invention adopts the following technical scheme for realizing the aim of the invention:
a satellite/vision/laser combined UAV positioning and navigation method for an urban canyon environment comprises the steps of extracting feature information of a building boundary and a road boundary from 3D map information to construct a 2D urban ideal skyline database, extracting building skyline features from a sky image shot by an airborne infrared camera, globally matching the building skyline features in the 2D urban ideal skyline database to determine the horizontal position of an unmanned aerial vehicle, measuring the altitude information of the unmanned aerial vehicle by using a laser range finder to further determine the initial spatial position information of the unmanned aerial vehicle, calculating the altitude angle information and the azimuth angle information of the building boundary line according to the initial spatial position information of the unmanned aerial vehicle and the 2D urban ideal skyline database, constructing a building boundary sky map according to the altitude angle information and the azimuth angle information of the building boundary line, and converting the altitude angle and the azimuth angle of a satellite calculated by a receiver into a coordinate system where the building boundary sky map is located to obtain the satellite sky map, and superposing the building boundary space-sky diagram and the satellite space-sky diagram to determine a multipath judgment rule, eliminating multipath signals in the received signals by using the multipath judgment rule, and calculating the final position by using the received information after the multipath signals are eliminated.
Further, in the method for positioning and navigating the UAV in the urban canyon environment of the satellite/vision/laser combination, the 2D urban ideal interplanetary database includes: the system comprises urban building boundary length information, building boundary point coordinate information, building identification information, road width information, road boundary point coordinate information and road identification information.
Further, in the satellite/vision/laser combined urban canyon environment UAV positioning and navigation method, the method for extracting the skyline feature of the building from the sky image shot by the airborne infrared camera comprises the following steps: the method comprises the steps of preprocessing a sky image shot by an airborne infrared camera, capturing pixel points with large brightness change span from the preprocessed sky image by using a sobel boundary detection operator, performing coordinate conversion on the captured pixel points to obtain coordinates of building boundary points, and extracting shape information and length proportion information of a building skyline boundary according to the coordinate information of the building boundary points.
Further, in the satellite/vision/laser combined urban canyon environment UAV positioning and navigation method, the method of globally matching the building skyline features in the 2D urban ideal skyline database to determine the horizontal position of the drone is as follows: and matching corner point information of the building skyline characteristics with building boundary point coordinate information in a 2D city ideal skyline database, matching length proportion information of the building skyline characteristics with city building boundary length information in the 2D city ideal skyline database after the matching degree of the corner point information meets the requirement, and determining the horizontal position of the unmanned aerial vehicle according to the position coordinates of the city building boundary line with the highest matching degree of the length information in the global map.
Further, in the satellite/vision/laser combined urban canyon environment UAV positioning and navigation method, the method for constructing the building boundary sky plot according to the altitude angle information and the azimuth angle information of the building boundary line comprises the following steps: and projecting the boundary point coordinate information, the altitude angle information and the azimuth angle information of the building boundary line under the upper northeast coordinate system to the EON plane to obtain a building boundary space map.
Further, in the satellite/vision/laser combined urban canyon environment UAV positioning and navigation method, the method of superimposing the building boundary sky map and the satellite sky map to determine the multipath decision rule is as follows: and superposing the north pointing direction of the building boundary sky pattern and the satellite sky pattern with the direction with the altitude angle of 90 degrees to obtain a superposed sky pattern, wherein the satellite falling below the boundary line of the building is a satellite shielded by the building, and the signal of the shielded satellite is a multipath signal.
By adopting the technical scheme, the invention has the following beneficial effects: the invention utilizes the omnidirectional infrared camera to acquire image information, avoids the condition of error caused by light condition changes such as light weakening at night, meanwhile, the invention only extracts partial data in the 3D city information to be used for constructing the ideal skyline database of the city in advance, thereby reducing the calculation complexity to a certain extent, the extraction of the building skyline features by processing the infrared images is easy to realize, the initial level information of the unmanned aerial vehicle can be obtained by performing global shape matching in an ideal skyline database, errors and calculated amount caused by direct scene matching can be reduced, the requirements of all-weather flight and operation of the unmanned aerial vehicle can be met, the building boundary sky map and the satellite sky map are superposed to realize simple judgment of multipath signals, and the calculated amount mined by a multipath judgment rule is reduced compared with a conventional positioning navigation scheme.
Drawings
Fig. 1 is a flow chart of positioning and navigation of a drone based on a satellite/vision/laser combination in an urban canyon environment.
FIG. 2 shows a pixel D0Distribution of surrounding pixels.
FIG. 3 is a schematic diagram of the calculation of the altitude and azimuth angles of the building boundary lines.
Fig. 4 is a schematic diagram of a building boundary sky plot.
Fig. 5 is a schematic view of sky map overlay.
Detailed Description
The technical scheme of the invention is explained in detail in the following with reference to the attached drawings. The specific scheme of the unmanned aerial vehicle positioning and navigation method based on the satellite/vision/laser combination in the urban canyon environment is shown in fig. 1, and comprises the following five steps.
(I) constructing a city skyline database
And extracting the characteristic information of buildings and roads according to the 3D map information to construct a 2D city ideal skyline database in advance. The database contains data information mainly comprising: city building boundary length information, building boundary point coordinate information, building identification information, road width information, road boundary point coordinate information, road identification information, and the like.
(II) acquiring the primary space position of the unmanned aerial vehicle
And an omnidirectional infrared camera is used for shooting the sky right above the unmanned aerial vehicle. Considering the imaging principle of the omnidirectional infrared camera, the camera needs to be calibrated and the obtained infrared image needs to be processed to reduce the influence of lens distortion on feature extraction, and then feature extraction is carried out according to the shape features presented by the infrared image. Under the infrared camera, the sky is obviously black, and can be obviously distinguished from the pattern which is close to white and is presented by the shape of the building, so that the characteristic data of the skyline of the building can be accurately captured after the infrared image is subjected to graying processing no matter day and night. The characteristic extraction principle and the method are as follows:
the method can be summarized as follows: according to the shape characteristics presented by the shot infrared image, a sobel boundary detection operator is adopted to capture the characteristic data of the building skyline by combining an image preprocessing technology, and the following steps are developed in detail.
Before the image is smoothed, the image needs to be grayed, that is, all values in RGB are equal, a weighted average method is used to perform grayscale processing on the image, and because human eyes have different sensitivities to red, green and blue, for each pixel point (u, v), the weight values of RGB in grayscale processing are as follows:
RGB(u,v)=R′=G′=B′=0a299R+0.578G+0.114B
after graying, the image needs to be smoothed to be the image type suitable for the sobel operator. The original image is convoluted by Gaussian blur to achieve the purposes of reducing abrupt change gradient of the image, smoothing abrupt change of brightness and improving image quality. For an image with a size of a × b, the gaussian blur calculation formula is:
Figure BDA0002292891410000051
where σ is the standard deviation of the gaussian distribution, and the larger the σ value, the smoother (blurred) the resulting image. After the image is smoothed, the boundary characteristic detection can be carried out by using a sobel operator, wherein the boundary detection is to search a pixel point at a position with large brightness change span according to a set proper threshold value M, namely, if the condition that T is greater than M is met, the pixel point is regarded as a boundary point.
The longitudinal convolution operator and the transverse convolution operator of the sobel operator are respectively as follows:
Figure BDA0002292891410000061
Figure BDA0002292891410000062
for a pixel point (u, v), it is denoted D0The distribution of the pixels around it is shown in fig. 2:
the gradient of the luminance change | T | at the pixel point (u, v) is:
|T|=|(D6+2D7+D8)-(D1+2D2+D3)|+|(D3+2D5+D8)-(D1+2D4+D6)|,
and comparing the brightness change gradient with a set threshold value M to obtain the pixel position of the boundary point. Because the image processing relates to a camera coordinate system, a world coordinate system and a pixel coordinate system, before feature data extraction, the coordinate system needs to be converted, and then the shape features of the boundary line of the building are extracted, so that the shape information of the boundary line of the building is obtained, the length proportion information of the boundary line of the building is obtained through calculation of a plane distance formula, and the shape information of the boundary line of the building and the length proportion information of the boundary line of the building are used as the feature data of the boundary line of the building extracted from the infrared image.
Next, based on the concept of "shape matching", global feature matching is performed in the 2D city ideal skyline database according to the extraction result of the infrared image building skyline feature data. By combining the data of city building boundary length information, building boundary point coordinate information, road boundary point coordinate information and the like in the database, the length ratio information and the boundary point information of the ideal building skyline in the ideal skyline database can be obtained through a distance formula, and are matched with the feature information extracted from the image: firstly, judging whether the matching degree of corner points (namely intersection points of two boundary lines) can meet the requirement, then judging the conformity degree of the boundary lines, finally carrying out comparison and screening according to length ratio information, and finally finding out the position coordinates of the boundary line of the ideal building with the highest matching degree in the global map to obtain the horizontal position of the unmanned aerial vehicle in the global map. Utilize laser range finder to record unmanned aerial vehicle altitude information to final definite unmanned aerial vehicle place preliminary spatial position information.
(III) sky map superposition and NLOS signal determination
According to the building boundary sky map of the area where the unmanned aerial vehicle is located (for example, the square range with the side length of the centroid of the unmanned aerial vehicle being 40 m) can be obtained through calculation according to the building boundary length information, the road width information and the building height information contained in the building boundary point information in the 2D city ideal sky line database of the height information, the horizontal position information of the unmanned aerial vehicle and the position of the unmanned aerial vehicle, the building boundary sky map contains the height angle and the azimuth angle of the building boundary line, because the 2D city ideal sky line database is constructed according to the 3D map information, a distance formula and an inverse trigonometric function can be applied, the solution is carried out through a geometric structure formed by the building boundary coordinate point position and the unmanned aerial vehicle coordinate point position obtained in the second step, the principle is shown in fig. 3, and the specific calculation process is as follows:
firstly, according to the coordinates of the position of the unmanned aerial vehicle, the height of the unmanned aerial vehicle and the establishment in the databaseThe coordinate information of the building boundary point and the building height information contained in the building boundary point can be easily solved by a coordinate relation and distance calculation formula to obtain the altitude angle and the azimuth angle of the building boundary line relative to the position of the unmanned aerial vehicle, if the altitude angle of the boundary point A is less 1, the azimuth angle is 0 degrees, the altitude angle of the boundary point B is less 1, and the azimuth angle is less 2. After the altitude angle and the azimuth angle of the boundary line of the building are solved, the X shown in FIG. 3 is converted into a coordinateuYuZuWhen the coordinate system of the unmanned aerial vehicle body is transferred to the coordinate system of the northeast, the coordinate transfer process is as follows:
wherein, the rotation matrix Q of the above process is:
Figure BDA0002292891410000071
up to now, the coordinate system of the unmanned aerial vehicle body has been transferred to the north-east upper (ENU) coordinate system, and accordingly, the altitude and the azimuth are also correspondingly transferred to the ENU coordinate system.
The boundary line coordinate information of the building under the ENU coordinate system, the altitude angle and the azimuth angle information thereof are projected to the EON plane to obtain a building boundary sky map, and the north direction is set to be 0 ° direction, so that the sky map intention shown in fig. 4 can be obtained. Wherein, the boundary of the gray area represents the boundary line of the building, the circular line represents the height angle, and the height angle is decreased by 10 degrees from the center (90 degrees) to the outermost circle.
Similarly, coordinate conversion is performed on the satellite elevation angle and the azimuth angle calculated by the receiver, so that the satellite elevation angle and the azimuth angle are also located under the ENU coordinate system, a satellite sky map is obtained, and the building boundary sky map and the satellite sky map are superposed, that is, the north direction is superposed with the direction with the elevation angle of 90 degrees, and the obtained sky map superposition schematic diagram is shown in fig. 5.
The satellite signal shielding condition is obtained by comparing the altitude angle and the azimuth angle of the building boundary line and the satellite, and the invisible satellite number is determined, so that the multipath judgment rule is determined: if the satellite position falls in a sheltering area below the building boundary, namely a non-sky area, the satellite position is judged to be a multipath signal, and then the multipath signal is eliminated, otherwise, the satellite position is judged to be an available satellite signal. As shown, the satellite ST2 and the satellite ST4 fall below the building boundary, i.e. the shadow part of the building, so the received signals of the two satellites should be NLOS signals and should be rejected.
(IV) Final position information resolution
And judging the received signals by using the rule, eliminating the signals judged as multipath, and performing positioning calculation based on a least square method by using the rest available satellite signals. Aiming at the residual error of the GNSS, a Klobuchar model is used for correcting an ionosphere error, a TROPFIELD model is used for correcting a troposphere error, a precise ephemeris is used for correcting a satellite clock error, and finally output position information is obtained.

Claims (4)

1. A satellite/vision/laser combined positioning and navigation method for an urban canyon environment UAV (unmanned aerial vehicle),
extracting feature information of a building boundary and a road boundary from 3D map information to construct a 2D city ideal skyline database, wherein the 2D city ideal skyline database comprises: the system comprises urban building boundary length information, building boundary point coordinate information, building identification information, road width information, road boundary point coordinate information and road identification information;
extracting building skyline features from a sky image shot by an airborne infrared camera, and globally matching the building skyline features in a 2D city ideal skyline database to determine the horizontal position of the unmanned aerial vehicle: matching corner point information of building skyline features with building boundary point coordinate information in a 2D city ideal skyline database, matching length proportion information of the building skyline features with city building boundary length information in the 2D city ideal skyline database after the matching degree of the corner point information meets requirements, and determining the horizontal position of the unmanned aerial vehicle according to position coordinates of the city building boundary line with the highest matching degree of the length information in a global map;
the method comprises the steps that altitude information of the unmanned aerial vehicle is measured by a laser range finder, so that preliminary spatial position information of the unmanned aerial vehicle is determined, altitude angle information and azimuth angle information of a building boundary line are calculated according to the preliminary spatial position information of the unmanned aerial vehicle and a 2D city ideal skyline database, and a building boundary sky plot is constructed according to the altitude angle information and the azimuth angle information of the building boundary line;
and converting the satellite altitude angle and the azimuth angle calculated by the receiver to a coordinate system where the building boundary space-sky diagram is located to obtain the satellite space-sky diagram, superposing the building boundary space-sky diagram and the satellite space-sky diagram to determine a multipath judgment rule, eliminating multipath signals in the received signals by using the multipath judgment rule, and calculating the final position by using the received information after the multipath signals are eliminated.
2. The combined satellite/vision/laser positioning and navigation method for the urban canyon environment UAV according to claim 1, wherein the method for extracting the skyline feature of the building from the sky image captured by the onboard infrared camera comprises: the method comprises the steps of preprocessing a sky image shot by an airborne infrared camera, capturing pixel points with large brightness change span from the preprocessed sky image by using a sobel boundary detection operator, performing coordinate conversion on the captured pixel points to obtain coordinates of building boundary points, and extracting shape information and length proportion information of a building skyline boundary according to the coordinate information of the building boundary points.
3. The satellite/vision/laser combined urban canyon environment UAV position and navigation method of claim 1, wherein the method for constructing the building boundary sky plot according to the altitude and azimuth information of the building boundary lines comprises: and projecting the boundary point coordinate information, the altitude angle information and the azimuth angle information of the building boundary line under the upper northeast coordinate system to the EON plane to obtain a building boundary space map.
4. The combined satellite/vision/laser UAV position and navigation method of claim 1, wherein the method of superimposing the building boundary sky plot and the satellite sky plot to determine the multi-path decision rule is: and superposing the north pointing direction of the building boundary sky pattern and the satellite sky pattern with the direction with the altitude angle of 90 degrees to obtain a superposed sky pattern, wherein the satellite falling below the boundary line of the building is a satellite shielded by the building, and the signal of the shielded satellite is a multipath signal.
CN201911188082.6A 2019-11-28 2019-11-28 Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method Active CN110926474B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911188082.6A CN110926474B (en) 2019-11-28 2019-11-28 Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911188082.6A CN110926474B (en) 2019-11-28 2019-11-28 Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method

Publications (2)

Publication Number Publication Date
CN110926474A CN110926474A (en) 2020-03-27
CN110926474B true CN110926474B (en) 2021-09-03

Family

ID=69846789

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911188082.6A Active CN110926474B (en) 2019-11-28 2019-11-28 Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method

Country Status (1)

Country Link
CN (1) CN110926474B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021223107A1 (en) * 2020-05-06 2021-11-11 深圳市大疆创新科技有限公司 Signal processing method, electronic device and computer-readable storage medium
CN113739797A (en) * 2020-05-31 2021-12-03 华为技术有限公司 Visual positioning method and device
CN112164114B (en) * 2020-09-23 2022-05-20 天津大学 Outdoor active camera repositioning method based on skyline matching
CN112835083B (en) * 2020-12-31 2023-08-01 广州南方卫星导航仪器有限公司 Combined navigation system
CN113031041B (en) * 2021-03-11 2022-06-10 南京航空航天大学 Urban canyon integrated navigation and positioning method based on skyline matching
CN113112544B (en) * 2021-04-09 2022-07-19 国能智慧科技发展(江苏)有限公司 Personnel positioning abnormity detection system based on intelligent Internet of things and big data
CN113325450A (en) * 2021-05-07 2021-08-31 Oppo广东移动通信有限公司 Positioning method, positioning device, electronic equipment and storage medium
CN113359168A (en) * 2021-05-19 2021-09-07 北京数研科技发展有限公司 Vision-assisted GNSS non-line-of-sight signal suppression method
CN113504553B (en) * 2021-06-29 2024-03-29 南京航空航天大学 GNSS positioning method based on accurate 3D city model in urban canyon
CN113820697B (en) * 2021-09-09 2024-03-26 中国电子科技集团公司第五十四研究所 Visual positioning method based on city building features and three-dimensional map
CN115212489A (en) * 2022-07-20 2022-10-21 中国矿业大学 Unmanned aerial vehicle fire-fighting rescue decision-making auxiliary system for forest fire
CN116755126B (en) * 2023-08-15 2023-11-14 北京航空航天大学 Beidou real-time accurate positioning method based on three-dimensional model mapping matching

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103822635A (en) * 2014-03-05 2014-05-28 北京航空航天大学 Visual information based real-time calculation method of spatial position of flying unmanned aircraft
CN104848867A (en) * 2015-05-13 2015-08-19 北京工业大学 Pilotless automobile combination navigation method based on vision screening
CN105022401A (en) * 2015-07-06 2015-11-04 南京航空航天大学 SLAM method through cooperation of multiple quadrotor unmanned planes based on vision
CN106371114A (en) * 2015-07-23 2017-02-01 现代自动车株式会社 Positioning apparatus and method for vehicle
CN106970398A (en) * 2017-03-27 2017-07-21 中国电建集团西北勘测设计研究院有限公司 Take the satellite visibility analysis and ephemeris forecasting procedure of satellite obstruction conditions into account
CN107064974A (en) * 2017-02-28 2017-08-18 广东工业大学 A kind of localization method and system for suppressing urban canyons multipath satellite-signal
CN109285177A (en) * 2018-08-24 2019-01-29 西安建筑科技大学 A kind of digital city skyline extracting method
CN109991640A (en) * 2017-12-29 2019-07-09 上海司南卫星导航技术股份有限公司 A kind of integrated navigation system and its localization method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IL164650A0 (en) * 2004-10-18 2005-12-18 Odf Optronics Ltd An application for the extraction and modeling of skyline for and stabilization the purpese of orientation
US20170146990A1 (en) * 2015-11-19 2017-05-25 Caterpillar Inc. Augmented communication and positioning using unmanned aerial vehicles
US20190271550A1 (en) * 2016-07-21 2019-09-05 Intelligent Technologies International, Inc. System and Method for Creating, Updating, and Using Maps Generated by Probe Vehicles
CN107966724B (en) * 2017-11-27 2019-06-14 南京航空航天大学 It is a kind of based on 3D city model auxiliary urban canyons in satellite positioning method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103822635A (en) * 2014-03-05 2014-05-28 北京航空航天大学 Visual information based real-time calculation method of spatial position of flying unmanned aircraft
CN104848867A (en) * 2015-05-13 2015-08-19 北京工业大学 Pilotless automobile combination navigation method based on vision screening
CN105022401A (en) * 2015-07-06 2015-11-04 南京航空航天大学 SLAM method through cooperation of multiple quadrotor unmanned planes based on vision
CN106371114A (en) * 2015-07-23 2017-02-01 现代自动车株式会社 Positioning apparatus and method for vehicle
CN107064974A (en) * 2017-02-28 2017-08-18 广东工业大学 A kind of localization method and system for suppressing urban canyons multipath satellite-signal
CN106970398A (en) * 2017-03-27 2017-07-21 中国电建集团西北勘测设计研究院有限公司 Take the satellite visibility analysis and ephemeris forecasting procedure of satellite obstruction conditions into account
CN109991640A (en) * 2017-12-29 2019-07-09 上海司南卫星导航技术股份有限公司 A kind of integrated navigation system and its localization method
CN109285177A (en) * 2018-08-24 2019-01-29 西安建筑科技大学 A kind of digital city skyline extracting method

Also Published As

Publication number Publication date
CN110926474A (en) 2020-03-27

Similar Documents

Publication Publication Date Title
CN110926474B (en) Satellite/vision/laser combined urban canyon environment UAV positioning and navigation method
WO2021248636A1 (en) System and method for detecting and positioning autonomous driving object
CN103822635B (en) The unmanned plane during flying spatial location real-time computing technique of view-based access control model information
US10909395B2 (en) Object detection apparatus
CN110859044A (en) Integrated sensor calibration in natural scenes
CN107917699B (en) Method for improving aerial three quality of mountain landform oblique photogrammetry
WO2010088290A1 (en) Tight optical intergation (toi) of images with gps range measurements
CN106408601A (en) GPS-based binocular fusion positioning method and device
CN112740225B (en) Method and device for determining road surface elements
KR20210034253A (en) Method and device to estimate location
CN113031041A (en) Urban canyon integrated navigation and positioning method based on skyline matching
Bai et al. Real-time GNSS NLOS detection and correction aided by sky-pointing camera and 3D LiDAR
CN112461204B (en) Method for satellite to dynamic flying target multi-view imaging combined calculation of navigation height
US11460302B2 (en) Terrestrial observation device having location determination functionality
Guo et al. Accurate calibration of a self-developed vehicle-borne LiDAR scanning system
Chellappa et al. On the positioning of multisensor imagery for exploitation and target recognition
CN113296133A (en) Device and method for realizing position calibration based on binocular vision measurement and high-precision positioning fusion technology
CN113340272A (en) Ground target real-time positioning method based on micro-group of unmanned aerial vehicle
Gu et al. SLAM with 3dimensional-GNSS
Koppanyi et al. Experiences with acquiring highly redundant spatial data to support driverless vehicle technologies
Pritt et al. Automated georegistration of motion imagery
Gakne Improving the accuracy of GNSS receivers in urban canyons using an upward-facing camera
CN112050830B (en) Motion state estimation method and device
Wei Multi-sources fusion based vehicle localization in urban environments under a loosely coupled probabilistic framework
Ishii et al. Autonomous UAV flight using the Total Station Navigation System in Non-GNSS Environments

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant