CN110108984B - Spatial relationship synchronization method for multiple sensors of power line patrol laser radar system - Google Patents

Spatial relationship synchronization method for multiple sensors of power line patrol laser radar system Download PDF

Info

Publication number
CN110108984B
CN110108984B CN201910439825.6A CN201910439825A CN110108984B CN 110108984 B CN110108984 B CN 110108984B CN 201910439825 A CN201910439825 A CN 201910439825A CN 110108984 B CN110108984 B CN 110108984B
Authority
CN
China
Prior art keywords
angle
deviation
coordinate system
flight
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910439825.6A
Other languages
Chinese (zh)
Other versions
CN110108984A (en
Inventor
张兴华
李庭坚
张建刚
苏国磊
姜诚
张福
罗望春
陈佳乐
余德全
李翔
莫兵兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Maintenance and Test Center of Extra High Voltage Power Transmission Co
Original Assignee
Maintenance and Test Center of Extra High Voltage Power Transmission Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Maintenance and Test Center of Extra High Voltage Power Transmission Co filed Critical Maintenance and Test Center of Extra High Voltage Power Transmission Co
Priority to CN201910439825.6A priority Critical patent/CN110108984B/en
Publication of CN110108984A publication Critical patent/CN110108984A/en
Application granted granted Critical
Publication of CN110108984B publication Critical patent/CN110108984B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01RMEASURING ELECTRIC VARIABLES; MEASURING MAGNETIC VARIABLES
    • G01R31/00Arrangements for testing electric properties; Arrangements for locating electric faults; Arrangements for electrical testing characterised by what is being tested not provided for elsewhere
    • G01R31/08Locating faults in cables, transmission lines, or networks
    • G01R31/081Locating faults in cables, transmission lines, or networks according to type of conductors
    • G01R31/085Locating faults in cables, transmission lines, or networks according to type of conductors in power transmission or distribution lines, e.g. overhead
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/86Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • G01S19/14Receivers specially adapted for specific applications

Abstract

The invention discloses a spatial relationship synchronization method for multiple sensors of a power line patrol laser radar system, which relates to the technical field of power line patrol and comprises the following steps: s1: fixedly integrating a laser radar, a GPS and an IMU on an aircraft; s2: using the aircraft to perform flight test field tests to obtain multi-sensor data, and obtaining attitude arrangement parameters from laser radar equipment to an inertial navigation coordinate system: pitch angle landing angle deviation, roll angle landing deviation, and yaw angle landing deviation; s3: calibrating the IMU, the GPS and the equipment fixed coordinate system through the attitude setting parameters; s4: and calibrating the spatial relationship between the digital camera and the laser point cloud, and solving the final laser point cloud to complete the spatial relationship synchronization of the digital camera and the laser point cloud.

Description

Spatial relationship synchronization method for multiple sensors of power line patrol laser radar system
Technical Field
The invention relates to the technical field of power inspection, in particular to a spatial relationship synchronization method for multiple sensors of a power line inspection laser radar system.
Background
In recent years, with the high-speed development of super (special) high-voltage transmission line construction in China, more and more high-voltage, high-power and long-distance transmission lines are provided, the passing geographic environment is more and more complex, and the aircraft line patrol is applied to a plurality of power grid operation units in China as an efficient patrol technology.
The conventional aircraft inspection operation mainly uses a helicopter and an unmanned aerial vehicle as platforms, and carries devices such as a laser radar System, a Global Positioning System (GPS), an Inertial Measurement Unit (IMU), and a digital camera. When the data is post-processed, the result is inaccurate due to the spatial position relationship of each device. At present, the relative positions of all devices are fixed, but the accuracy of data can not meet the requirement of power line inspection operation.
Disclosure of Invention
The invention aims to provide a spatial relationship synchronization method for multiple sensors of a power line patrol laser radar system, which is used for synchronizing the spatial relationship of the multiple sensors of the power line patrol laser radar system and enabling the obtained data to be more accurate.
In order to solve the above problem, a first aspect of the present invention provides a spatial relationship synchronization method for multiple sensors of a power line patrol lidar system, including:
s1: fixedly integrating a laser radar, a GPS and an IMU on an aircraft;
s2: using the aircraft to perform flight test field tests to obtain multi-sensor data, and obtaining attitude arrangement parameters from laser radar equipment to an inertial navigation coordinate system: pitch angle landing angle deviation, roll angle landing deviation, and yaw angle landing deviation;
s3: calibrating the IMU, the GPS and the equipment fixed coordinate system through the attitude setting parameters;
s4: and calibrating the spatial relationship between the digital camera and the laser point cloud, and solving the final laser point cloud.
A further technical solution is that, step S2 specifically includes:
s21: the air route is planned to be in a groined shape, the same air belt adopts a reverse flight mode, the overlapping degree of the air belts is not lower than 30%, and the distance between the center point of part of effective buildings and the bottom point of the aerial flight sky is enabled to be as large as possible on the premise that ground objects are covered by the imaging view field. In order to better correct course setting deviation, forward and backward flight is used, and laser radar of opposite routes is ensured to image and obtain information of two side edges of the building; the pitching installation angle deviation is as follows:
Figure GDA0003045746730000021
in the formula, D refers to the distance difference between the center positions of the same ground object obtained by forward flight and backward flight; h is the average flying height, and the scanning heights of the forward flying time and the backward flying time are basically consistent;
s22: after the pitching deviation is separated, the pitching deviation correction is added to recalculate and generate laser point cloud data, and the point cloud data is not influenced by the pitching placement angle deviation. The lateral roll setting angle deviation causes a substantially horizontal scanning foot point line on the horizontal plane to tilt. Separating the deviation of the side rolling direction arrangement angle, and obtaining an included angle between two groups of scanning lines of the same ground object, namely the deviation delta omega of the side rolling direction arrangement angle, by using a ground flat linear ground object and matching with an artificial calibrator through opposite flight; the side roll setting angle deviation is as follows:
Figure GDA0003045746730000022
wherein z isL-zRThe elevation difference of the same artificial platform (same-name point) in forward flight and backward flight is shown, H is the flight height, and theta is the scanning angle;
s23: after the deviation of the lateral rolling direction setting angle is separated, new point cloud data are recalculated, the data do not contain the deviation of the pitching direction setting angle and the deviation of the lateral rolling direction setting angle, and the deviation of the yaw direction setting angle is separated by utilizing the point cloud data at the moment. The deviation angle Δ κ changes the position of the center of the scanned object, causing it to deform in the horizontal direction. Separating the deviation by using a pyramid-shaped calibrator and adopting opposite or one-way double flight data; the deviation of the navigational deviation setting angle is as follows:
Figure GDA0003045746730000031
wherein S is the deviation distance between the positions of the point A of the two flights; d is the distance between the point A and the aircraft nadir.
A further technical solution is that, step S3 specifically includes:
converting the equipment coordinate system into an inertial navigation coordinate system, wherein the conversion relation is as follows:
Figure GDA0003045746730000032
wherein:
Figure GDA0003045746730000033
and
Figure GDA0003045746730000034
the rotation matrix and the coordinate offset of the equipment in an inertial navigation coordinate system are obtained;
Figure GDA0003045746730000035
coordinates of the equipment point in an inertial navigation coordinate system;
Figure GDA0003045746730000036
coordinates of the IMU in a GPS space coordinate system;
Figure GDA0003045746730000037
and (3) arranging parameters of the attitude of the laser radar equipment to an inertial navigation coordinate system, wherein the parameters comprise pitching arrangement angle deviation, side rolling arrangement angle deviation and yaw arrangement angle deviation.
A further technical solution is that, step S4 specifically includes:
s41: calculating a placement angle between the digital camera and the IMU; the process is as follows: the laser point cloud resolved near the airplane bottom point is influenced by the scanning angle of the laser radar to be ignored, the GPS coordinate of the equipment and the GPS measurement coordinate of the control point are taken to calculate the distance to be used as the ranging value of the laser radar, and the position deviation of the resolved control point is caused by the error of the setting parameter. According to the precision requirement, the position setting parameters can be directly used, the angle setting parameters can be calculated, or iterative calculation can be carried out, so that the precision of the setting parameters is improved;
the angle setting parameters are respectively converted into an inertial navigation coordinate system through the GPS measurement coordinates of the control points on the bottom points of the airplane and the laser radar resolving coordinates, and the rotation angle of the inertial navigation coordinate system is calculated to be the equipment setting angle; the laser radar angle setting value is resolved by using a Reed-Solomon matrix in the form of:
Figure GDA0003045746730000041
s42: converting the pixel coordinates of the digital image into space coordinates, wherein the conversion model is as follows:
Figure GDA0003045746730000042
Figure GDA0003045746730000043
wherein: x and y are coordinates of image points in an image space coordinate system; x is the number of0,y0Is a principal point; f is the focal length; xi,Yi,ZiTransforming spatial coordinates for the image pixels; xs,Ys,ZsIs a translation parameter; a isi,bi,c i3 attitude angle parameters, wherein the 3 attitude angles are small angle parameters to be solved;
and delta x and delta y are image point correction values, namely the distortion of the camera is corrected to correct the radial distortion and the tangential distortion of the lens, and the distortion correction model is as follows:
Figure GDA0003045746730000044
wherein: Δ x, Δ y are pixel correction values; x and y are coordinates of image points in an image space coordinate system; x is the number of0,y0Is a principal point;
Figure GDA0003045746730000045
the working principle of the invention is introduced: lidar (light laser Detection and ranging) is a short term for laser Detection and ranging systems.
Radar using a laser as a radiation source. Lidar is a product of a combination of laser technology and radar technology. The device consists of a transmitter, an antenna, a receiver, a tracking frame, information processing and the like. The transmitter is various lasers, such as a carbon dioxide laser, a neodymium-doped yttrium aluminum garnet laser, a semiconductor laser, a wavelength tunable solid laser and the like; the antenna is an optical telescope; the receiver employs various forms of photodetectors such as photomultiplier tubes, semiconductor photodiodes, avalanche photodiodes, infrared and visible light multiplexed detection devices, and the like. The laser radar adopts 2 working modes of pulse or continuous wave, and the detection method includes direct detection and heterodyne detection.
Since the first photo was taken by Daguerre and nieppe in 1839, the technique of making a photo plan (X, Y) using photos has been used. The dutch Fourcade invented a photogrammetric stereo observation technology by 1901, so that it was possible to obtain ground three-dimensional data (X, Y, Z) from two-dimensional photos. For one hundred years, the stereophotogrammetry is still the most accurate and reliable technology for acquiring three-dimensional ground data and is an important technology for surveying and mapping the national basic scale topographic map.
LIDAR is a system that integrates laser, Global Positioning System (GPS) and Inertial Navigation System (INS) technologies into one body for obtaining data and generating accurate DEMs. The combination of these three techniques allows to position the spot of the laser beam on the object with high accuracy. The system is divided into a terrain LIDAR system for obtaining a ground Digital Elevation Model (DEM) which is mature day by day at present and a hydrological LIDAR system for obtaining an underwater DEM which is mature and applied, and the two systems have the common characteristic that laser is used for detection and measurement, which is also an English original translation of the word LIDAR, namely: light Detection And Ranging.
The laser has very accurate ranging capability, the ranging accuracy can reach several centimeters, and the accuracy of the LIDAR system depends on the intrinsic factors such as the synchronization of the laser, the GPS and an Inertial Measurement Unit (IMU) besides the laser. With the development of commercial GPS and IMU, it has become possible and widely used to obtain high precision data from mobile platforms (e.g., on airplanes) via LIDAR.
The LIDAR system includes a single beam narrowband laser and a receiving system. The laser generates and emits a beam of light pulses which impinge on the object and are reflected back and finally received by the receiver. The receiver accurately measures the travel time of the light pulse from emission to reflection. Because the light pulses travel at the speed of light, the receiver will always receive the previous reflected pulse before the next pulse is sent out. Given that the speed of light is known, the travel time can be converted into a measure of distance. And by combining the height of the laser, the laser scanning angle, the position of the laser obtained from the GPS and the laser emission direction obtained from the INS, the coordinates X, Y and Z of each ground light spot can be accurately calculated. The frequency of laser beam emission can range from a few pulses per second to tens of thousands of pulses per second. For example, in a system with a frequency of ten thousand pulses per second, the receiver would record sixty thousand points in one minute. In general, the ground spot spacing of LIDAR systems varies from 2 to 4 m.
Lidar is a radar system that operates in the infrared to ultraviolet spectral range and has a principle and construction very similar to a laser range finder. Scientists refer to detection with laser pulses as pulsed lidar and continuous wave lidar to detection with continuous wave laser beams. The laser radar is used for accurately measuring the position (distance and angle), the motion state (speed, vibration and attitude) and the shape of a target, and detecting, identifying, distinguishing and tracking the target. Over the years of endeavor, scientists have developed fire control lidar, detection lidar, missile guidance lidar, target field measurement lidar, navigation lidar, and the like.
An Inertial Measurement Unit (IMU) is a device for measuring the three-axis attitude angle (or angular velocity) and acceleration of an object. Generally, an IMU includes three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detect acceleration signals of an object in three independent axes of a carrier coordinate system, and the gyroscopes detect angular velocity signals of the carrier relative to a navigation coordinate system, and measure angular velocity and acceleration of the object in three-dimensional space, and then solve the attitude of the object. Has important application value in navigation.
Is a device for measuring the three-axis attitude angle (or angular velocity) and acceleration of an object.
To increase reliability, more sensors may be provided for each axis. Generally, the IMU is mounted at the center of gravity of the object being tested.
IMUs are mostly used in devices requiring motion control, such as automobiles and robots. The method is also used in occasions needing to use the attitude for precise displacement calculation, such as inertial navigation equipment of submarines, airplanes, missiles and spacecrafts.
The geomagnetic sensor has the defects that an absolute reference object of the geomagnetic sensor is a magnetic line of force of a geomagnetic field, the geomagnetic sensor has a large using range and low strength, is about several gauss at zero, and is very easy to be interfered by other magnets, and if the instantaneous angle of the Z-axis gyroscope is fused, the system data can be more stable. The acceleration measurement is in the gravity direction, under the condition of no external force acceleration, the ROLL/PITCH two-axis attitude angle can be accurately output, and the angle has no accumulated error and is accurate in a longer time scale. However, the disadvantage of the angle measurement of the acceleration sensor is that the acceleration sensor actually uses the MEMS technology to detect the tiny deformation caused by the inertia force, and the inertia force is the same as the gravity, so the acceleration sensor cannot distinguish the gravity acceleration from the external acceleration, and when the system performs the variable speed motion in the three-dimensional space, the output of the acceleration sensor is incorrect.
The output angular velocity of the gyroscope is an instantaneous quantity, the angular velocity cannot be directly used in attitude balance, the angular velocity and time integral are required to calculate an angle, the obtained angle variation is added to an initial angle to obtain a target angle, the smaller the integral time Dt is, the more accurate the output angle is, but the principle of the gyroscope determines that the measurement reference is self, no absolute reference object outside a system exists, and the addition of Dt cannot be infinitely small, so that the accumulated error of integration can be rapidly increased along with the lapse of time, the output angle is finally inconsistent with the reality, and the gyroscope can only work in a relatively short time scale.
The inertial measurement unit IMU belongs to strapdown inertial navigation, the system is composed of two acceleration sensors and three speed sensors (gyros), the accelerometer is used for sensing the acceleration component of the airplane relative to the ground vertical line, the speed sensors are used for sensing the angle information of the airplane, the sub-components are mainly composed of two A/D converters AD7716BS and 64K E/EPROM memory X25650, the A/D converters adopt the analog variables of the sensors of the IMU, the analog variables are converted into digital information and are finally output to the pitching angle, the inclination angle and the sideslip angle after being calculated by a CPU, the E/EPROM memory mainly stores the linear curve graphs of the sensors of the IMU and the part numbers and the serial numbers of the sensors of the IMU, and when the part is just started, the image processing unit reads the linear curve parameters in the E/EPROM to provide initial information for the subsequent angle calculation. The specific entity of IMU is shown in figure.
The GPS is also called Global Positioning system (Global Positioning system GPS), which is a new generation satellite navigation and Positioning system developed in the united states from the 70 th century in 20 th century, built comprehensively in 1994, and having all-round real-time three-dimensional navigation and Positioning capabilities in sea, land, and air. The GPS is composed of three parts of a space constellation, a ground control part and user equipment. The GPS measurement technology can quickly, efficiently and accurately provide accurate three-dimensional coordinates of point, line and surface elements and other related information, has the remarkable characteristics of all weather, high precision, automation, high benefit and the like, and is widely applied to different fields of military affairs, civil traffic (ships, airplanes, automobiles and the like) navigation, geodetic survey, photogrammetry, field investigation and exploration, land utilization survey, accurate agriculture, daily life (personnel tracking, leisure and entertainment) and the like. The combination of the GPS and the modern communication technology enables the method for measuring the three-dimensional coordinates of the earth surface to be developed from static state to dynamic state and from data post-processing to real-time positioning and navigation, thereby greatly expanding the application range and depth of the method. The carrier phase differential method GPS technology can greatly improve the relative positioning precision and can reach centimeter-level precision in a small range. In addition, the GPS measurement technology has more flexible and convenient requirements on the aspects of visibility among measurement points, geometric figures and the like than the conventional measurement method, and can be completely used for measuring control networks of various grades. The development of the GPS total station is widely applied to topographic and land surveying, various engineering, deformation and surface subsidence monitoring, and shows great superiority in the aspects of precision, efficiency, cost and the like.
The basic positioning principle of the GPS is: the satellite continuously sends ephemeris parameters and time information of the satellite, and after receiving the information, a user calculates and obtains the three-dimensional direction of the three-dimensional position, the movement speed and the time information of the receiver.
The technical scheme of the invention has the following beneficial technical effects: the spatial relationship of the multiple sensors of the power line patrol laser radar system is synchronized, so that more accurate data can be obtained, and the calculation of subsequent point cloud data is facilitated.
Drawings
Fig. 1 is an internal integrated installation layout diagram of a lidar power patrol nacelle according to embodiment 1 of the invention;
fig. 2 is a schematic view of the laser radar pitch positioning angle calibration according to embodiment 1 of the present invention;
fig. 3 is a schematic diagram of laser radar side roll setting angle calibration according to embodiment 1 of the present invention;
fig. 4 is a schematic view of calibration of a laser radar yaw setting angle according to embodiment 1 of the present invention;
fig. 5 is a flowchart of relation calibration of the laser radar placement angle according to embodiment 1 of the present invention.
Reference numerals: 1: an aircraft; 2: a laser radar; 3: a digital camera; 4: an inertial measurement unit.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail with reference to the accompanying drawings in conjunction with the following detailed description. It should be understood that the description is intended to be exemplary only, and is not intended to limit the scope of the present invention. Moreover, in the following description, descriptions of well-known structures and techniques are omitted so as to not unnecessarily obscure the concepts of the present invention.
In order to realize the spatial relationship synchronization of the multiple sensors of the power line patrol laser radar system, the system adopts an integral integration scheme, and the multiple sensors of the power line patrol laser radar system are highly fixedly integrated. Fig. 1 shows an internal integrated installation layout of the laser radar power patrol nacelle.
The spatial relationship among the multiple sensors of the integrally integrated power line patrol laser radar system is fixed, so that the stability of the spatial relationship among the multiple sensors after one-time flight calibration is ensured, and the basis for realizing the synchronization of the spatial relationship among the multiple sensors is provided.
The method comprises the steps of acquiring data of multiple sensors of a primary flight test field, synchronously calibrating spatial relations among the multiple sensors of the electric power line patrol laser radar system, and calibrating spatial coordinate relations among a laser radar, a GPS and an IMU and calibrating spatial relations among a digital camera and laser point clouds.
IMU, GPS and equipment fixed coordinate system calibration
The equipment coordinate system is a right-hand rectangular coordinate consisting of a laser emission point of the equipment as an origin, a laser scanning direction as a Y axis and a platform moving direction as an X axis, and a Z axis points downwards; the IMU is an inertial navigation coordinate system, namely called as a north-heaven-east coordinate system, and is a local coordinate system which changes along with the north direction of different areas of the earth; the GPS is a global geographic coordinate system, the elevation is the normal ground height or sea level height, and the GPS can be converted into a space rectangular coordinate system in a projection mode, namely a WGS-84 coordinate system.
Respectively representing inertial navigation and equipment coordinate systems as E-XYZe0And I-XYZi0The initial attitude rotation of the fixed relation in the geodetic coordinate system is defined as
Figure GDA0003045746730000101
Device coordinate system E-XYZe0In an inertial navigation coordinate system I-XYZi0The transformation relationship in (1) can be expressed as:
Figure GDA0003045746730000102
wherein:
Figure GDA0003045746730000103
and
Figure GDA0003045746730000104
the rotation matrix and the coordinate offset of the equipment in the inertial navigation coordinate system are obtained;
Figure GDA0003045746730000105
is a device point in an inertial navigation coordinate system I-XYZi0Coordinates of (5);
Figure GDA0003045746730000106
the IMU coordinates in the GPS space coordinate system.
Figure GDA0003045746730000107
For the attitude fix parameters (pitch angle, roll angle and yaw angle) of the lidar apparatus to the IMU coordinate system,
Figure GDA0003045746730000108
arranging parameters for the position from the laser radar equipment to an IMU coordinate system, wherein the position arrangement parameters are obtained by accurate measurement when the equipment is integrated;
Figure GDA0003045746730000109
and the arrangement parameters from the IMU to the GPS coordinate system can be directly measured and obtained during system installation. The overall calibration procedure for the lidar setting angle is shown in fig. 2. The attitude setting parameters are obtained as follows:
1. pitching installation angle
As shown in fig. 3, when the pitching setting angle is calibrated, the flight path is planned to be in a groined shape, the same flight band adopts a reverse flight mode, the overlapping degree of the flight bands is not lower than 30%, and on the premise that ground objects are covered by the imaging view field, the distance between the center point of part of effective buildings and the bottom point of the flight sky is enabled to be as large as possible. In order to better correct course setting deviation, forward and backward flight is used, and laser radars of opposite routes are ensured to image and obtain information of two side edges of the building. The pitch angle deviation is:
Figure GDA0003045746730000111
in the formula, D refers to the distance difference between the center positions of the same ground object obtained by forward flight and backward flight; h is the average fly height, assuming that the scan heights are substantially uniform for both forward and backward flights.
2. Side roll to settle angle
As shown in fig. 4, a schematic diagram of calibrating the roll-to-lie angle of the laser radar side is shown, after the pitch deviation is separated, the pitch deviation is added to correct and recalculate to generate laser point cloud data, and the point cloud data is not affected by the pitch deviation of the lie angle. The lateral roll setting angle deviation causes a substantially horizontal scanning foot point line on the horizontal plane to tilt. And separating the deviation of the side rolling direction arrangement angle, and obtaining an included angle between two groups of scanning lines of the same ground object, namely the deviation delta omega of the side rolling direction arrangement angle, by using a ground flat linear ground object and matching with an artificial calibrator through opposite flight. The side roll setting angle deviation is:
Figure GDA0003045746730000112
wherein z isL-zRThe elevation difference of the same artificial platform (same-name point) in forward flight and backward flight is shown, H is the flight height, and theta is the scanning angle.
3. Aviation deflection angle of repose
As shown in fig. 5, a schematic diagram of calibrating the yaw setting angle of the laser radar is shown, after the yaw setting angle deviation is separated, new point cloud data are recalculated, the data do not contain the pitch setting angle deviation and the yaw setting angle deviation, and the yaw setting angle deviation is separated by using the point cloud data at the moment. The deviation angle Δ κ changes the position of the center of the scanned object, causing it to deform in the horizontal direction. The deviation can be separated by using a pyramid-shaped scaler and adopting opposite or one-way double flight data. The deviation of the navigational deviation setting angle is as follows:
Figure GDA0003045746730000121
wherein S is the deviation distance between the positions of the point A of the two flights; d is the distance between the point A and the nadir point of flight.
2) Digital camera and laser point cloud space relation calibration
The digital image acquired by the digital camera can establish a direct spatial relationship through ground control points or laser point cloud characteristic points, and the conversion model of the pixel coordinates and the corresponding spatial coordinates of the digital image is as follows:
Figure GDA0003045746730000122
Figure GDA0003045746730000123
wherein: x and y are coordinates of image points in an image space coordinate system; x is the number of0,y0Is a principal point; f is the focal length; xi,Yi,ZiTransforming spatial coordinates for the image pixels; xs,Ys,ZsIs a translation parameter; a isi,bi,ciCan be described by 3 attitude angles, and the 3 attitude angles are small angle parameters to be solved.
And delta x and delta y are image point correction values, namely the distortion of the camera is corrected to correct the radial distortion and the tangential distortion of the lens, and the distortion correction model is as follows:
Figure GDA0003045746730000124
wherein: Δ x, Δ y are pixel correction values; x and y are coordinates of image points in an image space coordinate system; x is the number of0,y0Is a principal point;
Figure GDA0003045746730000125
because the camera and the IMU adopt different angle measurement systems, in order to directly use the attitude angle directly acquired by the IMU, the arrangement angle between the camera and the IMU needs to be determined first, and the arrangement calculation process is as follows:
the laser point cloud resolved on (near) the aircraft bottom point is influenced by the scanning angle of the laser radar to be negligible, the GPS coordinate of the equipment and the GPS measurement coordinate of the control point are taken to calculate the distance to be used as the ranging value of the laser radar, and the deviation of the resolved control point position is caused by the error of the setting parameter. According to the precision requirement, the position setting parameters can be directly used, the angle setting parameters can be calculated, or iterative calculation can be carried out, so that the precision of the setting parameters is improved.
The angle setting parameters are obtained by respectively converting a control point GPS measurement coordinate on the bottom point of the airplane and a laser radar resolving coordinate (the distance of the GPS coordinate from the laser radar to the control point is used for distance measurement) into a local inertial coordinate system, and calculating the rotation angle of the local inertial coordinate system, namely the equipment setting angle. The laser radar angle setting value resolving uses a Reed-Solomon matrix, and the form of the Reed-Solomon matrix is as follows:
Figure GDA0003045746730000131
the synchronization of the space coordinate relationship between the digital camera image and the laser point cloud is realized through settlement.
It is to be understood that the above-described embodiments of the present invention are merely illustrative of or explaining the principles of the invention and are not to be construed as limiting the invention. Therefore, any modification, equivalent replacement, improvement and the like made without departing from the spirit and scope of the present invention should be included in the protection scope of the present invention. Further, it is intended that the appended claims cover all such variations and modifications as fall within the scope and boundaries of the appended claims or the equivalents of such scope and boundaries.

Claims (3)

1. The spatial relationship synchronization method of the multiple sensors of the power line patrol laser radar system is characterized by comprising the following steps of:
s1: fixedly integrating a laser radar, a GPS and an IMU on an aircraft;
s2: using the aircraft to perform flight test field tests to obtain multi-sensor data, and obtaining attitude arrangement parameters from laser radar equipment to an inertial navigation coordinate system: pitch angle landing angle deviation, roll angle landing deviation, and yaw angle landing deviation;
s3: calibrating the IMU, the GPS and the equipment fixed coordinate system through the attitude setting parameters;
s4: calibrating the spatial relationship between the digital camera and the laser point cloud, and solving the final laser point cloud to complete the spatial relationship synchronization of the digital camera and the laser point cloud;
wherein step S2 further includes:
s21: the flight path of the aircraft is planned to be in a groined shape, the same flight band adopts a reverse flight mode, the overlapping degree of the flight bands is not lower than 30%, and a certain distance is kept between the central point of part of effective buildings and the bottom point of the flight sky on the premise of ensuring that ground objects are covered by the imaging view field; acquiring laser radar imaging data of opposite routes by forward and backward flight of an aircraft, and acquiring information of two side edges of a building; the pitching installation angle deviation is as follows:
Figure FDA0003045746720000011
in the formula, D refers to the distance difference between the center positions of the same ground object obtained by forward flight and backward flight; h is the average flying height, and the scanning heights of the forward flying time and the backward flying time are consistent;
s22: after the pitching deviation is separated, the pitching deviation correction is added to recalculate and generate laser point cloud data, the point cloud data are not influenced by the pitching arrangement angle deviation, the side rolling arrangement angle deviation enables a scanning foot point connecting line which is originally horizontal on a horizontal plane to incline, and through a ground flat linear ground object and a manual calibrator, opposite flight is adopted, so that an included angle between two groups of scanning lines of the same ground object is obtained and is the side rolling arrangement angle deviation delta omega; the side roll setting angle deviation is as follows:
Figure FDA0003045746720000012
wherein z isL-zRThe elevation difference of the same artificial platform in forward flight and backward flight is shown, H is the flight height, and theta is the scanning angle;
s23: after the deviation of the lateral rolling direction setting angle is separated, recalculating new point cloud data, wherein the data no longer contain the deviation of the pitching direction setting angle and the deviation of the lateral rolling direction setting angle, and separating the deviation of the yaw direction setting angle by using the point cloud data at the moment; the deviation of the landing angle delta kappa changes the position of the center of the scanned object, so that the scanned object deforms in the horizontal direction; separating the deviation by using a pyramid-shaped calibrator and adopting opposite or one-way double flight data; the deviation of the navigational deviation setting angle is as follows:
Figure FDA0003045746720000021
wherein S is the deviation distance between the positions of the point A of the two flights; d is the distance between the point A and the aircraft nadir.
2. The spatial relationship synchronization method for multiple sensors of a power line patrol lidar system according to claim 1, wherein the step S3 specifically comprises:
converting the equipment coordinate system into an inertial navigation coordinate system, wherein the conversion relation is as follows:
Figure FDA0003045746720000022
wherein:
Figure FDA0003045746720000023
and
Figure FDA0003045746720000024
the rotation matrix and the coordinate offset of the equipment in an inertial navigation coordinate system are obtained;
Figure FDA0003045746720000025
coordinates of the equipment point in an inertial navigation coordinate system;
Figure FDA0003045746720000026
coordinates of the IMU in a GPS space coordinate system;
Figure FDA0003045746720000027
and (3) arranging parameters of the attitude of the laser radar equipment to an inertial navigation coordinate system, wherein the parameters comprise pitching arrangement angle deviation, side rolling arrangement angle deviation and yaw arrangement angle deviation.
3. The spatial relationship synchronization method for multiple sensors of a power line patrol lidar system according to claim 2, wherein the step S4 specifically comprises:
s41: calculating a placement angle between the digital camera and the IMU; the process is as follows: the method comprises the steps that laser point cloud resolved near an airplane bottom point is influenced by a scanning angle of a laser radar to be ignored, a device GPS coordinate and a control point GPS measurement coordinate are taken to calculate a distance to serve as a laser radar ranging value, the resolved control point position deviation is caused by errors of settling parameters, and according to accuracy requirements, position settling parameters are directly used, angle settling parameters are calculated, or iterative calculation is carried out, so that the accuracy of the settling parameters is improved;
the angle setting parameters are respectively converted into an inertial navigation coordinate system through the GPS measurement coordinates of the control points on the bottom points of the airplane and the laser radar resolving coordinates, and the rotation angle of the inertial navigation coordinate system is calculated to be the equipment setting angle; resolving the angle setting value of the laser radar by using a Reed-Solomon matrix;
s42: converting the pixel coordinates of the digital image into space coordinates, wherein the conversion model is as follows:
Figure FDA0003045746720000031
Figure FDA0003045746720000032
wherein: x and y are coordinates of image points in an image space coordinate system; x is the number of0,y0Is a principal point; f is the focal length; xi,Yi,ZiTransforming spatial coordinates for the image pixels; xs,Ys,ZsAs a translation parameter;ai,bi,ciCan be described by 3 attitude angles, and the 3 attitude angles are small angle parameters to be solved;
and delta x and delta y are image point correction values, namely the distortion of the camera is corrected to correct the radial distortion and the tangential distortion of the lens, and the distortion correction model is as follows:
Figure FDA0003045746720000033
wherein: Δ x, Δ y are pixel correction values; x and y are coordinates of image points in an image space coordinate system; x is the number of0,y0Is a principal point;
Figure FDA0003045746720000034
and k is the distance from the image point to the central point, k is the radial distortion coefficient, and p is the tangential distortion coefficient.
CN201910439825.6A 2019-05-24 2019-05-24 Spatial relationship synchronization method for multiple sensors of power line patrol laser radar system Active CN110108984B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910439825.6A CN110108984B (en) 2019-05-24 2019-05-24 Spatial relationship synchronization method for multiple sensors of power line patrol laser radar system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910439825.6A CN110108984B (en) 2019-05-24 2019-05-24 Spatial relationship synchronization method for multiple sensors of power line patrol laser radar system

Publications (2)

Publication Number Publication Date
CN110108984A CN110108984A (en) 2019-08-09
CN110108984B true CN110108984B (en) 2021-07-16

Family

ID=67492085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910439825.6A Active CN110108984B (en) 2019-05-24 2019-05-24 Spatial relationship synchronization method for multiple sensors of power line patrol laser radar system

Country Status (1)

Country Link
CN (1) CN110108984B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112396662A (en) * 2019-08-13 2021-02-23 杭州海康威视数字技术股份有限公司 Method and device for correcting conversion matrix
CN110764117B (en) * 2019-10-31 2022-10-11 成都圭目机器人有限公司 Method for calibrating relative position of detection robot antenna and sensor based on total station
CN111856492B (en) * 2020-06-22 2021-04-23 北京驭光科技发展有限公司 Dynamic ship height measuring method and device
CN111896949B (en) * 2020-07-15 2024-02-27 河海大学 Dynamic monitoring system and monitoring method for valley amplitude deformation of high arch dam
CN112762899B (en) * 2021-01-08 2023-03-24 中国南方电网有限责任公司超高压输电公司南宁监控中心 Fusion method of laser point cloud and BIM model with video information in visual transformer substation
CN113359810B (en) * 2021-07-29 2024-03-15 东北大学 Unmanned aerial vehicle landing area identification method based on multiple sensors
CN114067533A (en) * 2021-11-27 2022-02-18 四川大学 Geological disaster photographing monitoring and early warning method
CN114755693B (en) * 2022-06-15 2022-09-16 天津大学四川创新研究院 Infrastructure facility measuring system and method based on multi-rotor unmanned aerial vehicle
CN114966634A (en) * 2022-07-11 2022-08-30 高德软件有限公司 Laser ranging system calibration method, device and computer program product
CN116087925B (en) * 2023-04-07 2023-06-20 深圳煜炜光学科技有限公司 Method, device, equipment and storage medium for correcting quadrature error angle

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101914881A (en) * 2010-07-27 2010-12-15 唐粮 Method for rapidly measuring foundation pile control net (CPIII) of rapid transit railway
CN101949715A (en) * 2010-08-10 2011-01-19 武汉武大卓越科技有限责任公司 Multi-sensor integrated synchronous control method and system for high-precision time-space data acquisition
CN103434610A (en) * 2013-09-03 2013-12-11 哈尔滨工程大学 Docking positioning guiding method for offshore oil drilling platform
CN104112363A (en) * 2014-07-04 2014-10-22 西安交通大学 Multi-sensing-data space-time synchronization method and road multi-sensing-data vehicle-mounted acquisition system
CN104180793A (en) * 2014-08-27 2014-12-03 北京建筑大学 Device and method for obtaining mobile spatial information for digital city construction
CN104268935A (en) * 2014-09-18 2015-01-07 华南理工大学 Feature-based airborne laser point cloud and image data fusion system and method
CN106054185A (en) * 2016-05-23 2016-10-26 北京航空航天大学 Airborne double antenna InSAR base line calculating method based on distributed POS
CN106780629A (en) * 2016-12-28 2017-05-31 杭州中软安人网络通信股份有限公司 A kind of three-dimensional panorama data acquisition, modeling method
CN107656286A (en) * 2017-09-26 2018-02-02 武汉大学 Object localization method and system under big beveled distal end observing environment
CN107807365A (en) * 2017-10-20 2018-03-16 国家林业局昆明勘察设计院 Small-sized digital photography there-dimensional laser scanning device for the unmanned airborne vehicle in low latitude
CN108490433A (en) * 2018-02-07 2018-09-04 哈尔滨工业大学 Deviation Combined estimator and compensation method and system when sky based on Sequential filter
CN109579898A (en) * 2018-12-25 2019-04-05 佛山科学技术学院 A kind of intelligence manufacture sensing data spatial calibration method and device

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101914881A (en) * 2010-07-27 2010-12-15 唐粮 Method for rapidly measuring foundation pile control net (CPIII) of rapid transit railway
CN101949715A (en) * 2010-08-10 2011-01-19 武汉武大卓越科技有限责任公司 Multi-sensor integrated synchronous control method and system for high-precision time-space data acquisition
CN103434610A (en) * 2013-09-03 2013-12-11 哈尔滨工程大学 Docking positioning guiding method for offshore oil drilling platform
CN104112363A (en) * 2014-07-04 2014-10-22 西安交通大学 Multi-sensing-data space-time synchronization method and road multi-sensing-data vehicle-mounted acquisition system
CN104180793A (en) * 2014-08-27 2014-12-03 北京建筑大学 Device and method for obtaining mobile spatial information for digital city construction
CN104268935A (en) * 2014-09-18 2015-01-07 华南理工大学 Feature-based airborne laser point cloud and image data fusion system and method
CN106054185A (en) * 2016-05-23 2016-10-26 北京航空航天大学 Airborne double antenna InSAR base line calculating method based on distributed POS
CN106780629A (en) * 2016-12-28 2017-05-31 杭州中软安人网络通信股份有限公司 A kind of three-dimensional panorama data acquisition, modeling method
CN107656286A (en) * 2017-09-26 2018-02-02 武汉大学 Object localization method and system under big beveled distal end observing environment
CN107807365A (en) * 2017-10-20 2018-03-16 国家林业局昆明勘察设计院 Small-sized digital photography there-dimensional laser scanning device for the unmanned airborne vehicle in low latitude
CN108490433A (en) * 2018-02-07 2018-09-04 哈尔滨工业大学 Deviation Combined estimator and compensation method and system when sky based on Sequential filter
CN109579898A (en) * 2018-12-25 2019-04-05 佛山科学技术学院 A kind of intelligence manufacture sensing data spatial calibration method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
利用安置角检校进行记载LiDAR点云数据与同步影像快速配准;钟良 等;《武汉大学学报·信息科学版》;20110930;第36卷(第9期);正文1035至1037页 *

Also Published As

Publication number Publication date
CN110108984A (en) 2019-08-09

Similar Documents

Publication Publication Date Title
CN110108984B (en) Spatial relationship synchronization method for multiple sensors of power line patrol laser radar system
GREJNER‐BRZEZINSKA Direct exterior orientation of airborne imagery with GPS/INS system: Performance analysis
KR100860767B1 (en) Apparatus and method for digital mapping manufacture using airborne laser surveying data
Nagai et al. UAV-borne 3-D mapping system by multisensor integration
JP6560337B2 (en) Remote image stabilization and display
EP1019862B1 (en) Method and apparatus for generating navigation data
US9194954B2 (en) Method for geo-referencing an imaged area
RU2531802C2 (en) Method of determination of geographical coordinates of image points in sar images
CN109032153B (en) Unmanned aerial vehicle autonomous landing method and system based on photoelectric-inertial combined guidance
Xie et al. Design and data processing of China's first spaceborne laser altimeter system for earth observation: GaoFen-7
AU2006228080B1 (en) Increasing measurement rate in time of flight measurement apparatuses
CN106871932A (en) The in-orbit sensing calibration method of satellite borne laser based on Pyramidal search terrain match
Miller et al. 3-D site mapping with the CMU autonomous helicopter
WO2020150388A1 (en) Apparatuses, systems, and methods for gas flux measurements with mobile platforms
CN104729482A (en) Ground tiny target detection system and ground tiny target detection method based on airship
KR100571120B1 (en) Three dimentional survey system which use the laser apparatus
CN110360986B (en) Portable star catalogue local topography mapping system
CN104251994B (en) Long baselines laser ranging is realized without control point satellite Precise Position System and method
CN103245948B (en) Image match navigation method for double-area image formation synthetic aperture radars
Kordić et al. Spatial data performance test of mid-cost UAS with direct georeferencing
CN115202383A (en) Multi-dimensional track expression and generation method for unmanned aerial vehicle
JPH09126761A (en) Earth shape measuring device
Campbell Application of airborne laser scanner-aerial navigation
Cramer et al. Data capture
Mandlburger et al. Evaluation of Consumer-Grade and Survey-Grade UAV-LIDAR

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant