CN114693807B - Method and system for reconstructing mapping data of power transmission line image and point cloud - Google Patents

Method and system for reconstructing mapping data of power transmission line image and point cloud Download PDF

Info

Publication number
CN114693807B
CN114693807B CN202210405718.3A CN202210405718A CN114693807B CN 114693807 B CN114693807 B CN 114693807B CN 202210405718 A CN202210405718 A CN 202210405718A CN 114693807 B CN114693807 B CN 114693807B
Authority
CN
China
Prior art keywords
image
current image
coordinates
pixel
point cloud
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210405718.3A
Other languages
Chinese (zh)
Other versions
CN114693807A (en
Inventor
毛锋
戴永东
王茂飞
姚建光
高超
吴奇伟
王神玉
仲坚
张泽
鞠玲
翁蓓蓓
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
State Grid Jiangsu Electric Power Co ltd Innovation And Innovation Center
State Grid Jiangsu Electric Power Co Ltd
Taizhou Power Supply Co of State Grid Jiangsu Electric Power Co Ltd
Original Assignee
State Grid Jiangsu Electric Power Co ltd Innovation And Innovation Center
State Grid Jiangsu Electric Power Co Ltd
Taizhou Power Supply Co of State Grid Jiangsu Electric Power Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by State Grid Jiangsu Electric Power Co ltd Innovation And Innovation Center, State Grid Jiangsu Electric Power Co Ltd, Taizhou Power Supply Co of State Grid Jiangsu Electric Power Co Ltd filed Critical State Grid Jiangsu Electric Power Co ltd Innovation And Innovation Center
Priority to CN202210405718.3A priority Critical patent/CN114693807B/en
Publication of CN114693807A publication Critical patent/CN114693807A/en
Application granted granted Critical
Publication of CN114693807B publication Critical patent/CN114693807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/46Indirect determination of position data
    • G01S17/48Active triangulation systems, i.e. using the transmission and reflection of electromagnetic waves other than radio waves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/243Classification techniques relating to the number of classes
    • G06F18/2433Single-class perspective, e.g. one-against-all classification; Novelty detection; Outlier detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/66Analysis of geometric attributes of image moments or centre of gravity
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Electromagnetism (AREA)
  • Data Mining & Analysis (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a reconstruction method and a system of mapping data of a power transmission line image and a point cloud, wherein the reconstruction method comprises the following steps: collecting a current image through a shooting device; acquiring pre-stored mapping data, wherein the mapping data comprises a mapping relation between pixel coordinates of an established reference image and point cloud space coordinates of point cloud data; comparing the current image with the reference image, and judging whether the current image is abnormal or not; when the current image is abnormal, calibrating the shooting device to obtain parameter information; reconstructing the mapping data according to the parameter information; the method can improve the accuracy of the ranging task performed by using the current image and correct the ranging error.

Description

Method and system for reconstructing mapping data of power transmission line image and point cloud
Technical Field
The invention relates to the technical field of data processing, in particular to a method and a system for reconstructing mapping data of an electric transmission line image and point cloud.
Background
The laser radar scanning technology is an emerging three-dimensional data acquisition technology, and massive point cloud data can be quickly acquired by using laser radar scanners which are mounted on different platforms such as a tripod, an automobile, an airplane and a satellite. The point cloud data contains rich information such as longitude and latitude coordinates, intensity, repeated echo and color of each point, and has relevant application in the fields of mapping, forestry, agriculture, digital cities and the like.
The visible light two-dimensional picture acquired by the unmanned aerial vehicle can be restored to a three-dimensional power scene through a reconstruction process, and certain deviation exists between the three-dimensional power scene and the actual situation. The technical advantage of laser radar point cloud mapping is that the method can accurately restore the space information of the power scene and measure the distance, and the disadvantage of laser radar point cloud mapping is that the method cannot restore the color information in the power scene and has poor visual effect, so that accurate power object classification is difficult to carry out by utilizing the laser radar point cloud data. Therefore, in practical application, pixels in a two-dimensional image are often corresponding to point cloud space coordinates in laser point cloud data, and for the image and the point cloud data of the same power transmission line site, a mapping relationship can be established between target pixel coordinates in the image and point cloud data (space coordinates of a target point) of the target point, and based on the mapping relationship, a distance measurement task can be performed, for example, an actual distance between targets corresponding to two target pixel points in the image in the real world can be determined.
However, after the photographing device is abnormal due to aging, displacement and other reasons, the photographing device outputs an abnormal image, the mapping relationship established between the original point cloud data and the image cannot be applied to the abnormal image at the moment, the abnormal image is used for ranging tasks, a large ranging error can occur, and the ranging accuracy cannot be guaranteed. Patent document CN102982548A provides a multi-view stereoscopic video acquisition system and a camera parameter calibration method thereof: acquiring the internal and external parameters of each camera in the system; acquiring multi-viewpoint images of a common scene at the same time through each camera, and detecting and matching characteristic points of the multi-viewpoint images to obtain matching points among the viewpoint images; reconstructing and obtaining three-dimensional space point cloud coordinates of matching points among all viewpoint images by using camera parameters; according to the three-dimensional space point cloud coordinates and the internal and external parameters of the camera, utilizing sparse bundling set adjustment and optimization to obtain a reprojection error, and optimizing the reprojection error and the internal and external parameters of the camera; judging whether secondary optimization is carried out or not according to the optimized re-projection error; and judging whether to recalibrate the parameters according to the secondary optimization result. However, the method cannot solve the problem that the ranging task is affected due to inaccurate acquisition of the point cloud coordinates of an abnormal image caused by aging, displacement and the like of the shooting device.
Disclosure of Invention
The invention provides a method and a system for reconstructing mapping data of an electric transmission line image and point cloud, which can be used for adjusting the existing mapping data, reestablishing the mapping relation for a current image, improving the accuracy of a ranging task performed by using the current image and correcting a ranging error.
A reconstruction method of mapping data of a power transmission line image and a point cloud comprises the following steps:
collecting a current image through a shooting device;
acquiring pre-stored mapping data, wherein the mapping data comprises a mapping relation between pixel coordinates of an established reference image and point cloud space coordinates of point cloud data;
comparing the current image with the reference image, and judging whether the current image is abnormal or not;
when the current image is abnormal, calibrating the shooting device to obtain parameter information;
and reconstructing the mapping data according to the parameter information.
Further, comparing the current image with the reference image to determine whether the current image is abnormal, including:
selecting a characteristic point in the current image, comparing the pixel coordinate of the characteristic point in the current image with the pixel coordinate of the corresponding characteristic point in the reference image, if the pixel coordinate of the characteristic point in the current image is equal to the pixel coordinate of the corresponding characteristic point in the reference image, determining the current image as an abnormal image, and if the pixel coordinate of the characteristic point in the current image is not equal to the pixel coordinate of the corresponding characteristic point in the reference image.
Further, determining whether the current image is abnormal further includes:
and calculating a plurality of groups of offset of pixel coordinates of a plurality of feature points in the current image and corresponding pixel coordinates of a plurality of feature points in the reference image, if the plurality of groups of offset are equal, determining that the current image is offset, and if the plurality of groups of offset are unequal, determining that the current image is distorted.
Further, the parameter information includes internal parameters, external parameters and distortion parameters of the photographing device.
Further, reconstructing the mapping data according to the parameter information includes:
and reconstructing the mapping data for the current image with offset according to the pixel coordinates of the feature points in the external reference, the internal reference and the current image and the space coordinates of point clouds in the corresponding point cloud data.
Further, reconstructing the mapping data according to the parameter information includes:
and reconstructing the mapping data according to the internal parameters, the external parameters and the distortion parameters for the distorted current image.
Further, reconstructing the mapping data according to the internal parameters, the external parameters and the distortion parameters, including:
calculating that the space coordinates of the characteristic point cloud in the point cloud data correspond to normal pixel coordinates in the normal image according to the internal parameters and the external parameters;
calculating the target pixel coordinates in the current image, corresponding to the distortion, of the normal pixel coordinates according to the distortion parameters;
and reconstructing the mapping data according to the internal parameters, the external parameters, the target pixel coordinates and the space coordinates of the characteristic point cloud.
Further, the distortion parameters include radial deformation coefficients and tangential deformation coefficients, and the target pixel coordinates are calculated by the following formula:
x”=x'×(1+k 1 ×r 2 +k 2 ×r 4 )+2×p 1 ×x'×y'+p 2 ×(r 2 +2×x '2 );
y”=y'×(1+k 1 ×r 2 +k 2 ×r 4 )+2×p 2 ×x'×y'+p 1 ×(r 2 +2×y '2 );
wherein r is 2 =x '2 +y '2 R represents a distortion factor of physical coordinates of an image, k1 and k2 are radial deformation coefficients, p1 and p2 are tangential deformation coefficients, and u d 、v d The pixel coordinates in the distorted current image are u, v, the normal pixel coordinates in the normal image, and x ', y' are camera coordinatesIntermediate quantities tied to the image coordinate system, x "and y" being the coordinates of the distortion location, c x ,c y F is the offset of the optical axis to the center of the projection plane coordinates x And f y Is the focal length of the camera.
Further, after calculating that the normal pixel coordinates correspond to the target pixel coordinates in the distorted current image, the method further includes:
and rounding the coordinates of the target pixels which are not integers.
The system for reconstructing the mapping data of the power transmission line image and the point cloud comprises a shooting device and a server, wherein the server comprises a processor and a storage device, the storage device stores a plurality of instructions, and the processor is used for reading the instructions and executing the method.
The method and the system for reconstructing the mapping data of the power transmission line image and the point cloud provided by the invention at least comprise the following beneficial effects:
the current image acquired by the shooting device can be effectively identified abnormally, so that the mapping data are reconstructed according to abnormal conditions (offset or distortion), the mapping relation between the current image shot by the shooting device and the point cloud data is reestablished, the accuracy of a ranging task performed by the current image is improved, and a ranging error is corrected.
Drawings
Fig. 1 is a flowchart of an embodiment of a method for reconstructing mapping data of an electric transmission line image and a point cloud provided by the present invention.
Fig. 2 is a schematic diagram of a distorted image in an application scenario according to the method for reconstructing mapping data of an electric transmission line image and a point cloud provided by the invention.
Fig. 3 is a schematic structural diagram of an embodiment of a device for reconstructing mapping data of an electric transmission line image and a point cloud according to the present invention.
Fig. 4 is a flowchart of an embodiment of a reconstruction system for mapping data of an electric transmission line image and a point cloud provided by the present invention.
Detailed Description
In order to better understand the above technical solutions, the following detailed description will be given with reference to the accompanying drawings and specific embodiments.
To facilitate an understanding of the present application, some concepts related to the present application will be described first.
LIDAR (Laser Imaging Detection and Ranging, LIDAR for short): by emitting outgoing light (e.g., a laser beam) of a wavelength of, for example, 900nm or so, the outgoing light is reflected by the obstacle after encountering the obstacle, and the processing unit calculates the distance between the obstacle and the lidar based on the time difference between the reflected light and the outgoing light. In addition, the processing unit may estimate the reflectivity of the target based on the cross-sectional condition of the reflected light signal obtained after receiving the reflected light. The airborne laser radar has the advantages of small volume, high integration degree and more application scenes.
Point cloud data (point cloud data) refers to a set of vectors in a three-dimensional coordinate system. The scan data is recorded in the form of dots, each dot containing three-dimensional coordinates, some possibly containing color information (e.g., red, green, blue) or reflectance Intensity information (Intensity).
In the related art, mapping can be performed by adopting an airborne laser radar and other modes, and the map has position information of a plurality of objects. However, the point cloud data has no good intuitiveness of image data, and if the image data can be directly adopted to range the target object, the convenience of detection of the power transmission line can be effectively improved.
In order to achieve accurate measurement using the transmission line image directly, it is necessary to have spatial coordinate information for each pixel in the image. Two types of data (image data and point cloud data) aiming at the same target object are fused, so that the fused transmission line image can be used for measuring the clearance distance and the like of the transmission line, and the accuracy reaches the sub-meter level.
Referring to fig. 1, in some embodiments, a method for reconstructing mapping data of an electric transmission line image and a point cloud is provided, including:
s1, acquiring a current image through a shooting device;
s2, acquiring pre-stored mapping data, wherein the mapping data comprises a mapping relation between pixel coordinates of an established reference image and point cloud space coordinates of point cloud data;
s3, comparing the current image with the reference image, and judging whether the current image is abnormal or not;
s4, calibrating the shooting device when the current image is abnormal, and obtaining parameter information;
s5, reconstructing the mapping data according to the parameter information.
Specifically, in step S1, the photographing device may be mounted on an electric tower on the power transmission line, and may photograph an image of the corresponding power transmission line.
Further, in step S2, the photographing device photographs an image in a normal state, and is configured to establish a mapping relationship with the point cloud data, where the image is a reference image, obtain the point cloud data through the lidar, establish a mapping relationship between pixel coordinates of the reference image and point cloud space coordinates of the point cloud data, and form mapping data for storage.
In this step, the mapping relationship between the reference image and the point cloud data includes a mapping relationship between the coordinates of the target pixel in the reference image and the spatial coordinates of the target point in the point cloud data. The target point may be a characteristic point (e.g., a corner point, an end point, a vertex, a center point, etc.) of a device (e.g., a cross arm, an insulator, etc.) on the transmission line site. For example, the mapping data may include: mapping between pixel coordinates of a center point of an insulator in a reference image and spatial coordinates (e.g., world coordinates including longitude, latitude, and altitude) of the center point of the insulator in point cloud data, e.g., pixel coordinates (u) of the center point of the insulator in the reference image 1 ,v 1 ) Spatial coordinates (X) of the center point of the corresponding insulator in the point cloud data 1 ,Y 1 ,Z 1 ). The mapping relationship may be a functional relationship, and after the coordinates of the target pixel in the reference image are obtained, the spatial coordinates of the target point in the point cloud data may be obtained correspondingly according to the mapping relationship. After the space coordinates of the target point in the point cloud data are obtained, the coordinates of the target pixel in the reference image can be correspondingly obtained according to the mapping relation。
It can be understood that after the mapping data is obtained, the mapping relationship between the coordinates of the target pixel in the reference image and the spatial coordinates of the target point in the point cloud data can be obtained. That is, a target pixel point is selected in the reference image, and the spatial coordinates corresponding to the target pixel point can be obtained by using the mapping data, so that the actual distance between two target pixel points in the reference image in the real world can be determined. Therefore, the distance measurement task can be performed according to the reference image, and the method can be suitable for monitoring the power transmission line field. When the photographing device is used as a monitoring camera installed on the site of the transmission line, the image photographed by the photographing device is relatively stable and unchanged without abnormality of the photographing device due to aging, displacement, and the like. That is, in the case where the photographing device does not cause abnormality due to aging, displacement, or the like, the current image photographed by the photographing device is equivalent to the reference image. Therefore, the ranging task can be performed by using the current image shot by the shooting device according to the established mapping relation between the reference image shot by the shooting device and the point cloud data. However, after the photographing device is abnormal due to aging, displacement, and the like, the current image photographed by the photographing device is abnormal, the current image photographed by the photographing device is not identical to the reference image, and if the abnormal current image is used for performing the ranging task, a large ranging error occurs, and the ranging accuracy cannot be guaranteed.
Therefore, it is necessary to determine whether or not an abnormality has occurred in the photographing device by regarding the current image acquired by the photographing device.
Further, in step S3, comparing the current image with the reference image, and determining whether the current image is abnormal includes:
selecting a characteristic point in the current image, comparing the pixel coordinate of the characteristic point in the current image with the pixel coordinate of the corresponding characteristic point in the reference image, if the pixel coordinate of the characteristic point in the current image is equal to the pixel coordinate of the corresponding characteristic point in the reference image, determining the current image as an abnormal image, and if the pixel coordinate of the characteristic point in the current image is not equal to the pixel coordinate of the corresponding characteristic point in the reference image.
Further, determining whether the current image is abnormal further includes:
and calculating a plurality of groups of offset of pixel coordinates of a plurality of feature points in the current image and corresponding pixel coordinates of a plurality of feature points in the reference image, if the plurality of groups of offset are equal, determining that the current image is offset, and if the plurality of groups of offset are unequal, determining that the current image is distorted.
The distortion may include lens distortion, for example, barrel distortion and pincushion distortion, among others. Lens distortion is a generic term for perspective distortion inherent to optical lenses, that is, distortion due to perspective. Referring to fig. 2, barrel distortion and pincushion distortion are shown.
The wide-angle lens brings barrel distortion while obtaining a wide field of view and a special shooting effect. Barrel distortion, while not affecting imaging sharpness, affects the positional accuracy of the imaging, which can introduce errors, even erroneous decisions, into image analysis and image measurement. Barrel distortion brought by the wide-angle lens to a vision system is nonlinear, deformation is smaller in the center of an image, and deformation is larger as the lens is far away from the center of the image. Pincushion distortion is a phenomenon in which a picture is "shrunk" toward the middle by a lens. The pincushion phenomenon is most easily perceived when using a tele lens or using a tele end of a zoom lens. In a scene of power transmission line monitoring, long-focus lenses or wide-angle lenses are often required to be used due to the fact that the distance between electric towers is long, so that image distortion caused by lens distortion is obvious.
Further, in step S4, when the current image is abnormal, the photographing device is calibrated, so as to obtain parameter information. In this step, the photographing device may be calibrated by using a checkerboard calibration method to obtain parameter information of the photographing device.
Specifically, calibrating the photographing device by using a checkerboard calibration method includes: preparing a checkerboard, wherein the size of the checkerboard is known, shooting the checkerboard at different angles by using a shooting device to obtain a group of images, detecting characteristic points in the images such as calibration board corner points to obtain pixel coordinate values of the calibration board corner points, calculating to obtain physical coordinate values of the calibration board corner points according to the known size of the checkerboard and the origin of a world coordinate system, and calibrating the camera by using the pixel coordinate of each corner point and the physical coordinate of each corner point under the world coordinate system to obtain an internal parameter matrix and an external parameter matrix and distortion parameters of the camera.
The parameter information includes internal parameters, external parameters and distortion parameters of the photographing device.
Further, in step S5, reconstructing the mapping data according to the parameter information for the current image that is shifted but not distorted, including:
and reconstructing the mapping data for the current image with offset according to the pixel coordinates of the feature points in the external reference, the internal reference and the current image and the space coordinates of point clouds in the corresponding point cloud data.
Specifically, the internal parameters include a physical dimension dx of each pixel on an image horizontal axis x, a physical dimension dy of each pixel on an image vertical axis y, a warping factor r of an image physical coordinate, a focal length f, a lateral pixel number u of a phase difference between a center pixel coordinate of the image and an image origin pixel coordinate 0 Number v of horizontal and vertical pixels of phase difference between center pixel coordinates of image and origin pixel coordinates of image 0 That is, (u) 0 ,v 0 ) Representing the pixel coordinates of the intersection of the camera optical axis and the image plane.
The external parameters include a rotation matrix R and translation vector T, which are converted from a spatial coordinate system to a camera coordinate system.
Further, in the case that the current image is not distorted, for the same object, the spatial coordinates of the object in the point cloud data, the shooting midpoint of the shooting device, and the pixel coordinates of the object in the current image are collinear, and based on the collinear relationship, an expression as shown in the formula (1) is constructed:
in formula (1), (x, y) is the objectCoordinates of the image plane of the punctuation. (u) 0 ,v 0 ) The focal length is represented by f, which is the pixel coordinate of the intersection of the camera optical axis and the image plane. (X) s ,Y S ,Z S ) Is the coordinates of the center of the shooting device in a point cloud coordinate system, (X) A ,Y A ,Z A ) And representing the spatial coordinates of the target point in the point cloud data. a, a i ,b i ,c i (i=1, 2, 3) is a rotation matrix of the image.
The expression of the rotation matrix R is shown in the formula (2):
since the image plane coordinates (x, y) of the target point and the pixel coordinates (u, v) of the target point can be transformed correspondingly. By calibrating the camera, the internal participation and external parameters are known, i.e. the known (u 0 ,v 0 )、f、(X s ,Y S ,Z S ). In this way, according to the spatial coordinates (X A ,Y A ,Z A ) Then the target pixel coordinates (u, v) in the current image can be determined. In this way, a mapping relationship between the coordinates of the target pixel in the current image shot by the shooting device and the spatial coordinates of the target point in the point cloud data is constructed.
In another embodiment, according to the internal parameters and the external parameters in the parameter information, a mapping relationship between the current image shot by the shooting device and the point cloud data can be constructed according to the spatial transformation relationship.
The conversion relationship between the pixel coordinates and the spatial coordinates can be expressed as shown in expression (3):
in formula (3), d x And d y Representing the physical dimensions of each pixel on the horizontal x and vertical y axes of the image, respectively, (u) 0 ,v 0 ) For the intersection pixel coordinate of the optical axis of the photographing device and the image plane, f is the tableThe focal length of the shooting device is shown, and the parameters are internal parameters. (u, v) is the pixel coordinates of the target point, (X) w ,Y W ,Z W ) Is the spatial coordinates of the target point in the point cloud data. R represents a rotation matrix and T represents a translation vector.
In some embodiments, the rotation matrix R may also be represented as shown in equation (4):
for example, the number of the cells to be processed,the angles of rotation of the camera coordinate axes around the y-axis, the x-axis and the z-axis of the point cloud coordinate system, namely, attitude angles, are represented as shown in formula (5).
T=[t x ,t y ,t z ]; (5)
Wherein t is x 、t y 、t z Respectively representing the position coordinate values of the center of the camera under the point cloud coordinate system.
Since the internal and external parameters of the photographing device are known, the spatial coordinates (X w ,Y W ,Z W ) Then the target pixel coordinates (u, v) in the current image can be determined. In this way, a mapping relationship between the coordinates of the target pixel in the current image shot by the shooting device and the spatial coordinates of the target point in the point cloud data is constructed.
Further, in step S5, if the current image is distorted, reconstructing the mapping data according to the parameter information, including:
and reconstructing the mapping data according to the internal parameters, the external parameters and the distortion parameters for the distorted current image.
Specifically, reconstructing the mapping data according to the internal parameters, the external parameters and the distortion parameters, including:
according to the internal parameters and the external parameters, calculating that the space coordinates of the characteristic point cloud in the point cloud data correspond to the normal pixel coordinates in the normal image, specifically, in the step, according to the internal parameters and the external parameters in the parameter information, the space coordinates of the target point in the point cloud data can be calculated to correspond to the normal pixel coordinates in the normal image by using the expression or the expression (3);
calculating the target pixel coordinates in the current image, corresponding to the distortion, of the normal pixel coordinates according to the distortion parameters;
the reconstruction of the mapping data is performed based on the internal parameters, external parameters, target pixel coordinates, and spatial coordinates of the feature point cloud, and specifically, the reconstruction of the mapping data may be performed based on the above-described expressions (1) - (5).
The distortion mathematical model is explained below.
An internal reference matrix A (dx, dy, R, u, v, f) of the camera, an external reference matrix [ R|T ], distortion coefficients [ k1, k2, k3, p1, p2], physical dimensions dx and dy of one pixel, a focal length f, and a distortion factor R of physical coordinates of an image.
A radial distortion mathematical model, as shown in equation (6):
wherein r is 2 =x '2 +y '2 The radial distortion at the edges of the image is larger.
A tangential distortion mathematical model, as shown in equation (7):
wherein, the five vectors k1, k2, k3, P1, P2 are distortion parameters, u, v are pixel coordinates in the distorted image, and u ', v' are corrected pixel coordinates.
It will be appreciated that x c 、y c 、z c The coordinates of the pixel points in the camera coordinate system, x ', y' are the intermediate quantity from the camera coordinate system to the image coordinate system, and the normal position coordinates of one pixel in the pixel coordinate system of the image can be expressed as the following formulas (8) to (10):
x'=x c /z c ; (8)
y'=y c /z c ; (9)
x "and y" are distortion position coordinates, and can be expressed as the following formulas (11) to (13):
x”=x'×(1+k 1 ×r 2 +k 2 ×r 4 )+2×p 1 ×x'×y'+p 2 ×(r 2 +2×x '2 ); (11)
y”=y'×(1+k 1 ×r 2 +k 2 ×r 4 )+2×p 2 ×x'×y'+p 1 ×(r 2 +2×y '2 ); (12)
wherein r is 2 =x '2 +y '2 R represents a distortion factor of physical coordinates of an image, k1 and k2 are radial deformation coefficients, p1 and p2 are tangential deformation coefficients, and u d 、v d For the pixel coordinates in the distorted current image, u, v are the normal pixel coordinates in the normal image, x ', y' are the intermediate quantities from the camera coordinate system to the image coordinate system, x "and y" are the coordinates of the distortion location, c x ,c y F is the offset of the optical axis to the center of the projection plane coordinates x And f y Is the focal length of the camera.
In order to determine an image without distortion based on the known distorted image, its mapping relationship may be deduced by a distortion model.
The relationship between the normal image imgR and the distorted image imgD is as shown in formulas (14) to (16):
wherein r is 2 =x '2 +y '2 R represents a distortion factor of physical coordinates of an image, k1 and k2 are radial deformation coefficients, p1 and p2 are tangential deformation coefficients, and u d 、v d For the pixel coordinates in the distorted current image, u, v are the normal pixel coordinates in the normal image, x ', y' are the intermediate quantities from the camera coordinate system to the image coordinate system, x "and y" are the coordinates of the distortion location, c x ,c y F is the offset of the optical axis to the center of the projection plane coordinates x And f y The focal length of the photographing device is generally equal to the focal length of the photographing device.
Since the current image shot by the shooting device is determined to be distorted, the current image is the distorted abnormal image. Thus, after the normal pixel coordinates (u, v) in the normal image are calculated in the above steps, the target pixel coordinates (u) in the current image corresponding to the normal pixel coordinates (u, v) can be calculated according to the above formulas (14) to (16) d ,v d ) Thereby constructing the mapping relation between the coordinates of the target pixel in the current image shot by the shooting device and the space coordinates of the target point in the point cloud data.
It will be appreciated that the normal pixel coordinates (u, v) of the normal image are integers, for example, the normal pixel coordinates (1, 1), but the target pixel coordinates in the current image calculated to correspond to the normal pixel coordinates may be non-integers, for example, the target pixel coordinates (1.1,1.4) are calculated.
In some embodiments, when the calculated target pixel coordinate is a non-integer pixel coordinate, one integer pixel coordinate adjacent to the non-integer pixel coordinate in the current image is taken as the target pixel coordinate. For example, the approximation may be distinguished by rounding. When the calculated target pixel coordinate is (1.1, 1.2), the integer pixel coordinate (1, 1) in the current image may be regarded as the target pixel coordinate. In this way, a mapping relationship between the coordinates of the target pixel in the current image and the spatial coordinates of the target point in the point cloud data can be constructed.
In another embodiment, when the calculated target pixel coordinate is a non-integer pixel coordinate, the current image is scaled up, and one integer pixel coordinate corresponding to the non-integer pixel coordinate is selected as the target pixel coordinate in the scaled up current image. For example, when the calculated target pixel coordinates are (1.1, 1.2), the current image may be enlarged 10 times, and the integer pixel coordinates (11, 12) in the enlarged current image may be selected as the target pixel coordinates. In this way, a mapping relationship between the coordinates of the target pixel in the enlarged current image and the spatial coordinates of the target point in the point cloud data can be constructed.
As can be seen from this embodiment, the method provided in this embodiment of the present application may adjust the mapping relationship between the reference image and the point cloud data that are established in the original mapping data by using the newly constructed mapping relationship between the current image and the point cloud data, so as to obtain adjusted mapping data. Based on the mapping relation between the current image and the point cloud data in the adjusted mapping data, the current image shot by the shooting device can be utilized to perform a ranging task, and the ranging precision is high and the error is small.
The method provided by the embodiment at least comprises the following beneficial effects:
the current image acquired by the shooting device can be effectively identified abnormally, so that the mapping data are reconstructed according to abnormal conditions (offset or distortion), the mapping relation between the current image shot by the shooting device and the point cloud data is reestablished, the accuracy of a ranging task performed by the current image is improved, and a ranging error is corrected.
Further, referring to fig. 3, in some embodiments, there is further provided a device for reconstructing mapping data of an electric transmission line image and a point cloud, including:
an acquisition module 201, configured to acquire a current image through a photographing device;
a data acquisition module 202, configured to acquire pre-stored mapping data, where the mapping data includes a mapping relationship between a pixel coordinate of an established reference image and a point cloud space coordinate of point cloud data;
a judging module 203, configured to compare the current image with the reference image, and judge whether the current image is abnormal;
the calibration module 204 is used for calibrating the shooting device when the current image is abnormal, so as to obtain parameter information;
and a reconstruction module 205, configured to reconstruct the mapping data according to the parameter information.
Specifically, the judging module 203 is further configured to:
selecting a characteristic point in the current image, comparing the pixel coordinate of the characteristic point in the current image with the pixel coordinate of the corresponding characteristic point in the reference image, if the pixel coordinate of the characteristic point in the current image is equal to the pixel coordinate of the corresponding characteristic point in the reference image, determining the current image as an abnormal image, and if the pixel coordinate of the characteristic point in the current image is not equal to the pixel coordinate of the corresponding characteristic point in the reference image.
The judging module 203 is further configured to: :
and calculating a plurality of groups of offset of pixel coordinates of a plurality of feature points in the current image and corresponding pixel coordinates of a plurality of feature points in the reference image, if the plurality of groups of offset are equal, determining that the current image is offset, and if the plurality of groups of offset are unequal, determining that the current image is distorted.
The parameter information comprises internal parameters, external parameters and distortion parameters of the shooting device.
Further, the reconstruction module 205 is further configured to:
and reconstructing the mapping data for the current image with offset according to the pixel coordinates of the feature points in the external reference, the internal reference and the current image and the space coordinates of point clouds in the corresponding point cloud data.
Further, the reconstruction module 205 is further configured to:
and reconstructing the mapping data according to the internal parameters, the external parameters and the distortion parameters for the distorted current image.
Further, the reconstruction module 205 is further configured to:
calculating that the space coordinates of the characteristic point cloud in the point cloud data correspond to normal pixel coordinates in the normal image according to the internal parameters and the external parameters;
calculating the target pixel coordinates in the current image, corresponding to the distortion, of the normal pixel coordinates according to the distortion parameters;
and reconstructing the mapping data according to the internal parameters, the external parameters, the target pixel coordinates and the space coordinates of the characteristic point cloud.
Please refer to the above embodiments for a specific reconstruction method, and the description thereof is omitted.
Referring to fig. 4, in some embodiments, a system for reconstructing mapping data of an electric transmission line image and a point cloud is further provided, including a photographing device 301 and a server 302, where the server 302 includes a processor 3021 and a storage device 3022, the storage device 3022 stores a plurality of instructions, and the processor 3021 is configured to read the plurality of instructions and execute the method described above.
The processor may be a central processing unit (Central Processing Unit, CPU), but may also be other general purpose processors, digital signal processors (Digital Signal Processor, DSP), application specific integrated circuits (Application Specific Integrated Circuit, ASIC), field programmable gate arrays (Field-Programmable Gate Array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage devices may include various types of storage units, such as system memory, read Only Memory (ROM), and persistent storage.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. It is therefore intended that the following claims be interpreted as including the preferred embodiments and all such alterations and modifications as fall within the scope of the invention. It will be apparent to those skilled in the art that various modifications and variations can be made to the present invention without departing from the spirit or scope of the invention. Thus, it is intended that the present invention also include such modifications and alterations insofar as they come within the scope of the appended claims or the equivalents thereof.

Claims (10)

1. The method for reconstructing the mapping data of the power transmission line image and the point cloud is characterized by comprising the following steps:
collecting a current image through a shooting device;
acquiring pre-stored mapping data, wherein the mapping data comprises a mapping relation between pixel coordinates of an established reference image and point cloud space coordinates of point cloud data;
comparing the current image with the reference image, and judging whether the current image is abnormal or not;
when the current image is abnormal, calibrating the shooting device to obtain parameter information;
reconstructing the mapping data according to the parameter information;
comparing the current image with the reference image, and judging whether the current image is abnormal or not, wherein the method comprises the following steps:
selecting a characteristic point in the current image, comparing the pixel coordinate of the characteristic point in the current image with the pixel coordinate of the corresponding characteristic point in the reference image, if the pixel coordinate of the characteristic point in the current image is equal to the pixel coordinate of the corresponding characteristic point in the reference image, determining the current image as an abnormal image, if the pixel coordinate of the characteristic point in the current image is not equal to the pixel coordinate of the corresponding characteristic point in the reference image;
judging whether the current image is abnormal or not, and further comprising:
calculating a plurality of groups of offset of pixel coordinates of a plurality of feature points in the current image and corresponding pixel coordinates of a plurality of feature points in the reference image, if the plurality of groups of offset are equal, determining that the current image is offset, and if the plurality of groups of offset are unequal, determining that the current image is distorted;
the parameter information comprises internal parameters, external parameters and distortion parameters of the shooting device;
reconstructing the mapping data according to the parameter information, including:
reconstructing the mapping data according to the internal parameters, the external parameters and the distortion parameters of the distorted current image;
reconstructing the mapping data according to the internal parameters, the external parameters and the distortion parameters, including:
calculating that the space coordinates of the characteristic point cloud in the point cloud data correspond to normal pixel coordinates in the normal image according to the internal parameters and the external parameters;
calculating the target pixel coordinates in the current image, corresponding to the distortion, of the normal pixel coordinates according to the distortion parameters;
reconstructing the mapping data according to the internal parameters, the external parameters, the target pixel coordinates and the space coordinates of the feature point cloud;
the distortion parameters comprise radial deformation coefficients and tangential deformation coefficients, and the target pixel coordinates are calculated by the following formula:
x″=x′×(1+k 1 ×r 2 +k 2 ×r 4 )+2×p 1 ×x′×y′+p 2 ×(r 2 +2×x′ 2 );
y″=y′×(1+k 1 ×r 2 +k 2 ×r 4 )+2×p 2 ×x′×y′+p 1 ×(r 2 +2×y′ 2 );
wherein r is 2 =x′ 2 +y′ 2 R tableThe distortion factor showing the physical coordinates of the image, k1, k2 are radial deformation coefficients, p1, p2 are tangential deformation coefficients, u d 、v d For the pixel coordinates in the distorted current image, u, v are the normal pixel coordinates in the normal image, x ', y' are the intermediate quantities from the camera coordinate system to the image coordinate system, x "and y" are the coordinates of the distortion location, c x ,c y F is the offset of the optical axis to the center of the projection plane coordinates x And f y Is the focal length of the camera.
2. The method of claim 1, wherein reconstructing the mapping data from the parameter information comprises:
and reconstructing the mapping data for the current image with offset according to the pixel coordinates of the feature points in the external reference, the internal reference and the current image and the space coordinates of point clouds in the corresponding point cloud data.
3. The method of claim 1, further comprising, after calculating that the normal pixel coordinates correspond to target pixel coordinates in the distorted current image:
and rounding the coordinates of the target pixels which are not integers.
4. The method of claim 1, wherein the camera is mounted on an electric tower of an electric transmission line.
5. The method of claim 1, wherein the point cloud data is acquired by a lidar.
6. The method of claim 1, wherein the camera is calibrated using a checkerboard calibration method to obtain parameter information.
7. The method of claim 6, wherein calibrating the camera using a checkerboard calibration method comprises:
preparing a checkerboard, shooting the checkerboard with known size at different angles by using a shooting device to obtain a group of images, detecting calibration board corner points in the images to obtain pixel coordinate values of the calibration board corner points, calculating to obtain physical coordinate values of the calibration board corner points according to the known checkerboard size and the origin of a world coordinate system, and calibrating the shooting device to obtain an internal and external parameter matrix and distortion parameters of the shooting device by using the pixel coordinate of each calibration board corner point and the physical coordinate of each calibration board corner point under the world coordinate system.
8. The method of claim 1, wherein the internal parameters include a physical size of each pixel on a horizontal axis of the image, a physical size of each pixel on a vertical axis of the image, a warping factor r of physical coordinates of the image, a focal length f, a number of horizontal pixels differing between a center pixel coordinate of the image and an origin pixel coordinate of the image, and a number of horizontal and vertical pixels differing between a center pixel coordinate of the image and an origin pixel coordinate of the image.
9. The method of claim 1, wherein the external parameters include a rotation matrix and translation vector of the spatial coordinate system converted to the camera coordinate system.
10. A system for reconstructing mapping data of an image of a transmission line and a point cloud, comprising a photographing device and a server, wherein the server comprises a processor and a storage device, the storage device stores a plurality of instructions, and the processor is configured to read the plurality of instructions and execute the method according to any one of claims 1-9.
CN202210405718.3A 2022-04-18 2022-04-18 Method and system for reconstructing mapping data of power transmission line image and point cloud Active CN114693807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210405718.3A CN114693807B (en) 2022-04-18 2022-04-18 Method and system for reconstructing mapping data of power transmission line image and point cloud

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210405718.3A CN114693807B (en) 2022-04-18 2022-04-18 Method and system for reconstructing mapping data of power transmission line image and point cloud

Publications (2)

Publication Number Publication Date
CN114693807A CN114693807A (en) 2022-07-01
CN114693807B true CN114693807B (en) 2024-02-06

Family

ID=82142473

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210405718.3A Active CN114693807B (en) 2022-04-18 2022-04-18 Method and system for reconstructing mapping data of power transmission line image and point cloud

Country Status (1)

Country Link
CN (1) CN114693807B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117437303B (en) * 2023-12-18 2024-02-23 江苏尚诚能源科技有限公司 Method and system for calibrating camera external parameters

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436660A (en) * 2011-11-08 2012-05-02 北京新岸线网络技术有限公司 Automatic correction method and device of 3D camera image
CN105222788A (en) * 2015-09-30 2016-01-06 清华大学 The automatic correcting method of the aircraft course deviation shift error of feature based coupling
CN109410207A (en) * 2018-11-12 2019-03-01 贵州电网有限责任公司 A kind of unmanned plane line walking image transmission line faultlocating method based on NCC feature
CN113887641A (en) * 2021-10-11 2022-01-04 山东信通电子股份有限公司 Hidden danger target determination method, device and medium based on power transmission channel
CN114050650A (en) * 2021-11-12 2022-02-15 国网冀北电力有限公司电力科学研究院 Intelligent tower based on power transmission line regional autonomous system architecture
CN114255396A (en) * 2021-11-01 2022-03-29 南方电网数字电网研究院有限公司 Power transmission line environment reconstruction method, system and device and controller

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8194936B2 (en) * 2008-04-25 2012-06-05 University Of Iowa Research Foundation Optimal registration of multiple deformed images using a physical model of the imaging distortion

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102436660A (en) * 2011-11-08 2012-05-02 北京新岸线网络技术有限公司 Automatic correction method and device of 3D camera image
CN105222788A (en) * 2015-09-30 2016-01-06 清华大学 The automatic correcting method of the aircraft course deviation shift error of feature based coupling
CN109410207A (en) * 2018-11-12 2019-03-01 贵州电网有限责任公司 A kind of unmanned plane line walking image transmission line faultlocating method based on NCC feature
CN113887641A (en) * 2021-10-11 2022-01-04 山东信通电子股份有限公司 Hidden danger target determination method, device and medium based on power transmission channel
CN114255396A (en) * 2021-11-01 2022-03-29 南方电网数字电网研究院有限公司 Power transmission line environment reconstruction method, system and device and controller
CN114050650A (en) * 2021-11-12 2022-02-15 国网冀北电力有限公司电力科学研究院 Intelligent tower based on power transmission line regional autonomous system architecture

Also Published As

Publication number Publication date
CN114693807A (en) 2022-07-01

Similar Documents

Publication Publication Date Title
CN110689581B (en) Structured light module calibration method, electronic device and computer readable storage medium
US10586352B2 (en) Camera calibration
US10176595B2 (en) Image processing apparatus having automatic compensation function for image obtained from camera, and method thereof
CN108257183B (en) Camera lens optical axis calibration method and device
CN107316325B (en) Airborne laser point cloud and image registration fusion method based on image registration
CA2819956C (en) High accuracy camera modelling and calibration method
CN111815716B (en) Parameter calibration method and related device
CN108510551B (en) Method and system for calibrating camera parameters under long-distance large-field-of-view condition
CN105096329B (en) Method for accurately correcting image distortion of ultra-wide-angle camera
WO2021098448A1 (en) Sensor calibration method and device, storage medium, calibration system, and program product
CN111784585B (en) Image splicing method and device, electronic equipment and computer readable storage medium
US20020122117A1 (en) Camera device, camera system and image processing method
CN111383264B (en) Positioning method, positioning device, terminal and computer storage medium
EP4242609A1 (en) Temperature measurement method, apparatus, and system, storage medium, and program product
CN113240749B (en) Remote binocular calibration and ranging method for recovery of unmanned aerial vehicle facing offshore ship platform
CN113947638B (en) Method for correcting orthographic image of fish-eye camera
CN114693807B (en) Method and system for reconstructing mapping data of power transmission line image and point cloud
CN112927306A (en) Calibration method and device of shooting device and terminal equipment
Martins et al. Monocular camera calibration for autonomous driving—a comparative study
CN114979956A (en) Unmanned aerial vehicle aerial photography ground target positioning method and system
CN112950727B (en) Large-view-field multi-target simultaneous ranging method based on bionic curved compound eye
CN110470216B (en) Three-lens high-precision vision measurement method and device
CN112419427A (en) Method for improving time-of-flight camera accuracy
CN114445583A (en) Data processing method and device, electronic equipment and storage medium
CN109682312B (en) Method and device for measuring length based on camera

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant