CN112037192A - Method for collecting burial depth information in town gas public pipeline installation process - Google Patents

Method for collecting burial depth information in town gas public pipeline installation process Download PDF

Info

Publication number
CN112037192A
CN112037192A CN202010889767.XA CN202010889767A CN112037192A CN 112037192 A CN112037192 A CN 112037192A CN 202010889767 A CN202010889767 A CN 202010889767A CN 112037192 A CN112037192 A CN 112037192A
Authority
CN
China
Prior art keywords
image
coordinate system
camera
point
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010889767.XA
Other languages
Chinese (zh)
Inventor
高智勇
王金金
高建民
谢军太
张晓明
韩毅
田云祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shaanxi Special Equipment Inspection And Testing Institute
Xian Jiaotong University
Original Assignee
Shaanxi Special Equipment Inspection And Testing Institute
Xian Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shaanxi Special Equipment Inspection And Testing Institute, Xian Jiaotong University filed Critical Shaanxi Special Equipment Inspection And Testing Institute
Priority to CN202010889767.XA priority Critical patent/CN112037192A/en
Publication of CN112037192A publication Critical patent/CN112037192A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T5/70
    • G06T5/80
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform

Abstract

The invention discloses a method for acquiring buried depth information in the process of installing a town gas public pipeline, which provides a digital solution for the problems of complex field buried depth detection and high supervision difficulty: the image acquisition function of the portable terminal in the construction process is skillfully utilized, the requirements on detection equipment and environment are reduced, and the field real-time operability is strong; and the measurement steps are standardized, so that the interference of human factors is reduced and the error caused by the distortion of the original lens is avoided under the condition of ensuring the integrity of effective information. The method comprises the steps of adopting a field marker post as a standard object, firstly extracting Exif information contained in an image, further finishing relevant preprocessing, and improving the signal-to-noise ratio of the image; then, coordinate values of the feature points/control points in the image are obtained through Hough transformation; and establishing a conversion relation between the ideal point and the distortion point through a camera perspective model. And judging the orientation of the camera, finding the control point accurately and finishing correction. Finally, the numerical value of the pipeline buried depth is calculated by utilizing the proportional relation between the pipeline buried depth and the known marker post, and the digital monitoring and detection of the quality in the urban public gas pipeline installation process is guaranteed.

Description

Method for collecting burial depth information in town gas public pipeline installation process
Technical Field
The invention belongs to the technical field of town gas public pipeline installation, and particularly relates to a method for acquiring burial depth information in the town gas public pipeline installation process.
Background
The town gas pipe network is different from a long-distance pipeline and has unique complexity: compared with long-distance pipelines, town gas pipelines are more vulnerable to the damage of third parties. Especially, the planning of the town gas pipe network in the three-line urban area of China is disordered, and the construction and service standards of the pipeline are greatly different from those of a long-distance pipeline. Most of the gas pipe networks are buried underground, are close to the main traffic roads and are in the downtown areas of people, and the conveying medium is inflammable and explosive gas. In addition, the gas pipe network has the characteristics of long continuous operation time and the like, and accidents of the gas pipe network generally have the characteristics of openness, unknownness, high harmfulness and the like. According to the analysis of the conditions of fuel gas pipeline engineering and produced pipelines in the cities and towns of Shaanxi province, the phenomenon of pipe leakage caused by insufficient buried depth of the pipeline becomes one of important factors influencing the safe operation of the fuel gas pipeline, particularly in the bridge, river channel, crossing and other areas. The reasons for insufficient buried depth of the pipeline can be roughly summarized into the following types: 1) the backfill is not in place, and the backfill soil sinks; 2) water and soil loss caused by rainy season or flood season; 3) the pipeline is not deep enough in the process of excavating bridge, river channel and other sections; 4) the higher the penetration is, the larger the difference is, the embedded depth of the high ridge and the slope toe part is insufficient; 5) the pipe ditch is not filled back in time after being excavated, and the burying depth is insufficient due to ditch bottom siltation and pipe ditch collapse.
At present, aiming at the research of insufficient pipeline burial depth, the research mostly stays at the design and management angle, and emphasizes the strict supervision and system specifications in the middle and later stages before and after construction. Aiming at the problems of non-standard design, non-qualified quality, unscientific technology and the like of the installation process of the public pressure pipeline, the dawn provides optimization suggestions such as optimized pipeline installation design, strict inspection, implementation supervision responsibility and the like from the perspective of government supervision; from the design angle of pipeline buried depth, Wangyikun and the like simulate and analyze the influence factors such as ground temperature distribution, soil covering load, construction cost and the like of all depths, and provide a scheme for determining the reasonable buried depth of the pipeline. In the actual construction process, the check points for the buried depth of the pipeline comprise measurement during pipe ditch excavation, spot check during ditch check, check of concealed engineering after pipeline ditch entering, measurement of digital pipeline data and the like. The pipeline installation process mainly refers to a ditch descending stage of a pipeline, wherein the pipeline top elevation needs to be measured for multiple times in the process, and correction is carried out in time when the pipeline top elevation does not meet the requirements; before the pipe ditch is backfilled, the elevation of the top of the pipe of the pipeline must be measured again, and the buried depth is ensured to meet the requirement. At the present stage, the tape measure is still adopted for manual measurement, digital automatic acquisition cannot be realized, and supervision personnel cannot realize comprehensive supervision easily.
Disclosure of Invention
The invention aims to provide a solution for digitally acquiring the buried depth of a gas pipeline in a construction site, which is a method for acquiring large-scene plane geometric elements based on image measurement.
In order to achieve the purpose, the invention adopts the following technical scheme: a buried depth information acquisition method in the town gas public pipeline installation process comprises the following steps:
step 1), acquiring field image information to obtain an original image;
step 2), extracting Exif information of the original image obtained in the step 1);
step 3), preprocessing the original image obtained in the step 1) to obtain a clear bianarization image of the marker post;
step 4), performing classical Hough transform on the binary image obtained in the step 3) to obtain end points of horizontal and vertical benchmarks on the image surface and intersection point coordinate values of the end points;
step 5), constructing a perspective geometric model according to a world coordinate system and image coordinate system transformation relation in the camera imaging principle;
and 6) correcting the perspective geometric distortion in the image by combining the camera parameters, formulating a calculation flow chart, and calculating the buried depth of the pipeline by using the corrected coordinates and the known size of the marker post.
Further, in the step 1), the acquisition of the original image is obtained according to the same acquisition flow. According to a unified and standard measurement flow, the problems of complex background environment, obvious size difference of actual measurement scenes and large span of the buried depth range of the pipeline to be measured are solved.
Further, the Exif information of the original image includes time, geographical position and camera parameters.
Further, the field measurement in the step 1) adopts a marker post as a standard object for measurement, the vertical marker post is contacted with the bottom of the pipe ditch or the top of the pipeline, and the horizontal marker post is coplanar with the ground plane and is arranged perpendicular to the vertical marker post; the optical axis of the camera is perpendicular to the plane of the marker post, so that the image plane is parallel to the plane of the marker post, and the view field covers the whole marker post, thereby completing image acquisition.
Through the design of the steps, the specification can meet the requirements of on-site quick response and real-time measurement, and simultaneously contains geometric element information and standard substances required by single-image measurement.
Further, the Exif information in the step 2) is embedded in a set of shooting parameters of a fixed file format in a JPEG image format file, including information such as shooting parameters, time, place, camera brand model, color coding, and the like, wherein the shooting parameters include aperture, shutter, ISO, and focal length; the Exif information is placed in the header of the JPG file, and in NET, the Exif information is acquired using a propertytem object.
Further, the flow of image preprocessing in step 3) includes: calculating image gradient, binarizing, filling and filtering; (ii) a The method comprises the steps that gradient calculation is carried out to extract edge information in an image, and a filling operator is adopted to carry out edge connection aiming at the problem that the edge of a marker post is segmented and not closed, so that continuous and complete marker post information is obtained; and the image is mean filtered using a 4 x 4 convolution kernel. Background noise is eliminated, the signal to noise ratio is improved, and the accuracy of subsequent Hough transform is improved.
Further, step 4) performs classical Hough transformation, converts the coordinate space into a linear parameter space, extracts peak parameters, detects a straight line in the image, outputs an endpoint coordinate, and then obtains an intersection coordinate value according to a general equation of the straight line.
Further, in the step 5), firstly, a camera imaging model needs to be established, a relationship between a world coordinate system and an image coordinate system is obtained, a 3D real object point in the world coordinate system is converted to a 2D image used for image processing and taking a pixel as a unit, and a forward projection process is as follows:
firstly, defining a world coordinate system, and establishing a 3D coordinate system (U, V, W) by taking an actual object as a center; determining an optical axis and an optical center of a camera, and establishing a camera coordinate system (X, Y, Z) according to a right-hand system principle; converting points in a world coordinate system into a camera coordinate system through an external parameter matrix, wherein the external parameter matrix comprises a rotation matrix R and a translation matrix T;
Figure BDA0002656565730000041
wherein
Figure BDA0002656565730000042
Is marked as Mext
Secondly, by means of camera internal parameter f: converting points in a camera coordinate system into an imaging coordinate system by using a focal length (the distance from an imaging plane to an optical center, and calculation on a camera main shaft) parameter of the camera to obtain 2D coordinate points (x, y) on the photosensitive device; introducing a homogeneous coordinate system to obtain:
Figure BDA0002656565730000043
wherein
Figure BDA0002656565730000044
Is marked as Mproject
Finally, converting the imaging coordinate system to the image pixel coordinate system through camera internal references Ox, Oy (translation, translation from the central point of the imaging coordinate system to the central point of the pixel coordinate system) and Sx, Sy (sampling the image imaged by the imaging plane/photosensitive device, and representing the sampling rate in the x-axis direction and the y-axis direction), and obtaining the image pixel point coordinates (u, v);
Figure BDA0002656565730000051
is marked as
Figure BDA0002656565730000052
Thereby obtaining the conversion relation between the image shot on site and the real physical space point, and recording as:
P(u,v,1)=MaffineMprojectMextP(U,V,W,1)
wherein, P (u, v,1) is the coordinates of the pixel points of the digital image; p (U, V, W,1) is the coordinate of the object in the world coordinate system.
The requirement in the measurement specification is that a wide-angle lens is not used for shooting when an image is collected, so that the distortion of a nonlinear lens caused by the radial curvature change and the colinearity error of the optical lens can be ignored, the field space is limited, imaging is often required to be carried out at a certain inclination angle, at the moment, the optical axis of a photographic objective is not vertical to a measured plane, the geometric distortion caused by the perspective principle is generated on an image plane, the proportion of the image is changed, and the influence on the measurement result is not negligible.
Further, in the step 6): let u and v be the coordinates of the distorted image point and u 'and v' be the coordinates of the ideal image point, the correction model is represented as follows:
P(u′,v′)=f(u,v)
and f is determined by the geometric projection relation between the established distorted image and the ideal image, and the image with the correct proportion is obtained by correcting each pixel position in the distorted image through numerical calculation.
The invention needs to pay attention to the following in the correction process: firstly, determining an internal parameter of a camera, namely a focal length f, and unifying a nominal quantity value; preliminarily judging the orientation of the camera, selecting a control point as a standard point, and calculating an external parameter theta, namely an included angle between an optical axis of the camera and an object plane; and the coordinate range of the output result after correction is reasonably selected, so that the calculation cost is saved, and the detection efficiency is improved.
Compared with the prior art, the invention has the following beneficial technical effects:
the invention discloses a buried depth information acquisition method in the town gas public pipeline installation process, and provides a digital solution for the problems of complex field buried depth detection and high supervision difficulty: the image acquisition function of the portable terminal in the construction process is skillfully utilized, the requirements on detection equipment and environment are reduced, and the field real-time operability is strong; and the measurement steps are standardized, so that the interference of human factors is reduced and the error caused by the distortion of the original lens is avoided under the condition of ensuring the integrity of effective information. The method comprises the steps of adopting a field marker post as a standard object, firstly extracting Exif information contained in an image, further finishing relevant preprocessing, and improving the signal-to-noise ratio of the image; then, coordinate values of the feature points/control points in the image are obtained through Hough transformation; and establishing a conversion relation between the ideal point and the distortion point through a camera perspective model. And judging the orientation of the camera, finding the control point accurately and finishing correction. Finally, the pipeline buried depth value is calculated by utilizing the proportional relation between the pipeline buried depth and the known mark post, and the measurement precision mainly depends on the correction accuracy. The digital real-time monitoring can be realized by embedding the digital real-time monitoring system into a terminal.
Furthermore, the identification of the burial depth in the quality control of the pipeline installation process is a new requirement of digital supervision under the actual engineering background, and the prior art has no targeted research and measurement specification. Therefore, the invention has certain technical guidance significance.
Drawings
Fig. 1 is a flow chart of automatic buried depth recognition.
Fig. 2 is a schematic diagram of two-dimensional planar imaging of a camera.
Fig. 3 is a perspective geometric distortion model of a camera.
FIG. 4 is a flow chart of a computer correcting an original image.
Fig. 5 is a flow chart of an algorithm for determining the external parameter θ.
Fig. 6 shows the original image Exif information extraction result.
Fig. 7 is a graph comparing the results of image preprocessing.
Fig. 8 is a graph of the result of detecting straight lines by hough transform.
Fig. 9 is a control point method calibration schematic.
Fig. 10 is a graph of the correction results.
Detailed Description
The invention is described in further detail below with reference to the accompanying drawings:
as shown in fig. 1 to 8, the method for collecting burial depth information in the town gas public pipeline installation process mainly comprises the following steps: firstly, carrying out digital acquisition on an image to ensure that required physical data is extracted based on geometric elements of a single-view image; secondly, extracting Exif information to obtain internal parameters of the camera and shooting time and place, and monitoring the whole camera from time dimension and space dimension; the key core steps are that the image needs to be calibrated, a geometric distortion model of the camera is established, the transformation relation between a derived distorted image and an ideal image is analyzed, and a computer is utilized to correct image points one by one; during correction calculation, the camera position needs to be preliminarily judged by using coordinate values of the midpoint in the image space and is used as a control point for parameter solution, so that after image preprocessing, Hough transformation is adopted for linear detection, a benchmark linear line is fitted, and the coordinates of each feature point are solved; in view of the complex field environment, the images need to be preprocessed before hough transformation to ensure the accuracy of line detection. The method provided by the invention has the advantages that the field intelligent terminal is fully utilized, the unified standard measurement requirement is combined, the complex problem is simplified, and compared with the traditional image measurement method, the method does not need to carry out multiple measurements, does not need to carry out camera calibration and three-dimensional modeling under the condition of meeting the measurement precision requirement, is efficient and convenient, and provides possibility for realizing digital real-time monitoring.
The invention specifically comprises the following steps:
step 1), acquiring field image information according to a uniform and standard measurement process. The problems that the background environment is complex, the size difference of an actual measurement scene is obvious, the span of the buried depth range of the pipeline to be measured is large and the like are solved;
step 2), extracting Exif information of the original image, wherein the Exif information comprises effective information such as time, geographic position, camera parameters and the like;
step 3), preprocessing the image to obtain a clear bianarization image of the marker post;
step 4), acquiring end points of horizontal and vertical benchmarks on the image surface and intersection point coordinate values thereof through classical Hough transformation;
step 5), constructing a perspective geometric model according to a world coordinate system and image coordinate system transformation relation in the camera imaging principle;
and 6) correcting the perspective geometric distortion in the image by combining the camera parameters, formulating a calculation flow chart, and calculating the buried depth of the pipeline by using the corrected coordinates and the known size of the marker post.
The accuracy of final calculation is mainly determined by three key steps of geometric correction model, preprocessing and Hough transform straight line fitting.
The imaging principle of the camera is shown in fig. 2, and the transformation relationship between the coordinates P (U, V,1) of the pixel points of the digital image and the points P (U, V, W,1) in the actual physical coordinate system is as follows:
P(u,v,1)=MaffineMprojectMextP(U,V,W,1)
wherein M isaffineRepresenting affine transformation/rigid transformation matrices, MprojectRepresenting a projective transformation matrix, wherein parameters are from the interior of the camera, such as focal length and the like; mextA matrix of camera extrinsic parameters is represented, including parameters of rotation and translation transformations. These transformations can be viewed as linear transformations, the rules of which are constant during the correction of perspective geometric distortions, so that only the point between the distortion point and the ideal point needs to be derivedThe relationship can be converted without parameter calibration.
Then, the following calibration model is set up: the x-axis and y-axis of the image plane coordinate system are shown in fig. 3, where the y-axis passes through the origin and is perpendicular to the picture plane. Under ideal photographing conditions, the optical axis of the camera is perpendicular to the object plane, and when distortion occurs, the actual object plane is not perpendicular to the optical axis, namely theta is not equal to 90 degrees. The coordinates of the point A on the object plane on the imaging plane before and after the distortion are respectively (x, y) and (x ', y'), d is the distance from the object point A on the object plane to the intersection point of the optical axis and the object plane, L is the distance from the ideal object plane to the lens, the focal length of the lens is recorded as f, and the geometrical relationship can be obtained by the optical imaging principle:
Figure BDA0002656565730000091
Figure BDA0002656565730000092
Figure BDA0002656565730000093
simultaneous equations (1), (2), (3) yield the relationship between the distorted coordinate x and the ideal coordinate x' as:
Figure BDA0002656565730000094
since the y coordinate axis is always perpendicular to the optical axis, the corresponding image point coordinates y and y' depend only on the distance of the lens from the plane in which the object point is perpendicular to the optical axis, so that:
Figure BDA0002656565730000095
simultaneous equations (2), (3), (5) can be rewritten as:
Figure BDA0002656565730000096
the coordinate transformation relationship between the distorted image and the ideal image can be known from the formulas (4) and (6).
On the basis of the geometric correction, a computer can be adopted for geometric correction. And (4) converting the coordinates corresponding to the distortion image by using the formulas (4) and (6) from the coordinates of each pixel of the ideal image, and then giving the gray value of the corresponding distortion pixel point to the ideal pixel. And adopting a loop statement to gradually obtain the corrected image.
The flow of the correction algorithm is shown in fig. 4, and the input parameters include: the original image G (x, y), the focal length f of the camera and the included angle theta between the optical axis of the camera and the actual object plane. The camera focal length f described above may be obtained from Exif information of an image; when unit normalization is carried out, the required size of the original image is also obtained by Exif information; and the determination of the value of θ is shown in fig. 5:
the hough transform detects a horizontal straight line L1: a. the1(x1,y1),B1(x2,y2) Vertical straight line L2: a. the2(x3,y3),B2(x4,y4) And Cross of two straight lines is O (x)0,y0). Due to the limitation of field conditions, the camera shooting horizon is above the horizontal marker post, so that a 'relatively low point', namely a near point, is selected as a control point for calibration; the shooting height of the camera is a depression shot, and a 'relative high point' is selected as a control point. The angle between the camera optical axis and the object plane X, Y can be calculated by equation (6).
Coordinate axes X, Y are independent of each other and thus can be traversed column by column, respectively, to complete the calibration. In the calculation, the focal length f and the coordinate unit need to be normalized. Finally, calculating the burial depth according to the corrected coordinates:
Figure BDA0002656565730000101
example (b):
the example selects the image collected by the current place of Han-Tou pipe trench construction.
The method comprises the following steps: extracting Exif information of an image
The buried depth of the on-site gas pipeline is measured according to the standard of the unified requirement, an intelligent terminal is adopted to collect images, and the Exif information of an original image (non-compressed) is extracted and stored. The method mainly comprises the size of an original image, a shooting focal length, a geographic position and time information.
Step two: image pre-processing
Firstly, carrying out gradient transformation on an image, and finding out edge information in the vertical direction and the vertical direction; then setting a global gradient threshold value, and carrying out binarization processing on the image; then, carrying out conventional filtering operation, and carrying out smooth removal on noise in the image; and filling in a two-dimensional direction aiming at the problem of edge gaps in the image to obtain continuous edge information.
Step three: detecting straight line by Hough transform, calculating intersection point, and calculating parameters
Performing classical Hough transformation on the image to obtain a Hough matrix, namely a linear parameter matrix; then searching a peak point in the parameter matrix, namely finding out a straight line position; determining a linear equation by using the end point coordinates of the straight line; finally, simultaneous equations are used to calculate the intersection point of straight lines; the external parameter θ is calculated as mentioned above.
Step four: correcting perspective geometric distortion of image and calculating burial depth
According to the conversion relation between the ideal points and the distortion points in the perspective geometric distortion model, an original image and calculation parameters are input, and the gray value of the distortion points in the original image is assigned to the coordinates of the corresponding ideal points according to the computer correction process. And finally, outputting the ideal coordinates of the target range and calculating the burial depth.
1. Extracting Exif information of an image
Reading an image, extracting Exif information through an exifread library, traversing Tag, and storing the Tag into a dictionary variable. In the extraction, whether the required Exif information exists needs to be judged firstly, for example, the extraction of the GPSInfo information fails because of the reason that the image is compressed or the camera does not collect the image during shooting, the image which does not meet the requirement is deleted, and the 'uploading of the image which meets the requirement again' is prompted. The Exif information extracted in this example is shown in fig. 6.
2. Image pre-processing
Fig. 7.a is a field measurement image of the present example. The preprocessing is mainly to prepare for the subsequent hough transform: detecting straight line, namely edge information of a marker post, firstly carrying out gradient calculation on the image, defining Sobel operators in the x direction and the y direction, carrying out convolution operation on the Sobel operators and the original image, and calculating gradient/edge information g in the imagex,gyAs shown in fig. 7. b.
Figure BDA0002656565730000111
Then, a gradient threshold value is set, binarization is carried out on the image, and interference of weak edges with small gradient changes can be removed. Here, it is set that:
T_x=th×max(gx)
T_y=th×max(gy)
taking th as 0.3, when the gradient is larger than T _ x, assigning the value as 1, and when the gradient is smaller than T _ x, assigning the value as 0; the same applies to the y direction. The binarization results are shown in FIG. 7. c.
And aiming at the problem of edge gaps, detecting and filling are respectively carried out from the horizontal direction and the vertical direction to obtain continuous edges. The noise occurring during the filling process is averaged and the final image pre-processing result is shown in fig. 7. d.
3. Detecting straight line by Hough transform, calculating intersection point, and calculating parameters
Passing through points (x) in the image x-y coordinate spacei,yi) The straight line of (d) is represented as: y isi=axi+ b, where parameter a is slope and b is intercept. Passing point (x)i,yi) There are numerous straight lines, and taking a, b as variables, a straight line can be expressed as: b is ═ xia+yiAnd transforming the coordinate plane to a parameter plane to complete Hough transformation.
Passing point (x) in image coordinate spacei,yi) And point (x)j,yj) Each point on the straight line of (A) isThe parameter spaces a-b respectively correspond to a straight line which intersects at the point (a)0,b0) A and a0,b0That is, the x-y midpoint (x) of the image coordinate spacei,yi) And point (x)j,yj) Parameters of the determined straight line. I.e. points in the image space where the parameter space intersects a straight line representing the same point. The code implementation steps are as follows:
(1) and performing Hough transformation on the binary image by using a hough () function to obtain a Hough matrix.
(2) The hough () function is used to find the peak point in the hough matrix.
(3) And obtaining straight line information in the original binary image on the basis of the result of the previous 2 steps by using a houghlines () function.
The results of the treatment are shown in FIGS. 8.a and 8. b. From the detected end point information of the two straight lines and the calculation flow of fig. 5, the external parameter θ is calculated, and in this example, the following is calculated:
θx=-3.23°
θy=73.91°
4. correcting image perspective geometric distortion
According to the transformation relationship between the ideal points and distortion points obtained by the camera geometric correction model, the flow chart of the geometric distortion correction by the computer shown in fig. 4 is as follows: firstly, inputting an original image G (x, y), a camera focal length f and an included angle theta between a camera optical axis and an actual object plane; and determining the range of the output target coordinates (x ', y'), traversing the global range by adopting a loop statement, and carrying out coordinate transformation assignment.
In the calculation, the measurement is first normalized. The pixel coordinates of a digital image in a computer are represented by the number of rows and columns, and the minimum unit value represents the distance between two adjacent pixels. In an example, the actual focal length of the camera is 4mm and the 35mm equivalent focal length is 23mm, which can be obtained from the camera parameters in the Exif. Since the width of the frame of the image on the 35mm film is 24mm, and the original image size is 800 × 586, the normalized camera lens focal length is:
Figure BDA0002656565730000131
in this example, the control points in the vertical direction and the horizontal direction are performed with the upper end point and the right end point as control points, respectively, taking the horizontal direction as an example: let the coordinates of two end points A of the horizontal marker post on the original image be (x)A,yA) The coordinates of point B are (x)B,yB) As shown in fig. 9; the coordinates of A 'after correction are (x'A,y′A) And the coordinates of B 'are (x'B,y′B). Point B is a control point, and the coordinates before and after correction are unchanged, so after correction:
y′A=y′B=yB (9)
simultaneous expression (6), theta can be calculated; the vertical direction correction is the same. The graph of the results after the treatment is shown in FIG. 10.
Corrected horizontal marker post straight line L'1And a vertical marker post straight line L'2Simultaneous linear equation to obtain coordinates of intersection (cross'x,cross′y). The original height of the known marker post is L0Then, the calculation formula of the buried depth of the pipeline is as follows:
Figure BDA0002656565730000132
in the figure, the height of the marker post is 200cm, and the height of the pipeline is calculated to be 92.34 cm. And the precise embedding depth can be obtained by combining the pipe diameter size.
In conclusion, the method for automatically acquiring the burial depth in the urban public gas pipeline installation process, provided by the invention, sets up a unified image acquisition standard around an image detection scheme, monitors the field detection time and place through Exif information, and provides camera parameters for subsequent calculation. In the buried depth calculation, firstly, the image is subjected to distortion correction, and then preprocessing and Hough transformation are carried out, the flow is simple, the method is efficient, and the automatic acquisition and digital monitoring and detection of the buried depth of the pipeline on the installation site are realized on the principle of the method.

Claims (9)

1. A buried depth information acquisition method in the town gas public pipeline installation process is characterized by comprising the following steps:
step 1), acquiring field image information to obtain an original image;
step 2), extracting Exif information of the original image obtained in the step 1);
step 3), preprocessing the original image obtained in the step 1) to obtain a clear bianarization image of the marker post;
step 4), performing classical Hough transform on the binary image obtained in the step 3) to obtain end points of horizontal and vertical benchmarks on the image surface and intersection point coordinate values of the end points;
step 5), constructing a perspective geometric model according to a world coordinate system and image coordinate system transformation relation in the camera imaging principle;
and 6) correcting the perspective geometric distortion in the image by combining the camera parameters, formulating a calculation flow chart, and calculating the buried depth of the pipeline by using the corrected coordinates and the known size of the marker post.
2. The method for collecting the burial depth information in the town gas public pipeline installation process according to claim 1, wherein in the step 1), the collection of the original image is obtained according to the same collection process.
3. The method for collecting burial depth information in the town gas public pipeline installation process as claimed in claim 1, wherein Exif information of the original image comprises time, geographical position and camera parameters.
4. The method for collecting the burial depth information in the town gas public pipeline installation process according to claim 1, wherein the field measurement in the step 1) is performed by using a mark post as a standard, the vertical mark post is in contact with the bottom of the pipe ditch or the top of the pipeline, and the horizontal mark post is coplanar with the ground plane and is arranged perpendicular to the vertical mark post; the optical axis of the camera is perpendicular to the plane of the marker post, so that the image plane is parallel to the plane of the marker post, and the view field covers the whole marker post, thereby completing image acquisition.
5. The method for collecting the burial depth information in the urban gas public pipeline installation process according to claim 1, wherein the Exif information in the step 2) is embedded in a group of shooting parameters in a fixed file format in a JPEG image format file, wherein the shooting parameters comprise an aperture, a shutter, ISO (International Standard organization) and focal length, and the shooting parameters comprise shooting parameters, time, place, camera brand model, color coding and the like; the Exif information is placed in the header of the JPG file, and in NET, the Exif information is acquired using a propertytem object.
6. The method for collecting the burial depth information in the town gas public pipeline installation process according to claim 1, wherein the image preprocessing in the step 3) comprises the following steps: calculating image gradient, binarizing, filling and filtering; (ii) a The method comprises the steps that gradient calculation is carried out to extract edge information in an image, and a filling operator is adopted to carry out edge connection aiming at the problem that the edge of a marker post is segmented and not closed, so that continuous and complete marker post information is obtained; and the image is mean filtered using a 4 x 4 convolution kernel.
7. The method for collecting the buried depth information in the town gas public pipeline installation process according to claim 1, wherein the step 4) is to perform classical Hough transformation, convert a coordinate space into a linear parameter space, extract a peak parameter, detect a straight line in an image, output an end point coordinate, and then calculate an intersection coordinate value according to a general equation of the straight line.
8. The method for collecting the buried depth information in the town gas public pipeline installation process according to claim 1, wherein in the step 5), a camera imaging model is firstly established, a relationship from a world coordinate system to an image coordinate system and between the world coordinate system and the image coordinate system is obtained, a 3D real object point in the world coordinate system is converted to a 2D image which is used for image processing and takes a pixel as a unit, and a forward projection process is as follows:
firstly, defining a world coordinate system, and establishing a 3D coordinate system (U, V, W) by taking an actual object as a center; determining an optical axis and an optical center of a camera, and establishing a camera coordinate system (X, Y, Z) according to a right-hand system principle; converting points in a world coordinate system into a camera coordinate system through an external parameter matrix, wherein the external parameter matrix comprises a rotation matrix R and a translation matrix T;
Figure FDA0002656565720000031
wherein
Figure FDA0002656565720000032
Is marked as Mext
Secondly, by means of camera internal parameter f: converting points in a camera coordinate system into an imaging coordinate system by using a focal length (the distance from an imaging plane to an optical center, and calculation on a camera main shaft) parameter of the camera to obtain 2D coordinate points (x, y) on the photosensitive device; introducing a homogeneous coordinate system to obtain:
Figure FDA0002656565720000033
wherein
Figure FDA0002656565720000034
Is marked as Mproject
Finally, converting the imaging coordinate system to the image pixel coordinate system through camera internal references Ox, Oy (translation, translation from the central point of the imaging coordinate system to the central point of the pixel coordinate system) and Sx, Sy (sampling the image imaged by the imaging plane/photosensitive device, and representing the sampling rate in the x-axis direction and the y-axis direction), and obtaining the image pixel point coordinates (u, v);
Figure FDA0002656565720000035
is marked as
Figure FDA0002656565720000036
Thereby obtaining the conversion relation between the image shot on site and the real physical space point, and recording as:
P(u,v,1)=MaffineMprojectMextP(U,V,W,1)
wherein, P (u, v,1) is the coordinates of the pixel points of the digital image; p (U, V, W,1) is the coordinate of the object in the world coordinate system.
9. The method for collecting the burial depth information in the town gas public pipeline installation process according to claim 8, wherein in the step 6): let u and v be the coordinates of the distorted image point and u 'and v' be the coordinates of the ideal image point, the correction model is represented as follows:
P(u′,v′)=f(u,v)
and f is determined by the geometric projection relation between the established distorted image and the ideal image, and the image with the correct proportion is obtained by correcting each pixel position in the distorted image through numerical calculation.
CN202010889767.XA 2020-08-28 2020-08-28 Method for collecting burial depth information in town gas public pipeline installation process Pending CN112037192A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010889767.XA CN112037192A (en) 2020-08-28 2020-08-28 Method for collecting burial depth information in town gas public pipeline installation process

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010889767.XA CN112037192A (en) 2020-08-28 2020-08-28 Method for collecting burial depth information in town gas public pipeline installation process

Publications (1)

Publication Number Publication Date
CN112037192A true CN112037192A (en) 2020-12-04

Family

ID=73587703

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010889767.XA Pending CN112037192A (en) 2020-08-28 2020-08-28 Method for collecting burial depth information in town gas public pipeline installation process

Country Status (1)

Country Link
CN (1) CN112037192A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113111827A (en) * 2021-04-22 2021-07-13 北京房江湖科技有限公司 Construction monitoring method and device, electronic equipment and storage medium
CN114812414A (en) * 2022-04-12 2022-07-29 西安交通大学 Composite measuring device for surface defects of inner diameter and inner wall of pipeline

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090104350A (en) * 2008-03-31 2009-10-06 성균관대학교산학협력단 Image Processing Method and Apparatus for Detecting the Lines of Images
CN101936761A (en) * 2009-06-30 2011-01-05 宝山钢铁股份有限公司 Visual measuring method of stockpile in large-scale stock ground
US20140193039A1 (en) * 2013-01-07 2014-07-10 Ronald M. Wexler System and method of measuring distances related to an object
CN106405660A (en) * 2016-08-26 2017-02-15 国脉科技股份有限公司 Buried depth detecting device and method of communication pipeline
CN108759973A (en) * 2018-04-28 2018-11-06 南京昊控软件技术有限公司 A kind of water level measurement method
CN109285145A (en) * 2018-08-12 2019-01-29 浙江农林大学 The more plants of standing tree height measurement methods based on smart phone
CN109448043A (en) * 2018-10-22 2019-03-08 浙江农林大学 Standing tree height extracting method under plane restriction
CN109443480A (en) * 2018-11-02 2019-03-08 南京邮电大学 Gauge positioning and water level measurement method based on image procossing
CN109842756A (en) * 2017-11-28 2019-06-04 东莞市普灵思智能电子有限公司 A kind of method and system of lens distortion correction and feature extraction
CN110807355A (en) * 2019-09-12 2020-02-18 天津大学 Pointer instrument detection and reading identification method based on mobile robot
CN110866545A (en) * 2019-10-30 2020-03-06 中国地质大学(武汉) Method and system for automatically identifying pipeline target in ground penetrating radar data
CN111292371A (en) * 2020-02-21 2020-06-16 廖赟 Intelligent water level detection equipment, intelligent water level monitoring management device and detection method
CN111476787A (en) * 2020-04-23 2020-07-31 中科开创(广州)智能科技发展有限公司 Automatic reading method and device for adaptive distortion of pointer meter

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20090104350A (en) * 2008-03-31 2009-10-06 성균관대학교산학협력단 Image Processing Method and Apparatus for Detecting the Lines of Images
CN101936761A (en) * 2009-06-30 2011-01-05 宝山钢铁股份有限公司 Visual measuring method of stockpile in large-scale stock ground
US20140193039A1 (en) * 2013-01-07 2014-07-10 Ronald M. Wexler System and method of measuring distances related to an object
CN106405660A (en) * 2016-08-26 2017-02-15 国脉科技股份有限公司 Buried depth detecting device and method of communication pipeline
CN109842756A (en) * 2017-11-28 2019-06-04 东莞市普灵思智能电子有限公司 A kind of method and system of lens distortion correction and feature extraction
CN108759973A (en) * 2018-04-28 2018-11-06 南京昊控软件技术有限公司 A kind of water level measurement method
CN109285145A (en) * 2018-08-12 2019-01-29 浙江农林大学 The more plants of standing tree height measurement methods based on smart phone
CN109448043A (en) * 2018-10-22 2019-03-08 浙江农林大学 Standing tree height extracting method under plane restriction
CN109443480A (en) * 2018-11-02 2019-03-08 南京邮电大学 Gauge positioning and water level measurement method based on image procossing
CN110807355A (en) * 2019-09-12 2020-02-18 天津大学 Pointer instrument detection and reading identification method based on mobile robot
CN110866545A (en) * 2019-10-30 2020-03-06 中国地质大学(武汉) Method and system for automatically identifying pipeline target in ground penetrating radar data
CN111292371A (en) * 2020-02-21 2020-06-16 廖赟 Intelligent water level detection equipment, intelligent water level monitoring management device and detection method
CN111476787A (en) * 2020-04-23 2020-07-31 中科开创(广州)智能科技发展有限公司 Automatic reading method and device for adaptive distortion of pointer meter

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
LIHUIWANG 等: "Measurement of harvesting width of intelligent combine harvester by improved probabilistic Hough transform algorithm", 《MEASUREMENT》 *
张振 等: "标准双色水尺的图像法水位测量", 《仪器仪表学报》 *
沈忙作: "用计算机校正图像的透视几何畸变", 《光电工程》 *
聂鼎 等: "基于图像处理的小型水库水位测量方法", 《水电能源科学》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113111827A (en) * 2021-04-22 2021-07-13 北京房江湖科技有限公司 Construction monitoring method and device, electronic equipment and storage medium
CN114812414A (en) * 2022-04-12 2022-07-29 西安交通大学 Composite measuring device for surface defects of inner diameter and inner wall of pipeline
CN114812414B (en) * 2022-04-12 2024-01-12 西安交通大学 Composite measuring device for inner diameter and inner wall surface defects of pipeline

Similar Documents

Publication Publication Date Title
Brilakis et al. Progressive 3D reconstruction of infrastructure with videogrammetry
Golparvar-Fard et al. Evaluation of image-based modeling and laser scanning accuracy for emerging automated performance monitoring techniques
KR101105795B1 (en) Automatic processing of aerial images
Gamba et al. GIS and image understanding for near-real-time earthquake damage assessment
US20080089577A1 (en) Feature extraction from stereo imagery
JP2002157576A (en) Device and method for processing stereo image and recording medium for recording stereo image processing program
CN113971768A (en) Unmanned aerial vehicle-based three-dimensional dynamic detection method for power transmission line illegal building
Parente et al. Optimising the quality of an SfM‐MVS slope monitoring system using fixed cameras
KR102162818B1 (en) Method for surveying underground utility being constructed using a camera in real time and apparatus for producing numerical drawings of underground utility based on the same
CN112037192A (en) Method for collecting burial depth information in town gas public pipeline installation process
CN111489416A (en) Tunnel axis fitting method and application in calculation of over-under excavation square measure
Jiang et al. Determination of construction site elevations using drone technology
KR101793264B1 (en) Analysis method for occurrence and growth progression of crack
Wang et al. Applicability of a gradient profile algorithm for road network extraction-sensor, resolution and background considerations
Peppa et al. Handcrafted and learning-based tie point features-comparison using the EuroSDR RPAS benchmark datasets
CN114973028B (en) Aerial video image real-time change detection method and system
Kong et al. An automatic and accurate method for marking ground control points in unmanned aerial vehicle photogrammetry
Zhang et al. Matching of Ikonos stereo and multitemporal GEO images for DSM generation
Toschi et al. Validation tests of open-source procedures for digital camera calibration and 3D image-based modelling
CN114088063A (en) Pier local scour terrain measurement method based on mobile terminal
Kurdi et al. Automated building footprint and 3d building model generation from lidar point cloud data
CN112964192A (en) Engineering measurement online calibration method and system based on image video
CN112950763A (en) Live-action modeling method in transformer substation engineering
Batakanwa et al. The use of video camera to create metric 3D model of engineering objects
Yastikli et al. Building extraction using multi sensor systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20201204

RJ01 Rejection of invention patent application after publication