CN114088063A - Pier local scour terrain measurement method based on mobile terminal - Google Patents

Pier local scour terrain measurement method based on mobile terminal Download PDF

Info

Publication number
CN114088063A
CN114088063A CN202111214414.0A CN202111214414A CN114088063A CN 114088063 A CN114088063 A CN 114088063A CN 202111214414 A CN202111214414 A CN 202111214414A CN 114088063 A CN114088063 A CN 114088063A
Authority
CN
China
Prior art keywords
points
camera
parameters
mobile terminal
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111214414.0A
Other languages
Chinese (zh)
Other versions
CN114088063B (en
Inventor
王志华
陈启刚
李文虎
陈猇
张汁
王振
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qinghai Traffic Engineering Technology Service Center
Beijing Jiaotong University
Original Assignee
Qinghai Traffic Engineering Technology Service Center
Beijing Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qinghai Traffic Engineering Technology Service Center, Beijing Jiaotong University filed Critical Qinghai Traffic Engineering Technology Service Center
Priority to CN202111214414.0A priority Critical patent/CN114088063B/en
Publication of CN114088063A publication Critical patent/CN114088063A/en
Application granted granted Critical
Publication of CN114088063B publication Critical patent/CN114088063B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • G01C11/08Interpretation of pictures by comparison of two or more pictures of the same area the pictures not being supported in the same relative position as when they were taken
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention provides a method for measuring local scour terrain of a pier based on a mobile terminal. The method comprises the following steps: drawing a plurality of straight lines at the position on the surface of the bridge pier close to the riverbed to realize the arrangement of control points, and establishing a world coordinate system according to the control points; taking a picture at each of a plurality of machine positions surrounding a bridge pier by using a mobile terminal, wherein any point in a local flushing pit appears in not less than 2 pictures with different visual angles; solving the camera internal parameters, distortion parameters and external parameters when the photos are shot according to the matched feature points in every two adjacent photos; forming dense point clouds according to the space coordinates of all the measuring points; and sequentially selecting and inputting world coordinate values of at least three control points through a screen of the mobile terminal, and projecting the dense point cloud to a world coordinate system to obtain a local scouring terrain under the world coordinate system. The embodiment of the invention utilizes the mobile terminal to take pictures and process, realizes the local scour terrain measurement of the bridge pier, and has the advantages of no restriction of visibility, high efficiency, easy operation, high precision and the like.

Description

Pier local scour terrain measurement method based on mobile terminal
Technical Field
The invention relates to the technical field of bridge pier measurement, in particular to a local scour terrain measurement method for a bridge pier based on a mobile terminal.
Background
After a river-crossing bridge is built on a natural river, piers in the river channel disturb local water flow, water flow structures such as horseshoe vortexes formed around the piers erode local riverbeds, and local scouring pits are formed around the piers. The local scouring reduces the bearing capacity of the bridge pier, even leads to the instability of the bridge pier due to the hollowing of the foundation, seriously threatens the operation safety of the bridge and is the leading reason for causing the water damage of the bridge. In order to carry out risk assessment and scour prevention and control of bridge water damage, local scoured terrains of the bridge piers must be measured.
In small and medium-sized rivers in arid regions in the west of China, rivers usually have flowing water only in a short time when flood occurs, the riverbeds before and after the flood are dry ground, local scouring terrain measurement is carried out without involving underwater operation, and the conventional land terrain measurement technology can be used.
At present, the local scour topographic survey method for the bridge piers of the dry river bed in the prior art mainly comprises a GPS-RTK, a total station and a three-dimensional laser scanner. When the GPS-RTK technology works, an instrument can be required to look through the sky to receive satellite signals, local scouring of landforms of piers is usually implemented under bridges, and the shielding effect of the upper structures of the bridges often prevents the instrument from receiving good satellite signals. The total station has high measurement precision and does not need satellite signals, but requires that the measuring points and instruments can be seen mutually, and local scouring pits are distributed around the bridge pier, so that the problem that part of the measuring points to be measured are shielded no matter how the instruments are arranged is solved. The three-dimensional laser scanner is also shielded by the pier and the upper structure of the bridge when local scouring terrain around the pier is developed.
Besides the above disadvantages, the method for measuring local scoured terrain of bridge pier in the prior art also has the problems of expensive instrument and equipment, inconvenient carrying, strong specialization and the like, and is difficult to meet the requirements of field measurement on high efficiency, portability, easy operation and the like.
Disclosure of Invention
The embodiment of the invention provides a method for measuring local scoured terrain of a bridge pier based on a mobile terminal, which is used for effectively measuring the local scoured terrain of the bridge pier.
In order to achieve the purpose, the invention adopts the following technical scheme.
A method for measuring local scour terrain of a bridge pier based on a mobile terminal comprises the following steps:
step S1, drawing a vertical line at the position on the surface of the pier close to the river bed, respectively drawing not less than 3 horizontal short lines on the vertical line, wherein the intersection point of the horizontal short line and the vertical line is a control point;
s2, taking a picture at each of a plurality of machine positions surrounding a bridge pier by using a mobile terminal, wherein each picture covers a part of a local flushing pit, any point in the local flushing pit appears in at least 2 pictures with different visual angles, and at least one picture is used for completely taking a scale mark image;
s3, sequentially identifying feature points in each picture shot by the mobile terminal, finding out all matched feature points corresponding to the same space point from the feature points of every two adjacent pictures, taking the matched feature points as homonymy points, and calculating camera internal parameters, distortion parameters and external parameters when the pictures are shot according to the coordinates of the homonymy points;
s4, solving space coordinates of measuring points uniformly covering the whole measuring area by using image differences of the same space points in different pictures according to the internal parameters, the external parameters, the distortion parameters and all the pictures of the camera, and forming dense point clouds according to the space coordinates of all the measuring points;
and S5, finding out any picture with scale marks, enabling the scale marks to coincide with a vertical axis of a world coordinate system, taking any control point on the scale marks as a coordinate origin, sequentially selecting and inputting world coordinate values of at least three control points through a mobile terminal screen, projecting the dense point cloud to the world coordinate system, and obtaining a local scouring terrain under the world coordinate system.
Preferably, the step S1 specifically includes: drawing a vertical line at the position on the surface of the pier close to the riverbed, wherein the length of the vertical line is not less than 0.5m, respectively drawing not less than 3 horizontal short lines on the vertical line, and the intersection point of the horizontal short lines and the vertical line is a control point.
Preferably, the step S2 specifically includes: the method comprises the steps that a plurality of machine positions surrounding a bridge pier are respectively used for shooting a picture by a mobile terminal, a screen of the mobile terminal keeps a transverse screen or a vertical screen during shooting, the picture shot by each machine position covers a part of a local flushing pit, any point in the local flushing pit appears in at least 2 pictures with different visual angles, and each vertical line and each horizontal short line are completely shot by at least one picture.
Preferably, the step S3 specifically includes:
s301, setting world coordinates of measuring points on local scoured terrain as (X, Y, Z), setting pixel coordinates of the measuring points in a picture shot by the mobile terminal as (u, v), and establishing an imaging function between the pixel coordinates and the world coordinates as follows:
Figure BDA0003310078200000031
Figure BDA0003310078200000032
Figure BDA0003310078200000033
wherein f isx、fyFocal length of camera lens in pixel unit, cx、cyIs the coordinate of the optical center of the camera in pixel units, s is a non-orthogonal coefficient, fx、fy、cx、cyAnd s are collectively referred to as the in-camera parameters; z is a radical ofcThe distance from the measuring point to the camera is called a scale factor; k1、K2And K3Is the radial distortion coefficient of the camera; r is a rotation matrix of 3 x 3 relative to world coordinates, T is a translation vector of a camera of 3 x 1 relative to the world coordinates, R and T are jointly called camera external parameters, and each machine position has 6 independent parameters;
step S302, according to exchangeable image file format EXIF information stored in a photo shot by the mobile terminal, obtaining a camera focal length f and resolution M multiplied by N pixels of the mobile terminal, setting the pixel size to be mu, and calculating estimated values of camera internal parameters as follows:
fx=fy=f/μ,cx=M/2,cy=N/2,s=1
step S303, recognizing feature points one by using SIFT, SURF, ORB or FAST algorithm, and completing matching of same-name points among different photos by using an optical flow method;
s304, setting the number of homonymous points between two adjacent photos as n pairs, substituting the pixel coordinates of each pair of homonymous points into the imaging function to obtain 4 equations, establishing 4n equations by using the n pairs of homonymous points, and enabling a scale factor z in the imaging functioncWhen the number of the equations is not less than the number of the unknown quantities, namely n is more than or equal to 20, all camera parameters are solved according to the same-name points;
s305, averaging the internal parameters and the distortion parameters solved according to every two photos, and taking the obtained average values as the finally determined camera internal parameters and distortion parameters;
and S306, substituting the finally determined camera internal parameters and distortion parameters into the imaging function according to the coordinates of the homonymy points between every two pictures to construct an equation set, and solving the equation set to obtain corresponding external parameters when the camera shoots every picture.
Preferably, the step S4 specifically includes:
s401, sequentially selecting any two photos with mutually overlapped view fields, and performing stereo epipolar line correction on the photos according to corresponding camera external parameters of the two photos to enable the same space point to appear in the same row of the corrected photos;
s402, calculating a depth map according to the two pictures after epipolar line correction;
s403, calculating space coordinates (X ', Y ', Z ') of measuring points corresponding to pixels uniformly covering the whole measuring area according to the depth map by using the following formula;
Z'=dzc
Figure BDA0003310078200000041
Figure BDA0003310078200000042
and obtaining dense space point cloud according to the space coordinates of all the measuring points.
Preferably, the step S5 specifically includes:
step S501, setting the space coordinates of the control point in the dense point cloud as (X ', Y ', Z '), the real world coordinates input through the mobile terminal screen as (X, Y, Z), and the coordinate conversion function between the space coordinates and the real world coordinates as:
Figure BDA0003310078200000051
wherein S is a scaling factor, R is a rotation matrix of 3 × 3, T is a translation vector of 3 × 1, and 7 independent variables are counted;
s502, substituting world coordinates of more than 3 control points into the coordinate conversion function, establishing an equation set containing not less than 9 equations, and solving a scaling factor S, a rotation matrix R and a translational vector T;
and S503, sequentially transforming all points in the dense point cloud according to the scaling factor S, the rotation matrix R and the translational vector T and obtaining the dense point cloud of the local scoured terrain in the world coordinate system.
According to the technical scheme provided by the embodiment of the invention, the embodiment of the invention utilizes the mobile terminal to take pictures and process, realizes the local scour topographic survey of the bridge pier, and has the advantages of no limitation of visibility, portability, high efficiency, easy operation, high precision and the like.
Additional aspects and advantages of the invention will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a processing flow chart of a method for measuring local scour terrain of a bridge pier based on a mobile terminal according to an embodiment of the present invention;
fig. 2 is a scene schematic diagram of an embodiment of taking a picture of a local scour terrain around a pier by using a mobile phone.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or coupled. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be understood by those skilled in the art that, unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
For the convenience of understanding the embodiments of the present invention, the following description will be further explained by taking several specific embodiments as examples in conjunction with the drawings, and the embodiments are not to be construed as limiting the embodiments of the present invention.
The embodiment of the invention adopts a method of photographing by mobile terminals such as a mobile phone and the like to measure the topographic information of the local scour pit of the bridge pier. Because visible light is difficult to penetrate through turbid water and refracts on the water surface, the terrain below the water surface is difficult to accurately shoot by a mobile phone, and therefore measurement under the condition of a dry riverbed is recommended in the specific implementation process.
Example one
The processing flow of the method for measuring the local scour terrain of the bridge pier based on the mobile terminal is shown in figure 1, and comprises the following steps:
step S1, drawing a scale: drawing a vertical line at the position on the surface of the pier close to the riverbed, wherein the length of the vertical line is not less than 0.5m, respectively drawing not less than 3 horizontal short lines on the vertical line, and the intersection point of the horizontal short lines and the vertical line is a control point.
S2, taking pictures by using mobile terminals such as mobile phones: a plurality of machine positions surrounding a bridge pier are respectively used for shooting a photo, a screen of the mobile terminal is kept horizontal or vertical during shooting, the photo shot by each machine position covers a part of a local scouring pit, any point in the local scouring pit appears in at least 2 photos with different visual angles, and each vertical line and each horizontal short line are completely shot by at least one photo.
S3, registering the photos: the method comprises the steps of sequentially identifying feature points in each picture shot by a mobile terminal, finding out all matched feature points corresponding to the same space point from the feature points of every two adjacent pictures, wherein the matched feature points are named homonymous points, and then calculating camera internal parameters, distortion parameters and external parameters when each picture is shot according to coordinates of the homonymous points.
The feature points refer to points that can appear in the same or very similar form in other images containing the same scene, and are generally points with drastic change of image gray values or points with large curvature in the images.
S4, generating dense point cloud: according to the camera internal parameters, external parameters, distortion parameters and all the pictures taken, the world coordinates of the measuring points which uniformly cover the whole measuring area are calculated by utilizing the image difference of the same space point in different pictures to form dense point cloud.
S5, obtaining a local scoured terrain: and finding out any one of the photos of the shot scale marks drawn in the step S1, enabling the scale marks to coincide with the vertical axis of the world coordinate system, taking any control point on the scale marks as the origin of coordinates, sequentially selecting and inputting world coordinate values of at least three control points through a mobile terminal screen, and projecting the dense point cloud generated in the step S4 into the newly-established world coordinate system, wherein the dense point cloud represents the local scoured terrain under the world coordinate system.
Wherein the step S3 includes the following substeps:
s301, setting world coordinates of measuring points on local scoured terrain as (X, Y, Z), wherein the world coordinates are specific coordinate values of the measuring points in a world coordinate system, pixel coordinates of the measuring points in a picture shot by a mobile phone as (u, v), and establishing an imaging function between the pixel coordinates and the world coordinates as follows:
Figure BDA0003310078200000081
Figure BDA0003310078200000082
Figure BDA0003310078200000083
wherein f isx、fyFocal length of camera lens in pixel unit, cx、cyThe coordinates of the optical center of the camera taking pixels as units, s is a non-orthogonal coefficient, and the above 5 parameters are collectively called as camera intrinsic parameters; z is a radical ofcIs the distance from the point to the camera, commonly referred to as the scale factor; k1、K2、K3Is the radial distortion coefficient of the camera; r is a 3 x 3 rotation matrix relative to world coordinates, T is a 3 x 1 translation vector of the camera relative to world coordinates, R and T are collectively referred to as the camera extrinsic parameters, and there are 6 independent parameters at each position. Before camera calibration, the internal parameters, distortion parameters and external parameters of the camera are unknown.
Step S302, obtaining a focal length f and a resolution M multiplied by N pixel of a camera of the mobile phone according to EXIF (Exchangeable image file format) information stored in a photo shot by the mobile phone, wherein the pixel size is mu, and calculating estimated values of parameters in the camera respectively as follows:
fx=fy=f/μ,cx=M/2,cy=N/2,s=1
step S303.SIFT (Scale-invariant feature transform), SURF (Speeded Up Robust Features), ORB (ordered FAST and Speeded BRIEF, FAST feature point extraction and description) or FAST (Speeded From Accelerated Segment speed feature) algorithm identifies feature points From picture to picture, and then completes matching of same-name points between different pictures by using an optical flow method.
And S304, setting the number of the same-name points between two adjacent photos as n pairs, and substituting the pixel coordinates of each pair of the same-name points into the imaging function established in the S301 to obtain 4 equations, so that the 4n equations can be established by using the n pairs of the same-name points. Before calibration using real world coordinates established at the measurement site, let the scale factor z in the imaging functioncIf 1, the unknown quantity of the equation set comprises 3n world coordinates of the same-name points, 5 camera internal parameters, 3 distortion parameters and 12 external parameters when the camera takes two pictures, and the total quantity is 3n + 20. When the number of the equations is not less than the number of the unknown quantities, namely n is more than or equal to 20, all camera parameters can be solved according to the same-name points.
And S305, in order to improve the solving precision of the intrinsic parameters and the distortion parameters of the camera, performing ensemble averaging on the intrinsic parameters and the distortion parameters solved according to every two photos, and taking the average value as the final intrinsic parameters and the distortion parameters of the camera.
And S306, substituting the finally determined camera internal parameters and distortion parameters into the imaging function established in the S301 according to the coordinates of the homonymy points between every two photos to construct an equation set, and solving the equation set to obtain corresponding external parameters when the camera shoots every photo.
Wherein the step S4 includes the following sub-steps:
and S401, sequentially selecting any two photos with mutually overlapped view fields, and performing stereo epipolar line correction on the photos by adopting a conventional algorithm according to corresponding camera external parameters of the two photos so that the same space point basically appears in the same row of the corrected photos.
And S402, calculating a depth map according to the two pictures after epipolar line correction by adopting a conventional algorithm.
And S403, calculating the space coordinates (X ', Y ', Z ') of the measuring points corresponding to each pixel according to the depth map and the following formula to obtain dense space point cloud.
Z'=dzc
Figure BDA0003310078200000101
Figure BDA0003310078200000102
Wherein the step S5 includes the following substeps:
step S501, setting the space coordinates of the control point in the dense point cloud as (X ', Y ', Z '), and setting the real world coordinates input through the screen of the mobile terminal as (X, Y, Z), so that the two coordinate systems have translation, rotation and scaling relationships and are described by the following coordinate transfer functions:
Figure BDA0003310078200000103
where S is the scaling factor, R is a 3 × 3 rotation matrix, and T is a 3 × 1 translation vector, for a total of 7 independent variables.
And S502, substituting the world coordinates of more than 3 control points into the coordinate conversion function in S501, establishing an equation set containing not less than 9 equations, and solving a scaling factor S, a rotation matrix R and a translational vector T.
And S503, sequentially transforming all points in the dense point cloud according to the known scaling factor S, the rotation matrix R and the translational vector T according to the coordinate conversion function in the S501 to obtain the new dense point cloud accurately representing the local scoured terrain.
Example two
A scene schematic diagram of a method for taking a picture of local scoured terrain around a bridge pier by using a mobile phone according to the embodiment is shown in fig. 2, and the method for measuring the local scoured terrain of the bridge pier based on the mobile terminal according to the embodiment includes the following steps:
step S1, drawing scale marks: and drawing a vertical line with the length not less than 0.5m at the position, close to the riverbed, on the surface of the pier by using a marking pen, chalk or stone, respectively drawing not less than 3 horizontal short lines on the vertical line, wherein the intersection point of the horizontal short lines and the vertical line is a control point. In the embodiment of the invention, the measurement object is a local scouring pit of a certain highway bridge around a pier in a river channel, the pier is of a pile-column structure, the diameter of an upper cylinder is 1.2m, the diameter of a lower pile foundation is 1.4m, and the top of a pile is about 0.3m higher than the original river bed. On the measuring site, a chalk and a ruler are used for drawing scale marks with the length of 0.6m on the surface of the column, and the top end, the bottom end and the middle point of each scale mark are respectively drawn with a control point.
S2, shooting a photo by using a mobile phone: as shown in fig. 2, a photo is taken by the mobile phone 21 at each of a plurality of machine positions 23 surrounding a bridge pier 22, the screen of the mobile phone is kept horizontal or vertical during photographing, the photo taken by each machine position covers a part of a local flushing pit 24, any point in the local flushing pit appears in at least 2 photos with different visual angles, and at least one photo clearly and completely takes an image of a scale mark 24. In the embodiment of the invention, an iPhone XS Max type mobile phone is adopted for photographing, the resolution of the photograph taken by a camera is 4032 x 3024 pixels, the 35mm equivalent focal length is 26mm, and the pixel size is 14 μm; because the local scouring pits mainly appear on the upstream surface of the pier, 7 photos are taken around the upstream surface of the pier.
S3, registering the photos: the method comprises the steps of sequentially identifying feature points in each picture shot by a mobile phone, finding out all matched feature points corresponding to the same space point from the feature points of every two adjacent pictures, wherein the matched feature points are named homonymous points, and then resolving camera internal parameters, distortion parameters and external parameters when each picture is shot according to homonymous point coordinates.
The step S3 includes the following sub-steps:
s301, setting the coordinates of any measuring point on the surface of the local flushing pit in a world coordinate system as (X, Y, Z), and setting the pixel coordinates of the measuring point image in a picture shot by a mobile phone as (u, v). An imaging function of pixel coordinates and world coordinates is established as follows:
Figure BDA0003310078200000111
Figure BDA0003310078200000112
Figure BDA0003310078200000113
wherein f isxAnd fyFocal length of camera lens in pixel unit, cxIs given byyThe coordinates of the optical center of the camera taking pixels as units, s is a non-orthogonal coefficient, and the above 5 parameters are collectively called as camera intrinsic parameters; z is a radical ofcIs the distance from the point to the camera, commonly referred to as the scale factor; k1、K2、K3Is the radial distortion coefficient of the camera; r is a 3 x 3 rotation matrix relative to world coordinates, T is a 3 x 1 translation vector of the camera relative to world coordinates, R and T are collectively referred to as camera extrinsic parameters, and there are 6 independent parameters at each position. Before camera calibration, the internal parameters, distortion parameters and external parameters of the camera are unknown.
S302, according to EXIF information stored in a picture shot by the mobile phone, obtaining a focal length f and a resolution M multiplied by N pixel of a camera of the mobile phone, wherein the pixel size is mum, and calculating estimated values of parameters in the camera, namely:
fx=fy=f/μ,cx=M/2,cy=N/2,s=1
according to the mobile phone camera parameters of the embodiment, the internal parameter estimation values are respectively as follows:
fx=fy=1857.1,cx=2016,cy=1512,s=1
s303, recognizing the feature points one by using SIFT, SURF, ORB or FAST algorithm, and then completing the matching of the same name points among different photos by using an optical flow method. In the embodiment, the SURF algorithm is used for feature point identification, and the SurfFeatureDetector class in the OpenCV open source library is used for performing feature point identification, and the calcptical flowpyrlk function is used for completing the homonymous point matching based on the optical flow method.
S304, setting the number of the same-name points between two adjacent photos as n pairs, substituting the pixel coordinates of each pair of the same-name points into the imaging function established in the step S301 to obtain 4 equations,thus, 4n equations can be established using n pairs of homonyms. Before calibration using real world coordinates established at the measurement site, let the scale factor z in the imaging functioncIf 1, the unknown quantity of the equation set comprises 3n world coordinates of the same-name points, 5 camera internal parameters, 3 distortion parameters and 12 external parameters when the camera takes two pictures, and the total quantity is 3n + 20. When the number of the equations is not less than the number of the unknown quantities, namely n is more than or equal to 20, all camera parameters can be solved according to the same-name points. In this embodiment, if the number of the same name points between two adjacent photographs is less than 20, the two photographs are not used.
S305, in order to improve the solving precision of the intrinsic parameters and the distortion parameters of the camera, averaging the intrinsic parameters and the distortion parameters solved according to every two photos, and taking the obtained average values as the final intrinsic parameters and the distortion parameters of the camera.
S306, substituting the finally determined camera intrinsic parameters and distortion parameters into the imaging function established in the S301 according to the coordinates of the same-name points between every two photos to construct an equation set, and solving to obtain corresponding accurate extrinsic parameters when the mobile phone shoots each photo.
S4, generating dense point cloud covering the measurement area: and solving dense point clouds uniformly covering the whole measuring area by utilizing the image difference of the same space point in different pictures according to the solved internal parameter, external parameter and distortion parameter and all pictures shot by the mobile phone.
The arrangement S4 includes the following sub-steps:
s401, selecting any two photos with mutually overlapped view fields in sequence, and performing stereo epipolar line correction on the photos by adopting a Hartley algorithm or a Bouguet algorithm according to corresponding camera external parameters of the two photos to enable the same space point to basically appear in the same row of the corrected photos. This example uses the stereoRectify function in the OpenCV open source library for stereo epipolar rectification.
S402, calculating a depth map according to the two pictures after epipolar line correction by adopting a conventional algorithm. In this embodiment, an SGBM algorithm is used to calculate a disparity map according to two images after epipolar line correction, the stereo sgbs in the OpenCV open source library are used to calculate the disparity map, and then the disparity map is converted into a depth map by using the following formula:
Figure BDA0003310078200000131
where d represents the depth value of each pixel of the depth map, LbD is the parallax value of each pixel of the parallax image in order to take the distance between the optical centers of the cameras in two pictures.
And S403, calculating the space coordinates of the measuring points corresponding to the pixels according to the depth map and the following formula to obtain dense space point cloud.
Z'=ds
Figure BDA0003310078200000132
Figure BDA0003310078200000133
S5, obtaining a local scour terrain: finding out any one of the photos with the scale marks drawn in the step S1, enabling the scale marks to be coincident with the vertical axis of the world coordinate system, taking any control point on the scale marks as the origin of coordinates, sequentially selecting and inputting real coordinate values of at least three control points through a mobile phone screen, and projecting the dense point cloud generated in the step S4 into the newly-established world coordinate system, wherein the dense point cloud represents the real local scoured terrain.
The step S5 includes the following sub-steps:
s501, setting the space coordinates of the control point in the dense point cloud as (X ', Y ', Z '), and setting the real world coordinates input through the mobile phone screen as (X, Y, Z), then the two coordinate systems have translation, rotation and scaling relations, and are described by the following coordinate conversion functions:
Figure BDA0003310078200000141
where S is the scaling factor, R is a 3 × 3 rotation matrix, and T is a 3 × 1 translation vector, for a total of 7 independent variables.
S502, substituting the world coordinates of more than 3 control points into the coordinate conversion function in S501, establishing an equation set containing not less than 9 equations, and solving a scaling factor S, a rotation matrix R and a translation vector T.
And S503, sequentially transforming all points in the point cloud according to the known scaling factor S, the rotation matrix R and the translational vector T according to the coordinate conversion function in the S501 to obtain a new dense point cloud accurately representing the local scoured terrain.
In the embodiment of the invention, the shooting of the photo is completed by an iPhone XS Max type mobile phone on a measuring site, the photo is copied to a computer after the field operation is finished, and the terrain calculation is completed by software which is installed on the computer and written by C + + language.
In summary, the embodiment of the invention realizes the measurement of the local scoured terrain of the bridge pier of the dry river bed by taking pictures and processing by using the mobile terminals such as the mobile phone, and has the advantages of no limitation of visibility, portability, high efficiency, easy operation, high precision and the like.
Those of ordinary skill in the art will understand that: the figures are merely schematic representations of one embodiment, and the blocks or flow diagrams in the figures are not necessarily required to practice the present invention.
From the above description of the embodiments, it is clear to those skilled in the art that the present invention can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which may be stored in a storage medium, such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the method according to the embodiments or some parts of the embodiments.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for apparatus or system embodiments, since they are substantially similar to method embodiments, they are described in relative terms, as long as they are described in partial descriptions of method embodiments. The above-described embodiments of the apparatus and system are merely illustrative, and the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (6)

1. A method for measuring local scour terrain of a bridge pier based on a mobile terminal is characterized by comprising the following steps:
step S1, drawing a vertical line at the position on the surface of the pier close to the river bed, respectively drawing not less than 3 horizontal short lines on the vertical line, wherein the intersection point of the horizontal short line and the vertical line is a control point;
s2, taking a picture at each of a plurality of machine positions surrounding a bridge pier by using a mobile terminal, wherein each picture covers a part of a local flushing pit, any point in the local flushing pit appears in at least 2 pictures with different visual angles, and at least one picture is used for completely taking a scale mark image;
s3, sequentially identifying feature points in each picture shot by the mobile terminal, finding out all matched feature points corresponding to the same space point from the feature points of every two adjacent pictures, taking the matched feature points as homonymy points, and calculating camera internal parameters, distortion parameters and external parameters when the pictures are shot according to the coordinates of the homonymy points;
s4, solving space coordinates of measuring points uniformly covering the whole measuring area by using image differences of the same space points in different pictures according to the internal parameters, the external parameters, the distortion parameters and all the pictures of the camera, and forming dense point clouds according to the space coordinates of all the measuring points;
and S5, finding out any picture with scale marks, enabling the scale marks to coincide with a vertical axis of a world coordinate system, taking any control point on the scale marks as a coordinate origin, sequentially selecting and inputting world coordinate values of at least three control points through a mobile terminal screen, projecting the dense point cloud to the world coordinate system, and obtaining a local scouring terrain under the world coordinate system.
2. The method according to claim 1, wherein the step S1 specifically includes: drawing a vertical line at the position on the surface of the pier close to the riverbed, wherein the length of the vertical line is not less than 0.5m, respectively drawing not less than 3 horizontal short lines on the vertical line, and the intersection point of the horizontal short lines and the vertical line is a control point.
3. The method according to claim 1, wherein the step S2 specifically includes: the method comprises the steps that a plurality of machine positions surrounding a bridge pier are respectively used for shooting a picture by a mobile terminal, a screen of the mobile terminal keeps a transverse screen or a vertical screen during shooting, the picture shot by each machine position covers a part of a local flushing pit, any point in the local flushing pit appears in at least 2 pictures with different visual angles, and each vertical line and each horizontal short line are completely shot by at least one picture.
4. The method according to claim 1, 2 or 3, wherein the step S3 specifically comprises:
s301, setting world coordinates of measuring points on local scoured terrain as (X, Y, Z), setting pixel coordinates of the measuring points in a picture shot by the mobile terminal as (u, v), and establishing an imaging function between the pixel coordinates and the world coordinates as follows:
Figure FDA0003310078190000021
Figure FDA0003310078190000022
Figure FDA0003310078190000023
wherein f isx、fyFocal length of camera lens in pixel unit, cx、cyIs the coordinate of the optical center of the camera in pixel units, s is a non-orthogonal coefficient, fx、fy、cx、cyAnd s are collectively referred to as the in-camera parameters; z is a radical ofcThe distance from the measuring point to the camera is called a scale factor; k1、K2And K3Is the radial distortion coefficient of the camera; r is a rotation matrix of 3 x 3 relative to world coordinates, T is a translation vector of a camera of 3 x 1 relative to the world coordinates, R and T are jointly called camera external parameters, and each machine position has 6 independent parameters;
step S302, according to exchangeable image file format EXIF information stored in a photo shot by the mobile terminal, obtaining a camera focal length f and resolution M multiplied by N pixels of the mobile terminal, setting the pixel size to be mu, and calculating estimated values of camera internal parameters as follows:
fx=fy=f/μ,cx=M/2,cy=N/2,s=1
step S303, recognizing feature points one by using SIFT, SURF, ORB or FAST algorithm, and completing matching of same-name points among different photos by using an optical flow method;
s304, setting the number of homonymous points between two adjacent photos as n pairs, substituting the pixel coordinates of each pair of homonymous points into the imaging function to obtain 4 equations, establishing 4n equations by using the n pairs of homonymous points, and enabling a scale factor z in the imaging functionc1, then the unknowns of the system of equations include3n world coordinates, 5 camera internal parameters, 3 distortion parameters and 12 external parameters when the camera takes two pictures of 3n +20 in total of all the same-name points, and when the number of equations is not less than the unknown quantity number, namely n is not less than 20, all the camera parameters are solved according to the same-name points;
s305, averaging the internal parameters and the distortion parameters solved according to every two photos, and taking the obtained average values as the finally determined camera internal parameters and distortion parameters;
and S306, substituting the finally determined camera internal parameters and distortion parameters into the imaging function according to the coordinates of the homonymy points between every two pictures to construct an equation set, and solving the equation set to obtain corresponding external parameters when the camera shoots every picture.
5. The method according to claim 4, wherein the step S4 specifically includes:
s401, sequentially selecting any two photos with mutually overlapped view fields, and performing stereo epipolar line correction on the photos according to corresponding camera external parameters of the two photos to enable the same space point to appear in the same row of the corrected photos;
s402, calculating a depth map according to the two pictures after epipolar line correction;
s403, calculating space coordinates (X ', Y ', Z ') of measuring points corresponding to pixels uniformly covering the whole measuring area according to the depth map by using the following formula;
Z'=dzc
Figure FDA0003310078190000031
Figure FDA0003310078190000032
and obtaining dense space point cloud according to the space coordinates of all the measuring points.
6. The method according to claim 5, wherein the step S5 specifically comprises:
step S501, setting the space coordinates of the control point in the dense point cloud as (X ', Y ', Z '), the real world coordinates input through the mobile terminal screen as (X, Y, Z), and the coordinate conversion function between the space coordinates and the real world coordinates as:
Figure FDA0003310078190000041
wherein S is a scaling factor, R is a rotation matrix of 3 × 3, T is a translation vector of 3 × 1, and 7 independent variables are counted;
s502, substituting world coordinates of more than 3 control points into the coordinate conversion function, establishing an equation set containing not less than 9 equations, and solving a scaling factor S, a rotation matrix R and a translational vector T;
and S503, sequentially transforming all points in the dense point cloud according to the scaling factor S, the rotation matrix R and the translational vector T and obtaining the dense point cloud of the local scoured terrain in the world coordinate system.
CN202111214414.0A 2021-10-19 2021-10-19 Pier local scour terrain measurement method based on mobile terminal Active CN114088063B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111214414.0A CN114088063B (en) 2021-10-19 2021-10-19 Pier local scour terrain measurement method based on mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111214414.0A CN114088063B (en) 2021-10-19 2021-10-19 Pier local scour terrain measurement method based on mobile terminal

Publications (2)

Publication Number Publication Date
CN114088063A true CN114088063A (en) 2022-02-25
CN114088063B CN114088063B (en) 2024-02-02

Family

ID=80297435

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111214414.0A Active CN114088063B (en) 2021-10-19 2021-10-19 Pier local scour terrain measurement method based on mobile terminal

Country Status (1)

Country Link
CN (1) CN114088063B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116844142A (en) * 2023-08-28 2023-10-03 四川华腾公路试验检测有限责任公司 Bridge foundation scouring identification and assessment method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2747032A1 (en) * 2012-12-21 2014-06-25 3D Reality Maps GmbH Method for the photorealistic visualisation of a three-dimensional terrain data set in real-time
KR101486467B1 (en) * 2014-07-22 2015-01-28 주식회사 지오스토리 Map data updating system for confirming topography change by gps coordinate data
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
KR20170038569A (en) * 2015-09-30 2017-04-07 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP2017207438A (en) * 2016-05-20 2017-11-24 アジア航測株式会社 Topographic change analysis method
KR101813203B1 (en) * 2017-05-23 2017-12-28 (주)미도지리정보 Digital map update system according to change of terrain environment
CN109754429A (en) * 2018-12-14 2019-05-14 东南大学 A kind of deflection of bridge structure measurement method based on image
CN110180186A (en) * 2019-05-28 2019-08-30 北京奇思妙想信息技术有限公司 A kind of topographic map conversion method and system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2747032A1 (en) * 2012-12-21 2014-06-25 3D Reality Maps GmbH Method for the photorealistic visualisation of a three-dimensional terrain data set in real-time
KR101486467B1 (en) * 2014-07-22 2015-01-28 주식회사 지오스토리 Map data updating system for confirming topography change by gps coordinate data
CN104501779A (en) * 2015-01-09 2015-04-08 中国人民解放军63961部队 High-accuracy target positioning method of unmanned plane on basis of multi-station measurement
KR20170038569A (en) * 2015-09-30 2017-04-07 엘지전자 주식회사 Mobile terminal and method for controlling the same
JP2017207438A (en) * 2016-05-20 2017-11-24 アジア航測株式会社 Topographic change analysis method
KR101813203B1 (en) * 2017-05-23 2017-12-28 (주)미도지리정보 Digital map update system according to change of terrain environment
CN109754429A (en) * 2018-12-14 2019-05-14 东南大学 A kind of deflection of bridge structure measurement method based on image
CN110180186A (en) * 2019-05-28 2019-08-30 北京奇思妙想信息技术有限公司 A kind of topographic map conversion method and system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
FEROZ, SAINAB: "UAV-Based Remote Sensing Applications for Bridge Condition Assessment", REMOTE SENSING, vol. 13, no. 9 *
彭国平: "桥墩绕流结构与冲刷地形响应关系研究", 中国优秀硕士学位论文全文数据库基础科学辑, no. 1 *
赵志文: "运动恢复结构多视角立体重构在河工模型地形测量中的应用", 泥沙研究, vol. 44, no. 5 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116844142A (en) * 2023-08-28 2023-10-03 四川华腾公路试验检测有限责任公司 Bridge foundation scouring identification and assessment method
CN116844142B (en) * 2023-08-28 2023-11-21 四川华腾公路试验检测有限责任公司 Bridge foundation scouring identification and assessment method

Also Published As

Publication number Publication date
CN114088063B (en) 2024-02-02

Similar Documents

Publication Publication Date Title
CN108765328B (en) High-precision multi-feature plane template and distortion optimization and calibration method thereof
CN110146030A (en) Side slope surface DEFORMATION MONITORING SYSTEM and method based on gridiron pattern notation
CN113819974B (en) River water level visual measurement method without water gauge
CN112254656B (en) Stereoscopic vision three-dimensional displacement measurement method based on structural surface point characteristics
KR101759798B1 (en) Method, device and system for generating an indoor two dimensional plan view image
CN112648976B (en) Live-action image measuring method and device, electronic equipment and storage medium
WO2023046211A1 (en) Photogrammetry method, apparatus and device, and storage medium
CN109255818B (en) Novel target and extraction method of sub-pixel level angular points thereof
Parente et al. Optimising the quality of an SfM‐MVS slope monitoring system using fixed cameras
CN108399631B (en) Scale invariance oblique image multi-view dense matching method
US11972507B2 (en) Orthophoto map generation method based on panoramic map
CN111473776A (en) Landslide crack monitoring method based on single-image close-range photogrammetry
Cao et al. Universal algorithm for water depth refraction correction in through-water stereo remote sensing
CN109146791B (en) Tunnel spread map generation method based on area array CCD imaging
CN112929626A (en) Three-dimensional information extraction method based on smartphone image
CN114088063B (en) Pier local scour terrain measurement method based on mobile terminal
Yilmazturk et al. Geometric evaluation of mobile‐phone camera images for 3D information
Wang et al. Unmanned aerial vehicle and structure‐from‐motion photogrammetry for three‐dimensional documentation and digital rubbing of the Zuo River Valley rock paintings
CN114972447A (en) Water body surface flow trace measuring method based on unmanned aerial vehicle photographing
CN117291808B (en) Light field image super-resolution processing method based on stream prior and polar bias compensation
WO2021170051A1 (en) Digital photogrammetry method, electronic device, and system
CN108335321B (en) Automatic ground surface gravel size information extraction method based on multi-angle photos
CN112037192A (en) Method for collecting burial depth information in town gas public pipeline installation process
CN112102378A (en) Image registration method and device, terminal equipment and computer readable storage medium
KR102217215B1 (en) Server and method for 3dimension model production using scale bar

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant