CN114812514B - Tidal bore tide head line form and tide head propulsion speed on-site measurement method - Google Patents

Tidal bore tide head line form and tide head propulsion speed on-site measurement method Download PDF

Info

Publication number
CN114812514B
CN114812514B CN202210394952.0A CN202210394952A CN114812514B CN 114812514 B CN114812514 B CN 114812514B CN 202210394952 A CN202210394952 A CN 202210394952A CN 114812514 B CN114812514 B CN 114812514B
Authority
CN
China
Prior art keywords
tide
tidal bore
image
head
head line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210394952.0A
Other languages
Chinese (zh)
Other versions
CN114812514A (en
Inventor
杨元平
陈甫源
张芝永
何昆
陈刚
王瑞锋
陈韬霄
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Institute of Hydraulics and Estuary
Original Assignee
Zhejiang Institute of Hydraulics and Estuary
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Institute of Hydraulics and Estuary filed Critical Zhejiang Institute of Hydraulics and Estuary
Priority to CN202210394952.0A priority Critical patent/CN114812514B/en
Publication of CN114812514A publication Critical patent/CN114812514A/en
Application granted granted Critical
Publication of CN114812514B publication Critical patent/CN114812514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B21/00Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
    • G01B21/02Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant for measuring length, width, or thickness
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/337Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/30Assessment of water resources

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a tidal bore head line form and tidal bore head advancing speed field measurement method, which comprises the steps of carrying a high-resolution camera on an unmanned plane to the upper part of tidal bore to photograph the tidal bore head at intervals, so as to obtain a tidal bore head image of the whole measurement river reach; correcting the tidal bore tide head image, and converting the tidal bore tide head image into a unified coordinate system to form a complete tidal bore tide head line orthographic image moment by moment; according to the obvious boundary characteristics of the tidal bore tide head line position, the position of the tidal bore tide head line in the orthophoto image is identified, and the tidal bore tide head line plane position is analyzed from moment to moment; the tide head advancing speed can be obtained according to the tide head line position at the front and rear moments. Can obtain the integral plane form of the tidal bore tide head line and the tidal bore tide head propelling speed in a wide-coverage, high-efficiency and disposable way, is beneficial to analyzing the tidal bore tide propelling speed of the beach and the main groove, the method provides technical support for public safety tide observation, can analyze the impact effect of the tide on the river reach, and has important significance for the construction and protection of the tide river reach.

Description

Tidal bore tide head line form and tide head propulsion speed on-site measurement method
Technical Field
The invention belongs to the technical field of tidal bore on-site measurement methods, and particularly relates to a tidal bore head line form and tidal bore head propulsion speed on-site measurement method.
Background
The tidal bore tide head forms various plane lines such as cross, first line tide, returning tide and the like in the forward advancing process of the river channel, the plane geometric form of the tidal bore tide head line, the advancing speed of each point along the tidal bore tide head line are closely related to the tidal bore tide height, the water depth before tide, the underwater topography and the like, and the tidal bore tide form and the advancing speed of the tidal bore tide head line are important research contents of tidal bore tide research. Due to the restrictions of observation technology and observation conditions, the method mainly comprises near-shore fixed-point observation and shoreside photographing, and only the average advancing speed of the shoreside tidal bore line can be observed, the instantaneous advancing speed of the whole tidal bore line along the line is lacking, and the overall plane geometry of the tidal bore line cannot be quantitatively analyzed. The method adopted by the patent can quantitatively analyze the plane geometry of the tidal bore tide head line, the advancing speed Ci of each point along the whole tide head line is obtained, provides on-site tidal bore observation results for tidal bore research and is used for the research of tidal bore power and morphological development and change processes. The method can be used for analyzing the impact effect of a river reach of tidal bore on structures such as embankments, spur dams, bridges, wharfs and the like by on-site measurement of the plane shape, the propelling speed and the propelling direction of the tidal bore, can be used as a tidal bore monitoring technology, can also be used for observing and early warning tidal bore movement during tidal bore observation, and provides technical support for public tidal bore observation safety.
The foregoing background knowledge is intended to assist those of ordinary skill in the art in understanding the prior art that is closer to the present invention and to facilitate an understanding of the inventive concepts and aspects herein, and it should be understood that the foregoing background art should not be used to assess the novelty of the present application without explicit evidence that such disclosure is already disclosed prior to the filing date of the present application.
Disclosure of Invention
The invention aims to provide a tidal bore tide head line shape and tide head advancing speed on-site measuring method, can obtain the integral plane shape of the tidal bore tide head line, different space positions and the tidal bore tide head propelling speeds at different moments in a wide-coverage, high-efficiency and one-time way, the tidal flat and main groove tidal bore advancing speed analysis is facilitated, technical support is provided for public tidal bore observation safety, impact effect of tidal bore sections on structures such as embankments, spur dams, bridges and wharfs can be analyzed, and the tidal flat and main groove tidal bore advancing speed analysis has important significance for construction and protection of tidal bore sections.
In order to achieve the above object, the present invention provides the following technical solutions.
A tidal bore tide head line shape and tide head advancing speed field measuring method comprises the following steps:
step 1: an unmanned plane is adopted to carry a high-resolution camera, a GPS (global positioning system) positioning instrument and a distance meter to the position above the tidal bore, and the tidal bore head is photographed at intervals of delta t to obtain a tidal bore head image of the whole measurement river reach;
step 2: correcting the tidal bore tide head image, converting the processed image into a unified coordinate system according to camera parameters to form a complete tidal bore head line orthographic image moment by moment;
step 3: according to the obvious boundary characteristics of the tidal bore tide head line position, the position of the tidal bore tide head line in the orthographic image is identified through a boundary identification technology, and the tidal bore tide head line plane position is analyzed from moment to moment;
step 4: the tide head advancing speed can be obtained according to the tide head line position at the front and rear moments.
Further, in step 1, at least one unmanned aerial vehicle is provided.
In the step 1, further, the tidal bore tide head image of the whole measurement river reach can be obtained by splicing after segmented photographing and obtaining in a mode of collaborative photographing by a plurality of unmanned aerial vehicles.
Further, in step 1, the unmanned aerial vehicle hovers at a fixed point position and keeps the main optical axis of the camera vertical to the ground (water surface), the camera is started to take a picture, space positioning is performed through the GPS while taking a picture, a distance meter is used for measuring the distance between the unmanned aerial vehicle and the water surface, and camera shooting parameters are obtained according to the installation relative relation between the camera and the GPS as well as the distance meter.
In step 1, before photographing and measuring, calibrating the camera, determining a distortion correction calculation parameter, and correcting the image by using the parameter.
Further, in the step 2, the correction of the tidal bore head image comprises image radial distortion correction and image tangential distortion correction.
Further, the image radial distortion correction formula is as follows:
x 0 =x(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )
y 0 =y(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )
in the formula, (x) 0 ,y 0 ) Is the original pixel position of the distortion point on the original image, (x, y) is the pixel position on the new image after correction, k 1 、k 2 、k 3 Is a distortion correction calculation parameter related to the camera lens.
Further, the formula for correcting the tangential distortion of the image is as follows:
x 0 =x+[2p 1 y+p 2 (r 2 +2x 2 )]
y 0 =y+[2p 2 x+p 1 (r 2 +2y 2 )]
in the formula, (x) 0 ,y 0 ) The original pixel position of the distortion point on the original image, (x, y) is the pixel position on the new image after correction, p 1 、p 2 Is a distortion correction calculation parameter related to the camera lens.
The prior art can reduce the perspective distortion to a lower degree, but can not completely eliminate distortion, particularly distortion and distortion of different degrees generated by the edge of a lens, and the distortion of a picture can be corrected to a larger degree due to the fact that the lens is not parallel to a camera sensor plane (imaging plane) or an image plane and tangential distortion is possibly generated, so that the accuracy is improved in subsequent registration and identification.
Further, in step 2, the camera parameters include camera space coordinates, object distance, and distance.
In step 2, the processed image is converted into a unified coordinate system, specifically, the absolute registration and the relative registration of the images are combined, and the photographed images are unified into the same coordinate system.
Absolute registration means that a coordinate system is defined first, and all images are registered relative to the coordinate grid, that is, geometric correction of each component image is completed to realize the unification of the coordinate system, which can be performed by a computer and auxiliary manual method.
The relative registration refers to that one image in multiple images is selected as a reference image, other related images are registered with the reference image, the other related images can be automatically matched through a related algorithm, the coordinate system of the related images is arbitrary, but when the coordinates of the reference image are known, the image after the automatic matching is naturally the same as the coordinate system of the reference image.
Further, converting the processed image into a unified coordinate system specifically includes: and carrying out absolute registration on the first image, carrying out relative registration on the images shot subsequently by taking the first image as a reference, and carrying out absolute registration on the images registered by the process, wherein the images have the same coordinate system as the first image, so that the absolute registration is realized.
Furthermore, a fixed object is selected for relative registration, and an object with position change or deformation in the shooting process cannot be used for relative registration.
Further, in step 3, the specific step of identifying the position of the tidal bore head line in the orthophoto image by the boundary identification technology is as follows:
firstly, reading a picture into a tide line identification module, and converting an original color picture into a gray picture, wherein the conversion formula is as follows:
Gray=0.299R+0.887G+0.114B
gray represents a Gray image, and RGB represents three channel Gray values of red (red), green (green) and blue (blue) of a color image;
detecting and identifying edges by adopting a Canny operator, wherein the Canny operator adopts Gaussian filtering, and the mode is as follows:
Figure BDA0003598534900000031
and (3) calculating the image gradient, non-maximum value inhibition and double-threshold value screening through Gaussian filtering, and performing edge connection treatment to obtain the object edge, thus obtaining the tidal bore head line.
Further, the gaussian filter is:
Figure BDA0003598534900000041
further, in step 4, the advancing speed of the tide head includes the advancing speed of the tide head at each point on the tide head line, the advancing speed of the whole tide head line, and the advancing speed of the tide head line at each moment.
Further, in step 4, the specific step of obtaining the advancing speed of the tide head according to the position of the tide head line at the front and rear moments is as follows: all the images are converted into an actual coordinate system, after the tidal bore tide head line identification at each moment is completed, can be used for curing any point P on the tide line the push speed of the tidal bore tide head at the position is calculated, let t i Time sum t i+1 The tidal bore tide head lines are overlapped together at the moment, tidal bore head advancing speed C at point P i =dS i /(t i -t i+1 ) The propulsion speed of the tide head line at each point on the tide head line is calculated by adopting the same method, so that the propulsion speed distribution of the whole tide head line can be obtained, and the propulsion speed of the tide head line at each moment is calculated by adopting the same method.
The time and tidal bore speed of each point in the tidal bore reaching the river reach can be calculated by combining the advancing speed of each point of the tidal bore line and the condition of the known tidal bore not reached, so that the observation and early warning of the tidal bore movement during the tidal bore observation are facilitated, technical support is provided for public tidal bore observation safety, the impact effect of the river reach of the tidal bore on structures such as embankment, spur dike, bridge and wharf can be analyzed accordingly, and the method has important significance for the construction and protection of the dike, the water bottom and the like of the tidal bore river reach.
The above-mentioned preferable conditions can be combined with each other to obtain a specific embodiment on the basis of common knowledge in the art.
The beneficial effects of the invention are as follows:
carrying a high-definition camera on an unmanned plane to measure the tidal bore plane geometry and the tidal bore line advancing speed, (1) having wide coverage and high measuring efficiency, and obtaining the tidal bore line overall plane shape at one time; (2) simultaneously acquiring the advancing speed of the tide head at each position of the tide head line; (3) The tidal bore head propulsion speeds of different spatial positions are obtained, so that tidal flat ground tidal bore propulsion speed analysis of the main tank is facilitated; (4) Shooting can be tracked, and shooting measurement can be performed when tidal bore moves to and the tidal bore moves to; (5) The actual measurement is convenient, the implementation is flexible, the unmanned aerial vehicle measurement is not limited by the position, the measurement can be flexibly carried out, the observation can be carried out on a fixed river reach, and the observation can be carried out on a temporary needed measurement river reach; (6) Calculating the time for tidal bore to reach each point in the river reach and the tidal bore speed, facilitating the observation and early warning of tidal bore movement during tidal bore observation and providing technical support for public tidal bore observation safety; (7) The impact of the tidal bore river reach on structures such as embankments, spur dams, bridges, wharfs and the like can be analyzed, and the method has important significance for construction and protection of the tidal bore river reach such as dykes and dams, water bottoms and the like.
The invention adopts the technical proposal to realize the aim, makes up the defects of the prior art, has reasonable design and convenient operation.
Drawings
The foregoing and/or other objects, features, advantages and embodiments of the invention will be apparent from the following description taken in conjunction with the accompanying drawings in which:
FIG. 1 is a schematic view of a multi-shot tidal bore;
FIG. 2 is a tidal bore diagram;
FIG. 3 is a schematic diagram of absolute registration coordinate transformation;
FIG. 4 is a schematic diagram of a tidal bore head identification process;
FIG. 5 is a schematic view of a tidal head line;
fig. 6 is a schematic view of the calculation of the tidal head line advancing speed.
Detailed Description
Unless defined otherwise, technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The present invention uses the methods and tools described herein; other suitable methods and tools known in the art may be used. The tools, methods, and examples described herein are illustrative only and not intended to be limiting. All publications, patent applications, patents, provisional applications, database entries, and other references mentioned herein, and the like, are incorporated herein by reference in their entirety. In case of conflict, the present specification, including definitions, will control.
The tools, methods, and examples described herein are illustrative only and not intended to be limiting unless specifically indicated. Although methods and tools similar or equivalent to those described herein can be used in the practice or testing of the present invention, suitable methods and tools are described herein.
The present invention is described in detail below.
Example 1:
the method for measuring the form of a tidal bore tide head line and the advancing speed of the tide head on site adopts an unmanned plane to carry a high-resolution camera, a GPS (global positioning system) positioning instrument and a range finder to the upper part of the tidal bore tide, and photographs the tidal bore tide head at intervals of delta t to obtain a tidal bore tide head image of the whole measurement river reach, and specifically comprises the following steps:
as shown in fig. 1 and 2, a method of taking photos by using an unmanned aerial vehicle to carry a high-resolution camera, positioning by using a GPS (global positioning system) and measuring by using a range finder is adopted, the camera is calibrated before taking photos and measuring, distortion correction calculation parameters are determined, when a tidal bore comes, the unmanned aerial vehicle D1 flies above the tidal bore, hovers at a fixed point position and keeps a main optical axis of the camera vertical to the ground (water surface), the camera is started to take photos according to a fixed time interval delta t, space positioning is carried out by using the GPS while taking photos, the distance between the distance meter and the water surface is measured, and camera shooting parameters are obtained according to the installation relative relation between the camera, the GPS and the range finder.
When the tidal bore tide head line is about to move out of the photographing range, the unmanned aerial vehicle D2 with the same configuration can be flown to the next observation point in advance to carry out the same observation, the overlapping area of the photographed photo of the D2 and the registered photo is registered relatively, after the registration is completed, the photo photographed in the later stage of the D2 is registered with the registered photo, the splicing dislocation is reduced, and the photo has the same coordinate system. If all of D1, D2 and … … are taken with absolute coordinates, an error is unavoidable between the absolute coordinates, so that the taken photo deviates, and therefore, the splicing dislocation can be greatly reduced and the measurement accuracy can be improved by adopting absolute registration and relative registration for photo registration.
Likewise, when the tidal head line is about to move out of the shooting range of the D2 camera, a third unmanned aerial vehicle can lift off to shoot in a relay mode, and the fourth unmanned aerial vehicle and the fifth unmanned aerial vehicle are similarly operated, until the unmanned aerial vehicle shooting range can cover the whole tidal bore head line at the same time, and when a plurality of unmanned aerial vehicles shoot, the shooting time intervals are kept the same, and the shooting time is synchronous.
Example 2:
because of the image distortion caused by perspective reasons in the imaging process of a camera, the photo is subjected to distortion correction before registration and identification, the tidal bore head image is corrected on the basis of the embodiment, and the processed image is converted into a unified coordinate system according to camera parameters (including camera space coordinates, object distances and distances) to form a time-to-time complete tidal bore head line orthographic image, which comprises the following steps:
(1) Image radial distortion correction:
the prior art camera technology can reduce perspective distortion to a lower degree, but can not completely eliminate distortion, particularly distortion and distortion of different degrees generated by the edge of a lens. The distortion correction formula applied in this embodiment is as follows:
x 0 =x(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )
y 0 =y(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )
in the formula, (x) 0 ,y 0 ) Is the original pixel position of the distortion point on the original image, (x, y) is the pixel position on the new image after correction, k 1 、k 2 、k 3 Is a distortion correction calculation parameter related to the camera lens.
(2) Correction of tangential distortion of an image:
tangential distortion is generated by the lens itself being non-parallel to the camera sensor plane (imaging plane) or image plane, and the correction formula is as follows:
x 0 =x+[2p 1 y+p 2 (r 2 +2x 2 )]
y 0 =y+[2p 2 x+p 1 (r 2 +2y 2 )]
in the formula, (x) 0 ,y 0 ) The original pixel position of the distortion point on the original image, (x, y) is the pixel position on the new image after correction, p 1 、p 2 Is a distortion correction calculation parameter related to the camera lens.
Example 3:
on the basis of the foregoing embodiment, the converting the processed image into a unified coordinate system, specifically, combining the absolute registration and the relative registration of the images, unifies the photographed images into the same coordinate system, specifically includes:
(1) Absolute registration of the first picture:
as shown in fig. 3, two translation parameters dx, dy in the x, y direction and a rotation parameter θ around the perpendicular to the plane are required for the transformation in the two-dimensional plane, and furthermore, there is a scaling factor m, the specific transformation relationship is as follows:
Figure BDA0003598534900000071
in the middle of
Figure BDA0003598534900000072
For controlling the coordinate vector of the feature point in the XOY coordinate system (actual coordinate system)>
Figure BDA0003598534900000073
For controlling the coordinate vector of the feature point under the X ' O ' Y ' frame (pixel frame).
And converting the coordinate system by taking the pixel coordinate system as an original coordinate system and taking the actual coordinate system as a target coordinate system. The coordinates of the control points P1, P2 in the XOY coordinate system (target coordinate system) are known as (x) 1 ,y 1 )、(x 2 ,y 2 ) The coordinates in the X 'O' Y 'coordinate system (original coordinate system) are (X' 1 ,y′ 1 )、(x′ 2 ,y′ 2 )。
Firstly, translating an original point of a coordinate system to P1, wherein the translated coordinate system is X 'O' Y ', the coordinates of P1, P2 at the new coordinates X "O" Y "are (0, 0), (X' 2 -x′1,y′ 2 -y′ 1 ) I.e., the P1 point is located at the X "O" Y "origin. At this time, translational parameters dx and dV, rotation parameters θ and scale ratio parameters m can be obtained by direct calculation. The specific formula is as follows:
translation parameters:
Figure BDA0003598534900000074
rotation parameters:
Figure BDA0003598534900000075
scale ratio parameters:
Figure BDA0003598534900000081
solving the four parameters can realize the transfer from the original coordinate system to the target coordinate.
(2) Relative registration is performed on the rest of the pictures:
after the absolute registration of the first image is completed, the relative registration of the rest images shot later relative to the first image is carried out, and the images registered through the process have the same coordinate system as the first image, so that the absolute registration is realized. It should be noted that the relative alignment is performed by selecting a fixed object (e.g., a bank), and an object (e.g., a float, a ship, or a water surface spray) that changes or deforms during the shooting process cannot be used for the relative alignment.
The image relative registration is to extract the characteristics of two images, find the matched homonymy points (characteristic point pairs) by carrying out similarity measurement, obtain the image space coordinate transformation parameters by the matched homonymy points, and finally carry out image registration by the coordinate transformation parameters. The relative registration is typically fitted to the translation, rotation and affine transformation between the two images by a suitable polynomial. The feature-based method is adopted to determine Registration Control Points (RCPs), obtain image registration function mapping relations and determine coefficients in a transfer polynomial, and various documents exist for image matching algorithms, which are not described in detail herein.
Example 4:
based on the previous embodiment, according to the obvious boundary characteristics of the tidal bore tide head line position, the position of the tidal bore tide head line in the orthographic image is identified by the boundary identification technology, and analyze the position of the tide line plane from time to time, as shown in fig. 4 and 5, specifically including:
firstly, reading a picture into a tide line identification module, and converting an original color picture into a gray picture, wherein the conversion formula is as follows:
Gray=0.299R+0.887G+0.114B
gray represents a Gray image, and RGB represents three channel Gray values of red (red), green (green) and blue (blue) of a color image;
detecting and identifying edges by adopting a Canny operator, wherein the Canny operator adopts Gaussian filtering, and the mode is as follows:
Figure BDA0003598534900000082
and (3) calculating the image gradient, non-maximum value inhibition and double-threshold value screening through Gaussian filtering, and performing edge connection treatment to obtain the object edge, thus obtaining the tidal bore head line.
Further, the gaussian filter is:
Figure BDA0003598534900000091
example 5:
based on the foregoing embodiment, the specific steps for obtaining the tide head advancing speed according to the tide head line position at the front and rear moments are as follows: as shown in FIG. 6, all the images are converted into an actual coordinate system, after the tidal bore head line identification at each moment is completed, the tidal bore head advancing speed at any point P on the tidal bore head line can be calculated, and t is calculated i Time sum t i+1 The tidal bore tide head lines are overlapped together at the moment, tidal bore head advancing speed C at point P i =dS i /(t i -t i+1 ) The propulsion speed of the tide head line at each point on the tide head line is calculated by adopting the same method, so that the propulsion speed distribution of the whole tide head line can be obtained, and the propulsion speed of the tide head line at each moment is calculated by adopting the same method.
The time and tidal bore speed of each point in the tidal bore reaching the river reach can be calculated by combining the advancing speed of each point of the tidal bore line and the condition of the known tidal bore not reached, so that the observation and early warning of the tidal bore movement during the tidal bore observation are facilitated, technical support is provided for public tidal bore observation safety, the impact effect of the river reach of the tidal bore on structures such as embankment, spur dike, bridge and wharf can be analyzed accordingly, and the method has important significance for the construction and protection of the dike, the water bottom and the like of the tidal bore river reach.
The conventional technology in the above embodiments is known to those skilled in the art, and thus is not described in detail herein.
The specific embodiments described herein are offered by way of example only to illustrate the spirit of the invention. Various modifications or additions to the described embodiments may be made by those skilled in the art to which the invention pertains or may be substituted in a similar manner without departing from the spirit of the invention or beyond the scope of the appended claims.
While the invention has been described in detail and with reference to specific embodiments thereof, it will be apparent to one skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope thereof.
While the above detailed description has shown, described, and pointed out novel features as applied to various embodiments, it will be understood that various omissions, substitutions, and changes in the form and details of the device or method illustrated may be made without departing from the spirit of the disclosure. In addition, the various features and methods described above may be used independently of one another, or may be combined in various ways. All possible combinations and subcombinations are intended to fall within the scope of this disclosure. Many of the embodiments described above include similar components, and thus, these similar components are interchangeable in different embodiments. While the invention has been disclosed in the context of certain embodiments and examples, it will be understood by those skilled in the art that the invention extends beyond the specifically disclosed embodiments to other alternative embodiments and/or uses and obvious modifications and equivalents thereof. Therefore, the present invention is not intended to be limited by the specific disclosure of the preferred embodiments herein.
The invention is a well-known technique.

Claims (8)

1. A tidal bore tide head line shape and tide head propelling speed field measurement method is characterized by comprising the following steps:
step 1: an unmanned plane is adopted to carry a high-resolution camera, a GPS (global positioning system) positioning instrument and a distance meter to the position above the tidal bore, and the tidal bore head is photographed at intervals of delta t to obtain a tidal bore head image of the whole measurement river reach;
step 2: correcting the tidal bore tide head image, converting the processed image into a unified coordinate system according to camera parameters to form a complete tidal bore head line orthographic image moment by moment;
step 3: according to the obvious boundary characteristics of the tidal bore tide head line position, the position of the tidal bore tide head line in the orthographic image is identified through a boundary identification technology, and the tidal bore tide head line plane position is analyzed from moment to moment;
step 4: the tide head propelling speed can be obtained according to the tide head line positions at the front and rear moments;
in the step 2, the correction of the tidal bore head image comprises image radial distortion correction and image tangential distortion correction:
the formula of the radial distortion correction of the image is as follows:
x 0 =x(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )
y 0 =y(1+k 1 r 2 +k 2 r 4 +k 3 r 6 )
in the formula, (x) 0 ,y 0 ) Is the original pixel position of the distortion point on the original image, (x, y) is the pixel position on the new image after correction, k 1 、k 2 、k 3 Is a deformation correction calculation parameter related to the camera lens;
the formula of the tangential distortion correction of the image is as follows:
x 0 =x+[2p 1 y+p 2 (r 2 +2x 2 )]
y 0 =y+[2p 2 x+p 1 (r 2 +2y 2 )]
in the formula, (x) 0 ,y 0 ) The original pixel position of the distortion point on the original image, (x, y) is the pixel position on the new image after correction, p 1 、p 2 Is a deformation correction calculation parameter related to the camera lens;
in step 2, converting the processed image into a unified coordinate system to form a complete tidal bore head line orthophoto image moment by moment specifically comprises: absolute registration is carried out on the first image, the images shot subsequently are subjected to relative registration by taking the first image as a reference, and the images subjected to the process registration have the same coordinate system as the first image, so that the absolute registration is also realized;
when the tidal bore tide head line is about to move out of the photographing range, the unmanned aerial vehicle D2 which is configured in the same way can be flown to the next observation point in advance to carry out the same observation, the overlapping area of the photo shot by the D2 and the first unmanned aerial vehicle D1 is registered relatively, after the registration is completed, the photo shot at the later stage of the D2 is registered with the registered photo, the splicing dislocation is reduced, and the photo has the same coordinate system;
when the tidal bore line is about to move out of the shooting range of the D2 camera, the third unmanned aerial vehicle can lift off to shoot in a relay mode, and the fourth unmanned aerial vehicle and the fifth unmanned aerial vehicle can be similarly used, so that the tidal bore line can be covered simultaneously until the shooting range of the unmanned aerial vehicle, and the shooting time intervals are kept the same and the shooting time is synchronous when the plurality of unmanned aerial vehicles shoot.
2. The method according to claim 1, characterized in that: in the step 1, the tidal bore tide head image of the whole measurement river reach can be obtained by splicing after segmented photographing and obtaining in a mode of collaborative photographing by a plurality of unmanned aerial vehicles.
3. The method according to claim 1 or 2, characterized in that: in step 1, the unmanned aerial vehicle hovers at a fixed point position, enables a main optical axis of a camera to be vertical to the water surface, starts the camera to take a picture, performs space positioning through a GPS (global positioning system) while taking the picture, measures the distance between the unmanned aerial vehicle and the water surface by adopting a range finder, and obtains camera shooting parameters according to the installation relative relation between the camera and the GPS as well as the range finder.
4. The method according to claim 1 or 2, characterized in that: in the step 1, before photographing and measuring, the camera is calibrated, the distortion correction calculation parameter is determined, and the image correction can be carried out by using the parameter.
5. The method according to claim 1 or 2, characterized in that: in step 2, the camera parameters include camera space coordinates, object distance, and distance.
6. The method according to claim 1 or 2, characterized in that: in the step 4, the specific steps of identifying the position of the tidal bore tide head line in the orthophoto image by the boundary identification technology are as follows:
firstly, reading a picture into a tide line identification module, and converting an original color picture into a gray picture, wherein the conversion formula is as follows:
Gray=0.299R+0.887G+0.114B
gray represents a Gray image, and RGB represents three channel Gray values of red (red), green (green) and blue (blue) of a color image;
detecting and identifying edges by adopting a Canny operator, wherein the Canny operator adopts Gaussian filtering, and the mode is as follows:
Figure FDA0004205034550000021
and (3) calculating the image gradient, non-maximum value inhibition and double-threshold value screening through Gaussian filtering, and performing edge connection treatment to obtain the object edge, thus obtaining the tidal bore head line.
7. The method according to claim 6, wherein: the gaussian filter is:
Figure FDA0004205034550000022
8. the method according to claim 1 or 2, characterized in that: in the step 4, the specific steps of obtaining the advancing speed of the tide head according to the position of the tide head line at the front and back moments are as follows: all the images are converted into an actual coordinate system, after the tidal bore tide head line identification at each moment is completed, can be used for curing any point P on the tide line the push speed of the tidal bore tide head at the position is calculated, let t i Time sum t i+1 The tidal bore tide head lines are overlapped together at the moment, tidal bore head advancing speed C at point P i =dS i /(t i -t i+1 ) The propulsion speed of the tide head line at each point on the tide head line is calculated by adopting the same method, so that the propulsion speed distribution of the whole tide head line can be obtained, and the propulsion speed of the tide head line at each moment is calculated by adopting the same method.
CN202210394952.0A 2022-04-15 2022-04-15 Tidal bore tide head line form and tide head propulsion speed on-site measurement method Active CN114812514B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210394952.0A CN114812514B (en) 2022-04-15 2022-04-15 Tidal bore tide head line form and tide head propulsion speed on-site measurement method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210394952.0A CN114812514B (en) 2022-04-15 2022-04-15 Tidal bore tide head line form and tide head propulsion speed on-site measurement method

Publications (2)

Publication Number Publication Date
CN114812514A CN114812514A (en) 2022-07-29
CN114812514B true CN114812514B (en) 2023-06-16

Family

ID=82537318

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210394952.0A Active CN114812514B (en) 2022-04-15 2022-04-15 Tidal bore tide head line form and tide head propulsion speed on-site measurement method

Country Status (1)

Country Link
CN (1) CN114812514B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040022348A (en) * 2002-09-05 2004-03-12 한국전자통신연구원 Image-based lens distortion correction method and apparatus
JP2017133902A (en) * 2016-01-27 2017-08-03 国立大学法人 千葉大学 Wave measurement device and target detection device
CN110388899A (en) * 2019-06-04 2019-10-29 浙江省水利河口研究院 The personal module calculated for tidal bore flow velocity vertical characteristics
KR102113791B1 (en) * 2018-12-26 2020-05-21 단국대학교 산학협력단 Flow estimation method of streams based on non-uniform flow with remote sensed imagery
CN111914695A (en) * 2020-07-16 2020-11-10 河海大学 Tidal bore monitoring method based on machine vision
CN112819249A (en) * 2021-02-26 2021-05-18 自然资源部第二海洋研究所 Tidal current harmonic analysis and calculation method based on sailing ADCP observation ocean current data
CN112880645A (en) * 2021-02-20 2021-06-01 自然资源部第一海洋研究所 Sea wave surface three-dimensional model construction system and method based on three-dimensional mapping mode

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9852516B2 (en) * 2015-01-30 2017-12-26 Raytheon Company Apparatus and processes for estimating river velocity
US11010509B2 (en) * 2018-05-23 2021-05-18 Nvidia Corporation Systems and methods for computer simulation of detailed waves for large-scale water simulation
CN112614076A (en) * 2020-12-29 2021-04-06 广东省傲来科技有限公司 Method and device for correcting lens MTF distortion
CN113344953B (en) * 2021-04-21 2023-07-04 中国计量大学 Machine vision tidal bore flow velocity measurement method based on unmanned aerial vehicle

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20040022348A (en) * 2002-09-05 2004-03-12 한국전자통신연구원 Image-based lens distortion correction method and apparatus
JP2017133902A (en) * 2016-01-27 2017-08-03 国立大学法人 千葉大学 Wave measurement device and target detection device
KR102113791B1 (en) * 2018-12-26 2020-05-21 단국대학교 산학협력단 Flow estimation method of streams based on non-uniform flow with remote sensed imagery
CN110388899A (en) * 2019-06-04 2019-10-29 浙江省水利河口研究院 The personal module calculated for tidal bore flow velocity vertical characteristics
CN111914695A (en) * 2020-07-16 2020-11-10 河海大学 Tidal bore monitoring method based on machine vision
CN112880645A (en) * 2021-02-20 2021-06-01 自然资源部第一海洋研究所 Sea wave surface three-dimensional model construction system and method based on three-dimensional mapping mode
CN112819249A (en) * 2021-02-26 2021-05-18 自然资源部第二海洋研究所 Tidal current harmonic analysis and calculation method based on sailing ADCP observation ocean current data

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
基于径向基函数网络的钱塘江涌潮模型及其应用;曾剑;孙志林;熊绍隆;陈刚;;浙江大学学报(工学版)(第09期);全文 *
钱塘江河口涌潮传播速度研究;谢东风;潘存鸿;鲁海燕;王立辉;唐子文;;浙江大学学报(工学版)(第06期);全文 *

Also Published As

Publication number Publication date
CN114812514A (en) 2022-07-29

Similar Documents

Publication Publication Date Title
CN113137920B (en) Underwater measurement equipment and underwater measurement method
CN111311679B (en) Free floating target pose estimation method based on depth camera
CN106485751B (en) Unmanned aerial vehicle photographic imaging and data processing method and system applied to foundation pile detection
CN106556414B (en) A kind of automatic digital orientation method of laser scanner
CN113269671B (en) Bridge apparent panorama generating method based on local and global features
CN106996748A (en) Wheel diameter measuring method based on binocular vision
CN103993548A (en) Multi-camera stereoscopic shooting based pavement damage crack detection system and method
CN116758136B (en) Real-time online identification method, system, equipment and medium for cargo volume
CN112232319A (en) Scanning splicing method based on monocular vision positioning
CN103411587A (en) Positioning and attitude-determining method and system
CN104930976A (en) Portable crack length-measuring apparatus and method
CN114972447A (en) Water body surface flow trace measuring method based on unmanned aerial vehicle photographing
CN102999895B (en) Method for linearly solving intrinsic parameters of camera by aid of two concentric circles
CN114812514B (en) Tidal bore tide head line form and tide head propulsion speed on-site measurement method
Nakatani et al. 3D visual modeling of hydrothermal chimneys using a rotary laser scanning system
CN115984766A (en) Rapid monocular vision three-dimensional target detection method for underground coal mine
CN103196431B (en) Integral aerial triangulation method for airborne laser scanning point cloud and optical image
Calantropio et al. Photogrammetric underwater and UAS surveys of archaeological sites: The case study of the roman shipwreck of Torre Santa Sabina
Miled et al. Hybrid online mobile laser scanner calibration through image alignment by mutual information
CN116824079A (en) Three-dimensional entity model construction method and device based on full-information photogrammetry
CN115690380B (en) Registration method and system
CN116258832A (en) Shovel loading volume acquisition method and system based on three-dimensional reconstruction of material stacks before and after shovel loading
Zhang et al. Automatic processing of Chinese GF-1 wide field of view images
CN112284293B (en) Method for measuring space non-cooperative target fine three-dimensional morphology
CN114913439A (en) Lake surface blue algae concentration detection compensation method based on unmanned aerial vehicle remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant