CN117495918A - River water surface optical flow estimation method based on illumination self-adaptive ORB operator - Google Patents

River water surface optical flow estimation method based on illumination self-adaptive ORB operator Download PDF

Info

Publication number
CN117495918A
CN117495918A CN202311520874.5A CN202311520874A CN117495918A CN 117495918 A CN117495918 A CN 117495918A CN 202311520874 A CN202311520874 A CN 202311520874A CN 117495918 A CN117495918 A CN 117495918A
Authority
CN
China
Prior art keywords
optical flow
adaptive
pixel
orb
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311520874.5A
Other languages
Chinese (zh)
Inventor
张振
赵慧娟
黎君亮
刘海韵
王慧斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hohai University HHU
Original Assignee
Hohai University HHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hohai University HHU filed Critical Hohai University HHU
Priority to CN202311520874.5A priority Critical patent/CN117495918A/en
Publication of CN117495918A publication Critical patent/CN117495918A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/46Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
    • G06V10/462Salient features, e.g. scale invariant feature transforms [SIFT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/62Extraction of image or video features relating to a temporal dimension, e.g. time-based feature extraction; Pattern tracking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/46Extracting features or characteristics from the video content, e.g. video fingerprints, representative shots or key frames
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a river water surface optical flow estimation method based on an illumination self-adaptive ORB operator, which uses a self-adaptive threshold ORB operator method to extract characteristic points, wherein the illumination self-adaptive ORB threshold is obtained by multiplying a global self-adaptive threshold based on KSW entropy and a local self-adaptive threshold based on image contrast by weight coefficients respectively. The method comprises the steps of firstly inputting RGB images of two adjacent video frames, intercepting ROIs after graying treatment, and constructing an image pyramid. And then calculating an illumination self-adaptive ORB threshold value layer by layer and pixel by pixel for the first frame image pyramid, and judging and extracting ORB characteristic points according to a FAST rapid optimization scheme by using the threshold value. And finally, carrying out quality sequencing on the characteristic points by using a Harris scoring algorithm, selecting a certain number of high-quality characteristic points, calculating coordinate values of the movement of the characteristic points to the second frame image by using a sparse LK optical flow method, and finally, calculating optical flow movement vectors of the characteristic points on two adjacent frames of images. The invention is suitable for river water surface flow field observation under the condition of complex and changeable illumination.

Description

River water surface optical flow estimation method based on illumination self-adaptive ORB operator
Technical Field
The invention relates to the technical field of flow field measurement by an image method, in particular to a river water surface optical flow estimation method based on an illumination self-adaptive ORB operator.
Background
The river flow field prototype observation provides important basic data for monitoring and early warning of flood and drought disasters, safe operation of reservoir dams, disease control of dike slope engineering, verification and calibration of river dynamics models and the like, and is directly related to the effect of preventing and controlling flood control and hydraulic engineering accident risks in scientific decisions. The flow field measurement technology based on the particle image velocimetry (PIV, particle Image Velocimetry) principle has the characteristic of non-contact instantaneous full-field flow velocity measurement, and is applied to laboratory flumes, river models and river prototype observations. The size, direction, characteristics and distribution of the local fluid motion displacement and speed are obtained through analysis and calculation of the particle image sequence, so that the measurement capability of various complex flows in a laboratory environment is greatly improved. However, estimating the motion vector of the fluid from the particle image is a core and difficulty of PIV technology, and the implementation method depends not only on the hardware system but also on the characteristics of the fluid to be measured. The flow field measurement technology based on large-scale particle image velocimetry (LSPIV, large ScaleParticle Image Velocimetry) is independent of artificial tracer particles and is positioned on a river natural floater to calculate a global surface flow field. The motion of the tracer in the flow field is estimated, a gray scale correlation matching technology is used as a principle, and a time-averaged flow field model is established in a vector average mode. However, the method has large space domain cross-correlation calculation amount and low speed field resolution. The flow field measurement technology based on large-scale particle image tracking velocimetry (LSPTV, large Scale Particle TrackingVelocimetry) works well in the case of sparse particles, obtains the spatial distribution condition of particles in an image by locating trace particles with lower density and larger particles, further obtains the sub-pixel centroid position of the particles, and estimates the flow field by matching the particle centroids in different time frames. But it cannot handle the case of higher particle concentrations in the flow field and is sensitive to parameters that are partially set manually. The flow field measurement technology based on Space-time image velocimetry (STIV, space-time Image Velocimetry) has the characteristics of high spatial resolution and strong real-time performance, and a one-dimensional time-averaged flow field is calculated according to the main texture direction of Space-time image detection during Fourier transform detection by synthesizing Space-time images. But its temporal resolution is not high and is sensitive to complex illumination.
Optical flow methods were originally proposed in the field of computer vision, mainly for estimating apparent rigid motion from image sequences, since they can acquire dense velocity vector fields from image pairs, and also for estimating river motion scenes. In the field of fluid motion estimation, horns and Schunck propose classical variational optical flow estimation models based on the assumption of conservation of luminance. However, the constant constraint of brightness is too ideal to meet the practical situation, so the traditional H-S optical flow method is very sensitive to illumination change, and the optical flow has poor robustness.
In order to improve the robustness of illumination variation, the field of computer vision is improved on the basis of a classical variational optical flow method, for example, a structure-texture decomposition method is adopted for preprocessing, an image is decomposed into a structure part and a texture part containing fine proportion details, and then the texture part is used for replacing an original gray image to perform subsequent optical flow calculation. The method also adopts the idea of structure-texture decomposition, the characteristic optical flow method decomposes the image information into a target characteristic point and a background, and the movement of the characteristic point is estimated by the position change of the same characteristic point of adjacent frames of the image sequence on the time domain, so that the river movement is estimated. The main processes of the feature optical flow method include feature extraction and tracking, and common feature extraction algorithms include SIFT (Scale-Invariant Feature Transform), SURF (Speed-Up Robust Feature), FAST (Featuresfrom Accelerated Segment Test), ORB (Oriented FAST and Rotated Binary RobustIndependent Elementary Feature), and the like. SIFT and SURF are through constructing the scale space and calculating the characteristic main direction to be that the algorithm has scale invariance and rotation invariance, but is sensitive to illumination change, and the calculation speed is low; the FAST algorithm utilizes the gray structure information of the detection pixel points and surrounding pixels to determine the characteristic points, and is simple and high in calculation speed. The ORB algorithm is further improved on the basis of FAST, and an image pyramid is added, so that the algorithm has the characteristic of unchanged scale, and the calculation speed is further improved. The FAST algorithm and the ORB algorithm both use a single fixed threshold value to detect feature points, the threshold value is set according to manual experience values, and when the illumination condition of the river surface changes, illumination unevenness, local flare and the like occur on the river water surface, threshold value parameters are required to be manually adjusted to ensure that a sufficient number of feature points are extracted to calculate the optical flow.
Disclosure of Invention
The invention aims to: the invention aims to provide a river water surface optical flow estimation method based on an illumination self-adaptive ORB operator, which utilizes global KSW entropy and local gray contrast to respectively obtain a global self-adaptive threshold value and a local self-adaptive threshold value to realize illumination self-adaptation, and can stably extract river water surface characteristic points for water surface optical flow estimation under field complex illumination conditions.
The technical scheme is as follows: the river water surface optical flow estimation method based on the illumination self-adaptive ORB operator comprises the following steps:
(1) Graying RGB images of two adjacent river water surface video frames, and intercepting ROI images in river water surface areas;
(2) And downsampling the two frames of ROI images according to a scale factor s to construct an image pyramid, so that the images have scale invariance.
(3) Aiming at the pyramid of the first frame image, calculating an illumination self-adaptive ORB threshold T layer by layer and pixel by pixel, and extracting M feature points from the first frame image by an ORB feature extraction method;
(4) Carrying out quality sequencing on M characteristic points through a Harris scoring algorithm, and removing characteristic points with Harris corner response values smaller than 0 to obtain N high-quality characteristic points;
(5) The optical flow motion vector is estimated by the sparse LK optical flow method.
Further, in step (3), the illumination adaptive ORB threshold T is
T=α*T 1 +β*T 2
Wherein alpha and beta are weight coefficients, T 1 For the global adaptive threshold value calculated layer by layer for the first frame image pyramid, T 2 A locally adaptive threshold is calculated for each pixel.
Further, calculating a global adaptive threshold T by a KSW entropy method 1 The method is characterized by comprising the following steps:
let the gray level interval of the gray level distribution histogram of pyramid image be [0, L]Normalizing the frequency of occurrence of all gray levels to divide the gray level interval into L 1 =[0,t]And L 2 =[t+1,L],L 1 And L 2 Sum of corresponding entropies S:
S=H(L 1 )+H(L 2 )
then the global adaptive threshold T of the current image pyramid 1 The method comprises the following steps:
T 1 =|t max +t min |
wherein t is max Is the corresponding gray value, t, when the sum of the entropies S is maximum min The gray value corresponding to the minimum sum of the entropies.
Further, L 1 And L 2 The corresponding entropy values are H (L 1 ) And H (L) 2 ):
Wherein p is i For the frequency of occurrence of the individual gray levels in the gray level interval,is L 1 All of (3)The sum of the frequencies of the grey level occurrences, +.>Is L 2 The sum of the frequencies of occurrence of all gray levels in the picture.
Further, a locally adaptive threshold T 2 The calculation method is as follows:
after removing edge pixel points with odd pixel width of each layer of pyramid image, drawing a circle by taking the current pixel point P as a circle center and m pixels as a radius, obtaining gray values of n pixel points on the circle, removing maximum and minimum values to remove abnormal values, and averaging the rest pixel points to obtain a local self-adaptive threshold T of the current pixel point 2
Wherein I is i Is the gray value of the pixel point with the number I on the circumference, I max And I min The maximum gray value and the minimum gray value in n pixel points are respectively, and m and n are positive integers.
Preferably, m is 3 and n is 16.
Preferably, in step (3), in order to accelerate the detection efficiency, a FAST optimization scheme is adopted in the process of judging the feature points.
Further, the FAST optimization scheme is specifically as follows:
calculating absolute values of differences between gray values of the current pixel P and the 4 pixel points on the horizontal line and the vertical line on the circumference, if 3 absolute values are larger than the illumination self-adaptive ORB threshold T, continuing to judge other pixel points on the circumference, otherwise, directly discarding the pixel point, and directly discarding the marking value P of the current pixel P l Is that
Wherein if P l If the pixel point is equal to 1, reserving the pixel point, marking the pixel point as a characteristic point, and marking the pixel point as 1; otherwise, the houseDiscarding the pixel point, and marking the pixel point as 0; i 0 For the gray value of the current pixel point P, I i Is the gray value of the pixel point at the i-th position on the circumference.
Further, in step (5), the process of estimating the optical flow motion vector by the sparse LK optical flow method:
and tracking the high-quality feature points on the first frame image by a sparse LK optical flow method, determining coordinates of the high-quality feature points moving to the second frame image, and calculating optical flow motion vectors of the high-quality feature points on the two frame images.
Further, the optical flow motion vectors of the high-quality feature points are as follows:
wherein v is x Transverse pixel optical flow motion vector v which is high-quality characteristic point y Longitudinal pixel optical flow motion vector of high-quality feature point, delta t is time interval of two adjacent frames of images, and x 1 For the transverse pixel coordinates, y, of the high-quality feature points on the first frame image 1 For the longitudinal pixel coordinates, x, of the high-quality feature points on the first frame image 2 For the transverse pixel coordinates, y, of the high-quality feature points on the second frame image 2 Vertical pixel coordinates of the high-quality feature points on the second frame image.
The beneficial effects are that: compared with the prior art, the invention has the remarkable advantages that: according to the river water surface optical flow estimation method of the illumination self-adaptive ORB algorithm, manual parameter adjustment is not needed, the self-adaptive threshold value of each pixel point of the ROI image can be calculated according to the global KSW entropy and the local gray contrast, when the illumination of the river water surface is unevenly changed, a sufficient number of river water surface characteristic points can be stably detected and extracted and used for further sparse LK optical flow estimation, and the error of an optical flow estimation result is kept within a certain range; the problem that under the complex field river water surface illumination condition, illumination change causes uneven river water surface illumination, and an ORB characteristic point extraction algorithm with a single fixed threshold value needs to manually adjust parameters to acquire the characteristic points of the ROI image is solved.
Drawings
FIG. 1 is a flow chart of the present invention;
FIG. 2 is a diagram of current pixel and neighborhood circular ring pixel information;
FIG. 3 is a first frame of video images in two adjacent frames of video images;
FIG. 4 is a second frame of video images in two adjacent frames of video images;
FIG. 5 is a schematic view of the ROI area of a first frame of video image;
FIG. 6 is a feature point detection result of the illumination adaptive ORB algorithm on the first frame ROI image;
FIG. 7 shows the feature point detection result of the fixed threshold ORB algorithm on the first frame ROI image;
FIG. 8 is a feature point detection result of the FAST algorithm on the first frame of ROI image;
FIG. 9 is a light flow diagram of the illumination adaptive ORB algorithm calculation;
FIG. 10 is a light flow diagram of fixed threshold ORB algorithm calculations;
FIG. 11 is a light flow diagram of the FAST algorithm calculation;
FIG. 12 is a graph of coordinate locations of artificially marked feature points on a first frame image;
FIG. 13 is a graph showing the position of the feature points calculated by LK optical flow method in the second frame image;
fig. 14 is a coordinate position diagram of the movement of the artificial mark feature point to the second frame image.
Detailed Description
The invention is further described below with reference to the accompanying drawings.
As shown in fig. 1, the present invention is embodied as follows:
step1: two adjacent frames of RGB images in a monitoring video of a field river scene are taken, as shown in figures 3 and 4. Converting two frames of RGB images into gray level images, cutting out ROI images in a river water surface area, wherein the illumination condition of the ROI areas is complex, as shown in fig. 5, an area 1 is an ROI area for calculating the river water surface, all the areas 1 are in shadows formed on the water surface by the shoreside trees, and an area 2 contained in the areas 1 forms water surface glaring due to sun irradiation.
Step2: the two frames of ROI images are downsampled proportionally. Taking each frame of original high-resolution ROI image as a first layer, and downsampling according to a scale factor s=1.2 to construct an 8-layer pyramid image so as to ensure that the image has scale invariance;
step3, traversing each layer of pyramid image, traversing each pixel point, calculating ORB self-adaptive threshold T for different pixel points, extracting feature points by using an ORB feature extraction method according to the threshold T, and setting the maximum feature point number as M=3000. In the judging process of the feature points, a FAST optimization scheme is used for accelerating the detection efficiency.
Wherein the ORB adaptive threshold T is defined by a global adaptive threshold T 1 And a locally adaptive threshold T 2 Respectively multiplying the weight coefficients and summing the obtained products, wherein the obtained products are shown in a formula (1);
T=α*T 1 +β*T 2 (1)
the α and β are weight coefficients, and a great number of experiments prove that the feature point extraction effect is optimal when α=0.2 and β=0.2 are taken.
Step3.1: calculating a global adaptive threshold T for each layer of image of the first frame of ROI image pyramid 1 Global adaptive threshold T 1 The method is obtained by calculation according to KSW entropy of the current pyramid image, and comprises the following specific implementation steps:
firstly, calculating a one-dimensional gray level distribution histogram of the current pyramid image, dividing the one-dimensional gray level distribution histogram into 0-L total L+1 gray levels, and carrying out normalization processing on the occurrence frequency of all the gray levels. Then, a gray value t is preset, at which time t divides the gray range into L 1 =[0,t],L 2 =[t+1,L]Two parts, L 1 The frequency of occurrence of each gray level in (a) is { p } 1 ,p 2 ,p 3 ,…p t },L 2 The frequency of occurrence of each gray level in (a) is { p } t+1 ,p t+2 ,p t+3 ,…p L }. For L 1 Frequency advance of all gray levels in (3)Line summation, noted asThen L is 1 And L 2 Entropy values corresponding to the two gray scale ranges are H (L 1 ) And H (L) 2 ) The specific calculation formulas are shown in the formulas (2) and (3).
And summing the entropy values of the two parts, wherein the sum of the entropy values is S, and the calculation formula of S is shown as a formula (4).
S=H(L 1 )+H(L 2 ) (4)
Then, each gray value t is traversed in the gray level range 0-L, the corresponding sum S of the entropies is calculated according to the above formula, and the corresponding gray value t is calculated when the sum of the entropies is maximum max The gray value corresponding to the minimum sum of the entropy is t min
Then the global adaptive threshold T is finally found 1 That is, the absolute value of the difference between the two is calculated, and the formula is shown as formula (5).
T 1 =|t max +t min | (5)
Step3.2: calculating a global adaptive threshold T of each layer for a first frame image pyramid 1 Then, removing 7 pixel wide edge pixel points from each layer of pyramid, traversing all the rest pixel points, and calculating a local self-adaptive threshold T for each pixel point 2 Every time a local adaptive threshold T is acquired 2 And judging whether the current pixel point is a characteristic point or not according to the improved ORB self-adaptive threshold T. Locally adaptive threshold T 2 The specific calculation method of (2) is as follows:
as shown in FIG. 2, the gray values of 16 pixels on the circle are obtained by taking the current pixel as the center of a circle and 3 pixels as the radius, and the gray values are removedMaximum and minimum values are used for eliminating abnormal values, and the local self-adaptive threshold T of the current pixel point is obtained by averaging the rest 14 pixel points 2 . The specific calculation formula is shown as formula (6).
Wherein I is i For the gray value of the ith pixel point on the circumference of the current pixel point, I max And I min Respectively the maximum value and the minimum value of gray scale in 16 pixel points on the circumference, and n is a fixed value 16.
Step3.3: because ORB is improved on the basis of FAST, the specific feature point judging rule still keeps consistent with FAST, so that in order to accelerate the calculation efficiency, a FAST rapid optimization scheme is used when judging whether the current pixel point is a feature point or not. The specific implementation rule is that whether the absolute value of the difference between the gray value of the pixel point at the 1 st, 5 th, 9 th and 13 th positions on the circumference of the current pixel point shown in the figure 1 and the gray value of the current pixel point is larger than the self-adaptive threshold T obtained by previous calculation is tested preferentially, if at least 3 pixel points in the 4 pixel points meet the condition, the rest pixel points on the circumference are further judged, otherwise, the pixel points are discarded, and a specific formula is shown in a formula (7).
Wherein P is the current pixel point of the image, P l If P is equal to 1, reserving the pixel point and marking the pixel point as a characteristic point; otherwise, the pixel point is discarded. I 0 For the gray value of the current pixel point, I i And (3) the gray value of the pixel point at the ith position on the circumference, wherein T is the improved ORB adaptive threshold value calculated by the method.
Step4: after detecting m=3000 feature points, respectively calculating Harris corner response function values of each feature point by using a Harris scoring algorithm, and carrying out quality sequencing on all feature points according to the magnitude of corner responses of the feature points. The response function of the corner point is a large positive number, the response function of the edge is a large negative number, and the response function of the flat area is a small value, so that the characteristic point with larger response function value of the Harris corner point is regarded as a high-quality characteristic point. Characteristic points with Harris corner response values smaller than 0 are marked as error characteristic points and removed, and then excellent characteristic points with N=500 characteristic points with largest Harris corner response function values are reserved, wherein a distribution diagram of the excellent characteristic points is shown in fig. 6. The circle marks the position of the feature point.
The feature point distribution diagram extracted by the ORB feature point detection algorithm with a fixed threshold value, which is the comparison algorithm I, is shown in fig. 7, the circle marks the position of the feature point, and the square marks the feature point which is not detected by the method but is successfully extracted by the method.
The extracted feature point distribution diagram of the contrast algorithm II, FAST feature point detection algorithm, is shown in fig. 8, the circle marks are the positions of feature points, and the square marks are feature points which are not detected by the method and are successfully extracted by the method.
As can be seen from fig. 5, the algorithm of the present invention extracts all the floating objects with continuous motion that can be observed obviously in the ROI area, and the comparison algorithm is omitted, that is, the feature points of the square mark are all the obvious floating objects on the water surface, and the feature points should be extracted for optical flow estimation, but both comparison algorithms are omitted and the extraction is unsuccessful.
To verify the algorithm stability, three methods were used to detect feature points for 750 frames of images of a 30s video, including the video frame of fig. 3, followed by LK optical flow tracking. The number of feature points extracted from 750 frames of images, the number of error feature points and the number of successfully tracked feature points are averaged, and the results are shown in table 1.
Table 1 comparison of the performance of three detection algorithms
As can be taken from table 1, the method of the present invention extracts a sufficient number of feature points for calculation based on the illumination-adaptive ORB feature point detection method. The false feature points extracted by the FAST method are 7. The characteristic points of the three methods have better successful tracking rate.
Step5: after obtaining n=500 high-quality feature points on the first frame image, inputting two frames of pyramid images and the high-quality feature points on the first frame image, tracking the high-quality feature points on the first frame image by using a sparse LK optical flow method, and estimating the position of the high-quality feature points moving to the second image. For the tracked feature points, pixel coordinates P on the first frame image are calculated according to the feature points i (x 1 ,y 1 ) And its motion to pixel coordinate P on the second frame image i (x 2 ,y 2 ) And calculating optical flow motion vectors of the two adjacent images. Wherein i represents the index of the feature point, x 1 Representing the transverse pixel coordinates, y, of the feature points on the first frame image 1 Representing the longitudinal pixel coordinates, x, of the feature point on the first frame image 2 Representing the transverse pixel coordinates, y, of the feature points on the second frame image 2 Representing the vertical pixel coordinates of the feature point on the second frame image. Specific calculation formulas for calculating the optical flow motion vector of the feature point according to the corresponding image coordinates of the same feature point on the two frames of images are shown as formula (8) and formula (9).
In the formulas (8) and (9), v x Transverse pixel optical flow motion vector v as feature point y The invention takes the longitudinal pixel optical flow motion vector of the feature point, delta t is the time interval of two adjacent frames of images
FIG. 6 is a graph of the result of LK calculation of feature points extracted by adaptive ORB, fixed-threshold ORB and FAST, the graph of the result of LK calculation calculated by the method of the present invention is shown in FIG. 9, the graph of the result of ORB calculation by fixed-threshold ORB algorithm is shown in FIG. 10, and the graph of the result of optical flow calculated by FAST algorithm is shown in FIG. 11. Wherein the dots represent the end points of the optical flow vectors, and the lines represent the motion tracks of the feature points on two adjacent frames of images. In order to facilitate visualization, the optical flow in the image is enlarged to 11 times of the original size. As can be obtained from fig. 9, 10 and 11, the three algorithms all calculate a certain amount of optical flows for the feature points extracted in the feature point extraction stage, and according to the principle that the optical flows of the same y-pixel coordinates should be basically consistent in size, the optical flow calculation result calculated by the method relatively accords with the actual situation.
5 feature points are selected from 500 feature points to calculate an optical flow vector for performance evaluation, wherein the feature point a, b, c, d is 4 high-quality feature points in the 500 feature points, and the feature point e is a feature point successfully extracted by the method but not detected by two comparison algorithms. Because the motion information of the characteristic points of the river water surface image has no real value, the coordinate positions of the characteristic points on the two adjacent frames of images are marked manually according to the principle that the neighborhood information of the characteristic points basically keeps the same on the two adjacent frames of images, and the manual marking value is used as a true value to evaluate the performance. The manual marking position diagram of the 5 feature points on the first frame image is shown in fig. 12, and the coordinate truth value of the motion of the feature points which are manually marked according to the above consistent motion principle to the second frame image is shown in fig. 13. The feature points calculated by the LK optical flow method are moved to the coordinate positions of the second frame image to be marked, as shown in fig. 14. Here, in the region 3 of fig. 12, 13 and 14, the left-side distinct black mark pixel point is the coordinate position of the feature point a, the right-side distinct black mark pixel point is the coordinate position of the feature point b, the distinct black mark pixel point in the region 4 is the coordinate position of the feature point c, the distinct black mark pixel point in the region 5 is the position of the feature point d, and the distinct white mark pixel point in the region 6 is the position of the feature point e. Coordinates of 5 feature points calculated by three algorithms on the first frame of ROI image, coordinates of manually estimated feature points on the first frame of ROI image, coordinates of motion of the feature points calculated by LK optical flow method according to the coordinates of the feature points of the first frame to the coordinates of motion of the feature points of the second frame of ROI image, and coordinates of motion of the manually estimated feature points to the motion of the feature points of the second frame of ROI image are shown in table 2.
Table 2 results of calculation of coordinates of experimental feature points on two frame images in field river scene
From table 2, the 5 feature points extracted by the adaptive ORB algorithm have certain effectiveness in performing optical flow motion estimation, and the unidirectional estimation error of the position coordinates of the 5 experimental feature points is within 1 pixel and is 0.56 pixel at maximum.
According to the formulas (8) and (9), optical flow motion vectors of 5 experimental characteristic points extracted by the illumination self-adaptive ORB algorithm and manually estimated optical flow motion vectors are calculated, and calculated error values of the optical flow motion vectors and the optical flow motion vectors are obtained, and the results are shown in the table 3.
TABLE 3 optical flow calculation result of experimental feature points on two adjacent frames of images in field river scene
From Table 3, V of 5 experimental feature points calculated by the adaptive ORB algorithm x The optical flow motion vector has little difference, the error of the optical flow motion vector is within 6 pixels per second compared with the error of the true value of the manually estimated optical flow vector, and the error is larger than the manual mark value, and the optical flow motion vector is consistent with the rule of basically keeping the same flow velocity of similar floaters with the same starting point distance. Experimental data fully prove the feasibility of the river water surface optical flow estimation method based on the ORB illumination self-adaptive operator.

Claims (10)

1. A river water surface optical flow estimation method based on an illumination self-adaptive ORB operator is characterized by comprising the following steps:
(1) Graying RGB images of two adjacent river water surface video frames, and intercepting ROI images in river water surface areas;
(2) Downsampling two frames of ROI images according to a scale factor s to construct an image pyramid;
(3) Aiming at the pyramid of the first frame image, calculating an illumination self-adaptive ORB threshold T layer by layer and pixel by pixel, and extracting M feature points from the first frame image by an ORB feature extraction method;
(4) Carrying out quality sequencing on M characteristic points through a Harris scoring algorithm, and removing characteristic points with Harris corner response values smaller than 0 to obtain N high-quality characteristic points;
(5) The optical flow motion vector is estimated by the sparse LK optical flow method.
2. The method for estimating the optical flow of the river water surface based on the illumination-adaptive ORB operator according to claim 1, wherein in the step (3), the illumination-adaptive ORB threshold T is
T=α*T 1 +β·T 2
Wherein alpha and beta are weight coefficients, T 1 For the global adaptive threshold value calculated layer by layer for the first frame image pyramid, T 2 A locally adaptive threshold is calculated for each pixel.
3. The method for estimating the river water surface optical flow based on the illumination-adaptive ORB operator according to claim 2, wherein the global adaptive threshold T is calculated by a KSW entropy method 1 The method is characterized by comprising the following steps:
let the gray level interval of the gray level distribution histogram of pyramid image be [0, L]Normalizing the frequency of occurrence of all gray levels to divide the gray level interval into L 1 =[0,t]And L 2 =[t+1,L],L 1 And L 2 Sum of corresponding entropies S:
S=H(L 1 )+H(L 2 )
then the global adaptive threshold T of the current image pyramid 1 The method comprises the following steps:
T 1 =|t max +t min |
wherein t is max Ash corresponding to the sum of entropy S maximumDegree value, t min The gray value corresponding to the minimum sum of the entropies.
4. A river water surface optical flow estimation method based on illumination-adaptive ORB operator according to claim 3, wherein L 1 And L 2 The corresponding entropy values are H (L 1 ) And H (L) 2 ):
Wherein p is i For the frequency of occurrence of each gray level in the gray level interval, P L1 Is L 1 The sum of the frequencies of occurrence of all gray levels, P L2 Is L 2 The sum of the frequencies of occurrence of all gray levels in the picture.
5. The method for estimating the river water surface optical flow based on the illumination-adaptive ORB operator according to claim 2, wherein the locally adaptive threshold T 2 The calculation method is as follows:
after removing edge pixel points with odd pixel width of each layer of pyramid image, drawing a circle by taking the current pixel point P as a circle center and m pixels as a radius, obtaining gray values of n pixel points on the circle, removing maximum and minimum values to remove abnormal values, and averaging the rest pixel points to obtain a local self-adaptive threshold T of the current pixel point 2
Wherein I is i Is the gray value of the pixel point with the number I on the circumference, I max And I min The maximum gray value and the minimum gray value in n pixel points are respectively, and m and n are positive integers.
6. The method for estimating the river water surface optical flow based on the illumination-adaptive ORB operator according to claim 5, wherein m is 3 and n is 16.
7. The method for estimating the river water surface optical flow based on the illumination-adaptive ORB operator according to claim 6, wherein in the step (3), a FAST optimization scheme is adopted in the process of judging the feature points.
8. The method for estimating the river water surface optical flow based on the illumination self-adaptive ORB operator according to claim 7, wherein the FAST optimization scheme is specifically as follows:
respectively calculating absolute values of differences between gray values of the current pixel P and the 4 pixel points on the horizontal line and the vertical line on the circumference, if 3 absolute values are larger than the illumination self-adaptive ORB threshold T, continuously judging other pixel points on the circumference, otherwise, directly discarding the pixel point, and directly discarding the marking value P of the current pixel P l Is that
Wherein if P l If the pixel point is equal to 1, reserving the pixel point, marking the pixel point as a characteristic point, and marking the pixel point as 1; otherwise, discarding the pixel point, and marking as 0; i 0 For the gray value of the current pixel point P, I i Is the gray value of the pixel point at the i-th position on the circumference.
9. The method for estimating the river water surface optical flow based on the illumination-adaptive ORB operator according to claim 1, wherein in the step (5), the sparse LK optical flow method estimates the optical flow motion vector:
and tracking the high-quality feature points on the first frame image by a sparse LK optical flow method, determining coordinates of the high-quality feature points moving to the second frame image, and calculating optical flow motion vectors of the high-quality feature points on the two frame images.
10. The method for estimating the river water surface optical flow based on the illumination-adaptive ORB operator according to claim 9, wherein the optical flow motion vectors of the high-quality feature points are as follows:
wherein v is x Transverse pixel optical flow motion vector v which is high-quality characteristic point y Longitudinal pixel optical flow motion vector of high-quality feature point, delta t is time interval of two adjacent frames of images, and x 1 For the transverse pixel coordinates, y, of the high-quality feature points on the first frame image 1 For the longitudinal pixel coordinates, x, of the high-quality feature points on the first frame image 2 For the transverse pixel coordinates, y, of the high-quality feature points on the second frame image 2 Vertical pixel coordinates of the high-quality feature points on the second frame image.
CN202311520874.5A 2023-11-15 2023-11-15 River water surface optical flow estimation method based on illumination self-adaptive ORB operator Pending CN117495918A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311520874.5A CN117495918A (en) 2023-11-15 2023-11-15 River water surface optical flow estimation method based on illumination self-adaptive ORB operator

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311520874.5A CN117495918A (en) 2023-11-15 2023-11-15 River water surface optical flow estimation method based on illumination self-adaptive ORB operator

Publications (1)

Publication Number Publication Date
CN117495918A true CN117495918A (en) 2024-02-02

Family

ID=89670548

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311520874.5A Pending CN117495918A (en) 2023-11-15 2023-11-15 River water surface optical flow estimation method based on illumination self-adaptive ORB operator

Country Status (1)

Country Link
CN (1) CN117495918A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117714691A (en) * 2024-02-05 2024-03-15 佳木斯大学 AR augmented reality piano teaching is with self-adaptation transmission system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117714691A (en) * 2024-02-05 2024-03-15 佳木斯大学 AR augmented reality piano teaching is with self-adaptation transmission system
CN117714691B (en) * 2024-02-05 2024-04-12 佳木斯大学 AR augmented reality piano teaching is with self-adaptation transmission system

Similar Documents

Publication Publication Date Title
KR101996992B1 (en) Apparatus and Method for Measuring Flow Velocity of River using Optical Flow Image Processing
CN104361582B (en) Method of detecting flood disaster changes through object-level high-resolution SAR (synthetic aperture radar) images
CN105809715B (en) A kind of visual movement object detection method adding up transformation matrices based on interframe
CN110728697A (en) Infrared dim target detection tracking method based on convolutional neural network
CN111369597B (en) Particle filter target tracking method based on multi-feature fusion
CN105913396A (en) Noise estimation-based image edge preservation mixed de-noising method
CN113034452B (en) Weldment contour detection method
CN117495918A (en) River water surface optical flow estimation method based on illumination self-adaptive ORB operator
CN108711149B (en) Mineral rock granularity detection method based on image processing
CN110246151B (en) Underwater robot target tracking method based on deep learning and monocular vision
CN109376740A (en) A kind of water gauge reading detection method based on video
CN106204617B (en) Adapting to image binarization method based on residual image histogram cyclic shift
CN110717900B (en) Pantograph abrasion detection method based on improved Canny edge detection algorithm
CN108470338A (en) A kind of water level monitoring method
CN108038856B (en) Infrared small target detection method based on improved multi-scale fractal enhancement
CN115761563A (en) River surface flow velocity calculation method and system based on optical flow measurement and calculation
CN111753693A (en) Target detection method in static scene
CN116758528A (en) Acrylic emulsion color change identification method based on artificial intelligence
CN109671084B (en) Method for measuring shape of workpiece
CN115641767A (en) Unmanned ship perception experiment platform device
CN113899349B (en) Sea wave parameter detection method, equipment and storage medium
CN116934808A (en) River surface flow velocity measurement method based on water surface floater target tracking
CN115424249B (en) Self-adaptive detection method for small and weak targets in air under complex background
CN104240268B (en) A kind of pedestrian tracting method based on manifold learning and rarefaction representation
CN116466104A (en) Video current measurement method and device based on LK tracking and gray statistics feature method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination