CN113808168A - Underwater pipeline positioning and tracking method based on image processing and Kalman filtering - Google Patents

Underwater pipeline positioning and tracking method based on image processing and Kalman filtering Download PDF

Info

Publication number
CN113808168A
CN113808168A CN202111098451.XA CN202111098451A CN113808168A CN 113808168 A CN113808168 A CN 113808168A CN 202111098451 A CN202111098451 A CN 202111098451A CN 113808168 A CN113808168 A CN 113808168A
Authority
CN
China
Prior art keywords
target
image
tracking point
max
underwater pipeline
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111098451.XA
Other languages
Chinese (zh)
Inventor
郭雨薇
刘俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Dianji University
Original Assignee
Shanghai Dianji University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Dianji University filed Critical Shanghai Dianji University
Priority to CN202111098451.XA priority Critical patent/CN113808168A/en
Publication of CN113808168A publication Critical patent/CN113808168A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/40Image enhancement or restoration using histogram techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an underwater pipeline positioning and tracking method based on image processing and Kalman filtering, which solves the defects of poor working environment, high detection cost and low efficiency of underwater pipeline detection, and adopts the technical scheme that the difference degree between a foreground and a background is increased by a tone correction method; then, establishing a foreground area search box by using an HSV-based color space model and using the target color component as a processing area query table; a search frame contains a connected domain formed by target pixel points, and a target tracking point establishment mechanism is established; and finally, constructing a Kalman filter, and processing the obtained target tracking point state matrix.

Description

Underwater pipeline positioning and tracking method based on image processing and Kalman filtering
Technical Field
The invention relates to an underwater pipeline detection technology, in particular to an underwater pipeline positioning and tracking method based on image processing and Kalman filtering.
Background
Underwater pipelines are often constructed to meet the requirements of energy and communication transmission, and include various pipeline forms such as water supply pipes, oil and gas transportation pipelines, optical cables and the like. Subsea pipelines, particularly subsea pipelines, are typically designed and built to the highest standards during the designed lifetime. But during this period, damage and breakage inevitably occur due to the influence of design, manufacturing process, construction and service environment. Especially, under the action of strong hydrodynamic force factors, unstable seabed conditions and external human factors, once accidents such as pipeline damage, oil and gas leakage, optical cable breakage, communication interruption and the like occur, the consequences are very serious. According to the current standard requirements and international conventions of the submarine pipeline system at home and abroad, the submarine pipeline is inspected regularly (annual inspection) and specially to ensure the safe operation of the pipeline and prolong the service life.
However, the current underwater pipeline detection has poor working environment, high detection cost and large detection difficulty, and a simple and effective pipeline positioning and tracking technology is urgently needed.
Disclosure of Invention
The invention aims to provide an underwater pipeline positioning and tracking method based on image processing and Kalman filtering, which is simple, effective, small in calculated amount and more real-time and accurate.
The technical purpose of the invention is realized by the following technical scheme:
the underwater pipeline positioning and tracking method based on image processing and Kalman filtering comprises the following steps:
s1, enhancing the distinguishing degree of the foreground and the background by a hue correction method, and recovering the underwater image;
s2, constructing a foreground area search box by using the target color component as a processing area query table by adopting an HSV-based color space model, establishing a target tracking point establishment mechanism, and determining a target tracking point;
and S3, constructing a Kalman filter, processing the obtained target tracking point state matrix, predicting the position of the next frame of the moving target, and positioning the underwater pipeline position.
Preferably, in summary, the invention has the following beneficial effects:
by utilizing the underwater image tone correction technology, the original tone of the underwater image is recovered to a certain extent, HSV space color components are limited, and a target search box is simply and effectively obtained. Meanwhile, the search range of the target tracking point is narrowed, the Kalman filter predicts the position of the moving target possibly appearing in the next frame image, the tracking accuracy is enhanced, the calculated amount is less, and the real-time performance and the accuracy of a target tracking algorithm are considered.
Drawings
FIG. 1 is a schematic overall flow diagram of the present invention;
fig. 2 is a schematic diagram of the result after underwater image recovery.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings.
According to one or more embodiments, an underwater pipeline positioning and tracking method based on image processing and kalman filtering is disclosed, as shown in fig. 1, including the following steps:
s1, enhancing the distinguishing degree of the foreground and the background by a hue correction method, and recovering the underwater image;
s2, constructing a foreground area search box by using the target color component as a processing area query table by adopting an HSV-based color space model, establishing a target tracking point establishment mechanism, and determining a target tracking point;
and S3, constructing a Kalman filter, processing the obtained target tracking point state matrix, predicting the position of the next frame of the moving target, and positioning the underwater pipeline position.
Step S1, the underwater image restoration specifically includes:
the most frequently occurring color (dominant hue) in a uniformly illuminated underwater image can be an estimate of the color shift of the image, which is subtracted from the original image to obtain an achromatic hue image that conforms to the gray world assumption. To find the dominant hue, the average fourier frequency of the input underwater image of the three channels is calculated and the inverse fourier transform is applied to its maximum. The method comprises the following specific steps:
s11, calculating average Fourier frequency of the input original image, and obtaining deviation color tone in the image range according to Fourier transform calculation;
s111, inputting an original image I, wherein the three-channel image is IkK is set to { R, G, B }, and the output is a tone image Tk
S112, calculating Fourier frequency of input three-channel component
Figure BDA0003269854480000031
Wherein the content of the first and second substances,
Figure BDA0003269854480000032
represents a fourier transform;
s113, calculating average frequency: avgFI ← (FI)R+FIG+FIB)/3;
S114, determining a maximum frequency filter: filt ← [ avgFI ═ max (avgFI) ];
s115, applying a maximum frequency filter to the input image: FTk←FIk*Filt;
S116, calculating the intensity of the color tone:
Figure BDA0003269854480000034
wherein the content of the first and second substances,
Figure BDA0003269854480000033
representing the inverse fourier transform.
After calculating the deviated color tone in the whole image range, the following steps are carried out:
s12, subtracting the deviated tone from the original image to obtain a color balance image;
s13, the output color balance image is processed to restore the correct hue intensity range by linear histogram stretching. The basic formula is as follows:
Figure BDA0003269854480000041
Iinand IoutRepresenting the input and output pixel intensities, respectively, and a, b, c, and d are the minimum and maximum intensities of the input image and the target output image, respectively. C and d are set to 255 and 0, and a and b are selected within 0 to 5% and 95 to 100%, respectively, in the gray histogram of the input image according to the luminance level of the original image. The processing results are shown in fig. 2.
Due to the fact that the underwater visualization distance is short and color cast phenomenon exists, the underwater image original color tone is restored through the underwater color tone correction technology, the underwater image original color tone is restored to a certain degree, and subsequent foreground areas can be extracted and utilized conveniently.
In step S2, the target tracking point determination specifically includes:
the values of the color approximation in the RGB components differ significantly under the interference of a harsh environment. The HSV color space model can express the brightness, the tone and the vividness of colors very intuitively, has three measurement standards and is more reliable in the contrast between colors. Therefore, we choose to segment the foreground under HSV color space. The range of the tone-corrected image in HSV and RGB space is shown in table 1.
Component(s) of H S V
Pipeline 97-150 140-255 39-210
TABLE 1
And taking the target color component as a processing area query table, and constructing pixel points of the boundary of the target area into a pre-search frame: using hough line detection to find out the set of left and right boundary lines ll{(ul1,vl1),(ul2,vl2),……(uln,vln)},lr{(ur1,vr1),(ur2,vr2),……(urn,vrn)}。
Define the matrix target 1 with the size [ m,1 ]]Marking the pixel points of the target in the y direction, wherein the maximum position and the minimum position after marking are u respectivelymax' and umin'; similarly, a matrix target 2 is defined, and the size is [1, n ]]Marking the pixel points of the target in the x direction, wherein the maximum position and the minimum position after marking are v respectivelymax' and vmin’。
Limit [ u ]min’,umax’]∈ll +∩lr -While [ v ] ismin’,vmax’]∈ll +∩lr -To obtain a new interval range [ u ]min,umax],[vmin,vmax]. Position [ u ] of the sectionmin,umax],[vmin,vmax]And returning to the original image to obtain a search window.
Determining a search window determines the center p and the size m multiplied by n of the search window. Calculating the formula:
Figure BDA0003269854480000051
and finding out a connected domain centroid s with the minimum distance from p, expressing the minimum distance as delta, setting a threshold value epsilon, and taking the current s (u, v) as a target tracking point if delta is less than or equal to epsilon. Since it takes a certain time to acquire an image frame, perform image preprocessing, calculate a search center, and the like, when the system does not receive processed data and δ > ∈, and the like, p (u, v) is set as a target tracking point, a search window is expanded outward by 2 unit pixels, and the next search is waited.
By limiting the HSV space color components, the target search box can be simply and effectively obtained. Meanwhile, the search range of the target tracking point is narrowed. The calculated amount is small in the process of determining the target tracking point, and the color information of the target pixel point is not used as a single division basis through the updating transformation of the search window.
Step S3, the kalman filtering process specifically includes:
constructing and initializing Kalman filter parameters and a target state vector, wherein the target state vector x is [ u, v [ ]u,vv]T. x and y respectively represent coordinate components of the target tracking point on x and y axes in a rectangular coordinate system Oxy of the video image, and vu,vvThe average speed of the target tracking point in the Δ t time on the x and y axes is represented respectively.
The mathematical representation of the dynamic system state space and the associated computation formula in kalman filtering are as follows:
the system state equation:
Figure BDA0003269854480000052
prediction error covariance matrix:
Figure BDA0003269854480000061
kalman gain:
Figure BDA0003269854480000062
updating an observation variable:
Figure BDA0003269854480000063
updating measurementsError:
Figure BDA0003269854480000064
wherein A is a state transition matrix of the system,
Figure BDA0003269854480000065
Figure BDA0003269854480000066
and
Figure BDA0003269854480000067
are the target state vectors at time k-1 and k.
vkThe control vector at time k represents the influence of the control quantity v on the current state.
PkAnd Pk-1Representing the a posteriori estimated covariance at time k and time k-1, respectively.
Figure BDA0003269854480000068
Estimate covariance a priori for time k of which
Figure BDA0003269854480000069
The covariance of (a).
Q represents the noise existing between the state transition matrix and the actual process transition, also called the system process noise covariance matrix. Q is set to be as small as possible,
Figure BDA00032698544800000610
h is a system observation matrix, and the system observation matrix,
Figure BDA00032698544800000611
r is the covariance of the measurement noise,
Figure BDA00032698544800000612
zkis an observation matrix. It is composed of actual observed values and observed noise,
zk=Hxk+wk
the Kalman filter predicts the position of the moving target possibly appearing in the next frame image, and the tracking accuracy is enhanced.
The discrimination between the foreground and the background is increased by a tone correction method; and then, establishing a foreground area search box by using an HSV-based color space model and using the target color component as a processing area query table. The search frame contains a connected domain formed by target pixel points, and a target tracking point establishment mechanism is established: and finding out a connected domain centroid s with the minimum distance from the center of the search box, if the minimum distance is not greater than a set threshold value, taking the current s as a target tracking point, otherwise, defaulting the center of the search box, and simultaneously increasing the search area. And finally, constructing a Kalman filter, and processing the obtained target tracking point state matrix so as to more accurately position the underwater pipeline position. The whole calculation amount is less, and the real-time performance and the accuracy of the target tracking algorithm are considered.
The present embodiment is only for explaining the present invention, and it is not limited to the present invention, and those skilled in the art can make modifications of the present embodiment without inventive contribution as needed after reading the present specification, but all of them are protected by patent law within the scope of the claims of the present invention.

Claims (6)

1. An underwater pipeline positioning and tracking method based on image processing and Kalman filtering is characterized by comprising the following steps:
s1, enhancing the distinguishing degree of the foreground and the background by a hue correction method, and recovering the underwater image;
s2, constructing a foreground area search box by using the target color component as a processing area query table by adopting an HSV-based color space model, establishing a target tracking point establishment mechanism, and determining a target tracking point;
and S3, constructing a Kalman filter, processing the obtained target tracking point state matrix, predicting the position of the next frame of the moving target, and positioning the underwater pipeline position.
2. The underwater pipeline positioning and tracking method based on image processing and kalman filtering according to claim 1, wherein the step S1 specifically includes:
s11, calculating average Fourier frequency of the input original image, and obtaining deviation color tone in the image range according to Fourier transform calculation;
s12, subtracting the deviated tone from the original image to obtain a color balance image;
s13, the output color balance image is processed to restore the correct hue intensity range by linear histogram stretching.
3. The underwater pipeline positioning and tracking method based on image processing and Kalman filtering according to claim 2, characterized in that the off-hue obtained by Fourier transform calculation is specifically:
s111, inputting an original image I, wherein the three-channel image is IkK is set to { R, G, B }, and the output is a tone image Tk
S112, calculating Fourier frequency of input three-channel component
Figure FDA0003269854470000011
Wherein the content of the first and second substances,
Figure FDA0003269854470000012
represents a fourier transform;
s113, calculating average frequency: avgFI ← (FI)R+FIG+FIB)/3;
S114, determining a maximum frequency filter: filt ← [ avgFI ═ max (avgFI) ];
s115, applying a maximum frequency filter to the input image: FTk←FIk*Filt;
S116, calculating the intensity of the color tone:
Figure FDA0003269854470000021
wherein the content of the first and second substances,
Figure FDA0003269854470000022
representing the inverse fourier transform.
4. The underwater pipeline positioning and tracking method based on image processing and Kalman filtering according to claim 1, characterized in that the determination of the target tracking point is specifically:
segmenting the foreground in an HSV color space, taking a target color component as a processing area query table according to the range of the image after tone correction corresponding to the underwater pipeline in the HSV and RGB spaces, and constructing pixel points of the boundary of a target area into a pre-search frame;
determining the center, the size and the pixel mass center of a search window, and determining a target tracking point according to a set threshold condition.
5. The underwater pipeline positioning and tracking method based on image processing and Kalman filtering according to claim 4, characterized in that the establishment of the target tracking point establishment mechanism is specifically as follows:
using hough line detection to find out the set of left and right boundary lines ll{(ul1,vl1),(ul2,vl2),……(uln,vln)},lr{(ur1,vr1),(ur2,vr2),……(urn,vrn)};
Define the matrix target 1 with the size [ m,1 ]]Marking the pixel points of the target in the y direction, wherein the maximum position and the minimum position after marking are u respectivelymax' and umin’;
Define the matrix target 2 with the size of [1, n]Marking the pixel points of the target in the x direction, wherein the maximum position and the minimum position after marking are v respectivelymax' and vmin’;
Limit [ u ]min’,umax’]∈ll +∩lr -While [ v ] ismin’,vmax’]∈ll +∩lr -To obtain a new interval range [ u ]min,umax],[vmin,vmax];
Position [ u ] of the sectionmin,umax],[vmin,vmax]Returning to the original image to obtain a search window;
determining a search window determines the center p and the size mxn of the search window,
Figure FDA0003269854470000031
finding out a connected domain centroid s with the minimum distance from p, expressing the minimum distance as delta, setting a threshold value epsilon, and taking the current s (u, v) as a target tracking point if delta is less than or equal to epsilon;
when the system does not receive the processing data or delta > epsilon, taking p (u, v) as a target tracking point, the search window expands outwards by 2 unit pixels and waits for the next search.
6. The underwater pipeline positioning and tracking method based on image processing and Kalman filtering according to claim 1, characterized in that constructing a Kalman filter for processing specifically comprises:
constructing and initializing Kalman filter parameters and a target state vector, wherein the target state vector x is [ u, v [ ]u,vv]T(ii) a x and y respectively represent coordinate components of the target tracking point on x and y axes in a rectangular coordinate system Oxy of the video image, and vu,vvRespectively representing the average speed of the target tracking point in the delta t time on the x axis and the y axis;
input system state vector xkAnd prediction error covariance matrix P'k
xk=Axk-1+vk
P'k=APk-1AT+Q
Calculating Kalman filter gain coefficient Kk
Figure FDA0003269854470000032
The observation variable is updated, and the measurement error is updated:
x′k=Kk(zk-Hxk-1),
Pk=(1-KkH)P′k
correcting the state vector xkObtaining a correction value and correcting the covariance matrix P 'of the prediction error at the same time'kKalman filtering mean square error matrix Pk
Waiting for the next image data reading.
CN202111098451.XA 2021-09-18 2021-09-18 Underwater pipeline positioning and tracking method based on image processing and Kalman filtering Pending CN113808168A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111098451.XA CN113808168A (en) 2021-09-18 2021-09-18 Underwater pipeline positioning and tracking method based on image processing and Kalman filtering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111098451.XA CN113808168A (en) 2021-09-18 2021-09-18 Underwater pipeline positioning and tracking method based on image processing and Kalman filtering

Publications (1)

Publication Number Publication Date
CN113808168A true CN113808168A (en) 2021-12-17

Family

ID=78939658

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111098451.XA Pending CN113808168A (en) 2021-09-18 2021-09-18 Underwater pipeline positioning and tracking method based on image processing and Kalman filtering

Country Status (1)

Country Link
CN (1) CN113808168A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140017318A (en) * 2012-07-31 2014-02-11 서울과학기술대학교 산학협력단 Fire image recognition and pursuit method using kalman filter
CN104992451A (en) * 2015-06-25 2015-10-21 河海大学 Improved target tracking method
CN106355602A (en) * 2016-08-26 2017-01-25 杨百川 Multi-target locating and tracking video monitoring method
CN107609557A (en) * 2017-08-24 2018-01-19 华中科技大学 A kind of readings of pointer type meters recognition methods
CN107657623A (en) * 2017-08-28 2018-02-02 北京工业大学 A kind of river course line detecting system and method for unmanned plane
CN109870173A (en) * 2019-04-11 2019-06-11 中国石油化工股份有限公司 A kind of track correct method of the submarine pipeline inertial navigation system based on checkpoint
CN109993166A (en) * 2019-04-03 2019-07-09 同济大学 The readings of pointer type meters automatic identifying method searched based on scale
CN113052872A (en) * 2021-03-12 2021-06-29 浙江大学 Underwater moving object tracking method based on sonar image
CN113379789A (en) * 2021-06-11 2021-09-10 天津大学 Moving target tracking method in complex environment

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140017318A (en) * 2012-07-31 2014-02-11 서울과학기술대학교 산학협력단 Fire image recognition and pursuit method using kalman filter
CN104992451A (en) * 2015-06-25 2015-10-21 河海大学 Improved target tracking method
CN106355602A (en) * 2016-08-26 2017-01-25 杨百川 Multi-target locating and tracking video monitoring method
CN107609557A (en) * 2017-08-24 2018-01-19 华中科技大学 A kind of readings of pointer type meters recognition methods
CN107657623A (en) * 2017-08-28 2018-02-02 北京工业大学 A kind of river course line detecting system and method for unmanned plane
CN109993166A (en) * 2019-04-03 2019-07-09 同济大学 The readings of pointer type meters automatic identifying method searched based on scale
CN109870173A (en) * 2019-04-11 2019-06-11 中国石油化工股份有限公司 A kind of track correct method of the submarine pipeline inertial navigation system based on checkpoint
CN113052872A (en) * 2021-03-12 2021-06-29 浙江大学 Underwater moving object tracking method based on sonar image
CN113379789A (en) * 2021-06-11 2021-09-10 天津大学 Moving target tracking method in complex environment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
LIU, YIDAN, ET AL.: "An underwater image enhancement method for different illumination conditions based on color tone correction and fusion-based descattering", 《SENSORS》, pages 1 - 22 *
SHI, LIWEI, ET AL.: "An underwater pipeline tracking system for amphibious spherical robots", 《2017 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION (ICMA)》, pages 1390 - 1395 *
梁娟 等: "基于Camshift和Kalman滤波的自动跟踪算法", 《微型机与应用》, pages 28 - 31 *

Similar Documents

Publication Publication Date Title
CN106404793B (en) Bearing sealing element defect detection method based on vision
CN108280823B (en) Method and system for detecting weak edge flaws on optical cable surface in industrial production
CN109993710B (en) Underwater image denoising method based on generation countermeasure network
CN109118446B (en) Underwater image restoration and denoising method
CN105865329B (en) The acquisition system and method for the bundled round steel end face center coordinate of view-based access control model
CN109214380B (en) License plate inclination correction method
CN108663026B (en) Vibration measuring method
CN109816645B (en) Automatic detection method for steel coil loosening
CN108846844B (en) Sea surface target detection method based on sea antenna
CN109523528B (en) Power transmission line extraction method based on unmanned aerial vehicle binocular vision SGC algorithm
CN111709968B (en) Low-altitude target detection tracking method based on image processing
WO2019010916A1 (en) Workpiece three-dimensional point cloud data smoothing method
CN106600580A (en) Hough transform-based abnormal recognition method and system of power line
CN116597392A (en) Hydraulic oil impurity identification method based on machine vision
CN115631116B (en) Aircraft power inspection system based on binocular vision
CN111667470A (en) Industrial pipeline flaw detection inner wall detection method based on digital image
CN113971669A (en) Three-dimensional detection system applied to pipeline damage identification
CN112529853A (en) Method and device for detecting damage of netting of underwater aquaculture net cage
US20160035107A1 (en) Moving object detection
CN116823839A (en) Pipeline leakage detection method based on thermal infrared image
CN111191192A (en) Data denoising method and device and storage medium
CN107993193B (en) Tunnel lining image splicing method based on illumination equalization and surf algorithm improvement
CN113808168A (en) Underwater pipeline positioning and tracking method based on image processing and Kalman filtering
CN113052794A (en) Image definition recognition method based on edge features
CN109816710B (en) Parallax calculation method for binocular vision system with high precision and no smear

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination