CN112465895A - Bubble volume calculation method in air tightness detection based on computer vision - Google Patents

Bubble volume calculation method in air tightness detection based on computer vision Download PDF

Info

Publication number
CN112465895A
CN112465895A CN202011357032.9A CN202011357032A CN112465895A CN 112465895 A CN112465895 A CN 112465895A CN 202011357032 A CN202011357032 A CN 202011357032A CN 112465895 A CN112465895 A CN 112465895A
Authority
CN
China
Prior art keywords
bubble
bubbles
volume
calculating
stable
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202011357032.9A
Other languages
Chinese (zh)
Inventor
鲁腊福
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Henan Yaolan Intelligent Technology Co ltd
Original Assignee
Henan Yaolan Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Henan Yaolan Intelligent Technology Co ltd filed Critical Henan Yaolan Intelligent Technology Co ltd
Priority to CN202011357032.9A priority Critical patent/CN112465895A/en
Publication of CN112465895A publication Critical patent/CN112465895A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Geometry (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of computer vision, in particular to a method for calculating the volume of bubbles in air tightness detection based on computer vision, which comprises the steps of correcting and obtaining the average velocity of the bubbles by using a bubble velocity model, obtaining a most stable bubble diagram by using a steady-state measurement model and edge detection, and correcting and calculating the volume of normal bubbles by using a bubble volume model; the invention utilizes the computer vision technology to detect the bubble volume in the air tightness test, thereby greatly improving the detection efficiency, simultaneously reducing the interference of impurities such as noise bubbles and the like by obtaining the most stable bubble diagram, and greatly improving the volume calculation precision; in addition, compared with the method that high-cost equipment such as a high-speed camera is used for collecting the bubble image, the bubble image can be collected through a common camera, the method is simple to operate, wide in application range and low in cost.

Description

Bubble volume calculation method in air tightness detection based on computer vision
Technical Field
The invention relates to the technical field of computer vision, in particular to a method for calculating the volume of bubbles in air tightness detection based on computer vision.
Background
At present, domestic and foreign enterprises mostly adopt a soaking bubble detection method to detect the air tightness of a workpiece, namely, the water depth and the soaking time correspond to all levels of IP waterproof grade tests, and the size of a leakage air hole is calculated through analyzing and estimating the volume of the bubble.
However, most of the existing bubble volume detection methods do not consider the interference of the surrounding environment, which results in large error and low reliability, such as: impurities with the shape similar to bubbles cannot be well removed; in addition, most of the traditional air tightness detection methods collect bubbles through a high-speed camera, and the high-speed camera is expensive, complex to operate and has no wide applicability.
Disclosure of Invention
The invention provides a method for calculating the volume of bubbles in air tightness detection based on computer vision, and solves the technical problems that the existing bubble volume detection precision cannot be guaranteed, the reliability is low, the detection equipment is relatively complex, and the cost is high.
In order to solve the technical problems, the invention provides a method for calculating the volume of bubbles in air tightness detection based on computer vision, which comprises the following steps:
s1, inputting bubble images obtained by using different exposure durations into a semantic segmentation neural network, and obtaining an average bubble rate according to the obtained semantic segmentation effect images of different exposure durations and a bubble rate model;
s2, segmenting the bubble images acquired under the initial exposure duration by setting frame numbers, and performing frame difference superposition on all the bubble images in each segment to obtain a bubble track line graph of each segment;
s3, obtaining a bubble track line width and a track line inclination value according to the bubble track line graph, and calculating to obtain a track line width ratio according to the bubble track line width and a standard bubble track line width;
s4, based on the track line width ratio and the track line inclination value, obtaining a steady state measurement coefficient by using a steady state measurement model, selecting the time length of the section corresponding to the maximum steady state measurement coefficient as a stable time value, taking the bubble image contained in the stable time value as a stable bubble image, and simultaneously recording the volume change value in the corresponding section;
s5, selecting the most stable bubble image in all the stable bubble images by using an edge detection algorithm, and obtaining the bubble motion length according to the most stable bubble image and the bubble trajectory line image;
s6, calculating to obtain single bubble floating time according to the bubble movement length and the bubble average rate;
and S7, inputting the stable time value, the bubble floating time and the volume change value into a bubble volume model to obtain the single normal bubble volume.
Wherein, before the step S1, the bubble image needs to be preprocessed;
the semantic segmentation neural network employs an encoder-decoder infrastructure.
Further, the step S1 is specifically:
s11, setting initial exposure time, sequentially increasing preset exposure step length to adjust the exposure time, and inputting bubble images under different exposure time lengths into a semantic segmentation neural network to obtain semantic segmentation effect graphs of different exposure time lengths;
s12, obtaining the coordinates of the top ends of the bubbles at the bottom ends of the bubbles in the two adjacent frames on the same horizontal plane according to the semantic segmentation effect graph, obtaining the smear length of the bubbles in the two adjacent frames and the bubble width of the current frame by using the minimum circumscribed rectangle, and calculating the difference value of the smear length of the bubbles in the two adjacent frames;
s13, calculating the moving speed of the bubbles according to the difference value of the smear lengths of the bubbles;
s14, calculating the distance height of the bubbles according to the coordinates of the top ends of the bubbles of the two adjacent frames;
and S15, inputting the moving speed of the bubbles, the difference value of smear lengths of the bubbles, the distance height of the bubbles and the width of the bubbles into a pre-established bubble rate model to obtain the average rate of the bubbles.
And the two adjacent frames comprise the current frame and the next frame of the semantic segmentation effect graph.
Further, the step S5 further includes:
acquiring the number of noise bubbles by using the edge detection algorithm;
and acquiring the area of the normal bubbles and the area of the noise bubbles through the connected domain, calculating the ratio of the areas of the normal bubbles and the noise bubbles, and inputting the ratio of the areas of the normal bubbles and the noise bubbles into the bubble volume model.
Further, the steady state metric model is:
Figure BDA0002802892250000031
in the formula, CaThe steady state metric coefficient is represented, d represents the ratio of the widths of the track lines, epsilon represents the inclination value of the track lines, and alpha and beta represent the weight coefficients.
Further, the condition for satisfying the most stable bubble map at least includes:
a. none or minimal noise bubbles in all of the stable bubble maps;
b. the bottom end of the bubble which is about to emerge out of the water surface is substantially on the same horizontal line with the water surface.
Further, the bubble volume model is:
Figure BDA0002802892250000032
in the formula, VqIndicating the normal bubble volume, av indicates the volume change value, T indicates the settling time value, T indicates the individual bubble emergence time, S indicates the bubble area ratio, and a indicates the noise bubble number.
According to the bubble volume calculating method in the air tightness detection based on the computer vision, provided by the invention, the bubble volume is calculated through a semantic segmentation neural network, a steady-state measurement model and the like, so that the problems that the existing bubble volume detection precision cannot be guaranteed, the reliability is low, the detection equipment is relatively complex and the cost is high are solved; according to the method, the average speed of the bubbles is corrected, so that the error of the data is reduced, and the accuracy of the data is improved; in addition, the most stable bubble image is obtained through the segmentation of the bubble image and the edge detection, and the bubble volume is calculated, so that the interference of noise bubbles is reduced, the detection accuracy and reliability are greatly improved, a high-speed camera is not required to be used for collecting the image, the equipment cost is saved, and the application range is wider.
Drawings
FIG. 1 is a schematic flow chart of a method for calculating a volume of a bubble in a gas tightness test based on computer vision according to an embodiment of the present invention;
FIG. 2 is a simplified schematic diagram of the bottom ends of two adjacent frames of bubbles in step S12 being located at substantially the same horizontal plane;
fig. 3 is a simplified schematic diagram of the most stable bubble map satisfying another condition provided by an embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be described in detail below with reference to the accompanying drawings, which are given solely for the purpose of illustration and are not to be construed as limitations of the invention, including the drawings which are incorporated herein by reference and for illustration only and are not to be construed as limitations of the invention, since many variations thereof are possible without departing from the spirit and scope of the invention.
Aiming at the problems that the existing bubble volume detection precision cannot be guaranteed, the reliability is low, the detection equipment is relatively complex and the cost is high, the embodiment of the invention provides a bubble volume calculation method in air tightness detection based on computer vision, as shown in figure 1, the bubble volume calculation method comprises the following steps:
s1, inputting bubble images obtained by using different exposure durations into a semantic segmentation neural network, and obtaining an average bubble rate according to the obtained semantic segmentation effect graphs of the different exposure durations and a bubble rate model, wherein the method specifically comprises the following steps:
s11, setting initial exposure time, sequentially increasing preset exposure step length to adjust the exposure time, and inputting bubble images under different exposure time lengths into a semantic segmentation neural network to obtain semantic segmentation effect graphs of different exposure time lengths;
in order to ensure accurate detection results, the embodiment of the invention needs to acquire multi-frame bubble images by adjusting the exposure time in a state of good and stable water definition; in addition, because even in the same detection video, there may be a plurality of different illumination intensities, in this embodiment, first, the collected multi-frame bubble image is converted into a bubble gray scale image, and the overall average gray scale of a single-frame bubble image is calculated through a gray scale histogram, and then, according to a set gray scale interval, the bubble image with an overall brightness that is too high or too low is filtered out, so as to ensure that the overall average gray scale of the bubble image for detection is within a gray scale interval range, for example: setting a gray level interval to be [50, 220], and filtering out bubble images with the integral average gray level lower than 50 or higher than 220 in the embodiment; the gray scale interval can be selected by one skilled in the art according to specific implementation situations.
In this embodiment, the semantic segmentation neural network adopts an encoder-decoder infrastructure, and the specific training content is as follows:
1) marking the data set by using the acquired bubble image as the data set to obtain a label image, wherein the bubble mark is 1, and the other marks are 0; wherein, 80% of the data set is randomly selected as a training set, and the rest 20% is selected as a verification set.
2) Inputting the bubble image and the label image into a semantic segmentation neural network, wherein an encoder is used for extracting image characteristics and converting the number of channels into the number of categories; then, a semantic segmentation effect map having a size equal to that of the input bubble image is output by the decoder.
3) And training the semantic segmentation neural network by adopting a cross entropy loss function.
It should be noted that the region of interest in the embodiment of the present invention is mainly a bubble region, and therefore, to reduce workload, an image in which the overall average gray level of the bubble image is within the gray threshold range, but the average gray level of the bubble region is too high or too low needs to be further filtered, specifically including: dividing the row of the bubbles in the obtained semantic segmentation effect graph into specific areas, and calculating the average gray level of the bubbles in the specific areas; then, filtering out the semantic segmentation effect graph of the average bubble gray level outside the gray level interval, thereby eliminating the semantic segmentation effect graph with excessively bright or excessively dark bubble areas and increasing the accuracy of the detection result.
S12. As shown in figure 2, for the convenience of observation, figure 2 shows the bubbles of two adjacent frames inOne graph shows that two bubbles with the bottom ends of the bubbles basically in the same horizontal plane in two adjacent frames are selected according to the semantic segmentation effect graph, and the bubble top coordinate y of the two bubbles is obtained1、y2Simultaneously, the minimum external rectangle is utilized to obtain the bubble smear length l of the two bubbles1、l2And the bubble width w of the current frame; length l of smear of air bubble1、l2Making a difference to obtain a difference delta l of the bubble smear lengths of the two bubbles;
and the two adjacent frames comprise the current frame and the next frame of the semantic segmentation effect graph.
It should be noted that, in the specific implementation process, two bubbles whose bottom ends are substantially at the same horizontal plane may not exist in the bubbles of two adjacent frames, and at this time, the semantic segmentation effect map may be sequentially searched until the semantic segmentation effect map of another frame whose bottom end is substantially at the same horizontal plane as the bubble of the current frame is found.
S13, calculating the moving speed of the bubbles according to the difference value delta l of the smear lengths of the bubbles and the exposure time interval, wherein the specific formula is as follows:
Figure BDA0002802892250000061
in the formula (I), the compound is shown in the specification,
Figure BDA0002802892250000062
representing the moving speed of the bubble,. DELTA.l representing the difference in smear length of the bubble, t0Indicating the exposure time interval for two frames.
S14, calculating the distance height of the bubble according to the coordinates of the top end of the bubble of the two adjacent frames, specifically:
Figure BDA0002802892250000063
wherein h represents a bubble distance height, y1Representing the bubble top coordinate, y, of the current frame2Indicating the bubble top coordinates of the next frame.
S15, inputting the moving speed of the bubbles, the difference value of smear lengths of the bubbles, the distance height of the bubbles and the width of the bubbles into a pre-established bubble rate model to obtain the average rate of the bubbles, wherein the bubble rate model is as follows:
Figure BDA0002802892250000064
in the formula (I), the compound is shown in the specification,
Figure BDA0002802892250000065
which represents the average rate of the bubbles,
Figure BDA0002802892250000066
representing the moving speed of the air bubble, h representing the distance height of the air bubble, delta l representing the smear length difference of the air bubble, w representing the width of the air bubble, delta and gamma representing weight coefficients, h0、Δl0、w0Respectively representing the bubble distance height, the bubble smear length difference, the bubble width h of the semantic segmentation effect graph under the initial exposure time0、Δl0、w0The obtaining method is the same as described above, and details are not repeated in this embodiment.
The embodiment comprehensively considers the factors influencing the speed, such as the size of the bubbles, the rising height and the like, corrects the influences through the bubble rate model to obtain a more scientific bubble average rate, can be used for analyzing the motion of any section of the bubbles in water, greatly improves the detection precision, and provides a basis for calculating the volume of the bubbles.
S2, segmenting the bubble images acquired under the initial exposure duration by setting frame numbers, and performing frame difference superposition on all the bubble images in each segment to obtain a bubble track line graph of each segment;
it should be noted that, as the longer the bubble smear is, the more the bubble trajectory obtained by the frame difference addition method is interfered by the noise bubble, the embodiment selects the continuous multi-frame bubble image in the initial exposure time, and meanwhile, in order to ensure that the obtained bubble trajectory is more accurate and reduce the influence of the noise bubble on the bubble trajectory, the embodiment of the present invention segments the continuous multi-frame bubble image by the set number of frames.
S3, obtaining a bubble track line width and a track line inclination value according to the bubble track line graph, and calculating to obtain a track line width ratio according to the bubble track line width and a standard bubble track line width;
according to the embodiment of the invention, the width of the bubble track line in the bubble track line chart is obtained through the minimum external rectangle, and meanwhile, according to the coordinates of the upper end point and the lower end point of the bubble track line in the bubble track line chart, the sine value of the track line inclination angle, namely the track line inclination value, is obtained through calculation;
in addition, the present embodiment is based on the formula:
Figure BDA0002802892250000071
calculating to obtain a track line width ratio;
wherein d represents the width ratio of the track line, and the value range is [ 0-1 ]];wgRepresents the bubble trajectory line width; w is abIndicating the standard bubble trajectory line width.
The standard bubble trajectory line is a trajectory line corresponding to the bubble rising in the water in a vertical straight line.
S4, based on the track line width ratio and the track line inclination value, obtaining a steady state measurement coefficient by using a steady state measurement model, obtaining the maximum steady state measurement coefficient through comparison, taking the time length of the section corresponding to the maximum steady state measurement coefficient as a stable time value, taking the bubble image contained in the corresponding section as a stable bubble image, and simultaneously recording the volume change value in the corresponding section;
wherein the steady state metric model is:
Figure BDA0002802892250000072
in the formula, CaTo representThe steady state measurement coefficient is a normalized steady state measurement coefficient; d represents the ratio of the widths of the track lines; epsilon represents a trajectory tilt value, α and β represent weighting coefficients, and α and β are preferably 0.5 and 0.5 in this embodiment.
In this embodiment, CaThe closer to 1, the more stable the bubble motion state in the corresponding segment of the bubble trajectory line graph is.
S5, selecting the most stable bubble image in all the stable bubble images by using an edge detection algorithm, and obtaining the bubble movement length according to the most stable bubble image and the bubble trajectory line;
it should be noted that the steady state metric coefficient can only approach 1 infinitely for two reasons: the first is the presence of bubbles that deviate from the ideal trajectory, the second is the presence of noise bubbles; however, in the two cases, only the second kind of bubble volume will affect the detected bubble volume, so in order to obtain a more accurate bubble volume calculation result, this embodiment analyzes all the stable bubble maps of the obtained corresponding segments frame by frame to reduce the effect of the noise bubbles on the bubble volume calculation result;
in this embodiment, an edge detection algorithm is used to perform frame-by-frame analysis, and the specific process is as follows:
s51, inputting the stable bubble graph into the semantic segmentation neural network to obtain a stable bubble segmentation graph,
s52, obtaining the coordinates of the edge points of the bubbles in the stable bubble segmentation graph through a Canny edge detection algorithm;
s53, selecting pixel points of which the horizontal coordinate values are in the middle value in the bubble edge points, setting the pixel points as horizontal starting points, traversing the horizontal coordinates of other bubble edge points to obtain the horizontal axis distance from the horizontal starting points in the horizontal axis direction, judging that noise exists in the stable bubble segmentation graph if the horizontal axis distance exceeds a preset horizontal axis distance threshold, screening all noise points of which the horizontal axis distance exceeds the horizontal axis distance threshold, and entering the next step;
s54, selecting a pixel point with the minimum vertical coordinate from all the noise points, setting the pixel point as a vertical starting point, traversing the vertical coordinates of other noise points in the vertical axis direction to obtain the maximum vertical distance from the other noise points to the vertical starting point in the vertical axis direction, judging whether the maximum vertical distance is greater than a vertical axis distance threshold value or not, and if the maximum vertical distance is smaller than the vertical axis distance threshold value, judging that the stable bubble segmentation graph only contains one noise bubble; if the number of the noise bubbles is larger than the preset value, estimating the number of the noise bubbles, wherein a specific calculation formula is as follows:
Figure BDA0002802892250000081
wherein A represents the number of noise bubbles, D represents the maximum longitudinal distance,
Figure BDA0002802892250000082
represents a vertical axis distance threshold, which is preferentially set to the longest smear length of the bubble in the stable bubble segmentation map in this embodiment;
in this embodiment, a stable bubble segmentation map without the noise bubbles or with the minimum noise bubbles is used as a most stable bubble segmentation map, and the corresponding stable bubble map is used as a most stable bubble map, and in addition, in order to obtain a single bubble emergence time, as shown in fig. 3, the most stable bubble map further needs to satisfy another condition: the bottom end of the bubble Q which is about to completely emerge out of the water surface at present and the water surface are basically positioned on the same horizontal line; in this embodiment, a difference is made between frames of the most stable bubble diagram and the bubble trajectory diagram to obtain a bubble gap diagram, then, for the most stable bubble diagram and the bubble gap diagram, the next bubble to be emerged from the water surface is selected, the minimum circumscribed rectangle is used to obtain the bubble length F of the bubble, the bubble gap length G from the top end of the bubble to the water surface, and the bubble length F and the bubble gap length G are added to obtain the bubble movement length.
S55, acquiring the area of a normal bubble and the area of a noise bubble in the most stable bubble segmentation graph through a connected domain, calculating the bubble area ratio of the normal bubble and the noise bubble, and inputting the bubble area ratio into the bubble volume model; the bubble area in the present embodiment is a circular area of the space bubble on the two-dimensional plane.
For a simple illustration:
in this embodiment, 200 continuous frames of bubble images are selected, the 200 frames of bubble images are numbered by 1-200, the number of the continuous frames 1-20 is counted as a first section, 2-21 is counted as a second section, and so on, until 181-200 is the last section, 20 frames of bubble images of each section are superposed through frame differences to obtain a bubble trajectory line graph, then the bubble trajectory line graph is subjected to the steps S3 and S4 to obtain 20 frames of the stable bubble map of the corresponding section, and the 20 frames of the stable bubble map are subjected to frame-by-frame analysis through an edge detection algorithm to obtain the most stable bubble map.
S6, calculating to obtain single bubble floating time according to the bubble movement length and the bubble average rate, wherein the specific formula is as follows:
Figure BDA0002802892250000091
wherein t represents a single bubble emergence time, B represents a bubble movement length,
Figure BDA0002802892250000092
indicating the average rate of bubbles.
S7, inputting the stable time value, the bubble floating time and the volume change value into a bubble volume model to obtain a single normal bubble volume, wherein the method specifically comprises the following steps:
this embodiment assumes that the volume of each normal bubble is the same and the normal bubble volume is VqEach noise bubble has the same volume and the volume of the noise bubble is VnAnd then:
according to the formula of sphere volume
Figure BDA0002802892250000093
Obtaining the normal bubble radius
Figure BDA0002802892250000094
Further obtaining the normal bubble area Sq=π*Rq 2And obtaining the area of the noise bubble according to the bubble area ratio S by the same method:
Figure BDA0002802892250000101
therefore, the temperature of the molten metal is controlled,
Figure BDA0002802892250000102
the bubble volume model is then:
Figure BDA0002802892250000103
in the formula, VqIndicating the normal bubble volume, av indicates the volume change value, T indicates the settling time value, T indicates the individual bubble emergence time, S indicates the bubble area ratio, and a indicates the noise bubble number.
The bubble volume calculation method in the air tightness detection based on the computer vision comprises the steps of obtaining the average velocity of bubbles by using different exposure time lengths, segmenting a bubble image, obtaining the most stable bubbles according to a steady-state measurement model and edge detection, further obtaining the bubble emergence time, and finally obtaining the normal bubble volume by using a bubble volume model; the problems that the existing bubble volume detection precision cannot be guaranteed, the reliability is low, the detection equipment is relatively complex and the cost is high are solved; according to the embodiment of the invention, a steady-state measurement model is established through different characteristics of a bubble track line graph obtained by frame difference superposition under the initial exposure time and a standard bubble track line so as to select a time period during which bubbles rise stably, further optimize the bubbles in the time period, well eliminate impurities with shapes similar to the bubbles, and improve the accuracy of a volume calculation result; in addition, compared with the method that high-temperature equipment such as a high-speed camera is used for collecting the bubble image, the bubble image can be collected through a common camera, the operation is simple, and the cost is greatly reduced.
The above embodiments are preferred embodiments of the present invention, but the present invention is not limited to the above embodiments, and any other changes, modifications, substitutions, combinations, and simplifications which do not depart from the spirit and principle of the present invention should be construed as equivalents thereof, and all such changes, modifications, substitutions, combinations, and simplifications are intended to be included in the scope of the present invention.

Claims (9)

1. A bubble volume calculation method in air tightness detection based on computer vision is characterized by comprising the following steps:
s1, inputting bubble images obtained by using different exposure durations into a semantic segmentation neural network, and obtaining an average bubble rate according to the obtained semantic segmentation effect images of different exposure durations and a bubble rate model;
s2, segmenting the bubble images acquired under the initial exposure duration by setting frame numbers, and performing frame difference superposition on all the bubble images in each segment to obtain a bubble track line graph of each segment;
s3, obtaining a bubble track line width and a track line inclination value according to the bubble track line graph, and calculating to obtain a track line width ratio according to the bubble track line width and a standard bubble track line width;
s4, based on the track line width ratio and the track line inclination value, obtaining a steady state measurement coefficient by using a steady state measurement model, selecting the time length of the section corresponding to the maximum steady state measurement coefficient as a stable time value, taking the bubble image contained in the stable time value as a stable bubble image, and simultaneously recording the volume change value in the corresponding section;
s5, selecting the most stable bubble image in all the stable bubble images by using an edge detection algorithm, and obtaining the bubble motion length according to the most stable bubble image and the bubble trajectory line image;
s6, calculating to obtain single bubble floating time according to the bubble movement length and the bubble average rate;
and S7, inputting the stable time value, the bubble floating time and the volume change value into a bubble volume model to obtain the single normal bubble volume.
2. The method for calculating the volume of the air bubbles in the air tightness test based on the computer vision as claimed in claim 1, wherein the step S1 is specifically as follows:
s11, setting initial exposure time, sequentially increasing preset exposure step length to adjust the exposure time, and inputting bubble images under different exposure time lengths into a semantic segmentation neural network to obtain semantic segmentation effect graphs of different exposure time lengths;
s12, obtaining the coordinates of the top ends of the bubbles at the bottom ends of the bubbles in the two adjacent frames on the same horizontal plane according to the semantic segmentation effect graph, obtaining the smear length of the bubbles in the two adjacent frames and the bubble width of the current frame by using the minimum circumscribed rectangle, and calculating the difference value of the smear length of the bubbles in the two adjacent frames;
s13, calculating the moving speed of the bubbles according to the difference value of the smear lengths of the bubbles;
s14, calculating the distance height of the bubbles according to the coordinates of the top ends of the bubbles of the two adjacent frames;
and S15, inputting the moving speed of the bubbles, the difference value of smear lengths of the bubbles, the distance height of the bubbles and the width of the bubbles into a pre-established bubble rate model to obtain the average rate of the bubbles.
3. A method of calculating the volume of a gas bubble in a computer vision based hermeticity test as claimed in claim 1, wherein: the two adjacent frames comprise the current frame and the next frame of the semantic segmentation effect graph.
4. The method for calculating the volume of the air bubbles in the air tightness test based on the computer vision as claimed in claim 1, wherein the step S5 further comprises:
acquiring the number of noise bubbles by using the edge detection algorithm;
and acquiring the area of the normal bubbles and the area of the noise bubbles through the connected domain, calculating the ratio of the areas of the normal bubbles and the noise bubbles, and inputting the ratio of the areas of the normal bubbles and the noise bubbles into the bubble volume model.
5. The method of claim 1, wherein the steady state metric model is:
Figure FDA0002802892240000021
in the formula, CaThe steady state metric coefficient is represented, d represents the ratio of the widths of the track lines, epsilon represents the inclination value of the track lines, and alpha and beta represent the weight coefficients.
6. The method of claim 5, wherein the satisfying of the most stable bubble map condition at least comprises:
a. none or minimal noise bubbles in all of the stable bubble maps;
b. the bottom end of the bubble which is about to emerge out of the water surface is substantially on the same horizontal line with the water surface.
7. The method of claim 4, wherein the bubble volume model is:
Figure FDA0002802892240000031
in the formula, VqIndicating the normal bubble volume, av indicates the volume change value, T indicates the settling time value, T indicates the individual bubble emergence time, S indicates the bubble area ratio, and a indicates the noise bubble number.
8. A method of calculating the volume of a gas bubble in a computer vision based hermeticity test as claimed in claim 1, wherein: before the step S1, the bubble image needs to be preprocessed.
9. A method of calculating the volume of a gas bubble in a computer vision based hermeticity test as claimed in claim 1, wherein: the semantic segmentation neural network employs an encoder-decoder infrastructure.
CN202011357032.9A 2020-11-27 2020-11-27 Bubble volume calculation method in air tightness detection based on computer vision Withdrawn CN112465895A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011357032.9A CN112465895A (en) 2020-11-27 2020-11-27 Bubble volume calculation method in air tightness detection based on computer vision

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011357032.9A CN112465895A (en) 2020-11-27 2020-11-27 Bubble volume calculation method in air tightness detection based on computer vision

Publications (1)

Publication Number Publication Date
CN112465895A true CN112465895A (en) 2021-03-09

Family

ID=74809059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011357032.9A Withdrawn CN112465895A (en) 2020-11-27 2020-11-27 Bubble volume calculation method in air tightness detection based on computer vision

Country Status (1)

Country Link
CN (1) CN112465895A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113066076A (en) * 2021-04-12 2021-07-02 北京理工大学 Rubber tube leakage detection method, device, equipment and storage medium
CN113610802A (en) * 2021-08-06 2021-11-05 宿迁旺春机械制造有限公司 Water surface stability detection method, device and equipment based on artificial intelligence
CN114113535A (en) * 2021-12-13 2022-03-01 哈尔滨理工大学 Device and method for measuring area of underwater explosion bubble of small equivalent explosive
CN117409007A (en) * 2023-12-15 2024-01-16 深圳市什方智造科技有限公司 Method, device, equipment and medium for determining laminating degree of battery heating film

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113066076A (en) * 2021-04-12 2021-07-02 北京理工大学 Rubber tube leakage detection method, device, equipment and storage medium
CN113066076B (en) * 2021-04-12 2022-08-26 北京理工大学 Rubber tube leakage detection method, device, equipment and storage medium
CN113610802A (en) * 2021-08-06 2021-11-05 宿迁旺春机械制造有限公司 Water surface stability detection method, device and equipment based on artificial intelligence
CN114113535A (en) * 2021-12-13 2022-03-01 哈尔滨理工大学 Device and method for measuring area of underwater explosion bubble of small equivalent explosive
CN114113535B (en) * 2021-12-13 2023-06-16 哈尔滨理工大学 Method for measuring area of underwater explosion bubble of small equivalent explosive
CN117409007A (en) * 2023-12-15 2024-01-16 深圳市什方智造科技有限公司 Method, device, equipment and medium for determining laminating degree of battery heating film
CN117409007B (en) * 2023-12-15 2024-04-12 深圳市什方智造科技有限公司 Method, device, equipment and medium for determining laminating degree of battery heating film

Similar Documents

Publication Publication Date Title
CN112465895A (en) Bubble volume calculation method in air tightness detection based on computer vision
CN113971779B (en) Water gauge automatic reading method based on deep learning
CN107507173B (en) No-reference definition evaluation method and system for full-slice image
CN113592861B (en) Bridge crack detection method based on dynamic threshold
CN114627117B (en) Knitted fabric defect detection method and system based on projection method
CN101710425B (en) Self-adaptive pre-segmentation method based on gray scale and gradient of image and gray scale statistic histogram
US8457351B2 (en) Image object detection using separate ranges from both image detections
CN109376740A (en) A kind of water gauge reading detection method based on video
CN109409290B (en) Thermometer verification reading automatic identification system and method
CN116452598B (en) Axle production quality rapid detection method and system based on computer vision
CN110533686B (en) Method and system for judging whether line frequency of linear array camera is matched with object motion speed
CN114638827A (en) Visual detection method and device for impurities of lubricating oil machinery
CN115170570B (en) Fabric fuzzing and pilling detection method based on gray level run matrix
CN107730521B (en) Method for rapidly detecting ridge type edge in image
CN104200457A (en) Wide-angle camera shooting based discrete type canopy leaf area index detection system and method
CN111354047B (en) Computer vision-based camera module positioning method and system
CN110766683B (en) Pearl finish grade detection method and system
CN107392206B (en) Method for segmenting embossed characters of steel rail under working condition
CN111698420A (en) Automatic focusing method for image analyzer
CN115546072B (en) Image distortion correction method
CN111161264B (en) Method for segmenting TFT circuit image with defects
CN112686876A (en) Water steady-state visual detection method and system based on artificial intelligence
CN115984360A (en) Method and system for calculating length of dry beach based on image processing
CN106909897B (en) Text image inversion rapid detection method
CN111583341B (en) Cloud deck camera shift detection method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20210309