CN113762122B - Raindrop detection algorithm based on stroboscopic photo - Google Patents

Raindrop detection algorithm based on stroboscopic photo Download PDF

Info

Publication number
CN113762122B
CN113762122B CN202111008853.6A CN202111008853A CN113762122B CN 113762122 B CN113762122 B CN 113762122B CN 202111008853 A CN202111008853 A CN 202111008853A CN 113762122 B CN113762122 B CN 113762122B
Authority
CN
China
Prior art keywords
image
image point
raindrop
strobe
pixel value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111008853.6A
Other languages
Chinese (zh)
Other versions
CN113762122A (en
Inventor
张�杰
陆小虎
刘淑
徐喜东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Shipbuilding Pengli Nanjing Atmospheric And Ocean Information System Co ltd
Original Assignee
China Shipbuilding Pengli Nanjing Atmospheric And Ocean Information System Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Shipbuilding Pengli Nanjing Atmospheric And Ocean Information System Co ltd filed Critical China Shipbuilding Pengli Nanjing Atmospheric And Ocean Information System Co ltd
Priority to CN202111008853.6A priority Critical patent/CN113762122B/en
Publication of CN113762122A publication Critical patent/CN113762122A/en
Application granted granted Critical
Publication of CN113762122B publication Critical patent/CN113762122B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T5/73
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30192Weather; Meteorology
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The application discloses a raindrop detection algorithm based on a stroboscopic photo, which comprises a raindrop detection algorithm and a spectral analysis algorithm based on raindrop data. The raindrop detection algorithm comprises two steps of clutter filtering, image sharpening, target detection and positioning analysis. Firstly, processing an original photo into image data with clear targets and simple data structures through clutter filtering and image sharpening. And then calculating the coordinate position and the diameter of each raindrop in the image by edge detection, positioning analysis and other methods. The spectrum analysis algorithm comprises two steps of raindrop target association and statistical analysis. Firstly classifying a plurality of images of the same raindrop into the same point data through association analysis, and then obtaining raindrop spectrum data through statistical analysis and cluster analysis of a large number of raindrop data in a plurality of pictures. Therefore, the influence of environmental noise such as background, illumination, dust and the like and the influence conditions such as wind speed, wind direction, rainfall and the like are easily met, and the raindrop diameter and speed detection data are accurate and reliable.

Description

Raindrop detection algorithm based on stroboscopic photo
Technical Field
The application relates to the technical field of meteorological monitoring, in particular to a raindrop detection algorithm based on stroboscopic photos.
Background
The raindrop spectrometer is common meteorological monitoring equipment, and the traditional raindrop spectrometer adopts a laser detection method and has certain limitation. With the development and popularization of photographic equipment and image processing technology, the raindrop spectrometer based on image processing can obtain images of raindrops more intuitively and accurately, and obtain high-efficiency and accurate raindrop spectrum data through image processing and statistical methods. A raindrop spectrometer based on image processing adopts a stroboscopic photo as a basis for data acquisition and analysis, and how to identify raindrops in the photo through an efficient and accurate image processing algorithm and analyze and process the raindrop data is a key for determining the performance of the raindrop spectrometer.
Currently, the target recognition technology based on image processing is very mature and widely applied to production and living. There are many specificities in the application to raindrop detection. Raindrops are used as transparent objects, and have great influence on environmental noise such as background, illumination, dust and the like on a snapshot picture; raindrops are used as a target of high-speed movement and are influenced by the conditions of wind speed, wind direction, rainfall and the like, so that the raindrops have great randomness on the snap-shot photos; a plurality of images are left on each raindrop by adopting a raindrop photo obtained by a stroboscopic photographing mode; the general image processing algorithm does not consider the complex situations at the same time, so that the calculation error of parameters such as the radius of raindrops, the dropping speed of raindrops and the like is large, and the reliability is low.
Disclosure of Invention
Aiming at the defects of the prior art, the application provides a raindrop detection algorithm based on a stroboscopic photo, which can solve the problems that the raindrop detection algorithm based on the stroboscopic photo can be used for solving the influence of environmental noise such as background, illumination, dust and the like and is easy to be influenced by wind speed, wind direction, rainfall and the like, and the raindrop diameter and speed detection data are accurate and reliable.
In order to solve the technical problems, the application adopts the following technical scheme:
a raindrop detection algorithm based on stroboscopic photos comprises the following steps.
Step 1, shooting a raindrop image: the industrial camera and the stroboscopic light source are started, the stroboscopic light source carries out stroboscopic light filling on raindrops entering the raindrop measuring hole, and the raindrops entering the raindrop measuring hole are imaged on the black background plate; and the telecentric lens of the industrial camera shoots the imaging on the black background plate to form a raindrop image.
And 2, preprocessing an image, which specifically comprises the following steps.
Step 2A, clutter filtering: and (3) performing clutter filtering on the raindrop image formed by shooting in the step (1).
Step 2B, sharpening: and (3) carrying out binarization sharpening treatment on the raindrop image subjected to clutter filtering in the step (2).
Step 2C, establishing a two-dimensional coordinate system: the raindrop image sharpened in the step 2B is subjected to sharpening, the upper left corner point is taken as a coordinate origin o, the top edge of the raindrop image is taken as an x direction, the left side edge of the raindrop image is taken as a y direction, and a two-dimensional coordinate system of xoy is established; in the two-dimensional coordinate system, the length of each pixel value of the raindrop image in the x direction and the y direction is 1.
Step 3, acquiring n segments of sequences and m image points: scanning and traversing pixel values in a raindrop image with a two-dimensional coordinate system one by one from the origin o of coordinates in order from left to right and from top to bottom to obtain n-segment sequence and i-th segment sequence position information [ i, (x) si ,y si ),(x ei ,y ei )]The method comprises the steps of carrying out a first treatment on the surface of the Wherein i is more than or equal to 1 and less than or equal to n; (x) si ,y si ) A start coordinate corresponding to a start pixel value of the ith section of sequence; (x) ei ,y ei ) The ending coordinates corresponding to the starting pixel values of the ith section of sequence; wherein y is si =y ei The method comprises the steps of carrying out a first treatment on the surface of the The method comprises the steps of establishing image points while acquiring n sections of sequences to obtain m image points in total; the sequence of segments in each image point needs to satisfy: the x-direction coordinates are overlapped, and the y-direction coordinates are in an increasing relation of +1; the start and end coordinates of all segment sequences in each image point constitute the contour of the corresponding image point.
Step 4, obtaining the center coordinates of each image point and the diameter of each image point: and (3) calculating the center coordinates and the diameters of the image points according to the outline of each image point obtained in the step (3).
Step 5, acquiring associated image points of raindrops: traversing the central coordinate of each image point and the diameter of the corresponding image point by adopting a target association algorithm, and associating the image points which are positioned on the same vertical straight line, have the diameters close to each other and are equidistant to each other to be used as associated image points of the same raindrop; the number of associated image points per raindrop is not less than 3.
Step 6, calculating the diameter of the raindrops: the raindrop diameter is the average diameter of all associated image points of the corresponding raindrop.
Step 7, calculating the raindrop falling speed V, wherein a specific calculation formula is as follows: v= S f; wherein f is the frequency of the strobe light source; s is the average distance between all the associated image points of the corresponding raindrops.
In step 3, the method for acquiring the n-segment sequences and m image points comprises the following steps.
Step 31, pixel value scanning traversal: from the origin o of coordinates, pixel values in a raindrop image having a two-dimensional coordinate system are scanned one by one in order from left to right and from top to bottom.
Step 32, obtain the 1 st segment sequence and 1 st image point: in the first line of the raindrop image, when the first encountered pixel value is changed from 0 to 1, the pixel value 1 at this time is the start pixel value of the 1 st segment sequence, and the start coordinate (x s1 ,y s1 ) The method comprises the steps of carrying out a first treatment on the surface of the Continuing traversing, in the first row of the raindrop image, when the first encountered pixel value is changed from 1 to 0, the pixel value 1 at this time is the ending pixel value of the 1 st segment sequence, and recording the ending coordinate (x e1 ,y e1 ) The method comprises the steps of carrying out a first treatment on the surface of the Thus, the 1 st-segment sequence position information [1, (x) s1 ,y s1 ),(x e1 ,y e1 )]And record the 1 st segment sequence as the 1 st image point.
Step 33, acquiring the 2 nd segment sequence and the 2 nd image point: after the 1 st segment sequence is obtained, continuing to traverse the pixel values, when the pixel value is changed from 0 to 1 in the first row or the second row of the raindrop image, the pixel value 1 at the moment is the starting pixel value of the 2 nd segment sequence, and recording the starting coordinate (x s2 ,y s2 ) The method comprises the steps of carrying out a first treatment on the surface of the Continuing traversing, when encountering the pixel value from 1 to 0 in the same row of the raindrop image, the pixel value 1 at this time is the end pixel value of the 2 nd segment sequence, and recording the end coordinate (x e2 ,y e2 ) The method comprises the steps of carrying out a first treatment on the surface of the Thus, the 2 nd-segment sequence position information [2, (x) s2 ,y s2 ),(x e2 ,y e2 )]The method comprises the steps of carrying out a first treatment on the surface of the Judging whether the 2 nd segment sequence belongs to the 1 st image point or not and judging the methodThe method comprises the following steps: when the x-direction coordinate (x s2 ,x e2 ) With the x-direction coordinates (x s1 ,x e1 ) With overlap, and y s2 -y s1 When=1, the 2 nd segment sequence belongs to the 1 st image point, otherwise the 2 nd segment sequence is recorded as the 2 nd image point.
Step 34, acquiring an ith segment sequence and a jth image point: after the 2 nd segment sequence is obtained, continuing to traverse the pixel value, in a certain row of the raindrop image, when the pixel value is changed from 0 to 1, the pixel value 1 at the moment is the initial pixel value of the i th segment sequence, and recording the initial coordinate (x si ,y si ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein i is more than or equal to 2 and n is more than or equal to n; continuing traversing, in the same line of the raindrop image, when the pixel value is changed from 1 to 0, the pixel value 1 at the moment is the ending pixel value of the ith segment sequence, and recording the ending coordinate (x ei ,y ei ) The method comprises the steps of carrying out a first treatment on the surface of the Thus, the i-th segment sequence position information [ i, (x) si ,y si ),(x ei ,y ei )]The method comprises the steps of carrying out a first treatment on the surface of the If the i-1 th segment sequence belongs to the j-1 th image point, judging whether the i-1 th segment sequence belongs to the j-1 th image point or not according to the specific judging method: when the x-direction coordinate of the ith section sequence overlaps with the x-direction coordinate of the last section sequence of any one image point in the previous j-1 image points and the y-direction coordinate is added with 1, the ith section sequence belongs to the corresponding image point in the previous j-1 image points, otherwise, the ith section sequence is recorded as the jth image point.
Step 35, obtaining an nth segment sequence: continuing pixel value traversal until one traversal is completed, and obtaining n segments of sequences and m image points altogether; wherein the n-th sequence position information is [ n, (x) sn ,y sn ),(x en ,y en )]The start and end coordinates of all segment sequences in each image point constitute the contour of the corresponding image point.
In step 4, it is assumed that the jth image point includes a sequence of segments a, and the center coordinate is (x j ,y j ) Diameter d j Wherein j is more than or equal to 1 and less than or equal to m; then x j 、y j And d j The calculation formulas of (a) are respectively as follows:
wherein k is the kth sequence in the a-segment sequences, k is more than or equal to 1 and less than or equal to a, and the position information of the kth sequence is [ k, (x) sk ,y sk ),(x ek ,y ek )];x j 、y j And d j After the acquisition, the j-th image point position information is recorded as [ j, (x) j ,y j ),d j ]。
In step 5, a method for acquiring raindrop associated image points by adopting a target association algorithm comprises the following steps:
step 51, input the 1 st image point, and regard the 1 st image point as the 1 st point of the first strobe strip.
Step 52, inputting the 2 nd image point, and determining whether the 2 nd image point is the associated image point on the first strobe strip, wherein the specific determining method is as follows:
condition one: the diameter of the 2 nd image point is similar to the diameter of the 1 st image point.
Condition II: the included angle between the center of the 2 nd image point and the vertical direction of the central connecting line of the 1 st image point is smaller than 30 degrees.
When the first condition and the second condition are satisfied at the same time, the 2 nd image point is the associated image point on the first strobe strip line, and the distance D between the 2 nd image point and the 1 st image point is calculated; otherwise, the 2 nd image point is taken as the 1 st point of the second stroboscopic striping.
Step 53, inputting the first image point, wherein l is not less than 3 and not more than m, and judging whether the first image point is the associated image point on the existing stroboscopic strip line, wherein the specific judging method is as follows:
step 53A: the diameter of the first image point is similar to the diameter of the image point on the existing strobe strip H.
Step 53B: after step 53A is satisfied, the judgment is performed as follows:
(1) If there is only one image point on the existing strobe strip line H, the included angle between the center of the first image point and the vertical direction of the center connecting line of the existing image point on the existing strobe strip line H is smaller than 30 degrees, the first image point is used as the associated image point on the existing strobe strip line H.
(2) If there is only one image point on the existing strobe strip line H, the included angle between the center of the first image point and the vertical direction of the center connecting line of the existing image point on the existing strobe strip line H exceeds 30 degrees, the first image point is used as the first image point on the new strobe strip line.
(3) If there are at least two image points with a distance D on the existing strobe strip line H, the center of the first image point is located on the strobe strip line H, and the distance between the first image point and the adjacent image point on the strobe strip line H is D, the first image point is used as the associated image point on the existing strobe strip line H.
(4) If there are at least two image points with a distance D on the existing strobe strip line H, the center of the first image point is not located on the strobe strip line H or the center of the first image point is located on the strobe strip line H but the distance between the first image point and the adjacent image point on the strobe strip line H is not D, the first image point is taken as the first image point on the new strobe strip line.
Step 54, according to the method of step 53, traversing all image points until all m image points are traversed, and obtaining a plurality of strobe strip lines.
Step 55, counting the number of associated image points in each strobe strip obtained in step 54, and deleting all strobe strips with the number of associated image points lower than 3, thereby obtaining strobe strips with the number of associated image points not less than 3, wherein one strobe strip corresponds to one raindrop.
In step 53B (3), when there are at least two image points with a distance D on the existing strobe strip line H, the center coordinates of the last two image points are (x) l-2 ,y l-2 ) And (x) l-1 ,y l-1 ) The center coordinate of the first image point is (x l ,y l ) The method comprises the steps of carrying out a first treatment on the surface of the When x is l And y l Simultaneously meets the following requirements:
2x l-1 =x l-2 +x l
2y l-1 =y l-2 +y l
the first image point is the associated image point on the existing strobe strip H.
In step 5, the strobe frequency f of the strobe light source is 200Hz, and the associated image points of each raindrop are 4.
Also comprises a step 8 of calculating the local wind speed V x The specific calculation formula is as follows: v (V) x =S x f, performing the following steps; wherein f is the frequency of the strobe light source; s is S x Is the x-direction offset distance of two adjacent associated image points in the corresponding raindrops.
The application has the following beneficial effects:
1. in the application, the industrial camera with the telecentric lens, the power supply, the stroboscopic light source, the controller and the black background plate are all embedded and integrated in the raindrop measuring box, and the application has the advantages of small volume, simple structure and strong applicability.
2. Since the velocity of the raindrops when they fall to the ground is uniform and is generally not more than 10m/s. In the present application, the strobe frequency f of the strobe light source is greater than 200Hz, so that even at a raindrop speed of 10m/s, it is possible to ensure that each raindrop in each photographed picture has a minimum of 4 image points. According to the method, the radius and the falling speed of each raindrop can be accurately measured according to the fact that the number of image points of each raindrop is not less than four, the measuring precision is high, the testing data are reliable, and the error is small.
3. The stroboscopic light source includes a plurality of concentric equidistance setting and the annular lamp area of colour difference, so the RGB value of the image point of every raindrop in every photo of taking is high, and the definition is high, further makes the radius of raindrop and whereabouts speed can accurate measurement, and measurement accuracy is high, and test data is reliable, and the error is little.
4. According to the application, the raindrop measuring holes are all positioned in the object distance range of the telecentric lens, so that raindrops falling into the raindrop measuring holes can be accurately captured by the telecentric lens, the testing accuracy is high, and the tested raindrop range is large.
5. The application relates to a raindrop detection algorithm based on stroboscopic photos and a spectrum analysis algorithm based on raindrop data. The raindrop detection algorithm comprises two steps of clutter filtering, image sharpening, target detection and positioning analysis. Firstly, processing an original photo into image data with clear targets and simple data structures through clutter filtering and image sharpening. And then calculating the coordinate position and the diameter of each raindrop in the image by edge detection, positioning analysis and other methods. Therefore, the influence of environmental noise such as background, illumination, dust and the like and the influence conditions such as wind speed, wind direction, rainfall and the like are easily met, and the raindrop diameter and speed detection data are accurate and reliable.
Drawings
Fig. 1 is a top view of an embedded raindrop spectrometer device of the present application.
Fig. 2 is a front view of an embedded raindrop spectrometer device of the present application.
FIG. 3 is a schematic diagram of the layout position of the strobe light source in the present application.
Fig. 4 is a schematic diagram of an image processing procedure of the same raindrop in a photographed image. Wherein, the image a represents the photographed original image, the image b represents the image after clutter filtering, the image c represents the image after contour detection, and the image d represents the image after positioning analysis.
Fig. 5 shows a schematic diagram of a raindrop target detection flow.
FIG. 6 shows a schematic flow chart of the target association algorithm.
Fig. 7 shows a schematic diagram of an established two-dimensional coordinate system.
The method comprises the following steps:
1-an industrial camera; 2-telecentric lens; 3-a raindrop measuring box; 4-raindrop measuring holes;
5-stroboscopic light source; 51-white light band; 52-red light strip; 53-green light band; 54-blue light bands;
6-black background plate; 7-a power supply; 8-a controller;
91-a first image point; 92-a second image point; 93-a third image point; 94-fourth image point.
Detailed Description
The application will be described in further detail with reference to the accompanying drawings and specific preferred embodiments.
In the description of the present application, it should be understood that the terms "left", "right", "upper", "lower", etc. indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, are merely for convenience in describing the present application and simplifying the description, and do not indicate or imply that the apparatus or element being referred to must have a specific orientation, be configured and operated in a specific orientation, and "first", "second", etc. do not indicate the importance of the components, and thus are not to be construed as limiting the present application. The specific dimensions adopted in the present embodiment are only for illustrating the technical solution, and do not limit the protection scope of the present application.
As shown in fig. 1, an embedded raindrop spectrometer device includes a raindrop measuring box 3, an industrial camera 1, a strobe light source 5, a raindrop measuring hole 4, a black background plate 6, a power supply 7, a controller 8, and a monitoring device.
The black background plate 6 is arranged on the inner wall surface of one side wall of the raindrop measuring box 3; in this embodiment, it is preferable to mount the inner wall surface of the right side wall. The black background plate is preferably made of a resinous acrylic pigment.
The rain drop measuring box 3 is preferably a rectangular box body, a light source mounting plate is preferably inserted into the inner cavity on the left side of the rain drop measuring box 3, and the light source mounting plate preferably divides the rain drop measuring box 3 into an instrument accommodating cavity and a rain drop measuring cavity.
The raindrop measurement holes 4 are provided adjacent to the black background plate 6, and penetrate the top and bottom plates of the raindrop measurement box (i.e., the raindrop measurement chamber). The raindrop measurement well 4 is preferably a square well with a side length of 20 cm.
The industrial camera 1 is mounted on a side wall of the rain drop measuring box 3 opposite to the black background plate 6 and is located on the central axis of the black background plate 6. In the present embodiment, the industrial camera 1 is preferably installed in the center of the left inner wall of the raindrop measurement box 3 and is located in the instrument accommodation chamber.
The industrial camera 1 has a telecentric lens 2 facing a black background plate 6, and the telecentric lens 2 passes through and is mounted at the center of the light source mounting plate.
The object distance of the telecentric lens is in the range of 30-40cm, so that the center of the raindrop measurement aperture 4 is located within the object distance of the telecentric lens 2. In this embodiment, the telecentric lens is a variable object distance lens, having a maximum object distance of 40cm and a minimum object distance of 30 cm.
The square raindrop measurement hole has a proximal side wall and a distal side wall both parallel to the black background plate; the distance from the proximal sidewall to the telecentric lens is less than or equal to the minimum object distance, and the distance from the distal sidewall to the telecentric lens is equal to or greater than the maximum object distance. When the raindrops fall vertically, the distance from the proximal sidewall to the telecentric lens may be directly equal to the minimum object distance, and the distance from the distal sidewall to the telecentric lens may be directly equal to the maximum object distance. However, most raindrops fall with a certain inclination direction due to the influence of wind, so that the probability of the raindrops entering the raindrop measurement holes is increased, and the side length of the raindrop measurement holes is larger than the difference between the maximum object distance and the minimum object distance. In this embodiment, the side length of the raindrop measurement hole is 20cm, and the difference between the maximum object distance and the minimum object distance is 10cm, that is, the raindrop measurement hole is based on the center object distance of the telecentric lens, and the proximal side wall and the distal side wall are increased by 5cm respectively.
The periphery of the telecentric lens 2 is provided with stroboscopic light sources 5 with multiple colors in concentric rings.
The number of image points of the same raindrop in each picture shot by the telecentric lens can be controlled by controlling the stroboscopic frequency of the stroboscopic light source. Since the velocity of the raindrops when they fall to the ground is uniform and is generally not more than 10m/s. In the present application, the strobe frequency f of the strobe light source is greater than 200Hz, so that even at a raindrop speed of 10m/s, it is possible to ensure that each raindrop in each photographed picture has a minimum of 4 image points. According to the method, the radius and the falling speed of each raindrop can be accurately measured according to the fact that the number of image points of each raindrop is not less than four, the measuring precision is high, the testing data are reliable, and the error is small.
Further, the stroboscopic light source comprises a plurality of annular light bands which are concentrically and equidistantly arranged and have different colors. Further, the number of the annular lamp bands is four, and the four annular lamp bands are respectively a white lamp band 51, a red lamp band 52, a green lamp band 53 and a blue lamp band 54 from outside to inside.
Normally, only a white light band is used by default. When the situation is found, the device judges that the rainfall is large, the white lamp strip is turned off, and the red, green and blue annular lamp strips are used for flashing simultaneously, so that the RGB value of an image point of each raindrop in each shot photo is high, the definition is high, the radius and the falling speed of the raindrops can be accurately measured, the measuring precision is high, the test data is reliable, and the error is small.
A. The number of raindrops in a single photo exceeds 10, namely the number of images exceeds 40.
B. One photograph is taken every second, and there are cases where the raindrop pitch is less than 10mm in not less than 10 photographs per minute.
The power supply 7 and the controller 8 are both arranged in the instrument accommodating cavity of the rain drop measuring box, and each annular lamp strip is connected with the power supply through a change-over switch. Because each annular lamp belt is controlled by an independent change-over switch, each annular lamp belt can be independently opened and closed and independently work.
Further, a waterproof power supply interface and a waterproof network port are arranged on the raindrop measuring box. The power supply interface is used for charging the power supply; the industrial camera and the power supply are both connected with a controller, the model of the controller is preferably PQCJ228, and the controller is also connected with the monitoring equipment in a wireless or wired mode through a network port.
Before measurement, a raindrop detection algorithm based on a stroboscopic photo is placed on the ground of a region to be measured, and raindrops fall into a raindrop measurement cavity from a raindrop measurement hole at a top plate and flow away from an opening of a bottom plate, so that the raindrops cannot be gathered in the raindrop measurement cavity.
Then, a power supply and a controller are started, the power supply supplies power to the stroboscopic light source and the industrial camera, and the stroboscopic frequency f of the stroboscopic light source is larger than 200Hz. In this embodiment, the strobe frequency f=200 Hz of the strobe light source is such that each raindrop in each photograph taken has 4 to 5 image points, preferably 4 image points.
As shown in fig. 5 and 6, a raindrop detection algorithm based on a stroboscopic photo includes the following steps.
Step 1, shooting a raindrop image: the industrial camera and the stroboscopic light source are started, the stroboscopic light source carries out stroboscopic light filling on raindrops entering the raindrop measuring hole, and the raindrops entering the raindrop measuring hole are imaged on the black background plate; and the telecentric lens of the industrial camera shoots the imaging on the black background plate to form a raindrop image.
And 2, preprocessing an image, which specifically comprises the following steps.
Step 2A, clutter filtering: and (3) performing clutter filtering on the raindrop image formed by shooting in the step (1).
Step 2B, sharpening: and (3) carrying out binarization sharpening treatment on the raindrop image subjected to clutter filtering in the step (2).
Step 2C, establishing a two-dimensional coordinate system: establishing a two-dimensional coordinate system of xoy for the raindrop image sharpened in the step 2B by taking the upper left corner point as a coordinate origin o, the top edge of the raindrop image as an x direction and the left side edge of the raindrop image as a y direction, as shown in fig. 7; in the two-dimensional coordinate system, the length of each pixel value of the raindrop image in the x direction and the y direction is 1.
Step 3, acquiring n segments of sequences and m image points: scanning and traversing pixel values in a raindrop image with a two-dimensional coordinate system one by one from the origin o of coordinates in order from left to right and from top to bottom to obtain n-segment sequence and i-th segment sequence position information [ i, (x) si ,y si ),(x ei ,y ei )]The method comprises the steps of carrying out a first treatment on the surface of the Wherein i is more than or equal to 1 and less than or equal to n; (x) si ,y si ) A start coordinate corresponding to a start pixel value of the ith section of sequence; (x) ei ,y ei ) The ending coordinates corresponding to the starting pixel values of the ith section of sequence; wherein y is si =y ei The method comprises the steps of carrying out a first treatment on the surface of the The method comprises the steps of establishing image points while acquiring n sections of sequences to obtain m image points in total; the sequence of segments in each image point needs to satisfy: the x-direction coordinates are overlapped, and the y-direction coordinates are in an increasing relation of +1; the start and end coordinates of all segment sequences in each image point constitute the contour of the corresponding image point.
The method for acquiring the n-segment sequences and the m image points specifically comprises the following steps.
Step 31, pixel value scanning traversal: from the origin o of coordinates, pixel values in a raindrop image having a two-dimensional coordinate system are scanned one by one in order from left to right and from top to bottom.
Step 32, obtain the 1 st segment sequence and 1 st image point: in the first line of the raindrop image, when the first encountered pixel value is changed from 0 to 1, the pixel value 1 at this time is the start pixel value of the 1 st segment sequence, and the start coordinate (x s1 ,y s1 ) The method comprises the steps of carrying out a first treatment on the surface of the Continuing traversing, in the first row of the raindrop image, when the first encountered pixel value is changed from 1 to 0, the pixel value 1 at this time is the ending pixel value of the 1 st segment sequence, and recording the ending coordinate (x e1 ,y e1 ) The method comprises the steps of carrying out a first treatment on the surface of the Thus, the 1 st-segment sequence position information [1, (x) s1 ,y s1 ),(x e1 ,y e1 )]And record the 1 st segment sequence as the 1 st image point.
Step 33, acquiring the 2 nd segment sequence and the 2 nd image point: after the 1 st segment sequence is obtained, continuing to traverse the pixel values, when the pixel value is changed from 0 to 1 in the first row or the second row of the raindrop image, the pixel value 1 at the moment is the starting pixel value of the 2 nd segment sequence, and recording the starting coordinate (x s2 ,y s2 ) The method comprises the steps of carrying out a first treatment on the surface of the Continuing traversing, when encountering the pixel value from 1 to 0 in the same row of the raindrop image, the pixel value 1 at this time is the end pixel value of the 2 nd segment sequence, and recording the end coordinate (x e2 ,y e2 ) The method comprises the steps of carrying out a first treatment on the surface of the Thus, the 2 nd-segment sequence position information [2, (x) s2 ,y s2 ),(x e2 ,y e2 )]The method comprises the steps of carrying out a first treatment on the surface of the Judging whether the 2 nd segment sequence belongs to the 1 st image point, wherein the judging method comprises the following steps: when the x-direction coordinate (x s2 ,x e2 ) With the x-direction coordinates (x s1 ,x e1 ) With overlap, and y s2 -y s1 When=1, the 2 nd segment sequence belongs to the 1 st image point, otherwise the 2 nd segment sequence is recorded as the 2 nd image point.
Step 34, acquiring an ith segment sequence and a jth image point: after the 2 nd segment sequence is acquired, continuing to traverse the pixel value, wherein in a certain row of the raindrop image, when the pixel value is changed from 0 to 1, the pixel value 1 at the moment isThe start pixel value of the ith sequence is recorded, and the start coordinate (x si ,y si ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein i is more than or equal to 2 and n is more than or equal to n; continuing traversing, in the same line of the raindrop image, when the pixel value is changed from 1 to 0, the pixel value 1 at the moment is the ending pixel value of the ith segment sequence, and recording the ending coordinate (x ei ,y ei ) The method comprises the steps of carrying out a first treatment on the surface of the Thus, the i-th segment sequence position information [ i, (x) si ,y si ),(x ei ,y ei )]The method comprises the steps of carrying out a first treatment on the surface of the If the i-1 th segment sequence belongs to the j-1 th image point, judging whether the i-1 th segment sequence belongs to the j-1 th image point or not according to the specific judging method: when the x-direction coordinate of the ith section sequence overlaps with the x-direction coordinate of the last section sequence of any one image point in the previous j-1 image points and the y-direction coordinate is added with 1, the ith section sequence belongs to the corresponding image point in the previous j-1 image points, otherwise, the ith section sequence is recorded as the jth image point.
Step 35, obtaining an nth segment sequence: continuing pixel value traversal until one traversal is completed, and obtaining n segments of sequences and m image points altogether; wherein the n-th sequence position information is [ n, (x) sn ,y sn ),(x en ,y en )]The start and end coordinates of all segment sequences in each image point constitute the contour of the corresponding image point.
Step 4, obtaining the center coordinates of each image point and the diameter of each image point: and (3) calculating the center coordinates and the diameters of the image points according to the outline of each image point obtained in the step (3).
Assuming that the jth image point includes a sequence of segments a in total, the center coordinate is (x j ,y j ) Diameter d j Wherein j is more than or equal to 1 and less than or equal to m; then x j 、y j And d j The calculation formulas of (a) are respectively as follows:
wherein k is the kth sequence in the a-segment sequences, k is more than or equal to 1 and less than or equal to a, and the position information of the kth sequence is [ k, (x) sk ,y sk ),(x ek ,y ek )];x j 、y j And d j After the acquisition, the j-th image point position information is recorded as [ j, (x) j ,y j ),d j ]。
In this embodiment, it is assumed that the 1 st image point includes a sequence of a=9 segments, and the center coordinate is (x 1 ,y 1 ) Diameter d 1 Wherein, the position information of the 9-segment sequence is respectively:
[1,(8,5),(12,5)]
[2,(7,6),(13,6)]
[3,(6,7),(14,7)]
[4,(6,8),(14,8)]
[5,(6,9),(14,9)]
[6,(6,10),(14,10)]
[7,(6,11),(14,11)]
[8,(7,12),(13,12)]
[9,(6,13),(12,13)]
thus, according to the above calculation formula, x is calculated 1 =10,y 1 Diameter d =9 1 =9。
The steps 1 to 4 are raindrop target detection processes.
Step 5, acquiring associated image points of raindrops: traversing the central coordinate of each image point and the diameter of the corresponding image point by adopting a target association algorithm shown in fig. 6, and associating the image points which are positioned on the same vertical straight line, have the diameters close to each other and are equidistant to each other to be used as associated image points of the same raindrop; the number of associated image points per raindrop is not less than 3. In this embodiment, the frequency f of the strobe light source is preferably 200Hz, and the number of associated image points of each raindrop is 4.
The method for acquiring the raindrop associated image point by adopting the target association algorithm preferably comprises the following steps.
Step 51, input the 1 st image point, and regard the 1 st image point as the 1 st point of the first strobe strip.
Step 52, inputting the 2 nd image point, and determining whether the 2 nd image point is the associated image point on the first strobe strip, wherein the specific determining method is as follows:
condition one: the diameter of the 2 nd image point is similar to the diameter of the 1 st image point.
Condition II: the included angle between the center of the 2 nd image point and the vertical direction of the central connecting line of the 1 st image point is smaller than 30 degrees.
When the first condition and the second condition are satisfied at the same time, the 2 nd image point is the associated image point on the first strobe strip line, and the distance D between the 2 nd image point and the 1 st image point is calculated; otherwise, the 2 nd image point is taken as the 1 st point of the second stroboscopic striping.
Step 53, inputting the first image point, wherein l is not less than 3 and not more than m, and judging whether the first image point is the associated image point on the existing stroboscopic strip line, wherein the specific judging method is as follows:
step 53A: the diameter of the first image point is similar to the diameter of the image point on the existing strobe strip H.
Step 53B: after step 53A is satisfied, the judgment is performed as follows:
(1) If there is only one image point on the existing strobe strip line H, the included angle between the center of the first image point and the vertical direction of the center connecting line of the existing image point on the existing strobe strip line H is smaller than 30 degrees, the first image point is used as the associated image point on the existing strobe strip line H.
(2) If there is only one image point on the existing strobe strip line H, the included angle between the center of the first image point and the vertical direction of the center connecting line of the existing image point on the existing strobe strip line H exceeds 30 degrees, the first image point is used as the first image point on the new strobe strip line.
(3) If there are at least two image points with a distance D on the existing strobe strip line H, the center of the first image point is located on the strobe strip line H, and the distance between the first image point and the adjacent image point on the strobe strip line H is D, the first image point is used as the associated image point on the existing strobe strip line H.
If the central coordinates of the last two image points are (x) l-2 ,y l-2 ) And (x) l-1 ,y l-1 ) The center coordinate of the first image point is (x l ,y l ) The method comprises the steps of carrying out a first treatment on the surface of the I.e. when x l And y l The following requirements are satisfied at the same time:
2x l-1 =x l-2 +x l
2y l-1 =y l-2 +y l
the first image point is the associated image point on the existing strobe strip H.
(4) If there are at least two image points with a distance D on the existing strobe strip line H, the center of the first image point is not located on the strobe strip line H or the center of the first image point is located on the strobe strip line H but the distance between the first image point and the adjacent image point on the strobe strip line H is not D, the first image point is taken as the first image point on the new strobe strip line.
Step 54, according to the method of step 53, traversing all image points until all m image points are traversed, and obtaining a plurality of strobe strip lines.
Step 55, counting the number of associated image points in each strobe strip obtained in step 54, and deleting all strobe strips with the number of associated image points lower than 3, thereby obtaining strobe strips with the number of associated image points not less than 3, wherein one strobe strip corresponds to one raindrop.
Step 6, calculating the diameter of the raindrops: the raindrop diameter is the average diameter of all associated image points of the corresponding raindrop.
Step 7, calculating the raindrop falling speed V, wherein a specific calculation formula is as follows: v= S f; wherein f is the frequency of the strobe light source; s is the average distance between all the associated image points of the corresponding raindrops.
Step 8, calculating the local wind speed V x The specific calculation formula is as follows: v (V) x =S x f, performing the following steps; wherein f is the frequency of the strobe light source; s is S x Is the x-direction offset distance of two adjacent associated image points in the corresponding raindrops.
In this embodiment, the center of the first image point is the same as the existing frequencyThe included angle between the central connecting line of the existing image point on the flash band line H and the vertical direction is smaller than 30 degrees, so the local wind speed V x The following formula will be satisfied: vx (Vx)<atan(30°)*f*D。
The preferred embodiments of the present application have been described in detail above, but the present application is not limited to the specific details of the above embodiments, and various equivalent changes can be made to the technical solution of the present application within the scope of the technical concept of the present application, and all the equivalent changes belong to the protection scope of the present application.

Claims (7)

1. A raindrop detection algorithm based on stroboscopic photos is characterized in that: the method comprises the following steps:
step 1, shooting a raindrop image: the industrial camera and the stroboscopic light source are started, the stroboscopic light source carries out stroboscopic light filling on raindrops entering the raindrop measuring hole, and the raindrops entering the raindrop measuring hole are imaged on the black background plate; shooting imaging on a black background plate by a telecentric lens of an industrial camera to form a raindrop image;
step 2, preprocessing an image, which specifically comprises the following steps:
step 2A, clutter filtering: performing clutter filtering on the raindrop image formed by shooting in the step 1;
step 2B, sharpening: carrying out binarization sharpening treatment on the raindrop image subjected to clutter filtering in the step 2A;
step 2C, establishing a two-dimensional coordinate system: the raindrop image sharpened in the step 2B is subjected to sharpening, the upper left corner point is taken as a coordinate origin o, the top edge of the raindrop image is taken as an x direction, the left side edge of the raindrop image is taken as a y direction, and a two-dimensional coordinate system of xoy is established; in the two-dimensional coordinate system, the length of the x direction and the y direction of each pixel value of the raindrop image is 1;
step 3, acquiring n segments of sequences and m image points: scanning and traversing pixel values in a raindrop image with a two-dimensional coordinate system one by one from the origin o of coordinates in order from left to right and from top to bottom to obtain n-segment sequence and i-th segment sequence position information [ i, (x) si ,y si ),(x ei ,y ei )]The method comprises the steps of carrying out a first treatment on the surface of the Wherein i is more than or equal to 1 and less than or equal to n; (x) si ,y si ) A start coordinate corresponding to a start pixel value of the ith section of sequence; (x) ei ,y ei ) The ending coordinates corresponding to the starting pixel values of the ith section of sequence; wherein y is si =y ei The method comprises the steps of carrying out a first treatment on the surface of the The method comprises the steps of establishing image points while acquiring n sections of sequences to obtain m image points in total; the sequence of segments in each image point needs to satisfy: the x-direction coordinates are overlapped, and the y-direction coordinates are in an increasing relation of +1; the start and end coordinates of all sequences of segments in each image point form the contour of the corresponding image point;
step 4, obtaining the center coordinates of each image point and the diameter of each image point: calculating the center coordinates and the diameters of the image points according to the outline of each image point obtained in the step 3;
step 5, acquiring associated image points of raindrops: traversing the central coordinate of each image point and the diameter of the corresponding image point by adopting a target association algorithm, and associating the image points which are positioned on the same vertical straight line, have the diameters close to each other and are equidistant to each other to be used as associated image points of the same raindrop; the number of associated image points of each raindrop is not less than 3;
step 6, calculating the diameter of the raindrops: the raindrop diameter is the average diameter of all relevant image points of the corresponding raindrops;
step 7, calculating the raindrop falling speed V, wherein a specific calculation formula is as follows: v= S f; wherein f is the frequency of the strobe light source; s is the average distance between all the associated image points of the corresponding raindrops.
2. The stroboscopic photo-based raindrop detection algorithm of claim 1, wherein: in step 3, the method for acquiring the n-segment sequence and m image points comprises the following steps:
step 31, pixel value scanning traversal: scanning and traversing pixel values in a raindrop image with a two-dimensional coordinate system one by one from the origin o of coordinates in order from left to right and from top to bottom;
step 32, obtain the 1 st segment sequence and 1 st image point: in the first line of the raindrop image, when the first time the pixel value is changed from 0 to 1, the pixel value 1 at this time is the initial pixel value of the 1 st segment sequence, and recordingStart coordinates (x) corresponding to the start pixel values of the 1 st segment sequence s1 ,y s1 ) The method comprises the steps of carrying out a first treatment on the surface of the Continuing traversing, in the first row of the raindrop image, when the first encountered pixel value is changed from 1 to 0, the pixel value 1 at this time is the ending pixel value of the 1 st segment sequence, and recording the ending coordinate (x e1 ,y e1 ) The method comprises the steps of carrying out a first treatment on the surface of the Thus, the 1 st-segment sequence position information [1, (x) s1 ,y s1 ),(x e1 ,y e1 )]And recording the 1 st segment sequence as the 1 st image point;
step 33, acquiring the 2 nd segment sequence and the 2 nd image point: after the 1 st segment sequence is obtained, continuing to traverse the pixel values, when the pixel value is changed from 0 to 1 in the first row or the second row of the raindrop image, the pixel value 1 at the moment is the starting pixel value of the 2 nd segment sequence, and recording the starting coordinate (x s2 ,y s2 ) The method comprises the steps of carrying out a first treatment on the surface of the Continuing traversing, when encountering the pixel value from 1 to 0 in the same row of the raindrop image, the pixel value 1 at this time is the end pixel value of the 2 nd segment sequence, and recording the end coordinate (x e2 ,y e2 ) The method comprises the steps of carrying out a first treatment on the surface of the Thus, the 2 nd-segment sequence position information [2, (x) s2 ,y s2 ),(x e2 ,y e2 )]The method comprises the steps of carrying out a first treatment on the surface of the Judging whether the 2 nd segment sequence belongs to the 1 st image point, wherein the judging method comprises the following steps: when the x-direction coordinate (x s2 ,x e2 ) With the x-direction coordinates (x s1 ,x e1 ) With overlap, and y s2 -y s1 When=1, the 2 nd segment sequence belongs to the 1 st image point, otherwise, the 2 nd segment sequence is recorded as the 2 nd image point;
step 34, acquiring an ith segment sequence and a jth image point: after the 2 nd segment sequence is obtained, continuing to traverse the pixel value, in a certain row of the raindrop image, when the pixel value is changed from 0 to 1, the pixel value 1 at the moment is the initial pixel value of the i th segment sequence, and recording the initial coordinate (x si ,y si ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein i is more than or equal to 2 and n is more than or equal to n; continuing traversing, when encountering the pixel value from 1 to 0 in the same row of the raindrop image, the pixel value 1 at the moment is the ending pixel value of the ith section sequence, and recording the ending of the ith section sequenceEnd coordinates (x) ei ,y ei ) The method comprises the steps of carrying out a first treatment on the surface of the Thus, the i-th segment sequence position information [ i, (x) si ,y si ),(x ei ,y ei )]The method comprises the steps of carrying out a first treatment on the surface of the If the i-1 th segment sequence belongs to the j-1 th image point, judging whether the i-1 th segment sequence belongs to the previous j-1 image point or not according to the specific judging method: when the x-direction coordinate of the ith section of sequence is overlapped with the x-direction coordinate of the last section of sequence of any one image point in the previous j-1 image points and the y-direction coordinate is added with 1, the ith section of sequence belongs to the corresponding image point in the previous j-1 image points, otherwise, the ith section of sequence is recorded as the jth image point;
step 35, obtaining an nth segment sequence: continuing pixel value traversal until one traversal is completed, and obtaining n segments of sequences and m image points altogether; wherein the n-th sequence position information is [ n, (x) sn ,y sn ),(x en ,y en )]The start and end coordinates of all segment sequences in each image point constitute the contour of the corresponding image point.
3. The stroboscopic photo-based raindrop detection algorithm of claim 1, wherein: in step 4, it is assumed that the jth image point includes a sequence of segments a, and the center coordinate is (x j ,y j ) Diameter d j Wherein j is more than or equal to 1 and less than or equal to m; then x j 、y j And d j The calculation formulas of (a) are respectively as follows:
wherein k is the kth sequence in the a-segment sequence, k is more than or equal to 1 and less than or equal to a, and the position of the kth sequenceThe information is [ k, (x) sk ,y sk ),(x ek ,y ek )];x j 、y j And d j After the acquisition, the j-th image point position information is recorded as [ j, (x) j ,y j ),d j ]。
4. The stroboscopic photo-based raindrop detection algorithm of claim 1, wherein: in step 5, a method for acquiring raindrop associated image points by adopting a target association algorithm comprises the following steps:
step 51, inputting the 1 st image point, and using the 1 st image point as the 1 st point of the first strobe strip line;
step 52, inputting the 2 nd image point, and determining whether the 2 nd image point is the associated image point on the first strobe strip, wherein the specific determining method is as follows:
condition one: the diameter of the 2 nd image point is similar to that of the 1 st image point;
condition II: the included angle between the center of the 2 nd image point and the vertical direction and the center connecting line of the 1 st image point is smaller than 30 degrees;
when the first condition and the second condition are satisfied at the same time, the 2 nd image point is the associated image point on the first strobe strip line, and the distance D between the 2 nd image point and the 1 st image point is calculated; otherwise, taking the 2 nd image point as the 1 st point of the second stroboscopic striping;
step 53, inputting the first image point, wherein l is not less than 3 and not more than m, and judging whether the first image point is the associated image point on the existing stroboscopic strip line, wherein the specific judging method is as follows:
step 53A: the diameter of the first image point is similar to that of the image point on the existing stroboscopic strip line H;
step 53B: after step 53A is satisfied, the judgment is performed as follows:
(1) If only one image point exists on the existing stroboscopic strip line H, the included angle between the center of the first image point and the vertical direction of the center connecting line of the existing image point on the existing stroboscopic strip line H is smaller than 30 degrees, the first image point is used as the associated image point on the existing stroboscopic strip line H;
(2) If only one image point exists on the existing stroboscopic strip line H, the included angle between the center of the first image point and the vertical direction of the center connecting line of the existing image point on the existing stroboscopic strip line H exceeds 30 degrees, the first image point is used as the first image point on the new stroboscopic strip line;
(3) If there are at least two image points with a distance D on the existing strobe strip line H, the center of the first image point is positioned on the strobe strip line H, and the distance between the first image point and the adjacent image point on the strobe strip line H is D, the first image point is used as the associated image point on the existing strobe strip line H;
(4) If there are at least two image points with a distance D on the existing strobe strip line H, the center of the first image point is not located on the strobe strip line H or the center of the first image point is located on the strobe strip line H but the distance between the first image point and the adjacent image point on the strobe strip line H is not D, the first image point is taken as the first image point on the new strobe strip line;
step 54, traversing all image points according to the method of step 53 until all m image points are traversed, and obtaining a plurality of stroboscopic striping lines;
step 55, counting the number of associated image points in each strobe strip obtained in step 54, and deleting all strobe strips with the number of associated image points lower than 3, thereby obtaining strobe strips with the number of associated image points not less than 3, wherein one strobe strip corresponds to one raindrop.
5. The stroboscopic photo-based raindrop detection algorithm of claim 4, wherein: in step 53B (3), when there are at least two image points with a distance D on the existing strobe strip line H, the center coordinates of the last two image points are (x) l-2 ,y l-2 ) And (x) l-1 ,y l-1 ) The center coordinate of the first image point is (x l ,y l ) The method comprises the steps of carrying out a first treatment on the surface of the When x is l And y l Simultaneously meets the following requirements:
2x l-1 =x l-2 +x l
2y l-1 =y l-2 +y l
the first image point is the associated image point on the existing strobe strip H.
6. The stroboscopic photo-based raindrop detection algorithm of claim 1, wherein: in step 5, the strobe frequency f of the strobe light source is 200Hz, and the associated image points of each raindrop are 4.
7. The stroboscopic photo-based raindrop detection algorithm of claim 1, wherein: also comprises a step 8 of calculating the local wind speed V x The specific calculation formula is as follows: v (V) x =S x f, performing the following steps; wherein f is the frequency of the strobe light source; s is S x Is the x-direction offset distance of two adjacent associated image points in the corresponding raindrops.
CN202111008853.6A 2021-08-31 2021-08-31 Raindrop detection algorithm based on stroboscopic photo Active CN113762122B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111008853.6A CN113762122B (en) 2021-08-31 2021-08-31 Raindrop detection algorithm based on stroboscopic photo

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111008853.6A CN113762122B (en) 2021-08-31 2021-08-31 Raindrop detection algorithm based on stroboscopic photo

Publications (2)

Publication Number Publication Date
CN113762122A CN113762122A (en) 2021-12-07
CN113762122B true CN113762122B (en) 2023-10-13

Family

ID=78792030

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111008853.6A Active CN113762122B (en) 2021-08-31 2021-08-31 Raindrop detection algorithm based on stroboscopic photo

Country Status (1)

Country Link
CN (1) CN113762122B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114648711B (en) * 2022-04-11 2023-03-10 成都信息工程大学 Clustering-based cloud particle sub-image false target filtering method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111485A (en) * 2014-07-18 2014-10-22 中国科学院合肥物质科学研究院 Stereo imaging based observation method for raindrop size distribution and other rainfall micro physical characteristics
CN104976960A (en) * 2015-06-11 2015-10-14 西北农林科技大学 Raindrop physical property observation method and device
CN106970024A (en) * 2017-03-16 2017-07-21 中南大学 Gauge detection distance-finding method and system based on camera and controllable stroboscopic light source
CN107527316A (en) * 2017-08-14 2017-12-29 马鞍山雨甜医疗科技有限公司 The method and system of arbitrfary point structure cloud data on two-dimensional ultrasound image sequence
CN109115186A (en) * 2018-09-03 2019-01-01 山东科技大学 A kind of 360 ° for vehicle-mounted mobile measuring system can measure full-view image generation method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4668838B2 (en) * 2006-05-16 2011-04-13 株式会社日本自動車部品総合研究所 Raindrop detection device and wiper control device
EP2542451B1 (en) * 2010-03-04 2014-09-03 Valeo Schalter und Sensoren GmbH Method of fog and raindrop detection on a windscreen and driving assistance device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111485A (en) * 2014-07-18 2014-10-22 中国科学院合肥物质科学研究院 Stereo imaging based observation method for raindrop size distribution and other rainfall micro physical characteristics
CN104976960A (en) * 2015-06-11 2015-10-14 西北农林科技大学 Raindrop physical property observation method and device
CN106970024A (en) * 2017-03-16 2017-07-21 中南大学 Gauge detection distance-finding method and system based on camera and controllable stroboscopic light source
CN107527316A (en) * 2017-08-14 2017-12-29 马鞍山雨甜医疗科技有限公司 The method and system of arbitrfary point structure cloud data on two-dimensional ultrasound image sequence
CN109115186A (en) * 2018-09-03 2019-01-01 山东科技大学 A kind of 360 ° for vehicle-mounted mobile measuring system can measure full-view image generation method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"Some Aspects of the Scattering of Light and Microwaves on Non-Spherical Raindrops";Victor V. Sterlyadkin;《Atmosphere》;全文 *

Also Published As

Publication number Publication date
CN113762122A (en) 2021-12-07

Similar Documents

Publication Publication Date Title
CN107229930B (en) Intelligent identification method for numerical value of pointer instrument
CN109948469A (en) The automatic detection recognition method of crusing robot instrument based on deep learning
CN102297660B (en) Measuring method of shield tunnel lining segment seam open width and apparatus thereof
CN110807355B (en) Pointer instrument detection and reading identification method based on mobile robot
CN110322522B (en) Vehicle color recognition method based on target recognition area interception
CN109142371A (en) High density flexible exterior substrate defect detecting system and method based on deep learning
CN110400315A (en) A kind of defect inspection method, apparatus and system
CN111612846A (en) Concrete crack width measuring method based on U-net CNN image recognition and pixel calibration
CN113762122B (en) Raindrop detection algorithm based on stroboscopic photo
WO2022206161A1 (en) Feature point recognition-based block movement real-time detection method
CN109512402B (en) Near-distance multidimensional accurate medical human body infrared thermal imaging method
CN113643371B (en) Method for positioning aircraft model surface mark points
CN100470578C (en) Science instrument working state monitoring method based on computer vision
CN110991360A (en) Robot inspection point location intelligent configuration method based on visual algorithm
CN108445010A (en) Automatic optical detection method and device
CN111879735A (en) Rice appearance quality detection method based on image
CN113688817A (en) Instrument identification method and system for automatic inspection
KR20140075042A (en) Apparatus for inspecting of display panel and method thereof
CN113221805B (en) Method and device for acquiring image position of power equipment
CN111060518A (en) Stamping part defect identification method based on instance segmentation
CN116245793A (en) Color ring resistor resistance value detection method based on vision
CN113963230A (en) Parking space detection method based on deep learning
CN112505002B (en) Solution turbidity detection method, medium and image system based on RGB model
CN111191349B (en) Missile launching motion parameter analysis method
CN111652055B (en) Intelligent switch instrument identification method based on two-stage positioning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 211106 No. 32, Changqing street, Jiangning Development Zone, Nanjing, Jiangsu

Applicant after: China Shipbuilding Pengli (Nanjing) Atmospheric and Ocean Information System Co.,Ltd.

Address before: 211106 No. 32, Changqing street, Jiangning Development Zone, Nanjing, Jiangsu

Applicant before: CSIC PRIDE (NANJING) ATMOSPHERE MARINE INFORMATION SYSTEM Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant