CN112288648B - Rapid interpolation display method based on visibility automatic observation - Google Patents

Rapid interpolation display method based on visibility automatic observation Download PDF

Info

Publication number
CN112288648B
CN112288648B CN202011148745.4A CN202011148745A CN112288648B CN 112288648 B CN112288648 B CN 112288648B CN 202011148745 A CN202011148745 A CN 202011148745A CN 112288648 B CN112288648 B CN 112288648B
Authority
CN
China
Prior art keywords
light source
brightness
image
visibility
observation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011148745.4A
Other languages
Chinese (zh)
Other versions
CN112288648A (en
Inventor
雷鸣
武国良
姜罕盛
梁健
王琪
赵玉娟
王艺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Meteorological Information Center Tianjin Meteorological Archives
Original Assignee
Tianjin Meteorological Information Center Tianjin Meteorological Archives
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Meteorological Information Center Tianjin Meteorological Archives filed Critical Tianjin Meteorological Information Center Tianjin Meteorological Archives
Priority to CN202011148745.4A priority Critical patent/CN112288648B/en
Publication of CN112288648A publication Critical patent/CN112288648A/en
Application granted granted Critical
Publication of CN112288648B publication Critical patent/CN112288648B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a quick interpolation display method based on visibility automatic observation, which comprises the following steps: intercepting a video image through a program so as to obtain a picture of a video window; searching the position of a light source in a video window picture to prepare for subsequent positioning and operation; judging whether to perform fast interpolation or not, and if not, only obtaining the brightness of the surface light source; if yes, the image needs to be amplified; removing image noise by using an ecological algorithm to obtain the brightness of the surface light source; and calculating the visibility according to the obtained final brightness value of the surface light source. The method of the invention solves the problem that only one set of camera system is used, so that the method can be simultaneously suitable for two different observation modes of day and night; the equipment cost and the system design complexity are greatly reduced; the conflict that the requirements of daytime observation and night observation on the focal length of the camera are different can be better solved; the accuracy of system visibility observation can be effectively improved.

Description

Rapid interpolation display method based on visibility automatic observation
Technical Field
The invention belongs to the technical field of image display, and particularly relates to a quick interpolation display method based on visibility automatic observation.
Background
The fast image interpolation algorithm is researched, and the method can be suitable for two different observation modes of day and night by only using one set of camera system. The equipment cost and the system design complexity are greatly reduced. Simultaneously, the conflict that the requirement for the focal length of the camera is different between daytime observation and night observation can be better solved: if the focal length of the camera is zoomed in, the aperture of the light source in the video window can be enlarged, which is beneficial to night observation, but inevitably leads to the zoom-in of the sky background, so that the sky background can disappear in the video picture, thereby seriously affecting the daytime observation. This is an important scientific problem that has urgently needed to be solved in the research of video visibility automatic observation systems (DPVS) for many years. At present, the solution is to use 2 sets of different video visibility systems for respective observation. It is clear that this is not the best approach, adding additional cost. In response to this problem, a completely new set of solutions is proposed herein.
Disclosure of Invention
In view of this, the present invention is directed to a fast interpolation display method based on visibility automatic observation, so as to solve the problem that a video only uses one set of camera system in the daytime and at night.
In order to achieve the purpose, the technical scheme of the invention is realized as follows:
a quick interpolation display method based on visibility automatic observation comprises the following steps:
s1, intercepting a video image through a program to obtain a picture of a video window;
s2, searching the light source position in the video window picture, and preparing for subsequent positioning and operation;
s3, judging whether fast interpolation is needed or not, and if not, only obtaining the brightness of the surface light source; if yes, the image needs to be amplified;
s4, removing image noise by using an ecological algorithm according to the amplified image obtained in the S3, so as to obtain the brightness of the surface light source;
and S5, obtaining a final brightness value of the surface light source according to the S4, and calculating the visibility by using the final brightness value of the surface light source.
Further, the process of determining whether to perform fast interpolation in step S3 is: the program judges the brightness of the picture, if the brightness of the picture enters into the night, the fast interpolation is switched.
Further, the interpolation algorithm for enlarging the image by using fast interpolation in step S3 is as follows:
let the function y = f (x) at two nodes x 0 ,x 1 Respectively, the function values of (a) are y 0 ,y 1
Let pass two points (x) 0 ,y 0 )、(x 1 ,y 1 ) The linear equation of (a) is y = L (x), then it can be known that the function value y corresponding to any point x is:
Figure BDA0002740538320000021
since the surface light source has almost the same gray scale value, the above formula
Figure BDA0002740538320000022
Is a relatively small value;
thus, the above formula can be expressed as:
Figure BDA0002740538320000023
because only the area light source is concerned, other areas of the image are not concerned; therefore, the above formula is expressed as:
Figure BDA0002740538320000024
wherein Δ Φ = y 1 -y 0 The absolute value of delta phi is less than or equal to 5 to indicate that the absolute value of delta phi is less than or equal to 5 pixel interpolation values,
Figure BDA0002740538320000031
this value may be obtained from the image once in advance as an average pixel value of the light source region.
Further, the process of removing image noise by using an ecological algorithm in step S4 is as follows:
by utilizing the characteristic that the image details are eliminated by opening operation and closing operation in ecology, more effective information of the surface light source is kept, and the method comprises the following steps:
if f is the original image and b is the structural element, the set f is closed by the structural element b and is recorded as: f, b, if on, is recorded as:
Figure BDA0002740538320000036
the calculation of the closing operation is as follows:
Figure BDA0002740538320000032
the on operation is as follows:
Figure BDA0002740538320000033
further, the visibility in step S5 is calculated according to the following formula:
the formula for calculating daytime visibility is as follows:
Figure BDA0002740538320000034
wherein L is 1 Distance between near target and camera, L 2 Distance between distant object and camera, B g1 Brightness of sky background corresponding to near object, B t1 Brightness of a near object, B g2 Brightness of sky background corresponding to far objectDegree B t2 Is the brightness of the distant object.
The formula for calculating night visibility is as follows:
Figure BDA0002740538320000035
wherein, B t10 Then the true brightness of the light source 1, B t1 The brightness of the light source 1, B t20 Then the true brightness of the light source 2, B t2 The brightness of the light source 2, B b1 Brightness of black body 1, B b2 Is the brightness of the black body 2.
Compared with the prior art, the quick interpolation display method based on visibility automatic observation has the following advantages:
(1) The method of the invention solves the problem that only one set of camera system is used, so that the method can be simultaneously suitable for two different observation modes of day and night; the equipment cost and the system design complexity are greatly reduced; simultaneously, the conflict that the requirement for the focal length of the camera is different between daytime observation and night observation can be better solved: if the focal length of the camera is zoomed in, the size of the aperture of the light source in the video window can be enlarged, which is beneficial to night observation, but the sky background is inevitably zoomed in, so that the sky background can disappear in the video picture, and the daytime observation is seriously influenced.
(2) The method can effectively improve the uniformity of the pixel value of the target area, eliminate interference noise and improve the accuracy of visibility observation of the system.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate an embodiment of the invention and, together with the description, serve to explain the invention and not to limit the invention. In the drawings:
FIG. 1 is a flow chart of an interpolation algorithm according to an embodiment of the present invention;
FIG. 2 is a flow chart of a first-derivative second fast interpolation observation according to an embodiment of the present invention;
FIG. 3 is a comparison graph of interpolation before and after 2 times magnification according to an embodiment of the present invention;
FIG. 4 is a diagram of a light source area according to an embodiment of the present invention;
FIG. 5 is an enlarged isometric view of an original light source region according to an embodiment of the present invention;
FIG. 6 is a comparison graph of source region interpolation and filtering results according to an embodiment of the present invention;
FIG. 7 is a comparison graph of interpolation before and after 2 times magnification after the improved algorithm according to the embodiment of the present invention;
FIG. 8 is an enlarged isometric view of the light source region after the improved algorithm of the present invention;
FIG. 9 is an enlarged isometric view of a light source region according to an embodiment of the invention;
FIG. 10 is a comparison of daytime video windows according to an embodiment of the present invention;
FIG. 11 is a detail comparison diagram of the daytime front light according to the embodiment of the present invention;
FIG. 12 is a comparison of night video windows according to an embodiment of the present invention;
FIG. 13 is a comparison view of the details of the night front light according to the embodiment of the present invention;
FIG. 14 is a comparison diagram illustrating details of a night rear light according to an embodiment of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict.
In the description of the present invention, it is to be understood that the terms "center", "longitudinal", "lateral", "up", "down", "front", "back", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on those shown in the drawings, and are used only for convenience in describing the present invention and for simplicity in description, and do not indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and thus, are not to be construed as limiting the present invention. Furthermore, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first," "second," etc. may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless otherwise specified.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood by those of ordinary skill in the art through specific situations.
The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
As shown in fig. 1 to 14, a fast interpolation display method based on visibility automatic observation includes the following steps:
s1, intercepting a video image through a program to obtain a picture of a video window;
s2, searching the light source position in the video window picture, and preparing for subsequent positioning and operation;
s3, judging whether fast interpolation is needed or not, and if not, only obtaining the brightness of the surface light source; if yes, the image needs to be amplified;
s4, removing image noise by using an ecological algorithm according to the amplified image obtained in the S3, so as to obtain the brightness of the surface light source;
and S5, obtaining a final brightness value of the surface light source according to the S4, and calculating the visibility by using the final brightness value of the surface light source.
The process of determining whether to perform fast interpolation in step S3 is: the program judges the brightness of the picture, if the brightness of the picture enters the night, the fast interpolation is switched to
The interpolation algorithm for enlarging the image by using the fast interpolation in the step S3 is as follows:
let the function y = f (x) at two nodes x 0 ,x 1 Respectively, the function values of (a) are y 0 ,y 1
Let two points (x) pass 0 ,y 0 )、(x 1 ,y 1 ) Y = L (x), then it can be known that the function value y corresponding to any point x is:
Figure BDA0002740538320000061
since the surface light source has almost the same gray scale value, the above formula
Figure BDA0002740538320000071
Is a relatively small value;
thus, the above formula can be expressed as:
Figure BDA0002740538320000072
because only the area light source is concerned, other areas of the image are not concerned; therefore, the above formula is expressed as:
Figure BDA0002740538320000073
wherein Δ Φ = y 1 -y 0 The absolute value of delta phi is less than or equal to 5 to indicate that the absolute value of delta phi is less than or equal to 5 pixel interpolation values,
Figure BDA0002740538320000074
this value may be obtained from the image once in advance as an average pixel value of the light source region.
The process of removing the image noise by using the ecological algorithm in the step S4 is as follows:
by utilizing the characteristic that the image details are eliminated by opening operation and closing operation in ecology, more effective information of the surface light source is kept, and the surface light source has
If f is the original image and b is the structural element, the set f is closed by the structural element b and is recorded as: f, b, if on, is recorded as:
Figure BDA0002740538320000078
the calculation of the closing operation is as follows:
Figure BDA0002740538320000075
the on operation is as follows:
Figure BDA0002740538320000076
the visibility in the step S5 is calculated according to the following formula:
the formula for calculating daytime visibility is as follows:
Figure BDA0002740538320000077
wherein L is 1 Distance between near target and camera, L 2 Distance between distant object and camera, B g1 Brightness of sky background corresponding to near object, B t1 Brightness of a near object, B g2 Brightness of sky background corresponding to distant object, B t2 Is the brightness of the distant object.
The formula for calculating night visibility is as follows:
Figure BDA0002740538320000081
wherein, B t10 Then the true brightness of the light source 1, B t1 The brightness of the light source 1, B t20 Then the true brightness of the light source 2, B t2 The brightness of the light source 2, B b1 Brightness of black body 1, B b2 Is the brightness of the black body 2.
The specific method comprises the following steps:
1.1 the uniformity of the light source is an important condition for designing the light source, and the target light source adopted by the system is a very uniform surface light source (the technology for obtaining the uniform surface light source is not repeated herein); after interpolation amplification, the uniformity of the surface light source is ensured as much as possible, and meanwhile, certain requirements are also placed on interpolation speed; therefore, the research considers that a linear interpolation algorithm with higher calculation speed is adopted. Although this algorithm is prone to two kinds of distortion: image edge staircase distortion effects and edge blurring effects, but this does not affect the study. Because the present study is concerned only with the region at the surface light source, not with the edge portion thereof; we will therefore improve the algorithm, and the improved interpolation algorithm is as follows:
function y = f (x) at two nodes x 0 ,x 1 Respectively has a function value of y 0 ,y 1 . Let pass two points (x) 0 ,y 0 )、(x 1 ,y 1 ) The linear equation of (a) is y = L (x), then it can be known that the function value y corresponding to any point x is:
Figure BDA0002740538320000082
because of the surface light source, the corresponding gray values are almost equal. Therefore, in the above formula
Figure BDA0002740538320000091
Is a relatively small value; thus, the above formula can be expressed as:
Figure BDA0002740538320000092
because only the area light source is concerned, other areas of the image are not concerned; therefore, the above formula is expressed as:
Figure BDA0002740538320000093
wherein Δ Φ = y 1 -y 0 The absolute value of delta phi is less than or equal to 5 pixel interpolation values when the absolute value of delta phi is less than or equal to 5;
by using the method, the area light source concerned can be greatly enlarged; meanwhile, other non-continuous areas which are not concerned are eliminated to a certain degree, so that the aim of carrying out image matching identification subsequently and accurately finding the position of a light source is fulfilled;
therefore, the system realizes the effect of simulating two sets of video systems by using one set of video system, thereby well solving the problem of inaccurate calibration caused by small light source area and ensuring quick and accurate target positioning, and the effect graph obtained by realizing the algorithm is shown in figure 3.
For effective comparison, the light source areas are interpolated and placed in the same scale for comparison, as shown in fig. 5; actually, the light source region before interpolation should be half the size of the light source region after interpolation, as shown in fig. 4, the range of the light source region is greatly expanded, which is beneficial to the positioning of the subsequent light source region and the pixel value acquisition.
1.2 Algorithm further improvements
Note that while image amplification is performed, image noise is also amplified, and we only care about the light source part and not about the details of other parts of the image; therefore, by utilizing the characteristic that the opening operation and the closing operation in ecology eliminate image details, more effective information of the surface light source is reserved, and the surface light source has
If f is an original image and b is a structural element, the set f is subjected to closed operation by the structural element b and is recorded as: f, b, if on, is recorded as:
Figure BDA0002740538320000095
the calculation of the closing operation is as follows:
Figure BDA0002740538320000094
the on operation is as follows:
Figure BDA0002740538320000101
to verify the validity of the algorithm, the stability of the algorithm during observation can be checked by a large number of experimental comparative analyses. Taking the screenshots of 12-day and 12-month segments of 2018 (00, 03, 07, 10, 12, 15, 17, 20, 23, respectively), each time is divided into three groups, and each time is sequentially from left to right: original light source region, post-difference light source region, post-filter light source region.
It should be noted that: since the pixel value is directly made zero when | Δ Φ | > 5, i.e. y =0. So that small black spots appear in fig. 6. However, by looking at the detail comparison of the light source area, it can be found that: although the local noise of the light source area is increased after interpolation, the pixel consistency of the middle area is effectively improved although the edge area is shown in fig. 6, the pixel values are distributed in an original disordered point shape to be in an even area shape with equal pixel values, and the disorder is greatly improved.
However, a great deal of subsequent practical experimental observation shows that the filtering effect is ideal in most cases, and small black spots artificially generated can be eliminated through filtering; however, in some specific cases, there is still some noise that affects the smoothness of the active area, such as: at the position of the light source area, when small black spots are densely arranged and even large block black spots are formed, all noises cannot be eliminated by filtering, and the small black spots artificially generated can influence the subsequent calculation.
Therefore, in order to further eliminate the hidden observation trouble, the foregoing formula (2) is modified from the past direct simple zero assignment to the average pixel value of the light source region assigned to the corresponding pixel value, and the formula is as follows:
Figure BDA0002740538320000102
wherein Δ Φ = y 1 -y 0 The absolute value of delta phi is less than or equal to 5 pixel interpolation,
Figure BDA0002740538320000103
the average pixel value of the light source region, which can be obtained from the image once in advance, so the overall calculation speed will be slightly slower than by directly zero using equation (2), but because equation (3) only involves the pixel mean calculation for a small region of the light source region; thus, the speed is not too much affected.
The effect graphs obtained by the implementation of the above algorithm are shown in fig. 7 to 9 as follows;
wherein, from left to right in fig. 9, in order: the method comprises the steps of obtaining an original image of a light source area, an interpolated image of the light source area and a filtered image of the interpolated image of the light source area; from the image, it is obvious that in fig. 5, the pixel values of the interpolated image are obviously more uniform and smoother than the original image, and after the filtering, the uniformity of the image is further improved.
To further increase the speed of interpolation, we define the interpolation range. Because only the pixel values of the two light source regions are concerned, when the light source regions are positioned, subgraphs of the positions of the two light sources are synchronously intercepted, and only the subgraphs of the two light source regions with greatly reduced sizes are interpolated. Therefore, the overall calculation speed of the system is not greatly influenced.
The above implementation process is shown in figure 1,
s1, firstly, intercepting a video image to obtain a picture of a video window;
s2, searching the light source position in the video window picture, and preparing for subsequent positioning and operation;
and S3, when the light source position cannot be found, returning to the step S2, and continuously searching and positioning the correct light source position. And if the correct light source position is found, the accurate position of the light source frame is positioned according to the geometric characteristics of the light source. Meanwhile, calculating the mean value of pixels in the light source region to prepare for subsequent calculation;
s4, respectively intercepting respective subgraphs of the front light source and the rear light source according to the correct position of the light source frame positioned in the S3, and carrying out interpolation aiming at the two light source subgraphs;
and S5, obtaining the final light source area brightness value according to the S4, thereby providing accurate data for subsequent visibility calculation.
1.3 Algorithm quality validation analysis
To further verify the effectiveness of the algorithm, we use the standard deviation to verify the data, and measure whether the light source region is more uniform and smooth before and after the algorithm by detecting the deviation between the random gray value in the light source region and the average gray value. The standard deviation calculation formula is as follows:
Figure BDA0002740538320000121
wherein N is the number of pixels in the image, μ is the average gray-scale value of the image, and x i Is a single point of pixel value in the image. The corresponding value order of i in the image is: from left to right, from top to bottom.
And (4) respectively performing calculation analysis on the observation data of 2019 in 1 month according to different levels of high, medium and low visibility. For ease of analysis, the foregoing rules are followed here. Similarly, the observed data were analyzed in four groups from low to high, namely: less than 2km, 2-5km, 5-10km, 10-15km.
Since the data results obtained by statistics of different data segments are very similar, only the statistical analysis data condition of low visibility under the condition of less than 2km is given here, and the detailed results are shown in the following table. The statistical analysis results of the other three groups of data segments are not described in detail here.
TABLE 1 original, interpolated and filtered comparative analysis table (front light source)
Figure BDA0002740538320000122
TABLE 2 original, interpolated and filtered contrastive analysis table (backlight)
Figure BDA0002740538320000131
The total number of sample data in the above table is 1225, and the original, interpolated and filtered data of each light source region are mean data, that is, the final average result of 1225 samples.
In the original image, the size of the front light source region is 59X59, and the size of the rear light source region is 15X15. When actually performing performance analysis comparison, the N value of the front light source is set to be between [10 and 45 ]. Then the N value of the light source is set to be between [5 and 10 ]; the area where the edge is likely to be distorted is avoided as much as possible, and a relatively stable middle area is selected.
In the above table, it can be clearly seen that the numerical value of the standard deviation decreases from the original value to the value after interpolation, which indicates that the interpolation algorithm is more effective, and the standard deviation also decreases to a certain extent from the value after interpolation to filtering, but the degree of decrease from the value after filtering to multiple times (5 times of filtering) is relatively small; this indirectly verifies that our previous approach of discarding multiple filtering is correct.
As can be seen from the data table, the interpolated data is more uniform than the original data, the uniformity of the interpolated data is at least 1.25 times and at most 5.55 times of the original data, and the uniformity of the filtered data is further improved, compared with the original data, the interpolated data is at least 1.49 times and at most 11.01 times of the original data; therefore, the existing proposed algorithm can ensure the accurate extraction of the pixel value of the light source region, and is beneficial to the subsequent observation and calculation of atmospheric visibility.
2. One-derivative two-fast interpolation observation process
By fixing the focal length of the camera with better daytime observation, the brightness information of the surface light source required by calculating visibility can be obtained only by amplifying the image during nighttime observation through an interpolation algorithm on the premise of ensuring better daytime observation, and the flow is shown in figure 2,
s1, intercepting a video image through a program to obtain a picture of a video window;
s2, searching the position of a light source in a video window picture, and preparing for subsequent positioning and operation;
s3, judging whether a quick difference value needs to be carried out or not, and if not, only obtaining the brightness of the surface light source; if yes, the image needs to be amplified by N times;
s4, removing image noise by using an ecological algorithm according to the amplified image obtained in the S3, so as to obtain the brightness of the surface light source;
s5, obtaining a final brightness value of the surface light source according to the S4, and calculating the visibility by using the final brightness value of the surface light source;
in the above flow, only a small range of the light source is intercepted, and only the required area is processed, thereby further improving the processing speed.
3. Analysis of experiments
Selecting a square 5 × 5 operator b similar to the structure of the observed light source region by combining the characteristics of the surface light source, and processing the image, wherein the results are shown in fig. 10 to 14;
through a large number of observation tests, the interpolated image is only subjected to one-time on-off operation filtering; otherwise, although the uniformity of the light source region can be improved to a certain extent by performing the on-off filtering for many times, the improvement extent is limited, and the time consumption caused by the improvement extent is considerable; therefore, subsequent experiments uniformly only performed a single on-off filtering on the image.
As shown in fig. 10 to 14, it can be seen that, because the effect of removing the noise point of the positioning target is good, and the pixel at the position of the enlarged image window still maintains the same size as the original pixel, the subsequent visibility calculation is not affected, and the noise point of the picture is effectively removed, so that the pixel value of the light source appears more "smooth", thereby providing necessary support for subsequent positioning and good observation results.
In order to check the effectiveness of the algorithm, the visibility obtained by common observation is compared with the visibility obtained by observation after the algorithm of the system is amplified. For comparison, only the observed results of DPVS and LT31 (atmospheric transmittance meter) are compared and analyzed to verify whether the observed effect of visibility is improved.
From the data of 11 months in 2019, random sampling is performed according to observation distance. For the purpose of analysis, the observed data were divided into four groups from low to high, namely: less than 2km, 2-5km, 5-10km, 10-15km. First, the average of the observed data of DPVS and LT31 is taken as a standard value, and the relative error and the relative standard deviation between the DPVS and LT31 and the standard value of each data segment are respectively calculated. The relative standard deviation R is calculated as follows:
Figure BDA0002740538320000151
the specific analytical results are shown in Table 3.
TABLE 3 data comparison analysis table for filtering mode
Figure BDA0002740538320000152
Wherein, DPVS' is the result of using interpolation amplification algorithm to extract light source brightness for observation.
To further examine the performance of the algorithms, the time spent by a single observation of each of the different algorithms was plotted against the time spent in table 4:
TABLE 4 comparative analysis table for single observation time in different modes
Figure BDA0002740538320000161
The original algorithm refers to a normal visibility observation process algorithm, the algorithm before process adjustment refers to an algorithm for processing a frame of video image by adding a fast interpolation algorithm, the algorithm refers to an algorithm for processing a light source region only by the fast interpolation algorithm, and the image magnification is 2 times.
4. Results of the experiment
As can be seen from the data comparison and analysis results, the system improves the visibility observation accuracy, but the improvement effect is limited, and is only improved by a few percent. However, the system focuses on solving the contradiction that the observation at daytime and night conflicts with different focal lengths; through the experiment, the algorithm can better solve the conflict that the focal length of the camera is required to be different between daytime observation and night observation on the premise of keeping the observation effectiveness (the observation effect is even better): if the focal length of the camera is zoomed in, the aperture of the light source in the video window can be enlarged, which is beneficial to night observation, but inevitably leads to the zoom-in of the sky background, so that the sky background can disappear in the video picture, thereby seriously affecting the daytime observation.
The system can directly enlarge the video window to a proper state by using an algorithm through preferentially fixing various observation parameters of the camera during daytime observation without changing hardware parameters of the camera such as focal length and the like according to needs (in order to improve the speed, the area where the light source is located is directly extracted during observation so as to conveniently and quickly acquire light source brightness information), and effective observation is carried out. After the algorithm is added, although the consumed time is slightly more than that of the original algorithm, the increased time is within millisecond level, and compared with the single observation time interval of 1 minute, the time can be ignored, and the improvement of the observation effect is facilitated.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (4)

1. A quick interpolation display method based on visibility automatic observation is characterized by comprising the following steps:
s1, intercepting a video image through a program to obtain a picture of a video window;
s2, searching the position of a light source in a video window picture, and preparing for subsequent positioning and operation;
s3, judging whether fast interpolation is needed or not, and if not, only obtaining the brightness of the surface light source; if yes, the image needs to be amplified;
s4, removing image noise by using an ecological algorithm according to the amplified image obtained in the S3, so as to obtain the brightness of the surface light source;
s5, obtaining a final brightness value of the surface light source according to the S4, and calculating the visibility by using the final brightness value of the surface light source;
the interpolation algorithm for amplifying the image by using the fast interpolation in the step S3 is as follows:
let the function y = f (x) at two nodes x 0 ,x 1 Respectively, the function values of (a) are y 0 ,y 1
Let two points (x) pass 0 ,y 0 )、(x 1 ,y 1 ) Y = L (x), then it can be known that the function value y corresponding to any point x is:
Figure FDA0003794570780000011
since the surface light source has almost the same gray scale value, the above formula
Figure FDA0003794570780000012
Is a relatively small value;
thus, the above formula can be expressed as:
Figure FDA0003794570780000013
because only the area light source is concerned, other areas of the image are not concerned; therefore, the above formula is expressed as:
Figure FDA0003794570780000021
wherein Δ Φ = y 1 -y 0 The absolute value of delta phi is less than or equal to 5 pixel interpolation,
Figure FDA0003794570780000022
the value is an average pixel value of the source region, which can be obtained from the image once in advance.
2. The fast interpolation display method based on visibility automatic observation as claimed in claim 1, wherein: the process of determining whether to perform fast interpolation in step S3 is as follows: the program judges the brightness of the picture, if the brightness of the picture enters into the night, the fast interpolation is switched.
3. The fast interpolation display method based on visibility automatic observation according to claim 1, characterized in that: the process of removing image noise by using the ecological algorithm in the step S4 is as follows:
by utilizing the characteristic that the image details are eliminated by opening operation and closing operation in ecology, more effective information of the surface light source is kept, and the method comprises the following steps:
if f is an original image and b is a structural element, the set f is subjected to closed operation by the structural element b and is recorded as: f, b, if doing the open operation, recording as:
Figure FDA0003794570780000023
the calculation of the closing operation is as follows:
Figure FDA0003794570780000024
the on operation is as follows:
Figure FDA0003794570780000025
4. the fast interpolation display method based on visibility automatic observation as claimed in claim 1, wherein: the visibility in the step S5 is calculated according to the following formula:
the formula for calculating daytime visibility is as follows:
Figure FDA0003794570780000026
wherein L is 1 Distance between near target and camera, L 2 Distance between distant object and camera, B g1 Brightness of sky background corresponding to near object, B t1 Brightness of a near object, B g2 Brightness of sky background corresponding to distant object, B t2 The brightness of a far target object;
the formula for calculating the night visibility is as follows:
Figure FDA0003794570780000031
wherein, B t10 Then the true brightness of the light source 1, B t1 The brightness of the light source 1, B t20 Then the true brightness of the light source 2, B t2 The brightness of the light source 2, B b1 Brightness of black body 1, B b2 The brightness of the black body 2.
CN202011148745.4A 2020-10-23 2020-10-23 Rapid interpolation display method based on visibility automatic observation Active CN112288648B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011148745.4A CN112288648B (en) 2020-10-23 2020-10-23 Rapid interpolation display method based on visibility automatic observation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011148745.4A CN112288648B (en) 2020-10-23 2020-10-23 Rapid interpolation display method based on visibility automatic observation

Publications (2)

Publication Number Publication Date
CN112288648A CN112288648A (en) 2021-01-29
CN112288648B true CN112288648B (en) 2022-10-11

Family

ID=74424174

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011148745.4A Active CN112288648B (en) 2020-10-23 2020-10-23 Rapid interpolation display method based on visibility automatic observation

Country Status (1)

Country Link
CN (1) CN112288648B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156972A (en) * 2011-04-19 2011-08-17 清华大学 Image tilting correcting method and system
CN110335280A (en) * 2019-07-05 2019-10-15 湖南联信科技有限公司 A kind of financial documents image segmentation and antidote based on mobile terminal

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011111079A1 (en) * 2010-03-11 2011-09-15 Datalogic Scanning Group Sr.L. Image capturing device
CN101936900A (en) * 2010-06-12 2011-01-05 北京中科卓视科技有限责任公司 Video-based visibility detecting system
CN101957309B (en) * 2010-08-17 2012-11-07 招商局重庆交通科研设计院有限公司 All-weather video measurement method for visibility
CN108181307B (en) * 2017-12-06 2020-12-01 中国气象局北京城市气象研究所 Visibility measuring system and method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156972A (en) * 2011-04-19 2011-08-17 清华大学 Image tilting correcting method and system
CN110335280A (en) * 2019-07-05 2019-10-15 湖南联信科技有限公司 A kind of financial documents image segmentation and antidote based on mobile terminal

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
可变材质的实时全局光照明绘制;孙鑫等;《软件学报》;20080415(第04期);全文 *
基于亚像素边缘检测的高速摄影下枪机运动分析;杨一帆等;《电子测量与仪器学报》;20181115(第11期);全文 *
基于图像处理技术的光斑质心高精度测量;夏爱利等;《光电子.激光》;20111015(第10期);全文 *
立体零件加工质量的在线图像检测方法;史博等;《仪器仪表学报》;20091015(第10期);全文 *
线结构光条纹中心的全分辨率精确提取;熊会元等;《光学精密工程》;20090515(第05期);全文 *

Also Published As

Publication number Publication date
CN112288648A (en) 2021-01-29

Similar Documents

Publication Publication Date Title
CN110335204B (en) Thermal imaging image enhancement method
CN116309570A (en) Titanium alloy bar quality detection method and system
CN105049734A (en) License camera capable of giving shooting environment shooting prompt and shooting environment detection method
CN115841434A (en) Infrared image enhancement method for gas concentration analysis
WO2022178653A1 (en) Biochip image analysis method and apparatus, and computer device and storage medium
CN112991287B (en) Automatic indentation measurement method based on full convolution neural network
CN117764864B (en) Nuclear magnetic resonance tumor visual detection method based on image denoising
CN115047610B (en) Chromosome karyotype analysis device and method for automatically fitting microscopic focusing plane
CN111369523A (en) Method, system, device and medium for detecting cell stacking in microscopic image
CN109255799B (en) Target tracking method and system based on spatial adaptive correlation filter
CN117557820B (en) Quantum dot optical film damage detection method and system based on machine vision
CN111754548A (en) Multi-scale correlation filtering target tracking method and device based on response discrimination
CN111179184B (en) Fish-eye image effective region extraction method based on random sampling consistency
CN111292256B (en) Texture enhancement algorithm based on microscopic hyperspectral imaging
CN110766657B (en) Laser interference image quality evaluation method
CN112288648B (en) Rapid interpolation display method based on visibility automatic observation
CN116416268A (en) Method and device for detecting edge position of lithium battery pole piece based on recursion dichotomy
US10453189B2 (en) Process and device for direct measurements of plant stomata
CN114241023A (en) Sub-pixel image registration method and device and terminal equipment
CN116563298B (en) Cross line center sub-pixel detection method based on Gaussian fitting
CN110020999B (en) Uncooled infrared thermal image self-adaptive mapping method based on homomorphic filtering
US8249381B2 (en) Image based correction for unwanted light signals in a specific region of interest
CN112233060B (en) Screening method, device, equipment and medium for digital pathological image abnormal samples
CN115049549A (en) Infrared image strip noise removal method based on robust estimation
CN112146834A (en) Method and device for measuring structural vibration displacement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant