CN113192066A - Device and method for all-weather visibility estimation method of expressway - Google Patents
Device and method for all-weather visibility estimation method of expressway Download PDFInfo
- Publication number
- CN113192066A CN113192066A CN202110591389.1A CN202110591389A CN113192066A CN 113192066 A CN113192066 A CN 113192066A CN 202110591389 A CN202110591389 A CN 202110591389A CN 113192066 A CN113192066 A CN 113192066A
- Authority
- CN
- China
- Prior art keywords
- target
- brightness
- black
- visibility
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 238000012545 processing Methods 0.000 claims abstract description 8
- 238000005286 illumination Methods 0.000 claims description 14
- 238000009499 grossing Methods 0.000 claims description 5
- 230000003247 decreasing effect Effects 0.000 claims description 3
- 229920006395 saturated elastomer Polymers 0.000 claims description 3
- 230000007704 transition Effects 0.000 claims description 2
- 230000005622 photoelectricity Effects 0.000 abstract description 2
- 230000002238 attenuated effect Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 4
- 230000008033 biological extinction Effects 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 238000002834 transmittance Methods 0.000 description 3
- NAWXUBYGYWOOIX-SFHVURJKSA-N (2s)-2-[[4-[2-(2,4-diaminoquinazolin-6-yl)ethyl]benzoyl]amino]-4-methylidenepentanedioic acid Chemical compound C1=CC2=NC(N)=NC(N)=C2C=C1CCC1=CC=C(C(=O)N[C@@H](CC(=C)C(O)=O)C(O)=O)C=C1 NAWXUBYGYWOOIX-SFHVURJKSA-N 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 230000015556 catabolic process Effects 0.000 description 2
- 238000006731 degradation reaction Methods 0.000 description 2
- 230000005611 electricity Effects 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 1
- 238000009795 derivation Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000003760 hair shine Effects 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000035515 penetration Effects 0.000 description 1
- 238000013517 stratification Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G06T5/70—
-
- G06T5/90—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30181—Earth observation
- G06T2207/30192—Weather; Meteorology
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Abstract
The invention relates to the field of photoelectricity, in particular to a device and a method for an all-weather visibility estimation method for a highway. The system comprises a plurality of targets arranged beside a road and an image processing device connected with the targets, wherein each target comprises a light shielding plate and a back plate, and the image processing device comprises a camera and a computer connected with the camera; the target backboard is formed by splicing two squares, the middle area of the square above is black, and the outer area is white; the middle area of the square below is white, the outer area is black, the shading plates are arranged on the left side and the right side of the back plate, and the surfaces of the shading plates are coated with black. The invention has the following advantages: 1. the passive target is adopted, and the target is designed into a black part and a white part. Because the passive target is adopted, the problems of light attenuation, power taking and the like of the active target are abandoned. 2. Single ambient light data is filtered from the complex ambient light.
Description
Technical Field
The invention relates to the field of photoelectricity, in particular to a device and a method for an all-weather visibility estimation method for a highway.
Background
The following methods are mainly used in the current visibility estimation method:
(1) a fog visibility estimation method based on color characteristics. Firstly, converting an RGB color space into an HSV space, then extracting the characteristics of all HSV channels, and dividing the weather into non-fog days, small-fog days and large-fog days through the color characteristics.
(2) Methods based on the combination of multiple sensors. A plurality of sensors are adopted, including a fog-penetrating camera, a temperature sensor, a humidity sensor and the like, and whether fog is present or not is comprehensively judged by comparing histograms of images before and after fog penetration or not and combining temperature and humidity.
(3) The method based on dark primary color channel, the basic theory of which is fog degradation model, first by extracting dark primary color channel image of fog image, then using estimated transmittance image. Visibility is directly estimated by directly marking visibility of an original foggy day image and then utilizing a transmittance image. By marking observation points in the images, the visibility of the scene is calculated by using the depth information of the observation points and the transmittance images.
(4) A multi-target based approach that estimates visibility by setting targets at different distances and then the difference in brightness between different targets.
However, the prior art has the following disadvantages: 1. the method based on color characteristics and the dark primary color channel can only process daytime scenes and cannot process visibility estimation at night. 2. The multi-sensor based approach is too complex and costly. 3. Based on the multi-light-source method, different influences can be caused to light sources at different distances by the environment, and meanwhile, the consistency and stability of the brightness of the light sources are difficult to guarantee, so that certain precision loss can be caused; in addition, electricity is difficult to get on the highway, light attenuation phenomenon exists, and later maintenance cost is relatively high.
Disclosure of Invention
The invention mainly solves the following technical problems in the prior art:
1. problem of all-weather visibility estimation: the existing foggy day degradation model is only suitable for the day and cannot meet the condition that observation points are difficult to see at night and the like.
2. Problems with active targets: with the lapse of time, the active target will produce the light decay phenomenon on certain stratification level; an important problem faced by active targets is that electricity is difficult to get on a highway.
3. Influence of complex ambient light on highway at night: highway is the far-reaching headlamp of vehicle at night, and the kind of light includes: the spectrum, the light intensity, the distance between the light source and the target are different, and the results are greatly influenced by factors such as a plurality of light sources;
an apparatus and method for an all-weather visibility estimation method for a highway are provided.
The invention mainly adopts the following technical scheme:
the device for the all-weather visibility estimation method of the expressway is characterized by comprising a plurality of targets arranged beside the expressway and an image processing device connected with the targets, wherein the targets comprise a light shielding plate and a back plate, and the image processing device comprises a camera and a computer connected with the camera; the target backboard is formed by splicing two squares, the middle area of the square above is black, and the outer area is white; the middle area of the square below is white, the outer area is black, the shading plates are arranged on the left side and the right side of the back plate, and the surfaces of the shading plates are coated with black.
In the device for the all-weather visibility estimation method for the expressway, the target backboard is formed by splicing two squares with the side length of 100 centimeters, the background of the square on the target backboard is white, and the square area with the middle length of 80 centimeters is black; the square background below is black, and the square area inside the middle 80 is white; the shading plates are arranged on the left side and the right side of the back plate, and the included angle between the shading plates and the back plate is 120 degrees.
In the above device for the method for estimating all-weather visibility on the highway, in the image processing device, the computer acquires the picture of the camera through the network, respectively calculates the brightness values of the black area of the square on the target backboard and the white area of the square below the target backboard in the picture, and estimates the visibility at the moment according to the brightness values of the two areas.
An all-weather visibility estimation method for a highway is characterized by comprising the following steps:
the daytime visibility estimation is based on the following formula:
where d1 and d2 are the distances of the targets M1 and M2 from the camera, respectively, Bgi,BBlack iBrightness values respectively representing a target sky background and a target black part;
night visibility estimation is based on the following formula:
wherein, B0,B1,B2,B`1,B`2,BBlack 1,BBlack 2Respectively representing the illumination intensity of the automobile light, the illumination intensity of the target M1 receiving the automobile light, the illumination intensity of the target M2 receiving the automobile light, the brightness of the white part of the target M1 observed from the camera, the brightness of the white part of the target M2 observed from the camera, the brightness of the black part of the target M1 observed from the camera, and the brightness of the black part of the target M2 observed from the camera; d1, d2 are the distances of target M1 and target M2 from the camera, respectively.
Before calculating visibility, the method for estimating all-weather visibility of the highway further comprises the following steps of:
wherein t represents the current time; in the range of the transition region, judging whether the sky brightness mean value (sky _ mean) is daytime or nighttime by comparing the sky brightness mean value (sky _ mean) with the ground brightness mean value (ground _ mean), if the sky brightness mean value (sky _ mean) is daytime or nighttime, judging whether the sky brightness mean value (sky _ mean) is nighttime or not by comparing the sky brightness mean value (ground _ mean) with the ground brightness mean value>And if the ground _ mean is the day, otherwise, the night.
Before calculating visibility, the method for estimating all-weather visibility of the highway further comprises the following steps of:
dividing a road surface area between two targets in an image into n parts, calculating the brightness mean value of each part, and if the condition mean1< mean2< mean3< mean4< mean5< … < mean < bright _ thresh is met, namely the current frame meets the condition of a single light source; the brightness _ thresh is default to 180, and is increased to prevent the brightness of the target from being saturated due to the over-lighting of the lamp, and the average value is decreased only when the light source is not in the middle area of the two targets.
Before calculating visibility and after acquiring visibility at a certain moment, the method for estimating all-weather visibility of the highway needs to smooth a single measuring point in a time domain in the following way:
whereinRepresenting the final output visibility value at time t,and the visibility value calculated according to the single-frame image at the moment t is shown, and alpha represents a smoothing coefficient.
Therefore, the invention has the following advantages:
the invention has the following advantages:
1. designing a target: the passive target is adopted, and the target is designed into a black part and a white part. The difference with the traditional target is that a white part is added, and the white part can reflect all the light spectrum, so that the white target can be used as a night reflection light source; and meanwhile, the light shielding plates are additionally arranged on two sides of the target, so that the influence of a side light source is reduced. Because the passive target is adopted, the problems of light attenuation, power taking and the like of the active target are abandoned.
2. Filtering single ambient light data from complex ambient light: dividing a road surface area between two targets into a plurality of area blocks, calculating the brightness mean value of each area block, and judging whether the relation between the brightness mean values of the area blocks meets a single ambient light condition or not, so that effective single ambient light data are filtered out for visibility calculation.
Drawings
Fig. 1 is a schematic view of a target in a front view.
Fig. 2 is a general plan view of visibility after the whole day.
FIG. 3 is an all-weather visibility estimation overview flow diagram.
Detailed Description
The technical scheme of the invention is further specifically described by the following embodiments and the accompanying drawings.
Example (b):
the invention relates to a method and a device for estimating all-weather visibility of a highway. Wherein the target color and shape, the target and the mounting position of the image capturing device are shown in fig. 1. The target is installed on the right side of the road, and the image acquisition equipment is installed on the portal frame and is higher than the target.
The present invention installs a plurality of targets at different distances, and takes 2 targets as an example, the following formula is derived according to the position relationship of each device in fig. 1.
First, daytime visibility
For the daytime, the invention adopts double brightness difference to estimate visibility, and the formula is as follows:
where d1 and d2 are the distances of target 1 and target 2 from the camera, respectively, Bgi,BBlack iBrightness values representing the background of the target sky and the black part of the target, respectively.
Second, visibility at night
The double-light source method is an effective method for measuring visibility at night, and is used for calculating the visibility of the atmosphere by measuring the brightness of two target light sources which have the same real brightness and are attenuated by air columns with different lengths to obtain an atmospheric extinction coefficient. Assuming that the real illumination intensity of the target light source is B, the real illumination intensity becomes brightness B' after being attenuated by length d, and the relationship between the two is as follows: b ═ Be-σd+Bd=Be-σd+BBlack colourIn which B isd,BBlack colourRespectively, the brightness of the air column and the brightness of the black body. Because the distance between the black body and the light source is equal to the distance between the black body and the light source and the camera, the brightness of the black body is equal to that of the air column.
The invention utilizes the characteristic that white can reflect all spectrums, so the reflected light of a target white area is used as a target light source, and the formula derivation of the visibility is as follows:
assuming that no luminophor is arranged in the middle area of the two targets, the light illumination intensity of the automobile is B0Then the vehicle light shines on the target.
The illumination intensity of the automobile light attenuated to the target 1 is as follows: b is1=B0e-σd3Equation (2)
The illumination intensity of the automobile light attenuated to the target 2 is as follows: b is2=B0e-σd4Equation (3)
After the reflection of the white part of the target 1, the camera detects that the brightness of the white area of the target 1 is as follows:
B`1=B1e-σd1+Bblack 1Equation (4)
After the reflection of the white part of the target 2, the camera detects that the brightness of the white area of the target 2 is as follows:
B`2=B2e-σd2+Bblack 2Equation (5)
Substituting equation (2) and equation (3) into equation (4) and equation (5), respectively, can obtain:
B`1-Bblack 1=B0e-σd3×e-σd1=B0e-σ(d3+d1)Equation (6)
B`2-BBlack 2=B0e-σd4×e-σd2=B0e-σ(d4+d2)Equation (7)
Dividing equation (6) and equation (7) by logarithm to obtain:
the atmospheric extinction coefficient can be calculated according to equation (8):
the visibility at this time can be calculated according to the formula (9):
wherein, B0,B1,B2,B`1,B`2,BBlack 1,BBlack 2The illumination intensity of the automobile light, the illumination intensity of the target 1 receiving the automobile light, the illumination intensity of the target 2 receiving the automobile light, the brightness of the white part of the target 1 observed from the camera, the brightness of the white part of the target 2 observed from the camera, the brightness of the black part of the target 1 observed from the camera, and the brightness of the black part of the target 2 observed from the camera are respectively represented. d1, d2, d3 and d4 respectively represent the distance from the target 1 to the camera, the distance from the target 2 to the camera, the distance from the target 1 to the automobile lamp and the distance from the target 2 to the automobile lamp. σ is extinction coefficient, VdIndicating the visibility value.
Comparing the formula (1) with the formula (10), it can be found that the forms of the two are very similar, only the light sources are from different sources, the sky brightness is adopted in the daytime, and the reflection brightness of the white area of the target is adopted at night.
Thirdly, the general implementation flow chart of the invention is shown in fig. 2, and the steps are as follows:
1. an image is captured with an image capture device.
2. The average luminance values of the two target white and black regions were calculated, respectively.
The function is to calculate the parameter B in the formula (1)Black iParameter B' in equation (10)1,B`2,BBlack 1,BBlack 2Visibility values are calculated from these parameters. Wherein B isBlack iRepresenting the brightness, B' of the middle black area of the square above the ith targetiIndicating the brightness of the middle white area of the square under the ith target.
The calculation method for respectively calculating the average brightness values of the white area and the black area of the two targets is as follows:
wherein, gray represents the image block of the area to be detected in the target, m, n represents the width and height of the area to be detected, and the unit is pixel.
3. And judging whether the current time is day time or night.
4. If the daytime is adopted, the sky brightness value is calculated, and then the visibility value is calculated according to the formula (1).
5. If the current frame meets the single light source condition, whether the current frame meets the single light source condition is judged, and if the current frame meets the single light source condition, the visibility value is calculated according to a formula (10).
6. The individual measurement points are smoothed in the time domain. And taking the smoothing result as the current visibility output value.
The method for judging whether the current frame meets the single light source condition in the step 5 comprises the following steps:
dividing a road surface region between two targets in the image into n parts, such as r1, r2, r3, r4, r5 and r6 in fig. 1, calculating the brightness mean value meani of each part, and if a condition mean1< mean2< mean3< mean4< mean5< mean6< bright _ thresh is met, namely the current frame meets a single illuminant condition. The brightness _ thresh is default to 180, and is increased to prevent the brightness of the target from being saturated due to the over-lighting of the lamp, and the average value is decreased only when the light source is not in the middle area of the two targets.
The step is used for reducing the influence of ambient light, and is used in night visibility estimation, and experiments show that the visibility error estimated by the method is relatively small under the condition of only a single light source at night, so that data which do not meet the condition of the single light source are filtered by the method, and the calculation error is reduced.
In step 6, the way of smoothing a single measurement point in the time domain is as follows:
whereinRepresenting the final output visibility value at time t,the visibility value calculated from the single frame image at the time t is shown, and alpha represents a smoothing coefficient (default is 0.5).
The step also has the effect of reducing errors, and for the same place, the visibility cannot be suddenly changed, so that the smooth value of the visibility value within a period of time is used as output, and the output errors are reduced. This step is used both at daytime and at night, where formula (1), (10) calculates the visibility at one time, and step 6 averages a plurality of successive time calculations as output.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.
Claims (7)
1. The device for the all-weather visibility estimation method of the expressway is characterized by comprising a plurality of targets arranged beside the expressway and an image processing device connected with the targets, wherein the targets comprise a light shielding plate and a back plate, and the image processing device comprises a camera and a computer connected with the camera; the target backboard is formed by splicing two squares, the middle area of the square above is black, and the outer area is white; the middle area of the square below is white, the outer area is black, the shading plates are arranged on the left side and the right side of the back plate, and the surfaces of the shading plates are coated with black.
2. The device for the all-weather visibility estimation method of the expressway as claimed in claim 1, wherein the target backboard is formed by splicing two squares with the side length of 100 cm, the background of the upper square is white, and the square area in the middle of 80 cm is black; the square background below is black, and the square area inside the middle 80 is white; the shading plates are arranged on the left side and the right side of the back plate, and the included angle between the shading plates and the back plate is 120 degrees.
3. The device of claim 1, wherein in the image processing device, the computer obtains the camera image through the network, calculates the brightness values of the black area and the white area of the lower square on the target backboard in the image, and estimates the visibility at that time according to the brightness values of the two areas.
4. An all-weather visibility estimation method for a highway is characterized by comprising the following steps:
the daytime visibility estimation is based on the following formula:
where d1 and d2 are the distances of the targets M1 and M2 from the camera, respectively, Bgi,BBlack iBrightness values respectively representing a target sky background and a target black part;
night visibility estimation is based on the following formula:
wherein, B0,B1,B2,B`1,B`2,BBlack 1,BBlack 2Respectively representing the illumination intensity of the automobile light, the illumination intensity of the target M1 receiving the automobile light, the illumination intensity of the target M2 receiving the automobile light, the brightness of the white part of the target M1 observed from the camera, the brightness of the white part of the target M2 observed from the camera, the brightness of the black part of the target M1 observed from the camera, and the brightness of the black part of the target M2 observed from the camera; d1, d2 are the distances of target M1 and target M2 from the camera, respectively.
5. The method for estimating all-weather visibility on a highway according to claim 4, further comprising the step of judging whether the current time is day or night before calculating visibility:
wherein t represents the current time; in the range of the transition region, judging whether the sky brightness mean value (sky _ mean) is daytime or nighttime by comparing the sky brightness mean value (sky _ mean) with the ground brightness mean value (ground _ mean), if the sky brightness mean value (sky _ mean) is daytime or nighttime, judging whether the sky brightness mean value (sky _ mean) is nighttime or not by comparing the sky brightness mean value (ground _ mean) with the ground brightness mean value>And if the ground _ mean is the day, otherwise, the night.
6. The method of claim 4, further comprising the step of determining whether the current frame satisfies the condition of single light source before calculating visibility:
dividing a road surface area between two targets in an image into n parts, calculating the brightness mean value of each part, and if the condition mean1< mean2< mean3< mean4< mean5< … < mean < bright _ thresh is met, namely the current frame meets the condition of a single light source; the brightness _ thresh is default to 180, and is increased to prevent the brightness of the target from being saturated due to the over-lighting of the lamp, and the average value is decreased only when the light source is not in the middle area of the two targets.
7. The method for estimating all-weather visibility on a highway according to claim 4, wherein before calculating visibility, after acquiring visibility after a certain moment, the way to smooth a single measuring point in time domain is as follows:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110591389.1A CN113192066A (en) | 2021-05-28 | 2021-05-28 | Device and method for all-weather visibility estimation method of expressway |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110591389.1A CN113192066A (en) | 2021-05-28 | 2021-05-28 | Device and method for all-weather visibility estimation method of expressway |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113192066A true CN113192066A (en) | 2021-07-30 |
Family
ID=76985912
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110591389.1A Pending CN113192066A (en) | 2021-05-28 | 2021-05-28 | Device and method for all-weather visibility estimation method of expressway |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113192066A (en) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1580738A (en) * | 2003-08-04 | 2005-02-16 | 成都易航信息科技有限公司 | Visibility measuring method and visitility monitoring instrument |
CN1804588A (en) * | 2006-01-06 | 2006-07-19 | 成都易航信息科技有限公司 | Self-calibrating atmosphere visibility measuring method and measuring system thereof |
CN101614675A (en) * | 2009-07-06 | 2009-12-30 | 中国气象局北京城市气象研究所 | The visibility measurement system and method |
CN101936900A (en) * | 2010-06-12 | 2011-01-05 | 北京中科卓视科技有限责任公司 | Video-based visibility detecting system |
CN101957309A (en) * | 2010-08-17 | 2011-01-26 | 招商局重庆交通科研设计院有限公司 | All-weather video measurement method for visibility |
CN104777103A (en) * | 2015-04-15 | 2015-07-15 | 西安灏通节能工程设备有限公司 | Sight distance visibility meter and measuring method thereof |
CN105021528A (en) * | 2015-07-15 | 2015-11-04 | 安徽皖通科技股份有限公司 | Road weather detection device based on videos |
CN108663368A (en) * | 2018-05-11 | 2018-10-16 | 长安大学 | A kind of system and method for real-time monitoring freeway network night entirety visibility |
JP2019159518A (en) * | 2018-03-09 | 2019-09-19 | 株式会社国際電気通信基礎技術研究所 | Visual state detection apparatus, visual state detection method, and visual state detection program |
-
2021
- 2021-05-28 CN CN202110591389.1A patent/CN113192066A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1580738A (en) * | 2003-08-04 | 2005-02-16 | 成都易航信息科技有限公司 | Visibility measuring method and visitility monitoring instrument |
CN1804588A (en) * | 2006-01-06 | 2006-07-19 | 成都易航信息科技有限公司 | Self-calibrating atmosphere visibility measuring method and measuring system thereof |
CN101614675A (en) * | 2009-07-06 | 2009-12-30 | 中国气象局北京城市气象研究所 | The visibility measurement system and method |
CN101936900A (en) * | 2010-06-12 | 2011-01-05 | 北京中科卓视科技有限责任公司 | Video-based visibility detecting system |
CN101957309A (en) * | 2010-08-17 | 2011-01-26 | 招商局重庆交通科研设计院有限公司 | All-weather video measurement method for visibility |
CN104777103A (en) * | 2015-04-15 | 2015-07-15 | 西安灏通节能工程设备有限公司 | Sight distance visibility meter and measuring method thereof |
CN105021528A (en) * | 2015-07-15 | 2015-11-04 | 安徽皖通科技股份有限公司 | Road weather detection device based on videos |
JP2019159518A (en) * | 2018-03-09 | 2019-09-19 | 株式会社国際電気通信基礎技術研究所 | Visual state detection apparatus, visual state detection method, and visual state detection program |
CN108663368A (en) * | 2018-05-11 | 2018-10-16 | 长安大学 | A kind of system and method for real-time monitoring freeway network night entirety visibility |
Non-Patent Citations (1)
Title |
---|
李小磊: "基于数字图像处理的高速公路能见度检测系统设计", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 * |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7016045B2 (en) | Video camera-based visibility measurement system | |
US7899207B2 (en) | Image-based visibility measurement | |
US9883148B2 (en) | Color mask for an image sensor of a vehicle camera | |
KR101364727B1 (en) | Method and apparatus for detecting fog using the processing of pictured image | |
CN105424655B (en) | A kind of visibility detecting method based on video image | |
US9659237B2 (en) | Imaging through aerosol obscurants | |
US20160189354A1 (en) | Image processing system, image processing device, and image processing method | |
CN112288736B (en) | Visibility estimation method based on images | |
CN107505291B (en) | Method for estimating visibility through single image | |
CN112365467B (en) | Foggy image visibility estimation method based on single image depth estimation | |
CN1580738A (en) | Visibility measuring method and visitility monitoring instrument | |
CN109886920A (en) | A kind of greasy weather stage division, greasy weather hierarchy system | |
CN103453890A (en) | Nighttime distance measuring method based on taillight detection | |
CN111275698B (en) | Method for detecting visibility of road in foggy weather based on unimodal offset maximum entropy threshold segmentation | |
WO2010002379A1 (en) | Digital camera control system | |
US20230196544A1 (en) | Method for detecting anomalies on or in a surface of a structure | |
CN113192066A (en) | Device and method for all-weather visibility estimation method of expressway | |
CN112419272B (en) | Method and system for quickly estimating visibility of expressway in foggy weather | |
CN116433513A (en) | Road monitoring video defogging method, system, electronic equipment and storage medium | |
Meng et al. | Highway visibility detection method based on surveillance video | |
CN113408415A (en) | Detection and display system for airport visibility and runway visual range based on image recognition technology | |
Zhao-Zheng et al. | Real-time video detection of road visibility conditions | |
CN113436134A (en) | Visibility measuring method of panoramic camera and panoramic camera applying same | |
CN215117795U (en) | Visibility determination device under low light level condition | |
CN117037007B (en) | Aerial photographing type road illumination uniformity checking method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |