CN107328777A - A kind of method and device that atmospheric visibility is measured at night - Google Patents

A kind of method and device that atmospheric visibility is measured at night Download PDF

Info

Publication number
CN107328777A
CN107328777A CN201710543433.5A CN201710543433A CN107328777A CN 107328777 A CN107328777 A CN 107328777A CN 201710543433 A CN201710543433 A CN 201710543433A CN 107328777 A CN107328777 A CN 107328777A
Authority
CN
China
Prior art keywords
visibility
mrow
gray level
level image
measured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710543433.5A
Other languages
Chinese (zh)
Inventor
于平平
王震洲
郄岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei University of Science and Technology
Original Assignee
Hebei University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei University of Science and Technology filed Critical Hebei University of Science and Technology
Priority to CN201710543433.5A priority Critical patent/CN107328777A/en
Publication of CN107328777A publication Critical patent/CN107328777A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • G06V10/273Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion removing elements interfering with the pattern to be recognised
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/48Extraction of image or video features by mapping characteristic values of the pattern into a parameter space, e.g. Hough transformation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/02Mechanical
    • G01N2201/021Special mounting in general
    • G01N2201/0216Vehicle borne

Abstract

The invention discloses a kind of method and device that atmospheric visibility is measured at night, it is related to visibility detection technique field;Including camera, headlamp and intelligent object;The camera is fixed by camera frame, the headlamp is located at front of camera region and fixed by lamp bracket, the control end of the intelligent object and headlamp connects and controls headlamp bright or go out, the intelligent object is connected with camera and controls camera to take pictures or image and collect and handle the data of camera collection, obtains night atmospheric visibility;A kind of method and device for measuring atmospheric visibility at night passes through the technologies such as video acquisition, image procossing and graphical analysis, realize and detect visibility at night, visibility particularly on night dynamic detection highway, easy to operate, operation and maintenance cost are low.

Description

A kind of method and device that atmospheric visibility is measured at night
Technical field
The present invention relates to visibility detection technique field, more particularly to a kind of method that atmospheric visibility is measured at night and Device.
Background technology
Visibility is to influence a key factor of traffic safety, compared with daytime, the traffic accident of significant proportion Occur at night, especially in high speed, when running into the situation of greasy weather low visibility, if not reducing speed accordingly, be just easy to Generation traffic accident.Therefore, night visibility is accurately and efficiently detected, it becomes possible to the reference of safe driving is provided for driver, So as to reduce traffic accidents probability of happening.
Due to the limitation of illumination condition, night visibility is more difficult to measurement compared to daytime visibility.At present, it is conventional to see Degree detection method is that ocular estimate and device survey method.The subjectivity of ocular estimate is strong, normative poor, it is impossible to meet round-the-clock and real-time detection The need for.Device surveys method mainly using atmospheric transmission instrument, scatterometer and laser visibility automatic measuring instrument etc., and these instruments are generally only A small range of visibility can be measured, at the night without external light source, is not suitable for the energy on dynamic detection highway Degree of opinion, and in practical operation, equipment is complicated, cumbersome, operation and maintenance cost are higher.
In recent years, study hotspot is become using video images detection visibility.Based on video images detection visibility Algorithm mainly has three classes, is dual differential luminance method, Camera Self-Calibration method and dark channel prior method respectively.But these three methods are all suitable for In the visibility on detection daytime, it can not be used to detect night visibility.With daytime on the contrary, in the greasy weather at night to road scene Observation is largely dependent upon artificial light source.Some scholars propose double Light Source Methods to detect on the basis of dual differential luminance method The visibility at night.Double Light Source Methods are by measuring two intrinsic brightness identical light sources after the decay of different length gas column Brightness, obtain atmospheric extinction coefficient and extrapolate visibility value.The method needs fixed test site, and night has solid Fixed illumination, and to ensure the uniformity of two light sources, therefore, at the night without external light source, be not suitable for dynamic detection high Visibility on fast highway.
The content of the invention
The technical problems to be solved by the invention are to provide a kind of method and device that atmospheric visibility is measured at night, lead to The technologies such as video acquisition, image procossing and graphical analysis are crossed, realizes and visibility is detected at night, particularly in night dynamic detection Visibility on highway, easy to operate, operation and maintenance cost are low.
In order to solve the above technical problems, the technical solution used in the present invention is:It is a kind of to measure atmospheric visibility at night Device include camera, headlamp and intelligent object;The camera is fixed by camera frame, and the headlamp is located at front of camera Region is simultaneously fixed by lamp bracket, and the control end of the intelligent object and headlamp connects and controls headlamp bright or go out, the intelligence Energy module is connected with camera and controls camera to take pictures or image and collect and handle the data of camera collection, obtains night air Visibility.
Further technical scheme is:The camera fixes top nose, the headlamp in the car by camera frame For the headlight on car, the intelligent object is handheld terminal.
Further technical scheme is:Step is as follows,
S1 sets up gray level image java standard library;
S2 obtains the corresponding gray level image of visibility to be measured;
S3 compares the corresponding gray level image of visibility to be measured and gray level image java standard library;
S4 determines visibility to be measured;
Wherein, the S1 sets up gray level image java standard library and comprised the following steps:
S101 gathers image
A two field picture is randomly selected in the data that intelligent object is gathered from camera (1);
The processing of S102 gray processings
The image that image step acquisition is gathered to S101 does gray processing processing, and concrete methods of realizing is, using weighted average Value method realizes gray processing, and gradation conversion formula is
P (i, j)=WRR(i,j)+WGG(i,j)+WBB(i,j)
Wherein P (i, j) for conversion after gray level image in pixel (i, j) gray value, R (i, j), G (i, j), B (i, J) be respectively red, green, blue component, WR、WG、WBThe respectively weights of red, green, blue component, and meet WR+WG+WB=1.
Further technical scheme is:The step of S1 sets up gray level image java standard library also comprise the following steps,
S103 average treatments
In the case of specific visibility, image step is gathered according to the S101, the image of FN frames is gathered, according to described S102 gray processing process steps, carry out gray processing processing to the image of FN frames respectively, the gray level image of obtained FN frames are carried out Average treatment, obtains the gray level image of specific visibility;The formula of average treatment is P (i, j)=∑ Pm(i, j)/FN, wherein (m =1,2,3,4,5 ...), PmThe gray value of pixel (i, j) in the frame gray level image obtained after being handled for S102 gray processings;
S104 sets up the different corresponding gray level images of specific visibility
Under different specific visibility, according to the S103 average treatments step, different specific visibility pair are obtained The gray level image answered, so as to form gray level image java standard library.
Further technical scheme is:The numerical value of the FN is 5, WR=0.30, WG=0.59, WB=0.11.
Further technical scheme is:The S2 obtains the corresponding gray level image of visibility to be measured and comprised the following steps,
S201 gathers visibility chart picture to be measured
Under preset test environment, image step is gathered according to the S101, a two field picture is gathered;
S202 gray processings processing visibility chart picture to be measured
According to the S102 gray processings process step, S201 is gathered into the image progress that visibility image step to be measured is obtained Gray processing processing;
S203 non-local mean denoisings
The gray level image for the preset test environment that S202 gray processings processing visibility image step to be measured is obtained carries out non- Local mean value denoising, obtains the corresponding gray level image of visibility to be measured.
Further technical scheme is:The S2, which obtains the corresponding gray level image of visibility to be measured, also includes following step Suddenly,
S204 removes lane line processing
If having lane line in the gray level image that the S203 non-local means denoising step is obtained, by the car Diatom is removed lane line processing;Using Canny rim detections, the marginal information of lane line is obtained, is examined using Hough transform Straight line is surveyed, lane line is extracted and removes, obtain the corresponding gray level image of visibility to be measured of no lane line.
Further technical scheme is:The S3 is by the corresponding gray level image of visibility to be measured and gray level image java standard library Compare, including the corresponding gray level image of visibility to be measured is compared with each gray level image in gray level image java standard library respectively, and Zero-mean normalized-cross-correlation function is calculated respectively, and formula is
Wherein, I0、I′0The gray scale vector gone for input picture before and after average, Ii、I′iAverage is gone for i-th of reference picture Front and rear gray scale vector, E, D, T represent to average respectively, variance and transposition.
Further technical scheme is:The S4 determines visibility to be measured, including determines S3 by visibility to be measured correspondence Gray level image and gray level image java standard library compare can see corresponding to maximum zero-mean normalized-cross-correlation function value in step Spend for visibility to be measured.
Further technical scheme is:The S4 determines visibility to be measured, including determines S3 by visibility to be measured correspondence Gray level image and gray level image java standard library compare corresponding to two zero-mean normalized-cross-correlation function values larger in step Scope between visibility is visibility range to be measured.
It is using the beneficial effect produced by above-mentioned technical proposal:
First, easy to operate, operation and maintenance cost are low.Because the equipment used is camera and headlamp, to light source It is required that low, without special light sources, also, camera and headlamp are adjoining, without the gas column of greater depth, the light without a distant place Source, so, equipment operation is easy, and operation and maintenance cost are low.
Second, suitable for fixation, mobile two ways detection visibility.During fixed test, camera and headlamp are fixed on ground On face, the visibility of current environment can detect;During mobile detection, camera fixes top nose, preceding photograph in the car by camera frame Lamp is the headlight on car, and handheld terminal fixes front end in the car, can dynamic detection region visibility.
3rd, S103 average treatment, reduce and eliminate the influence of the factor to gray level image such as extraneous, accidental.FN numerical value For 5, the extraneous, influence of the factor to gray level image such as accidentally can be preferably eliminated, meanwhile, system operating efficiency is preferable, it is to avoid The FN excessive influence system response time of numerical value.
4th, it is minimum to blue-sensitive degree because human eye is to green sensitive degree highest, by many experiments, to best suit Human visual experience is standard, it is determined that optimal weighed combination is, WR=0.30, WG=0.59, WB=0.11.
5th, S203 non-local mean denoising, reduce and eliminate the factor such as extraneous, accidental to visibility pair to be measured The influence for the gray level image answered.
6th, S204 remove lane line processing, eliminate influence of the lane line to the corresponding gray level image of visibility to be measured.
7th, in summary, using technical scheme, it can fix or mobile two ways detection visibility, Detect that performance is quick, efficient, accurate, easy to operate, operation and maintenance cost are low.
Brief description of the drawings
Below in conjunction with the accompanying drawings, the present invention is further detailed explanation for subordinate list and embodiment.
Fig. 1 is the structural representation of device in the present invention;
Fig. 2 is four frame gray level images in gray level image java standard library of the present invention;
Fig. 3 is the corresponding gray level image of visibility to be measured in the embodiment of the present invention 2;
Fig. 4 is the corresponding gray level image of visibility to be measured without lane line in the embodiment of the present invention 2;
Fig. 5 is compared the corresponding gray level image of visibility to be measured and gray level image java standard library in the embodiment of the present invention 2 Zero-mean normalized-cross-correlation function value.
Wherein:1 camera, 2 headlamps, 3 horizontal ranges, 4 vertical ranges.
Embodiment
With reference to the accompanying drawing in the embodiment of the present invention, the technical scheme in the embodiment of the present invention is carried out clear, complete Ground is described, it is clear that described embodiment is only a part of embodiment of the present invention, rather than whole embodiments.Based on this Embodiment in invention, the every other reality that those of ordinary skill in the art are obtained under the premise of creative work is not made Example is applied, the scope of protection of the invention is belonged to.
Many details are elaborated in the following description to facilitate a thorough understanding of the present invention, still the present invention can be with It is different from other manner described here using other to implement, those skilled in the art can be without prejudice to intension of the present invention In the case of do similar popularization, therefore the present invention is not limited by following public specific embodiment.
Embodiment 1:
As shown in figure 1, embodiment 1 discloses a kind of device that atmospheric visibility is measured at night, it includes camera 1, preceding Illuminator 2 and intelligent object, the intelligent object include controller, arithmetic unit, memory and display, and the camera 1 passes through phase Frame is fixed on the ground, and the headlamp 2 is located at the front lower place of camera 1, and is fixed on the ground by lamp bracket, the control Device is connected with the control end of headlamp 2, and control headlamp 2 is bright or go out, and the controller is connected with camera 1, control camera 1 take pictures or Shooting, the memory collects the data that camera 1 is gathered, the data that the arithmetic unit processing camera 1 is gathered, the display report Accuse the night atmospheric visibility detected.
The quantity of the headlamp 2 is 2, and the quantity of camera 1 is 1, and the horizontal range 3 of camera 1 and headlamp 2 is 1.4 meters, the vertical range 4 of camera 1 and headlamp 2 is 0.5 meter, normal and the place straight horizontal of horizontal range 3 of the camera lens of camera 1 And allowable error ± 5 degree, the distance between two headlamps 2 are 1.6m ± 0.1m, and the headlamp 2 meets GB7454-87《Machine Motor-car headlamp is used and light beam adjustment technology is provided》.
Embodiment 2:
Embodiment 2 discloses a kind of device that atmospheric visibility is measured at night, and it includes camera 1, headlamp 2 and hand-held Terminal, the camera 1 fixes top nose in the car by camera frame, and the headlamp 2 is the headlight on car, described hand-held Terminal fixes front end in the car, and the handheld terminal is connected with headlight control end, and control headlight is bright or goes out, the hand-held end End is connected with camera 1, and control camera 1 is taken pictures or imaged, and collects, handles the data of the collection of camera 1, and obtaining night air can see Degree.
The quantity of the headlight is 2, and the quantity of camera 1 is 1, and the horizontal range 3 of camera 1 and headlight is 1.4 The vertical range 4 of rice, camera 1 and headlight is 0.5 meter, normal and horizontal line level and allowable error ± 5 of the camera lens of camera 1 Degree, the distance between two headlights are 1.6m ± 0.1m, and the headlight meets GB7454-87《Automotive headlight is used With light beam adjustment technology regulation》.
Embodiment 3:
As shown in Fig. 2 embodiment 3 discloses a kind of method for measuring atmospheric visibility at night, including step is as follows:
S1 sets up gray level image java standard library
Current environment is night depletion region, does not interfere with thing, the scene without other lighting apparatus.It is with embodiment 1 Basis, in the case where visibility is respectively the weather of 20 meters, 50 meters, 100 meters, 200 meters, 500 meters and infinity, shoots correspondence respectively The video image of visibility.
Scheme respectively from visibility to be gathered in 20 meters, 50 meters, 100 meters, 200 meters, 500 meters and the video image of infinity As 5 frames, the frame of image 30 is gathered altogether, and gray processing processing, average treatment, so as to form ash are carried out successively respectively to the image selected Spend graphics standard storehouse.
Below, using visibility as 20 meters of situation, further is made to collection image, gray processing processing and average process step Explain.
S101 gathers image
The visibility for depositing in memory that controller is gathered from camera 1 randomly selects a frame figure in being 20 meters of data Picture.
The processing of S102 gray processings
Arithmetic unit gathers the image that image step obtains to S101 and does gray processing processing, and concrete methods of realizing is, using adding Weight average value method realizes gray processing, and gradation conversion formula is
P (i, j)=WRR(i,j)+WGG(i,j)+WBB(i,j)
Wherein P (i, j) for conversion after gray level image in pixel (i, j) gray value, R (i, j), G (i, j), B (i, J) be respectively red, green, blue component, WR、WG、WBThe respectively weights of red, green, blue component, and meet WR+WG+WB=1, due to Human eye is minimum to blue-sensitive degree to green sensitive degree highest, by many experiments, to best suit human visual experience as mark Standard, it is determined that optimal weighed combination is, WR=0.30, WG=0.59, WB=0.11.
S103 average treatments
In the case where visibility is 20 meters, circulating repetition carries out S101 collection image, at S102 gray processings successively Manage step, totally 5 times, the 5 frame gray level image respectively P obtained after gray processing processing1、P2、P3、P4、P5, by 5 obtained frame gray scales Image is averaging processing, and obtains the gray level image that visibility is 20 meters;The formula of average treatment is P (i, j)=∑ Pm(i, J)/5, wherein (m=1,2,3,4,5), PmPixel (i, j) in the frame gray level image obtained after being handled for S102 gray processings Gray value.
S104 sets up the different corresponding gray level images of specific visibility
Respectively in the case where visibility is 20 meters, 50 meters, 100 meters, 200 meters, 500 meters and infinity, according to described S103 average treatment steps, obtain the different corresponding gray level images of specific visibility, so as to form gray level image standard respectively Storehouse.
S2 obtains the corresponding gray level image of visibility to be measured
Same model, the equipment of parameter setting in gray level image java standard library step are set up using with S1, current night is measured The visibility of depletion region, the depletion region does not interfere with thing and other lighting apparatus.
S201 gathers visibility chart picture to be measured
Under preset test environment, image step is gathered according to the S101, a two field picture is gathered.
S202 gray processings processing visibility chart picture to be measured
According to the S102 gray processings process step, S201 is gathered into the image progress that visibility image step to be measured is obtained Gray processing processing.
S203 non-local mean denoisings
The gray level image for the preset test environment that S202 gray processings processing visibility image step to be measured is obtained carries out non- Local mean value denoising, obtains the corresponding gray level image of visibility to be measured;Concrete methods of realizing is that Denoising Algorithm is based on figure As similitude, in relatively Similarity measures weights, choose the image block centered on the pixel replace single pixel point it Between gray value be compared, the image block chosen centered on pending pixel is referred to as similar window;If scheming before denoising As being P,
P=p (i) | i ∈ I }
Pixel i similar window NiFor the image subblock centered on i, the image after denoising is
WhereinFor the filter value at pixel i in denoising image, w (i, j) is weights, and w (i, j) depends on similar window NiAnd NjBetween similarity degree,
W (i, j) meets 0≤w (i, j)≤1 and ∑jW (i, j) v (j)=1, wherein Z (i) is normalization factor, and formula is
Wherein u (Ni) represent similar window NiPixel set, | | | | represent Gauss weighted euclidean distance function, a is Gauss The standard deviation of sum, a>0, h is the filtering parameter for controlling image smoothness.
S3 compares the corresponding gray level image of visibility to be measured and gray level image java standard library
The corresponding gray level image of visibility to be measured is compared with each gray level image in gray level image java standard library respectively;Specifically Implementation method is to calculate corresponding zero-mean normalized-cross-correlation function respectively, and ZNCC is zero-mean normalized-cross-correlation function English abbreviation, formula is
Wherein, I0、I′0The gray scale vector gone for input picture before and after average, Ii、I′iAverage is gone for i-th of reference picture Front and rear gray scale vector, E, D, T represent to average respectively, variance and transposition.
S4 determines visibility to be measured
Its determination method is that visibility to be measured is the corresponding gray level image of visibility to be measured and in gray level image java standard library zero The corresponding visibility of mean normalization cross-correlation coefficient highest gray level image, i.e., visibility to be measured is that can see to be measured in S3 Spend corresponding gray level image to compare in step with gray level image java standard library, maximum corresponding visibility in 6 ZNCC values.
Embodiment 4:
As shown in figs 2-4, embodiment 4 discloses a kind of method for measuring atmospheric visibility at night, including step is such as Under:
S2 obtains the corresponding gray level image of visibility to be measured
Based on embodiment 2, running car on a highway, there is lane line in preset test environment, without other Lighting apparatus.
Visibility chart picture to be measured, S202 gray processings processing visibility chart picture to be measured are gathered according to S201 described in embodiment 3 After the work of S203 non-local mean denoisings step, then carry out following steps,
S204 removes lane line processing
Using Canny rim detections, the marginal information of lane line is obtained, using Hough transform detection of straight lines, track is extracted Line is simultaneously removed, so as to obtain the corresponding gray level image of visibility to be measured of no lane line.
The corresponding gray level image of visibility to be measured and gray level image java standard library are compared into step according to S3 described in embodiment 3 Work, the gray level image java standard library is the gray level image java standard library set up in embodiment 3.
S4 determines visibility to be measured
Visibility range to be measured is, the corresponding gray level image of visibility to be measured and zero-mean normalizing in gray level image java standard library Change the scope between the higher corresponding visibility of two frame gray level images of cross-correlation coefficient, i.e., visibility to be measured is, will be to be measured in S3 The corresponding gray level image of visibility is compared in step with gray level image java standard library, two larger ZNCC values institutes in 6 ZNCC values Scope between correspondence visibility.
As shown in figure 5, ZNCC values according to corresponding to data can be seen that 200 meters of 100 meters of visibility and visibility compared with Height, accordingly, it can be determined that visibility range is between 100 meters -200 meters.According to China《The law on road traffic safety》Implementing regulations 81 regulations, motor vehicle is travelled on expressway, when running into the low visibility meteorological condition such as mist, rain, snow, sand and dust, hail, Following regulation should be observed:Visibility is less than 200 meters of 60 kilometers/hour of speed limits, less than 100 meters 40 kilometers of speed limits of visibility/small When, visibility is less than 50 meters of 20 kilometers/hour of speed limits.So, according to current visibility range, it is proposed that driver's speed of operation is small In 60 kilometers/hour.
Data shown in Fig. 5, are one groups obtained using the technical scheme of embodiment 4 when it is 170 meters that visibility is informed in announcement Test data, it is seen then that the visibility that the technical scheme that provides of the present invention can conveniently, on effective dynamic detection highway.

Claims (10)

1. a kind of device that atmospheric visibility is measured at night, it is characterised in that:It includes camera (1), headlamp (2) and intelligence Module;The camera (1) is fixed by camera frame, and the headlamp (2) is located at camera (1) front area and solid by lamp bracket Fixed, the intelligent object is connected with the control end of headlamp (2) and controls headlamp (2) bright or go out, the intelligent object and phase Machine (1) connects and controls camera (1) to take pictures or image and collect and handle the data of camera (1) collection, obtains night air Visibility.
2. a kind of device that atmospheric visibility is measured at night according to claim 1, it is characterised in that:The camera (1) top nose in the car is fixed by camera frame, the headlamp (2) is the headlight on car, and the intelligent object is hand Hold terminal.
3. a kind of method for measuring atmospheric visibility at night, it is characterised in that:Step is as follows,
S1 sets up gray level image java standard library;
S2 obtains the corresponding gray level image of visibility to be measured;
S3 compares the corresponding gray level image of visibility to be measured and gray level image java standard library;
S4 determines visibility to be measured;
Wherein, the S1 sets up gray level image java standard library and comprised the following steps:
S101 gathers image
A two field picture is randomly selected in the data that intelligent object is gathered from camera (1);
The processing of S102 gray processings
The image that image step acquisition is gathered to S101 does gray processing processing, and concrete methods of realizing is, using weighted average method Gray processing is realized, gradation conversion formula is
P (i, j)=WRR(i,j)+WGG(i,j)+WBB(i,j)
Wherein P (i, j) is the gray value of pixel (i, j) in the gray level image after conversion, and R (i, j), G (i, j), B (i, j) divide Not Wei red, green, blue component, WR、WG、WBThe respectively weights of red, green, blue component, and meet WR+WG+WB=1.
4. a kind of method for measuring atmospheric visibility at night according to claim 3, it is characterised in that:The S1 is set up The step of gray level image java standard library also comprises the following steps,
S103 average treatments
In the case of specific visibility, image step is gathered according to the S101, the image of FN frames is gathered, according to the S102 Gray processing process step, carries out gray processing processing to the image of FN frames respectively, the gray level image of obtained FN frames is averaged Processing, obtains the gray level image of specific visibility;The formula of average treatment is P (i, j)=∑ Pm(i, j)/FN, wherein (m=1, 2,3,4,5 ...), PmThe gray value of pixel (i, j) in the frame gray level image obtained after being handled for S102 gray processings;
S104 sets up the different corresponding gray level images of specific visibility
Under different specific visibility, according to the S103 average treatments step, different specific visibility are obtained corresponding Gray level image, so as to form gray level image java standard library.
5. a kind of method for measuring atmospheric visibility at night according to claim 4, it is characterised in that:The number of the FN It is worth for 5, WR=0.30, WG=0.59, WB=0.11.
6. a kind of method for measuring atmospheric visibility at night according to claim 3, it is characterised in that:The S2 is obtained The corresponding gray level image of visibility to be measured comprises the following steps,
S201 gathers visibility chart picture to be measured
Under preset test environment, image step is gathered according to the S101, a two field picture is gathered;
S202 gray processings processing visibility chart picture to be measured
According to the S102 gray processings process step, S201 is gathered into the image progress gray scale that visibility image step to be measured is obtained Change is handled;
S203 non-local mean denoisings
The gray level image for the preset test environment that S202 gray processings processing visibility image step to be measured is obtained carries out non local Equalization denoising, obtains the corresponding gray level image of visibility to be measured.
7. a kind of method for measuring atmospheric visibility at night according to claim 6, it is characterised in that:The S2 is obtained The corresponding gray level image of visibility to be measured also comprises the following steps,
S204 removes lane line processing
If having lane line in the gray level image that the S203 non-local means denoising step is obtained, by the lane line It is removed lane line processing;Using Canny rim detections, the marginal information of lane line is obtained, detects straight using Hough transform Line, extracts lane line and removes, obtain the corresponding gray level image of visibility to be measured of no lane line.
8. a kind of method for measuring atmospheric visibility at night according to claim 3, it is characterised in that:The S3 will be treated Survey the corresponding gray level image of visibility to compare with gray level image java standard library, including the corresponding gray level image of visibility to be measured is distinguished Compared with each gray level image in gray level image java standard library, and calculate zero-mean normalized-cross-correlation function respectively, formula is
<mrow> <mi>R</mi> <mrow> <mo>(</mo> <mrow> <msub> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mn>0</mn> </msub> <mo>,</mo> <msubsup> <mi>I</mi> <mi>i</mi> <mo>&amp;prime;</mo> </msubsup> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mfrac> <mrow> <mi>E</mi> <mrow> <mo>(</mo> <mrow> <msub> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mn>0</mn> </msub> <mo>*</mo> <msubsup> <mi>I</mi> <mi>i</mi> <mo>&amp;prime;</mo> </msubsup> </mrow> <mo>)</mo> </mrow> <mo>-</mo> <mi>E</mi> <mrow> <mo>(</mo> <mrow> <msub> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mn>0</mn> </msub> </mrow> <mo>)</mo> </mrow> <mo>*</mo> <mi>E</mi> <mrow> <mo>(</mo> <msubsup> <mi>I</mi> <mi>i</mi> <mo>&amp;prime;</mo> </msubsup> <mo>)</mo> </mrow> </mrow> <mrow> <mi>D</mi> <mrow> <mo>(</mo> <mrow> <msub> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mn>0</mn> </msub> </mrow> <mo>)</mo> </mrow> <mo>*</mo> <mi>D</mi> <mrow> <mo>(</mo> <msubsup> <mi>I</mi> <mi>i</mi> <mo>&amp;prime;</mo> </msubsup> <mo>)</mo> </mrow> </mrow> </mfrac> <mo>=</mo> <mfrac> <mrow> <msub> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mn>0</mn> </msub> <mo>,</mo> <msup> <msubsup> <mi>I</mi> <mi>i</mi> <mo>&amp;prime;</mo> </msubsup> <mi>T</mi> </msup> </mrow> <mrow> <mi>N</mi> <mo>*</mo> <mi>D</mi> <mrow> <mo>(</mo> <mrow> <msub> <msup> <mi>I</mi> <mo>&amp;prime;</mo> </msup> <mn>0</mn> </msub> </mrow> <mo>)</mo> </mrow> <mo>*</mo> <mi>D</mi> <mrow> <mo>(</mo> <msubsup> <mi>I</mi> <mi>i</mi> <mo>&amp;prime;</mo> </msubsup> <mo>)</mo> </mrow> </mrow> </mfrac> </mrow>
Wherein, I0、I′0The gray scale vector gone for input picture before and after average, Ii、I′iGone for i-th of reference picture before and after average Gray scale vector, E, D, T represent to average respectively, variance and transposition.
9. a kind of method for measuring atmospheric visibility at night according to claim 8, it is characterised in that:The S4 is determined Visibility to be measured, including determine that the corresponding gray level image of visibility to be measured and gray level image java standard library are compared maximum in step by S3 Visibility corresponding to zero-mean normalized-cross-correlation function value is visibility to be measured.
10. a kind of method for measuring atmospheric visibility at night according to claim 8, it is characterised in that:The S4 is true Fixed visibility to be measured, including determine S3 by the corresponding gray level image of visibility to be measured and gray level image java standard library compare in step compared with Scope between visibility corresponding to two big zero-mean normalized-cross-correlation function values is visibility range to be measured.
CN201710543433.5A 2017-07-05 2017-07-05 A kind of method and device that atmospheric visibility is measured at night Pending CN107328777A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710543433.5A CN107328777A (en) 2017-07-05 2017-07-05 A kind of method and device that atmospheric visibility is measured at night

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710543433.5A CN107328777A (en) 2017-07-05 2017-07-05 A kind of method and device that atmospheric visibility is measured at night

Publications (1)

Publication Number Publication Date
CN107328777A true CN107328777A (en) 2017-11-07

Family

ID=60196399

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710543433.5A Pending CN107328777A (en) 2017-07-05 2017-07-05 A kind of method and device that atmospheric visibility is measured at night

Country Status (1)

Country Link
CN (1) CN107328777A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108181307A (en) * 2017-12-06 2018-06-19 中国气象局北京城市气象研究所 A kind of Visibility Measures System and method
CN109241831A (en) * 2018-07-26 2019-01-18 东南大学 A kind of greasy weather at night visibility classification method based on image analysis
CN109584575A (en) * 2018-12-19 2019-04-05 山东交通学院 A kind of road safety speed limit prompt system and method based on visibility analysis

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101281142A (en) * 2007-12-28 2008-10-08 深圳先进技术研究院 Method for measuring atmosphere visibility
CN101957309A (en) * 2010-08-17 2011-01-26 招商局重庆交通科研设计院有限公司 All-weather video measurement method for visibility
CN102854138A (en) * 2012-03-23 2013-01-02 中国气象局北京城市气象研究所 Visibility measuring system and method based on digital camera shooting method
CN102930508A (en) * 2012-08-30 2013-02-13 西安电子科技大学 Image residual signal based non-local mean value image de-noising method
CN103839234A (en) * 2014-02-21 2014-06-04 西安电子科技大学 Double-geometry nonlocal average image denoising method based on controlled nuclear
CN104634784A (en) * 2013-11-08 2015-05-20 中国电信股份有限公司 Atmospheric visibility monitoring method and device
CN104978715A (en) * 2015-05-11 2015-10-14 中国科学院光电技术研究所 Non-local mean value image denoising method based on filter window and parameter adaption
CN105335947A (en) * 2014-05-26 2016-02-17 富士通株式会社 Image de-noising method and image de-noising apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101281142A (en) * 2007-12-28 2008-10-08 深圳先进技术研究院 Method for measuring atmosphere visibility
CN101957309A (en) * 2010-08-17 2011-01-26 招商局重庆交通科研设计院有限公司 All-weather video measurement method for visibility
CN102854138A (en) * 2012-03-23 2013-01-02 中国气象局北京城市气象研究所 Visibility measuring system and method based on digital camera shooting method
CN102930508A (en) * 2012-08-30 2013-02-13 西安电子科技大学 Image residual signal based non-local mean value image de-noising method
CN104634784A (en) * 2013-11-08 2015-05-20 中国电信股份有限公司 Atmospheric visibility monitoring method and device
CN103839234A (en) * 2014-02-21 2014-06-04 西安电子科技大学 Double-geometry nonlocal average image denoising method based on controlled nuclear
CN105335947A (en) * 2014-05-26 2016-02-17 富士通株式会社 Image de-noising method and image de-noising apparatus
CN104978715A (en) * 2015-05-11 2015-10-14 中国科学院光电技术研究所 Non-local mean value image denoising method based on filter window and parameter adaption

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108181307A (en) * 2017-12-06 2018-06-19 中国气象局北京城市气象研究所 A kind of Visibility Measures System and method
CN109241831A (en) * 2018-07-26 2019-01-18 东南大学 A kind of greasy weather at night visibility classification method based on image analysis
CN109241831B (en) * 2018-07-26 2021-11-26 东南大学 Night fog visibility classification method based on image analysis
CN109584575A (en) * 2018-12-19 2019-04-05 山东交通学院 A kind of road safety speed limit prompt system and method based on visibility analysis

Similar Documents

Publication Publication Date Title
CN106919915B (en) Map road marking and road quality acquisition device and method based on ADAS system
CN104266823B (en) Based on safety depending on the tunnel portal section lighting criteria measuring method and its system on daytime recognized
CN104011737B (en) Method for detecting mist
CN105424655B (en) A kind of visibility detecting method based on video image
CN105261018B (en) Visibility detecting method based on optical model and dark primary priori theoretical
RU2571368C1 (en) Device for detecting three-dimensional objects, method of detecting three-dimensional objects
CN103714343B (en) Under laser line generator lighting condition, the pavement image of twin-line array collected by camera splices and homogenizing method
CN107315095B (en) More vehicle automatic speed-measuring methods with illumination adaptability based on video processing
CN103808723A (en) Exhaust gas blackness automatic detection device for diesel vehicles
CN105574552A (en) Vehicle ranging and collision early warning method based on monocular vision
CN107240079A (en) A kind of road surface crack detection method based on image procossing
CN101936900A (en) Video-based visibility detecting system
JP2007234019A (en) Vehicle image area specifying device and method for it
CN104634784B (en) atmospheric visibility monitoring method and device
CN105139347A (en) Polarized image defogging method combined with dark channel prior principle
CN107328777A (en) A kind of method and device that atmospheric visibility is measured at night
CN105628194B (en) A kind of road lighting quality on-site measurement method
CN106778534A (en) Surrounding environment method for recognition of lamplight in a kind of vehicle traveling
CN105466556A (en) Tunnel brightness detection system
CN110645973A (en) Vehicle positioning method
CN112419745A (en) Highway group fog early warning system based on degree of depth fusion network
CN105631825B (en) Based on the image defogging method for rolling guiding
CN108229447B (en) High beam light detection method based on video stream
CN201740736U (en) Visibility detection system based on video
Hautière et al. Experimental validation of dedicated methods to in-vehicle estimation of atmospheric visibility distance

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20171107