CN107464418B - Intelligent traffic management system - Google Patents
Intelligent traffic management system Download PDFInfo
- Publication number
- CN107464418B CN107464418B CN201710714546.7A CN201710714546A CN107464418B CN 107464418 B CN107464418 B CN 107464418B CN 201710714546 A CN201710714546 A CN 201710714546A CN 107464418 B CN107464418 B CN 107464418B
- Authority
- CN
- China
- Prior art keywords
- image
- defogged
- evaluation
- processing
- foggy day
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000011156 evaluation Methods 0.000 claims abstract description 60
- 238000012545 processing Methods 0.000 claims abstract description 49
- 238000012544 monitoring process Methods 0.000 claims abstract description 28
- 238000003707 image sharpening Methods 0.000 claims abstract description 12
- 238000000034 method Methods 0.000 claims abstract description 6
- 238000004891 communication Methods 0.000 claims abstract description 4
- 230000000694 effects Effects 0.000 claims description 13
- 230000004927 fusion Effects 0.000 claims description 10
- 238000005286 illumination Methods 0.000 claims description 6
- 238000007499 fusion processing Methods 0.000 claims description 4
- 238000003384 imaging method Methods 0.000 claims description 2
- 230000014759 maintenance of location Effects 0.000 claims description 2
- 230000007903 penetration ability Effects 0.000 claims description 2
- 230000000717 retained effect Effects 0.000 claims description 2
- 230000009286 beneficial effect Effects 0.000 abstract description 3
- 238000001914 filtration Methods 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000003912 environmental pollution Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 230000002349 favourable effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/01—Detecting movement of traffic to be counted or controlled
-
- G06T5/73—
Abstract
The invention provides an intelligent traffic management system, which comprises a monitoring platform and a user terminal, wherein the monitoring platform and the user terminal complete data exchange through wireless communication, the monitoring platform comprises a camera and an image sharpening processing subsystem, the camera is used for acquiring a monitoring image, and the image sharpening processing subsystem is used for processing the monitoring image in a foggy day; the image sharpening processing subsystem comprises an image storage module, an image processing module and a performance evaluation module, wherein the image storage module is used for storing the foggy day images acquired by the camera, the image processing module is used for carrying out defogging processing on the stored foggy day images, and the performance evaluation module is used for evaluating the performance of the image processing module. The invention has the beneficial effects that: the method realizes the clear processing of the foggy day image and the evaluation of the processing performance, and improves the traffic management level.
Description
Technical Field
The invention relates to the technical field of traffic management, in particular to an intelligent traffic management system.
Background
With the development of urban traffic, higher requirements are put forward on traffic management, and a monitoring platform is required to be capable of acquiring monitoring information with low visibility.
In a foggy day environment, due to the scattering effect of atmospheric particles in the air on light, a foggy day image acquired by an outdoor vision system is seriously degraded, and the conditions of contrast reduction, color distortion, detail loss and the like occur. Taking video monitoring as an example, due to the fact that fog is diffused, the visibility of a monitoring site is greatly reduced, video information obtained through a visual system is often not accurate enough, and great inconvenience is brought to monitoring work. In addition, with the increase of environmental pollution in recent years, haze is increased, and the same problem as that in the fog is brought about. In the research, the foggy days and the haze days are collectively called as the foggy days, the processing effect of the foggy day images in the prior art is poor, and the processing effect cannot be effectively evaluated.
Disclosure of Invention
In view of the above problems, the present invention aims to provide an intelligent traffic management system.
The purpose of the invention is realized by adopting the following technical scheme:
the intelligent traffic management system comprises a monitoring platform and a user terminal, wherein the monitoring platform and the user terminal complete data exchange through wireless communication, the monitoring platform comprises a camera and an image sharpening processing subsystem, the camera is used for acquiring a monitoring image, and the image sharpening processing subsystem is used for processing the monitoring image in the foggy days; the image sharpening processing subsystem comprises an image storage module, an image processing module and a performance evaluation module, wherein the image storage module is used for storing the foggy day images acquired by the camera, the image processing module is used for carrying out defogging processing on the stored foggy day images, and the performance evaluation module is used for evaluating the performance of the image processing module.
The invention has the beneficial effects that: the method realizes the clear processing of the foggy day image and the evaluation of the processing performance, and improves the traffic management level.
Drawings
The invention is further illustrated by means of the attached drawings, but the embodiments in the drawings do not constitute any limitation to the invention, and for a person skilled in the art, other drawings can be obtained on the basis of the following drawings without inventive effort.
FIG. 1 is a schematic structural view of the present invention;
reference numerals:
monitoring platform 1, user terminal 2.
Detailed Description
The invention is further described with reference to the following examples.
Referring to fig. 1, the intelligent traffic management system of the embodiment includes a monitoring platform 1 and a user terminal 2, where the monitoring platform 1 and the user terminal 2 complete data exchange through wireless communication, the monitoring platform 1 includes a camera and an image sharpening processing subsystem, the camera is used to obtain a monitoring image, and the image sharpening processing subsystem is used to process the monitoring image in a foggy day; the image sharpening processing subsystem comprises an image storage module, an image processing module and a performance evaluation module, wherein the image storage module is used for storing the foggy day images acquired by the camera, the image processing module is used for carrying out defogging processing on the stored foggy day images, and the performance evaluation module is used for evaluating the performance of the image processing module.
The embodiment realizes the clear processing of the foggy day image and the evaluation of the processing performance, and improves the traffic management level.
Preferably, the image processing module comprises a first defogging unit, a second defogging unit and a fusion defogging unit, the first defogging unit is used for processing the foggy image according to the first atmospheric scattering model to obtain a primary defogged image, the second defogging unit is used for processing the foggy image according to the second atmospheric scattering model to obtain a secondary defogged image, and the fusion defogging unit is used for conducting fusion processing on the primary defogged image and the secondary defogged image to obtain a defogged clear image.
The preferred embodiment has good defogging effect.
Preferably, the processing of the foggy day image according to the first atmospheric scattering model is performed in the following manner:
step 1, establishing a first atmospheric scattering model:
L(x)=1+R1(x)z(x)+B[1-z(x)]
in the above formula, x represents the spatial coordinates of the image pixel, l (x) represents the acquired foggy day image, B represents the atmospheric illumination, z (x) represents the medium propagation function for reflecting the light penetration ability, and R (x) represents the medium propagation function1(x) Representing a primary defogged image;
step 2, obtaining R1(x) Dark primary color image of (1):
in the above formula, R1 da(x) Represents R1(x) Dark primary color image of R1 c(y) represents R1 da(x) One channel of (1), R, g, b, respectively, represents the image R1(x) A red channel, a green channel and a blue channel,representing a square area with x as the center and a side length a, wherein a is 2% of the maximum side of the image;
step 3, carrying out local area size on the first atmospheric scattering model to obtainAnd minimum for each channel:
in the above formula, Lc(y) denotes a channel in the foggy day image, BcA channel representing atmospheric illumination;
step 4, known from the principle of dark primary color, R1 da(x) Towards 0, the medium propagation function can be obtained:
in the formula, delta is a fog retention factor and is used for adjusting the degree of retained fog;
the first defogging unit of this preferred embodiment has simplified the atmosphere scattering model in the past through establishing first atmosphere scattering model, helps improving computational efficiency, and when filtering first atmosphere scattering model, the filtering area carries out self-adjustment according to the image size, has guaranteed the defogging effect of different images.
Preferably, the processing of the foggy day image according to the second atmospheric scattering model is performed in the following manner:
step 1, establishing a second atmospheric scattering model:
in the formula, x represents an image pixel space coordinate, L (x) represents an acquired foggy day image, B represents atmospheric illumination, q (x) represents an atmospheric dissipation function and is used for reflecting the influence of ambient light on scene imaging, and R2(x) Representing a secondary defogged image;
step 2, carrying out white balance operation on the foggy day image by adopting the following formula:
defining the minimum color component of the foggy day image as an atmospheric dissipation function:
in the above formula, Lc' (x) denotes the minimum color component of the foggy day image in the three color channels r, g, b;
and performing fusion processing on the primary defogged image and the secondary defogged image by adopting the following formula:
in the above formula, r (x) represents a defogged clear image.
The second defogging unit in the preferred embodiment simplifies the previous atmospheric scattering model by establishing the second atmospheric scattering model, contributes to improving the calculation efficiency, and is favorable for improving the defogging effect and obtaining a clearer image by acquiring the fog-free clear image through the fusion algorithm.
Preferably, the performance evaluation module includes a first evaluation unit, a second evaluation unit and a fusion evaluation unit, the first evaluation unit is configured to determine a first evaluation factor of the defogged clear image, the second evaluation unit is configured to determine a second evaluation factor of the defogged clear image, and the fusion evaluation unit is configured to evaluate the defogged clear image according to the first evaluation factor and the second evaluation factor.
The preferred embodiment realizes the accurate evaluation of the performance of the image processing module by a multi-evaluation factor fusion mode.
Preferably, the determining the first evaluation factor of the defogged clear image specifically includes:
in the above formula, P1First evaluation factor, n, for a haze-free sharp image1And n2Respectively representing the number of visible edges in the collected image and the defogged clear image;
the determining of the second evaluation factor of the defogged clear image specifically comprises the following steps:
in the above formula, P2Second evaluation factor, m, for a haze-free sharp image1And m2Respectively representing the number of black pixel points and white pixel points of the defogged clear image.
The evaluation of the defogged clear image specifically comprises the following steps: calculating a comprehensive evaluation factor P according to the first evaluation factor and the second evaluation factor:the larger the comprehensive evaluation factor is, the better the defogging effect is, and the clearer the image is.
At present, subjective visual evaluation is mainly used for evaluating the defogging effect, repeated experiments are needed for many times, an observer evaluates the image quality, time and labor are consumed, the evaluation is easily influenced by subjective factors such as the professional background of the observer, and the evaluation method does not have good reliability. The optimized embodiment quantitatively describes the clear effect of the defogging algorithm, realizes the objective evaluation of the defogging effect, and the evaluation module comprehensively considers various evaluation factors, so that the evaluation reliability is high.
The monitoring platform is adopted to process the foggy day images, 5 groups of foggy day images are selected to be processed, the images are respectively an image 1 group, an image 2 group, an image 3 group, an image 4 group and an image 5 group, each group contains 10 foggy day images, and the average value of the image defogging time and the image defogging effect of each group are counted, compared with the prior art, the monitoring platform has the following beneficial effects:
average reduction of image defogging time | Image defogging effect improvement | |
Group of images 1 | 29% | 21% |
Image 2 group | 27% | 23% |
Image 3 group | 26% | 25% |
4 groups of images | 25% | 27% |
Image 5 group | 24% | 29% |
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the protection scope of the present invention, although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.
Claims (2)
1. An intelligent traffic management system is characterized by comprising a monitoring platform and a user terminal, wherein the monitoring platform and the user terminal complete data exchange through wireless communication, the monitoring platform comprises a camera and an image sharpening processing subsystem, the camera is used for acquiring a monitoring image, and the image sharpening processing subsystem is used for processing the monitoring image in the foggy days; the image sharpening processing subsystem comprises an image storage module, an image processing module and a performance evaluation module, wherein the image storage module is used for storing the foggy day images acquired by the camera, the image processing module is used for carrying out defogging processing on the stored foggy day images, and the performance evaluation module is used for evaluating the performance of the image processing module;
the image processing module comprises a first defogging unit, a second defogging unit and a fusion defogging unit, wherein the first defogging unit is used for processing a foggy image according to a first atmospheric scattering model to obtain a primary defogged image, the second defogging unit is used for processing the foggy image according to a second atmospheric scattering model to obtain a secondary defogged image, and the fusion defogging unit is used for conducting fusion processing on the primary defogged image and the secondary defogged image to obtain a defogged clear image;
the foggy day image is processed according to the first atmospheric scattering model by adopting the following method:
step 1, establishing a first atmospheric scattering model:
L(x)=1+R1(x)z(x)+B[1-z(x)]
in the above formula, x represents the spatial coordinates of the image pixel, l (x) represents the acquired foggy day image, B represents the atmospheric illumination, z (x) represents the medium propagation function for reflecting the light penetration ability, and R (x) represents the medium propagation function1(x) Representing a primary defogged image;
step 2, obtaining R1(x) Dark primary color image of (1):
in the above formula, R1 da(x) Represents R1(x) Dark primary color image of R1 c(y) represents R1 da(x) One channel of (1), R, g, b, respectively, represents the image R1(x) A red channel, a green channel and a blue channel,representing a square area with x as the center and a side length a, wherein a is 2% of the maximum side of the image;
step 3, carrying out local area size on the first atmospheric scattering model to obtainAnd minimum for each channel:
in the above formula, Lc(y) denotes a channel in the foggy day image, BcA channel representing atmospheric illumination;
step 4, known from the principle of dark primary color, R1 da(x) Tend to be0, the medium propagation function can be obtained:
in the formula, delta is a fog retention factor and is used for adjusting the degree of retained fog;
and processing the foggy day image according to the second atmospheric scattering model by adopting the following method:
step 1, establishing a second atmospheric scattering model:
in the formula, x represents an image pixel space coordinate, L (x) represents an acquired foggy day image, B represents atmospheric illumination, q (x) represents an atmospheric dissipation function and is used for reflecting the influence of ambient light on scene imaging, and R2(x) Representing a secondary defogged image;
step 2, carrying out white balance operation on the foggy day image by adopting the following formula:
l' (x) represents a foggy day image after the white balance operation;
defining the minimum color component of the foggy day image as an atmospheric dissipation function:
in the above formula, Lc' (x) representsThe minimum color components of the foggy day image in the r, g and b color channels;
and performing fusion processing on the primary defogged image and the secondary defogged image by adopting the following formula:
in the above formula, r (x) represents a defogged clear image;
the performance evaluation module comprises a first evaluation unit, a second evaluation unit and a fusion evaluation unit, wherein the first evaluation unit is used for determining a first evaluation factor of the defogged clear image, the second evaluation unit is used for determining a second evaluation factor of the defogged clear image, and the fusion evaluation unit is used for evaluating the defogged clear image according to the first evaluation factor and the second evaluation factor;
the determining of the first evaluation factor of the defogged clear image specifically comprises the following steps:
in the above formula, P1First evaluation factor, n, for a haze-free sharp image1And n2Respectively representing the number of visible edges in the collected image and the defogged clear image;
the determining of the second evaluation factor of the defogged clear image specifically comprises the following steps:
in the above formula, P2Second evaluation factor, m, for a haze-free sharp image1And m2Respectively representing the number of black pixel points and white pixel points of the defogged clear image.
2. The intelligent traffic management system according to claim 1, wherein the evaluation of the defogged clear images is specifically: calculating a comprehensive evaluation factor P according to the first evaluation factor and the second evaluation factor:the larger the comprehensive evaluation factor is, the better the defogging effect is, and the clearer the image is.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710714546.7A CN107464418B (en) | 2017-08-18 | 2017-08-18 | Intelligent traffic management system |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710714546.7A CN107464418B (en) | 2017-08-18 | 2017-08-18 | Intelligent traffic management system |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107464418A CN107464418A (en) | 2017-12-12 |
CN107464418B true CN107464418B (en) | 2021-03-19 |
Family
ID=60550085
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710714546.7A Active CN107464418B (en) | 2017-08-18 | 2017-08-18 | Intelligent traffic management system |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107464418B (en) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103021177A (en) * | 2012-11-05 | 2013-04-03 | 北京理工大学 | Method and system for processing traffic monitoring video image in foggy day |
KR20140083602A (en) * | 2012-12-26 | 2014-07-04 | 금오공과대학교 산학협력단 | Device and method for visibility enhancement using fusion of dehazing and retinex |
CN103955905A (en) * | 2014-05-13 | 2014-07-30 | 北京邮电大学 | Rapid wavelet transformation and weighted image fusion single-image defogging method |
WO2014174765A1 (en) * | 2013-04-26 | 2014-10-30 | コニカミノルタ株式会社 | Image capture device and image capture method |
CN104715623A (en) * | 2015-04-08 | 2015-06-17 | 王蕾 | Congestion index detection system for traffic intersection in front of signal lamp |
CN104867121A (en) * | 2015-06-08 | 2015-08-26 | 武汉理工大学 | Fast image defogging method based on dark channel prior and Retinex theory |
CN105574819A (en) * | 2015-06-25 | 2016-05-11 | 宇龙计算机通信科技(深圳)有限公司 | Real-time image defogging method and apparatus |
-
2017
- 2017-08-18 CN CN201710714546.7A patent/CN107464418B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103021177A (en) * | 2012-11-05 | 2013-04-03 | 北京理工大学 | Method and system for processing traffic monitoring video image in foggy day |
KR20140083602A (en) * | 2012-12-26 | 2014-07-04 | 금오공과대학교 산학협력단 | Device and method for visibility enhancement using fusion of dehazing and retinex |
WO2014174765A1 (en) * | 2013-04-26 | 2014-10-30 | コニカミノルタ株式会社 | Image capture device and image capture method |
CN103955905A (en) * | 2014-05-13 | 2014-07-30 | 北京邮电大学 | Rapid wavelet transformation and weighted image fusion single-image defogging method |
CN104715623A (en) * | 2015-04-08 | 2015-06-17 | 王蕾 | Congestion index detection system for traffic intersection in front of signal lamp |
CN104867121A (en) * | 2015-06-08 | 2015-08-26 | 武汉理工大学 | Fast image defogging method based on dark channel prior and Retinex theory |
CN105574819A (en) * | 2015-06-25 | 2016-05-11 | 宇龙计算机通信科技(深圳)有限公司 | Real-time image defogging method and apparatus |
Also Published As
Publication number | Publication date |
---|---|
CN107464418A (en) | 2017-12-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106910175B (en) | Single image defogging algorithm based on deep learning | |
CN101908210B (en) | Method and system for color image defogging treatment | |
CN110378849B (en) | Image defogging and rain removing method based on depth residual error network | |
CN102170574B (en) | Real-time video defogging system | |
CN102063706B (en) | Rapid defogging method | |
CN107292830B (en) | Low-illumination image enhancement and evaluation method | |
CN108765342A (en) | A kind of underwater image restoration method based on improvement dark | |
CN103049888A (en) | Image/video demisting method based on combination of dark primary color of atmospheric scattered light | |
TW201610912A (en) | Method and system for image haze removal based on hybrid dark channel prior | |
CN105374013A (en) | Method and image processing apparatus for image visibility restoration on the base of dual dark channel prior | |
CN109993804A (en) | A kind of road scene defogging method generating confrontation network based on condition | |
CN110544213A (en) | Image defogging method based on global and local feature fusion | |
CN109389569B (en) | Monitoring video real-time defogging method based on improved DehazeNet | |
CN105989583B (en) | A kind of image defogging method | |
CN102609909A (en) | Method and device for defogging single image | |
CN108629750A (en) | A kind of night defogging method, terminal device and storage medium | |
CN103778605A (en) | Greasy weather image enhancement method | |
Ji et al. | Real-time enhancement of the image clarity for traffic video monitoring systems in haze | |
CN107464418B (en) | Intelligent traffic management system | |
CN110197465B (en) | Foggy image enhancement method | |
CN110807743B (en) | Image defogging method based on convolutional neural network | |
CN112070691A (en) | Image defogging method based on U-Net | |
CN117197068A (en) | Mist concentration estimation method, device, equipment and storage medium | |
CN104537623A (en) | Image fog-removing method and device based on image segmentation | |
CN107454319A (en) | Image processing method, device, mobile terminal and computer-readable recording medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
TA01 | Transfer of patent application right | ||
TA01 | Transfer of patent application right |
Effective date of registration: 20210129 Address after: 518001 west block, 4th floor, building 702, PengJi Industrial Zone, Liantang, Xianhu Road, Luohu District, Shenzhen City, Guangdong Province Applicant after: SHENZHEN PENGCHENG TRANSPORTATION NETWORK Co.,Ltd. Address before: No.27, Dongqiao East Road, Zengjiang street, Zengcheng District, Guangzhou, Guangdong 510000 Applicant before: Pan Jinwen |
|
GR01 | Patent grant | ||
GR01 | Patent grant |