CN107464418B - Intelligent traffic management system - Google Patents

Intelligent traffic management system Download PDF

Info

Publication number
CN107464418B
CN107464418B CN201710714546.7A CN201710714546A CN107464418B CN 107464418 B CN107464418 B CN 107464418B CN 201710714546 A CN201710714546 A CN 201710714546A CN 107464418 B CN107464418 B CN 107464418B
Authority
CN
China
Prior art keywords
image
defogged
evaluation
processing
foggy day
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710714546.7A
Other languages
Chinese (zh)
Other versions
CN107464418A (en
Inventor
潘金文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SHENZHEN PENGCHENG TRANSPORTATION NETWORK Co.,Ltd.
Original Assignee
Shenzhen Pengcheng Transportation Network Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Pengcheng Transportation Network Co ltd filed Critical Shenzhen Pengcheng Transportation Network Co ltd
Priority to CN201710714546.7A priority Critical patent/CN107464418B/en
Publication of CN107464418A publication Critical patent/CN107464418A/en
Application granted granted Critical
Publication of CN107464418B publication Critical patent/CN107464418B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G06T5/73

Abstract

The invention provides an intelligent traffic management system, which comprises a monitoring platform and a user terminal, wherein the monitoring platform and the user terminal complete data exchange through wireless communication, the monitoring platform comprises a camera and an image sharpening processing subsystem, the camera is used for acquiring a monitoring image, and the image sharpening processing subsystem is used for processing the monitoring image in a foggy day; the image sharpening processing subsystem comprises an image storage module, an image processing module and a performance evaluation module, wherein the image storage module is used for storing the foggy day images acquired by the camera, the image processing module is used for carrying out defogging processing on the stored foggy day images, and the performance evaluation module is used for evaluating the performance of the image processing module. The invention has the beneficial effects that: the method realizes the clear processing of the foggy day image and the evaluation of the processing performance, and improves the traffic management level.

Description

Intelligent traffic management system
Technical Field
The invention relates to the technical field of traffic management, in particular to an intelligent traffic management system.
Background
With the development of urban traffic, higher requirements are put forward on traffic management, and a monitoring platform is required to be capable of acquiring monitoring information with low visibility.
In a foggy day environment, due to the scattering effect of atmospheric particles in the air on light, a foggy day image acquired by an outdoor vision system is seriously degraded, and the conditions of contrast reduction, color distortion, detail loss and the like occur. Taking video monitoring as an example, due to the fact that fog is diffused, the visibility of a monitoring site is greatly reduced, video information obtained through a visual system is often not accurate enough, and great inconvenience is brought to monitoring work. In addition, with the increase of environmental pollution in recent years, haze is increased, and the same problem as that in the fog is brought about. In the research, the foggy days and the haze days are collectively called as the foggy days, the processing effect of the foggy day images in the prior art is poor, and the processing effect cannot be effectively evaluated.
Disclosure of Invention
In view of the above problems, the present invention aims to provide an intelligent traffic management system.
The purpose of the invention is realized by adopting the following technical scheme:
the intelligent traffic management system comprises a monitoring platform and a user terminal, wherein the monitoring platform and the user terminal complete data exchange through wireless communication, the monitoring platform comprises a camera and an image sharpening processing subsystem, the camera is used for acquiring a monitoring image, and the image sharpening processing subsystem is used for processing the monitoring image in the foggy days; the image sharpening processing subsystem comprises an image storage module, an image processing module and a performance evaluation module, wherein the image storage module is used for storing the foggy day images acquired by the camera, the image processing module is used for carrying out defogging processing on the stored foggy day images, and the performance evaluation module is used for evaluating the performance of the image processing module.
The invention has the beneficial effects that: the method realizes the clear processing of the foggy day image and the evaluation of the processing performance, and improves the traffic management level.
Drawings
The invention is further illustrated by means of the attached drawings, but the embodiments in the drawings do not constitute any limitation to the invention, and for a person skilled in the art, other drawings can be obtained on the basis of the following drawings without inventive effort.
FIG. 1 is a schematic structural view of the present invention;
reference numerals:
monitoring platform 1, user terminal 2.
Detailed Description
The invention is further described with reference to the following examples.
Referring to fig. 1, the intelligent traffic management system of the embodiment includes a monitoring platform 1 and a user terminal 2, where the monitoring platform 1 and the user terminal 2 complete data exchange through wireless communication, the monitoring platform 1 includes a camera and an image sharpening processing subsystem, the camera is used to obtain a monitoring image, and the image sharpening processing subsystem is used to process the monitoring image in a foggy day; the image sharpening processing subsystem comprises an image storage module, an image processing module and a performance evaluation module, wherein the image storage module is used for storing the foggy day images acquired by the camera, the image processing module is used for carrying out defogging processing on the stored foggy day images, and the performance evaluation module is used for evaluating the performance of the image processing module.
The embodiment realizes the clear processing of the foggy day image and the evaluation of the processing performance, and improves the traffic management level.
Preferably, the image processing module comprises a first defogging unit, a second defogging unit and a fusion defogging unit, the first defogging unit is used for processing the foggy image according to the first atmospheric scattering model to obtain a primary defogged image, the second defogging unit is used for processing the foggy image according to the second atmospheric scattering model to obtain a secondary defogged image, and the fusion defogging unit is used for conducting fusion processing on the primary defogged image and the secondary defogged image to obtain a defogged clear image.
The preferred embodiment has good defogging effect.
Preferably, the processing of the foggy day image according to the first atmospheric scattering model is performed in the following manner:
step 1, establishing a first atmospheric scattering model:
L(x)=1+R1(x)z(x)+B[1-z(x)]
in the above formula, x represents the spatial coordinates of the image pixel, l (x) represents the acquired foggy day image, B represents the atmospheric illumination, z (x) represents the medium propagation function for reflecting the light penetration ability, and R (x) represents the medium propagation function1(x) Representing a primary defogged image;
step 2, obtaining R1(x) Dark primary color image of (1):
Figure BDA0001383486150000021
in the above formula, R1 da(x) Represents R1(x) Dark primary color image of R1 c(y) represents R1 da(x) One channel of (1), R, g, b, respectively, represents the image R1(x) A red channel, a green channel and a blue channel,
Figure BDA0001383486150000022
representing a square area with x as the center and a side length a, wherein a is 2% of the maximum side of the image;
step 3, carrying out local area size on the first atmospheric scattering model to obtain
Figure BDA0001383486150000023
And minimum for each channel:
Figure BDA0001383486150000031
in the above formula, Lc(y) denotes a channel in the foggy day image, BcA channel representing atmospheric illumination;
step 4, known from the principle of dark primary color, R1 da(x) Towards 0, the medium propagation function can be obtained:
Figure BDA0001383486150000032
in the formula, delta is a fog retention factor and is used for adjusting the degree of retained fog;
step 5, solving the primary defogged image according to the first atmospheric scattering model:
Figure BDA0001383486150000033
the first defogging unit of this preferred embodiment has simplified the atmosphere scattering model in the past through establishing first atmosphere scattering model, helps improving computational efficiency, and when filtering first atmosphere scattering model, the filtering area carries out self-adjustment according to the image size, has guaranteed the defogging effect of different images.
Preferably, the processing of the foggy day image according to the second atmospheric scattering model is performed in the following manner:
step 1, establishing a second atmospheric scattering model:
Figure BDA0001383486150000034
in the formula, x represents an image pixel space coordinate, L (x) represents an acquired foggy day image, B represents atmospheric illumination, q (x) represents an atmospheric dissipation function and is used for reflecting the influence of ambient light on scene imaging, and R2(x) Representing a secondary defogged image;
step 2, carrying out white balance operation on the foggy day image by adopting the following formula:
Figure BDA0001383486150000035
defining the minimum color component of the foggy day image as an atmospheric dissipation function:
Figure BDA0001383486150000036
in the above formula, Lc' (x) denotes the minimum color component of the foggy day image in the three color channels r, g, b;
step 3, solving a secondary defogged image according to an atmospheric dissipation function:
Figure BDA0001383486150000037
and performing fusion processing on the primary defogged image and the secondary defogged image by adopting the following formula:
Figure BDA0001383486150000038
in the above formula, r (x) represents a defogged clear image.
The second defogging unit in the preferred embodiment simplifies the previous atmospheric scattering model by establishing the second atmospheric scattering model, contributes to improving the calculation efficiency, and is favorable for improving the defogging effect and obtaining a clearer image by acquiring the fog-free clear image through the fusion algorithm.
Preferably, the performance evaluation module includes a first evaluation unit, a second evaluation unit and a fusion evaluation unit, the first evaluation unit is configured to determine a first evaluation factor of the defogged clear image, the second evaluation unit is configured to determine a second evaluation factor of the defogged clear image, and the fusion evaluation unit is configured to evaluate the defogged clear image according to the first evaluation factor and the second evaluation factor.
The preferred embodiment realizes the accurate evaluation of the performance of the image processing module by a multi-evaluation factor fusion mode.
Preferably, the determining the first evaluation factor of the defogged clear image specifically includes:
Figure BDA0001383486150000041
in the above formula, P1First evaluation factor, n, for a haze-free sharp image1And n2Respectively representing the number of visible edges in the collected image and the defogged clear image;
the determining of the second evaluation factor of the defogged clear image specifically comprises the following steps:
Figure BDA0001383486150000042
in the above formula, P2Second evaluation factor, m, for a haze-free sharp image1And m2Respectively representing the number of black pixel points and white pixel points of the defogged clear image.
The evaluation of the defogged clear image specifically comprises the following steps: calculating a comprehensive evaluation factor P according to the first evaluation factor and the second evaluation factor:
Figure BDA0001383486150000043
the larger the comprehensive evaluation factor is, the better the defogging effect is, and the clearer the image is.
At present, subjective visual evaluation is mainly used for evaluating the defogging effect, repeated experiments are needed for many times, an observer evaluates the image quality, time and labor are consumed, the evaluation is easily influenced by subjective factors such as the professional background of the observer, and the evaluation method does not have good reliability. The optimized embodiment quantitatively describes the clear effect of the defogging algorithm, realizes the objective evaluation of the defogging effect, and the evaluation module comprehensively considers various evaluation factors, so that the evaluation reliability is high.
The monitoring platform is adopted to process the foggy day images, 5 groups of foggy day images are selected to be processed, the images are respectively an image 1 group, an image 2 group, an image 3 group, an image 4 group and an image 5 group, each group contains 10 foggy day images, and the average value of the image defogging time and the image defogging effect of each group are counted, compared with the prior art, the monitoring platform has the following beneficial effects:
average reduction of image defogging time Image defogging effect improvement
Group of images 1 29% 21%
Image 2 group 27% 23%
Image 3 group 26% 25%
4 groups of images 25% 27%
Image 5 group 24% 29%
Finally, it should be noted that the above embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the protection scope of the present invention, although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions can be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (2)

1. An intelligent traffic management system is characterized by comprising a monitoring platform and a user terminal, wherein the monitoring platform and the user terminal complete data exchange through wireless communication, the monitoring platform comprises a camera and an image sharpening processing subsystem, the camera is used for acquiring a monitoring image, and the image sharpening processing subsystem is used for processing the monitoring image in the foggy days; the image sharpening processing subsystem comprises an image storage module, an image processing module and a performance evaluation module, wherein the image storage module is used for storing the foggy day images acquired by the camera, the image processing module is used for carrying out defogging processing on the stored foggy day images, and the performance evaluation module is used for evaluating the performance of the image processing module;
the image processing module comprises a first defogging unit, a second defogging unit and a fusion defogging unit, wherein the first defogging unit is used for processing a foggy image according to a first atmospheric scattering model to obtain a primary defogged image, the second defogging unit is used for processing the foggy image according to a second atmospheric scattering model to obtain a secondary defogged image, and the fusion defogging unit is used for conducting fusion processing on the primary defogged image and the secondary defogged image to obtain a defogged clear image;
the foggy day image is processed according to the first atmospheric scattering model by adopting the following method:
step 1, establishing a first atmospheric scattering model:
L(x)=1+R1(x)z(x)+B[1-z(x)]
in the above formula, x represents the spatial coordinates of the image pixel, l (x) represents the acquired foggy day image, B represents the atmospheric illumination, z (x) represents the medium propagation function for reflecting the light penetration ability, and R (x) represents the medium propagation function1(x) Representing a primary defogged image;
step 2, obtaining R1(x) Dark primary color image of (1):
Figure FDA0002629722580000011
in the above formula, R1 da(x) Represents R1(x) Dark primary color image of R1 c(y) represents R1 da(x) One channel of (1), R, g, b, respectively, represents the image R1(x) A red channel, a green channel and a blue channel,
Figure FDA0002629722580000012
representing a square area with x as the center and a side length a, wherein a is 2% of the maximum side of the image;
step 3, carrying out local area size on the first atmospheric scattering model to obtain
Figure FDA0002629722580000013
And minimum for each channel:
Figure FDA0002629722580000014
in the above formula, Lc(y) denotes a channel in the foggy day image, BcA channel representing atmospheric illumination;
step 4, known from the principle of dark primary color, R1 da(x) Tend to be0, the medium propagation function can be obtained:
Figure FDA0002629722580000015
in the formula, delta is a fog retention factor and is used for adjusting the degree of retained fog;
step 5, solving the primary defogged image according to the first atmospheric scattering model:
Figure FDA0002629722580000021
and processing the foggy day image according to the second atmospheric scattering model by adopting the following method:
step 1, establishing a second atmospheric scattering model:
Figure FDA0002629722580000022
in the formula, x represents an image pixel space coordinate, L (x) represents an acquired foggy day image, B represents atmospheric illumination, q (x) represents an atmospheric dissipation function and is used for reflecting the influence of ambient light on scene imaging, and R2(x) Representing a secondary defogged image;
step 2, carrying out white balance operation on the foggy day image by adopting the following formula:
Figure FDA0002629722580000023
l' (x) represents a foggy day image after the white balance operation;
defining the minimum color component of the foggy day image as an atmospheric dissipation function:
Figure FDA0002629722580000024
in the above formula, Lc' (x) representsThe minimum color components of the foggy day image in the r, g and b color channels;
step 3, solving a secondary defogged image according to an atmospheric dissipation function:
Figure FDA0002629722580000025
and performing fusion processing on the primary defogged image and the secondary defogged image by adopting the following formula:
Figure FDA0002629722580000026
in the above formula, r (x) represents a defogged clear image;
the performance evaluation module comprises a first evaluation unit, a second evaluation unit and a fusion evaluation unit, wherein the first evaluation unit is used for determining a first evaluation factor of the defogged clear image, the second evaluation unit is used for determining a second evaluation factor of the defogged clear image, and the fusion evaluation unit is used for evaluating the defogged clear image according to the first evaluation factor and the second evaluation factor;
the determining of the first evaluation factor of the defogged clear image specifically comprises the following steps:
Figure FDA0002629722580000027
in the above formula, P1First evaluation factor, n, for a haze-free sharp image1And n2Respectively representing the number of visible edges in the collected image and the defogged clear image;
the determining of the second evaluation factor of the defogged clear image specifically comprises the following steps:
Figure FDA0002629722580000028
in the above formula, P2Second evaluation factor, m, for a haze-free sharp image1And m2Respectively representing the number of black pixel points and white pixel points of the defogged clear image.
2. The intelligent traffic management system according to claim 1, wherein the evaluation of the defogged clear images is specifically: calculating a comprehensive evaluation factor P according to the first evaluation factor and the second evaluation factor:
Figure FDA0002629722580000031
the larger the comprehensive evaluation factor is, the better the defogging effect is, and the clearer the image is.
CN201710714546.7A 2017-08-18 2017-08-18 Intelligent traffic management system Active CN107464418B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710714546.7A CN107464418B (en) 2017-08-18 2017-08-18 Intelligent traffic management system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710714546.7A CN107464418B (en) 2017-08-18 2017-08-18 Intelligent traffic management system

Publications (2)

Publication Number Publication Date
CN107464418A CN107464418A (en) 2017-12-12
CN107464418B true CN107464418B (en) 2021-03-19

Family

ID=60550085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710714546.7A Active CN107464418B (en) 2017-08-18 2017-08-18 Intelligent traffic management system

Country Status (1)

Country Link
CN (1) CN107464418B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103021177A (en) * 2012-11-05 2013-04-03 北京理工大学 Method and system for processing traffic monitoring video image in foggy day
KR20140083602A (en) * 2012-12-26 2014-07-04 금오공과대학교 산학협력단 Device and method for visibility enhancement using fusion of dehazing and retinex
CN103955905A (en) * 2014-05-13 2014-07-30 北京邮电大学 Rapid wavelet transformation and weighted image fusion single-image defogging method
WO2014174765A1 (en) * 2013-04-26 2014-10-30 コニカミノルタ株式会社 Image capture device and image capture method
CN104715623A (en) * 2015-04-08 2015-06-17 王蕾 Congestion index detection system for traffic intersection in front of signal lamp
CN104867121A (en) * 2015-06-08 2015-08-26 武汉理工大学 Fast image defogging method based on dark channel prior and Retinex theory
CN105574819A (en) * 2015-06-25 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Real-time image defogging method and apparatus

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103021177A (en) * 2012-11-05 2013-04-03 北京理工大学 Method and system for processing traffic monitoring video image in foggy day
KR20140083602A (en) * 2012-12-26 2014-07-04 금오공과대학교 산학협력단 Device and method for visibility enhancement using fusion of dehazing and retinex
WO2014174765A1 (en) * 2013-04-26 2014-10-30 コニカミノルタ株式会社 Image capture device and image capture method
CN103955905A (en) * 2014-05-13 2014-07-30 北京邮电大学 Rapid wavelet transformation and weighted image fusion single-image defogging method
CN104715623A (en) * 2015-04-08 2015-06-17 王蕾 Congestion index detection system for traffic intersection in front of signal lamp
CN104867121A (en) * 2015-06-08 2015-08-26 武汉理工大学 Fast image defogging method based on dark channel prior and Retinex theory
CN105574819A (en) * 2015-06-25 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Real-time image defogging method and apparatus

Also Published As

Publication number Publication date
CN107464418A (en) 2017-12-12

Similar Documents

Publication Publication Date Title
CN106910175B (en) Single image defogging algorithm based on deep learning
CN101908210B (en) Method and system for color image defogging treatment
CN110378849B (en) Image defogging and rain removing method based on depth residual error network
CN102170574B (en) Real-time video defogging system
CN102063706B (en) Rapid defogging method
CN107292830B (en) Low-illumination image enhancement and evaluation method
CN108765342A (en) A kind of underwater image restoration method based on improvement dark
CN103049888A (en) Image/video demisting method based on combination of dark primary color of atmospheric scattered light
TW201610912A (en) Method and system for image haze removal based on hybrid dark channel prior
CN105374013A (en) Method and image processing apparatus for image visibility restoration on the base of dual dark channel prior
CN109993804A (en) A kind of road scene defogging method generating confrontation network based on condition
CN110544213A (en) Image defogging method based on global and local feature fusion
CN109389569B (en) Monitoring video real-time defogging method based on improved DehazeNet
CN105989583B (en) A kind of image defogging method
CN102609909A (en) Method and device for defogging single image
CN108629750A (en) A kind of night defogging method, terminal device and storage medium
CN103778605A (en) Greasy weather image enhancement method
Ji et al. Real-time enhancement of the image clarity for traffic video monitoring systems in haze
CN107464418B (en) Intelligent traffic management system
CN110197465B (en) Foggy image enhancement method
CN110807743B (en) Image defogging method based on convolutional neural network
CN112070691A (en) Image defogging method based on U-Net
CN117197068A (en) Mist concentration estimation method, device, equipment and storage medium
CN104537623A (en) Image fog-removing method and device based on image segmentation
CN107454319A (en) Image processing method, device, mobile terminal and computer-readable recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20210129

Address after: 518001 west block, 4th floor, building 702, PengJi Industrial Zone, Liantang, Xianhu Road, Luohu District, Shenzhen City, Guangdong Province

Applicant after: SHENZHEN PENGCHENG TRANSPORTATION NETWORK Co.,Ltd.

Address before: No.27, Dongqiao East Road, Zengjiang street, Zengcheng District, Guangzhou, Guangdong 510000

Applicant before: Pan Jinwen

GR01 Patent grant
GR01 Patent grant