CN111340151A - Weather phenomenon recognition system and method for assisting automatic driving of vehicle - Google Patents

Weather phenomenon recognition system and method for assisting automatic driving of vehicle Download PDF

Info

Publication number
CN111340151A
CN111340151A CN202010446015.6A CN202010446015A CN111340151A CN 111340151 A CN111340151 A CN 111340151A CN 202010446015 A CN202010446015 A CN 202010446015A CN 111340151 A CN111340151 A CN 111340151A
Authority
CN
China
Prior art keywords
weather
image
vehicle
sky
ground
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010446015.6A
Other languages
Chinese (zh)
Other versions
CN111340151B (en
Inventor
夏景明
宣大伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
YANCHENG DONGFANG TIANCHENG MACHINERY Co.,Ltd.
Original Assignee
Nanjing University of Information Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Information Science and Technology filed Critical Nanjing University of Information Science and Technology
Priority to CN202010446015.6A priority Critical patent/CN111340151B/en
Publication of CN111340151A publication Critical patent/CN111340151A/en
Application granted granted Critical
Publication of CN111340151B publication Critical patent/CN111340151B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • B60W2050/146Display means

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Health & Medical Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a weather phenomenon recognition system for assisting automatic driving of a vehicle, which comprises an ARM processor, and a vehicle-mounted camera, a remote server, a vehicle-mounted LCD display screen and a communication module which are respectively connected with the ARM processor; the vehicle-mounted camera shoots weather images of scenes where vehicles are located in real time; the ARM processor divides the weather image into two parts: a sky image including sky region weather information and a ground image including ground region weather information; and the remote server leads the received weather image, the sky image and the ground image into a weather phenomenon identification model, and identifies the weather phenomenon type corresponding to the current weather image. According to the invention, the sky local feature, the ground local feature and the global feature are respectively extracted, and the integrated result is led into the classifier to identify six weather phenomena of sunny days, cloudy days, snowy days, rainy days, foggy days and sand dust, so that the dependence on the number of samples is reduced, and a high-accuracy weather phenomenon identification result can be realized based on the medium-sized weather image data set.

Description

Weather phenomenon recognition system and method for assisting automatic driving of vehicle
Technical Field
The invention relates to the technical field of image recognition, in particular to a weather phenomenon recognition system and method for assisting automatic driving of a vehicle.
Background
Severe weather conditions can have a significant impact on vehicle operation. The poor weather conditions of rain, snow, fog, sand and dust and the like can cause the reduction of visibility and road friction coefficient, thereby possibly causing traffic jam and serious traffic accidents, and having huge potential risks. A weather phenomenon identification system for assisting automatic driving of an automobile can effectively avoid serious traffic accidents through real-time monitoring of weather environment and comprehensive utilization of traffic information, and improves driving efficiency under severe weather conditions. For example, speed limits may be set in bad weather conditions, prompts to maintain the vehicle distance, automatic opening of wipers in rainy weather, and the like. Therefore, the automatic identification of the weather conditions has important application value in the aspects of traffic condition early warning, automobile auxiliary driving, intelligent traffic systems and the like.
The traditional method for weather identification is mainly based on multiple sensors, weather data are acquired by multiple sensors, including laser radar, instruments, cameras and the like, and weather identification results are obtained by fusing data of the multiple sensors and images. However, installation and maintenance of the sensors would be labor and material intensive. In addition, the complex external environment affects the recognition accuracy of various sensors, and the weather conditions change frequently in space and time, which makes the sensor device unable to recognize weather anywhere in real time.
In recent years, with the development of intelligent transportation systems, various monitoring devices are installed on roads, and thus, a weather recognition method based on image processing and machine vision has been gradually developed. With the rapid development of the deep learning field, the Convolutional Neural Network (CNN) has remarkable performance in various machine vision tasks such as image classification, target detection, semantic segmentation and the like, and the convolutional neural network can extract rich, abstract and deep semantic information in the weather image, so that the convolutional neural network is superior to the traditional weather identification method to a great extent. However, these deep learning methods usually require a very large-scale data set as a support, and can only perform effective training on a high-end GPU, which makes them very expensive in terms of weather condition recognition, and in addition, the currently existing CNN model has certain defects for abstract weather information discrimination due to the complexity of background information in weather images. Therefore, it is difficult to widely apply these methods to terminal devices in the traffic field at present.
Disclosure of Invention
The invention aims to provide a weather phenomenon recognition system and a method for assisting automatic driving of a vehicle.
In order to achieve the above purpose, with reference to fig. 1, the present invention provides a weather phenomenon recognition system for assisting automatic driving of a vehicle, where the recognition system includes an ARM processor, and a vehicle-mounted camera, a remote server, a vehicle-mounted LCD display screen, and a communication module, which are respectively connected to the ARM processor; the ARM processor, the vehicle-mounted camera, the vehicle-mounted LCD display screen and the communication module are carried on a vehicle;
the vehicle-mounted camera shoots weather images of a scene where the vehicle is located in real time according to a control instruction of the ARM processor, and the shot images are transmitted to the ARM processor;
after the ARM processor preprocesses the weather image, the weather image is divided into two parts: the sky image containing sky area weather information and the ground image containing ground area weather information are transmitted to a remote server through a communication module;
a weather phenomenon recognition model is installed in the remote server and comprises a three-channel convolutional neural network, a feature fusion module, a complete connection layer and a Softmax classifier which are sequentially connected; the three-channel convolutional neural network comprises a first convolutional neural network branch for extracting sky features, a second convolutional neural network branch for extracting global features and a third convolutional neural network branch for extracting ground features; the feature fusion module is used for fusing sky features, ground features and global features extracted by the three-channel convolutional neural network; the complete connection layer and the Softmax classifier are used for identifying and obtaining the weather phenomenon type by combining the fused characteristic information;
the remote server leads the received weather image, the sky image and the ground image into a weather phenomenon identification model, wherein the weather image is led into a second convolution neural network branch, the sky image is led into a first convolution neural network branch, the ground image is led into a third convolution neural network branch, the weather phenomenon type corresponding to the current weather image is identified and obtained, and the identification result is returned to the ARM processor through a communication module;
and the ARM processor displays the received identification result to a user through a vehicle-mounted LCD display screen.
As a preferred example, after quantifying the identified weather phenomenon type, the quantified result is transmitted to an automatic driving servo system.
As a preferred example, the feature fusion module fuses the sky features, the ground features and the global features extracted by the three channels by using a Concat function.
As a preferred example, the communication module includes a 4G/5G communication device.
As a preferable example, the weather phenomenon types include six types, namely sunny days, cloudy days, snowy days, rainy days, foggy days and sand and dust.
As a preferred example, the first convolutional neural network branch and the third convolutional neural network branch adopt a CNN5 convolutional neural network composed of 5 convolutional layers and 1 Max pooling layer, and the second convolutional neural network branch adopts a Resnet15 residual network composed of 1 convolutional layer, 1 Max pooling layer and 4 sets of residual modules.
As a preferable example, the CNN5 convolutional neural network comprises an input layer with the parameter [224 × 112] 3, a convolutional layer with the parameter [7 × 7] 32/(2, 1), a Max pooling layer with the parameter [7 × 7]/2, a convolutional layer with the parameter [3 × 3]:64/2, a convolutional layer with the parameter [3 × 3]:128/2, a convolutional layer with the parameter [3 × 3]:256/2 and a convolutional layer with the parameter [3 × 3]:512/2 which are connected in sequence.
As a preferred example, the Resnet15 residual network includes an input layer with a parameter [224 × 224] 3, a convolutional layer with a parameter [3 × 3] 32/2, a Max pooling layer with a parameter [3 × 3]/2, a residual module with a parameter [1 × 1] 64/2, a residual module with a parameter [1 × 1] 128/2, a residual module with a parameter [1 × 1] 256/2, and a residual module with a parameter [1 × 1] 512/2, which are connected in sequence. The network greatly reduces the identification time, so that the weather phenomenon identification system can be used for automatic driving parameter judgment with high real-time requirement.
The invention also refers to a method for identifying weather phenomena for assisting the automatic driving of a vehicle, said method comprising:
s1, shooting weather images in real time by adopting a vehicle-mounted camera;
s2, after the weather image is preprocessed, the weather image is divided into two parts: a sky image including sky region weather information and a ground image including ground region weather information;
s3, extracting sky features from the sky image, extracting ground features from the ground image, extracting global features from the weather image, fusing the extracted sky features, ground features and global features, importing the fused sky features, ground features and global features into a Softmax classifier for classifying weather phenomena, and identifying weather phenomenon categories;
and S4, quantizing and identifying the obtained weather phenomenon types, and transmitting the quantized result to an automatic driving servo system to assist the automatic driving of the vehicle.
Compared with the prior art, the technical scheme of the invention has the following remarkable beneficial effects:
(1) a three-channel convolution neural network is adopted to respectively extract sky local features, ground local features and global features, and the three-channel convolution neural network is fused and then led into a classifier to identify six weather phenomena of sunny days, cloudy days, snowy days, rainy days, foggy days and sand dust, so that the dependency on the number of samples is reduced, and a weather phenomenon identification result with high accuracy can be realized based on a medium-sized weather image data set.
(2) The test result shows that the method has good identification precision and identification speed, and can be successfully used for automatic driving parameter judgment with high real-time requirement. Compared with the traditional deep learning method, the method can realize 91.34% of average recognition accuracy, has the fastest recognition speed, and can recognize the number of weather images per second as high as 163.3.
(3) The two CNN branches are added to separately extract the sky and ground features, so that the identification precision of the weather in cloudy days and rainy days can be greatly improved.
(4) The ARM processor is combined with shooting parameters of the vehicle-mounted camera to preprocess and segment the weather image, and segmentation efficiency is higher.
It should be understood that all combinations of the foregoing concepts and additional concepts described in greater detail below can be considered as part of the inventive subject matter of this disclosure unless such concepts are mutually inconsistent. In addition, all combinations of claimed subject matter are considered a part of the presently disclosed subject matter.
The foregoing and other aspects, embodiments and features of the present teachings can be more fully understood from the following description taken in conjunction with the accompanying drawings. Additional aspects of the present invention, such as features and/or advantages of exemplary embodiments, will be apparent from the description which follows, or may be learned by practice of specific embodiments in accordance with the teachings of the present invention.
Drawings
The drawings are not intended to be drawn to scale. In the drawings, each identical or nearly identical component that is illustrated in various figures may be represented by a like numeral. For purposes of clarity, not every component may be labeled in every drawing. Embodiments of various aspects of the present invention will now be described, by way of example, with reference to the accompanying drawings, in which:
fig. 1 is a schematic configuration diagram of a weather phenomenon recognition system for assisting automatic driving of a vehicle according to the present invention.
FIG. 2 is a 3C-CNN weather phenomenon identification model in a remote server.
FIG. 3 is a confusion matrix for 3C-CNN weather identification.
Fig. 4 is a graph of recognition accuracy versus traversal number for different methods.
FIG. 5 is a histogram comparing average recognition accuracy for different methods.
FIG. 6 is a histogram of recognition speed versus time for different methods.
FIG. 7 is a schematic diagram of a mesoscale weather image dataset "weather dataset-6".
Detailed Description
In order to better understand the technical content of the present invention, specific embodiments are described below with reference to the accompanying drawings.
Detailed description of the preferred embodiment
With reference to fig. 1, the present invention provides a weather phenomenon recognition system for assisting automatic driving of a vehicle, wherein the recognition system includes an ARM processor, and a vehicle-mounted camera, a remote server, a vehicle-mounted LCD display screen and a communication module respectively connected to the ARM processor; the ARM processor, the vehicle-mounted camera, the vehicle-mounted LCD display screen and the communication module are carried on a vehicle.
The vehicle-mounted camera shoots weather images of the scene where the vehicle is located in real time according to the control instruction of the ARM processor, and the shot images are transmitted to the ARM processor.
After the ARM processor preprocesses the weather image, the weather image is divided into two parts: the sky image containing sky area weather information and the ground image containing ground area weather information are transmitted to the remote server through the communication module.
A weather phenomenon recognition model is installed in the remote server and comprises a three-channel convolutional neural network, a feature fusion module, a complete connection layer and a Softmax classifier which are sequentially connected; the three-channel convolutional neural network comprises a first convolutional neural network branch for extracting sky features, a second convolutional neural network branch for extracting global features and a third convolutional neural network branch for extracting ground features; the feature fusion module is used for fusing sky features, ground features and global features extracted by the three-channel convolutional neural network; and the complete connection layer and the Softmax classifier are used for identifying and obtaining the weather phenomenon type by combining the fused characteristic information.
The remote server leads the received weather image, the sky image and the ground image into a weather phenomenon recognition model, wherein the weather image is led into a second convolution neural network branch, the sky image is led into a first convolution neural network branch, the ground image is led into a third convolution neural network branch, the weather phenomenon type corresponding to the current weather image is obtained through recognition, and the recognition result is returned to the ARM processor through the communication module.
And the ARM processor displays the received identification result to a user through a vehicle-mounted LCD display screen.
The weather phenomenon identification system is characterized in that a hardware part of the weather phenomenon identification system is composed of a vehicle-mounted camera, an ARM processor, a remote server and a vehicle-mounted LCD display screen. The vehicle-mounted camera is mainly used for collecting weather images when the automobile runs. The ARM processor is used for receiving images of the vehicle-mounted camera and then preprocessing the images, the preprocessing step is to compress the images, the images are divided into an upper portion and a lower portion, the upper portion is mainly a sky image, the lower portion is mainly a ground image, and the divided sky image, the ground image and the original weather image are transmitted to the remote server through the 4G/5G communication module. The remote server is loaded with a trained weather phenomenon recognition model (deep learning network model 3C-CNN), the 3C-CNN model can recognize six weather phenomena of sunny days, cloudy days, snow days, rainy days, foggy days and sand and dust, the deep learning network model 3C-CNN is composed of two CNN5 convolution neural networks and one Resnet15 residual error network in parallel, the remote server respectively sends the received sky image, ground image and weather image to three channels of the 3C-CNN model for identification, the sky image of the upper half part and the ground image of the lower half part are respectively sent to two CNN5 convolutional neural networks, the original weather image is sent to a Resnet15 residual error network, local features extracted by the 3 channels and global features are fused together through a Concat function, and finally, a weather identification result is output through a complete connection layer and a Softmax classifier. And the remote server transmits the identification result back to the ARM processor through the 4G/5G communication module. The ARM processor quantizes the identification result, respectively represents six weather phenomena of sunny days, cloudy days, snowy days, rainy days, foggy days and sand and dust by using numbers 1-6, and transmits the weather phenomena to an automatic driving servo system of the automobile through a serial port, so that the automobile can automatically set the driving parameters of the automobile according to the weather phenomena. Compared with the prior art, the weather phenomenon identification module based on the three-channel convolutional neural network can accurately identify six weather phenomena of cloudy days, foggy days, rainy days, sand and dust, snowy days and sunny days, has higher identification precision and identification speed, and can greatly improve the identification precision of the two types of weather of cloudy days and rainy days by adding two CNN branches to separately extract sky and ground characteristics.
The functional modules in the weather phenomenon recognition system are further described below with reference to the accompanying drawings.
(1) Training of remote server 3C-CNN neural network model
Firstly, a three-channel convolutional neural network 3C-CNN is built on the basis of a Keras library through Python, and the network structure is shown in FIG. 2. The 3C-CNN comprises three convolutional neural network branches, each branch processes input images of different regions, image blocks of an upper region and an lower region are respectively input into a channel 1 and a channel 3, the two channels are CNN5 convolutional neural networks consisting of 5 layers of convolutional layers and 1 layer of Max pooling layers, sky features and ground features are respectively extracted, a whole image is input into a residual error network ResNet15 branch in the channel 2 to extract global features, then the local features and the global features extracted by the 3 branches are fused through a Concat function, and finally a weather identification result is output through a complete connection layer and a Softmax classifier. The structural parameters of the network model are shown in table 1.
TABLE 13C-CNN network architecture parameters
Figure 306988DEST_PATH_IMAGE002
Aiming at the technical problem that a large-scale weather data set which is generally lack of disclosure in the existing image identification process is constructed, a medium-sized weather image data set 'weather database-6' is constructed for the weather condition identification task, the specific situation of the data set is shown in fig. 7, the data set totally comprises 6185 weather images which are far less than the number of samples required in the traditional image identification method, the weather images contained in the data set are divided into six types of cloudy days, foggy days, rainy days, sand dust, snow days and sunny days, each type of weather image is divided into a training set and a testing set, 4800 weather images are used for training, 1385 weather images are used for testing, each image is uniformly processed into 256 × 256 pixels, most of the images are collected from a network, the image modes comprise aerial photography, cameras, news, traffic accidents, automobile data recorders and the like, screening is carried out according to requirements, the images are multi-angle shooting and comprise various complex scenes, and the number of the images of each type of weather is relatively large, so that the images have certain generalization and universality.
Firstly, training a weather identification model on a GPU remote server through training set images in a weather image data set 'weather dataset-6', inputting test set images into the trained model, performing weather identification, and outputting classification results (one of six weather conditions including cloudy days, foggy days, rainy days, dust, snow days and sunny days). The recognition accuracy is calculated according to the classification result of the test picture, and a confusion matrix for weather recognition is drawn as shown in fig. 3. Values on diagonal lines of the confusion matrix respectively represent the recognition accuracy of each type of weather, wherein the recognition accuracy of cloudy days is 86.18%, the recognition accuracy of fog days is 91.86%, the recognition accuracy of rainy days is 87.06%, the recognition accuracy of sand dust is 98.25%, the recognition accuracy of snow days is 90.95%, the recognition accuracy of fine days is 94.12%, and the average recognition accuracy is 91.34%.
In order to evaluate the 3C-CNN-based weather identification method, the identification performance of a 3C-CNN network model on a medium-sized weather image data set 'weather database-6' is compared with other deep learning methods (AlexNet, VGG16 and ResNet 50), FIG. 4 shows training curves of various methods, and FIG. 5 shows the average identification accuracy of various methods, wherein the 3C-CNN network model provided by the invention has the highest average identification accuracy of 91.34%, and then ResNet50, AlexNet and VGG16 are sequentially carried out. Fig. 6 shows recognition speeds of various methods, FPS represents the number of recognizable weather images per second, wherein the 3C-CNN network model proposed by the present invention has the fastest recognition speed, FPS can reach 163.3, and next are AlexNet, VGG16 and ResNet50 in sequence. The test result shows that the weather phenomenon identification method based on the three-channel convolutional neural network 3C-CNN has good identification precision and identification speed and can be used for automatic driving parameter judgment with high real-time requirement.
(2) Working process of real-time weather phenomenon identification vehicle-mounted system for assisting automatic driving
The invention provides a real-time recognition vehicle-mounted system for weather phenomenon for assisting automatic driving, which is used in actual use, firstly, a weather image when a vehicle runs is collected through a vehicle-mounted camera, an ARM processor receives the image of the vehicle-mounted camera and then carries out preprocessing, the preprocessing comprises the steps of compressing the image, dividing the image into an upper part and a lower part equally, wherein the upper part is mainly a sky image, the lower part is mainly a ground image, and then the ARM processor transmits the divided image and the original image to a remote server through a 4G/5G communication module; the trained weather identification model is loaded in the remote server, and the server sends the received segmented image and the original image to three channels of the 3C-CNN model respectively for weather identification; finally, the remote server transmits the identification result back to the ARM processor through the 4G/5G communication module; the ARM processor quantizes the identification result, respectively represents six weather phenomena of sunny days, cloudy days, snowy days, rainy days, foggy days and sand and dust by using numbers 1-6, and transmits the weather phenomena to an automatic automobile driving servo system for automatically setting the vehicle driving parameters according to the weather phenomena.
Detailed description of the invention
Based on the weather phenomenon recognition system, the invention also provides a weather phenomenon recognition method for assisting automatic driving of the vehicle, and the recognition method comprises the following steps:
and S1, shooting the weather image in real time by adopting the vehicle-mounted camera.
S2, after the weather image is preprocessed, the weather image is divided into two parts: a sky image containing sky region weather information and a ground image including ground region weather information.
And S3, extracting sky features from the sky image, extracting ground features from the ground image, extracting global features from the weather image, fusing the extracted sky features, ground features and global features, importing the fused sky features, ground features and global features into a Softmax classifier for classifying weather phenomena, and identifying weather phenomenon categories.
And S4, quantizing and identifying the obtained weather phenomenon types, and transmitting the quantized result to an automatic driving servo system to assist the automatic driving of the vehicle.
In this disclosure, aspects of the present invention are described with reference to the accompanying drawings, in which a number of illustrative embodiments are shown. Embodiments of the present disclosure are not necessarily defined to include all aspects of the invention. It should be appreciated that the various concepts and embodiments described above, as well as those described in greater detail below, may be implemented in any of numerous ways, as the disclosed concepts and embodiments are not limited to any one implementation. In addition, some aspects of the present disclosure may be used alone, or in any suitable combination with other aspects of the present disclosure.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to be limited thereto. Those skilled in the art can make various changes and modifications without departing from the spirit and scope of the invention. Therefore, the protection scope of the present invention should be determined by the appended claims.

Claims (9)

1. A weather phenomenon recognition system for assisting automatic driving of a vehicle is characterized by comprising an ARM processor, and a vehicle-mounted camera, a remote server, a vehicle-mounted LCD display screen and a communication module which are respectively connected with the ARM processor; the ARM processor, the vehicle-mounted camera, the vehicle-mounted LCD display screen and the communication module are carried on a vehicle;
the vehicle-mounted camera shoots weather images of a scene where the vehicle is located in real time according to a control instruction of the ARM processor, and the shot images are transmitted to the ARM processor;
after the ARM processor preprocesses the weather image, the weather image is divided into two parts: the sky image containing sky area weather information and the ground image containing ground area weather information are transmitted to a remote server through a communication module;
a weather phenomenon recognition model is installed in the remote server and comprises a three-channel convolutional neural network, a feature fusion module, a complete connection layer and a Softmax classifier which are sequentially connected; the three-channel convolutional neural network comprises a first convolutional neural network branch for extracting sky features, a second convolutional neural network branch for extracting global features and a third convolutional neural network branch for extracting ground features; the feature fusion module is used for fusing sky features, ground features and global features extracted by the three-channel convolutional neural network; the complete connection layer and the Softmax classifier are used for identifying and obtaining the weather phenomenon type by combining the fused characteristic information;
the remote server leads the received weather image, the sky image and the ground image into a weather phenomenon identification model, wherein the weather image is led into a second convolution neural network branch, the sky image is led into a first convolution neural network branch, the ground image is led into a third convolution neural network branch, the weather phenomenon type corresponding to the current weather image is identified and obtained, and the identification result is returned to the ARM processor through a communication module;
and the ARM processor displays the received identification result to a user through a vehicle-mounted LCD display screen.
2. The weather phenomenon recognition system for assisting automatic driving of a vehicle according to claim 1, wherein the recognized weather phenomenon type is quantized and then a result of the quantization is transmitted to an automatic driving servo system.
3. The weather phenomenon identification system for assisting vehicle autopilot according to claim 1, characterized in that the feature fusion module fuses three channel extracted sky features, ground features and global features using a Concat function.
4. The weather phenomenon recognition system for assisting automatic driving of a vehicle according to claim 1, wherein the communication module includes a 4G/5G communication device.
5. The weather phenomenon recognition system for assisting automatic driving of a vehicle according to claim 1, wherein the weather phenomenon types include six types of sunny days, cloudy days, snowy days, rainy days, foggy days, and sand and dust.
6. The weather phenomenon recognition system for assisting vehicle autopilot according to claim 1, characterized in that the first convolutional neural network branch and the third convolutional neural network branch employ a CNN5 convolutional neural network composed of 5 convolutional layers and 1 Max pooling layer, and the second convolutional neural network branch employs a Resnet15 residual network composed of 1 convolutional layer, 1 Max pooling layer, and 4 sets of residual modules.
7. The weather phenomenon recognition system for assisting automatic driving of a vehicle of claim 6, wherein the CNN5 convolutional neural network comprises an input layer with a parameter [224 × 112] 3, a convolutional layer with a parameter [7 × 7] 32/(2, 1), a Max pooling layer with a parameter [7 × 7]/2, a convolutional layer with a parameter [3 × 3]:64/2, a convolutional layer with a parameter [3 × 3]:128/2, a convolutional layer with a parameter [3 × 3]:256/2, and a convolutional layer with a parameter [3 × 3]:512/2, which are connected in sequence.
8. The weather phenomenon recognition system for assisting automatic driving of a vehicle according to claim 6, wherein the Resnet15 residual network comprises an input layer with a parameter [ 224X 224] 3, a convolutional layer with a parameter [ 3X 3] 32/2, a Max pooling layer with a parameter [ 3X 3]/2, a residual module with a parameter [ 1X 1] 64/2, a residual module with a parameter [ 1X 1] 128/2, a residual module with a parameter [ 1X 1] 256/2, and a residual module with a parameter [ 1X 1] 512/2, which are connected in sequence.
9. A weather phenomenon recognition method for assisting automatic driving of a vehicle, characterized in that the recognition method comprises:
s1, shooting weather images in real time by adopting a vehicle-mounted camera;
s2, after the weather image is preprocessed, the weather image is divided into two parts: a sky image including sky region weather information and a ground image including ground region weather information;
s3, extracting sky features from the sky image, extracting ground features from the ground image, extracting global features from the weather image, fusing the extracted sky features, ground features and global features, importing the fused sky features, ground features and global features into a Softmax classifier for classifying weather phenomena, and identifying weather phenomenon categories;
and S4, quantizing and identifying the obtained weather phenomenon types, and transmitting the quantized result to an automatic driving servo system to assist the automatic driving of the vehicle.
CN202010446015.6A 2020-05-25 2020-05-25 Weather phenomenon recognition system and method for assisting automatic driving of vehicle Active CN111340151B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010446015.6A CN111340151B (en) 2020-05-25 2020-05-25 Weather phenomenon recognition system and method for assisting automatic driving of vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010446015.6A CN111340151B (en) 2020-05-25 2020-05-25 Weather phenomenon recognition system and method for assisting automatic driving of vehicle

Publications (2)

Publication Number Publication Date
CN111340151A true CN111340151A (en) 2020-06-26
CN111340151B CN111340151B (en) 2020-08-25

Family

ID=71183027

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010446015.6A Active CN111340151B (en) 2020-05-25 2020-05-25 Weather phenomenon recognition system and method for assisting automatic driving of vehicle

Country Status (1)

Country Link
CN (1) CN111340151B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112854950A (en) * 2021-01-28 2021-05-28 东风汽车集团股份有限公司 Automobile window self-adaptive lifting method and system based on perception fusion
CN113392804A (en) * 2021-07-02 2021-09-14 昆明理工大学 Multi-angle-based traffic police target data set scene construction method and system
CN113642614A (en) * 2021-07-23 2021-11-12 西安理工大学 Basic weather type classification method based on deep network
CN114120025A (en) * 2021-09-29 2022-03-01 吉林大学 Deep learning-based weather identification and degree quantification method
CN114299726A (en) * 2021-12-31 2022-04-08 象谱信息产业有限公司 Highway severe weather identification method based on artificial intelligence
CN115690519A (en) * 2022-11-30 2023-02-03 北京中环高科环境治理有限公司 Black carbon remote measuring method, device and system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108549929A (en) * 2018-03-29 2018-09-18 河海大学 A kind of photovoltaic power prediction technique based on deep layer convolutional neural networks
CN109492668A (en) * 2018-10-10 2019-03-19 华中科技大学 MRI based on multichannel convolutive neural network not same period multi-mode image characterizing method
CN110766333A (en) * 2019-10-29 2020-02-07 北京依派伟业数码科技有限公司 Intelligent processing method and system for weather phenomenon information

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108549929A (en) * 2018-03-29 2018-09-18 河海大学 A kind of photovoltaic power prediction technique based on deep layer convolutional neural networks
CN109492668A (en) * 2018-10-10 2019-03-19 华中科技大学 MRI based on multichannel convolutive neural network not same period multi-mode image characterizing method
CN110766333A (en) * 2019-10-29 2020-02-07 北京依派伟业数码科技有限公司 Intelligent processing method and system for weather phenomenon information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
许朴等: "边界层通用气象数据采集系统设计", 《电脑知识与技术》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112854950A (en) * 2021-01-28 2021-05-28 东风汽车集团股份有限公司 Automobile window self-adaptive lifting method and system based on perception fusion
CN113392804A (en) * 2021-07-02 2021-09-14 昆明理工大学 Multi-angle-based traffic police target data set scene construction method and system
CN113642614A (en) * 2021-07-23 2021-11-12 西安理工大学 Basic weather type classification method based on deep network
CN114120025A (en) * 2021-09-29 2022-03-01 吉林大学 Deep learning-based weather identification and degree quantification method
CN114299726A (en) * 2021-12-31 2022-04-08 象谱信息产业有限公司 Highway severe weather identification method based on artificial intelligence
CN115690519A (en) * 2022-11-30 2023-02-03 北京中环高科环境治理有限公司 Black carbon remote measuring method, device and system
CN115690519B (en) * 2022-11-30 2023-08-04 北京中环高科环境治理有限公司 Black carbon telemetry method, device and system

Also Published As

Publication number Publication date
CN111340151B (en) 2020-08-25

Similar Documents

Publication Publication Date Title
CN111340151B (en) Weather phenomenon recognition system and method for assisting automatic driving of vehicle
CN110660222B (en) Intelligent environment-friendly electronic snapshot system for black-smoke road vehicle
CN110717387B (en) Real-time vehicle detection method based on unmanned aerial vehicle platform
Xia et al. ResNet15: weather recognition on traffic road with deep convolutional neural network
CN113723377B (en) Traffic sign detection method based on LD-SSD network
CN111274942A (en) Traffic cone identification method and device based on cascade network
CN108416316B (en) Detection method and system for black smoke vehicle
CN113762209A (en) Multi-scale parallel feature fusion road sign detection method based on YOLO
CN113052106A (en) Airplane take-off and landing runway identification method based on PSPNet network
CN112784834A (en) Automatic license plate identification method in natural scene
CN111507196A (en) Vehicle type identification method based on machine vision and deep learning
WO2021026855A1 (en) Machine vision-based image processing method and device
CN112785610B (en) Lane line semantic segmentation method integrating low-level features
CN114550023A (en) Traffic target static information extraction device
CN113139615A (en) Unmanned environment target detection method based on embedded equipment
CN112509321A (en) Unmanned aerial vehicle-based driving control method and system for urban complex traffic situation and readable storage medium
CN111160282B (en) Traffic light detection method based on binary Yolov3 network
CN112700653A (en) Method, device and equipment for judging illegal lane change of vehicle and storage medium
CN115457420B (en) Vehicle weight detection method based on low contrast ratio at night when unmanned aerial vehicle shoots
CN116052090A (en) Image quality evaluation method, model training method, device, equipment and medium
CN114419018A (en) Image sampling method, system, device and medium
CN114565597A (en) Nighttime road pedestrian detection method based on YOLOv3-tiny-DB and transfer learning
CN113850112A (en) Road condition identification method and system based on twin neural network
CN113159153A (en) License plate recognition method based on convolutional neural network
CN113343817A (en) Unmanned vehicle path detection method and device for target area and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20201209

Address after: 8 Shuguang Road, Xuefu Town, Yandu District, Yancheng City, Jiangsu Province

Patentee after: YANCHENG DONGFANG TIANCHENG MACHINERY Co.,Ltd.

Address before: 210044 No. 219 Ning six road, Jiangbei new district, Nanjing, Jiangsu

Patentee before: NANJING University OF INFORMATION SCIENCE & TECHNOLOGY

TR01 Transfer of patent right