CN110837800A - Port severe weather-oriented target detection and identification method - Google Patents

Port severe weather-oriented target detection and identification method Download PDF

Info

Publication number
CN110837800A
CN110837800A CN201911072396.XA CN201911072396A CN110837800A CN 110837800 A CN110837800 A CN 110837800A CN 201911072396 A CN201911072396 A CN 201911072396A CN 110837800 A CN110837800 A CN 110837800A
Authority
CN
China
Prior art keywords
vehicle
weather
bicycle
image
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911072396.XA
Other languages
Chinese (zh)
Inventor
张祖锋
殷嘉伦
刘凯
闵文芳
杨迪海
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Changjia Fashion (suzhou) Intelligent Technology Co Ltd
Original Assignee
Changjia Fashion (suzhou) Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Changjia Fashion (suzhou) Intelligent Technology Co Ltd filed Critical Changjia Fashion (suzhou) Intelligent Technology Co Ltd
Priority to CN201911072396.XA priority Critical patent/CN110837800A/en
Publication of CN110837800A publication Critical patent/CN110837800A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01WMETEOROLOGY
    • G01W1/00Meteorology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/07Target detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Abstract

The invention discloses a method for detecting and identifying targets facing severe weather of a port, which comprises the following steps: step 1, establishing a Mask R-CNN classifier model for a pre-acquired severe weather traffic scene picture according to a convolutional neural network; and 2, carrying out target detection and identification based on the established Mask R-CNN classifier model. Under severe weather such as fog days, the color image acquired by the vehicle-mounted camera is subjected to defogging treatment, and moving and static target identification is carried out by utilizing the background shielded by fog in the deep learning technology image, so that the identification rate and the accuracy of the target are effectively improved.

Description

Port severe weather-oriented target detection and identification method
Technical Field
The invention belongs to the technical field of automatic driving, and particularly relates to a target detection and identification method for severe weather of a port.
Background
The related technical problems in the field of automatic driving become hot topics of key attention of colleges, enterprises, institutions and scientific research institutions at home and abroad at present. The automatic driving environment perception technology facing port severe weather data in the field is one of the difficult problems.
The international society of automotive engineers classified autopilot into 6 classes L0 to L5, with port being a defined scenario for a typical class L3 autopilot application. At present, a deep learning method is mostly adopted for recognizing moving and static targets in the automatic driving environment perception technology, and under the condition of good weather conditions, the recognition rate and the accuracy rate of the method can meet the automatic driving requirements in ports. When the camera encounters severe weather with poor visibility, such as rainy and snowy weather or foggy weather, the visibility of the camera is reduced, so that the recognition rate and the accuracy are reduced, automatic driving cannot be performed or the automatic driving effect is poor, and the requirement of automatic driving in a port is difficult to meet under the port operation environment.
Disclosure of Invention
The invention aims to provide the method for automatically detecting and identifying the driving target in the severe weather of the port, which has the advantages of simple structure, simplicity in operation and high target identification rate and accuracy in the severe weather of the port in the operation environment.
The technical scheme of the invention is as follows:
a target detection and identification method for severe weather of a port comprises the following steps:
step 1, establishing a Mask R-CNN classifier model for a pre-acquired severe weather traffic scene picture according to a convolutional neural network, wherein the step 1 of establishing the Mask R-CNN classifier model specifically comprises the following steps:
(1) the method comprises the following steps that a bicycle runs on a road to be detected at a port, and a vehicle-mounted camera collects image data;
(2) marking a dynamic target or a static target in the image data by using a marking tool, and marking to obtain image characteristics;
(3) extracting the image characteristics by using a ResNet network, training by using a Mask R-CNN neural network, and adjusting parameters to obtain a Mask R-CNN classifier model;
step 2, carrying out target detection and identification based on the established Mask R-CNN classifier model, and specifically comprising the following steps:
(4) after the single vehicle receives the foggy day signal, the single vehicle runs on a road to be detected in a port, and meanwhile, the vehicle-mounted camera collects a color original foggy image;
(5) defogging the collected original color fogging image to obtain a defogged image;
(6) on the basis of the defogged image, the ResNet network is used for extracting the image characteristics, and then the trained Mask R-CNN classifier is used for detecting and identifying the dynamic target and the static target.
In the above technical solution, the method for performing the defogging treatment in step (5) includes the following steps:
(5-1) calculating the minimum value f (x) of the light intensity in each pixel color channel of the original color foggy image I acquired by the vehicle-mounted camera, wherein the pixel is M multiplied by N, forming an M multiplied by N matrix by the minimum value of the color channel of each pixel obtained by calculation, performing minimum value filtering on the matrix to generate a dark channel gray scale image, and calculating by adopting the following formula:
wherein, IDRepresents each color channel of a color image, phi (I) being an n × n window centered on a pixel I;
(5-2) taking the first 0.1% of pixels with the maximum light intensity from the dark channel gray-scale image, finding out the pixel value of the pixel point with the highest corresponding light intensity value in the original foggy image I in the pixel position as a C value, and calculating to obtain the transmissivity t (x), wherein if t (x) is less than a certain threshold value ξ, the transmissivity t (x) is a threshold value ξ, and if t (x) is greater than a threshold value ξ, the transmissivity is calculated by the following formula:
Figure BDA0002261361630000022
wherein λ is a constant, which is a threshold;
(5-3) solving the defogged clear image J (x), and calculating by adopting the following formula:
Figure BDA0002261361630000023
in the above technical solution, the labeling tool is labelme.
The invention also aims to provide a port severe weather-oriented bicycle environment sensing system, which comprises a vehicle-mounted camera, a laser radar, an inertial navigation device and a processor, wherein the processor is electrically connected with the vehicle-mounted camera, the laser radar and the inertial navigation device;
the vehicle-mounted camera is used for acquiring image data of the surrounding environment of the bicycle when the bicycle runs on the road to be detected;
the laser radar is used for acquiring three-dimensional laser point cloud data of the surrounding environment of the bicycle;
the inertial navigation device is used for acquiring the running attitude and the running speed of the bicycle;
the processor is used for receiving and respectively processing the image data, the three-dimensional laser point cloud data, the running attitude and the running speed, extracting image features in the Mask R-CNN classifier model to detect road signs, static and dynamic targets based on the Mask R-CNN classifier model which is built in the processor in advance according to claim 1, and calculating the position of the bicycle through the running attitude and the running speed of the bicycle to position the position of the bicycle.
In the technical scheme, the vehicle-mounted camera is mounted in front of a windshield of the bicycle, the laser radar is mounted in the middle of the roof of the bicycle, the inertial navigation device is mounted on the roof of the bicycle, and the processor is mounted inside the bicycle.
In the above technical solution, the road signs include containers in ports, truck trucks, pedestrians, lane lines, traffic signs and signal lights.
The invention also aims to provide an automatic driving system facing to severe weather of a port, which comprises a master control system, a single-vehicle automatic driving system and a data transmission system;
the general control system is used for macroscopically regulating and controlling the states of vehicles, weather and working conditions in a port, and comprises a weather detection system, a working condition system and a vehicle scheduling system, wherein the weather detection system is used for monitoring the weather data in the port in real time and feeding the weather data back to the vehicle scheduling system, the working condition system sends a vehicle scheduling instruction to the vehicle scheduling system according to the production requirement of the port, and the vehicle scheduling system sends a control instruction to a single-vehicle automatic driving system on a single vehicle after receiving the weather data and the vehicle scheduling instruction sent by the weather detection system and the working condition system;
the single-vehicle automatic driving system is used for receiving and completing a control instruction sent by the vehicle scheduling system and feeding back a single-vehicle state to the master control system, and comprises a decision system, a control system and the single-vehicle environment sensing system of claim 4; the bicycle environment sensing system is used for acquiring current image data, laser data and position data around a bicycle, the decision making system is used for calculating and analyzing according to the image data, the laser data and the position data to obtain decision making information and transmitting the decision making information to the control system, and the control system is used for receiving the decision making information to control the operation of the bicycle;
and the data transmission system is used for data transmission between the master control system and the single-vehicle automatic driving system.
In the above technical scheme, the weather data monitored by the weather detection system includes sunny days, cloudy days, heavy rain, medium rain, small snow, heavy snow, medium snow, small snow and foggy days, and the weather detection system can also detect the illumination condition and the visibility of the camera so as to analyze and judge whether the weather data is a foggy day signal.
In the technical scheme, the weather detection system judges whether the weather is a foggy day or not by adopting the following mode, when the visibility of air is less than 2 kilometers, the weather is judged to be a foggy day signal, and when the visibility of air is more than 2 kilometers, the weather is judged to be a non-foggy day signal.
In the above technical solution, the production requirement includes the type, number and destination of the required vehicles.
The invention has the advantages and positive effects that:
1. under severe weather such as fog days, the color image acquired by the vehicle-mounted camera is subjected to defogging treatment, and moving and static target identification is carried out by utilizing the background shielded by fog in the deep learning technology image, so that the identification rate and the accuracy of the target are effectively improved.
2. The automatic driving master control system for the severe weather of the port is established, the master control system macroscopically monitors and regulates the states of vehicles, weather and working conditions of the whole port, different vehicle scheduling instructions are issued to the vehicles according to different weather data, and the safety of the work and operation of the vehicles in the port is ensured.
Drawings
FIG. 1 is a flow chart of object detection and identification for the bicycle environmental awareness system of the present invention;
FIG. 2 is a block diagram of the autopilot system of the present invention;
Detailed Description
The present invention will be described in further detail with reference to specific examples. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the scope of the invention in any way.
Example 1
As shown in figure 1, the system for perceiving the environment of the port-oriented bicycle comprises a vehicle-mounted camera, a laser radar, an inertial navigation device and a processor, wherein the processor is electrically connected with the vehicle-mounted camera, the laser radar and the inertial navigation device, and the vehicle-mounted camera, the laser radar, the inertial navigation device and the processor are all arranged on the bicycle.
Furthermore, the vehicle-mounted camera is installed in front of a windshield of the bicycle, the laser radar is installed in the middle of the roof of the bicycle, the inertial navigation device is installed on the roof of the bicycle, and the processor is installed inside the bicycle.
The vehicle-mounted camera is a gray-point monocular camera (model BFLY-PGE-23S6C-C (SONY249 chip/1/1.2'/5.86 um/30fps)) and is used for collecting image data of the surrounding environment of the single vehicle in the port;
the laser radar is a 16-line laser radar (model Velodyne VLP-16) and is used for acquiring three-dimensional laser point cloud data of the surrounding environment of the vehicle and the bicycle;
the inertial navigation device (model beidou star-shaped NPOS220) is used for acquiring the running attitude and the running speed of the bicycle;
the processor (industrial personal computer) is used for receiving and processing the image data, the three-dimensional laser point cloud data, the operation attitude and the operation speed, detecting road signs, static and dynamic targets and positioning the position of the bicycle.
A target detection and identification method based on the single-vehicle environment perception system specifically comprises the following steps:
step 1, establishing a Mask R-CNN classifier model for a pre-acquired severe weather traffic scene picture according to a convolutional neural network, wherein the establishment of the Mask R-CNN classifier model specifically comprises the following steps;
(1) the method comprises the following steps that a bicycle runs on a road to be detected at a port, and a vehicle-mounted camera collects image data;
(2) labeling common dynamic targets or static targets (such as containers, trucks, workers, lane lines, traffic signs, signal lamps and the like) in the image data by using a labeling tool labelme to obtain image characteristics;
(3) and extracting image characteristics by using a ResNet network, training by using a Mask R-CNN neural network, and adjusting parameters to obtain an optimal Mask R-CNN classifier model.
Further, during training by using a Mask R-CNN neural network, a pooling layer is established to reduce the data calculation amount, and the setting of parameters is adjusted to obtain an optimal Mask R-CNN classifier model.
Step 2, carrying out target detection and identification based on the established Mask R-CNN classifier model, and specifically comprising the following steps:
(4) after the single vehicle receives the fog signal and the speed limiting instruction, the single vehicle runs on a road to be tested in a port (runs at a specified speed), and meanwhile, a vehicle-mounted camera collects a color original fog image;
(5) defogging the collected original color fogging image to obtain a defogged image;
(6) on the basis of the defogged image, a ResNet network is used for extracting image features, and then a trained MaskR-CNN classifier is used for detecting and identifying a dynamic target and a static target.
Example 2
On the basis of the embodiment 1, the method for performing the defogging treatment in the step (5) specifically comprises the following steps:
(5-1) calculating the minimum value f (x) of the light intensity in each pixel color channel for the original color foggy image I collected by the vehicle-mounted camera, wherein the pixel is M multiplied by N, forming an M multiplied by N matrix (namely an image) by the minimum value of the color channel (the color channel comprises r, g and b) of each pixel obtained by calculation, filtering the minimum value of the matrix to generate a dark channel gray scale image, and calculating by adopting the following formula:
Figure BDA0002261361630000051
wherein, IDRepresents each color channel of a color image, phi (I) being an n × n window centered on pixel I (where n is determined by the actual situation);
(5-2) taking the first 0.1% of pixels with the maximum light intensity from the dark channel gray scale image obtained in the step (5-1), in the pixel position, finding out the pixel value of the pixel point with the highest corresponding light intensity value in the original foggy image I as a C value, and calculating to obtain a transmittance t (x), wherein if t (x) is less than a certain threshold ξ, the transmittance t (x) is a threshold ξ, and if t (x) is greater than a threshold ξ, the transmittance is calculated by using the following formula:
Figure BDA0002261361630000052
where λ is 0.95, which is a threshold value, and the threshold value ξ is set according to actual conditions.
(5-3) solving the defogged clear image J (x), and calculating by adopting the following formula:
Figure BDA0002261361630000053
example 3
As shown in fig. 2, the automatic driving system facing to severe weather of a port of the invention comprises a master control system, a single-vehicle automatic driving system and a data transmission system;
the general control system is used for macroscopically regulating and controlling the states of vehicles, weather and working conditions in the port, comprises a weather detection system, a working condition system and a vehicle scheduling system, and is arranged in a port command room;
the weather detection system is used for monitoring weather data in the port in real time and feeding the weather data back to the vehicle dispatching system through a wireless network;
the working condition system (a working condition platform built based on an operating system in a port command room) sends a vehicle scheduling instruction to the vehicle scheduling system through a wireless network according to the production requirement of a port;
further, production requirements include the type, number, and destination of vehicles required.
The vehicle dispatching system sends a control instruction to a single-vehicle automatic driving system on a single vehicle after receiving weather data and a vehicle dispatching instruction sent by the weather detection system and the working condition system.
Furthermore, the weather data monitored by the weather detection system (based on the weather detection platform set up by the operating system in the port command room) comprises sunny days, cloudy days, heavy rain, medium rain, small snow, large snow, medium snow, small snow and foggy days, and the weather detection system can also detect the illumination condition and the visibility of the camera so as to analyze and judge whether the signals are foggy days signals.
Further, the weather detection system determines whether the signal is a foggy day by adopting the following method, wherein the signal is determined as a foggy day signal when the visibility of the air is less than 2 kilometers, and the signal is determined as a non-foggy day signal when the visibility of the air is more than 2 kilometers.
Further, the road signs include containers in ports, truck trucks, pedestrians, lane lines, traffic signs, and signal lights.
The single-vehicle automatic driving system is used for receiving and completing a control instruction sent by the vehicle scheduling system and feeding back the single-vehicle state to the master control system, the single-vehicle automatic driving system comprises an environment sensing system, a decision-making system and a control system, the environment sensing system is used for collecting current image data, laser data and position data around a single vehicle, the decision-making system is used for obtaining decision-making information through calculation and analysis according to the image data, the laser data and the position data and transmitting the decision-making information to the control system, the control system is used for receiving the decision-making information to control the running of the single vehicle, and the single-vehicle automatic driving system is arranged on each single vehicle.
The data transmission system is used for data transmission between the master control system and the single-vehicle automatic driving system, and the data transmission system is used for wireless network transmission and is used for data transmission among all the systems.
Example 4
On the basis of embodiment 3, when the automatic driving system for the severe weather of the port operates, the specific steps are as follows:
s1, a weather monitoring system and a working condition system send monitored weather data and a vehicle scheduling instruction (such as where in a port a vehicle needs to be dispatched to transport goods) to a vehicle scheduling system;
s2, after receiving the weather data and the vehicle scheduling instruction, the vehicle scheduling system sends a control instruction to the vehicle to be operated which is closest to the working condition production ground through a wireless network;
s3, after a single-vehicle automatic driving system arranged on a vehicle receives a control instruction through a wireless network, the current state of the single vehicle is fed back to a master control system through an environment sensing system, a decision-making system and a control system in the single-vehicle automatic driving system, and if the received control instruction has a foggy day signal and a vehicle speed limit signal, an industrial personal computer in the single-vehicle environment sensing system carries out defogging processing on a color image acquired by a vehicle-mounted camera; and if the foggy day signal is not received, the bicycle normally operates, the bicycle environment sensing system is used for acquiring current image data, laser data and position data around the bicycle, detecting and identifying a target in a port operating environment, and the control system is used for controlling the operation of the bicycle.
The invention has been described in an illustrative manner, and it is to be understood that any simple variations, modifications or other equivalent changes which can be made by one skilled in the art without departing from the spirit of the invention fall within the scope of the invention.

Claims (10)

1. A method for detecting and identifying targets facing severe weather of a port is characterized by comprising the following steps:
step 1, establishing a Mask R-CNN classifier model for a pre-acquired severe weather traffic scene picture according to a convolutional neural network, wherein the step 1 of establishing the Mask R-CNN classifier model specifically comprises the following steps:
(1) the method comprises the following steps that a bicycle runs on a road to be detected at a port, and a vehicle-mounted camera collects image data;
(2) marking a dynamic target or a static target in the image data by using a marking tool, and marking to obtain image characteristics;
(3) extracting the image characteristics by using a ResNet network, training by using a Mask R-CNN neural network, and adjusting parameters to obtain a Mask R-CNN classifier model;
step 2, carrying out target detection and identification based on the established Mask R-CNN classifier model, and specifically comprising the following steps:
(4) after the single vehicle receives the foggy day signal, the single vehicle runs on a road to be detected in a port, and meanwhile, the vehicle-mounted camera collects a color original foggy image;
(5) defogging the collected original color fogging image to obtain a defogged image;
(6) on the basis of the defogged image, the ResNet network is used for extracting the image characteristics, and then the trained Mask R-CNN classifier is used for detecting and identifying the dynamic target and the static target.
2. The method of object detection and identification as claimed in claim 1, wherein: the method for performing the defogging treatment in the step (5) comprises the following steps:
(5-1) calculating the minimum value f (x) of the light intensity in each pixel color channel of the original color foggy image I acquired by the vehicle-mounted camera, wherein the pixel is M multiplied by N, forming an M multiplied by N matrix by the minimum value of the color channel of each pixel obtained by calculation, performing minimum value filtering on the matrix to generate a dark channel gray scale image, and calculating by adopting the following formula:
Figure FDA0002261361620000011
wherein, IDRepresents each color channel of a color image, phi (I) being an n × n window centered on a pixel I;
(5-2) taking the first 0.1% of pixels with the maximum light intensity from the dark channel gray-scale image, finding out the pixel value of the pixel point with the highest corresponding light intensity value in the original foggy image I in the pixel position as a C value, and calculating to obtain the transmissivity t (x), wherein if t (x) is less than a certain threshold value ξ, the transmissivity t (x) is a threshold value ξ, and if t (x) is greater than a threshold value ξ, the transmissivity is calculated by the following formula:
Figure FDA0002261361620000021
wherein λ is a constant, which is a threshold;
(5-3) solving the defogged clear image J (x), and calculating by adopting the following formula:
Figure FDA0002261361620000022
3. the method of object detection and identification as claimed in claim 1, wherein: the marking tool is labelme.
4. The utility model provides a bicycle environmental perception system towards harbour bad weather which characterized in that: the system comprises a vehicle-mounted camera, a laser radar, an inertial navigation device and a processor, wherein the processor is electrically connected with the vehicle-mounted camera, the laser radar and the inertial navigation device;
the vehicle-mounted camera is used for acquiring image data of the surrounding environment of the bicycle when the bicycle runs on the road to be detected;
the laser radar is used for acquiring three-dimensional laser point cloud data of the surrounding environment of the bicycle;
the inertial navigation device is used for acquiring the running attitude and the running speed of the bicycle;
the processor is used for receiving and respectively processing the image data, the three-dimensional laser point cloud data, the running attitude and the running speed, extracting image features in the Mask R-CNN classifier model to detect road signs, static and dynamic targets based on the Mask R-CNN classifier model which is built in the processor in advance according to claim 1, and calculating the position of the bicycle through the running attitude and the running speed of the bicycle to position the position of the bicycle.
5. The bicycle environmental perception system of claim 4, wherein: the vehicle-mounted camera is installed in front of a windshield of the bicycle, the laser radar is installed in the middle of the roof of the bicycle, the inertial navigation device is installed on the roof of the bicycle, and the processor is installed inside the bicycle.
6. The bicycle environmental perception system of claim 5, wherein: the road signs include containers in ports, truck trucks, pedestrians, lane lines, traffic signs and signal lights.
7. The utility model provides a towards automatic driving system of harbour bad weather which characterized in that: the system comprises a master control system, a single-vehicle automatic driving system and a data transmission system;
the general control system is used for macroscopically regulating and controlling the states of vehicles, weather and working conditions in a port, and comprises a weather detection system, a working condition system and a vehicle scheduling system, wherein the weather detection system is used for monitoring the weather data in the port in real time and feeding the weather data back to the vehicle scheduling system, the working condition system sends a vehicle scheduling instruction to the vehicle scheduling system according to the production requirement of the port, and the vehicle scheduling system sends a control instruction to a single-vehicle automatic driving system on a single vehicle after receiving the weather data and the vehicle scheduling instruction sent by the weather detection system and the working condition system;
the single-vehicle automatic driving system is used for receiving and completing a control instruction sent by the vehicle scheduling system and feeding back a single-vehicle state to the master control system, and comprises a decision system, a control system and the single-vehicle environment sensing system of claim 4; the bicycle environment sensing system is used for acquiring current image data, laser data and position data around a bicycle, the decision making system is used for calculating and analyzing according to the image data, the laser data and the position data to obtain decision making information and transmitting the decision making information to the control system, and the control system is used for receiving the decision making information to control the operation of the bicycle;
and the data transmission system is used for data transmission between the master control system and the single-vehicle automatic driving system.
8. The autopilot system of claim 7 wherein: the weather data monitored by the weather detection system comprise sunny days, cloudy days, heavy rain, medium rain, snow smaller than the weather data, snow larger than the weather data, snow smaller than the weather data and fog days, and the weather detection system can also detect the illumination condition and the visibility of the camera so as to analyze and judge whether the signals are fog day signals.
9. The autopilot system of claim 7 wherein: the weather detection system judges whether the weather is a foggy day or not in the following mode, when the visibility of air is less than 2 kilometers, the weather is judged to be a foggy day signal, and when the visibility of air is more than 2 kilometers, the weather is judged to be a non-foggy day signal.
10. The autopilot system of claim 7 wherein: the production requirements include the type, number and destination of vehicles required.
CN201911072396.XA 2019-11-05 2019-11-05 Port severe weather-oriented target detection and identification method Pending CN110837800A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911072396.XA CN110837800A (en) 2019-11-05 2019-11-05 Port severe weather-oriented target detection and identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911072396.XA CN110837800A (en) 2019-11-05 2019-11-05 Port severe weather-oriented target detection and identification method

Publications (1)

Publication Number Publication Date
CN110837800A true CN110837800A (en) 2020-02-25

Family

ID=69574594

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911072396.XA Pending CN110837800A (en) 2019-11-05 2019-11-05 Port severe weather-oriented target detection and identification method

Country Status (1)

Country Link
CN (1) CN110837800A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111487975A (en) * 2020-04-30 2020-08-04 畅加风行(苏州)智能科技有限公司 Intelligent networking system-based automatic port truck formation method and system
CN111861923A (en) * 2020-07-21 2020-10-30 济南大学 Target identification method and system based on lightweight residual error network image defogging
CN111967332A (en) * 2020-07-20 2020-11-20 禾多科技(北京)有限公司 Visibility information generation method and device for automatic driving
CN112101316A (en) * 2020-11-17 2020-12-18 北京中科原动力科技有限公司 Target detection method and system
CN112183788A (en) * 2020-11-30 2021-01-05 华南理工大学 Domain adaptive equipment operation detection system and method
CN112801225A (en) * 2021-04-01 2021-05-14 中国人民解放军国防科技大学 Automatic driving multi-sensor fusion sensing method and system under limit working condition
CN113468963A (en) * 2021-05-31 2021-10-01 山东信通电子股份有限公司 Road raise dust identification method and equipment
CN113597168A (en) * 2021-08-02 2021-11-02 安徽信息工程学院 Imaging identification auxiliary structure under severe weather
WO2023184460A1 (en) * 2022-03-31 2023-10-05 华为技术有限公司 Out-of-focus detection method and related apparatus
CN117218517A (en) * 2023-11-08 2023-12-12 诺比侃人工智能科技(成都)股份有限公司 Outdoor moving object detection system in rainy and snowy weather

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065133A (en) * 2013-01-21 2013-04-24 信帧电子技术(北京)有限公司 Method and device for detecting pedestrian in foggy weather
CN107380163A (en) * 2017-08-15 2017-11-24 上海电气自动化设计研究所有限公司 Automobile intelligent alarm forecasting system and its method based on magnetic navigation
CN107985188A (en) * 2016-10-27 2018-05-04 中国科学院沈阳自动化研究所 A kind of greasy weather road is prevented hitting intelligent guide method and system
CN109166314A (en) * 2018-09-29 2019-01-08 河北德冠隆电子科技有限公司 Road conditions awareness apparatus and bus or train route cooperative system based on omnidirectional tracking detection radar
CN110210354A (en) * 2019-05-23 2019-09-06 南京邮电大学 A kind of detection of haze weather traffic mark with know method for distinguishing
CN110263706A (en) * 2019-06-19 2019-09-20 南京邮电大学 A kind of haze weather Vehicular video Detection dynamic target and know method for distinguishing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065133A (en) * 2013-01-21 2013-04-24 信帧电子技术(北京)有限公司 Method and device for detecting pedestrian in foggy weather
CN107985188A (en) * 2016-10-27 2018-05-04 中国科学院沈阳自动化研究所 A kind of greasy weather road is prevented hitting intelligent guide method and system
CN107380163A (en) * 2017-08-15 2017-11-24 上海电气自动化设计研究所有限公司 Automobile intelligent alarm forecasting system and its method based on magnetic navigation
CN109166314A (en) * 2018-09-29 2019-01-08 河北德冠隆电子科技有限公司 Road conditions awareness apparatus and bus or train route cooperative system based on omnidirectional tracking detection radar
CN110210354A (en) * 2019-05-23 2019-09-06 南京邮电大学 A kind of detection of haze weather traffic mark with know method for distinguishing
CN110263706A (en) * 2019-06-19 2019-09-20 南京邮电大学 A kind of haze weather Vehicular video Detection dynamic target and know method for distinguishing

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
田青: "基于暗通道去雾和深度学习的行人检测方法", 《激光与光电子学进展》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111487975A (en) * 2020-04-30 2020-08-04 畅加风行(苏州)智能科技有限公司 Intelligent networking system-based automatic port truck formation method and system
CN111967332A (en) * 2020-07-20 2020-11-20 禾多科技(北京)有限公司 Visibility information generation method and device for automatic driving
CN111967332B (en) * 2020-07-20 2021-08-31 禾多科技(北京)有限公司 Visibility information generation method and device for automatic driving
CN111861923A (en) * 2020-07-21 2020-10-30 济南大学 Target identification method and system based on lightweight residual error network image defogging
CN112101316B (en) * 2020-11-17 2022-03-25 北京中科原动力科技有限公司 Target detection method and system
CN112101316A (en) * 2020-11-17 2020-12-18 北京中科原动力科技有限公司 Target detection method and system
CN112183788A (en) * 2020-11-30 2021-01-05 华南理工大学 Domain adaptive equipment operation detection system and method
WO2022111219A1 (en) * 2020-11-30 2022-06-02 华南理工大学 Domain adaptation device operation and maintenance system and method
CN112801225A (en) * 2021-04-01 2021-05-14 中国人民解放军国防科技大学 Automatic driving multi-sensor fusion sensing method and system under limit working condition
CN112801225B (en) * 2021-04-01 2021-06-18 中国人民解放军国防科技大学 Automatic driving multi-sensor fusion sensing method and system under limit working condition
CN113468963A (en) * 2021-05-31 2021-10-01 山东信通电子股份有限公司 Road raise dust identification method and equipment
CN113597168A (en) * 2021-08-02 2021-11-02 安徽信息工程学院 Imaging identification auxiliary structure under severe weather
CN113597168B (en) * 2021-08-02 2022-09-27 安徽信息工程学院 Imaging identification auxiliary structure under severe weather
WO2023184460A1 (en) * 2022-03-31 2023-10-05 华为技术有限公司 Out-of-focus detection method and related apparatus
CN117218517A (en) * 2023-11-08 2023-12-12 诺比侃人工智能科技(成都)股份有限公司 Outdoor moving object detection system in rainy and snowy weather
CN117218517B (en) * 2023-11-08 2024-01-26 诺比侃人工智能科技(成都)股份有限公司 Outdoor moving object detection system in rainy and snowy weather

Similar Documents

Publication Publication Date Title
CN110837800A (en) Port severe weather-oriented target detection and identification method
CN105512623B (en) Based on multisensor travelling in fog day vision enhancement and visibility early warning system and method
CN112700470B (en) Target detection and track extraction method based on traffic video stream
CN102724482B (en) Based on the intelligent vision sensing network moving target relay tracking system of GPS and GIS
CN105844257A (en) Early warning system based on machine vision driving-in-fog road denoter missing and early warning method
CN110738081B (en) Abnormal road condition detection method and device
CN111243274A (en) Road collision early warning system and method for non-internet traffic individuals
CN112419773A (en) Vehicle-road cooperative unmanned control system based on cloud control platform
CN110097783A (en) Vehicle early warning method and system
CN104157160A (en) Vehicle drive control method and device as well as vehicle
CN113851017A (en) Pedestrian and vehicle identification and early warning multifunctional system based on road side RSU
CN114387785A (en) Safety management and control method and system based on intelligent highway and storable medium
Jiang et al. Target detection algorithm based on MMW radar and camera fusion
US20230419688A1 (en) Ambiguous Lane Detection Event Miner
CN112810619A (en) Radar-based method for identifying front target vehicle of assistant driving system
US10930145B2 (en) Traffic system for predicting and providing traffic signal switching timing
CN112750170A (en) Fog feature identification method and device and related equipment
CN111882924A (en) Vehicle testing system, driving behavior judgment control method and accident early warning method
CN113111876A (en) Method and system for obtaining evidence of traffic violation
US11748664B1 (en) Systems for creating training data for determining vehicle following distance
CN110647863B (en) Visual signal acquisition and analysis system for intelligent driving
CN116486359A (en) All-weather-oriented intelligent vehicle environment sensing network self-adaptive selection method
CN116630891A (en) Traffic abnormal event detection system and method
CN107329471B (en) A kind of intelligent decision system of automatic driving vehicle
US11935309B2 (en) Determining traffic light labels and classification quality from infrastructure signals

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200225