CN108645854B - System and method for monitoring whole visibility of highway network in real time - Google Patents
System and method for monitoring whole visibility of highway network in real time Download PDFInfo
- Publication number
- CN108645854B CN108645854B CN201810448104.7A CN201810448104A CN108645854B CN 108645854 B CN108645854 B CN 108645854B CN 201810448104 A CN201810448104 A CN 201810448104A CN 108645854 B CN108645854 B CN 108645854B
- Authority
- CN
- China
- Prior art keywords
- vehicle
- visibility
- speed
- arm processor
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/95—Radar or analogous systems specially adapted for specific applications for meteorological use
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02A—TECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
- Y02A90/00—Technologies having an indirect contribution to adaptation to climate change
- Y02A90/10—Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation
Abstract
The invention belongs to the technical field of highway visibility monitoring, and discloses a system and a method for monitoring the overall visibility of a highway network in real time, wherein the system comprises a control center and a plurality of devices for monitoring the visibility, each device for monitoring the visibility is fixed on a corresponding vehicle, and each device for monitoring the visibility comprises: the system comprises a camera arranged in the center of a front windshield of the vehicle, a radar sensor arranged in the center of a front bumper of the vehicle and facing the advancing direction of the vehicle, an ARM processor arranged in an engine compartment of the vehicle, a GPS module arranged on the side surface of a steering wheel of the vehicle and a 4G communication module; the method has the characteristics of low investment, suitability for large-scale popularization, intellectualization, automation, no need of operation and high reliability.
Description
Technical Field
The invention belongs to the technical field of highway visibility monitoring, and particularly relates to a system and a method for monitoring the overall visibility of a highway network in real time.
Background
Visibility of the highway has a direct influence on traffic safety, countless traffic accidents are caused by low visibility, and huge casualties and property loss are brought. From the perspective of highway safety management, taking different management measures according to different situations of visibility is an important method for ensuring safe traffic of a highway. For the countermeasure of visibility, no matter what kind of countermeasure, the most important premise is to acquire the data of visibility, and the countermeasure can be purposefully taken.
At this stage, it is a common method to adopt a visibility sensor. The visibility sensor is fixed on the side of the highway, and the visibility data is collected in real time by using the optical principle and sent back to the control center. The method has good effect, but the visibility sensor has high cost and can be arranged at a few points. The number of highway networks in China is large, and the method can only obtain the visibility of a few positions and cannot reflect the visibility condition of the whole road network. Limited by cost, this method cannot be widely generalized to cover all road segments in a short period of time.
Disclosure of Invention
In view of the above problems, the present invention aims to provide a system and a method for monitoring the overall visibility of a highway network in real time, which have the characteristics of low investment, suitability for large-scale popularization, intelligence, automation, no need of operation and high reliability.
In order to achieve the purpose, the invention is realized by adopting the following technical scheme.
The first technical scheme is as follows:
a system for monitoring the whole visibility of a highway network in real time comprises a control center and a plurality of devices for monitoring the visibility, wherein each device for monitoring the visibility is fixed on a corresponding vehicle, and each device for monitoring the visibility comprises: the system comprises a camera arranged in the center of a front windshield of the vehicle, a radar sensor arranged in the center of a front bumper of the vehicle and facing the advancing direction of the vehicle, an ARM processor arranged in an engine compartment of the vehicle, a GPS module arranged on the side surface of a steering wheel of the vehicle and a 4G communication module;
the signal output part of the camera is connected with the first signal input part of the ARM processor, the signal output part of the radar sensor is connected with the second signal input part of the ARM processor, the signal output part of the GPS module is connected with the third signal input part of the ARM processor, the signal output part of the ARM processor is connected with the signal input part of the 4G communication module, and the signal output part of the 4G communication module is in communication connection with the control center.
The first technical scheme of the invention has the characteristics and further improvements that:
(1) in each of the devices for monitoring visibility:
the camera is used for acquiring a road image in front of the vehicle in real time and sending the road image to the ARM processor;
the radar sensor is used for acquiring the relative distance and the relative speed between the vehicle and each vehicle in front in real time and sending the relative distance and the relative speed between the vehicle and each vehicle in front to the ARM processor;
the GPS module is used for acquiring longitude and latitude information of the self-vehicle in real time and sending the longitude and latitude information of the self-vehicle to the ARM processor;
the ARM processor is used for acquiring the contour of each front vehicle in the road image according to the road image sent by the camera, so as to determine the coordinates (X1, Y1) of the contour center of each front vehicle in the road image;
the ARM processor is further used for determining each front vehicle with the speed greater than the speed of the vehicle according to the relative distance and the relative speed between the vehicle and each front vehicle, which are sent by the radar sensor, and acquiring the radar coordinates (X2, Y2) of each front vehicle with the speed greater than the speed of the vehicle;
the ARM processor is further used for pairing coordinates (X1, Y1) of the contour center of each front vehicle in the road image with radar coordinates (X2, Y2) of each front vehicle with the speed higher than the speed of the vehicle to obtain the same front vehicle which is simultaneously acquired by the camera and the radar sensor and has the speed higher than the speed of the vehicle;
the ARM processor is further used for determining the relative distance between the same front vehicle and the own vehicle, which is sent by the radar sensor at the last sampling moment, as the visibility of the current road according to the last sampling moment before the same front vehicle disappears in the road image sent by the camera;
the ARM processor is also used for sending the longitude and latitude information of the vehicle and the visibility of the current road section sent by the GPS module to the 4G communication module;
and the 4G communication module is used for sending the longitude and latitude information of the vehicle and the visibility of the current road section sent by the ARM processor to the control center.
(2) The control center is used for counting the visibility of each road section of the highway network according to the longitude and latitude information of the vehicle and the visibility of the current road section, which are sent by the devices for monitoring the visibility through the 4G communication module, and displaying the visibility of each road section of the highway network on the vehicles on the highway network through the electronic map.
(3) The ARM processor is further used for pairing coordinates (X1, Y1) of the contour center of each front vehicle in the road image with radar coordinates (X2, Y2) of each front vehicle with the speed larger than the speed of the vehicle, and obtaining the same front vehicle which is simultaneously collected by the camera and the radar sensor and has the speed larger than the speed of the vehicle, and specifically comprises the following steps:
the camera and the radar sensor are fixedly arranged behind the vehicle, and a fixed functional relation exists between a coordinate system of a road image acquired by the camera and a coordinate system of a vehicle in front acquired by the radar sensor;
the ARM processor acquires coordinates (X1, Y1) of the contour center of each front vehicle in a road image and two coordinates of the same front vehicle, which meet the fixed functional relation, in radar coordinates (X2, Y2) of each front vehicle with the speed greater than the speed of the vehicle, so that the two coordinates are successfully paired, and the same front vehicle which is acquired by a camera and a radar sensor at the same time and has the speed greater than the speed of the vehicle is obtained.
The second technical scheme is as follows:
a method for monitoring the overall visibility of a highway network in real time, said method being applied to a system as defined in the first technical solution, said system comprising a control center and a plurality of devices for monitoring visibility, each device for monitoring visibility being fixed to a corresponding vehicle, said method comprising:
for each device that monitors visibility, wherein,
the method comprises the following steps that a camera collects road images in front of a self-vehicle in real time and sends the road images to an ARM processor;
the radar sensor collects the relative distance and the relative speed between the self-vehicle and each front vehicle in real time and sends the relative distance and the relative speed between the self-vehicle and each front vehicle to the ARM processor;
the GPS module acquires longitude and latitude information of the self-vehicle in real time and sends the longitude and latitude information of the self-vehicle to the ARM processor;
the ARM processor acquires the contour of each front vehicle in the road image according to the road image sent by the camera, so as to determine the coordinates (X1, Y1) of the contour center of each front vehicle in the road image;
the ARM processor determines each front vehicle with the speed greater than the speed of the vehicle according to the relative distance and the relative speed between the vehicle and each front vehicle sent by the radar sensor, and acquires the radar coordinates (X2, Y2) of each front vehicle with the speed greater than the speed of the vehicle;
the ARM processor is used for pairing coordinates (X1, Y1) of the contour center of each front vehicle in a road image with radar coordinates (X2, Y2) of each front vehicle with the speed higher than the speed of the vehicle to obtain the same front vehicle which is simultaneously acquired by the camera and the radar sensor and has the speed higher than the speed of the vehicle;
the ARM processor determines the relative distance between the same front vehicle and the own vehicle sent by the radar sensor at the last sampling moment as the visibility of the current road according to the last sampling moment before the same front vehicle disappears in the road image sent by the camera;
the ARM processor sends the longitude and latitude information of the vehicle and the visibility of the current road section sent by the GPS module to the 4G communication module;
and the 4G communication module sends the longitude and latitude information of the vehicle and the visibility of the current road section sent by the ARM processor to the control center.
The second technical scheme of the invention has the characteristics and further improvements that:
(1) the ARM processor pairs coordinates (X1, Y1) of the contour center of each front vehicle in a road image with radar coordinates (X2, Y2) of each front vehicle with the speed higher than the speed of the vehicle to obtain the same front vehicle which is simultaneously collected by the camera and the radar sensor and has the speed higher than the speed of the vehicle, and specifically comprises the following steps:
the camera and the radar sensor are fixedly arranged behind the vehicle, and a fixed functional relation exists between a coordinate system of a road image acquired by the camera and a coordinate system of a vehicle in front acquired by the radar sensor;
the ARM processor acquires coordinates (X1, Y1) of the contour center of each front vehicle in a road image and two coordinates of the same front vehicle, which meet the fixed functional relation, in radar coordinates (X2, Y2) of each front vehicle with the speed greater than the speed of the vehicle, so that the two coordinates are successfully paired, and the same front vehicle which is acquired by a camera and a radar sensor at the same time and has the speed greater than the speed of the vehicle is obtained.
(2) The control center counts the visibility of each road section of the highway network according to the longitude and latitude information of the vehicle and the visibility of the current road section, which are sent by the devices for monitoring the visibility through the 4G communication module, and displays the visibility of each road section of the highway network on the vehicles on the highway network through the electronic map.
Related sensors, processors and the like related to the technical scheme of the invention are widely applied to the current automobiles, the technical scheme of the invention does not actually need to additionally purchase the devices in the process of popularization and use, the wide-range visibility monitoring of the highway network can be realized only by operating the idea method of the invention in an on-board processor, and the process is fully automatic and does not need any operation of a driver. The invention can quickly and accurately monitor the visibility of the highway network nationwide, and has wide application prospect.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic circuit structure diagram of a device for monitoring visibility in a system for monitoring overall visibility of a highway network in real time according to an embodiment of the present invention;
fig. 2 is a schematic installation diagram of a device for monitoring visibility in a system for monitoring overall visibility of a highway network in real time according to an embodiment of the present invention;
fig. 3 is a schematic flow chart of a method for monitoring the overall visibility of a highway network in real time according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The embodiment of the invention provides a system for monitoring the whole visibility of a highway network in real time, which comprises a control center and a plurality of devices for monitoring the visibility, wherein each device for monitoring the visibility is fixed on a corresponding vehicle, and the system is shown in figure 1, which is a circuit structure schematic diagram of the device for monitoring the whole visibility of the highway network in real time provided by the embodiment of the invention, and each device for monitoring the visibility comprises an ARM9 processor 3 arranged at an idle position below an engine hood of a self-vehicle 6, and an ARM9 processor 3 which is in signal transmission with the outside through a lead. The specific model of the ARM9 processor 7 is S3C 2410. In the embodiment of the present invention, the preceding vehicle refers to all other vehicles located in front of the own vehicle 6 (i.e., the own vehicle).
A camera 2 is fixed to the center of a front windshield of a vehicle, and referring to fig. 2, the camera 2 is fixed below the center of the front windshield of the vehicle by gluing, a lens of the camera 2 horizontally faces forward, and the camera 2 is used for collecting road images in front of the vehicle. The camera 2 adopts a Zhongxing YJS-01USB2.0 camera, and the effective pixel is 600 ten thousand. With reference to fig. 1, the camera 2 is connected to the USB interface of the ARM9 processor 3 through a USB data cable, and the camera 2 is configured to send the acquired road image to the ARM9 processor 3.
In the embodiment of the invention, a radar sensor 1 is further fixed on the outer side of the vehicle air inlet grille (fastened on the outer side of the vehicle air inlet grille by adopting a fine-grained bolt), and the radar sensor 1 is used for detecting the relative distance, the relative angle and the relative speed of a front vehicle and a self vehicle by transmitting signals to the front of the vehicle. With reference to fig. 1, a signal output terminal of the radar sensor 2 is electrically connected to a signal input terminal of the ARM9 processor 3. After the radar sensor collects the relative distance, the relative angle and the relative speed of the front vehicle and the self vehicle, the radar sensor sends the relative distance, the relative angle and the relative speed to the ARM9 processor. In an embodiment of the present invention, the radar sensor is an ESR radar sensor.
With reference to fig. 1, the present invention is further provided with a GPS module 4 for collecting location information of the vehicle 6, specifically including longitude and latitude data. The signal output end of the GPS module 4 is electrically connected with the I/O interface of the ARM9 processor 3.
In the embodiment of the invention, the invention is also provided with a 4G communication module 5. The 4G communication module 5 is electrically connected to the ARM9 processor 3, and is configured to send data packets to the control center through the mobile communication network.
Since the present invention belongs to the field of vehicle safety, it is desirable that the present invention have good real-time performance. The working frequency of the invention is set to 10Hz, which can basically meet the real-time requirement of the active safety system.
With reference to fig. 3, the following describes in detail the working process of the method for monitoring the overall visibility of the highway network in real time according to the present invention:
s1: after the system is started, the camera acquires road images in front of the vehicle in real time, the radar sensor acquires the relative distance, the relative angle and the relative speed between the vehicle and the vehicle in front in real time, and the GPS module acquires the longitude and latitude information of the vehicle in real time. The ARM9 processor is used for receiving road images from the camera, relative distance, relative angle and relative speed from the radar sensor and longitude and latitude information collected by the GPS module in real time.
S2: the ARM9 processor carries out edge contour analysis on the real-time road image and extracts a contour image of a front vehicle. For each vehicle contour, the coordinates of the center of the contour in the image are calculated (X1, Y1).
S3: the ARM9 processor analyzes the radar data, and according to the positive and negative of the relative speed, the front vehicle data of each front vehicle are eliminated, and only the front vehicle data with the speed higher than that of the front vehicle are reserved.
S4: the radar target and the image data are paired. For a vehicle with a faster forward speed than the own vehicle, the radar returns the coordinate position of the vehicle (X2, Y2). Because the radar sensor and the camera simultaneously collect the front, the radar data (X2, Y2) and the camera data (X1, Y1) of the front vehicle can be simultaneously acquired as long as the front vehicle is within the monitoring range of the radar and the camera. After the radar sensor and the camera are fixed, a certain corresponding relation exists between (X1, Y1) and (X2, Y2) and is expressed by f (X, Y) function.
For each acquisition of 1 group (X1, Y1) and 1 group (X2, Y2), if the 2 groups of data belong to the same preceding vehicle and are acquired by the radar sensor and the camera at the same time, the function f (X, Y) should be satisfied between (X1, Y1) and (X2, Y2). Therefore, if the function f (X, Y) is satisfied between (X1, Y1) and (X2, Y2), it indicates that the vehicle in front has been tracked and acquired by the radar sensor and the camera simultaneously, and the data pairing of the radar sensor and the camera is successful. If the function f (X, Y) is not satisfied between (X1, Y1) and (X2, Y2), the pairing fails, and the data of the preceding vehicle is discarded.
S5: for a preceding vehicle object successfully paired, since the speed of the object is higher than that of the own vehicle, the object gradually moves away from the own vehicle and gradually disappears in the image. In the case of low visibility, when the target disappears in the image, the radar sensor can still detect the data of the target and continue to return to the relative speed, the relative angle and the relative distance between the target and the own vehicle.
Therefore, for the target which is successfully paired, the pairing will continue to be successful in a subsequent period of time, and only radar data is left until image data disappears for a certain time. At this time, the last pairing data before disappearance is taken out, and the distance d between the vehicle and the target collected by the radar sensor at the sampling time is obtained. d is the visibility data of the road section.
In the process, the method only focuses on vehicles within the range of 200 meters. When the distance is more than 200 m, the image processing effect is poor, and the system does not process the image. The road running characteristics are relatively good under the condition that the visibility exceeds 200 meters, and a control center is not required to take strict management measures.
S6: and packaging and sending the visibility data and the latitude and longitude data. After the visibility data d is acquired, the microprocessor packs the d value and longitude and latitude data acquired by the GPS and sends the d value and the longitude and latitude data to the control center in a remote and real-time mode by using the 4G communication module.
S7: with the scheme, each vehicle running on the expressway can be a self vehicle, and the visibility data can be collected on any road section, so that when numerous self vehicles send the visibility data to the control center on different road sections, the control center can acquire countless visibility data. Meanwhile, by utilizing longitude and latitude data sent together with the visibility data, the control center can quickly and accurately analyze the visibility characteristics of the whole highway network and prompt drivers or correspondingly manage the highway by other means.
Those of ordinary skill in the art will understand that: all or part of the steps for realizing the method embodiments can be completed by hardware related to program instructions, the program can be stored in a computer readable storage medium, and the program executes the steps comprising the method embodiments when executed; and the aforementioned storage medium includes: various media that can store program codes, such as ROM, RAM, magnetic or optical disks.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the appended claims.
Claims (2)
1. A system for monitoring the whole visibility of a highway network in real time is characterized by comprising a control center and a plurality of devices for monitoring the visibility, wherein each device for monitoring the visibility is fixed on a corresponding vehicle, and each device for monitoring the visibility comprises: the system comprises a camera arranged in the center of a front windshield of the vehicle, a radar sensor arranged in the center of a front bumper of the vehicle and facing the advancing direction of the vehicle, an ARM processor arranged in an engine compartment of the vehicle, a GPS module arranged on the side surface of a steering wheel of the vehicle and a 4G communication module;
the signal output end of the camera is connected with the first signal input end of the ARM processor, the signal output end of the radar sensor is connected with the second signal input end of the ARM processor, the signal output end of the GPS module is connected with the third signal input end of the ARM processor, the signal output end of the ARM processor is connected with the signal input end of the 4G communication module, and the signal output end of the 4G communication module is in communication connection with the control center;
in each of the devices for monitoring visibility:
the camera is used for acquiring a road image in front of the vehicle in real time and sending the road image to the ARM processor;
the radar sensor is used for acquiring the relative distance and the relative speed between the vehicle and each vehicle in front in real time and sending the relative distance and the relative speed between the vehicle and each vehicle in front to the ARM processor;
the GPS module is used for acquiring longitude and latitude information of the self-vehicle in real time and sending the longitude and latitude information of the self-vehicle to the ARM processor;
the ARM processor is used for acquiring the contour of each front vehicle in the road image according to the road image sent by the camera, so as to determine the coordinates (X1, Y1) of the contour center of each front vehicle in the road image;
the ARM processor is further used for determining each front vehicle with the speed greater than the speed of the vehicle according to the relative distance and the relative speed between the vehicle and each front vehicle, which are sent by the radar sensor, and acquiring the radar coordinates (X2, Y2) of each front vehicle with the speed greater than the speed of the vehicle;
the ARM processor is further used for pairing coordinates (X1, Y1) of the contour center of each front vehicle in the road image with radar coordinates (X2, Y2) of each front vehicle with the speed higher than the speed of the vehicle to obtain the same front vehicle which is simultaneously acquired by the camera and the radar sensor and has the speed higher than the speed of the vehicle;
the ARM processor is further used for determining the relative distance between the same front vehicle and the own vehicle, which is sent by the radar sensor at the last sampling moment, as the visibility of the current road according to the last sampling moment before the same front vehicle disappears in the road image sent by the camera;
the ARM processor is also used for sending the longitude and latitude information of the vehicle and the visibility of the current road section sent by the GPS module to the 4G communication module;
the 4G communication module is used for sending the longitude and latitude information of the vehicle and the visibility of the current road section sent by the ARM processor to the control center;
the control center is used for counting the visibility of each road section of the highway network according to the longitude and latitude information of the vehicle and the visibility of the current road section, which are sent by the devices for monitoring the visibility through the 4G communication module, and displaying the visibility of each road section of the highway network on the vehicle on the highway network through the electronic map;
the ARM processor is further used for pairing coordinates (X1, Y1) of the contour center of each front vehicle in the road image with radar coordinates (X2, Y2) of each front vehicle with the speed larger than the speed of the vehicle, and obtaining the same front vehicle which is simultaneously collected by the camera and the radar sensor and has the speed larger than the speed of the vehicle, and specifically comprises the following steps:
the camera and the radar sensor are fixedly arranged behind the vehicle, and a fixed functional relation exists between a coordinate system of a road image acquired by the camera and a coordinate system of a vehicle in front acquired by the radar sensor;
the ARM processor acquires coordinates (X1, Y1) of the contour center of each front vehicle in a road image and two coordinates of the same front vehicle, which meet the fixed functional relation, in radar coordinates (X2, Y2) of each front vehicle with the speed greater than the speed of the vehicle, so that the two coordinates are successfully paired, and the same front vehicle which is acquired by a camera and a radar sensor at the same time and has the speed greater than the speed of the vehicle is obtained.
2. A method for real-time monitoring of the overall visibility of a highway network, said method being applied to a system as claimed in claim 1, said system comprising a control center and a plurality of devices for monitoring visibility, each device for monitoring visibility being fixed to a corresponding vehicle, said method comprising:
for each device that monitors visibility, wherein,
the method comprises the following steps that a camera collects road images in front of a self-vehicle in real time and sends the road images to an ARM processor;
the radar sensor collects the relative distance and the relative speed between the self-vehicle and each front vehicle in real time and sends the relative distance and the relative speed between the self-vehicle and each front vehicle to the ARM processor;
the GPS module acquires longitude and latitude information of the self-vehicle in real time and sends the longitude and latitude information of the self-vehicle to the ARM processor;
the ARM processor acquires the contour of each front vehicle in the road image according to the road image sent by the camera, so as to determine the coordinates (X1, Y1) of the contour center of each front vehicle in the road image;
the ARM processor determines each front vehicle with the speed greater than the speed of the vehicle according to the relative distance and the relative speed between the vehicle and each front vehicle sent by the radar sensor, and acquires the radar coordinates (X2, Y2) of each front vehicle with the speed greater than the speed of the vehicle;
the ARM processor is used for pairing coordinates (X1, Y1) of the contour center of each front vehicle in a road image with radar coordinates (X2, Y2) of each front vehicle with the speed higher than the speed of the vehicle to obtain the same front vehicle which is simultaneously acquired by the camera and the radar sensor and has the speed higher than the speed of the vehicle;
the ARM processor determines the relative distance between the same front vehicle and the own vehicle sent by the radar sensor at the last sampling moment as the visibility of the current road according to the last sampling moment before the same front vehicle disappears in the road image sent by the camera;
the ARM processor sends the longitude and latitude information of the vehicle and the visibility of the current road section sent by the GPS module to the 4G communication module;
and the 4G communication module sends the longitude and latitude information of the vehicle and the visibility of the current road section sent by the ARM processor to the control center.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810448104.7A CN108645854B (en) | 2018-05-11 | 2018-05-11 | System and method for monitoring whole visibility of highway network in real time |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810448104.7A CN108645854B (en) | 2018-05-11 | 2018-05-11 | System and method for monitoring whole visibility of highway network in real time |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108645854A CN108645854A (en) | 2018-10-12 |
CN108645854B true CN108645854B (en) | 2020-11-27 |
Family
ID=63754536
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810448104.7A Active CN108645854B (en) | 2018-05-11 | 2018-05-11 | System and method for monitoring whole visibility of highway network in real time |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108645854B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109709541A (en) * | 2018-12-26 | 2019-05-03 | 杭州奥腾电子股份有限公司 | A kind of vehicle environment perception emerging system target erroneous detection processing method |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101382497A (en) * | 2008-10-06 | 2009-03-11 | 南京大学 | Visibility detecting method based on monitoring video of traffic condition |
CN102124370A (en) * | 2008-08-12 | 2011-07-13 | 大陆汽车有限公司 | Method for detecting expansive static object |
CN103134800A (en) * | 2013-02-07 | 2013-06-05 | 安徽皖通科技股份有限公司 | Road weather detection system based on video |
JP2014109945A (en) * | 2012-12-03 | 2014-06-12 | Fuji Heavy Ind Ltd | Vehicle driving support control device |
CN104290753A (en) * | 2014-09-29 | 2015-01-21 | 长安大学 | Tracking and predicting device of motion state of front vehicle and predicating method thereof |
CN105527251A (en) * | 2014-10-22 | 2016-04-27 | 姜海梅 | Highway visibility monitoring and early warning system based on cloud computing platform |
CN106225789A (en) * | 2016-07-12 | 2016-12-14 | 武汉理工大学 | A kind of onboard navigation system with high security and bootstrap technique thereof |
US9739881B1 (en) * | 2016-03-24 | 2017-08-22 | RFNAV, Inc. | Low cost 3D radar imaging and 3D association method from low count linear arrays for all weather autonomous vehicle navigation |
CN107161097A (en) * | 2017-04-06 | 2017-09-15 | 南京航空航天大学 | Vehicle running intelligent security system based on triones navigation system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4608631B2 (en) * | 2005-06-08 | 2011-01-12 | 国立大学法人名古屋大学 | Image processing device for vehicle, driving support device |
-
2018
- 2018-05-11 CN CN201810448104.7A patent/CN108645854B/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102124370A (en) * | 2008-08-12 | 2011-07-13 | 大陆汽车有限公司 | Method for detecting expansive static object |
CN101382497A (en) * | 2008-10-06 | 2009-03-11 | 南京大学 | Visibility detecting method based on monitoring video of traffic condition |
JP2014109945A (en) * | 2012-12-03 | 2014-06-12 | Fuji Heavy Ind Ltd | Vehicle driving support control device |
CN103134800A (en) * | 2013-02-07 | 2013-06-05 | 安徽皖通科技股份有限公司 | Road weather detection system based on video |
CN104290753A (en) * | 2014-09-29 | 2015-01-21 | 长安大学 | Tracking and predicting device of motion state of front vehicle and predicating method thereof |
CN105527251A (en) * | 2014-10-22 | 2016-04-27 | 姜海梅 | Highway visibility monitoring and early warning system based on cloud computing platform |
US9739881B1 (en) * | 2016-03-24 | 2017-08-22 | RFNAV, Inc. | Low cost 3D radar imaging and 3D association method from low count linear arrays for all weather autonomous vehicle navigation |
CN106225789A (en) * | 2016-07-12 | 2016-12-14 | 武汉理工大学 | A kind of onboard navigation system with high security and bootstrap technique thereof |
CN107161097A (en) * | 2017-04-06 | 2017-09-15 | 南京航空航天大学 | Vehicle running intelligent security system based on triones navigation system |
Also Published As
Publication number | Publication date |
---|---|
CN108645854A (en) | 2018-10-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN202134106U (en) | Intelligent vehicle-mounted road condition collector | |
CN202018743U (en) | Express way safety distance early warning system based on GPS (global positioning system) and 3G wireless communication | |
CN108663368B (en) | System and method for monitoring whole night visibility of highway network in real time | |
CN112466141A (en) | Vehicle-road-collaboration-oriented intelligent network connection end equipment interaction method, system and storage medium | |
CN104239889A (en) | Vehicle passenger number monitor, vehicle passenger number monitoring method, and computer-readable recording medium | |
CN109544725B (en) | Event-driven-based automatic driving accident intelligent processing method | |
CN113724531B (en) | Intersection human-vehicle road cooperation early warning system and method under Internet of vehicles environment | |
CN203217630U (en) | Safe driving management device of big dipper/GPS (global position system) dual-mode vehicle | |
CN110083099B (en) | Automatic driving architecture system meeting automobile function safety standard and working method | |
CN202018744U (en) | Vehicle safety driving assisting system based on monocular distance measurement | |
CN103578242A (en) | Private car driving monitoring and alarm system and method thereof | |
CN112441019A (en) | Intelligent networking automobile running monitoring system and method for vehicle-road cooperation | |
CN108645854B (en) | System and method for monitoring whole visibility of highway network in real time | |
CN210081719U (en) | Road inspection robot | |
CN111284401A (en) | Multi-sensor vehicle right turning blind area detection early warning system | |
CN206672365U (en) | Road traffic management system applied to smart city | |
CN203415100U (en) | Flood prevention early warning device and system | |
CN108648479B (en) | Device and method for reminding night foggy road section in real time by using electronic map | |
CN108749708B (en) | Device and method for automatically turning on fog lamp of vehicle under low visibility | |
CN108896062B (en) | Device and method for reminding foggy road section in real time by using electronic map | |
CN112572427A (en) | Vehicle information processing method and device and vehicle | |
CN108638957B (en) | Device and method for automatically turning on fog lamp of vehicle under low visibility at night | |
CN114655122A (en) | Intelligent vehicle obstacle avoidance and emergency braking system based on laser radar | |
CN108023964A (en) | A kind of new-energy automobile car networking system | |
JP7115872B2 (en) | Drive recorder and image recording method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |